Sample records for improve existing models

  1. Improving inflow forecasting into hydropower reservoirs through a complementary modelling framework

    NASA Astrophysics Data System (ADS)

    Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.

    2014-10-01

    Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead-time is considered within the day-ahead (Elspot) market of the Nordic exchange market. We present here a new approach for issuing hourly reservoir inflow forecasts that aims to improve on existing forecasting models that are in place operationally, without needing to modify the pre-existing approach, but instead formulating an additive or complementary model that is independent and captures the structure the existing model may be missing. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. The procedure presented comprises an error model added on top of an un-alterable constant parameter conceptual model, the models being demonstrated with reference to the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead-times up to 17 h. Season based evaluations indicated that the improvement in inflow forecasts varies across seasons and inflow forecasts in autumn and spring are less successful with the 95% prediction interval bracketing less than 95% of the observations for lead-times beyond 17 h.

  2. Systems oncology: towards patient-specific treatment regimes informed by multiscale mathematical modelling.

    PubMed

    Powathil, Gibin G; Swat, Maciej; Chaplain, Mark A J

    2015-02-01

    The multiscale complexity of cancer as a disease necessitates a corresponding multiscale modelling approach to produce truly predictive mathematical models capable of improving existing treatment protocols. To capture all the dynamics of solid tumour growth and its progression, mathematical modellers need to couple biological processes occurring at various spatial and temporal scales (from genes to tissues). Because effectiveness of cancer therapy is considerably affected by intracellular and extracellular heterogeneities as well as by the dynamical changes in the tissue microenvironment, any model attempt to optimise existing protocols must consider these factors ultimately leading to improved multimodal treatment regimes. By improving existing and building new mathematical models of cancer, modellers can play important role in preventing the use of potentially sub-optimal treatment combinations. In this paper, we analyse a multiscale computational mathematical model for cancer growth and spread, incorporating the multiple effects of radiation therapy and chemotherapy in the patient survival probability and implement the model using two different cell based modelling techniques. We show that the insights provided by such multiscale modelling approaches can ultimately help in designing optimal patient-specific multi-modality treatment protocols that may increase patients quality of life. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. How Theory-Building Research on Instruction Can Support Instructional Improvement: Toward a Modelling Perspective in Secondary Geometry

    ERIC Educational Resources Information Center

    Herbst, Patricio

    2016-01-01

    How can basic research on mathematics instruction contribute to instructional improvement? In our research on the practical rationality of geometry teaching we describe existing instruction and examine how existing instruction responds to perturbations. In this talk I consider the proposal that geometry instruction could be improved by infusing it…

  4. A new enhanced index tracking model in portfolio optimization with sum weighted approach

    NASA Astrophysics Data System (ADS)

    Siew, Lam Weng; Jaaman, Saiful Hafizah; Hoe, Lam Weng

    2017-04-01

    Index tracking is a portfolio management which aims to construct the optimal portfolio to achieve similar return with the benchmark index return at minimum tracking error without purchasing all the stocks that make up the index. Enhanced index tracking is an improved portfolio management which aims to generate higher portfolio return than the benchmark index return besides minimizing the tracking error. The objective of this paper is to propose a new enhanced index tracking model with sum weighted approach to improve the existing index tracking model for tracking the benchmark Technology Index in Malaysia. The optimal portfolio composition and performance of both models are determined and compared in terms of portfolio mean return, tracking error and information ratio. The results of this study show that the optimal portfolio of the proposed model is able to generate higher mean return than the benchmark index at minimum tracking error. Besides that, the proposed model is able to outperform the existing model in tracking the benchmark index. The significance of this study is to propose a new enhanced index tracking model with sum weighted apporach which contributes 67% improvement on the portfolio mean return as compared to the existing model.

  5. Diagnostic utility of appetite loss in addition to existing prediction models for community-acquired pneumonia in the elderly: a prospective diagnostic study in acute care hospitals in Japan

    PubMed Central

    Yamamoto, Yosuke; Terada, Kazuhiko; Ohta, Mitsuyasu; Mikami, Wakako; Yokota, Hajime; Hayashi, Michio; Miyashita, Jun; Azuma, Teruhisa; Fukuma, Shingo; Fukuhara, Shunichi

    2017-01-01

    Objective Diagnosis of community-acquired pneumonia (CAP) in the elderly is often delayed because of atypical presentation and non-specific symptoms, such as appetite loss, falls and disturbance in consciousness. The aim of this study was to investigate the external validity of existing prediction models and the added value of the non-specific symptoms for the diagnosis of CAP in elderly patients. Design Prospective cohort study. Setting General medicine departments of three teaching hospitals in Japan. Participants A total of 109 elderly patients who consulted for upper respiratory symptoms between 1 October 2014 and 30 September 2016. Main outcome measures The reference standard for CAP was chest radiograph evaluated by two certified radiologists. The existing models were externally validated for diagnostic performance by calibration plot and discrimination. To evaluate the additional value of the non-specific symptoms to the existing prediction models, we developed an extended logistic regression model. Calibration, discrimination, category-free net reclassification improvement (NRI) and decision curve analysis (DCA) were investigated in the extended model. Results Among the existing models, the model by van Vugt demonstrated the best performance, with an area under the curve of 0.75(95% CI 0.63 to 0.88); calibration plot showed good fit despite a significant Hosmer-Lemeshow test (p=0.017). Among the non-specific symptoms, appetite loss had positive likelihood ratio of 3.2 (2.0–5.3), negative likelihood ratio of 0.4 (0.2–0.7) and OR of 7.7 (3.0–19.7). Addition of appetite loss to the model by van Vugt led to improved calibration at p=0.48, NRI of 0.53 (p=0.019) and higher net benefit by DCA. Conclusions Information on appetite loss improved the performance of an existing model for the diagnosis of CAP in the elderly. PMID:29122806

  6. Improved dual-porosity models for petrophysical analysis of vuggy reservoirs

    NASA Astrophysics Data System (ADS)

    Wang, Haitao

    2017-08-01

    A new vug interconnection, isolated vug (IVG), was investigated through resistivity modeling and the dual-porosity model for connected vug (CVG) vuggy reservoirs was tested. The vuggy models were built by pore-scale modeling, and their electrical resistivity was calculated by the finite difference method. For CVG vuggy reservoirs, the CVG reduced formation factors and increased the porosity exponents, and the existing dual-porosity model failed to match these results. Based on the existing dual-porosity model, a conceptual dual-porosity model for CVG was developed by introducing a decoupled term to reduce the resistivity of the model. For IVG vuggy reservoirs, IVG increased the formation factors and porosity exponents. The existing dual-porosity model succeeded due to accurate calculation of the formation factors of the deformed interparticle porous media caused by the insertion of the IVG. Based on the existing dual-porosity model, a new porosity model for IVG vuggy reservoirs was developed by simultaneously recalculating the formation factors of the altered interparticle pore-scale models. The formation factors and porosity exponents from the improved and extended dual-porosity models for CVG and IVG vuggy reservoirs well matched the simulated formation factors and porosity exponents. This work is helpful for understanding the influence of connected and disconnected vugs on resistivity factors—an issue of particular importance in carbonates.

  7. Model Uncertainty Quantification Methods In Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  8. 78 FR 13874 - Watershed Modeling To Assess the Sensitivity of Streamflow, Nutrient, and Sediment Loads to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-01

    ... an improved understanding of methodological challenges associated with integrating existing tools and... methodological challenges associated with integrating existing tools (e.g., climate models, downscaling... sensitivity to methodological choices such as different approaches for downscaling global climate change...

  9. Using Bayesian Networks to Improve Knowledge Assessment

    ERIC Educational Resources Information Center

    Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra

    2013-01-01

    In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…

  10. Facilities Management of Existing School Buildings: Two Models.

    ERIC Educational Resources Information Center

    Building Technology, Inc., Silver Spring, MD.

    While all school districts are responsible for the management of their existing buildings, they often approach the task in different ways. This document presents two models that offer ways a school district administration, regardless of size, may introduce activities into its ongoing management process that will lead to improvements in earthquake…

  11. wfip2.model/realtime.hrrr_esrl.graphics.01 (Model: Real Time)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  12. wfip2.model/realtime.rap_esrl.icbc.01 (Model: Real Time)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  13. wfip2.model/refcst.01.fcst.02 (Model: Year-Long Reforecast)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  14. wfip2.model/refcst.coldstart.icbc.02 (Model: Year-Long Reforecast)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  15. wfip2.model/realtime.hrrr_esrl.icbc.01 (Model: Real Time)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  16. wfip2.model/realtime.rap_esrl.graphics.01 (Model: Real Time)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  17. wfip2.model/refcst.01.fcst.01 (Model: Year-Long Reforecast)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  18. wfip2.model/refcst.coldstart.icbc.01 (Model: Year-Long Reforecast)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  19. wfip2.model/refcst.02.fcst.02 (Model: Year-Long Reforecast)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  20. Additional Research Needs to Support the GENII Biosphere Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napier, Bruce A.; Snyder, Sandra F.; Arimescu, Carmen

    In the course of evaluating the current parameter needs for the GENII Version 2 code (Snyder et al. 2013), areas of possible improvement for both the data and the underlying models have been identified. As the data review was implemented, PNNL staff identified areas where the models can be improved both to accommodate the locally significant pathways identified and also to incorporate newer models. The areas are general data needs for the existing models and improved formulations for the pathway models.

  1. wfip2.model/refcst.02.fcst.01

    DOE Data Explorer

    Macduff, Matt

    2017-10-26

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  2. Diagnostic utility of appetite loss in addition to existing prediction models for community-acquired pneumonia in the elderly: a prospective diagnostic study in acute care hospitals in Japan.

    PubMed

    Takada, Toshihiko; Yamamoto, Yosuke; Terada, Kazuhiko; Ohta, Mitsuyasu; Mikami, Wakako; Yokota, Hajime; Hayashi, Michio; Miyashita, Jun; Azuma, Teruhisa; Fukuma, Shingo; Fukuhara, Shunichi

    2017-11-08

    Diagnosis of community-acquired pneumonia (CAP) in the elderly is often delayed because of atypical presentation and non-specific symptoms, such as appetite loss, falls and disturbance in consciousness. The aim of this study was to investigate the external validity of existing prediction models and the added value of the non-specific symptoms for the diagnosis of CAP in elderly patients. Prospective cohort study. General medicine departments of three teaching hospitals in Japan. A total of 109 elderly patients who consulted for upper respiratory symptoms between 1 October 2014 and 30 September 2016. The reference standard for CAP was chest radiograph evaluated by two certified radiologists. The existing models were externally validated for diagnostic performance by calibration plot and discrimination. To evaluate the additional value of the non-specific symptoms to the existing prediction models, we developed an extended logistic regression model. Calibration, discrimination, category-free net reclassification improvement (NRI) and decision curve analysis (DCA) were investigated in the extended model. Among the existing models, the model by van Vugt demonstrated the best performance, with an area under the curve of 0.75(95% CI 0.63 to 0.88); calibration plot showed good fit despite a significant Hosmer-Lemeshow test (p=0.017). Among the non-specific symptoms, appetite loss had positive likelihood ratio of 3.2 (2.0-5.3), negative likelihood ratio of 0.4 (0.2-0.7) and OR of 7.7 (3.0-19.7). Addition of appetite loss to the model by van Vugt led to improved calibration at p=0.48, NRI of 0.53 (p=0.019) and higher net benefit by DCA. Information on appetite loss improved the performance of an existing model for the diagnosis of CAP in the elderly. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  3. Proates a computer modelling system for power plant: Its description and application to heatrate improvement within PowerGen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, C.H.; Ready, A.B.; Rea, J.

    1995-06-01

    Versions of the computer program PROATES (PROcess Analysis for Thermal Energy Systems) have been used since 1979 to analyse plant performance improvement proposals relating to existing plant and also to evaluate new plant designs. Several plant modifications have been made to improve performance based on the model predictions and the predicted performance has been realised in practice. The program was born out of a need to model the overall steady state performance of complex plant to enable proposals to change plant component items or operating strategy to be evaluated. To do this with confidence it is necessary to model themore » multiple thermodynamic interactions between the plant components. The modelling system is modular in concept allowing the configuration of individual plant components to represent any particular power plant design. A library exists of physics based modules which have been extensively validated and which provide representations of a wide range of boiler, turbine and CW system components. Changes to model data and construction is achieved via a user friendly graphical model editing/analysis front-end with results being presented via the computer screen or hard copy. The paper describes briefly the modelling system but concentrates mainly on the application of the modelling system to assess design re-optimisation, firing with different fuels and the re-powering of an existing plant.« less

  4. Improved Modeling of Open Waveguide Aperture Radiators for use in Conformal Antenna Arrays

    NASA Astrophysics Data System (ADS)

    Nelson, Gregory James

    Open waveguide apertures have been used as radiating elements in conformal arrays. Individual radiating element model patterns are used in constructing overall array models. The existing models for these aperture radiating elements may not accurately predict the array pattern for TEM waves which are not on boresight for each radiating element. In particular, surrounding structures can affect the far field patterns of these apertures, which ultimately affects the overall array pattern. New models of open waveguide apertures are developed here with the goal of accounting for the surrounding structure effects on the aperture far field patterns such that the new models make accurate pattern predictions. These aperture patterns (both E plane and H plane) are measured in an anechoic chamber and the manner in which they deviate from existing model patterns are studied. Using these measurements as a basis, existing models for both E and H planes are updated with new factors and terms which allow the prediction of far field open waveguide aperture patterns with improved accuracy. These new and improved individual radiator models are then used to predict overall conformal array patterns. Arrays of open waveguide apertures are constructed and measured in a similar fashion to the individual aperture measurements. These measured array patterns are compared with the newly modeled array patterns to verify the improved accuracy of the new models as compared with the performance of existing models in making array far field pattern predictions. The array pattern lobe characteristics are then studied for predicting fully circularly conformal arrays of varying radii. The lobe metrics that are tracked are angular location and magnitude as the radii of the conformal arrays are varied. A constructed, measured array that is close to conforming to a circular surface is compared with a fully circularly conformal modeled array pattern prediction, with the predicted lobe angular locations and magnitudes tracked, plotted and tabulated. The close match between the patterns of the measured array and the modeled circularly conformal array verifies the validity of the modeled circularly conformal array pattern predictions.

  5. wfip2.model/retro.hrrr.01.fcst.01 (Model: 10-Day Retrospective)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  6. wfip2.model/retro.hrrr.02.fcst.01 (Model: 10-Day Retrospective)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  7. wfip2.model/retro.hrrr.02.fcst.02 (Model: 10-Day Retrospective)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  8. wfip2.model/retro.rap.01.fcst.01 (Model: 10-Day Retrospective)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  9. wfip2.model/realtime.hrrr_wfip2.graphics.02 (Model: Real Time)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  10. wfip2.model/retro.rap.02.fcst.01 (Model: 10-Day Retrospective)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  11. wfip2.model/realtime.hrrr_wfip2.icbc.02 (Model: Real Time)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  12. wfip2.model/retro.hrrr.01.fcst.02 (Model: 10-Day Retrospective)

    DOE Data Explorer

    Macduff, Matt

    2017-10-27

    The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.

  13. A framework for considering business models.

    PubMed

    Anderson, James G

    2003-01-01

    Information technology (IT) such as computerized physician order entry, computer-based decision support and alerting systems, and electronic prescribing can reduce medical errors and improve the quality of health care. However, the business value of these systems is frequently questioned. At present a number of barriers exist to realizing the potential of IT to improve quality of care. Some of these barriers are: the ineffectiveness of existing error reporting systems, low investment in IT infrastructure, legal impediments to reforms, and the difficulty in demonstrating a sufficient return on investment to justify expenditures for quality improvement. This paper provides an overview of these issues, a framework for considering business models, and examples of successful implementations of IT to improve quality of patient care.

  14. Improving surgeon utilization in an orthopedic department using simulation modeling

    PubMed Central

    Simwita, Yusta W; Helgheim, Berit I

    2016-01-01

    Purpose Worldwide more than two billion people lack appropriate access to surgical services due to mismatch between existing human resource and patient demands. Improving utilization of existing workforce capacity can reduce the existing gap between surgical demand and available workforce capacity. In this paper, the authors use discrete event simulation to explore the care process at an orthopedic department. Our main focus is improving utilization of surgeons while minimizing patient wait time. Methods The authors collaborated with orthopedic department personnel to map the current operations of orthopedic care process in order to identify factors that influence poor surgeons utilization and high patient waiting time. The authors used an observational approach to collect data. The developed model was validated by comparing the simulation output with the actual patient data that were collected from the studied orthopedic care process. The authors developed a proposal scenario to show how to improve surgeon utilization. Results The simulation results showed that if ancillary services could be performed before the start of clinic examination services, the orthopedic care process could be highly improved. That is, improved surgeon utilization and reduced patient waiting time. Simulation results demonstrate that with improved surgeon utilizations, up to 55% increase of future demand can be accommodated without patients reaching current waiting time at this clinic, thus, improving patient access to health care services. Conclusion This study shows how simulation modeling can be used to improve health care processes. This study was limited to a single care process; however the findings can be applied to improve other orthopedic care process with similar operational characteristics. PMID:29355193

  15. Improved indexes for targeting placement of buffers of Hortonian runoff

    Treesearch

    M.G. Dosskey; Z. Qiu; M.J. Helmers; D.E. Eisenhauer

    2011-01-01

    Targeting specific locations within agricultural watersheds for installing vegetative buffers has been advocated as a way to enhance the impact of buffers and buffer programs on stream water quality. Existing models for targeting buffers of Hortonian, or infiltration-excess, runoff are not well developed. The objective was to improve on an existing soil survey–based...

  16. Setting priorities in health research using the model proposed by the World Health Organization: development of a quantitative methodology using tuberculosis in South Africa as a worked example.

    PubMed

    Hacking, Damian; Cleary, Susan

    2016-02-09

    Setting priorities is important in health research given the limited resources available for research. Various guidelines exist to assist in the priority setting process; however, priority setting still faces significant challenges such as the clear ranking of identified priorities. The World Health Organization (WHO) proposed a Disability Adjusted Life Year (DALY)-based model to rank priorities by research area (basic, health systems and biomedical) by dividing the DALYs into 'unavertable with existing interventions', 'avertable with improved efficiency' and 'avertable with existing but non-cost-effective interventions', respectively. However, the model has conceptual flaws and no clear methodology for its construction. Therefore, the aim of this paper was to amend the model to address these flaws, and develop a clear methodology by using tuberculosis in South Africa as a worked example. An amended model was constructed to represent total DALYs as the product of DALYs per person and absolute burden of disease. These figures were calculated for all countries from WHO datasets. The lowest figures achieved by any country were assumed to represent 'unavertable with existing interventions' if extrapolated to South Africa. The ratio of 'cost per patient treated' (adjusted for purchasing power and outcome weighted) between South Africa and the best country was used to calculate the 'avertable with improved efficiency section'. Finally, 'avertable with existing but non-cost-effective interventions' was calculated using Disease Control Priorities Project efficacy data, and the ratio between the best intervention and South Africa's current intervention, irrespective of cost. The amended model shows that South Africa has a tuberculosis burden of 1,009,837.3 DALYs; 0.009% of DALYs are unavertable with existing interventions and 96.3% of DALYs could be averted with improvements in efficiency. Of the remaining DALYs, a further 56.9% could be averted with existing but non-cost-effective interventions. The amended model was successfully constructed using limited data sources. The generalizability of the data used is the main limitation of the model. More complex formulas are required to deal with such potential confounding variables; however, the results act as starting point for development of a more robust model.

  17. Test code for the assessment and improvement of Reynolds stress models

    NASA Technical Reports Server (NTRS)

    Rubesin, M. W.; Viegas, J. R.; Vandromme, D.; Minh, H. HA

    1987-01-01

    An existing two-dimensional, compressible flow, Navier-Stokes computer code, containing a full Reynolds stress turbulence model, was adapted for use as a test bed for assessing and improving turbulence models based on turbulence simulation experiments. To date, the results of using the code in comparison with simulated channel flow and over an oscillating flat plate have shown that the turbulence model used in the code needs improvement for these flows. It is also shown that direct simulation of turbulent flows over a range of Reynolds numbers are needed to guide subsequent improvement of turbulence models.

  18. SU-F-T-350: Continuous Leaf Optimization (CLO) for IMRT Leaf Sequencing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, T; Chen, M; Jiang, S

    Purpose: To study a new step-and-shoot IMRT leaf sequencing model that avoids the two main pitfalls of conventional leaf sequencing: (1) target fluence being stratified into a fixed number of discrete levels and/or (2) aperture leaf positions being restricted to a discrete set of locations. These assumptions induce error into the sequence or reduce the feasible region of potential plans, respectively. Methods: We develop a one-dimensional (single leaf pair) methodology that does not make assumptions (1) or (2) that can be easily extended to a multi-row model. The proposed continuous leaf optimization (CLO) methodology takes in an existing set ofmore » apertures and associated intensities, or solution “seed,” and improves the plan without the restrictiveness of 1or (2). It then uses a first-order descent algorithm to converge onto a locally optimal solution. A seed solution can come from models that assume (1) and (2), thus allowing the CLO model to improve upon existing leaf sequencing methodologies. Results: The CLO model was applied to 208 generated target fluence maps in one dimension. In all cases for all tested sequencing strategies, the CLO model made improvements on the starting seed objective function. The CLO model also was able to keep MUs low. Conclusion: The CLO model can improve upon existing leaf sequencing methods by avoiding the restrictions of (1) and (2). By allowing for more flexible leaf positioning, error can be reduced when matching some target fluence. This study lays the foundation for future models and solution methodologies that can incorporate continuous leaf positions explicitly into the IMRT treatment planning model. Supported by Cancer Prevention & Research Institute of Texas (CPRIT) - ID RP150485.« less

  19. An individual-based model of zebrafish population dynamics accounting for energy dynamics.

    PubMed

    Beaudouin, Rémy; Goussen, Benoit; Piccini, Benjamin; Augustine, Starrlight; Devillers, James; Brion, François; Péry, Alexandre R R

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) was coupled to an individual based model of zebrafish population dynamics (IBM model). Next, we fitted the DEB model to new experimental data on zebrafish growth and reproduction thus improving existing models. We further analysed the DEB-model and DEB-IBM using a sensitivity analysis. Finally, the predictions of the DEB-IBM were compared to existing observations on natural zebrafish populations and the predicted population dynamics are realistic. While our zebrafish DEB-IBM model can still be improved by acquiring new experimental data on the most uncertain processes (e.g. survival or feeding), it can already serve to predict the impact of compounds at the population level.

  20. An Individual-Based Model of Zebrafish Population Dynamics Accounting for Energy Dynamics

    PubMed Central

    Beaudouin, Rémy; Goussen, Benoit; Piccini, Benjamin; Augustine, Starrlight; Devillers, James; Brion, François; Péry, Alexandre R. R.

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) was coupled to an individual based model of zebrafish population dynamics (IBM model). Next, we fitted the DEB model to new experimental data on zebrafish growth and reproduction thus improving existing models. We further analysed the DEB-model and DEB-IBM using a sensitivity analysis. Finally, the predictions of the DEB-IBM were compared to existing observations on natural zebrafish populations and the predicted population dynamics are realistic. While our zebrafish DEB-IBM model can still be improved by acquiring new experimental data on the most uncertain processes (e.g. survival or feeding), it can already serve to predict the impact of compounds at the population level. PMID:25938409

  1. Theory of low frequency noise transmission through turbines

    NASA Technical Reports Server (NTRS)

    Matta, R. K.; Mani, R.

    1979-01-01

    Improvements of the existing theory of low frequency noise transmission through turbines and development of a working prediction tool are described. The existing actuator-disk model and a new finite-chord model were utilized in an analytical study. The interactive effect of adjacent blade rows, higher order spinning modes, blade-passage shocks, and duct area variations were considered separately. The improved theory was validated using the data acquired in an earlier NASA program. Computer programs incorporating the improved theory were produced for transmission loss prediction purposes. The programs were exercised parametrically and charts constructed to define approximately the low frequency noise transfer through turbines. The loss through the exhaust nozzle and flow(s) was also considered.

  2. Calibration of PMIS pavement performance prediction models.

    DOT National Transportation Integrated Search

    2012-02-01

    Improve the accuracy of TxDOTs existing pavement performance prediction models through calibrating these models using actual field data obtained from the Pavement Management Information System (PMIS). : Ensure logical performance superiority patte...

  3. Distribution system model calibration with big data from AMI and PV inverters

    DOE PAGES

    Peppanen, Jouni; Reno, Matthew J.; Broderick, Robert J.; ...

    2016-03-03

    Efficient management and coordination of distributed energy resources with advanced automation schemes requires accurate distribution system modeling and monitoring. Big data from smart meters and photovoltaic (PV) micro-inverters can be leveraged to calibrate existing utility models. This paper presents computationally efficient distribution system parameter estimation algorithms to improve the accuracy of existing utility feeder radial secondary circuit model parameters. The method is demonstrated using a real utility feeder model with advanced metering infrastructure (AMI) and PV micro-inverters, along with alternative parameter estimation approaches that can be used to improve secondary circuit models when limited measurement data is available. Lastly, themore » parameter estimation accuracy is demonstrated for both a three-phase test circuit with typical secondary circuit topologies and single-phase secondary circuits in a real mixed-phase test system.« less

  4. Distribution system model calibration with big data from AMI and PV inverters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peppanen, Jouni; Reno, Matthew J.; Broderick, Robert J.

    Efficient management and coordination of distributed energy resources with advanced automation schemes requires accurate distribution system modeling and monitoring. Big data from smart meters and photovoltaic (PV) micro-inverters can be leveraged to calibrate existing utility models. This paper presents computationally efficient distribution system parameter estimation algorithms to improve the accuracy of existing utility feeder radial secondary circuit model parameters. The method is demonstrated using a real utility feeder model with advanced metering infrastructure (AMI) and PV micro-inverters, along with alternative parameter estimation approaches that can be used to improve secondary circuit models when limited measurement data is available. Lastly, themore » parameter estimation accuracy is demonstrated for both a three-phase test circuit with typical secondary circuit topologies and single-phase secondary circuits in a real mixed-phase test system.« less

  5. Improving real-time inflow forecasting into hydropower reservoirs through a complementary modelling framework

    NASA Astrophysics Data System (ADS)

    Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.

    2015-08-01

    Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead time is considered within the day-ahead (Elspot) market of the Nordic exchange market. A complementary modelling framework presents an approach for improving real-time forecasting without needing to modify the pre-existing forecasting model, but instead formulating an independent additive or complementary model that captures the structure the existing operational model may be missing. We present here the application of this principle for issuing improved hourly inflow forecasts into hydropower reservoirs over extended lead times, and the parameter estimation procedure reformulated to deal with bias, persistence and heteroscedasticity. The procedure presented comprises an error model added on top of an unalterable constant parameter conceptual model. This procedure is applied in the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead times up to 17 h. Evaluation of the percentage of observations bracketed in the forecasted 95 % confidence interval indicated that the degree of success in containing 95 % of the observations varies across seasons and hydrologic years.

  6. Institutional Transformation Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-19

    Reducing the energy consumption of large institutions with dozens to hundreds of existing buildings while maintaining and improving existing infrastructure is a critical economic and environmental challenge. SNL's Institutional Transformation (IX) work integrates facilities and infrastructure sustainability technology capabilities and collaborative decision support modeling approaches to help facilities managers at Sandia National Laboratories (SNL) simulate different future energy reduction strategies and meet long term energy conservation goals.

  7. A Whole School Approach: Collaborative Development of School Health Policies, Processes, and Practices

    ERIC Educational Resources Information Center

    Hunt, Pete; Barrios, Lisa; Telljohann, Susan K.; Mazyck, Donna

    2015-01-01

    Background: The Whole School, Whole Community, Whole Child (WSCC) model shows the interrelationship between health and learning and the potential for improving educational outcomes by improving health outcomes. However, current descriptions do not explain how to implement the model. Methods: The existing literature, including scientific articles,…

  8. Mammographic density, breast cancer risk and risk prediction

    PubMed Central

    Vachon, Celine M; van Gils, Carla H; Sellers, Thomas A; Ghosh, Karthik; Pruthi, Sandhya; Brandt, Kathleen R; Pankratz, V Shane

    2007-01-01

    In this review, we examine the evidence for mammographic density as an independent risk factor for breast cancer, describe the risk prediction models that have incorporated density, and discuss the current and future implications of using mammographic density in clinical practice. Mammographic density is a consistent and strong risk factor for breast cancer in several populations and across age at mammogram. Recently, this risk factor has been added to existing breast cancer risk prediction models, increasing the discriminatory accuracy with its inclusion, albeit slightly. With validation, these models may replace the existing Gail model for clinical risk assessment. However, absolute risk estimates resulting from these improved models are still limited in their ability to characterize an individual's probability of developing cancer. Promising new measures of mammographic density, including volumetric density, which can be standardized using full-field digital mammography, will likely result in a stronger risk factor and improve accuracy of risk prediction models. PMID:18190724

  9. CMMI for Services (SVC): The Strategic Landscape for Service

    DTIC Science & Technology

    2012-01-01

    processes. • Many existing models are designed for specific services or industries . • Other existing models do not provide a clear improvement path...Production, such as engineering and manufacturing Disciplines and industries , such as education, health care, insurance, utilities, and hospitality...as a Service ―More and more major businesses and industries are being run on software and delivered as online services—from movies to agriculture

  10. A Critical Review for Developing Accurate and Dynamic Predictive Models Using Machine Learning Methods in Medicine and Health Care.

    PubMed

    Alanazi, Hamdan O; Abdullah, Abdul Hanan; Qureshi, Kashif Naseer

    2017-04-01

    Recently, Artificial Intelligence (AI) has been used widely in medicine and health care sector. In machine learning, the classification or prediction is a major field of AI. Today, the study of existing predictive models based on machine learning methods is extremely active. Doctors need accurate predictions for the outcomes of their patients' diseases. In addition, for accurate predictions, timing is another significant factor that influences treatment decisions. In this paper, existing predictive models in medicine and health care have critically reviewed. Furthermore, the most famous machine learning methods have explained, and the confusion between a statistical approach and machine learning has clarified. A review of related literature reveals that the predictions of existing predictive models differ even when the same dataset is used. Therefore, existing predictive models are essential, and current methods must be improved.

  11. Collaborative Care in Schools: Enhancing Integration and Impact in Youth Mental Health

    PubMed Central

    Lyon, Aaron R.; Whitaker, Kelly; French, William P.; Richardson, Laura P.; Wasse, Jessica Knaster; McCauley, Elizabeth

    2016-01-01

    Collaborative Care is an innovative approach to integrated mental health service delivery that focuses on reducing access barriers, improving service quality, and lowering healthcare expenditures. A large body of evidence supports the effectiveness of Collaborative Care models with adults and, increasingly, for youth. Although existing studies examining these models for youth have focused exclusively on primary care, the education sector is also an appropriate analog for the accessibility that primary care offers to adults. Collaborative Care aligns closely with the practical realities of the education sector and may represent a strategy to achieve some of the objectives of increasingly popular multi-tiered systems of supports frameworks. Unfortunately, no resources exist to guide the application of Collaborative Care models in schools. Based on the existing evidence for Collaborative Care models, the current paper (1) provides a rationale for the adaptation of Collaborative Care models to improve mental health service accessibility and effectiveness in the education sector; (2) presents a preliminary Collaborative Care model for use in schools; and (3) describes avenues for research surrounding school-based Collaborative Care, including the currently funded Accessible, Collaborative Care for Effective School-based Services (ACCESS) project. PMID:28392832

  12. Assessment of the Crashworthiness of Existing Urban Rail Vehicles. Volume 3. Train-Collision Model Users Manual.

    DOT National Transportation Integrated Search

    1975-11-01

    The crashworthiness of existing urban rail vehicles (passenger cars) and the feasibility of improvements in this area were investigated. Both rail-car structural configurations and impact absorption devices were studied. This final report issued unde...

  13. Consumer preference models: fuzzy theory approach

    NASA Astrophysics Data System (ADS)

    Turksen, I. B.; Wilson, I. A.

    1993-12-01

    Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).

  14. An improved k-epsilon model for near wall turbulence

    NASA Technical Reports Server (NTRS)

    Shih, T. H.; Hsu, Andrew T.

    1991-01-01

    An improved k-epsilon model for low Reynolds number turbulence near a wall is presented. In the first part of this work, the near-wall asymptotic behavior of the eddy viscosity and the pressure transport term in the turbulent kinetic energy equation are analyzed. Based on these analyses, a modified eddy viscosity model with the correct near-wall behavior is suggested, and a model for the pressure transport term in the k-equation is proposed. In addition, a modeled dissipation rate equation is reformulated, and a boundary condition for the dissipation rate is suggested. In the second part of the work, one of the deficiencies of the existing k-epsilon models, namely, the wall distance dependency of the equations and the damping functions, is examined. An improved model that does not depend on any wall distance is introduced. Fully developed turbulent channel flows and turbulent boundary layers over a flat plate are studied as validations for the proposed new models. Numerical results obtained from the present and other previous k-epsilon models are compared with data from direct numerical simulation. The results show that the present k-epsilon model, with added robustness, performs as well as or better than other existing models in predicting the behavior of near-wall turbulence.

  15. Spreadsheet WATERSHED modeling for nonpoint-source pollution management in a Wisconsin basin

    USGS Publications Warehouse

    Walker, J.F.; Pickard, S.A.; Sonzogni, W.C.

    1989-01-01

    Although several sophisticated nonpoint pollution models exist, few are available that are easy to use, cover a variety of conditions, and integrate a wide range of information to allow managers and planners to assess different control strategies. Here, a straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.A straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.

  16. Existence of topological multi-string solutions in Abelian gauge field theories

    NASA Astrophysics Data System (ADS)

    Han, Jongmin; Sohn, Juhee

    2017-11-01

    In this paper, we consider a general form of self-dual equations arising from Abelian gauge field theories coupled with the Einstein equations. By applying the super/subsolution method, we prove that topological multi-string solutions exist for any coupling constant, which improves previously known results. We provide two examples for application: the self-dual Einstein-Maxwell-Higgs model and the gravitational Maxwell gauged O(3) sigma model.

  17. Improving Deterministic Reserve Requirements for Security Constrained Unit Commitment and Scheduling Problems in Power Systems

    NASA Astrophysics Data System (ADS)

    Wang, Fengyu

    Traditional deterministic reserve requirements rely on ad-hoc, rule of thumb methods to determine adequate reserve in order to ensure a reliable unit commitment. Since congestion and uncertainties exist in the system, both the quantity and the location of reserves are essential to ensure system reliability and market efficiency. The modeling of operating reserves in the existing deterministic reserve requirements acquire the operating reserves on a zonal basis and do not fully capture the impact of congestion. The purpose of a reserve zone is to ensure that operating reserves are spread across the network. Operating reserves are shared inside each reserve zone, but intra-zonal congestion may block the deliverability of operating reserves within a zone. Thus, improving reserve policies such as reserve zones may improve the location and deliverability of reserve. As more non-dispatchable renewable resources are integrated into the grid, it will become increasingly difficult to predict the transfer capabilities and the network congestion. At the same time, renewable resources require operators to acquire more operating reserves. With existing deterministic reserve requirements unable to ensure optimal reserve locations, the importance of reserve location and reserve deliverability will increase. While stochastic programming can be used to determine reserve by explicitly modelling uncertainties, there are still scalability as well as pricing issues. Therefore, new methods to improve existing deterministic reserve requirements are desired. One key barrier of improving existing deterministic reserve requirements is its potential market impacts. A metric, quality of service, is proposed in this thesis to evaluate the price signal and market impacts of proposed hourly reserve zones. Three main goals of this thesis are: 1) to develop a theoretical and mathematical model to better locate reserve while maintaining the deterministic unit commitment and economic dispatch structure, especially with the consideration of renewables, 2) to develop a market settlement scheme of proposed dynamic reserve policies such that the market efficiency is improved, 3) to evaluate the market impacts and price signal of the proposed dynamic reserve policies.

  18. Using Geographic Information Systems to Evaluate Energy Initiatives in Austere Environments

    DTIC Science & Technology

    2013-03-01

    conducting economic analysis of energy reduction initiatives. This research examined the energy savings potential of improving the thermal properties...shelter improvements in any climate and location in the world. Specifically, solar flies developed through Solar Integrated Power Shelter System...94 Improvements to the Existing Model

  19. Health service changes to address diabetes in pregnancy in a complex setting: perspectives of health professionals.

    PubMed

    Kirkham, R; Boyle, J A; Whitbread, C; Dowden, M; Connors, C; Corpus, S; McCarthy, L; Oats, J; McIntyre, H D; Moore, E; O'Dea, K; Brown, A; Maple-Brown, L

    2017-08-03

    Australian Aboriginal and Torres Strait Islander women have high rates of gestational and pre-existing type 2 diabetes in pregnancy. The Northern Territory (NT) Diabetes in Pregnancy Partnership was established to enhance systems and services to improve health outcomes. It has three arms: a clinical register, developing models of care and a longitudinal birth cohort. This study used a process evaluation to report on health professional's perceptions of models of care and related quality improvement activities since the implementation of the Partnership. Changes to models of care were documented according to goals and aims of the Partnership and reviewed annually by the Partnership Steering group. A 'systems assessment tool' was used to guide six focus groups (49 healthcare professionals). Transcripts were coded and analysed according to pre-identified themes of orientation and guidelines, education, communication, logistics and access, and information technology. Key improvements since implementation of the Partnership include: health professional relationships, communication and education; and integration of quality improvement activities. Focus groups with 49 health professionals provided in depth information about how these activities have impacted their practice and models of care for diabetes in pregnancy. Co-ordination of care was reported to have improved, however it was also identified as an opportunity for further development. Recommendations included a central care coordinator, better integration of information technology systems and ongoing comprehensive quality improvement processes. The Partnership has facilitated quality improvement through supporting the development of improved systems that enhance models of care. Persisting challenges exist for delivering care to a high risk population however improvements in formal processes and structures, as demonstrated in this work thus far, play an important role in work towards improving health outcomes.

  20. [The methods of assessment of health risk from exposure to radon and radon daughters].

    PubMed

    Demin, V F; Zhukovskiy, M V; Kiselev, S M

    2014-01-01

    The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.

  1. FABRIC FILTER MODEL FORMAT CHANGE; VOLUME II. USER'S GUIDE

    EPA Science Inventory

    The report describes an improved mathematical model for use by control personnel to determine the adequacy of existing or proposed filter systems designed to minimize coal fly ash emissions. Several time-saving steps have been introduced to facilitate model application by Agency ...

  2. Development of the information model for consumer assessment of key quality indicators by goods labelling

    NASA Astrophysics Data System (ADS)

    Koshkina, S.; Ostrinskaya, L.

    2018-04-01

    An information model for “key” quality indicators of goods has been developed. This model is based on the assessment of f standardization existing state and the product labeling quality. According to the authors’ opinion, the proposed “key” indicators are the most significant for purchasing decision making. Customers will be able to use this model through their mobile technical devices. The developed model allows to decompose existing processes in data flows and to reveal the levels of possible architectural solutions. In-depth analysis of the presented information model decomposition levels will allow determining the stages of its improvement and to reveal additional indicators of the goods quality that are of interest to customers in the further research. Examining the architectural solutions for the customer’s information environment functioning when integrating existing databases will allow us to determine the boundaries of the model flexibility and customizability.

  3. Mach Stability Improvements Using an Existing Second Throat Capability at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Chan, David T.; Balakrishna, Sundareswara; Walker, Eric L.; Goodliff, Scott L.

    2015-01-01

    Recent data quality improvements at the National Transonic Facility have an intended goal of reducing the Mach number variation in a data point to within plus or minus 0.0005, with the ultimate goal of reducing the data repeatability of the drag coefficient for full-span subsonic transport models at transonic speeds to within half a drag count. This paper will discuss the Mach stability improvements achieved through the use of an existing second throat capability at the NTF to create a minimum area at the end of the test section. These improvements were demonstrated using both the NASA Common Research Model and the NTF Pathfinder-I model in recent experiments. Sonic conditions at the throat were verified using sidewall static pressure data. The Mach variation levels from both experiments in the baseline tunnel configuration and the choked tunnel configuration will be presented and the correlation between Mach number and drag will also be examined. Finally, a brief discussion is given on the consequences of using the second throat in its location at the end of the test section.

  4. Mach Stability Improvements Using an Existing Second Throat Capability at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Chan, David T.

    2015-01-01

    Recent data quality improvements at the National Transonic Facility (NTF) have an intended goal of reducing the Mach number variation in a data point to within unit vector A plus or minus 0.0005, with the ultimate goal of reducing the data repeatability of the drag coefficient for full-span subsonic transport models at transonic speeds to within half of a drag count. This paper will discuss the Mach stability improvements achieved through the use of an existing second throat capability at the NTF to create a minimum area at the end of the test section. These improvements were demonstrated using both the NASA Common Research Model and the NTF Pathfinder-I model in recent experiments. Sonic conditions at the throat were verified using sidewall static pressure data. The Mach variation levels from both experiments in the baseline tunnel configuration and the choked tunnel configuration will be presented. Finally, a brief discussion is given on the consequences of using the second throat in its location at the end of the test section.

  5. Modelling of different measures for improving removal in a stormwater pond.

    PubMed

    German, J; Jansons, K; Svensson, G; Karlsson, D; Gustafsson, L G

    2005-01-01

    The effect of retrofitting an existing pond on removal efficiency and hydraulic performance was modelled using the commercial software Mike21 and compartmental modelling. The Mike21 model had previously been calibrated on the studied pond. Installation of baffles, the addition of culverts under a causeway and removal of an existing island were all studied as possible improvement measures in the pond. The subsequent effect on hydraulic performance and removal of suspended solids was then evaluated. Copper, cadmium, BOD, nitrogen and phosphorus removal were also investigated for that specific improvement measure showing the best results. Outcomes of this study reveal that all measures increase the removal efficiency of suspended solids. The hydraulic efficiency is improved for all cases, except for the case where the island is removed. Compartmental modelling was also used to evaluate hydraulic performance and facilitated a better understanding of the way each of the different measures affected the flow pattern and performance. It was concluded that the installation of baffles is the best of the studied measures resulting in a reduction in the annual load on the receiving lake by approximately 8,000 kg of suspended solids (25% reduction of the annual load), 2 kg of copper (10% reduction of the annual load) and 600 kg of BOD (10% reduction of the annual load).

  6. Improved thermal lattice Boltzmann model for simulation of liquid-vapor phase change

    NASA Astrophysics Data System (ADS)

    Li, Qing; Zhou, P.; Yan, H. J.

    2017-12-01

    In this paper, an improved thermal lattice Boltzmann (LB) model is proposed for simulating liquid-vapor phase change, which is aimed at improving an existing thermal LB model for liquid-vapor phase change [S. Gong and P. Cheng, Int. J. Heat Mass Transfer 55, 4923 (2012), 10.1016/j.ijheatmasstransfer.2012.04.037]. First, we emphasize that the replacement of ∇ .(λ ∇ T ) /∇.(λ ∇ T ) ρ cV ρ cV with ∇ .(χ ∇ T ) is an inappropriate treatment for diffuse interface modeling of liquid-vapor phase change. Furthermore, the error terms ∂t 0(T v ) +∇ .(T vv ) , which exist in the macroscopic temperature equation recovered from the previous model, are eliminated in the present model through a way that is consistent with the philosophy of the LB method. Moreover, the discrete effect of the source term is also eliminated in the present model. Numerical simulations are performed for droplet evaporation and bubble nucleation to validate the capability of the model for simulating liquid-vapor phase change. It is shown that the numerical results of the improved model agree well with those of a finite-difference scheme. Meanwhile, it is found that the replacement of ∇ .(λ ∇ T ) /∇ .(λ ∇ T ) ρ cV ρ cV with ∇ .(χ ∇ T ) leads to significant numerical errors and the error terms in the recovered macroscopic temperature equation also result in considerable errors.

  7. FABRIC FILTER MODEL FORMAT CHANGE; VOLUME 1. DETAILED TECHNICAL REPORT

    EPA Science Inventory

    The report describes an improved mathematical model for use by control personnel to determine the adequacy of existing or proposed filter systems designed to minimize coal fly ash emissions. Several time-saving steps have been introduced to facilitate model application by Agency ...

  8. Critical Speed of The Glass Glue Machine's Creep and Influence Factors Analysis

    NASA Astrophysics Data System (ADS)

    Yang, Jianxi; Huang, Jian; Wang, Liying; Shi, Jintai

    When automatic glass glue machine works, two questions of the machine starting vibrating and stick-slip motion are existing. These problems should be solved. According to these questions, a glue machine's model for studying stick-slip is established. Based on the dynamics system describing of the model, mathematical expression is presented. The creep critical speed expression is constructed referring to existing research achievement and a new conclusion is found. The influencing factors of stiffness, dampness, mass, velocity, difference of static and kinetic coefficient of friction are analyzed through Matlab simulation. Research shows that reasonable choice of influence parameters can improve the creep phenomenon. These all supply the theory evidence for improving the machine's motion stability.

  9. Archetype modeling methodology.

    PubMed

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. A Bayesian model averaging method for improving SMT phrase table

    NASA Astrophysics Data System (ADS)

    Duan, Nan

    2013-03-01

    Previous methods on improving translation quality by employing multiple SMT models usually carry out as a second-pass decision procedure on hypotheses from multiple systems using extra features instead of using features in existing models in more depth. In this paper, we propose translation model generalization (TMG), an approach that updates probability feature values for the translation model being used based on the model itself and a set of auxiliary models, aiming to alleviate the over-estimation problem and enhance translation quality in the first-pass decoding phase. We validate our approach for translation models based on auxiliary models built by two different ways. We also introduce novel probability variance features into the log-linear models for further improvements. We conclude our approach can be developed independently and integrated into current SMT pipeline directly. We demonstrate BLEU improvements on the NIST Chinese-to-English MT tasks for single-system decodings.

  11. Applying the Model of the Interrelationship of Leadership Environments and Outcomes for Nurse Executives: a community hospital's exemplar in developing staff nurse engagement through documentation improvement initiatives.

    PubMed

    Adams, Jeffrey M; Denham, Debra; Neumeister, Irene Ramirez

    2010-01-01

    The Model of the Interrelationship of Leadership, Environments & Outcomes for Nurse Executives (MILE ONE) was developed on the basis of existing literature related to identifying strategies for simultaneous improvement of leadership, professional practice/work environments (PPWE), and outcomes. Through existing evidence, the MILE ONE identifies the continuous and dependent interrelationship of 3 distinct concept areas: (1) nurse executives influence PPWE, (2) PPWE influence patient and organizational outcomes, and (3) patient and organizational outcomes influence nurse executives. This article highlights the application of the MILE ONE framework to a community district hospital's clinical documentation performance improvement projects. Results suggest that the MILE ONE is a valid and useful framework yielding both anticipated and unexpected enhancements to leaders, environments, and outcomes.

  12. A review of methods for predicting air pollution dispersion

    NASA Technical Reports Server (NTRS)

    Mathis, J. J., Jr.; Grose, W. L.

    1973-01-01

    Air pollution modeling, and problem areas in air pollution dispersion modeling were surveyed. Emission source inventory, meteorological data, and turbulent diffusion are discussed in terms of developing a dispersion model. Existing mathematical models of urban air pollution, and highway and airport models are discussed along with their limitations. Recommendations for improving modeling capabilities are included.

  13. Development of an Improved Simulator for Chemical and Microbial EOR Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, Gary A.; Sepehrnoori, Kamy; Delshad, Mojdeh

    2000-09-11

    The objective of this research was to extend the capability of an existing simulator (UTCHEM) to improved oil recovery methods that use surfactants, polymers, gels, alkaline chemicals, microorganisms and foam as well as various combinations of these in both conventional and naturally fractured oil reservoirs. Task 1 is the addition of a dual-porosity model for chemical improved of recovery processes in naturally fractured oil reservoirs. Task 2 is the addition of a foam model. Task 3 addresses several numerical and coding enhancements that will greatly improve the versatility and performance of UTCHEM. Task 4 is the enhancements of physical propertymore » models.« less

  14. Improved Accuracy Using Recursive Bayesian Estimation Based Language Model Fusion in ERP-Based BCI Typing Systems

    PubMed Central

    Orhan, U.; Erdogmus, D.; Roark, B.; Oken, B.; Purwar, S.; Hild, K. E.; Fowler, A.; Fried-Oken, M.

    2013-01-01

    RSVP Keyboard™ is an electroencephalography (EEG) based brain computer interface (BCI) typing system, designed as an assistive technology for the communication needs of people with locked-in syndrome (LIS). It relies on rapid serial visual presentation (RSVP) and does not require precise eye gaze control. Existing BCI typing systems which uses event related potentials (ERP) in EEG suffer from low accuracy due to low signal-to-noise ratio. Henceforth, RSVP Keyboard™ utilizes a context based decision making via incorporating a language model, to improve the accuracy of letter decisions. To further improve the contributions of the language model, we propose recursive Bayesian estimation, which relies on non-committing string decisions, and conduct an offline analysis, which compares it with the existing naïve Bayesian fusion approach. The results indicate the superiority of the recursive Bayesian fusion and in the next generation of RSVP Keyboard™ we plan to incorporate this new approach. PMID:23366432

  15. Using Uncertainty Quantification to Guide Development and Improvements of a Regional-Scale Model of the Coastal Lowlands Aquifer System Spanning Texas, Louisiana, Mississippi, Alabama and Florida

    NASA Astrophysics Data System (ADS)

    Foster, L. K.; Clark, B. R.; Duncan, L. L.; Tebo, D. T.; White, J.

    2017-12-01

    Several historical groundwater models exist within the Coastal Lowlands Aquifer System (CLAS), which spans the Gulf Coastal Plain in Texas, Louisiana, Mississippi, Alabama, and Florida. The largest of these models, called the Gulf Coast Regional Aquifer System Analysis (RASA) model, has been brought into a new framework using the Newton formulation for MODFLOW-2005 (MODFLOW-NWT) and serves as the starting point of a new investigation underway by the U.S. Geological Survey to improve understanding of the CLAS and provide predictions of future groundwater availability within an uncertainty quantification (UQ) framework. The use of an UQ framework will not only provide estimates of water-level observation worth, hydraulic parameter uncertainty, boundary-condition uncertainty, and uncertainty of future potential predictions, but it will also guide the model development process. Traditionally, model development proceeds from dataset construction to the process of deterministic history matching, followed by deterministic predictions using the model. This investigation will combine the use of UQ with existing historical models of the study area to assess in a quantitative framework the effect model package and property improvements have on the ability to represent past-system states, as well as the effect on the model's ability to make certain predictions of water levels, water budgets, and base-flow estimates. Estimates of hydraulic property information and boundary conditions from the existing models and literature, forming the prior, will be used to make initial estimates of model forecasts and their corresponding uncertainty, along with an uncalibrated groundwater model run within an unconstrained Monte Carlo analysis. First-Order Second-Moment (FOSM) analysis will also be used to investigate parameter and predictive uncertainty, and guide next steps in model development prior to rigorous history matching by using PEST++ parameter estimation code.

  16. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production

    PubMed Central

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D.; Pittelkow, Cameron M.

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders. PMID:28804490

  17. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production.

    PubMed

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D; Pittelkow, Cameron M

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders.

  18. Improving Listening Comprehension through a Whole-Schema Approach.

    ERIC Educational Resources Information Center

    Ellermeyer, Deborah

    1993-01-01

    Examines the development of the schema, or cognitive structure, theory of reading comprehension. Advances a model for improving listening comprehension within the classroom through a teacher-facilitated approach which leads students to selecting and utilizing existing schema within a whole-language environment. (MDM)

  19. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  20. Metropolitan Model Deployment Initiative : San Antonio evaluation report

    DOT National Transportation Integrated Search

    2000-05-01

    This report presents results from the evaluation of the San Antonio Texas Metropolitan Model Deployment Initiative (MMDI). The MMDI had six key goals directed at improving existing services and deploying new services. The goals were directed at: 1) e...

  1. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  2. Quantum protocols within Spekkens' toy model

    NASA Astrophysics Data System (ADS)

    Disilvestro, Leonardo; Markham, Damian

    2017-05-01

    Quantum mechanics is known to provide significant improvements in information processing tasks when compared to classical models. These advantages range from computational speedups to security improvements. A key question is where these advantages come from. The toy model developed by Spekkens [R. W. Spekkens, Phys. Rev. A 75, 032110 (2007), 10.1103/PhysRevA.75.032110] mimics many of the features of quantum mechanics, such as entanglement and no cloning, regarded as being important in this regard, despite being a local hidden variable theory. In this work, we study several protocols within Spekkens' toy model where we see it can also mimic the advantages and limitations shown in the quantum case. We first provide explicit proofs for the impossibility of toy bit commitment and the existence of a toy error correction protocol and consequent k -threshold secret sharing. Then, defining a toy computational model based on the quantum one-way computer, we prove the existence of blind and verified protocols. Importantly, these two last quantum protocols are known to achieve a better-than-classical security. Our results suggest that such quantum improvements need not arise from any Bell-type nonlocality or contextuality, but rather as a consequence of steering correlations.

  3. Online Knowledge-Based Model for Big Data Topic Extraction.

    PubMed

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half.

  4. A Modest Proposal for Improving the Education of Reading Teachers. Technical Report No. 487.

    ERIC Educational Resources Information Center

    Anderson, Richard C.; And Others

    A gap exists between talk about teaching that is featured in most preservice teacher education and the working knowledge and problem-solving expertise that characterize skilled teaching. This gap exists because typical teacher training does not embody the principles of modeling, coaching, scaffolding, articulation, and reflection. Three methods…

  5. A Narrowing Target for Early Mars Climate Models: Which Models Survive Confrontation with Improved Hydrology Constraints?

    NASA Astrophysics Data System (ADS)

    Kite, E. S.; Goldblatt, C.; Gao, P.; Mayer, D. P.; Sneed, J.; Wilson, S. A.

    2016-12-01

    The wettest climates in Mars' geologic history represent habitability optima, and also set the tightest constraints on climate models. For lake-forming climates on Early Mars, geologic data constrain discharge, duration, intermittency, and the number of lake-forming events. We synthesise new and existing data to suggest that post-Noachian lake-forming climates were widely separated in time, lasted >10^4 yr individually, were few in number, but cumulatively lasted <10^7 yr (to allow olivine to survive globally). We compare these data against existing models, set out a new model involving methane bursts, and conclude with future directions for Early Mars geologic analysis and modelling work.

  6. Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karali, Nihan; Xu, Tengfang; Sathaye, Jayant

    2012-12-12

    The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.

  7. How Well Does LCA Model Land Use Impacts on Biodiversity?--A Comparison with Approaches from Ecology and Conservation.

    PubMed

    Curran, Michael; de Souza, Danielle Maia; Antón, Assumpció; Teixeira, Ricardo F M; Michelsen, Ottar; Vidal-Legaz, Beatriz; Sala, Serenella; Milà i Canals, Llorenç

    2016-03-15

    The modeling of land use impacts on biodiversity is considered a priority in life cycle assessment (LCA). Many diverging approaches have been proposed in an expanding literature on the topic. The UNEP/SETAC Life Cycle Initiative is engaged in building consensus on a shared modeling framework to highlight best-practice and guide model application by practitioners. In this paper, we evaluated the performance of 31 models from both the LCA and the ecology/conservation literature (20 from LCA, 11 from non-LCA fields) according to a set of criteria reflecting (i) model completeness, (ii) biodiversity representation, (iii) impact pathway coverage, (iv) scientific quality, and (v) stakeholder acceptance. We show that LCA models tend to perform worse than those from ecology and conservation (although not significantly), implying room for improvement. We identify seven best-practice recommendations that can be implemented immediately to improve LCA models based on existing approaches in the literature. We further propose building a "consensus model" through weighted averaging of existing information, to complement future development. While our research focuses on conceptual model design, further quantitative comparison of promising models in shared case studies is an essential prerequisite for future informed model choice.

  8. Models of Integrating Physical Therapists into Family Health Teams in Ontario, Canada: Challenges and Opportunities

    PubMed Central

    Mandoda, Shilpa; Landry, Michel D.

    2011-01-01

    ABSTRACT Purpose: To explore the potential for different models of incorporating physical therapy (PT) services within the emerging network of family health teams (FHTs) in Ontario and to identify challenges and opportunities of each model. Methods: A two-phase mixed-methods qualitative descriptive approach was used. First, FHTs were mapped in relation to existing community-based PT practices. Second, semi-structured key-informant interviews were conducted with representatives from urban and rural FHTs and from a variety of community-based PT practices. Interviews were digitally recorded, transcribed verbatim, and analyzed using a categorizing/editing approach. Results: Most participants agreed that the ideal model involves embedding physical therapists directly into FHTs; in some situations, however, partnering with an existing external PT provider may be more feasible and sustainable. Access and funding remain the key issues, regardless of the model adopted. Conclusion: Although there are differences across the urban/rural divide, there exist opportunities to enhance and optimize existing delivery models so as to improve client access and address emerging demand for community-based PT services. PMID:22654231

  9. Quality Improvement on the Acute Inpatient Psychiatry Unit Using the Model for Improvement

    PubMed Central

    Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean

    2013-01-01

    Background A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. Methods We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients—those starting or continuing on standing neuroleptics—with the Abnormal Involuntary Movement Scale (AIMS). Results After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Conclusion Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team. PMID:24052768

  10. Quality improvement on the acute inpatient psychiatry unit using the model for improvement.

    PubMed

    Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean

    2013-01-01

    A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients-those starting or continuing on standing neuroleptics-with the Abnormal Involuntary Movement Scale (AIMS). After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team.

  11. Plausible combinations: An improved method to evaluate the covariate structure of Cormack-Jolly-Seber mark-recapture models

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; McDonald, Trent L.; Amstrup, Steven C.

    2013-01-01

    Mark-recapture models are extensively used in quantitative population ecology, providing estimates of population vital rates, such as survival, that are difficult to obtain using other methods. Vital rates are commonly modeled as functions of explanatory covariates, adding considerable flexibility to mark-recapture models, but also increasing the subjectivity and complexity of the modeling process. Consequently, model selection and the evaluation of covariate structure remain critical aspects of mark-recapture modeling. The difficulties involved in model selection are compounded in Cormack-Jolly- Seber models because they are composed of separate sub-models for survival and recapture probabilities, which are conceptualized independently even though their parameters are not statistically independent. The construction of models as combinations of sub-models, together with multiple potential covariates, can lead to a large model set. Although desirable, estimation of the parameters of all models may not be feasible. Strategies to search a model space and base inference on a subset of all models exist and enjoy widespread use. However, even though the methods used to search a model space can be expected to influence parameter estimation, the assessment of covariate importance, and therefore the ecological interpretation of the modeling results, the performance of these strategies has received limited investigation. We present a new strategy for searching the space of a candidate set of Cormack-Jolly-Seber models and explore its performance relative to existing strategies using computer simulation. The new strategy provides an improved assessment of the importance of covariates and covariate combinations used to model survival and recapture probabilities, while requiring only a modest increase in the number of models on which inference is based in comparison to existing techniques.

  12. The economic impact of drag in general aviation

    NASA Technical Reports Server (NTRS)

    Neal, R. D.

    1975-01-01

    General aviation aircraft fuel consumption and operating costs are closely linked to drag reduction methods. Improvements in airplane drag are envisioned for new models; their effects will be in the 5 to 10% range. Major improvements in fuel consumption over existing turbofan airplanes will be the combined results of improved aerodynamics plus additional effects from advanced turbofan engine designs.

  13. Improving Student Writing Skills through the Modeling of the Writing Process.

    ERIC Educational Resources Information Center

    Kapka, Dawn; Oberman, Dina A.

    This study describes a program designed to improve students' writing skills in order to improve academic achievement. The targeted population consists of third and fifth grade elementary students in two separate communities ranging from low to middle class, located in two midwestern suburbs of a large city. Evidence for the existence of the…

  14. Construction of mathematical model for measuring material concentration by colorimetric method

    NASA Astrophysics Data System (ADS)

    Liu, Bing; Gao, Lingceng; Yu, Kairong; Tan, Xianghua

    2018-06-01

    This paper use the method of multiple linear regression to discuss the data of C problem of mathematical modeling in 2017. First, we have established a regression model for the concentration of 5 substances. But only the regression model of the substance concentration of urea in milk can pass through the significance test. The regression model established by the second sets of data can pass the significance test. But this model exists serious multicollinearity. We have improved the model by principal component analysis. The improved model is used to control the system so that it is possible to measure the concentration of material by direct colorimetric method.

  15. The public health nutrition intervention management bi-cycle: a model for training and practice improvement.

    PubMed

    Hughes, Roger; Margetts, Barrie

    2012-11-01

    The present paper describes a model for public health nutrition practice designed to facilitate practice improvement and provide a step-wise approach to assist with workforce development. The bi-cycle model for public health nutrition practice has been developed based on existing cyclical models for intervention management but modified to integrate discrete capacity-building practices. Education and practice settings. This model will have applications for educators and practitioners. Modifications to existing models have been informed by the authors' observations and experiences as practitioners and educators, and reflect a conceptual framework with applications in workforce development and practice improvement. From a workforce development and educational perspective, the model is designed to reflect adult learning principles, exposing students to experiential, problem-solving and practical learning experiences that reflect the realities of work as a public health nutritionist. In doing so, it assists the development of competency beyond knowing to knowing how, showing how and doing. This progression of learning from knowledge to performance is critical to effective competency development for effective practice. Public health nutrition practice is dynamic and varied, and models need to be adaptable and applicable to practice context to have utility. The paper serves to stimulate debate in the public health nutrition community, to encourage critical feedback about the validity, applicability and utility of this model in different practice contexts.

  16. Snow model design for operational purposes

    NASA Astrophysics Data System (ADS)

    Kolberg, Sjur

    2017-04-01

    A parsimonious distributed energy balance snow model intended for operational use is evaluated using discharge, snow covered area and grain size; the latter two as observed from the MODIS sensor. The snow model is an improvement of the existing GamSnow model, which is a part of the Enki modelling framework. Core requirements for the new version have been: 1. Reduction of calibration freedom, motivated by previous experience of non-identifiable parameters in the existing version 2. Improvement of process representation based on recent advances in physically based snow modelling 3. Limiting the sensitivity to forcing data which are poorly known over the spatial domain of interest (often in mountainous areas) 4. Preference for observable states, and the ability to improve from updates. The albedo calculation is completely revised, now based on grain size through an emulation of the SNICAR model (Flanner and Zender, 2006; Gardener and Sharp, 2010). The number of calibration parameters in the albedo model is reduced from 6 to 2. The wind function governing turbulent energy fluxes has been reduced from 2 to 1 parameter. Following Raleigh et al (2011), snow surface radiant temperature is split from the top layer thermodynamic temperature, using bias-corrected wet-bulb temperature to model the former. Analyses are ongoing, and the poster will bring evaluation results from 16 years of MODIS observations and more than 25 catchments in southern Norway.

  17. Improving Localization Accuracy: Successive Measurements Error Modeling

    PubMed Central

    Abu Ali, Najah; Abu-Elkheir, Mervat

    2015-01-01

    Vehicle self-localization is an essential requirement for many of the safety applications envisioned for vehicular networks. The mathematical models used in current vehicular localization schemes focus on modeling the localization error itself, and overlook the potential correlation between successive localization measurement errors. In this paper, we first investigate the existence of correlation between successive positioning measurements, and then incorporate this correlation into the modeling positioning error. We use the Yule Walker equations to determine the degree of correlation between a vehicle’s future position and its past positions, and then propose a p-order Gauss–Markov model to predict the future position of a vehicle from its past p positions. We investigate the existence of correlation for two datasets representing the mobility traces of two vehicles over a period of time. We prove the existence of correlation between successive measurements in the two datasets, and show that the time correlation between measurements can have a value up to four minutes. Through simulations, we validate the robustness of our model and show that it is possible to use the first-order Gauss–Markov model, which has the least complexity, and still maintain an accurate estimation of a vehicle’s future location over time using only its current position. Our model can assist in providing better modeling of positioning errors and can be used as a prediction tool to improve the performance of classical localization algorithms such as the Kalman filter. PMID:26140345

  18. Enhancing Access to Patient Education Information: A Pilot Usability Study

    PubMed Central

    Beaudoin, Denise E.; Rocha, Roberto A.; Tse, Tony

    2005-01-01

    Health care organizations are developing Web-based portals to provide patient access to personal health information and enhance patient-provider communication. This pilot study investigates two navigation models (“serial” and “menu-driven”) for improving access to education materials available through a portal. There was a trend toward greater user satisfaction with the menu-driven model. Model preference was influenced by frequency of Web use. Results should aid in the improvement of existing portals and in the development of new ones. PMID:16779179

  19. Analysis and model on space-time characteristics of wind power output based on the measured wind speed data

    NASA Astrophysics Data System (ADS)

    Shi, Wenhui; Feng, Changyou; Qu, Jixian; Zha, Hao; Ke, Dan

    2018-02-01

    Most of the existing studies on wind power output focus on the fluctuation of wind farms and the spatial self-complementary of wind power output time series was ignored. Therefore the existing probability models can’t reflect the features of power system incorporating wind farms. This paper analyzed the spatial self-complementary of wind power and proposed a probability model which can reflect temporal characteristics of wind power on seasonal and diurnal timescales based on sufficient measured data and improved clustering method. This model could provide important reference for power system simulation incorporating wind farms.

  20. Combustion of Nitramine Propellants

    DTIC Science & Technology

    1983-03-01

    through development of a comprehensive analytical model. The ultimate goals are to enable prediction of deflagration rate over a wide pressure range...superior in burn rate prediction , both simple models fail in correlating existing temperature- sensitivity data. (2) In the second part, a...auxiliary condition to enable independent burn rate prediction ; improved melt phase model including decomposition-gas bubbles; model for far-field

  1. Technical Note: Harmonizing met-ocean model data via standard web services within small research groups

    USGS Publications Warehouse

    Signell, Richard; Camossi, E.

    2016-01-01

    Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.

  2. Technical note: Harmonising metocean model data via standard web services within small research groups

    NASA Astrophysics Data System (ADS)

    Signell, Richard P.; Camossi, Elena

    2016-05-01

    Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.

  3. 3D modeling based on CityEngine

    NASA Astrophysics Data System (ADS)

    Jia, Guangyin; Liao, Kaiju

    2017-03-01

    Currently, there are many 3D modeling softwares, like 3DMAX, AUTOCAD, and more populous BIM softwares represented by REVIT. CityEngine modeling software introduced in this paper can fully utilize the existing GIS data and combine other built models to make 3D modeling on internal and external part of buildings in a rapid and batch manner, so as to improve the 3D modeling efficiency.

  4. Numerical simulation for the air entrainment of aerated flow with an improved multiphase SPH model

    NASA Astrophysics Data System (ADS)

    Wan, Hang; Li, Ran; Pu, Xunchi; Zhang, Hongwei; Feng, Jingjie

    2017-11-01

    Aerated flow is a complex hydraulic phenomenon that exists widely in the field of environmental hydraulics. It is generally characterised by large deformation and violent fragmentation of the free surface. Compared to Euler methods (volume of fluid (VOF) method or rigid-lid hypothesis method), the existing single-phase Smooth Particle Hydrodynamics (SPH) method has performed well for solving particle motion. A lack of research on interphase interaction and air concentration, however, has affected the application of SPH model. In our study, an improved multiphase SPH model is presented to simulate aeration flows. A drag force was included in the momentum equation to ensure accuracy of the air particle slip velocity. Furthermore, a calculation method for air concentration is developed to analyse the air entrainment characteristics. Two studies were used to simulate the hydraulic and air entrainment characteristics. And, compared with the experimental results, the simulation results agree with the experimental results well.

  5. Waffle mode error in the AEOS adaptive optics point-spread function

    NASA Astrophysics Data System (ADS)

    Makidon, Russell B.; Sivaramakrishnan, Anand; Roberts, Lewis C., Jr.; Oppenheimer, Ben R.; Graham, James R.

    2003-02-01

    Adaptive optics (AO) systems have improved astronomical imaging capabilities significantly over the last decade, and have the potential to revolutionize the kinds of science done with 4-5m class ground-based telescopes. However, provided sufficient detailed study and analysis, existing AO systems can be improved beyond their original specified error budgets. Indeed, modeling AO systems has been a major activity in the past decade: sources of noise in the atmosphere and the wavefront sensing WFS) control loop have received a great deal of attention, and many detailed and sophisticated control-theoretic and numerical models predicting AO performance are already in existence. However, in terms of AO system performance improvements, wavefront reconstruction (WFR) and wavefront calibration techniques have commanded relatively little attention. We elucidate the nature of some of these reconstruction problems, and demonstrate their existence in data from the AEOS AO system. We simulate the AO correction of AEOS in the I-band, and show that the magnitude of the `waffle mode' error in the AEOS reconstructor is considerably larger than expected. We suggest ways of reducing the magnitude of this error, and, in doing so, open up ways of understanding how wavefront reconstruction might handle bad actuators and partially-illuminated WFS subapertures.

  6. Online Knowledge-Based Model for Big Data Topic Extraction

    PubMed Central

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half. PMID:27195004

  7. Establishing NWP capabilities in African Small Island States (SIDs)

    NASA Astrophysics Data System (ADS)

    Rögnvaldsson, Ólafur

    2017-04-01

    Íslenskar orkurannsóknir (ÍSOR), in collaboration with Belgingur Ltd. and the United Nations Economic Commission for Africa (UNECA) signed a Letter of Agreement in 2015 regarding collaboration in the "Establishing Operational Capacity for Building, Deploying and Using Numerical Weather and Seasonal Prediction Systems in Small Island States in Africa (SIDs)" project. The specific objectives of the collaboration were the following: - Build capacity of National Meteorological and Hydrology Services (NMHS) staff on the use of the WRF atmospheric model for weather and seasonal forecasting, interpretation of model results, and the use of observations to verify and improve model simulations. - Establish a platform for integrating short to medium range weather forecasts, as well as seasonal forecasts, into already existing infrastructure at NMHS and Regional Climate Centres. - Improve understanding of existing model results and forecast verification, for improving decision-making on the time scale of days to weeks. To meet these challenges the operational Weather On Demand (WOD) forecasting system, developed by Belgingur, is being installed in a number of SIDs countries (Cabo Verde, Guinea-Bissau, and Seychelles), as well as being deployed for the Pan-Africa region, with forecasts being disseminated to collaborating NMHSs.

  8. An improved water budget for the El Yunque National Forest, Puerto Rico, as determined by the Water Supply Stress Index Model

    Treesearch

    Liangxia Zhang; Ge Sun; Erika Cohen; Steven McNulty; Peter Caldwell; Suzanne Krieger; Jason Christian; Decheng Zhou; Kai Duan; Keren J. Cepero-Pérez

    2018-01-01

    Quantifying the forest water budget is fundamental to making science-based forest management decisions. This study aimed at developing an improved water budget for the El Yunque National Forest (ENF) in Puerto Rico, one of the wettest forests in the United States. We modified an existing monthly scale water balance model, Water Supply Stress Index (WaSSI), to reflect...

  9. Turbulence Modeling Workshop

    NASA Technical Reports Server (NTRS)

    Rubinstein, R. (Editor); Rumsey, C. L. (Editor); Salas, M. D. (Editor); Thomas, J. L. (Editor); Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    Advances in turbulence modeling are needed in order to calculate high Reynolds number flows near the onset of separation and beyond. To this end, the participants in this workshop made the following recommendations. (1) A national/international database and standards for turbulence modeling assessment should be established. Existing experimental data sets should be reviewed and categorized. Advantage should be taken of other efforts already under-way, such as that of the European Research Community on Flow, Turbulence, and Combustion (ERCOFTAC) consortium. Carefully selected "unit" experiments will be needed, as well as advances in instrumentation, to fill the gaps in existing datasets. A high priority should be given to document existing turbulence model capabilities in a standard form, including numerical implementation issues such as grid quality and resolution. (2) NASA should support long-term research on Algebraic Stress Models and Reynolds Stress Models. The emphasis should be placed on improving the length-scale equation, since it is the least understood and is a key component of two-equation and higher models. Second priority should be given to the development of improved near-wall models. Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) would provide valuable guidance in developing and validating new Reynolds-averaged Navier-Stokes (RANS) models. Although not the focus of this workshop, DNS, LES, and hybrid methods currently represent viable approaches for analysis on a limited basis. Therefore, although computer limitations require the use of RANS methods for realistic configurations at high Reynolds number in the foreseeable future, a balanced effort in turbulence modeling development, validation, and implementation should include these approaches as well.

  10. Elemental composition and energy spectra of galactic cosmic rays

    NASA Technical Reports Server (NTRS)

    Mewaldt, R. A.

    1988-01-01

    A brief review is presented of the major features of the elemental composition and energy spectra of galactic cosmic rays. The requirements for phenomenological models of cosmic ray composition and energy spectra are discussed, and possible improvements to an existing model are suggested.

  11. Fitmunk: improving protein structures by accurate, automatic modeling of side-chain conformations.

    PubMed

    Porebski, Przemyslaw Jerzy; Cymborowski, Marcin; Pasenkiewicz-Gierula, Marta; Minor, Wladek

    2016-02-01

    Improvements in crystallographic hardware and software have allowed automated structure-solution pipelines to approach a near-`one-click' experience for the initial determination of macromolecular structures. However, in many cases the resulting initial model requires a laborious, iterative process of refinement and validation. A new method has been developed for the automatic modeling of side-chain conformations that takes advantage of rotamer-prediction methods in a crystallographic context. The algorithm, which is based on deterministic dead-end elimination (DEE) theory, uses new dense conformer libraries and a hybrid energy function derived from experimental data and prior information about rotamer frequencies to find the optimal conformation of each side chain. In contrast to existing methods, which incorporate the electron-density term into protein-modeling frameworks, the proposed algorithm is designed to take advantage of the highly discriminatory nature of electron-density maps. This method has been implemented in the program Fitmunk, which uses extensive conformational sampling. This improves the accuracy of the modeling and makes it a versatile tool for crystallographic model building, refinement and validation. Fitmunk was extensively tested on over 115 new structures, as well as a subset of 1100 structures from the PDB. It is demonstrated that the ability of Fitmunk to model more than 95% of side chains accurately is beneficial for improving the quality of crystallographic protein models, especially at medium and low resolutions. Fitmunk can be used for model validation of existing structures and as a tool to assess whether side chains are modeled optimally or could be better fitted into electron density. Fitmunk is available as a web service at http://kniahini.med.virginia.edu/fitmunk/server/ or at http://fitmunk.bitbucket.org/.

  12. Creating and Using Interactive, 3D-Printed Models to Improve Student Comprehension of the Bohr Model of the Atom, Bond Polarity, and Hybridization

    ERIC Educational Resources Information Center

    Smiar, Karen; Mendez, J. D.

    2016-01-01

    Molecular model kits have been used in chemistry classrooms for decades but have seen very little recent innovation. Using 3D printing, three sets of physical models were created for a first semester, introductory chemistry course. Students manipulated these interactive models during class activities as a supplement to existing teaching tools for…

  13. Embodied Agents, E-SQ and Stickiness: Improving Existing Cognitive and Affective Models

    NASA Astrophysics Data System (ADS)

    de Diesbach, Pablo Brice

    This paper synthesizes results from two previous studies of embodied virtual agents on commercial websites. We analyze and criticize the proposed models and discuss the limits of the experimental findings. Results from other important research in the literature are integrated. We also integrate concepts from profound, more business-related, analysis that deepens on the mechanisms of rhetoric in marketing and communication, and the possible role of E-SQ in man-agent interaction. We finally suggest a refined model for the impacts of these agents on web site users, and limits of the improved model are commented.

  14. Multiclassifier fusion in human brain MR segmentation: modelling convergence.

    PubMed

    Heckemann, Rolf A; Hajnal, Joseph V; Aljabar, Paul; Rueckert, Daniel; Hammers, Alexander

    2006-01-01

    Segmentations of MR images of the human brain can be generated by propagating an existing atlas label volume to the target image. By fusing multiple propagated label volumes, the segmentation can be improved. We developed a model that predicts the improvement of labelling accuracy and precision based on the number of segmentations used as input. Using a cross-validation study on brain image data as well as numerical simulations, we verified the model. Fit parameters of this model are potential indicators of the quality of a given label propagation method or the consistency of the input segmentations used.

  15. Sculpting bespoke mountains: Determining free energies with basis expansions

    NASA Astrophysics Data System (ADS)

    Whitmer, Jonathan K.; Fluitt, Aaron M.; Antony, Lucas; Qin, Jian; McGovern, Michael; de Pablo, Juan J.

    2015-07-01

    The intriguing behavior of a wide variety of physical systems, ranging from amorphous solids or glasses to proteins, is a direct manifestation of underlying free energy landscapes riddled with local minima separated by large barriers. Exploring such landscapes has arguably become one of statistical physics's great challenges. A new method is proposed here for uniform sampling of rugged free energy surfaces. The method, which relies on special Green's functions to approximate the Dirac delta function, improves significantly on existing simulation techniques by providing a boundary-agnostic approach that is capable of mapping complex features in multidimensional free energy surfaces. The usefulness of the proposed approach is established in the context of a simple model glass former and model proteins, demonstrating improved convergence and accuracy over existing methods.

  16. Modeling Aromatic Liquids:  Toluene, Phenol, and Pyridine.

    PubMed

    Baker, Christopher M; Grant, Guy H

    2007-03-01

    Aromatic groups are now acknowledged to play an important role in many systems of interest. However, existing molecular mechanics methods provide a poor representation of these groups. In a previous paper, we have shown that the molecular mechanics treatment of benzene can be improved by the incorporation of an explicit representation of the aromatic π electrons. Here, we develop this concept further, developing charge-separation models for toluene, phenol, and pyridine. Monte Carlo simulations are used to parametrize the models, via the reproduction of experimental thermodynamic data, and our models are shown to outperform an existing atom-centered model. The models are then used to make predictions about the structures of the liquids at the molecular level and are tested further through their application to the modeling of gas-phase dimers and cation-π interactions.

  17. Thermo-chemical modelling of a village cookstove for design improvement

    NASA Astrophysics Data System (ADS)

    Honkalaskar, Vijay H.; Sohoni, Milind; Bhandarkar, Upendra V.

    2014-05-01

    Cookstove operation comprises three basic processes, namely combustion of firewood, natural air draft due to the buoyancy induced by the temperature difference between the hearth and its surroundings, and heat transfer to the pot, stove body and surrounding atmosphere. Owing to the heterogenous and unsteady burning of solid fuel, there exist nonlinear and dynamic interrelationships among these process parameters. A steady-state analytical model of the cookstove operation is developed for its design improvement by splitting the hearth into three zones to study char combustion, volatile combustion and heat transfer to the pot bottom separately. It comprises a total of seven relations corresponding to a thorough analysis of the three basic processes. A novel method is proposed to model the combustion of wood to mimic the realities closely. Combustion space above the fuel bed is split into 1000 discrete parts to study the combustion of volatiles by considering a set of representative volatile gases. Model results are validated by comparing them with a set of water boiling tests carried on a traditional cookstove in the laboratory. It is found that the major thrust areas to improve the thermal performance are combustion of volatiles and the heat transfer to the pot. It is revealed that the existing design dimensions of the traditional cookstove are close to their optimal values. Addition of twisted-tape inserts in the hearth of the cookstove shows an improvement in the thermal performance due to increase in the heat transfer coefficient to the pot bottom and improved combustion of volatiles.

  18. Investigation of Stimulation-Response Relationships for Complex Fracture Systems in Enhanced Geothermal Reservoirs

    DOE Data Explorer

    Fu, Pengcheng; Johnson, Scott M.; Carrigan, Charles R.

    2011-01-01

    Hydraulic fracturing is currently the primary method for stimulating low-permeability geothermal reservoirs and creating Enhanced (or Engineered) Geothermal Systems (EGS) with improved permeability and heat production efficiency. Complex natural fracture systems usually exist in the formations to be stimulated and it is therefore critical to understand the interactions between existing fractures and newly created fractures before optimal stimulation strategies can be developed. Our study aims to improve the understanding of EGS stimulation-response relationships by developing and applying computer-based models that can effectively reflect the key mechanisms governing interactions between complex existing fracture networks and newly created hydraulic fractures. In this paper, we first briefly describe the key modules of our methodology, namely a geomechanics solver, a discrete fracture flow solver, a rock joint response model, an adaptive remeshing module, and most importantly their effective coupling. After verifying the numerical model against classical closed-form solutions, we investigate responses of reservoirs with different preexisting natural fractures to a variety of stimulation strategies. The factors investigated include: the in situ stress states (orientation of the principal stresses and the degree of stress anisotropy), pumping pressure, and stimulation sequences of multiple wells.

  19. Assessing shortfalls and complementary conservation areas for national plant biodiversity in South Korea.

    PubMed

    Choe, Hyeyeong; Thorne, James H; Huber, Patrick R; Lee, Dongkun; Quinn, James F

    2018-01-01

    Protected areas (PAs) are often considered the most important biodiversity conservation areas in national plans, but PAs often do not represent national-scale biodiversity. We evaluate the current conservation status of plant biodiversity within current existing PAs, and identify potential additional PAs for South Korea. We modeled species ranges for 2,297 plant species using Multivariate Adaptive Regression Splines and compared the level of mean range representation in South Korea's existing PAs, which comprise 5.7% of the country's mainland area, with an equal-area alternative PA strategy selected with the reserve algorithm Marxan. We also used Marxan to model two additional conservation scenarios that add lands to approach the Aichi Biodiversity Target objectives (17% of the country). Existing PAs in South Korea contain an average of 6.3% of each plant species' range, compared to 5.9% in the modeled equal-area alternative. However, existing PAs primarily represent a high percentage of the ranges for high-elevation and small range size species. The additional PAs scenario that adds lands to the existing PAs covers 14,587.55 km2, and would improve overall plant range representation to a mean of 16.8% of every species' range. The other additional PAs scenario, which selects new PAs from all lands and covers 13,197.35 km2, would improve overall plant range representation to a mean of 13.5%. Even though the additional PAs that includes existing PAs represents higher percentages of species' ranges, it is missing many biodiversity hotspots in non-mountainous areas and the additional PAs without locking in the existing PAs represent almost all species' ranges evenly, including low-elevation ones with larger ranges. Some priority conservation areas we identified are expansions of, or near, existing PAs, especially in northeastern and southern South Korea. However, lowland coastal areas and areas surrounding the capital city, Seoul, are also critical for biodiversity conservation in South Korea.

  20. Assessing shortfalls and complementary conservation areas for national plant biodiversity in South Korea

    PubMed Central

    Thorne, James H.; Huber, Patrick R.; Lee, Dongkun; Quinn, James F.

    2018-01-01

    Protected areas (PAs) are often considered the most important biodiversity conservation areas in national plans, but PAs often do not represent national-scale biodiversity. We evaluate the current conservation status of plant biodiversity within current existing PAs, and identify potential additional PAs for South Korea. We modeled species ranges for 2,297 plant species using Multivariate Adaptive Regression Splines and compared the level of mean range representation in South Korea’s existing PAs, which comprise 5.7% of the country’s mainland area, with an equal-area alternative PA strategy selected with the reserve algorithm Marxan. We also used Marxan to model two additional conservation scenarios that add lands to approach the Aichi Biodiversity Target objectives (17% of the country). Existing PAs in South Korea contain an average of 6.3% of each plant species’ range, compared to 5.9% in the modeled equal-area alternative. However, existing PAs primarily represent a high percentage of the ranges for high-elevation and small range size species. The additional PAs scenario that adds lands to the existing PAs covers 14,587.55 km2, and would improve overall plant range representation to a mean of 16.8% of every species’ range. The other additional PAs scenario, which selects new PAs from all lands and covers 13,197.35 km2, would improve overall plant range representation to a mean of 13.5%. Even though the additional PAs that includes existing PAs represents higher percentages of species’ ranges, it is missing many biodiversity hotspots in non-mountainous areas and the additional PAs without locking in the existing PAs represent almost all species’ ranges evenly, including low-elevation ones with larger ranges. Some priority conservation areas we identified are expansions of, or near, existing PAs, especially in northeastern and southern South Korea. However, lowland coastal areas and areas surrounding the capital city, Seoul, are also critical for biodiversity conservation in South Korea. PMID:29474355

  1. On Optimizing H. 264/AVC Rate Control by Improving R-D Model and Incorporating HVS Characteristics

    NASA Astrophysics Data System (ADS)

    Zhu, Zhongjie; Wang, Yuer; Bai, Yongqiang; Jiang, Gangyi

    2010-12-01

    The state-of-the-art JVT-G012 rate control algorithm of H.264 is improved from two aspects. First, the quadratic rate-distortion (R-D) model is modified based on both empirical observations and theoretical analysis. Second, based on the existing physiological and psychological research findings of human vision, the rate control algorithm is optimized by incorporating the main characteristics of the human visual system (HVS) such as contrast sensitivity, multichannel theory, and masking effect. Experiments are conducted, and experimental results show that the improved algorithm can simultaneously enhance the overall subjective visual quality and improve the rate control precision effectively.

  2. Improving RNA nearest neighbor parameters for helices by going beyond the two-state model.

    PubMed

    Spasic, Aleksandar; Berger, Kyle D; Chen, Jonathan L; Seetin, Matthew G; Turner, Douglas H; Mathews, David H

    2018-06-01

    RNA folding free energy change nearest neighbor parameters are widely used to predict folding stabilities of secondary structures. They were determined by linear regression to datasets of optical melting experiments on small model systems. Traditionally, the optical melting experiments are analyzed assuming a two-state model, i.e. a structure is either complete or denatured. Experimental evidence, however, shows that structures exist in an ensemble of conformations. Partition functions calculated with existing nearest neighbor parameters predict that secondary structures can be partially denatured, which also directly conflicts with the two-state model. Here, a new approach for determining RNA nearest neighbor parameters is presented. Available optical melting data for 34 Watson-Crick helices were fit directly to a partition function model that allows an ensemble of conformations. Fitting parameters were the enthalpy and entropy changes for helix initiation, terminal AU pairs, stacks of Watson-Crick pairs and disordered internal loops. The resulting set of nearest neighbor parameters shows a 38.5% improvement in the sum of residuals in fitting the experimental melting curves compared to the current literature set.

  3. Low Reynolds number two-equation modeling of turbulent flows

    NASA Technical Reports Server (NTRS)

    Michelassi, V.; Shih, T.-H.

    1991-01-01

    A k-epsilon model that accounts for viscous and wall effects is presented. The proposed formulation does not contain the local wall distance thereby making very simple the application to complex geometries. The formulation is based on an existing k-epsilon model that proved to fit very well with the results of direct numerical simulation. The new form is compared with nine different two-equation models and with direct numerical simulation for a fully developed channel flow at Re = 3300. The simple flow configuration allows a comparison free from numerical inaccuracies. The computed results prove that few of the considered forms exhibit a satisfactory agreement with the channel flow data. The model shows an improvement with respect to the existing formulations.

  4. Evaluation of Modeled and Measured Energy Savings in Existing All Electric Public Housing in the Pacific Northwest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon, A.; Lubliner, M.; Howard, L.

    2014-04-01

    This project analyzes the cost effectiveness of energy savings measures installed by a large public housing authority in Salishan, a community in Tacoma Washington. Research focuses on the modeled and measured energy usage of the first six phases of construction, and compares the energy usage of those phases to phase 7. Market-ready energy solutions were also evaluated to improve the efficiency of affordable housing for new and existing (built since 2001) affordable housing in the marine climate of Washington State.

  5. The Collaboration Model and Reading Improvement of High School Students with Learning Disabilities

    ERIC Educational Resources Information Center

    Sacchetto, Jorge A.

    2014-01-01

    In the field of reading research, studies that focus on improving the reading achievement of high school students with learning disabilities are lacking. Although collaborative interventions for elementary age students have been shown to be effective, a gap exists in the current research regarding effective collaborative reading interventions for…

  6. New Research Strengthens Home Visiting Field: The Pew Home Visiting Campaign

    ERIC Educational Resources Information Center

    Doggett, Libby

    2013-01-01

    Extensive research has shown that home visiting parental education programs improve child and family outcomes, and they save money for states and taxpayers. Now, the next generation of research is deepening understanding of those program elements that are essential to success, ways to improve existing models, and factors to consider in tailoring…

  7. Educational Technology Classics: Educational Technology Doesn't Really Exist

    ERIC Educational Resources Information Center

    Silvern, Leonard C.

    2013-01-01

    The improvement of a professional group is due, in part, to its ability for introspection and self-evaluation. This is essentially the process of "analyzing" the profession as it currently is practiced, identifying necessary changes and improvements, and "synthesizing" or creating a new image or model of the profession to…

  8. Improved global prediction of 300 nautical mile mean free air anomalies

    NASA Technical Reports Server (NTRS)

    Cruz, J. Y.

    1982-01-01

    Current procedures used for the global prediction of 300nm mean anomalies starting from known values of 1 deg by 1 deg mean anomalies yield unreasonable prediction results when applied to 300nm blocks which have a rapidly varying gravity anomaly field and which contain relatively few observed 60nm blocks. Improvement of overall 300nm anomaly prediction is first achieved by using area-weighted as opposed to unweighted averaging of the 25 generated 60nm mean anomalies inside the 300nm block. Then, improvement of prediction over rough 300nm blocks is realized through the use of fully known 1 deg by 1 deg mean elevations, taking advantage of the correlation that locally exists between 60nm mean anomalies and 60nm mean elevations inside the 300nm block. An improved prediction model which adapts itself to the roughness of the local anomaly field is found to be the model of Least Squares Collocation with systematic parameters, the systematic parameter being the slope b which is a type of Bouguer slope expressing the correlation that locally exists between 60nm mean anomalies and 60nm mean elevations.

  9. Adaptive estimation of state of charge and capacity with online identified battery model for vanadium redox flow battery

    NASA Astrophysics Data System (ADS)

    Wei, Zhongbao; Tseng, King Jet; Wai, Nyunt; Lim, Tuti Mariana; Skyllas-Kazacos, Maria

    2016-11-01

    Reliable state estimate depends largely on an accurate battery model. However, the parameters of battery model are time varying with operating condition variation and battery aging. The existing co-estimation methods address the model uncertainty by integrating the online model identification with state estimate and have shown improved accuracy. However, the cross interference may arise from the integrated framework to compromise numerical stability and accuracy. Thus this paper proposes the decoupling of model identification and state estimate to eliminate the possibility of cross interference. The model parameters are online adapted with the recursive least squares (RLS) method, based on which a novel joint estimator based on extended Kalman Filter (EKF) is formulated to estimate the state of charge (SOC) and capacity concurrently. The proposed joint estimator effectively compresses the filter order which leads to substantial improvement in the computational efficiency and numerical stability. Lab scale experiment on vanadium redox flow battery shows that the proposed method is highly authentic with good robustness to varying operating conditions and battery aging. The proposed method is further compared with some existing methods and shown to be superior in terms of accuracy, convergence speed, and computational cost.

  10. Implementation strategies for collaborative primary care-mental health models.

    PubMed

    Franx, Gerdien; Dixon, Lisa; Wensing, Michel; Pincus, Harold

    2013-09-01

    Extensive research exists that collaborative primary care-mental health models can improve care and outcomes for patients. These programs are currently being implemented throughout the United States and beyond. The purpose of this study is to review the literature and to generate an overview of strategies currently used to implement such models in daily practice. Six overlapping strategies to implement collaborative primary care-mental health models were described in 18 selected studies. We identified interactive educational strategies, quality improvement change processes, technological support tools, stakeholder engagement in the design and execution of implementation plans, organizational changes in terms of expanding the task of nurses and financial strategies such as additional collaboration fees and pay for performance incentives. Considering the overwhelming evidence about the effectiveness of primary care-mental health models, there is a lack of good studies focusing on their implementation strategies. In practice, these strategies are multifaceted and locally defined, as a result of intensive and required stakeholder engagement. Although many barriers still exist, the implementation of collaborative models could have a chance to succeed in the United States, where new service delivery and payment models, such as the Patient-Centered Medical Home, the Health Home and the Accountable Care Organization, are being promoted.

  11. Drug drug interaction extraction from the literature using a recursive neural network

    PubMed Central

    Lim, Sangrak; Lee, Kyubum

    2018-01-01

    Detecting drug-drug interactions (DDI) is important because information on DDIs can help prevent adverse effects from drug combinations. Since there are many new DDI-related papers published in the biomedical domain, manually extracting DDI information from the literature is a laborious task. However, text mining can be used to find DDIs in the biomedical literature. Among the recently developed neural networks, we use a Recursive Neural Network to improve the performance of DDI extraction. Our recursive neural network model uses a position feature, a subtree containment feature, and an ensemble method to improve the performance of DDI extraction. Compared with the state-of-the-art models, the DDI detection and type classifiers of our model performed 4.4% and 2.8% better, respectively, on the DDIExtraction Challenge’13 test data. We also validated our model on the PK DDI corpus that consists of two types of DDIs data: in vivo DDI and in vitro DDI. Compared with the existing model, our detection classifier performed 2.3% and 6.7% better on in vivo and in vitro data respectively. The results of our validation demonstrate that our model can automatically extract DDIs better than existing models. PMID:29373599

  12. Risk Classification with an Adaptive Naive Bayes Kernel Machine Model.

    PubMed

    Minnier, Jessica; Yuan, Ming; Liu, Jun S; Cai, Tianxi

    2015-04-22

    Genetic studies of complex traits have uncovered only a small number of risk markers explaining a small fraction of heritability and adding little improvement to disease risk prediction. Standard single marker methods may lack power in selecting informative markers or estimating effects. Most existing methods also typically do not account for non-linearity. Identifying markers with weak signals and estimating their joint effects among many non-informative markers remains challenging. One potential approach is to group markers based on biological knowledge such as gene structure. If markers in a group tend to have similar effects, proper usage of the group structure could improve power and efficiency in estimation. We propose a two-stage method relating markers to disease risk by taking advantage of known gene-set structures. Imposing a naive bayes kernel machine (KM) model, we estimate gene-set specific risk models that relate each gene-set to the outcome in stage I. The KM framework efficiently models potentially non-linear effects of predictors without requiring explicit specification of functional forms. In stage II, we aggregate information across gene-sets via a regularization procedure. Estimation and computational efficiency is further improved with kernel principle component analysis. Asymptotic results for model estimation and gene set selection are derived and numerical studies suggest that the proposed procedure could outperform existing procedures for constructing genetic risk models.

  13. Towards the ecotourism: a decision support model for the assessment of sustainability of mountain huts in the Alps.

    PubMed

    Stubelj Ars, Mojca; Bohanec, Marko

    2010-12-01

    This paper studies mountain hut infrastructure in the Alps as an important element of ecotourism in the Alpine region. To improve the decision-making process regarding the implementation of future infrastructure and improvement of existing infrastructure in the vulnerable natural environment of mountain ecosystems, a new decision support model has been developed. The methodology is based on qualitative multi-attribute modelling supported by the DEXi software. The integrated rule-based model is hierarchical and consists of two submodels that cover the infrastructure of the mountain huts and that of the huts' surroundings. The final goal for the designed tool is to help minimize the ecological footprint of tourists in environmentally sensitive and undeveloped mountain areas and contribute to mountain ecotourism. The model has been tested in the case study of four mountain huts in Triglav National Park in Slovenia. Study findings provide a new empirical approach to evaluating existing mountain infrastructure and predicting improvements for the future. The assessment results are of particular interest for decision makers in protected areas, such as Alpine national parks managers and administrators. In a way, this model proposes an approach to the management assessment of mountain huts with the main aim of increasing the quality of life of mountain environment visitors as well as the satisfaction of tourists who may eventually become ecotourists. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Holographic heat engine within the framework of massive gravity

    NASA Astrophysics Data System (ADS)

    Mo, Jie-Xiong; Li, Gu-Qiang

    2018-05-01

    Heat engine models are constructed within the framework of massive gravity in this paper. For the four-dimensional charged black holes in massive gravity, it is shown that the existence of graviton mass improves the heat engine efficiency significantly. The situation is more complicated for the five-dimensional neutral black holes since the constant which corresponds to the third massive potential also contributes to the efficiency. It is also shown that the existence of graviton mass can improve the heat engine efficiency. Moreover, we probe how the massive gravity influences the behavior of the heat engine efficiency approaching the Carnot efficiency.

  15. Air Quality Modeling | Air Quality Planning & Standards | US ...

    EPA Pesticide Factsheets

    2016-06-08

    The basic mission of the Office of Air Quality Planning and Standards is to preserve and improve the quality of our nation's air. One facet of accomplishing this goal requires that new and existing air pollution sources be modeled for compliance with the National Ambient Air Quality Standards (NAAQS).

  16. Economic Benefits of Predictive Models for Pest Control in Agricultural Crops

    USDA-ARS?s Scientific Manuscript database

    Various forms of crop models or decision making tools for managing crops have existed for many years. The potential advantage of all of these decision making tools is that more informed and economically improved crop management or decision making is accomplished. However, examination of some of thes...

  17. Epidemic models with an infected-infectious period

    NASA Astrophysics Data System (ADS)

    Méndez, Vicenç

    1998-03-01

    The introduction of an infective-infectious period on the geographic spread of epidemics is considered in two different models. The classical evolution equations arising in the literature are generalized and the existence of epidemic wave fronts is revised. The asymptotic speed is obtained and improves previous results for the Black Death plague.

  18. Global scale analysis and evaluation of an improved mechanistic representation of plant nitrogen and carbon dynamics in the Community Land Model (CLM)

    NASA Astrophysics Data System (ADS)

    Ghimire, B.; Riley, W. J.; Koven, C. D.; Randerson, J. T.; Mu, M.; Kattge, J.; Rogers, A.; Reich, P. B.

    2014-12-01

    In many ecosystems, nitrogen is the most limiting nutrient for plant growth and productivity. However mechanistic representation of nitrogen uptake linked to root traits, and functional nitrogen allocation among different leaf enzymes involved in respiration and photosynthesis is currently lacking in Earth System models. The linkage between nitrogen availability and plant productivity is simplistically represented by potential photosynthesis rates, and is subsequently downregulated depending on nitrogen supply and other nitrogen consumers in the model (e.g., nitrification). This type of potential photosynthesis rate calculation is problematic for several reasons. Firstly, plants do not photosynthesize at potential rates and then downregulate. Secondly, there is considerable subjectivity on the meaning of potential photosynthesis rates. Thirdly, there exists lack of understanding on modeling these potential photosynthesis rates in a changing climate. In addition to model structural issues in representing photosynthesis rates, the role of plant roots in nutrient acquisition have been largely ignored in Earth System models. For example, in CLM4.5, nitrogen uptake is linked to leaf level processes (e.g., primarily productivity) rather than root scale process involved in nitrogen uptake. We present a new plant model for CLM with an improved mechanistic presentation of plant nitrogen uptake based on root scale Michaelis Menten kinetics, and stronger linkages between leaf nitrogen and plant productivity by inferring relationships observed in global databases of plant traits (including the TRY database and several individual studies). We also incorporate improved representation of plant nitrogen leaf allocation, especially in tropical regions where significant over-prediction of plant growth and productivity in CLM4.5 simulations exist. We evaluate our improved global model simulations using the International Land Model Benchmarking (ILAMB) framework. We conclude that mechanistic representation of leaf-level nitrogen allocation and a theoretically consistent treatment of competition with belowground consumers leads to overall improvements in CLM4.5's global carbon cycling predictions.

  19. The effect of the Mihalas, Hummer, and Daeppen equation of state and the molecular opacity on the standard solar model

    NASA Technical Reports Server (NTRS)

    Kim, Y.-C.; Demarque, P.; Guenther, D. B.

    1991-01-01

    Improvements to the Yale Rotating Stellar Evolution Code (YREC) by incorporating the Mihalas-Hummer-Daeppen equation of state, an improved opacity interpolation routine, and the effects of molecular opacities, calculated at Los Alamos, have been made. the effect of each of the improvements on the standard solar model has been tested independently by computing the corresponding solar nonradial oscillation frequencies. According to these tests, the Mihalas-Hummer-Daeppen equation of state has very little effect on the model's low l p-mode oscillation spectrum compared to the model using the existing analytical equation of state implemented in YREC. On the other hand, the molecular opacity does improve the model's oscillation spectrum. The effect of molecular opacity on the computed solar oscillation frequencies is much larger than that of the Mihalas-Hummer-Daeppen equation of state. together, the two improvements to the physics reduce the discrepancy with observations by 10 microHz for the low l modes.

  20. Modeling and prediction of ionospheric scintillation

    NASA Technical Reports Server (NTRS)

    Fremouw, E. J.

    1974-01-01

    Scintillation modeling performed thus far is based on the theory of diffraction by a weakly modulating phase screen developed by Briggs and Parkin (1963). Shortcomings of the existing empirical model for the scintillation index are discussed together with questions of channel modeling, giving attention to the needs of the communication engineers. It is pointed out that much improved scintillation index models may be available in a matter of a year or so.

  1. Predictive control strategy of a gas turbine for improvement of combined cycle power plant dynamic performance and efficiency.

    PubMed

    Mohamed, Omar; Wang, Jihong; Khalil, Ashraf; Limhabrash, Marwan

    2016-01-01

    This paper presents a novel strategy for implementing model predictive control (MPC) to a large gas turbine power plant as a part of our research progress in order to improve plant thermal efficiency and load-frequency control performance. A generalized state space model for a large gas turbine covering the whole steady operational range is designed according to subspace identification method with closed loop data as input to the identification algorithm. Then the model is used in developing a MPC and integrated into the plant existing control strategy. The strategy principle is based on feeding the reference signals of the pilot valve, natural gas valve, and the compressor pressure ratio controller with the optimized decisions given by the MPC instead of direct application of the control signals. If the set points for the compressor controller and turbine valves are sent in a timely manner, there will be more kinetic energy in the plant to release faster responses on the output and the overall system efficiency is improved. Simulation results have illustrated the feasibility of the proposed application that has achieved significant improvement in the frequency variations and load following capability which are also translated to be improvements in the overall combined cycle thermal efficiency of around 1.1 % compared to the existing one.

  2. A Plan for Academic Biobank Solvency-Leveraging Resources and Applying Business Processes to Improve Sustainability.

    PubMed

    Uzarski, Diane; Burke, James; Turner, Barbara; Vroom, James; Short, Nancy

    2015-10-01

    Researcher-initiated biobanks based at academic institutions contribute valuable biomarker and translational research advances to medicine. With many legacy banks once supported by federal funding, reductions in fiscal support threaten the future of existing and new biobanks. When the Brain Bank at Duke University's Bryan Alzheimer's Disease Center (ADRC) faced a funding crisis, a collaborative, multidisciplinary team embarked on a 2-year biobank sustainability project utilizing a comprehensive business strategy, dedicated project management, and a systems approach involving many Duke University entities. By synthesizing and applying existing knowledge, Duke Translational Medicine Institute created and launched a business model that can be adjusted and applied to legacy and start-up academic biobanks. This model provides a path to identify new funding mechanisms, while also emphasizing improved communication, business development, and a focus on collaborating with industry to improve access to biospecimens. Benchmarks for short-term Brain Bank stabilization have been successfully attained, and the evaluation of long-term sustainability metrics is ongoing. © 2015 Wiley Periodicals, Inc.

  3. A Plan for Academic Biobank Solvency—Leveraging Resources and Applying Business Processes to Improve Sustainability

    PubMed Central

    Burke, James; Turner, Barbara; Vroom, James; Short, Nancy

    2015-01-01

    Abstract Researcher‐initiated biobanks based at academic institutions contribute valuable biomarker and translational research advances to medicine. With many legacy banks once supported by federal funding, reductions in fiscal support threaten the future of existing and new biobanks. When the Brain Bank at Duke University's Bryan Alzheimer's Disease Center (ADRC) faced a funding crisis, a collaborative, multidisciplinary team embarked on a 2‐year biobank sustainability project utilizing a comprehensive business strategy, dedicated project management, and a systems approach involving many Duke University entities. By synthesizing and applying existing knowledge, Duke Translational Medicine Institute created and launched a business model that can be adjusted and applied to legacy and start‐up academic biobanks. This model provides a path to identify new funding mechanisms, while also emphasizing improved communication, business development, and a focus on collaborating with industry to improve access to biospecimens. Benchmarks for short‐term Brain Bank stabilization have been successfully attained, and the evaluation of long‐term sustainability metrics is ongoing. PMID:25996355

  4. Multiple R&D projects scheduling optimization with improved particle swarm algorithm.

    PubMed

    Liu, Mengqi; Shan, Miyuan; Wu, Juan

    2014-01-01

    For most enterprises, in order to win the initiative in the fierce competition of market, a key step is to improve their R&D ability to meet the various demands of customers more timely and less costly. This paper discusses the features of multiple R&D environments in large make-to-order enterprises under constrained human resource and budget, and puts forward a multi-project scheduling model during a certain period. Furthermore, we make some improvements to existed particle swarm algorithm and apply the one developed here to the resource-constrained multi-project scheduling model for a simulation experiment. Simultaneously, the feasibility of model and the validity of algorithm are proved in the experiment.

  5. Io's Magnetospheric Interaction: An MHD Model with Day-Night Asymmetry

    NASA Technical Reports Server (NTRS)

    Kabin, K.; Combi, M. R.; Gombosi, T. I.; DeZeeuw, D. L.; Hansen, K. C.; Powell, K. G.

    2001-01-01

    In this paper we present the results of all improved three-dimensional MHD model for Io's interaction with Jupiter's magnetosphere. We have included the day-night asymmetry into the spatial distribution of our mass-loading, which allowed us to reproduce several smaller features or the Galileo December 1995 data set. The calculation is performed using our newly modified description of the pick-up processes that accounts for the effects of the corotational electric field existing in the Jovian magnetosphere. This change in the formulation of the source terms for the MHD equations resulted in significant improvements in the comparison with the Galileo measurements. We briefly discuss the limitations of our model and possible future improvements.

  6. Mainstreaming implementation science into immunization systems in the decade of vaccines: a programmatic imperative for the African Region.

    PubMed

    Adamu, Abdu A; Adamu, Aishatu L; Dahiru, Abdulkarim I; Uthman, Olalekan A; Wiysonge, Charles S

    2018-05-17

    Several innovations that can improve immunization systems already exist. Some interventions target service consumers within communities to raise awareness, build trust, improve understanding, remind caregivers, reward service users, and improve communication. Other interventions target health facilities to improve access and quality of vaccination services among others. Despite available empirical evidence, there is a delay in translating innovations into routine practice by immunization programmes. Drawing on an existing implementation science framework, we propose an interactive, and multi-perspective model to improve uptake and utilization of available immunization-related innovations in the African region. It is important to stress that our framework is by no means prescriptive. The key intention is to advocate for the entire immunization system to be viewed as an interconnected system of stakeholders, so as to foster better interaction, and proactive transfer of evidence-based innovation into policy and practice.

  7. Improving smoothing efficiency of rigid conformal polishing tool using time-dependent smoothing evaluation model

    NASA Astrophysics Data System (ADS)

    Song, Chi; Zhang, Xuejun; Zhang, Xin; Hu, Haifei; Zeng, Xuefeng

    2017-06-01

    A rigid conformal (RC) lap can smooth mid-spatial-frequency (MSF) errors, which are naturally smaller than the tool size, while still removing large-scale errors in a short time. However, the RC-lap smoothing efficiency performance is poorer than expected, and existing smoothing models cannot explicitly specify the methods to improve this efficiency. We presented an explicit time-dependent smoothing evaluation model that contained specific smoothing parameters directly derived from the parametric smoothing model and the Preston equation. Based on the time-dependent model, we proposed a strategy to improve the RC-lap smoothing efficiency, which incorporated the theoretical model, tool optimization, and efficiency limit determination. Two sets of smoothing experiments were performed to demonstrate the smoothing efficiency achieved using the time-dependent smoothing model. A high, theory-like tool influence function and a limiting tool speed of 300 RPM were o

  8. Improving acceptance for Higgs events at CDF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sforza, Federico; /INFN, Pisa

    2008-03-01

    The Standard Model of elementary particles predicts the existence of the Higgs boson as the responsable of the electroweak symmetry breaking, the process by which fermions and vector bosons acquire mass. The Higgs existence is one of the most important questions in the present high energy physics research. This work concerns the search of W H associate production at the CDF II experiment (Collider Detector at Fermilab).

  9. Using Item-Type Performance Covariance to Improve the Skill Model of an Existing Tutor

    ERIC Educational Resources Information Center

    Pavlik, Philip I., Jr.; Cen, Hao; Wu, Lili; Koedinger, Kenneth R.

    2008-01-01

    Using data from an existing pre-algebra computer-based tutor, we analyzed the covariance of item-types with the goal of describing a more effective way to assign skill labels to item-types. Analyzing covariance is important because it allows us to place the skills in a related network in which we can identify the role each skill plays in learning…

  10. Health-based risk adjustment: improving the pharmacy-based cost group model by adding diagnostic cost groups.

    PubMed

    Prinsze, Femmeke J; van Vliet, René C J A

    Since 1991, risk-adjusted premium subsidies have existed in the Dutch social health insurance sector, which covered about two-thirds of the population until 2006. In 2002, pharmacy-based cost groups (PCGs) were included in the demographic risk adjustment model, which improved the goodness-of-fit, as measured by the R2, to 11.5%. The model's R2 reached 22.8% in 2004, when inpatient diagnostic information was added in the form of diagnostic cost groups (DCGs). PCGs and DCGs appear to be complementary in their ability to predict future costs. PCGs particularly improve the R2 for outpatient expenses, whereas DCGs improve the R2 for inpatient expenses. In 2006, this system of risk-adjusted premium subsidies was extended to cover the entire population.

  11. Improving LHC searches for dark photons using lepton-jet substructure

    NASA Astrophysics Data System (ADS)

    Barello, G.; Chang, Spencer; Newby, Christopher A.; Ostdiek, Bryan

    2017-03-01

    Collider signals of dark photons are an exciting probe for new gauge forces and are characterized by events with boosted lepton jets. Existing techniques are efficient in searching for muonic lepton jets but due to substantial backgrounds have difficulty constraining lepton jets containing only electrons. This is unfortunate since upcoming intensity frontier experiments are sensitive to dark photon masses which only allow electron decays. Analyzing a recently proposed model of kinetic mixing, with new scalar particles decaying into dark photons, we find that existing techniques for electron jets can be substantially improved. We show that using lepton-jet-substructure variables, in association with a boosted decision tree, improves background rejection, significantly increasing the LHC's reach for dark photons in this region of parameter space.

  12. Assessment and improvement of biotransfer models to cow's milk and beef used in exposure assessment tools for organic pollutants.

    PubMed

    Takaki, Koki; Wade, Andrew J; Collins, Chris D

    2015-11-01

    The aim of this study was to assess and improve the accuracy of biotransfer models for the organic pollutants (PCBs, PCDD/Fs, PBDEs, PFCAs, and pesticides) into cow's milk and beef used in human exposure assessment. Metabolic rate in cattle is known as a key parameter for this biotransfer, however few experimental data and no simulation methods are currently available. In this research, metabolic rate was estimated using existing QSAR biodegradation models of microorganisms (BioWIN) and fish (EPI-HL and IFS-HL). This simulated metabolic rate was then incorporated into the mechanistic cattle biotransfer models (RAIDAR, ACC-HUMAN, OMEGA, and CKow). The goodness of fit tests showed that RAIDAR, ACC-HUMAN, OMEGA model performances were significantly improved using either of the QSARs when comparing the new model outputs to observed data. The CKow model is the only one that separates the processes in the gut and liver. This model showed the lowest residual error of all the models tested when the BioWIN model was used to represent the ruminant metabolic process in the gut and the two fish QSARs were used to represent the metabolic process in the liver. Our testing included EUSES and CalTOX which are KOW-regression models that are widely used in regulatory assessment. New regressions based on the simulated rate of the two metabolic processes are also proposed as an alternative to KOW-regression models for a screening risk assessment. The modified CKow model is more physiologically realistic, but has equivalent usability to existing KOW-regression models for estimating cattle biotransfer of organic pollutants. Copyright © 2015. Published by Elsevier Ltd.

  13. An Evaluation of Understandability of Patient Journey Models in Mental Health.

    PubMed

    Percival, Jennifer; McGregor, Carolyn

    2016-07-28

    There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers.

  14. Sawmill simulation and the best opening face system : a user`s guide

    Treesearch

    D. W. Lewis

    1985-01-01

    Computer sawmill simulation models are being used to increase lumber yield and improve management control. Although there are few managers or technical people in the sawmill industry who are not aware of the existence of these models, many do not realize the modelsa full potential. The first section of this paper describes computerized sawmill simulation models and...

  15. Patient, staff and physician satisfaction: a new model, instrument and their implications.

    PubMed

    York, Anne S; McCarthy, Kim A

    2011-01-01

    Customer satisfaction's importance is well-documented in the marketing literature and is rapidly gaining wide acceptance in the healthcare industry. The purpose of this paper is to introduce a new customer-satisfaction measuring method - Reichheld's ultimate question - and compare it with traditional techniques using data gathered from four healthcare clinics. A new survey method, called the ultimate question, was used to collect patient satisfaction data. It was subsequently compared with the data collected via an existing method. Findings suggest that the ultimate question provides similar ratings to existing models at lower costs. A relatively small sample size may affect the generalizability of the results; it is also possible that potential spill-over effects exist owing to two patient satisfaction surveys administered at the same time. This new ultimate question method greatly improves the process and ease with which hospital or clinic administrators are able to collect patient (as well as staff and physician) satisfaction data in healthcare settings. Also, the feedback gained from this method is actionable and can be used to make strategic improvements that will impact business and ultimately increase profitability. The paper's real value is pinpointing specific quality improvement areas based not just on patient ratings but also physician and staff satisfaction, which often underlie patients' clinical experiences.

  16. Interactive Reliability Model for Whisker-toughened Ceramics

    NASA Technical Reports Server (NTRS)

    Palko, Joseph L.

    1993-01-01

    Wider use of ceramic matrix composites (CMC) will require the development of advanced structural analysis technologies. The use of an interactive model to predict the time-independent reliability of a component subjected to multiaxial loads is discussed. The deterministic, three-parameter Willam-Warnke failure criterion serves as the theoretical basis for the reliability model. The strength parameters defining the model are assumed to be random variables, thereby transforming the deterministic failure criterion into a probabilistic criterion. The ability of the model to account for multiaxial stress states with the same unified theory is an improvement over existing models. The new model was coupled with a public-domain finite element program through an integrated design program. This allows a design engineer to predict the probability of failure of a component. A simple structural problem is analyzed using the new model, and the results are compared to existing models.

  17. A disease management programme for patients with diabetes mellitus is associated with improved quality of care within existing budgets.

    PubMed

    Steuten, L M G; Vrijhoef, H J M; Landewé-Cleuren, S; Schaper, N; Van Merode, G G; Spreeuwenberg, C

    2007-10-01

    To assess the impact of a disease management programme for patients with diabetes mellitus (Type 1 and Type 2) on cost-effectiveness, quality of life and patient self-management. By organizing care in accordance with the principles of disease management, it is aimed to increase quality of care within existing budgets. Single-group, pre-post design with 2-year follow-up in 473 patients. Substantial significant improvements in glycaemic control, health-related quality of life (HRQL) and patient self-management were found. No significant changes were detected in total costs of care. The probability that the disease management programme is cost-effective compared with usual care amounts to 74%, expressed in an average saving of 117 per additional life year at 5% improved HRQL. Introduction of a disease management programme for patients with diabetes is associated with improved intermediate outcomes within existing budgets. Further research should focus on long-term cost-effectiveness, including diabetic complications and mortality, in a controlled setting or by using decision-analytic modelling techniques.

  18. The CEOP Inter-Monsoon Studies (CIMS)

    NASA Technical Reports Server (NTRS)

    Lau, William K. M.

    2003-01-01

    Prediction of climate relies on models, and better model prediction depends on good model physics. Improving model physics requires the maximal utilization of climate data of the past, present and future. CEOP provides the first example of a comprehensive, integrated global and regional data set, consisting of globally gridded data, reference site in-situ observations, model location time series (MOLTS), and integrated satellite data for a two-year period covering two complete annual cycles of 2003-2004. The monsoon regions are the most important socio-economically in terms of devastation by floods and droughts, and potential impacts from climate change md fluctuatinns nf the hydrologic cyc!e. Scientifically, it is most challenging, because of complex interactions of atmosphere, land and oceans, local vs. remote forcings in contributing to climate variability and change in the region. Given that many common features, and physical teleconnection exist among different monsoon regions, an international research focus on monsoon must be coordinated and sustained. Current models of the monsoon are grossly inadequate for regional predictions. For improvement, models must be confronted with relevant observations, and model physic developers must be made to be aware of the wealth of information from existing climate data, field measurements, and satellite data that can be used to improve models. Model transferability studles must be conducted. CIMS is a major initiative under CEOP to engage the modeling and the observational communities to join in a coordinated effort to study the monsoons. The objectives of CIMS are (a) To provide a better understanding of fundamental physical processes (diurnal cycle, annual cycle, and intraseasonal oscillations) in monsoon regions around the world and (b) To demonstrate the synergy and utility of CEOP data in providing a pathway for model physics evaluation and improvement. In this talk, I will present the basic concepts of CIMS and the key scientific problems facing monsoon climates and provide examples of common monsoon features, and possible monsoon induced teleconnections linking different parts of the world.

  19. Best Practices Article: Gradually Increasing Individuality: Suggestions for Improving Alternative Teacher Education Programs

    ERIC Educational Resources Information Center

    Henning-Smith, Jeff

    2018-01-01

    The purpose of this article was to examine the use of a gradual release of responsibility (GRR) model (Pearson & Gallagher, 1983) embedded in a coteaching framework (Heck & Bacharach, 2016) during the student-teaching portion of an alternative teaching licensure program. The goal was to improve an already existing student-teacher field…

  20. Prioritizing environmental justice and equality: diesel emissions in southern California.

    PubMed

    Marshall, Julian D; Swor, Kathryn R; Nguyen, Nam P

    2014-04-01

    Existing environmental policies aim to reduce emissions but lack standards for addressing environmental justice. Environmental justice research documents disparities in exposure to air pollution; however, little guidance currently exists on how to make improvements or on how specific emission-reduction scenarios would improve or deteriorate environmental justice conditions. Here, we quantify how emission reductions from specific sources would change various measures of environmental equality and justice. We evaluate potential emission reductions for fine diesel particulate matter (DPM) in Southern California for five sources: on-road mobile, off-road mobile, ships, trains, and stationary. Our approach employs state-of-the-science dispersion and exposure models. We compare four environmental goals: impact, efficiency, equality, and justice. Results indicate potential trade-offs among those goals. For example, reductions in train emissions produce the greatest improvements in terms of efficiency, equality, and justice, whereas off-road mobile source reductions can have the greatest total impact. Reductions in on-road emissions produce improvements in impact, equality, and justice, whereas emission reductions from ships would widen existing population inequalities. Results are similar for complex versus simplified exposure analyses. The approach employed here could usefully be applied elsewhere to evaluate opportunities for improving environmental equality and justice in other locations.

  1. Reducing ambulance response times using discrete event simulation.

    PubMed

    Wei Lam, Sean Shao; Zhang, Zhong Cheng; Oh, Hong Choon; Ng, Yih Ying; Wah, Win; Hock Ong, Marcus Eng

    2014-01-01

    The objectives of this study are to develop a discrete-event simulation (DES) model for the Singapore Emergency Medical Services (EMS), and to demonstrate the utility of this DES model for the evaluation of different policy alternatives to improve ambulance response times. A DES model was developed based on retrospective emergency call data over a continuous 6-month period in Singapore. The main outcome measure is the distribution of response times. The secondary outcome measure is ambulance utilization levels based on unit hour utilization (UHU) ratios. The DES model was used to evaluate different policy options in order to improve the response times, while maintaining reasonable fleet utilization. Three policy alternatives looking at the reallocation of ambulances, the addition of new ambulances, and alternative dispatch policies were evaluated. Modifications of dispatch policy combined with the reallocation of existing ambulances were able to achieve response time performance equivalent to that of adding 10 ambulances. The median (90th percentile) response time was 7.08 minutes (12.69 minutes). Overall, this combined strategy managed to narrow the gap between the ideal and existing response time distribution by 11-13%. Furthermore, the median UHU under this combined strategy was 0.324 with an interquartile range (IQR) of 0.047 versus a median utilization of 0.285 (IQR of 0.051) resulting from the introduction of additional ambulances. Response times were shown to be improved via a more effective reallocation of ambulances and dispatch policy. More importantly, the response time improvements were achieved without a reduction in the utilization levels and additional costs associated with the addition of ambulances. We demonstrated the effective use of DES as a versatile platform to model the dynamic system complexities of Singapore's national EMS systems for the evaluation of operational strategies to improve ambulance response times.

  2. Improved design of constrained model predictive tracking control for batch processes against unknown uncertainties.

    PubMed

    Wu, Sheng; Jin, Qibing; Zhang, Ridong; Zhang, Junfeng; Gao, Furong

    2017-07-01

    In this paper, an improved constrained tracking control design is proposed for batch processes under uncertainties. A new process model that facilitates process state and tracking error augmentation with further additional tuning is first proposed. Then a subsequent controller design is formulated using robust stable constrained MPC optimization. Unlike conventional robust model predictive control (MPC), the proposed method enables the controller design to bear more degrees of tuning so that improved tracking control can be acquired, which is very important since uncertainties exist inevitably in practice and cause model/plant mismatches. An injection molding process is introduced to illustrate the effectiveness of the proposed MPC approach in comparison with conventional robust MPC. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Comparison of Different Attitude Correction Models for ZY-3 Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Song, Wenping; Liu, Shijie; Tong, Xiaohua; Niu, Changling; Ye, Zhen; Zhang, Han; Jin, Yanmin

    2018-04-01

    ZY-3 satellite, launched in 2012, is the first civilian high resolution stereo mapping satellite of China. This paper analyzed the positioning errors of ZY-3 satellite imagery and conducted compensation for geo-position accuracy improvement using different correction models, including attitude quaternion correction, attitude angle offset correction, and attitude angle linear correction. The experimental results revealed that there exist systematic errors with ZY-3 attitude observations and the positioning accuracy can be improved after attitude correction with aid of ground controls. There is no significant difference between the results of attitude quaternion correction method and the attitude angle correction method. However, the attitude angle offset correction model produced steady improvement than the linear correction model when limited ground control points are available for single scene.

  4. Implementation of the dynamical system of the deposit and loan growth based on the Lotka-Volterra model and the improved model

    NASA Astrophysics Data System (ADS)

    Fadhlurrahman, Akmal; Sumarti, Novriana

    2016-04-01

    The Lotka-Volterra model is a very popular mathematical model based on the relationship in Ecology between predator, which is an organism that eats another organism, and prey, which is the organism which the predator eats. Predator and prey evolve together. The prey is part of the predator's environment, and the existence of the predator depends on the existence of the prey. As a dynamical system, this model could generate limit cycles, which is an interesting type of equilibrium sometime in the system of two or more dimensions. In [1,2], the dynamical system of the the Deposit and Loan Volumes based on the Lotka-Volterra Model had been developed. In this paper, we improve the definition of parameters in the model and then implement the model on the data of banking from January 2003 to December 2014 which consist of 4 (four) types of banks. The data is represented into the form of return in order to have data in a periodical-like form. The results show the periodicity in the deposit and loan growth data which is in line with paper in [3] that suggest the positive correlation between loan growth and deposit growth, and vice-versa.

  5. An Integrated Model for Effective Knowledge Management in Chinese Organizations

    ERIC Educational Resources Information Center

    An, Xiaomi; Deng, Hepu; Wang, Yiwen; Chao, Lemen

    2013-01-01

    Purpose: The purpose of this paper is to provide organizations in the Chinese cultural context with a conceptual model for an integrated adoption of existing knowledge management (KM) methods and to improve the effectiveness of their KM activities. Design/methodology/approaches: A comparative analysis is conducted between China and the western…

  6. Estimating effective roughness parameters of the L-MEB model for soil moisture retrieval using passive microwave observations from SMAPVEX12

    USDA-ARS?s Scientific Manuscript database

    Although there have been efforts to improve existing soil moisture retrieval algorithms, the ability to estimate soil moisture from passive microwave observations is still hampered by problems in accurately modeling the observed microwave signal. This paper focuses on the estimation of effective sur...

  7. Signal Partitioning Algorithm for Highly Efficient Gaussian Mixture Modeling in Mass Spectrometry

    PubMed Central

    Polanski, Andrzej; Marczyk, Michal; Pietrowska, Monika; Widlak, Piotr; Polanska, Joanna

    2015-01-01

    Mixture - modeling of mass spectra is an approach with many potential applications including peak detection and quantification, smoothing, de-noising, feature extraction and spectral signal compression. However, existing algorithms do not allow for automated analyses of whole spectra. Therefore, despite highlighting potential advantages of mixture modeling of mass spectra of peptide/protein mixtures and some preliminary results presented in several papers, the mixture modeling approach was so far not developed to the stage enabling systematic comparisons with existing software packages for proteomic mass spectra analyses. In this paper we present an efficient algorithm for Gaussian mixture modeling of proteomic mass spectra of different types (e.g., MALDI-ToF profiling, MALDI-IMS). The main idea is automated partitioning of protein mass spectral signal into fragments. The obtained fragments are separately decomposed into Gaussian mixture models. The parameters of the mixture models of fragments are then aggregated to form the mixture model of the whole spectrum. We compare the elaborated algorithm to existing algorithms for peak detection and we demonstrate improvements of peak detection efficiency obtained by using Gaussian mixture modeling. We also show applications of the elaborated algorithm to real proteomic datasets of low and high resolution. PMID:26230717

  8. Improved water resource management for a highly complex environment using three-dimensional groundwater modelling

    NASA Astrophysics Data System (ADS)

    Moeck, Christian; Affolter, Annette; Radny, Dirk; Dressmann, Horst; Auckenthaler, Adrian; Huggenberger, Peter; Schirmer, Mario

    2018-02-01

    A three-dimensional groundwater model was used to improve water resource management for a study area in north-west Switzerland, where drinking-water production is close to former landfills and industrial areas. To avoid drinking-water contamination, artificial groundwater recharge with surface water is used to create a hydraulic barrier between the contaminated sites and drinking-water extraction wells. The model was used for simulating existing and proposed water management strategies as a tool to ensure the utmost security for drinking water. A systematic evaluation of the flow direction between existing observation points using a developed three-point estimation method for a large number of scenarios was carried out. It is demonstrated that systematically applying the developed methodology helps to identify vulnerable locations which are sensitive to changing boundary conditions such as those arising from changes to artificial groundwater recharge rates. At these locations, additional investigations and protection are required. The presented integrated approach, using the groundwater flow direction between observation points, can be easily transferred to a variety of hydrological settings to systematically evaluate groundwater modelling scenarios.

  9. Existing and Emerging Payment and Delivery Reforms in Cardiology

    PubMed Central

    Farmer, Steven A.; Darling, Margaret L.; George, Meaghan; Casale, Paul N.; Hagan, Eileen; McClellan, Mark B.

    2017-01-01

    IMPORTANCE Recent health care reforms aim to increase patient access, reduce costs, and improve health care quality as payers turn to payment reform for greater value. Cardiologists need to understand emerging payment models to succeed in the evolving payment landscape. We review existing payment and delivery reforms that affect cardiologists, present 4 emerging examples, and consider their implications for clinical practice. OBSERVATIONS Public and commercial payers have recently implemented payment reforms and new models are evolving. Most cardiology models are modified fee-for-service or address procedural or episodic care, but population models are also emerging. Although there is widespread agreement that payment reform is needed, existing programs have significant limitations and the adoption ofnew programs has been slow. New payment reforms address some of these problems, but many details remain undefined. CONCLUSIONS AND RELEVANCE Early payment reforms were voluntary and cardiologists’ participation is variable. However, conventional fee-for-service will become less viable, and enrollment in new payment models will be unavoidable. Early participation in new payment models will allow clinicians to develop expertise in new care pathways during a period of relatively lower risk. PMID:27851858

  10. Towards an Enhancement of Organizational Information Security through Threat Factor Profiling (TFP) Model

    NASA Astrophysics Data System (ADS)

    Sidi, Fatimah; Daud, Maslina; Ahmad, Sabariah; Zainuddin, Naqliyah; Anneisa Abdullah, Syafiqa; Jabar, Marzanah A.; Suriani Affendey, Lilly; Ishak, Iskandar; Sharef, Nurfadhlina Mohd; Zolkepli, Maslina; Nur Majdina Nordin, Fatin; Amat Sejani, Hashimah; Ramadzan Hairani, Saiful

    2017-09-01

    Information security has been identified by organizations as part of internal operations that need to be well implemented and protected. This is because each day the organizations face a high probability of increase of threats to their networks and services that will lead to information security issues. Thus, effective information security management is required in order to protect their information assets. Threat profiling is a method that can be used by an organization to address the security challenges. Threat profiling allows analysts to understand and organize intelligent information related to threat groups. This paper presents a comparative analysis that was conducted to study the existing threat profiling models. It was found that existing threat models were constructed based on specific objectives, thus each model is limited to only certain components or factors such as assets, threat sources, countermeasures, threat agents, threat outcomes and threat actors. It is suggested that threat profiling can be improved by the combination of components found in each existing threat profiling model/framework. The proposed model can be used by an organization in executing a proactive approach to incident management.

  11. Description of Data Acquisition Efforts

    DOT National Transportation Integrated Search

    1999-09-01

    As part of the overall strategy of refining and improving the existing transportation and air-quality modeling framework, the current project focuses extensively on acquiring disaggregate and reliable data for analysis. In this report, we discuss the...

  12. Characterization of Orbital Debris via Hyper-Velocity Ground-Based Tests

    NASA Technical Reports Server (NTRS)

    Cowardin, Heather

    2016-01-01

    The purpose of the DebriSat project is to replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoDand NASA breakup models.

  13. Center for Modeling of Turbulence and Transition: Research Briefs, 1995

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This research brief contains the progress reports of the research staff of the Center for Modeling of Turbulence and Transition (CMOTT) from July 1993 to July 1995. It also constitutes a progress report to the Institute of Computational Mechanics in Propulsion located at the Ohio Aerospace Institute and the Lewis Research Center. CMOTT has been in existence for about four years. In the first three years, its main activities were to develop and validate turbulence and combustion models for propulsion systems, in an effort to remove the deficiencies of existing models. Three workshops on computational turbulence modeling were held at LeRC (1991, 1993, 1994). At present, CMOTT is integrating the CMOTT developed/improved models into CFD tools which can be used by the propulsion systems community. This activity has resulted in an increased collaboration with the Lewis CFD researchers.

  14. Center for modeling of turbulence and transition: Research briefs, 1995

    NASA Astrophysics Data System (ADS)

    1995-10-01

    This research brief contains the progress reports of the research staff of the Center for Modeling of Turbulence and Transition (CMOTT) from July 1993 to July 1995. It also constitutes a progress report to the Institute of Computational Mechanics in Propulsion located at the Ohio Aerospace Institute and the Lewis Research Center. CMOTT has been in existence for about four years. In the first three years, its main activities were to develop and validate turbulence and combustion models for propulsion systems, in an effort to remove the deficiencies of existing models. Three workshops on computational turbulence modeling were held at LeRC (1991, 1993, 1994). At present, CMOTT is integrating the CMOTT developed/improved models into CFD tools which can be used by the propulsion systems community. This activity has resulted in an increased collaboration with the Lewis CFD researchers.

  15. Wind tunnel measurements for dispersion modelling of vehicle wakes

    NASA Astrophysics Data System (ADS)

    Carpentieri, Matteo; Kumar, Prashant; Robins, Alan

    2012-12-01

    Wind tunnel measurements downwind of reduced scale car models have been made to study the wake regions in detail, test the usefulness of existing vehicle wake models, and draw key information needed for dispersion modelling in vehicle wakes. The experiments simulated a car moving in still air. This is achieved by (i) the experimental characterisation of the flow, turbulence and concentration fields in both the near and far wake regions, (ii) the preliminary assessment of existing wake models using the experimental database, and (iii) the comparison of previous field measurements in the wake of a real diesel car with the wind tunnel measurements. The experiments highlighted very large gradients of velocities and concentrations existing, in particular, in the near-wake. Of course, the measured fields are strongly dependent on the geometry of the modelled vehicle and a generalisation for other vehicles may prove to be difficult. The methodology applied in the present study, although improvable, could constitute a first step towards the development of mathematical parameterisations. Experimental results were also compared with the estimates from two wake models. It was found that they can adequately describe the far-wake of a vehicle in terms of velocities, but a better characterisation in terms of turbulence and pollutant dispersion is needed. Parameterised models able to predict velocity and concentrations with fine enough details at the near-wake scale do not exist.

  16. A cross-correlation-based estimate of the galaxy luminosity function

    NASA Astrophysics Data System (ADS)

    van Daalen, Marcel P.; White, Martin

    2018-06-01

    We extend existing methods for using cross-correlations to derive redshift distributions for photometric galaxies, without using photometric redshifts. The model presented in this paper simultaneously yields highly accurate and unbiased redshift distributions and, for the first time, redshift-dependent luminosity functions, using only clustering information and the apparent magnitudes of the galaxies as input. In contrast to many existing techniques for recovering unbiased redshift distributions, the output of our method is not degenerate with the galaxy bias b(z), which is achieved by modelling the shape of the luminosity bias. We successfully apply our method to a mock galaxy survey and discuss improvements to be made before applying our model to real data.

  17. Efficient Translation of LTL Formulae into Buchi Automata

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Lerda, Flavio

    2001-01-01

    Model checking is a fully automated technique for checking that a system satisfies a set of required properties. With explicit-state model checkers, properties are typically defined in linear-time temporal logic (LTL), and are translated into B chi automata in order to be checked. This report presents how we have combined and improved existing techniques to obtain an efficient LTL to B chi automata translator. In particular, we optimize the core of existing tableau-based approaches to generate significantly smaller automata. Our approach has been implemented and is being released as part of the Java PathFinder software (JPF), an explicit state model checker under development at the NASA Ames Research Center.

  18. Improved tsunami impact assessments: validation, comparison and the integration of hydrodynamic modeling

    NASA Astrophysics Data System (ADS)

    Tarbotton, C.; Walters, R. A.; Goff, J. R.; Dominey-Howes, D.; Turner, I. L.

    2012-12-01

    As communities become increasingly aware of the risks posed by tsunamis, it is important to develop methods for predicting the damage they can cause to the built environment. This will provide the information needed to make informed decisions regarding land-use, building codes, and evacuation. At present, a number of tsunami-building vulnerability assessment models are available, however, the relative infrequency and destructive nature of tsunamis has long made it difficult to obtain the data necessary to adequately validate and compare them. Further complicating matters is that the inundation of a tsunami in the built environment is very difficult model, as is the response of a building to the hydraulic forces that a tsunami generates. Variations in building design and condition will significantly affect a building's susceptibility to damage. Likewise, factors affecting the flow conditions at a building (i.e. surrounding structures and topography), will greatly affect its exposure. This presents significant challenges for practitioners, as they are often left in the dark on how to use hazard modeling and vulnerability assessment techniques together to conduct the community-scale impact studies required for tsunami planning. This paper presents the results of an in-depth case study of Yuriage, Miyagi Prefecture - a coastal city in Japan that was badly damaged by the 2011 Tohoku tsunami. The aim of the study was twofold: 1) To test and compare existing tsunami vulnerability assessment models and 2) To more effectively utilize hydrodynamic models in the context of tsunami impact studies. Following the 2011 Tohoku event, an unprecedented quantity of field data, imagery and video emerged. Yuriage in particular, features a comprehensive set of street level Google Street View imagery, available both before and after the event. This has enabled the collection of a large dataset describing the characteristics of the buildings existing before the event as well the subsequent damage that they sustained during. These data together with the detailed results from hydrodynamic models have been used to provide the building, damage and hazard data necessary to rigorously test and compare existing vulnerability assessments techniques. The result is a much-improved understanding of the capabilities of existing vulnerability assessment techniques, as well as important improvements to their assessment framework This provides much needed guidance to practitioners on how to conduct tsunami impact assessments in the future. Furthermore, the study introduces some new methods of integrating hydrodynamic models into vulnerability assessment models, offering guidance on how to more effectively model tsunami inundation in the built environment.

  19. Using environmental tracer data to identify deep-aquifer, long-term flow patterns and recharge distributions in the Surat Basin, Queensland, Australia

    NASA Astrophysics Data System (ADS)

    Siade, A. J.; Suckow, A. O.; Morris, R.; Raiber, M.; Prommer, H.

    2017-12-01

    The calibration of regional groundwater flow models, including those investigating coal-seam gas (CSG) impacts in the Surat Basin, Australia, are not typically constrained using environmental tracers, although the use of such data can potentially provide significant reductions in predictive uncertainties. These additional sources of information can also improve the conceptualisation of flow systems and the quantification of groundwater fluxes. In this study, new multi-tracer data (14C, 39Ar, 81Kr, and 36Cl) were collected for the eastern recharge areas of the basin and within the deeper Hutton and Precipice Sandstone formations to complement existing environmental tracer data. These data were used to better understand the recharge mechanisms, recharge rates and the hydraulic properties associated with deep aquifer systems in the Surat Basin. Together with newly acquired pressure data documenting the response to the large-scale reinjection of highly treated CSG co-produced water, the environmental tracer data helped to improve the conceptualisation of the aquifer system, forming the basis for a more robust quantification of the long-term impacts of CSG-related activities. An existing regional scale MODFLOW-USG groundwater flow model of the area was used as the basis for our analysis of existing and new observation data. A variety of surrogate modelling approaches were used to develop simplified models that focussed on the flow and transport behaviour of the deep aquifer systems. These surrogate models were able to represent sub-system behaviour in terms of flow, multi-environmental tracer transport and the observed large-scale hydrogeochemical patterns. The incorporation of the environmental tracer data into the modelling framework provide an improved understanding of the flow regimes of the deeper aquifer systems as well as valuable information on how to reduce uncertainties in hydraulic properties where there is little or no historical observations of hydraulic heads.

  20. Turbulent flow separation in three-dimensional asymmetric diffusers

    NASA Astrophysics Data System (ADS)

    Jeyapaul, Elbert

    2011-12-01

    Turbulent three-dimensional flow separation is more complicated than 2-D. The physics of the flow is not well understood. Turbulent flow separation is nearly independent of the Reynolds number, and separation in 3-D occurs at singular points and along convergence lines emanating from these points. Most of the engineering turbulence research is driven by the need to gain knowledge of the flow field that can be used to improve modeling predictions. This work is motivated by the need for a detailed study of 3-D separation in asymmetric diffusers, to understand the separation phenomena using eddy-resolving simulation methods, assess the predictability of existing RANS turbulence models and propose modeling improvements. The Cherry diffuser has been used as a benchmark. All existing linear eddy-viscosity RANS models k--o SST,k--epsilon and v2- f fail in predicting such flows, predicting separation on the wrong side. The geometry has a doubly-sloped wall, with the other two walls orthogonal to each other and aligned with the diffuser inlet giving the diffuser an asymmetry. The top and side flare angles are different and this gives rise to different pressure gradient in each transverse direction. Eddyresolving simulations using the Scale adaptive simulation (SAS) and Large Eddy Simulation (LES) method have been used to predict separation in benchmark diffuser and validated. A series of diffusers with the same configuration have been generated, each having the same streamwise pressure gradient and parametrized only by the inlet aspect ratio. The RANS models were put to test and the flow physics explored using SAS-generated flow field. The RANS model indicate a transition in separation surface from top sloped wall to the side sloped wall at an inlet aspect ratio much lower than observed in LES results. This over-sensitivity of RANS models to transverse pressure gradients is due to lack of anisotropy in the linear Reynolds stress formulation. The complexity of the flow separation is due to effects of lateral straining, streamline curvature, secondary flow of second kind, transverse pressure gradient on turbulence. Resolving these effects is possible with anisotropy turbulence models as the Explicit Algebraic Reynolds stress model (EARSM). This model has provided accurate prediction of streamwise and transverse velocity, however the wall pressure is under predicted. An improved EARSM model is developed by correcting the coefficients, which predicts a more accurate wall pressure. There exists scope for improvement of this model, by including convective effects and dynamics of velocity gradient invariants.

  1. Modeling nanomaterial environmental fate in aquatic systems.

    PubMed

    Dale, Amy L; Casman, Elizabeth A; Lowry, Gregory V; Lead, Jamie R; Viparelli, Enrica; Baalousha, Mohammed

    2015-03-03

    Mathematical models improve our fundamental understanding of the environmental behavior, fate, and transport of engineered nanomaterials (NMs, chemical substances or materials roughly 1-100 nm in size) and facilitate risk assessment and management activities. Although today's large-scale environmental fate models for NMs are a considerable improvement over early efforts, a gap still remains between the experimental research performed to date on the environmental fate of NMs and its incorporation into models. This article provides an introduction to the current state of the science in modeling the fate and behavior of NMs in aquatic environments. We address the strengths and weaknesses of existing fate models, identify the challenges facing researchers in developing and validating these models, and offer a perspective on how these challenges can be addressed through the combined efforts of modelers and experimentalists.

  2. Evaluation of machine learning algorithms for improved risk assessment for Down's syndrome.

    PubMed

    Koivu, Aki; Korpimäki, Teemu; Kivelä, Petri; Pahikkala, Tapio; Sairanen, Mikko

    2018-05-04

    Prenatal screening generates a great amount of data that is used for predicting risk of various disorders. Prenatal risk assessment is based on multiple clinical variables and overall performance is defined by how well the risk algorithm is optimized for the population in question. This article evaluates machine learning algorithms to improve performance of first trimester screening of Down syndrome. Machine learning algorithms pose an adaptive alternative to develop better risk assessment models using the existing clinical variables. Two real-world data sets were used to experiment with multiple classification algorithms. Implemented models were tested with a third, real-world, data set and performance was compared to a predicate method, a commercial risk assessment software. Best performing deep neural network model gave an area under the curve of 0.96 and detection rate of 78% with 1% false positive rate with the test data. Support vector machine model gave area under the curve of 0.95 and detection rate of 61% with 1% false positive rate with the same test data. When compared with the predicate method, the best support vector machine model was slightly inferior, but an optimized deep neural network model was able to give higher detection rates with same false positive rate or similar detection rate but with markedly lower false positive rate. This finding could further improve the first trimester screening for Down syndrome, by using existing clinical variables and a large training data derived from a specific population. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Improving the accuracy of energy baseline models for commercial buildings with occupancy data

    DOE PAGES

    Liang, Xin; Hong, Tianzhen; Shen, Geoffrey Qiping

    2016-07-07

    More than 80% of energy is consumed during operation phase of a building's life cycle, so energy efficiency retrofit for existing buildings is considered a promising way to reduce energy use in buildings. The investment strategies of retrofit depend on the ability to quantify energy savings by “measurement and verification” (M&V), which compares actual energy consumption to how much energy would have been used without retrofit (called the “baseline” of energy use). Although numerous models exist for predicting baseline of energy use, a critical limitation is that occupancy has not been included as a variable. However, occupancy rate is essentialmore » for energy consumption and was emphasized by previous studies. This study develops a new baseline model which is built upon the Lawrence Berkeley National Laboratory (LBNL) model but includes the use of building occupancy data. The study also proposes metrics to quantify the accuracy of prediction and the impacts of variables. However, the results show that including occupancy data does not significantly improve the accuracy of the baseline model, especially for HVAC load. The reasons are discussed further. In addition, sensitivity analysis is conducted to show the influence of parameters in baseline models. To conclude, the results from this study can help us understand the influence of occupancy on energy use, improve energy baseline prediction by including the occupancy factor, reduce risks of M&V and facilitate investment strategies of energy efficiency retrofit.« less

  4. Federating Cyber and Physical Models for Event-Driven Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Pawlowski, Ronald A.; Sridhar, Siddharth

    The purpose of this paper is to describe a novel method to improve electric power system monitoring and control software application interoperability. This method employs the concept of federation, which is defined as the use of existing models that represent aspects of a system in specific domains (such as physical and cyber security domains) and building interface to link all of domain models.

  5. The application of feature selection to the development of Gaussian process models for percutaneous absorption.

    PubMed

    Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P

    2010-06-01

    The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it was possible to interchange certain descriptors (i.e. molecular weight and melting point) without incurring a loss of model quality. Such synergy suggested that a model constructed from discrete terms in an equation may not be the most appropriate way of representing mechanistic understandings of skin absorption.

  6. Prediction of total organic carbon content in shale reservoir based on a new integrated hybrid neural network and conventional well logging curves

    NASA Astrophysics Data System (ADS)

    Zhu, Linqi; Zhang, Chong; Zhang, Chaomo; Wei, Yang; Zhou, Xueqing; Cheng, Yuan; Huang, Yuyang; Zhang, Le

    2018-06-01

    There is increasing interest in shale gas reservoirs due to their abundant reserves. As a key evaluation criterion, the total organic carbon content (TOC) of the reservoirs can reflect its hydrocarbon generation potential. The existing TOC calculation model is not very accurate and there is still the possibility for improvement. In this paper, an integrated hybrid neural network (IHNN) model is proposed for predicting the TOC. This is based on the fact that the TOC information on the low TOC reservoir, where the TOC is easy to evaluate, comes from a prediction problem, which is the inherent problem of the existing algorithm. By comparing the prediction models established in 132 rock samples in the shale gas reservoir within the Jiaoshiba area, it can be seen that the accuracy of the proposed IHNN model is much higher than that of the other prediction models. The mean square error of the samples, which were not joined to the established models, was reduced from 0.586 to 0.442. The results show that TOC prediction is easier after logging prediction has been improved. Furthermore, this paper puts forward the next research direction of the prediction model. The IHNN algorithm can help evaluate the TOC of a shale gas reservoir.

  7. Launch and Landing Effects Ground Operations (LLEGO) Model

    NASA Technical Reports Server (NTRS)

    2008-01-01

    LLEGO is a model for understanding recurring launch and landing operations costs at Kennedy Space Center for human space flight. Launch and landing operations are often referred to as ground processing, or ground operations. Currently, this function is specific to the ground operations for the Space Shuttle Space Transportation System within the Space Shuttle Program. The Constellation system to follow the Space Shuttle consists of the crewed Orion spacecraft atop an Ares I launch vehicle and the uncrewed Ares V cargo launch vehicle. The Constellation flight and ground systems build upon many elements of the existing Shuttle flight and ground hardware, as well as upon existing organizations and processes. In turn, the LLEGO model builds upon past ground operations research, modeling, data, and experience in estimating for future programs. Rather than to simply provide estimates, the LLEGO model s main purpose is to improve expenses by relating complex relationships among functions (ground operations contractor, subcontractors, civil service technical, center management, operations, etc.) to tangible drivers. Drivers include flight system complexity and reliability, as well as operations and supply chain management processes and technology. Together these factors define the operability and potential improvements for any future system, from the most direct to the least direct expenses.

  8. Potential Improvements to Remote Primary Productivity Estimation in the Southern California Current System

    NASA Astrophysics Data System (ADS)

    Jacox, M.; Edwards, C. A.; Kahru, M.; Rudnick, D. L.; Kudela, R. M.

    2012-12-01

    A 26-year record of depth integrated primary productivity (PP) in the Southern California Current System (SCCS) is analyzed with the goal of improving satellite net primary productivity (PP) estimates. The ratio of integrated primary productivity to surface chlorophyll correlates strongly to surface chlorophyll concentration (chl0). However, chl0 does not correlate to chlorophyll-specific productivity, and appears to be a proxy for vertical phytoplankton distribution rather than phytoplankton physiology. Modest improvements in PP model performance are achieved by tuning existing algorithms for the SCCS, particularly by empirical parameterization of photosynthetic efficiency in the Vertically Generalized Production Model. Much larger improvements are enabled by improving accuracy of subsurface chlorophyll and light profiles. In a simple vertically resolved production model, substitution of in situ surface data for remote sensing estimates offers only marginal improvements in model r2 and total log10 root mean squared difference, while inclusion of in situ chlorophyll and light profiles improves these metrics significantly. Autonomous underwater gliders, capable of measuring subsurface fluorescence on long-term, long-range deployments, significantly improve PP model fidelity in the SCCS. We suggest their use (and that of other autonomous profilers such as Argo floats) in conjunction with satellites as a way forward for improved PP estimation in coastal upwelling systems.

  9. The potential for improving remote primary productivity estimates through subsurface chlorophyll and irradiance measurement

    NASA Astrophysics Data System (ADS)

    Jacox, Michael G.; Edwards, Christopher A.; Kahru, Mati; Rudnick, Daniel L.; Kudela, Raphael M.

    2015-02-01

    A 26-year record of depth integrated primary productivity (PP) in the Southern California Current System (SCCS) is analyzed with the goal of improving satellite net primary productivity (PP) estimates. Modest improvements in PP model performance are achieved by tuning existing algorithms for the SCCS, particularly by parameterizing carbon fixation rate in the vertically generalized production model as a function of surface chlorophyll concentration and distance from shore. Much larger improvements are enabled by improving the accuracy of subsurface chlorophyll and light profiles. In a simple vertically resolved production model for the SCCS (VRPM-SC), substitution of in situ surface data for remote sensing estimates offers only marginal improvements in model r2 (from 0.54 to 0.56) and total log10 root mean squared difference (from 0.22 to 0.21), while inclusion of in situ chlorophyll and light profiles improves these metrics to 0.77 and 0.15, respectively. Autonomous underwater gliders, capable of measuring subsurface properties on long-term, long-range deployments, significantly improve PP model fidelity in the SCCS. We suggest their use (and that of other autonomous profilers such as Argo floats) in conjunction with satellites as a way forward for large-scale improvements in PP estimation.

  10. Computational studies of horizontal axis wind turbines in high wind speed condition using advanced turbulence models

    NASA Astrophysics Data System (ADS)

    Benjanirat, Sarun

    Next generation horizontal-axis wind turbines (HAWTs) will operate at very high wind speeds. Existing engineering approaches for modeling the flow phenomena are based on blade element theory, and cannot adequately account for 3-D separated, unsteady flow effects. Therefore, researchers around the world are beginning to model these flows using first principles-based computational fluid dynamics (CFD) approaches. In this study, an existing first principles-based Navier-Stokes approach is being enhanced to model HAWTs at high wind speeds. The enhancements include improved grid topology, implicit time-marching algorithms, and advanced turbulence models. The advanced turbulence models include the Spalart-Allmaras one-equation model, k-epsilon, k-o and Shear Stress Transport (k-o-SST) models. These models are also integrated with detached eddy simulation (DES) models. Results are presented for a range of wind speeds, for a configuration termed National Renewable Energy Laboratory Phase VI rotor, tested at NASA Ames Research Center. Grid sensitivity studies are also presented. Additionally, effects of existing transition models on the predictions are assessed. Data presented include power/torque production, radial distribution of normal and tangential pressure forces, root bending moments, and surface pressure fields. Good agreement was obtained between the predictions and experiments for most of the conditions, particularly with the Spalart-Allmaras-DES model.

  11. Testing the efficacy of existing force-endurance models to account for the prevalence of obesity in the workforce.

    PubMed

    Pajoutan, Mojdeh; Cavuoto, Lora A; Mehta, Ranjana K

    2017-10-01

    This study evaluates whether the existing force-endurance relationship models are predictive of endurance time for overweight and obese individuals, and if not, provide revised models that can be applied for ergonomics practice. Data was collected from 141 participants (49 normal weight, 50 overweight, 42 obese) who each performed isometric endurance tasks of hand grip, shoulder flexion, and trunk extension at four levels of relative workload. Subject-specific fatigue rates and a general model of the force-endurance relationship were determined and compared to two fatigue models from the literature. There was a lack of fit between previous models and the current data for the grip (ICC = 0.8), with a shift toward lower endurance times for the new data. Application of the revised models can facilitate improved workplace design and job evaluation to accommodate the capacities of the current workforce.

  12. Enhanced model of photovoltaic cell/panel/array considering the direct and reverse modes

    NASA Astrophysics Data System (ADS)

    Zegaoui, Abdallah; Boutoubat, Mohamed; Sawicki, Jean-Paul; Kessaissia, Fatma Zohra; Djahbar, Abdelkader; Aillerie, Michel

    2018-05-01

    This paper presents an improved generalized physical model for photovoltaic, PV cells, panels and arrays taking into account the behavior of these devices when considering their biasing existing in direct and reverse modes. Existing PV physical models generally are very efficient for simulating influence of irradiation changes on the short circuit current but they could not visualize the influences of temperature changes. The Enhanced Direct and Reverse Mode model, named EDRM model, enlightens the influence on the short-circuit current of both temperature and irradiation in the reverse mode of the considered PV devices. Due to its easy implementation, the proposed model can be a useful power tool for the development of new photovoltaic systems taking into account and in a more exhaustive manner, environmental conditions. The developed model was tested on a marketed PV panel and it gives a satisfactory results compared with parameters given in the manufacturer datasheet.

  13. Toward a descriptive model of galactic cosmic rays in the heliosphere

    NASA Technical Reports Server (NTRS)

    Mewaldt, R. A.; Cummings, A. C.; Adams, James H., Jr.; Evenson, Paul; Fillius, W.; Jokipii, J. R.; Mckibben, R. B.; Robinson, Paul A., Jr.

    1988-01-01

    Researchers review the elements that enter into phenomenological models of the composition, energy spectra, and the spatial and temporal variations of galactic cosmic rays, including the so-called anomalous cosmic ray component. Starting from an existing model, designed to describe the behavior of cosmic rays in the near-Earth environment, researchers suggest possible updates and improvements to this model, and then propose a quantitative approach for extending such a model into other regions of the heliosphere.

  14. Using kaizen to improve employee well-being: Results from two organizational intervention studies.

    PubMed

    von Thiele Schwarz, Ulrica; Nielsen, Karina M; Stenfors-Hayes, Terese; Hasson, Henna

    2017-08-01

    Participatory intervention approaches that are embedded in existing organizational structures may improve the efficiency and effectiveness of organizational interventions, but concrete tools are lacking. In the present article, we use a realist evaluation approach to explore the role of kaizen, a lean tool for participatory continuous improvement, in improving employee well-being in two cluster-randomized, controlled participatory intervention studies. Case 1 is from the Danish Postal Service, where kaizen boards were used to implement action plans. The results of multi-group structural equation modeling showed that kaizen served as a mechanism that increased the level of awareness of and capacity to manage psychosocial issues, which, in turn, predicted increased job satisfaction and mental health. Case 2 is from a regional hospital in Sweden that integrated occupational health processes with a pre-existing kaizen system. Multi-group structural equation modeling revealed that, in the intervention group, kaizen work predicted better integration of organizational and employee objectives after 12 months, which, in turn, predicted increased job satisfaction and decreased discomfort at 24 months. The findings suggest that participatory and structured problem-solving approaches that are familiar and visual to employees can facilitate organizational interventions.

  15. Using kaizen to improve employee well-being: Results from two organizational intervention studies

    PubMed Central

    von Thiele Schwarz, Ulrica; Nielsen, Karina M; Stenfors-Hayes, Terese; Hasson, Henna

    2016-01-01

    Participatory intervention approaches that are embedded in existing organizational structures may improve the efficiency and effectiveness of organizational interventions, but concrete tools are lacking. In the present article, we use a realist evaluation approach to explore the role of kaizen, a lean tool for participatory continuous improvement, in improving employee well-being in two cluster-randomized, controlled participatory intervention studies. Case 1 is from the Danish Postal Service, where kaizen boards were used to implement action plans. The results of multi-group structural equation modeling showed that kaizen served as a mechanism that increased the level of awareness of and capacity to manage psychosocial issues, which, in turn, predicted increased job satisfaction and mental health. Case 2 is from a regional hospital in Sweden that integrated occupational health processes with a pre-existing kaizen system. Multi-group structural equation modeling revealed that, in the intervention group, kaizen work predicted better integration of organizational and employee objectives after 12 months, which, in turn, predicted increased job satisfaction and decreased discomfort at 24 months. The findings suggest that participatory and structured problem-solving approaches that are familiar and visual to employees can facilitate organizational interventions. PMID:28736455

  16. Metropolitan Model Deployment Initiative : Seattle evaluation report

    DOT National Transportation Integrated Search

    2000-05-30

    The Washington State Department of Transportation (WSDOT) and others in the public and private sectors are looking to emerging technologies to help improve the performance of the Seattle regions existing transportation system. Their goal is to app...

  17. The Navy Oceanic Vertical Aerosol Model

    DTIC Science & Technology

    1993-12-01

    development of models from the basic research community in the future. Another area of concern is the use of the model in close-in coastal areas. Compensation...34windows" exist in the molecular absorption of the electromagnetic energy through which trans- missions in IR communication can take place. In these...commercial market ) will greatly improve the overall operation of the model. It will do this in conjunction with the optical visibility by pinning down

  18. Joint Optimization of Vertical Component Gravity and Seismic P-wave First Arrivals by Simulated Annealing

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Basler-Reeder, K.; Kent, G. M.; Pullammanappallil, S. K.

    2015-12-01

    Simultaneous joint seismic-gravity optimization improves P-wave velocity models in areas with sharp lateral velocity contrasts. Optimization is achieved using simulated annealing, a metaheuristic global optimization algorithm that does not require an accurate initial model. Balancing the seismic-gravity objective function is accomplished by a novel approach based on analysis of Pareto charts. Gravity modeling uses a newly developed convolution algorithm, while seismic modeling utilizes the highly efficient Vidale eikonal equation traveltime generation technique. Synthetic tests show that joint optimization improves velocity model accuracy and provides velocity control below the deepest headwave raypath. Detailed first arrival picking followed by trial velocity modeling remediates inconsistent data. We use a set of highly refined first arrival picks to compare results of a convergent joint seismic-gravity optimization to the Plotrefa™ and SeisOpt® Pro™ velocity modeling packages. Plotrefa™ uses a nonlinear least squares approach that is initial model dependent and produces shallow velocity artifacts. SeisOpt® Pro™ utilizes the simulated annealing algorithm and is limited to depths above the deepest raypath. Joint optimization increases the depth of constrained velocities, improving reflector coherency at depth. Kirchoff prestack depth migrations reveal that joint optimization ameliorates shallow velocity artifacts caused by limitations in refraction ray coverage. Seismic and gravity data from the San Emidio Geothermal field of the northwest Basin and Range province demonstrate that joint optimization changes interpretation outcomes. The prior shallow-valley interpretation gives way to a deep valley model, while shallow antiformal reflectors that could have been interpreted as antiformal folds are flattened. Furthermore, joint optimization provides a clearer image of the rangefront fault. This technique can readily be applied to existing datasets and could replace the existing strategy of forward modeling to match gravity data.

  19. Accurate and dynamic predictive model for better prediction in medicine and healthcare.

    PubMed

    Alanazi, H O; Abdullah, A H; Qureshi, K N; Ismail, A S

    2018-05-01

    Information and communication technologies (ICTs) have changed the trend into new integrated operations and methods in all fields of life. The health sector has also adopted new technologies to improve the systems and provide better services to customers. Predictive models in health care are also influenced from new technologies to predict the different disease outcomes. However, still, existing predictive models have suffered from some limitations in terms of predictive outcomes performance. In order to improve predictive model performance, this paper proposed a predictive model by classifying the disease predictions into different categories. To achieve this model performance, this paper uses traumatic brain injury (TBI) datasets. TBI is one of the serious diseases worldwide and needs more attention due to its seriousness and serious impacts on human life. The proposed predictive model improves the predictive performance of TBI. The TBI data set is developed and approved by neurologists to set its features. The experiment results show that the proposed model has achieved significant results including accuracy, sensitivity, and specificity.

  20. A Novel Evaluation Model for the Vehicle Navigation Device Market Using Hybrid MCDM Techniques

    NASA Astrophysics Data System (ADS)

    Lin, Chia-Li; Hsieh, Meng-Shu; Tzeng, Gwo-Hshiung

    The developing strategy of ND is also presented to initiate the product roadmap. Criteria for evaluation are constructed via reviewing papers, interviewing experts and brain-storming. The ISM (interpretive structural modeling) method was used to construct the relationship between each criterion. The existing NDs were sampled to benchmark the gap between the consumer’s aspired/desired utilities with respect to the utilities of existing/developing NDs. The VIKOR method was applied to rank the sampled NDs. This paper will propose the key driving criteria of purchasing new ND and compare the consumer behavior of various characters. Those conclusions can be served as a reference for ND producers for improving existing functions or planning further utilities in the next e-era ND generation.

  1. CAN A MODEL TRANSFERABILITY FRAMEWORK IMPROVE ECOSYSTEM SERVICE ESTIMATES? A CASE STUDY OF SOIL FOREST CARBON SEQUESTRATION IN TILLAMOOK BAY, OR, USA

    EPA Science Inventory

    Budget constraints and policies that limit primary data collection have fueled a practice of transferring estimates (or models to generate estimates) of ecological endpoints from sites where primary data exists to sites where little to no primary data were collected. Whereas bene...

  2. Catholic School Faculty Meetings: A Case Study Linking Catholic Identity, School Improvement, and Teacher Engagement

    ERIC Educational Resources Information Center

    Hagan, Daryl C.; Houchens, Gary

    2016-01-01

    While research on faculty meetings is limited, existing literature suggests that meetings could be an arena where schools can address their most pressing challenges (Brandenburg, 2008; Michel, 2011; Riehl, 1998). Building on Macey and Schneider's (2008) Model of Employee Engagement and McGrath's Model of Group Effectiveness (1964), this case study…

  3. Pre-Service Teachers Learn to Teach Geography: A Suggested Course Model

    ERIC Educational Resources Information Center

    Mitchell, Jerry T.

    2018-01-01

    How to improve geography education via teacher preparation programs has been a concern for nearly three decades, but few examples of a single, comprehensive university-level course exist. The purpose of this article is to share the model of a pre-service geography education methods course. Within the course, geography content (physical and social)…

  4. Clinical prediction models for mortality and functional outcome following ischemic stroke: A systematic review and meta-analysis

    PubMed Central

    Crayton, Elise; Wolfe, Charles; Douiri, Abdel

    2018-01-01

    Objective We aim to identify and critically appraise clinical prediction models of mortality and function following ischaemic stroke. Methods Electronic databases, reference lists, citations were searched from inception to September 2015. Studies were selected for inclusion, according to pre-specified criteria and critically appraised by independent, blinded reviewers. The discrimination of the prediction models was measured by the area under the curve receiver operating characteristic curve or c-statistic in random effects meta-analysis. Heterogeneity was measured using I2. Appropriate appraisal tools and reporting guidelines were used in this review. Results 31395 references were screened, of which 109 articles were included in the review. These articles described 66 different predictive risk models. Appraisal identified poor methodological quality and a high risk of bias for most models. However, all models precede the development of reporting guidelines for prediction modelling studies. Generalisability of models could be improved, less than half of the included models have been externally validated(n = 27/66). 152 predictors of mortality and 192 predictors and functional outcome were identified. No studies assessing ability to improve patient outcome (model impact studies) were identified. Conclusions Further external validation and model impact studies to confirm the utility of existing models in supporting decision-making is required. Existing models have much potential. Those wishing to predict stroke outcome are advised to build on previous work, to update and adapt validated models to their specific contexts opposed to designing new ones. PMID:29377923

  5. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  6. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  7. Satellite-based terrestrial production efficiency modeling

    PubMed Central

    McCallum, Ian; Wagner, Wolfgang; Schmullius, Christiane; Shvidenko, Anatoly; Obersteiner, Michael; Fritz, Steffen; Nilsson, Sten

    2009-01-01

    Production efficiency models (PEMs) are based on the theory of light use efficiency (LUE) which states that a relatively constant relationship exists between photosynthetic carbon uptake and radiation receipt at the canopy level. Challenges remain however in the application of the PEM methodology to global net primary productivity (NPP) monitoring. The objectives of this review are as follows: 1) to describe the general functioning of six PEMs (CASA; GLO-PEM; TURC; C-Fix; MOD17; and BEAMS) identified in the literature; 2) to review each model to determine potential improvements to the general PEM methodology; 3) to review the related literature on satellite-based gross primary productivity (GPP) and NPP modeling for additional possibilities for improvement; and 4) based on this review, propose items for coordinated research. This review noted a number of possibilities for improvement to the general PEM architecture - ranging from LUE to meteorological and satellite-based inputs. Current PEMs tend to treat the globe similarly in terms of physiological and meteorological factors, often ignoring unique regional aspects. Each of the existing PEMs has developed unique methods to estimate NPP and the combination of the most successful of these could lead to improvements. It may be beneficial to develop regional PEMs that can be combined under a global framework. The results of this review suggest the creation of a hybrid PEM could bring about a significant enhancement to the PEM methodology and thus terrestrial carbon flux modeling. Key items topping the PEM research agenda identified in this review include the following: LUE should not be assumed constant, but should vary by plant functional type (PFT) or photosynthetic pathway; evidence is mounting that PEMs should consider incorporating diffuse radiation; continue to pursue relationships between satellite-derived variables and LUE, GPP and autotrophic respiration (Ra); there is an urgent need for satellite-based biomass measurements to improve Ra estimation; and satellite-based soil moisture data could improve determination of soil water stress. PMID:19765285

  8. IMPROVED ALGORITHMS FOR RADAR-BASED RECONSTRUCTION OF ASTEROID SHAPES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenberg, Adam H.; Margot, Jean-Luc

    We describe our implementation of a global-parameter optimizer and Square Root Information Filter into the asteroid-modeling software shape. We compare the performance of our new optimizer with that of the existing sequential optimizer when operating on various forms of simulated data and actual asteroid radar data. In all cases, the new implementation performs substantially better than its predecessor: it converges faster, produces shape models that are more accurate, and solves for spin axis orientations more reliably. We discuss potential future changes to improve shape's fitting speed and accuracy.

  9. Modeling of near-wall turbulence

    NASA Technical Reports Server (NTRS)

    Shih, T. H.; Mansour, N. N.

    1990-01-01

    An improved k-epsilon model and a second order closure model is presented for low Reynolds number turbulence near a wall. For the k-epsilon model, a modified form of the eddy viscosity having correct asymptotic near wall behavior is suggested, and a model for the pressure diffusion term in the turbulent kinetic energy equation is proposed. For the second order closure model, the existing models are modified for the Reynolds stress equations to have proper near wall behavior. A dissipation rate equation for the turbulent kinetic energy is also reformulated. The proposed models satisfy realizability and will not produce unphysical behavior. Fully developed channel flows are used for model testing. The calculations are compared with direct numerical simulations. It is shown that the present models, both the k-epsilon model and the second order closure model, perform well in predicting the behavior of the near wall turbulence. Significant improvements over previous models are obtained.

  10. From guidelines to practice: a pharmacist-driven prospective audit and feedback improvement model for peri-operative antibiotic prophylaxis in 34 South African hospitals.

    PubMed

    Brink, Adrian J; Messina, Angeliki P; Feldman, Charles; Richards, Guy A; van den Bergh, Dena

    2017-04-01

    Few data exist on the implementation of process measures to facilitate adherence to peri-operative antibiotic prophylaxis (PAP) guidelines in Africa. To implement an improvement model for PAP utilizing existing resources, in order to achieve a reduction in surgical site infections (SSIs) across a heterogeneous group of 34 urban and rural South African hospitals. A pharmacist-driven, prospective audit and feedback strategy involving change management and improvement principles was utilized. This 2.5 year intervention involved a pre-implementation phase to test a PAP guideline and a 'toolkit' at pilot sites. Following antimicrobial stewardship committee and clinician endorsement, the model was introduced in all institutions and a survey of baseline SSI and compliance rates with four process measures (antibiotic choice, dose, administration time and duration) was performed. The post-implementation phase involved audit, intervention and monthly feedback to facilitate improvements in compliance. For 70 weeks of standardized measurements and feedback, 24 206 surgical cases were reviewed. There was a significant improvement in compliance with all process measures (composite compliance) from 66.8% (95% CI 64.8-68.7) to 83.3% (95% CI 80.8-85.8), representing a 24.7% increase ( P  <   0.0001). The SSI rate decreased by 19.7% from a mean group rate of 2.46 (95% CI 2.18-2.73) pre-intervention to 1.97 post-intervention (95% CI 1.79-2.15) ( P  =   0.0029). The implementation of process improvement initiatives and principles targeted to institutional needs utilizing pharmacists can effectively improve PAP guideline compliance and sustainable patient outcomes. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Re-refinement from deposited X-ray data can deliver improved models for most PDB entries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joosten, Robbie P.; Womack, Thomas; Vriend, Gert, E-mail: vriend@cmbi.ru.nl

    2009-02-01

    An evaluation of validation and real-space intervention possibilities for improving existing automated (re-)refinement methods. The deposition of X-ray data along with the customary structural models defining PDB entries makes it possible to apply large-scale re-refinement protocols to these entries, thus giving users the benefit of improvements in X-ray methods that have occurred since the structure was deposited. Automated gradient refinement is an effective method to achieve this goal, but real-space intervention is most often required in order to adequately address problems detected by structure-validation software. In order to improve the existing protocol, automated re-refinement was combined with structure validation andmore » difference-density peak analysis to produce a catalogue of problems in PDB entries that are amenable to automatic correction. It is shown that re-refinement can be effective in producing improvements, which are often associated with the systematic use of the TLS parameterization of B factors, even for relatively new and high-resolution PDB entries, while the accompanying manual or semi-manual map analysis and fitting steps show good prospects for eventual automation. It is proposed that the potential for simultaneous improvements in methods and in re-refinement results be further encouraged by broadening the scope of depositions to include refinement metadata and ultimately primary rather than reduced X-ray data.« less

  12. Reviewing innovative Earth observation solutions for filling science-policy gaps in hydrology

    NASA Astrophysics Data System (ADS)

    Lehmann, Anthony; Giuliani, Gregory; Ray, Nicolas; Rahman, Kazi; Abbaspour, Karim C.; Nativi, Stefano; Craglia, Massimo; Cripe, Douglas; Quevauviller, Philippe; Beniston, Martin

    2014-10-01

    Improved data sharing is needed for hydrological modeling and water management that require better integration of data, information and models. Technological advances in Earth observation and Web technologies have allowed the development of Spatial Data Infrastructures (SDIs) for improved data sharing at various scales. International initiatives catalyze data sharing by promoting interoperability standards to maximize the use of data and by supporting easy access to and utilization of geospatial data. A series of recent European projects are contributing to the promotion of innovative Earth observation solutions and the uptake of scientific outcomes in policy. Several success stories involving different hydrologists' communities can be reported around the World. Gaps still exist in hydrological, agricultural, meteorological and climatological data access because of various issues. While many sources of data exists at all scales it remains difficult and time-consuming to assemble hydrological information for most projects. Furthermore, data and sharing formats remain very heterogeneous. Improvements require implementing/endorsing some commonly agreed standards and documenting data with adequate metadata. The brokering approach allows binding heterogeneous resources published by different data providers and adapting them to tools and interfaces commonly used by consumers of these resources. The challenge is to provide decision-makers with reliable information, based on integrated data and tools derived from both Earth observations and scientific models. Successful SDIs rely therefore on various aspects: a shared vision between all participants, necessity to solve a common problem, adequate data policies, incentives, and sufficient resources. New data streams from remote sensing or crowd sourcing are also producing valuable information to improve our understanding of the water cycle, while field sensors are developing rapidly and becoming less costly. More recent data standards are enhancing interoperability between hydrology and other scientific disciplines, while solutions exist to communicate uncertainty of data and models, which is an essential pre-requisite for decision-making. Distributed computing infrastructures can handle complex and large hydrological data and models, while Web Processing Services bring the flexibility to develop and execute simple to complex workflows over the Internet. The need for capacity building at human, infrastructure and institutional levels is also a major driver for reinforcing the commitment to SDI concepts.

  13. Performance model-directed data sieving for high-performance I/O

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yong; Lu, Yin; Amritkar, Prathamesh

    2014-09-10

    Many scientific computing applications and engineering simulations exhibit noncontiguous I/O access patterns. Data sieving is an important technique to improve the performance of noncontiguous I/O accesses by combining small and noncontiguous requests into a large and contiguous request. It has been proven effective even though more data are potentially accessed than demanded. In this study, we propose a new data sieving approach namely performance model-directed data sieving, or PMD data sieving in short. It improves the existing data sieving approach from two aspects: (1) dynamically determines when it is beneficial to perform data sieving; and (2) dynamically determines how tomore » perform data sieving if beneficial. It improves the performance of the existing data sieving approach considerably and reduces the memory consumption as verified by both theoretical analysis and experimental results. Given the importance of supporting noncontiguous accesses effectively and reducing the memory pressure in a large-scale system, the proposed PMD data sieving approach in this research holds a great promise and will have an impact on high-performance I/O systems.« less

  14. Enhancing and Adapting Treatment Foster Care: Lessons Learned in Trying to Change Practice.

    PubMed

    Murray, Maureen M; Southerland, Dannia; Farmer, Elizabeth M; Ballentine, Kess

    2010-01-01

    Evidence-based practices to improve outcomes for children with severe behavioral and emotional problems have received a great deal of attention in children's mental health. Therapeutic Foster Care (TFC), a residential intervention for youth with emotional or behavioral problems, is one of the few community-based programs that is considered to be evidence-based. However, as for most treatment approaches, the vast majority of existing programs do not deliver the evidence-based version. In an attempt to fill this gap and improve practice across a wide range of TFC agencies, we developed an enhanced model of TFC based on input from both practice and research. It includes elements associated with improved outcomes for youth in "usual care" TFC agencies as well as key elements from Chamberlain's evidence-based model. The current manuscript describes this "hybrid" intervention - Together Facing the Challenge - and discusses key issues in implementation. We describe the sample and settings, highlight key implementation strategies, and provide "lessons learned" to help guide others who may wish to change practice in existing agencies.

  15. The relevance of human stem cell-derived organoid models for epithelial translational medicine

    PubMed Central

    Hynds, Robert E.; Giangreco, Adam

    2014-01-01

    Epithelial organ remodeling is a major contributing factor to worldwide death and disease, costing healthcare systems billions of dollars every year. Despite this, most fundamental epithelial organ research fails to produce new therapies and mortality rates for epithelial organ diseases remain unacceptably high. In large part, this failure in translating basic epithelial research into clinical therapy is due to a lack of relevance in existing preclinical models. To correct this, new models are required that improve preclinical target identification, pharmacological lead validation, and compound optimization. In this review, we discuss the relevance of human stem cell-derived, three-dimensional organoid models for addressing each of these challenges. We highlight the advantages of stem cell-derived organoid models over existing culture systems, discuss recent advances in epithelial tissue-specific organoids, and present a paradigm for using organoid models in human translational medicine. PMID:23203919

  16. Modeling highly transient flow, mass, and heat transport in the Chattahoochee River near Atlanta, Georgia

    USGS Publications Warehouse

    Jobson, Harvey E.; Keefer, Thomas N.

    1979-01-01

    A coupled flow-temperature model has been developed and verified for a 27.9-km reach of the Chattahoochee River between Buford Dam and Norcross, Ga. Flow in this reach of the Chattahoochee is continuous but highly regulated by Buford Dam, a flood-control and hydroelectric facility located near Buford, Ga. Calibration and verification utilized two sets of data collected under highly unsteady discharge conditions. Existing solution techniques, with certain minor improvements, were applied to verify the existing technology of flow and transport modeling. A linear, implicit finite-difference flow model was coupled with implicit, finite-difference transport and temperature models. Both the conservative and nonconservative forms of the transport equation were solved, and the difference in the predicted concentrations of dye were found to be insignificant. The temperature model, therefore, was based on the simpler nonconservative form of the transport equation. (Woodard-USGS)

  17. Robust Models for Operator Workload Estimation

    DTIC Science & Technology

    2015-03-01

    piloted aircraft (RPA) simultaneously, a vast improvement in resource utilization compared to existing operations that require several operators per...into distinct cognitive channels (visual, auditory, spatial, etc.) based on our ability to multitask effectively as long as no one channel is

  18. ASTER Global Digital Elevation Model Version 2 - summary of validation results

    USGS Publications Warehouse

    Tachikawa, Tetushi; Kaku, Manabu; Iwasaki, Akira; Gesch, Dean B.; Oimoen, Michael J.; Zhang, Z.; Danielson, Jeffrey J.; Krieger, Tabatha; Curtis, Bill; Haase, Jeff; Abrams, Michael; Carabajal, C.; Meyer, Dave

    2011-01-01

    Based on these findings, the GDEM validation team recommends the release of the GDEM2 to the public, acknowledging that, while vastly improved, some artifacts still exist which could affect its utility in certain applications.

  19. Development of an improved system for contract time determination : phase III.

    DOT National Transportation Integrated Search

    2010-09-30

    This study developed Daily Work Report (DWR) based prediction models to determine reasonable : production rates of controlling activities of highway projects. The study used available resources such as : DWR, soil data, AADT and other existing projec...

  20. Using a safety forecast model to calculate future safety metrics.

    DOT National Transportation Integrated Search

    2017-05-01

    This research sought to identify a process to improve long-range planning prioritization by using forecasted : safety metrics in place of the existing Utah Department of Transportation Safety Indexa metric based on historical : crash data. The res...

  1. A model to estimate the cost effectiveness of the indoorenvironment improvements in office work

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seppanen, Olli; Fisk, William J.

    2004-06-01

    Deteriorated indoor climate is commonly related to increases in sick building syndrome symptoms, respiratory illnesses, sick leave, reduced comfort and losses in productivity. The cost of deteriorated indoor climate for the society is high. Some calculations show that the cost is higher than the heating energy costs of the same buildings. Also building-level calculations have shown that many measures taken to improve indoor air quality and climate are cost-effective when the potential monetary savings resulting from an improved indoor climate are included as benefits gained. As an initial step towards systemizing these building level calculations we have developed a conceptualmore » model to estimate the cost-effectiveness of various measures. The model shows the links between the improvements in the indoor environment and the following potential financial benefits: reduced medical care cost, reduced sick leave, better performance of work, lower turn over of employees, and lower cost of building maintenance due to fewer complaints about indoor air quality and climate. The pathways to these potential benefits from changes in building technology and practices go via several human responses to the indoor environment such as infectious diseases, allergies and asthma, sick building syndrome symptoms, perceived air quality, and thermal environment. The model also includes the annual cost of investments, operation costs, and cost savings of improved indoor climate. The conceptual model illustrates how various factors are linked to each other. SBS symptoms are probably the most commonly assessed health responses in IEQ studies and have been linked to several characteristics of buildings and IEQ. While the available evidence indicates that SBS symptoms can affect these outcomes and suspects that such a linkage exists, at present we can not quantify the relationships sufficiently for cost-benefit modeling. New research and analyses of existing data to quantify the financial importance of SBS symptoms would enable more widespread consideration of the effects of IEQ in cost benefit calculations.« less

  2. Interprofessional Collaborative Practice Models in Chronic Disease Management.

    PubMed

    Southerland, Janet H; Webster-Cyriaque, Jennifer; Bednarsh, Helene; Mouton, Charles P

    2016-10-01

    Interprofessional collaboration in health has become essential to providing high-quality care, decreased costs, and improved outcomes. Patient-centered care requires synthesis of all the components of primary and specialty medicine to address patient needs. For individuals living with chronic diseases, this model is even more critical to obtain better health outcomes. Studies have shown shown that oral health and systemic disease are correlated as it relates to disease development and progression. Thus, inclusion of oral health in many of the existing and new collaborative models could result in better management of chronic illnesses and improve overall health outcomes. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Using Model Based Systems Engineering and the Systems Modeling Language to Develop Space Mission Area Architectures

    DTIC Science & Technology

    2013-09-01

    processes used in space system acquisitions, simply implementing a data exchange specification would not fundamentally improve how information is...instruction, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing the collection of information ...and manage the configuration of all critical program models, processes , and tools used throughout the DoD. Second, mandate a data exchange

  4. Sensitivity analysis of a soil-vegetation-atmosphere transfer (SVAT) model parameterised for a British floodplain meadow

    NASA Astrophysics Data System (ADS)

    Morris, P. J.; Verhoef, A.; Van der Tol, C.; Macdonald, D.

    2011-12-01

    Rationale: Floodplain meadows are highly species-rich grassland ecosystems, unique in that their vegetation and soil structures have been shaped and maintained by ~1,000 yrs of traditional, low-intensity agricultural management. Widespread development on floodplains over the last two centuries has left few remaining examples of these once commonplace ecosystems and they are afforded high conservation value by British and European agencies. Increased incidences and severity of summer drought and winter flooding in Britain in recent years have placed floodplain plant communities under stress through altered soil moisture regimes. There is a clear need for improved management strategies if the last remaining British floodplain meadows are to be conserved under changing climates. Aim: As part of the Floodplain Underground Sensors Experiment (FUSE, a 3-year project funded by the Natural Environment Research Council) we aim to understand the environmental controls over soil-vegetation-atmosphere transfers (SVAT) of water, CO2 and energy at Yarnton Mead, a floodplain meadow in southern England. An existing model, SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes; van der Tol et al., 2009), uses remotely-sensed infrared radiance spectra to predict heat and water transfers between a vegetation canopy and the atmosphere. We intend to expand SCOPE by developing a more realistic, physically-based representation of water, gas and energy transfers between soil and vegetation. This improved understanding will eventually take the form of a new submodel within SCOPE, allowing more rigorous estimation of soil-canopy-atmosphere exchanges for the site using predominantly remotely-sensed data. In this context a number of existing SVAT models will be tested and compared to ensure that only reliable and robust underground model components will be coupled to SCOPE. Approach: For this study, we parameterised an existing and widely-used SVAT model (CoupModel; Jansson, 2011) for our study site and analysed the model's sensitivity to a comprehensive set of soil/plant biophysical processes and parameter values. Findings: The sensitivity analysis indicates those processes and parameters most important to soil-vegetation-atmosphere transfers at the site. We use the outcomes of the sensitivity analysis to indicate directly the desired structure of the new SCOPE submodel. In addition, existing soil-moisture, soil matric-potential and meteorological data for the site indicate that evapotranspiration is heavily water-limited during summer months, although soil moisture and soil matric-potential data alone provide very little explanation of the ratio of potential to actual evapotranspiration. A mechanistic representation of stomatal resistance and its response to short-term changes in meteorological conditions - independent of soil moisture status - will also likely improve SCOPE's predictions of heat and water transfers. Ultimately our work will contribute to improved understanding and management of floodplain meadows in Britain and elsewhere.

  5. Estimates of Nitrogen, Phosphorus, Biochemical Oxygen Demand, and Fecal Coliforms Entering the Environment Due to Inadequate Sanitation Treatment Technologies in 108 Low and Middle Income Countries.

    PubMed

    Fuhrmeister, Erica R; Schwab, Kellogg J; Julian, Timothy R

    2015-10-06

    Understanding the excretion and treatment of human waste (feces and urine) in low and middle income countries (LMICs) is necessary to design appropriate waste management strategies. However, excretion and treatment are often difficult to quantify due to decentralization of excreta management. We address this gap by developing a mechanistic, stochastic model to characterize phosphorus, nitrogen, biochemical oxygen demand (BOD), and fecal coliform pollution from human excreta for 108 LMICs. The model estimates excretion and treatment given three scenarios: (1) use of existing sanitation systems, (2) use of World Health Organization-defined "improved sanitation", and (3) use of best available technologies. Our model estimates that more than 10(9) kg/yr each of phosphorus, nitrogen and BOD are produced. Of this, 22(19-27)%, 11(7-15)%, 17(10-23)%, and 35 (23-47)% (mean and 95% range) BOD, nitrogen, phosphorus, and fecal coliforms, respectively, are removed by existing sanitation systems. Our model estimates that upgrading to "improved sanitation" increases mean removal slightly to between 17 and 53%. Under the best available technology scenario, only approximately 60-80% of pollutants are treated. To reduce impact of nutrient and microbial pollution on human and environmental health, improvements in both access to adequate sanitation and sanitation treatment efficiency are needed.

  6. Pan-Arctic River Discharge: Where Can We Improve Monitoring of Future Change?

    NASA Astrophysics Data System (ADS)

    Bring, A.; Shiklomanov, A. I.; Lammers, R. B.

    2016-12-01

    The Arctic freshwater cycle is changing rapidly, which will require adequate monitoring of river flow to detect, observe and understand changes and provide adaptation information. There has however been little detail about where the greatest flow changes are projected, and where monitoring therefore may need to be strengthened. In this study, we used a set of recent climate model runs and an advanced macro-scale hydrological model to analyze how flows across the continental pan-Arctic are projected to change, and where the climate models agree on significant changes. We also developed a method to identify where monitoring stations should be placed to observe these significant changes, and compared this set of suggested locations with the existing network of monitoring stations. Overall, our results reinforce earlier indications of large increases in flow over much of the Arctic, but we also identify some areas where projections agree on significant changes but disagree on the sign of change. For monitoring, central and eastern Siberia, Alaska and central Canada are hot spots for the highest changes. To take advantage of existing networks, a number of stations across central Canada and western and central Siberia could form a prioritized set. Further development of model representation of high-latitude hydrology would improve confidence in the areas we identify here. Nevertheless, ongoing observation programs may consider these suggested locations in efforts to improve monitoring of the rapidly changing Arctic freshwater cycle.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kandler A; Santhanagopalan, Shriram; Yang, Chuanbo

    Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.

  8. Improved earthquake monitoring in the central and eastern United States in support of seismic assessments for critical facilities

    USGS Publications Warehouse

    Leith, William S.; Benz, Harley M.; Herrmann, Robert B.

    2011-01-01

    Evaluation of seismic monitoring capabilities in the central and eastern United States for critical facilities - including nuclear powerplants - focused on specific improvements to understand better the seismic hazards in the region. The report is not an assessment of seismic safety at nuclear plants. To accomplish the evaluation and to provide suggestions for improvements using funding from the American Recovery and Reinvestment Act of 2009, the U.S. Geological Survey examined addition of new strong-motion seismic stations in areas of seismic activity and addition of new seismic stations near nuclear power-plant locations, along with integration of data from the Transportable Array of some 400 mobile seismic stations. Some 38 and 68 stations, respectively, were suggested for addition in active seismic zones and near-power-plant locations. Expansion of databases for strong-motion and other earthquake source-characterization data also was evaluated. Recognizing pragmatic limitations of station deployment, augmentation of existing deployments provides improvements in source characterization by quantification of near-source attenuation in regions where larger earthquakes are expected. That augmentation also supports systematic data collection from existing networks. The report further utilizes the application of modeling procedures and processing algorithms, with the additional stations and the improved seismic databases, to leverage the capabilities of existing and expanded seismic arrays.

  9. Numerical study on turbulence modulation in gas-particle flows

    NASA Astrophysics Data System (ADS)

    Yan, F.; Lightstone, M. F.; Wood, P. E.

    2007-01-01

    A mathematical model is proposed based on the Eulerian/Lagrangian approach to account for both the particle crossing trajectory effect and the extra turbulence production due to particle wake effects. The resulting model, together with existing models from the literature, is applied to two different particle-laden flow configurations, namely a vertical pipe flow and axisymmetric downward jet flow. The results show that the proposed model is able to provide improved predictions of the experimental results.

  10. Global attractivity of an almost periodic N-species nonlinear ecological competitive model

    NASA Astrophysics Data System (ADS)

    Xia, Yonghui; Han, Maoan; Huang, Zhenkun

    2008-01-01

    By using comparison theorem and constructing suitable Lyapunov functional, we study the following almost periodic nonlinear N-species competitive Lotka-Volterra model: A set of sufficient conditions is obtained for the existence and global attractivity of a unique positive almost periodic solution of the above model. As applications, some special competition models are studied again, our new results improve and generalize former results. Examples and their simulations show the feasibility of our main results.

  11. A Review of Mathematical Models for Leukemia and Lymphoma

    PubMed Central

    Clapp, Geoffrey; Levy, Doron

    2014-01-01

    Recently, there has been significant activity in the mathematical community, aimed at developing quantitative tools for studying leukemia and lymphoma. Mathematical models have been applied to evaluate existing therapies and to suggest novel therapies. This article reviews the recent contributions of mathematical modeling to leukemia and lymphoma research. These developments suggest that mathematical modeling has great potential in this field. Collaboration between mathematicians, clinicians, and experimentalists can significantly improve leukemia and lymphoma therapy. PMID:26744598

  12. Characterization of structural connections using free and forced response test data

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Huckelbridge, Arthur A.

    1989-01-01

    The accurate prediction of system dynamic response often has been limited by deficiencies in existing capabilities to characterize connections adequately. Connections between structural components often are complex mechanically, and difficult to accurately model analytically. Improved analytical models for connections are needed to improve system dynamic preditions. A procedure for identifying physical connection properties from free and forced response test data is developed, then verified utilizing a system having both a linear and nonlinear connection. Connection properties are computed in terms of physical parameters so that the physical characteristics of the connections can better be understood, in addition to providing improved input for the system model. The identification procedure is applicable to multi-degree of freedom systems, and does not require that the test data be measured directly at the connection locations.

  13. MysiRNA: improving siRNA efficacy prediction using a machine-learning model combining multi-tools and whole stacking energy (ΔG).

    PubMed

    Mysara, Mohamed; Elhefnawi, Mahmoud; Garibaldi, Jonathan M

    2012-06-01

    The investigation of small interfering RNA (siRNA) and its posttranscriptional gene-regulation has become an extremely important research topic, both for fundamental reasons and for potential longer-term therapeutic benefits. Several factors affect the functionality of siRNA including positional preferences, target accessibility and other thermodynamic features. State of the art tools aim to optimize the selection of target siRNAs by identifying those that may have high experimental inhibition. Such tools implement artificial neural network models as Biopredsi and ThermoComposition21, and linear regression models as DSIR, i-Score and Scales, among others. However, all these models have limitations in performance. In this work, a neural-network trained new siRNA scoring/efficacy prediction model was developed based on combining two existing scoring algorithms (ThermoComposition21 and i-Score), together with the whole stacking energy (ΔG), in a multi-layer artificial neural network. These three parameters were chosen after a comparative combinatorial study between five well known tools. Our developed model, 'MysiRNA' was trained on 2431 siRNA records and tested using three further datasets. MysiRNA was compared with 11 alternative existing scoring tools in an evaluation study to assess the predicted and experimental siRNA efficiency where it achieved the highest performance both in terms of correlation coefficient (R(2)=0.600) and receiver operating characteristics analysis (AUC=0.808), improving the prediction accuracy by up to 18% with respect to sensitivity and specificity of the best available tools. MysiRNA is a novel, freely accessible model capable of predicting siRNA inhibition efficiency with improved specificity and sensitivity. This multiclassifier approach could help improve the performance of prediction in several bioinformatics areas. MysiRNA model, part of MysiRNA-Designer package [1], is expected to play a key role in siRNA selection and evaluation. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Improving the process of process modelling by the use of domain process patterns

    NASA Astrophysics Data System (ADS)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  15. The Application of Satellite-Derived, High-Resolution Land Use/Land Cover Data to Improve Urban Air Quality Model Forecasts

    NASA Technical Reports Server (NTRS)

    Quattrochi, D. A.; Lapenta, W. M.; Crosson, W. L.; Estes, M. G., Jr.; Limaye, A.; Kahn, M.

    2006-01-01

    Local and state agencies are responsible for developing state implementation plans to meet National Ambient Air Quality Standards. Numerical models used for this purpose simulate the transport and transformation of criteria pollutants and their precursors. The specification of land use/land cover (LULC) plays an important role in controlling modeled surface meteorology and emissions. NASA researchers have worked with partners and Atlanta stakeholders to incorporate an improved high-resolution LULC dataset for the Atlanta area within their modeling system and to assess meteorological and air quality impacts of Urban Heat Island (UHI) mitigation strategies. The new LULC dataset provides a more accurate representation of land use, has the potential to improve model accuracy, and facilitates prediction of LULC changes. Use of the new LULC dataset for two summertime episodes improved meteorological forecasts, with an existing daytime cold bias of approx. equal to 3 C reduced by 30%. Model performance for ozone prediction did not show improvement. In addition, LULC changes due to Atlanta area urbanization were predicted through 2030, for which model simulations predict higher urban air temperatures. The incorporation of UHI mitigation strategies partially offset this warming trend. The data and modeling methods used are generally applicable to other U.S. cities.

  16. Improved Dynamic Modeling of the Cascade Distillation Subsystem and Integration with Models of Other Water Recovery Subsystems

    NASA Technical Reports Server (NTRS)

    Perry, Bruce; Anderson, Molly

    2015-01-01

    The Cascade Distillation Subsystem (CDS) is a rotary multistage distiller being developed to serve as the primary processor for wastewater recovery during long-duration space missions. The CDS could be integrated with a system similar to the International Space Station (ISS) Water Processor Assembly (WPA) to form a complete Water Recovery System (WRS) for future missions. Independent chemical process simulations with varying levels of detail have previously been developed using Aspen Custom Modeler (ACM) to aid in the analysis of the CDS and several WPA components. The existing CDS simulation could not model behavior during thermal startup and lacked detailed analysis of several key internal processes, including heat transfer between stages. The first part of this paper describes modifications to the ACM model of the CDS that improve its capabilities and the accuracy of its predictions. Notably, the modified version of the model can accurately predict behavior during thermal startup for both NaCl solution and pretreated urine feeds. The model is used to predict how changing operating parameters and design features of the CDS affects its performance, and conclusions from these predictions are discussed. The second part of this paper describes the integration of the modified CDS model and the existing WPA component models into a single WRS model. The integrated model is used to demonstrate the effects that changes to one component can have on the dynamic behavior of the system as a whole.

  17. Towards a New Generation of Agricultural System Data, Models and Knowledge Products: Design and Improvement

    NASA Technical Reports Server (NTRS)

    Antle, John M.; Basso, Bruno; Conant, Richard T.; Godfray, H. Charles J.; Jones, James W.; Herrero, Mario; Howitt, Richard E.; Keating, Brian A.; Munoz-Carpena, Rafael; Rosenzweig, Cynthia

    2016-01-01

    This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.

  18. Towards a new generation of agricultural system data, models and knowledge products: Design and improvement.

    PubMed

    Antle, John M; Basso, Bruno; Conant, Richard T; Godfray, H Charles J; Jones, James W; Herrero, Mario; Howitt, Richard E; Keating, Brian A; Munoz-Carpena, Rafael; Rosenzweig, Cynthia; Tittonell, Pablo; Wheeler, Tim R

    2017-07-01

    This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.

  19. A Simple Lightning Assimilation Technique For Improving Retrospective WRF Simulations

    EPA Science Inventory

    Convective rainfall is often a large source of error in retrospective modeling applications. In particular, positive rainfall biases commonly exist during summer months due to overactive convective parameterizations. In this study, lightning assimilation was applied in the Kain...

  20. A simple lightning assimilation technique for improving retrospective WRF simulations.

    EPA Science Inventory

    Convective rainfall is often a large source of error in retrospective modeling applications. In particular, positive rainfall biases commonly exist during summer months due to overactive convective parameterizations. In this study, lightning assimilation was applied in the Kain-F...

  1. An improved canopy wind model for predicting wind adjustment factors and wildland fire behavior

    Treesearch

    W. J. Massman; J. M. Forthofer; M. A. Finney

    2017-01-01

    The ability to rapidly estimate wind speed beneath a forest canopy or near the ground surface in any vegetation is critical to practical wildland fire behavior models. The common metric of this wind speed is the "mid-flame" wind speed, UMF. However, the existing approach for estimating UMF has some significant shortcomings. These include the assumptions that...

  2. Behavioral Design Teams: The Next Frontier in Clinical Delivery Innovation?

    PubMed

    Robertson, Ted; Darling, Matthew; Leifer, Jennifer; Footer, Owen; Gordski, Dani

    2017-11-01

    A deep understanding of human behavior is critical to designing effective health care delivery models, tools, and processes. Currently, however, few mechanisms exist to systematically apply insights about human behavior to improve health outcomes. Behavioral design teams (BDTs) are a successful model for applying behavioral insights within an organization. Already operational within government, this model can be adapted to function in a health care setting. To explore how BDTs could be applied to clinical care delivery and review models for integrating these teams within health care organizations. Interviews with experts in clinical delivery innovation and applied behavioral science, as well as leaders of existing government BDTs. BDTs are most effective when they enjoy top-level executive support, are co-led by a domain expert and behavioral scientist, collaborate closely with key staff and departments, have access to data and IT support, and operate a portfolio of projects. BDTs could be embedded in health care organizations in multiple ways, including in or just below the CEO’s office, within a quality improvement unit, or within an internal innovation center. When running a portfolio, BDTs achieve a greater number and diversity of insights at lower costs. They also become a platform for strategic learning and scaling.

  3. Process Improvements in Training Device Acceptance Testing: A Study in Total Quality Management

    DTIC Science & Technology

    1990-12-12

    Quality Management , a small group of Government and industry specialists examined the existing training device acceptance test process for potential improvements. The agreed-to mission of the Air Force/Industry partnership was to continuously identify and promote implementable approaches to minimize the cost and time required for acceptance testing while ensuring that validated performance supports the user training requirements. Application of a Total Quality process improvement model focused on the customers and their requirements, analyzed how work was accomplished, and

  4. Improved Casting Furnace Conceptual Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fielding, Randall Sidney; Tolman, David Donald

    In an attempt to ensure more consistent casting results and remove some schedule variance associated with casting, an improved casting furnace concept has been developed. The improved furnace uses the existing arc melter hardware and glovebox utilities. The furnace concept was designed around physical and operational requirements such as; a charge sized of less than 30 grams, high heating rates and minimal additional footprint. The conceptual model is shown in the report as well as a summary of how the requirements were met.

  5. A study of optical design of backlight module with external illuminance

    NASA Astrophysics Data System (ADS)

    Yen, Chih-Ta; Fang, Yi-Chin

    2011-10-01

    This research proposes the concept of Light Guide Film (LGF) at the back side of Back Light Unit (BLU). This new design may induce the exterior light, and then improve the power-saving of existent BLU. Two design models are reseated: One is design for 14 inch LCD monitor of notebook computer, which might improve 21% compared to traditional one. Another is designed for 3.5 inch LCD for mobile phone display, which might improve 15% compared to traditional one.

  6. Correlated Topic Vector for Scene Classification.

    PubMed

    Wei, Pengxu; Qin, Fei; Wan, Fang; Zhu, Yi; Jiao, Jianbin; Ye, Qixiang

    2017-07-01

    Scene images usually involve semantic correlations, particularly when considering large-scale image data sets. This paper proposes a novel generative image representation, correlated topic vector, to model such semantic correlations. Oriented from the correlated topic model, correlated topic vector intends to naturally utilize the correlations among topics, which are seldom considered in the conventional feature encoding, e.g., Fisher vector, but do exist in scene images. It is expected that the involvement of correlations can increase the discriminative capability of the learned generative model and consequently improve the recognition accuracy. Incorporated with the Fisher kernel method, correlated topic vector inherits the advantages of Fisher vector. The contributions to the topics of visual words have been further employed by incorporating the Fisher kernel framework to indicate the differences among scenes. Combined with the deep convolutional neural network (CNN) features and Gibbs sampling solution, correlated topic vector shows great potential when processing large-scale and complex scene image data sets. Experiments on two scene image data sets demonstrate that correlated topic vector improves significantly the deep CNN features, and outperforms existing Fisher kernel-based features.

  7. Improving the Statistical Modeling of the TRMM Extreme Precipitation Monitoring System

    NASA Astrophysics Data System (ADS)

    Demirdjian, L.; Zhou, Y.; Huffman, G. J.

    2016-12-01

    This project improves upon an existing extreme precipitation monitoring system based on the Tropical Rainfall Measuring Mission (TRMM) daily product (3B42) using new statistical models. The proposed system utilizes a regional modeling approach, where data from similar grid locations are pooled to increase the quality and stability of the resulting model parameter estimates to compensate for the short data record. The regional frequency analysis is divided into two stages. In the first stage, the region defined by the TRMM measurements is partitioned into approximately 27,000 non-overlapping clusters using a recursive k-means clustering scheme. In the second stage, a statistical model is used to characterize the extreme precipitation events occurring in each cluster. Instead of utilizing the block-maxima approach used in the existing system, where annual maxima are fit to the Generalized Extreme Value (GEV) probability distribution at each cluster separately, the present work adopts the peak-over-threshold (POT) method of classifying points as extreme if they exceed a pre-specified threshold. Theoretical considerations motivate the use of the Generalized-Pareto (GP) distribution for fitting threshold exceedances. The fitted parameters can be used to construct simple and intuitive average recurrence interval (ARI) maps which reveal how rare a particular precipitation event is given its spatial location. The new methodology eliminates much of the random noise that was produced by the existing models due to a short data record, producing more reasonable ARI maps when compared with NOAA's long-term Climate Prediction Center (CPC) ground based observations. The resulting ARI maps can be useful for disaster preparation, warning, and management, as well as increased public awareness of the severity of precipitation events. Furthermore, the proposed methodology can be applied to various other extreme climate records.

  8. Elastic and inelastic scattering of neutrons on 238U nucleus

    NASA Astrophysics Data System (ADS)

    Capote, R.; Trkov, A.; Sin, M.; Herman, M. W.; Soukhovitskiĩ, E. Sh.

    2014-04-01

    Advanced modelling of neutron induced reactions on the 238U nucleus is aimed at improving our knowledge of neutron scattering. Capture and fission channels are well constrained by available experimental data and neutron standard evaluation. A focus of this contribution is on elastic and inelastic scattering cross sections. The employed nuclear reaction model includes - a new rotational-vibrational dispersive optical model potential coupling the low-lying collective bands of vibrational character observed in even-even actinides; - the Engelbrecht-Weidenmüller transformation allowing for inclusion of compound-direct interference effects; - and a multi-humped fission barrier with absorption in the secondary well described within the optical model for fission. Impact of the advanced modelling on elastic and inelastic scattering cross sections including angular distributions and emission spectra is assessed both by comparison with selected microscopic experimental data and integral criticality benchmarks including measured reaction rates (e.g. JEMIMA, FLAPTOP and BIG TEN). Benchmark calculations provided feedback to improve the reaction modelling. Improvement of existing libraries will be discussed.

  9. An Evaluation of Understandability of Patient Journey Models in Mental Health

    PubMed Central

    2016-01-01

    Background There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. Objectives This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Method Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. Results The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. Conclusions The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers. PMID:27471006

  10. End-user satisfaction analysis on library management system unnes using technology acceptance model towards national standard of integrated library

    NASA Astrophysics Data System (ADS)

    Hardyanto, W.; Purwinarko, A.; Adhi, M. A.

    2018-03-01

    The library which is the gate of the University should be supported by the existence of an adequate information system, to provide excellent service and optimal to every user. Library management system that has been in existence since 2009 needs to be re-evaluated so that the system can meet the needs of both operator and Unnes user in particular, and users from outside Unnes in general. This study aims to evaluate and improve the existing library management system to produce a system that is accountable and able to meet the needs of end users, as well as produce a library management system that is integrated Unnes. Research is directed to produce evaluation report with Technology Acceptance Model (TAM) approach and library management system integrated with the national standard.

  11. Global Optimization Ensemble Model for Classification Methods

    PubMed Central

    Anwar, Hina; Qamar, Usman; Muzaffar Qureshi, Abdul Wahab

    2014-01-01

    Supervised learning is the process of data mining for deducing rules from training datasets. A broad array of supervised learning algorithms exists, every one of them with its own advantages and drawbacks. There are some basic issues that affect the accuracy of classifier while solving a supervised learning problem, like bias-variance tradeoff, dimensionality of input space, and noise in the input data space. All these problems affect the accuracy of classifier and are the reason that there is no global optimal method for classification. There is not any generalized improvement method that can increase the accuracy of any classifier while addressing all the problems stated above. This paper proposes a global optimization ensemble model for classification methods (GMC) that can improve the overall accuracy for supervised learning problems. The experimental results on various public datasets showed that the proposed model improved the accuracy of the classification models from 1% to 30% depending upon the algorithm complexity. PMID:24883382

  12. Reynolds averaged turbulence modelling using deep neural networks with embedded invariance

    DOE PAGES

    Ling, Julia; Kurzawski, Andrew; Templeton, Jeremy

    2016-10-18

    There exists significant demand for improved Reynolds-averaged Navier–Stokes (RANS) turbulence models that are informed by and can represent a richer set of turbulence physics. This paper presents a method of using deep neural networks to learn a model for the Reynolds stress anisotropy tensor from high-fidelity simulation data. A novel neural network architecture is proposed which uses a multiplicative layer with an invariant tensor basis to embed Galilean invariance into the predicted anisotropy tensor. It is demonstrated that this neural network architecture provides improved prediction accuracy compared with a generic neural network architecture that does not embed this invariance property.more » Furthermore, the Reynolds stress anisotropy predictions of this invariant neural network are propagated through to the velocity field for two test cases. For both test cases, significant improvement versus baseline RANS linear eddy viscosity and nonlinear eddy viscosity models is demonstrated.« less

  13. Reynolds averaged turbulence modelling using deep neural networks with embedded invariance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ling, Julia; Kurzawski, Andrew; Templeton, Jeremy

    There exists significant demand for improved Reynolds-averaged Navier–Stokes (RANS) turbulence models that are informed by and can represent a richer set of turbulence physics. This paper presents a method of using deep neural networks to learn a model for the Reynolds stress anisotropy tensor from high-fidelity simulation data. A novel neural network architecture is proposed which uses a multiplicative layer with an invariant tensor basis to embed Galilean invariance into the predicted anisotropy tensor. It is demonstrated that this neural network architecture provides improved prediction accuracy compared with a generic neural network architecture that does not embed this invariance property.more » Furthermore, the Reynolds stress anisotropy predictions of this invariant neural network are propagated through to the velocity field for two test cases. For both test cases, significant improvement versus baseline RANS linear eddy viscosity and nonlinear eddy viscosity models is demonstrated.« less

  14. A Generalized Framework for Modeling Next Generation 911 Implementations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelic, Andjelka; Aamir, Munaf Syed; Kelic, Andjelka

    This document summarizes the current state of Sandia 911 modeling capabilities and then addresses key aspects of Next Generation 911 (NG911) architectures for expansion of existing models. Analysis of three NG911 implementations was used to inform heuristics , associated key data requirements , and assumptions needed to capture NG911 architectures in the existing models . Modeling of NG911 necessitates careful consideration of its complexity and the diversity of implementations. Draft heuristics for constructing NG911 models are pres ented based on the analysis along with a summary of current challenges and ways to improve future NG911 modeling efforts . We foundmore » that NG911 relies on E nhanced 911 (E911) assets such as 911 selective routers to route calls originating from traditional tel ephony service which are a majority of 911 calls . We also found that the diversity and transitional nature of NG911 implementations necessitates significant and frequent data collection to ensure that adequate model s are available for crisis action support .« less

  15. Design for navigation improvements at Nome Harbor, Alaska: Coastal model investigation. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bottin, R.R.; Acuff, H.F.

    1998-09-01

    A 1:90-scale (undistorted) three dimensional coastal hydraulic model was used to investigate the design of proposed navigation improvements at Nome Harbor, Alaska, with respect to wave, current, and shoaling conditions at the site. The model reproduced about 3,350 m (11,000 ft) of the Alaskan shoreline, the existing harbor and lower reaches of the Snake River, and sufficient offshore bathymetry in the Norton Sound to permit generation of the required experimental waves. The model was used to determine the impacts of a new entrance channel on wave-induced current patterns and magnitudes, sediment transport patterns, and wave conditions in the new channelmore » and harbor area, as well as to optimize the lengths and alignments of new breakwaters and causeway extensions. A 24.4-m-long (9O-ft-long) unidirectional, spectral wave generator, and automated data acquisition and control system, and a crushed coal tracer material were utilized in model operation. It was concluded from study results that: (a) existing conditions are characterized by rough and turbulent wave conditions in the existing entrance. Very confused wave patterns were observed in the entrance due to wave energy reflected off the vertical walls lining the entrance. Wave heights in excess of 1.5 m (5 ft) were obtained in the entrance for typical storm conditions; and wave heights of almost 3.7 m (12 ft) were obtained in the entrance for 5O-year storm wave conditions with extreme high-water level 4 m (+13 ft); (b) wave conditions along the vertical-faced causeway docks were excessive for existing conditions. Wave heights in excess of 3.7 and 2.7 m (12 and 9 ft) were obtained along the outer and inner docks, respectively, for typical storm conditions; and wave heights of almost 7 and 5.8 m (23 and 19 ft) were recorded along these docks, respectively, for 5-year storm wave conditions with extreme high-water levels.« less

  16. New constraints and discovery potential for Higgs to Higgs cascade decays through vectorlike leptons

    DOE PAGES

    Dermíšek, Radovan; Lunghi, Enrico; Shin, Seodong

    2016-10-17

    One of the cleanest signatures of a heavy Higgs boson in models with vectorlike leptons is H→e ± 4ℓ ∓→hℓ +ℓ - which, in two Higgs doublet model type-II, can even be the dominant decay mode of heavy Higgses. Among the decay modes of the standard model like Higgs boson, h, we consider bb¯¯ and γγ as representative channels with sizable and negligible background, respectively. We obtained new model independent limits on production cross section for this process from recasting existing experimental searches and interpret them within the two Higgs doublet model. In addition, we show that these limits canmore » be improved by about two orders of magnitude with appropriate selection cuts immediately with existing data sets. We also discuss expected sensitivities with integrated luminosity up to 3 ab -1 and present a brief overview of other channels.« less

  17. New constraints and discovery potential for Higgs to Higgs cascade decays through vectorlike leptons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dermíšek, Radovan; Lunghi, Enrico; Shin, Seodong

    One of the cleanest signatures of a heavy Higgs boson in models with vectorlike leptons is H→e ± 4ℓ ∓→hℓ +ℓ - which, in two Higgs doublet model type-II, can even be the dominant decay mode of heavy Higgses. Among the decay modes of the standard model like Higgs boson, h, we consider bb¯¯ and γγ as representative channels with sizable and negligible background, respectively. We obtained new model independent limits on production cross section for this process from recasting existing experimental searches and interpret them within the two Higgs doublet model. In addition, we show that these limits canmore » be improved by about two orders of magnitude with appropriate selection cuts immediately with existing data sets. We also discuss expected sensitivities with integrated luminosity up to 3 ab -1 and present a brief overview of other channels.« less

  18. Interacting multiple model forward filtering and backward smoothing for maneuvering target tracking

    NASA Astrophysics Data System (ADS)

    Nandakumaran, N.; Sutharsan, S.; Tharmarasa, R.; Lang, Tom; McDonald, Mike; Kirubarajan, T.

    2009-08-01

    The Interacting Multiple Model (IMM) estimator has been proven to be effective in tracking agile targets. Smoothing or retrodiction, which uses measurements beyond the current estimation time, provides better estimates of target states. Various methods have been proposed for multiple model smoothing in the literature. In this paper, a new smoothing method, which involves forward filtering followed by backward smoothing while maintaining the fundamental spirit of the IMM, is proposed. The forward filtering is performed using the standard IMM recursion, while the backward smoothing is performed using a novel interacting smoothing recursion. This backward recursion mimics the IMM estimator in the backward direction, where each mode conditioned smoother uses standard Kalman smoothing recursion. Resulting algorithm provides improved but delayed estimates of target states. Simulation studies are performed to demonstrate the improved performance with a maneuvering target scenario. The comparison with existing methods confirms the improved smoothing accuracy. This improvement results from avoiding the augmented state vector used by other algorithms. In addition, the new technique to account for model switching in smoothing is a key in improving the performance.

  19. An Update on Improvements to NiCE Support for PROTEUS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Andrew; McCaskey, Alexander J.; Billings, Jay Jay

    2015-09-01

    The Department of Energy Office of Nuclear Energy's Nuclear Energy Advanced Modeling and Simulation (NEAMS) program has supported the development of the NEAMS Integrated Computational Environment (NiCE), a modeling and simulation workflow environment that provides services and plugins to facilitate tasks such as code execution, model input construction, visualization, and data analysis. This report details the development of workflows for the reactor core neutronics application, PROTEUS. This advanced neutronics application (primarily developed at Argonne National Laboratory) aims to improve nuclear reactor design and analysis by providing an extensible and massively parallel, finite-element solver for current and advanced reactor fuel neutronicsmore » modeling. The integration of PROTEUS-specific tools into NiCE is intended to make the advanced capabilities that PROTEUS provides more accessible to the nuclear energy research and development community. This report will detail the work done to improve existing PROTEUS workflow support in NiCE. We will demonstrate and discuss these improvements, including the development of flexible IO services, an improved interface for input generation, and the addition of advanced Fortran development tools natively in the platform.« less

  20. Coupling population dynamics with earth system models: the POPEM model.

    PubMed

    Navarro, Andrés; Moreno, Raúl; Jiménez-Alcázar, Alfonso; Tapiador, Francisco J

    2017-09-16

    Precise modeling of CO 2 emissions is important for environmental research. This paper presents a new model of human population dynamics that can be embedded into ESMs (Earth System Models) to improve climate modeling. Through a system dynamics approach, we develop a cohort-component model that successfully simulates historical population dynamics with fine spatial resolution (about 1°×1°). The population projections are used to improve the estimates of CO 2 emissions, thus transcending the bulk approach of existing models and allowing more realistic non-linear effects to feature in the simulations. The module, dubbed POPEM (from Population Parameterization for Earth Models), is compared with current emission inventories and validated against UN aggregated data. Finally, it is shown that the module can be used to advance toward fully coupling the social and natural components of the Earth system, an emerging research path for environmental science and pollution research.

  1. Analysis on trust influencing factors and trust model from multiple perspectives of online Auction

    NASA Astrophysics Data System (ADS)

    Yu, Wang

    2017-10-01

    Current reputation models lack the research on online auction trading completely so they cannot entirely reflect the reputation status of users and may cause problems on operability. To evaluate the user trust in online auction correctly, a trust computing model based on multiple influencing factors is established. It aims at overcoming the efficiency of current trust computing methods and the limitations of traditional theoretical trust models. The improved model comprehensively considers the trust degree evaluation factors of three types of participants according to different participation modes of online auctioneers, to improve the accuracy, effectiveness and robustness of the trust degree. The experiments test the efficiency and the performance of our model under different scale of malicious user, under environment like eBay and Sporas model. The experimental results analysis show the model proposed in this paper makes up the deficiency of existing model and it also has better feasibility.

  2. On the Conditioning of Machine-Learning-Assisted Turbulence Modeling

    NASA Astrophysics Data System (ADS)

    Wu, Jinlong; Sun, Rui; Wang, Qiqi; Xiao, Heng

    2017-11-01

    Recently, several researchers have demonstrated that machine learning techniques can be used to improve the RANS modeled Reynolds stress by training on available database of high fidelity simulations. However, obtaining improved mean velocity field remains an unsolved challenge, restricting the predictive capability of current machine-learning-assisted turbulence modeling approaches. In this work we define a condition number to evaluate the model conditioning of data-driven turbulence modeling approaches, and propose a stability-oriented machine learning framework to model Reynolds stress. Two canonical flows, the flow in a square duct and the flow over periodic hills, are investigated to demonstrate the predictive capability of the proposed framework. The satisfactory prediction performance of mean velocity field for both flows demonstrates the predictive capability of the proposed framework for machine-learning-assisted turbulence modeling. With showing the capability of improving the prediction of mean flow field, the proposed stability-oriented machine learning framework bridges the gap between the existing machine-learning-assisted turbulence modeling approaches and the demand of predictive capability of turbulence models in real applications.

  3. Improving Our Ability to Evaluate Underlying Mechanisms of Behavioral Onset and Other Event Occurrence Outcomes: A Discrete-Time Survival Mediation Model

    PubMed Central

    Fairchild, Amanda J.; Abara, Winston E.; Gottschall, Amanda C.; Tein, Jenn-Yun; Prinz, Ronald J.

    2015-01-01

    The purpose of this article is to introduce and describe a statistical model that researchers can use to evaluate underlying mechanisms of behavioral onset and other event occurrence outcomes. Specifically, the article develops a framework for estimating mediation effects with outcomes measured in discrete-time epochs by integrating the statistical mediation model with discrete-time survival analysis. The methodology has the potential to help strengthen health research by targeting prevention and intervention work more effectively as well as by improving our understanding of discretized periods of risk. The model is applied to an existing longitudinal data set to demonstrate its use, and programming code is provided to facilitate its implementation. PMID:24296470

  4. An improved Multimodel Approach for Global Sea Surface Temperature Forecasts

    NASA Astrophysics Data System (ADS)

    Khan, M. Z. K.; Mehrotra, R.; Sharma, A.

    2014-12-01

    The concept of ensemble combinations for formulating improved climate forecasts has gained popularity in recent years. However, many climate models share similar physics or modeling processes, which may lead to similar (or strongly correlated) forecasts. Recent approaches for combining forecasts that take into consideration differences in model accuracy over space and time have either ignored the similarity of forecast among the models or followed a pairwise dynamic combination approach. Here we present a basis for combining model predictions, illustrating the improvements that can be achieved if procedures for factoring in inter-model dependence are utilised. The utility of the approach is demonstrated by combining sea surface temperature (SST) forecasts from five climate models over a period of 1960-2005. The variable of interest, the monthly global sea surface temperature anomalies (SSTA) at a 50´50 latitude-longitude grid, is predicted three months in advance to demonstrate the utility of the proposed algorithm. Results indicate that the proposed approach offers consistent and significant improvements for majority of grid points compared to the case where the dependence among the models is ignored. Therefore, the proposed approach of combining multiple models by taking into account the existing interdependence, provides an attractive alternative to obtain improved climate forecast. In addition, an approach to combine seasonal forecasts from multiple climate models with varying periods of availability is also demonstrated.

  5. Application of neural networks and sensitivity analysis to improved prediction of trauma survival.

    PubMed

    Hunter, A; Kennedy, L; Henry, J; Ferguson, I

    2000-05-01

    The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models.

  6. Creating High Reliability in Health Care Organizations

    PubMed Central

    Pronovost, Peter J; Berenholtz, Sean M; Goeschel, Christine A; Needham, Dale M; Sexton, J Bryan; Thompson, David A; Lubomski, Lisa H; Marsteller, Jill A; Makary, Martin A; Hunt, Elizabeth

    2006-01-01

    Objective The objective of this paper was to present a comprehensive approach to help health care organizations reliably deliver effective interventions. Context Reliability in healthcare translates into using valid rate-based measures. Yet high reliability organizations have proven that the context in which care is delivered, called organizational culture, also has important influences on patient safety. Model for Improvement Our model to improve reliability, which also includes interventions to improve culture, focuses on valid rate-based measures. This model includes (1) identifying evidence-based interventions that improve the outcome, (2) selecting interventions with the most impact on outcomes and converting to behaviors, (3) developing measures to evaluate reliability, (4) measuring baseline performance, and (5) ensuring patients receive the evidence-based interventions. The comprehensive unit-based safety program (CUSP) is used to improve culture and guide organizations in learning from mistakes that are important, but cannot be measured as rates. Conclusions We present how this model was used in over 100 intensive care units in Michigan to improve culture and eliminate catheter-related blood stream infections—both were accomplished. Our model differs from existing models in that it incorporates efforts to improve a vital component for system redesign—culture, it targets 3 important groups—senior leaders, team leaders, and front line staff, and facilitates change management—engage, educate, execute, and evaluate for planned interventions. PMID:16898981

  7. The SURFEXv7.2 land and ocean surface platform for coupled or offline simulation of earth surface variables and fluxes

    NASA Astrophysics Data System (ADS)

    Masson, V.; Le Moigne, P.; Martin, E.; Faroux, S.; Alias, A.; Alkama, R.; Belamari, S.; Barbu, A.; Boone, A.; Bouyssel, F.; Brousseau, P.; Brun, E.; Calvet, J.-C.; Carrer, D.; Decharme, B.; Delire, C.; Donier, S.; Essaouini, K.; Gibelin, A.-L.; Giordani, H.; Habets, F.; Jidane, M.; Kerdraon, G.; Kourzeneva, E.; Lafaysse, M.; Lafont, S.; Lebeaupin Brossier, C.; Lemonsu, A.; Mahfouf, J.-F.; Marguinaud, P.; Mokhtari, M.; Morin, S.; Pigeon, G.; Salgado, R.; Seity, Y.; Taillefer, F.; Tanguy, G.; Tulet, P.; Vincendon, B.; Vionnet, V.; Voldoire, A.

    2013-07-01

    SURFEX is a new externalized land and ocean surface platform that describes the surface fluxes and the evolution of four types of surfaces: nature, town, inland water and ocean. It is mostly based on pre-existing, well-validated scientific models that are continuously improved. The motivation for the building of SURFEX is to use strictly identical scientific models in a high range of applications in order to mutualise the research and development efforts. SURFEX can be run in offline mode (0-D or 2-D runs) or in coupled mode (from mesoscale models to numerical weather prediction and climate models). An assimilation mode is included for numerical weather prediction and monitoring. In addition to momentum, heat and water fluxes, SURFEX is able to simulate fluxes of carbon dioxide, chemical species, continental aerosols, sea salt and snow particles. The main principles of the organisation of the surface are described first. Then, a survey is made of the scientific module (including the coupling strategy). Finally, the main applications of the code are summarised. The validation work undertaken shows that replacing the pre-existing surface models by SURFEX in these applications is usually associated with improved skill, as the numerous scientific developments contained in this community code are used to good advantage.

  8. Technical note: Harmonizing met-ocean model data via standard web services within small research groups

    NASA Astrophysics Data System (ADS)

    Signell, R. P.; Camossi, E.

    2015-11-01

    Work over the last decade has resulted in standardized web-services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by: (1) making it simple for providers to enable web service access to existing output files; (2) using technology that is free, and that is easy to deploy and configure; and (3) providing tools to communicate with web services that work in existing research environments. We present a simple, local brokering approach that lets modelers continue producing custom data, but virtually aggregates and standardizes the data using NetCDF Markup Language. The THREDDS Data Server is used for data delivery, pycsw for data search, NCTOOLBOX (Matlab®1) and Iris (Python) for data access, and Ocean Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.1 Mention of trade names or commercial products does not constitute endorsement or recommendation for use by the US Government.

  9. Parameterizing the Variability and Uncertainty of Wind and Solar in CEMs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frew, Bethany

    We present current and improved methods for estimating the capacity value and curtailment impacts from variable generation (VG) in capacity expansion models (CEMs). The ideal calculation of these variability metrics is through an explicit co-optimized investment-dispatch model using multiple years of VG and load data. Because of data and computational limitations, existing CEMs typically approximate these metrics using a subset of all hours from a single year and/or using statistical methods, which often do not capture the tail-event impacts or the broader set of interactions between VG, storage, and conventional generators. In our proposed new methods, we use hourly generationmore » and load values across all hours of the year to characterize the (1) contribution of VG to system capacity during high load hours, (2) the curtailment level of VG, and (3) the reduction in VG curtailment due to storage and shutdown of select thermal generators. Using CEM model outputs from a preceding model solve period, we apply these methods to exogenously calculate capacity value and curtailment metrics for the subsequent model solve period. Preliminary results suggest that these hourly methods offer improved capacity value and curtailment representations of VG in the CEM from existing approximation methods without additional computational burdens.« less

  10. Global analysis of fermion mixing with exotics

    NASA Technical Reports Server (NTRS)

    Nardi, Enrico; Roulet, Esteban; Tommasini, Daniele

    1991-01-01

    The limits are analyzed on deviation of the lepton and quark weak-couplings from their standard model values in a general class of models where the known fermions are allowed to mix with new heavy particles with exotic SU(2) x U(1) quantum number assignments (left-handed singlets or right-handed doublets). These mixings appear in many extensions of the electroweak theory such as models with mirror fermions, E(sub 6) models, etc. The results update previous analyses and improve considerably the existing bounds.

  11. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    NASA Technical Reports Server (NTRS)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Onar, Arzu; Bayliss, Jon; Ludwig, Larry

    2009-01-01

    In this experiment, an empirical model to quantify the probability of occurrence of an electrical short circuit from tin whiskers as a function of voltage was developed. This empirical model can be used to improve existing risk simulation models. FIB and TEM images of a tin whisker confirm the rare polycrystalline structure on one of the three whiskers studied. FIB cross-section of the card guides verified that the tin finish was bright tin.

  12. PiTS-1: Carbon Partitioning in Loblolly Pine after 13C Labeling and Shade Treatments

    DOE Data Explorer

    Warren, J. M.; Iversen, C. M.; Garten, Jr., C. T.; Norby, R. J.; Childs, J.; Brice, D.; Evans, R. M.; Gu, L.; Thornton, P.; Weston, D. J.

    2013-01-01

    The PiTS task was established with the objective of improving the C partitioning routines in existing ecosystem models by exploring mechanistic model representations of partitioning tested against field observations. We used short-term field manipulations of C flow, through 13CO2 labeling, canopy shading and stem girdling, to dramatically alter C partitioning, and resultant data are being used to test model representation of C partitioning processes in the Community Land Model (CLM4 or CLM4.5).

  13. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  14. An integrated decision model for the application of airborne sensors for improved response to accidental and terrorist chemical vapor releases

    NASA Astrophysics Data System (ADS)

    Kapitan, Loginn

    This research created a new model which provides an integrated approach to planning the effective selection and employment of airborne sensor systems in response to accidental or intentional chemical vapor releases. The approach taken was to use systems engineering and decision analysis methods to construct a model architecture which produced a modular structure for integrating both new and existing components into a logical procedure to assess the application of airborne sensor systems to address chemical vapor hazards. The resulting integrated process model includes an internal aggregation model which allowed differentiation among alternative airborne sensor systems. Both models were developed and validated by experts and demonstrated using appropriate hazardous chemical release scenarios. The resultant prototype integrated process model or system fills a current gap in capability allowing improved planning, training and exercise for HAZMAT teams and first responders when considering the selection and employment of airborne sensor systems. Through the research process, insights into the current response structure and how current airborne capability may be most effectively used were generated. Furthermore, the resultant prototype system is tailorable for local, state, and federal application, and can potentially be modified to help evaluate investments in new airborne sensor technology and systems. Better planning, training and preparedness exercising holds the prospect for the effective application of airborne assets for improved response to large scale chemical release incidents. Improved response will result in fewer casualties and lives lost, reduced economic impact, and increased protection of critical infrastructure when faced with accidental and intentional terrorist release of hazardous industrial chemicals. With the prospect of more airborne sensor systems becoming available, this prototype system integrates existing and new tools into an effective process for the selection and employment of airborne sensors to better plan, train and exercise ahead of potential chemical release events.

  15. Physically based estimation of soil water retention from textural data: General framework, new models, and streamlined existing models

    USGS Publications Warehouse

    Nimmo, J.R.; Herkelrath, W.N.; Laguna, Luna A.M.

    2007-01-01

    Numerous models are in widespread use for the estimation of soil water retention from more easily measured textural data. Improved models are needed for better prediction and wider applicability. We developed a basic framework from which new and existing models can be derived to facilitate improvements. Starting from the assumption that every particle has a characteristic dimension R associated uniquely with a matric pressure ?? and that the form of the ??-R relation is the defining characteristic of each model, this framework leads to particular models by specification of geometric relationships between pores and particles. Typical assumptions are that particles are spheres, pores are cylinders with volume equal to the associated particle volume times the void ratio, and that the capillary inverse proportionality between radius and matric pressure is valid. Examples include fixed-pore-shape and fixed-pore-length models. We also developed alternative versions of the model of Arya and Paris that eliminate its interval-size dependence and other problems. The alternative models are calculable by direct application of algebraic formulas rather than manipulation of data tables and intermediate results, and they easily combine with other models (e.g., incorporating structural effects) that are formulated on a continuous basis. Additionally, we developed a family of models based on the same pore geometry as the widely used unsaturated hydraulic conductivity model of Mualem. Predictions of measurements for different suitable media show that some of the models provide consistently good results and can be chosen based on ease of calculations and other factors. ?? Soil Science Society of America. All rights reserved.

  16. VMT Mix Modeling for Mobile Source Emissions Forecasting: Formulation and Empirical Application

    DOT National Transportation Integrated Search

    2000-05-01

    The purpose of the current report is to propose and implement a methodology for obtaining improved link-specific vehicle miles of travel (VMT) mix values compared to those obtained from existent methods. Specifically, the research is developing a fra...

  17. Proceedings - International Conference on Wheel/Rail Load and Displacement Measurement Techniques : January 19-20, 1981

    DOT National Transportation Integrated Search

    1981-09-01

    Measurement of wheel/rail characteristics generates information for improvement of design tools such as model validation, establishment of load spectra and vehicle/track system interaction. Existing and new designs are assessed from evaluation of veh...

  18. IMPROVED SCIENCE AND DECISION SUPPORT FOR MANAGING WATERSHED NUTRIENT LOADS

    EPA Science Inventory

    The proposed research addresses two critical gaps in the TMDL process: (1) the inadequacy of presently existing receiving water models to accurately simulate nutrient-sediment-water interactions and fixed plants; and (2) the lack of decision-oriented optimization f...

  19. A network-based approach for semi-quantitative knowledge mining and its application to yield variability

    NASA Astrophysics Data System (ADS)

    Schauberger, Bernhard; Rolinski, Susanne; Müller, Christoph

    2016-12-01

    Variability of crop yields is detrimental for food security. Under climate change its amplitude is likely to increase, thus it is essential to understand the underlying causes and mechanisms. Crop models are the primary tool to project future changes in crop yields under climate change. A systematic overview of drivers and mechanisms of crop yield variability (YV) can thus inform crop model development and facilitate improved understanding of climate change impacts on crop yields. Yet there is a vast body of literature on crop physiology and YV, which makes a prioritization of mechanisms for implementation in models challenging. Therefore this paper takes on a novel approach to systematically mine and organize existing knowledge from the literature. The aim is to identify important mechanisms lacking in models, which can help to set priorities in model improvement. We structure knowledge from the literature in a semi-quantitative network. This network consists of complex interactions between growing conditions, plant physiology and crop yield. We utilize the resulting network structure to assign relative importance to causes of YV and related plant physiological processes. As expected, our findings confirm existing knowledge, in particular on the dominant role of temperature and precipitation, but also highlight other important drivers of YV. More importantly, our method allows for identifying the relevant physiological processes that transmit variability in growing conditions to variability in yield. We can identify explicit targets for the improvement of crop models. The network can additionally guide model development by outlining complex interactions between processes and by easily retrieving quantitative information for each of the 350 interactions. We show the validity of our network method as a structured, consistent and scalable dictionary of literature. The method can easily be applied to many other research fields.

  20. Modified compensation algorithm of lever-arm effect and flexural deformation for polar shipborne transfer alignment based on improved adaptive Kalman filter

    NASA Astrophysics Data System (ADS)

    Wang, Tongda; Cheng, Jianhua; Guan, Dongxue; Kang, Yingyao; Zhang, Wei

    2017-09-01

    Due to the lever-arm effect and flexural deformation in the practical application of transfer alignment (TA), the TA performance is decreased. The existing polar TA algorithm only compensates a fixed lever-arm without considering the dynamic lever-arm caused by flexural deformation; traditional non-polar TA algorithms also have some limitations. Thus, the performance of existing compensation algorithms is unsatisfactory. In this paper, a modified compensation algorithm of the lever-arm effect and flexural deformation is proposed to promote the accuracy and speed of the polar TA. On the basis of a dynamic lever-arm model and a noise compensation method for flexural deformation, polar TA equations are derived in grid frames. Based on the velocity-plus-attitude matching method, the filter models of polar TA are designed. An adaptive Kalman filter (AKF) is improved to promote the robustness and accuracy of the system, and then applied to the estimation of the misalignment angles. Simulation and experiment results have demonstrated that the modified compensation algorithm based on the improved AKF for polar TA can effectively compensate the lever-arm effect and flexural deformation, and then improve the accuracy and speed of TA in the polar region.

  1. Improving Crop Productions Using the Irrigation & Crop Production Model Under Drought

    NASA Astrophysics Data System (ADS)

    Shin, Y.; Lee, T.; Lee, S. H.; Kim, J.; Jang, W.; Park, S.

    2017-12-01

    We aimed to improve crop productions by providing optimal irrigation water amounts (IWAs) for various soils and crops using the Irrigation & Crop Production (ICP) model under various hydro-climatic regions. We selected the Little Washita (LW 13/21) and Bangdong-ri sites in Oklahoma (United States of America) and Chuncheon (Republic of Korea) for the synthetic studies. Our results showed that the ICP model performed well for improving crop productions by providing optimal IWAs during the study period (2000 to 2016). Crop productions were significantly affected by the solar radiation and precipitation, but the maximum and minimum temperature showed less impact on crop productions. When we considerd that the weather variables cannot be adjusted by artifical activities, irrigation might be the only solution for improving crop productions under drought. Also, the presence of shallow ground water (SGW) table depths higlhy influences on crop production. Although certainties exist in the synthetic studies, our results showed the robustness of the ICP model for improving crop productions under the drought condition. Thus, the ICP model can contribute to efficient water management plans under drought in regions at where water availability is limited.

  2. Continuing Development of a Hybrid Model (VSH) of the Neutral Thermosphere

    NASA Technical Reports Server (NTRS)

    Burns, Alan

    1996-01-01

    We propose to continue the development of a new operational model of neutral thermospheric density, composition, temperatures and winds to improve current engineering environment definitions of the neutral thermosphere. This model will be based on simulations made with the National Center for Atmospheric Research (NCAR) Thermosphere-Ionosphere- Electrodynamic General Circulation Model (TIEGCM) and on empirical data. It will be capable of using real-time geophysical indices or data from ground-based and satellite inputs and provides neutral variables at specified locations and times. This "hybrid" model will be based on a Vector Spherical Harmonic (VSH) analysis technique developed (over the last 8 years) at the University of Michigan that permits the incorporation of the TIGCM outputs and data into the model. The VSH model will be a more accurate version of existing models of the neutral thermospheric, and will thus improve density specification for satellites flying in low Earth orbit (LEO).

  3. Technical Review of the CENWP Computational Fluid Dynamics Model of the John Day Dam Forebay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rakowski, Cynthia L.; Serkowski, John A.; Richmond, Marshall C.

    The US Army Corps of Engineers Portland District (CENWP) has developed a computational fluid dynamics (CFD) model of the John Day forebay on the Columbia River to aid in the development and design of alternatives to improve juvenile salmon passage at the John Day Project. At the request of CENWP, Pacific Northwest National Laboratory (PNNL) Hydrology Group has conducted a technical review of CENWP's CFD model run in CFD solver software, STAR-CD. PNNL has extensive experience developing and applying 3D CFD models run in STAR-CD for Columbia River hydroelectric projects. The John Day forebay model developed by CENWP is adequatelymore » configured and validated. The model is ready for use simulating forebay hydraulics for structural and operational alternatives. The approach and method are sound, however CENWP has identified some improvements that need to be made for future models and for modifications to this existing model.« less

  4. Hypersonic flow analysis

    NASA Technical Reports Server (NTRS)

    Chow, Chuen-Yen; Ryan, James S.

    1987-01-01

    While the zonal grid system of Transonic Navier-Stokes (TNS) provides excellent modeling of complex geometries, improved shock capturing, and a higher Mach number range will be required if flows about hypersonic aircraft are to be modeled accurately. A computational fluid dynamics (CFD) code, the Compressible Navier-Stokes (CNS), is under development to combine the required high Mach number capability with the existing TNS geometry capability. One of several candidate flow solvers for inclusion in the CNS is that of F3D. This upwinding flow solver promises improved shock capturing, and more accurate hypersonic solutions overall, compared to the solver currently used in TNS.

  5. A Self-Adaptive Dynamic Recognition Model for Fatigue Driving Based on Multi-Source Information and Two Levels of Fusion

    PubMed Central

    Sun, Wei; Zhang, Xiaorui; Peeta, Srinivas; He, Xiaozheng; Li, Yongfu; Zhu, Senlai

    2015-01-01

    To improve the effectiveness and robustness of fatigue driving recognition, a self-adaptive dynamic recognition model is proposed that incorporates information from multiple sources and involves two sequential levels of fusion, constructed at the feature level and the decision level. Compared with existing models, the proposed model introduces a dynamic basic probability assignment (BPA) to the decision-level fusion such that the weight of each feature source can change dynamically with the real-time fatigue feature measurements. Further, the proposed model can combine the fatigue state at the previous time step in the decision-level fusion to improve the robustness of the fatigue driving recognition. An improved correction strategy of the BPA is also proposed to accommodate the decision conflict caused by external disturbances. Results from field experiments demonstrate that the effectiveness and robustness of the proposed model are better than those of models based on a single fatigue feature and/or single-source information fusion, especially when the most effective fatigue features are used in the proposed model. PMID:26393615

  6. Probabilistic Determination of Green Infrastructure Pollutant Removal Rates from the International Stormwater BMP Database

    NASA Astrophysics Data System (ADS)

    Gilliom, R.; Hogue, T. S.; McCray, J. E.

    2017-12-01

    There is a need for improved parameterization of stormwater best management practices (BMP) performance estimates to improve modeling of urban hydrology, planning and design of green infrastructure projects, and water quality crediting for stormwater management. Percent removal is commonly used to estimate BMP pollutant removal efficiency, but there is general agreement that this approach has significant uncertainties and is easily affected by site-specific factors. Additionally, some fraction of monitored BMPs have negative percent removal, so it is important to understand the probability that a BMP will provide the desired water quality function versus exacerbating water quality problems. The widely used k-C* equation has shown to provide a more adaptable and accurate method to model BMP contaminant attenuation, and previous work has begun to evaluate the strengths and weaknesses of the k-C* method. However, no systematic method exists for obtaining first-order removal rate constants needed to use the k-C* equation for stormwater BMPs; thus there is minimal application of the method. The current research analyzes existing water quality data in the International Stormwater BMP Database to provide screening-level parameterization of the k-C* equation for selected BMP types and analysis of factors that skew the distribution of efficiency estimates from the database. Results illustrate that while certain BMPs are more likely to provide desired contaminant removal than others, site- and design-specific factors strongly influence performance. For example, bioretention systems show both the highest and lowest removal rates of dissolved copper, total phosphorous, and total nitrogen. Exploration and discussion of this and other findings will inform the application of the probabilistic pollutant removal rate constants. Though data limitations exist, this research will facilitate improved accuracy of BMP modeling and ultimately aid decision-making for stormwater quality management in urban systems.

  7. Risk prediction models of breast cancer: a systematic review of model performances.

    PubMed

    Anothaisintawee, Thunyarat; Teerawattananon, Yot; Wiratkapun, Chollathip; Kasamesup, Vijj; Thakkinstian, Ammarin

    2012-05-01

    The number of risk prediction models has been increasingly developed, for estimating about breast cancer in individual women. However, those model performances are questionable. We therefore have conducted a study with the aim to systematically review previous risk prediction models. The results from this review help to identify the most reliable model and indicate the strengths and weaknesses of each model for guiding future model development. We searched MEDLINE (PubMed) from 1949 and EMBASE (Ovid) from 1974 until October 2010. Observational studies which constructed models using regression methods were selected. Information about model development and performance were extracted. Twenty-five out of 453 studies were eligible. Of these, 18 developed prediction models and 7 validated existing prediction models. Up to 13 variables were included in the models and sample sizes for each study ranged from 550 to 2,404,636. Internal validation was performed in four models, while five models had external validation. Gail and Rosner and Colditz models were the significant models which were subsequently modified by other scholars. Calibration performance of most models was fair to good (expected/observe ratio: 0.87-1.12), but discriminatory accuracy was poor to fair both in internal validation (concordance statistics: 0.53-0.66) and in external validation (concordance statistics: 0.56-0.63). Most models yielded relatively poor discrimination in both internal and external validation. This poor discriminatory accuracy of existing models might be because of a lack of knowledge about risk factors, heterogeneous subtypes of breast cancer, and different distributions of risk factors across populations. In addition the concordance statistic itself is insensitive to measure the improvement of discrimination. Therefore, the new method such as net reclassification index should be considered to evaluate the improvement of the performance of a new develop model.

  8. Hybrid active contour model for inhomogeneous image segmentation with background estimation

    NASA Astrophysics Data System (ADS)

    Sun, Kaiqiong; Li, Yaqin; Zeng, Shan; Wang, Jun

    2018-03-01

    This paper proposes a hybrid active contour model for inhomogeneous image segmentation. The data term of the energy function in the active contour consists of a global region fitting term in a difference image and a local region fitting term in the original image. The difference image is obtained by subtracting the background from the original image. The background image is dynamically estimated from a linear filtered result of the original image on the basis of the varying curve locations during the active contour evolution process. As in existing local models, fitting the image to local region information makes the proposed model robust against an inhomogeneous background and maintains the accuracy of the segmentation result. Furthermore, fitting the difference image to the global region information makes the proposed model robust against the initial contour location, unlike existing local models. Experimental results show that the proposed model can obtain improved segmentation results compared with related methods in terms of both segmentation accuracy and initial contour sensitivity.

  9. Developing and implementing the use of predictive models for estimating water quality at Great Lakes beaches

    USGS Publications Warehouse

    Francy, Donna S.; Brady, Amie M.G.; Carvin, Rebecca B.; Corsi, Steven R.; Fuller, Lori M.; Harrison, John H.; Hayhurst, Brett A.; Lant, Jeremiah; Nevers, Meredith B.; Terrio, Paul J.; Zimmerman, Tammy M.

    2013-01-01

    Predictive models have been used at beaches to improve the timeliness and accuracy of recreational water-quality assessments over the most common current approach to water-quality monitoring, which relies on culturing fecal-indicator bacteria such as Escherichia coli (E. coli.). Beach-specific predictive models use environmental and water-quality variables that are easily and quickly measured as surrogates to estimate concentrations of fecal-indicator bacteria or to provide the probability that a State recreational water-quality standard will be exceeded. When predictive models are used for beach closure or advisory decisions, they are referred to as “nowcasts.” During the recreational seasons of 2010-12, the U.S. Geological Survey (USGS), in cooperation with 23 local and State agencies, worked to improve existing nowcasts at 4 beaches, validate predictive models at another 38 beaches, and collect data for predictive-model development at 7 beaches throughout the Great Lakes. This report summarizes efforts to collect data and develop predictive models by multiple agencies and to compile existing information on the beaches and beach-monitoring programs into one comprehensive report. Local agencies measured E. coli concentrations and variables expected to affect E. coli concentrations such as wave height, turbidity, water temperature, and numbers of birds at the time of sampling. In addition to these field measurements, equipment was installed by the USGS or local agencies at or near several beaches to collect water-quality and metrological measurements in near real time, including nearshore buoys, weather stations, and tributary staff gages and monitors. The USGS worked with local agencies to retrieve data from existing sources either manually or by use of tools designed specifically to compile and process data for predictive-model development. Predictive models were developed by use of linear regression and (or) partial least squares techniques for 42 beaches that had at least 2 years of data (2010-11 and sometimes earlier) and for 1 beach that had 1 year of data. For most models, software designed for model development by the U.S. Environmental Protection Agency (Virtual Beach) was used. The selected model for each beach was based on a combination of explanatory variables including, most commonly, turbidity, day of the year, change in lake level over 24 hours, wave height, wind direction and speed, and antecedent rainfall for various time periods. Forty-two predictive models were validated against data collected during an independent year (2012) and compared to the current method for assessing recreational water quality-using the previous day’s E. coli concentration (persistence model). Goals for good predictive-model performance were responses that were at least 5 percent greater than the persistence model and overall correct responses greater than or equal to 80 percent, sensitivities (percentage of exceedances of the bathing-water standard that were correctly predicted by the model) greater than or equal to 50 percent, and specificities (percentage of nonexceedances correctly predicted by the model) greater than or equal to 85 percent. Out of 42 predictive models, 24 models yielded over-all correct responses that were at least 5 percent greater than the use of the persistence model. Predictive-model responses met the performance goals more often than the persistence-model responses in terms of overall correctness (28 versus 17 models, respectively), sensitivity (17 versus 4 models), and specificity (34 versus 25 models). Gaining knowledge of each beach and the factors that affect E. coli concentrations is important for developing good predictive models. Collection of additional years of data with a wide range of environmental conditions may also help to improve future model performance. The USGS will continue to work with local agencies in 2013 and beyond to develop and validate predictive models at beaches and improve existing nowcasts, restructuring monitoring activities to accommodate future uncertainties in funding and resources.

  10. Retargeting of existing FORTRAN program and development of parallel compilers

    NASA Technical Reports Server (NTRS)

    Agrawal, Dharma P.

    1988-01-01

    The software models used in implementing the parallelizing compiler for the B-HIVE multiprocessor system are described. The various models and strategies used in the compiler development are: flexible granularity model, which allows a compromise between two extreme granularity models; communication model, which is capable of precisely describing the interprocessor communication timings and patterns; loop type detection strategy, which identifies different types of loops; critical path with coloring scheme, which is a versatile scheduling strategy for any multicomputer with some associated communication costs; and loop allocation strategy, which realizes optimum overlapped operations between computation and communication of the system. Using these models, several sample routines of the AIR3D package are examined and tested. It may be noted that automatically generated codes are highly parallelized to provide the maximized degree of parallelism, obtaining the speedup up to a 28 to 32-processor system. A comparison of parallel codes for both the existing and proposed communication model, is performed and the corresponding expected speedup factors are obtained. The experimentation shows that the B-HIVE compiler produces more efficient codes than existing techniques. Work is progressing well in completing the final phase of the compiler. Numerous enhancements are needed to improve the capabilities of the parallelizing compiler.

  11. Studies on Experimental Ontology and Knowledge Service Development in Bio-Environmental Engineering

    NASA Astrophysics Data System (ADS)

    Zhang, Yunliang

    2018-01-01

    The existing domain-related ontology and information service patterns are analyzed, and the main problems faced by the experimental scheme knowledge service were clarified. The ontology framework model for knowledge service of Bio-environmental Engineering was proposed from the aspects of experimental materials, experimental conditions and experimental instruments, and this ontology will be combined with existing knowledge organization systems to organize scientific and technological literatures, data and experimental schemes. With the similarity and priority calculation, it can improve the related domain research.

  12. Point Source X-Ray Lithography System for Sub-0.15 Micron Design Rules

    DTIC Science & Technology

    1998-05-22

    consist of a SAL developed stepper, an SRL developed Dense Plasma Focus , (DPF), X-Ray source, and a CXrL developed beam line. The system will be...existing machine that used spark gap switching, SRL has developed an all solid state driver and improved head electrode assembly for their dense plasma ... focus X-Ray source. Likewise, SAL has used their existing Model 4 stepper installed at CXrL as a design starting point, and has developed an advanced

  13. Next-Generation NATO Reference Mobility Model (NG-NRMM)

    DTIC Science & Technology

    2016-05-11

    facilitate comparisons between vehicle design candidates and to assess the mobility of existing vehicles under specific scenarios. Although NRMM has...of different deployed platforms in different areas of operation and routes  Improved flexibility as a design and procurement support tool through...Element Method DEM Digital Elevation Model DIL Driver in the Loop DP Drawbar Pull Force DOE Design of Experiments DTED Digital Terrain Elevation Data

  14. Model I & II Organizations: Examining Organizational Learning in Institutions Participating in the Academy for the Assessment of Student Learning

    ERIC Educational Resources Information Center

    Haywood, Antwione Maurice

    2012-01-01

    The Academy was an assessment enhancement program created by the HLC to help institutions strengthen and improve the assessment of student learning. Using a multiple case study approach, this study applies Argyis and Schon's (1976) Theory of Action to explore the espoused values and existence of Model I and II behavior characteristics. Argyis…

  15. Bidirectional reflectance distribution function measurements and analysis of retroreflective materials.

    PubMed

    Belcour, Laurent; Pacanowski, Romain; Delahaie, Marion; Laville-Geay, Aude; Eupherte, Laure

    2014-12-01

    We compare the performance of various analytical retroreflecting bidirectional reflectance distribution function (BRDF) models to assess how they reproduce accurately measured data of retroreflecting materials. We introduce a new parametrization, the back vector parametrization, to analyze retroreflecting data, and we show that this parametrization better preserves the isotropy of data. Furthermore, we update existing BRDF models to improve the representation of retroreflective data.

  16. Quicklook overview of model changes in Melcor 2.2: Rev 6342 to Rev 9496

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphries, Larry L.

    2017-05-01

    MELCOR 2.2 is a significant official release of the MELCOR code with many new models and model improvements. This report provides the code user with a quick review and characterization of new models added, changes to existing models, the effect of code changes during this code development cycle (rev 6342 to rev 9496), a preview of validation results with this code version. More detailed information is found in the code Subversion logs as well as the User Guide and Reference Manuals.

  17. Temperature modelling and prediction for activated sludge systems.

    PubMed

    Lippi, S; Rosso, D; Lubello, C; Canziani, R; Stenstrom, M K

    2009-01-01

    Temperature is an important factor affecting biomass activity, which is critical to maintain efficient biological wastewater treatment, and also physiochemical properties of mixed liquor as dissolved oxygen saturation and settling velocity. Controlling temperature is not normally possible for treatment systems but incorporating factors impacting temperature in the design process, such as aeration system, surface to volume ratio, and tank geometry can reduce the range of temperature extremes and improve the overall process performance. Determining how much these design or up-grade options affect the tank temperature requires a temperature model that can be used with existing design methodologies. This paper presents a new steady state temperature model developed by incorporating the best aspects of previously published models, introducing new functions for selected heat exchange paths and improving the method for predicting the effects of covering aeration tanks. Numerical improvements with embedded reference data provide simpler formulation, faster execution, easier sensitivity analyses, using an ordinary spreadsheet. The paper presents several cases to validate the model.

  18. STATISTICAL METHODOLOGY FOR THE SIMULTANEOUS ANALYSIS OF MULTIPLE TYPES OF OUTCOMES IN NONLINEAR THRESHOLD MODELS.

    EPA Science Inventory

    Multiple outcomes are often measured on each experimental unit in toxicology experiments. These multiple observations typically imply the existence of correlation between endpoints, and a statistical analysis that incorporates it may result in improved inference. When both disc...

  19. Improving Student Services in Secondary Schools.

    ERIC Educational Resources Information Center

    Maddy-Bernstein, Carolyn; Cunanan, Esmeralda S.

    1995-01-01

    No single comprehensive student services delivery model exists, and "student services" terminology remains problematic. The Office of Student Services has defined student services as those services provided by educational institutions to facilitate learning and the successful transition from school to work, military, or more education. To be…

  20. An Investigation of the Electrical Short Circuit Characteristics of Tin Whiskers

    NASA Technical Reports Server (NTRS)

    Courey, Karim J.

    2008-01-01

    In this experiment, an empirical model to quantify the probability of occurrence of an electrical short circuit from tin whiskers as a function of voltage was developed. This model can be used to improve existing risk simulation models FIB and TEM images of a tin whisker confirm the rare polycrystalline structure on one of the three whiskers studied. FIB cross-section of the card guides verified that the tin finish was bright tin.

  1. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Humanitarian response: improving logistics to save lives.

    PubMed

    McCoy, Jessica

    2008-01-01

    Each year, millions of people worldwide are affected by disasters, underscoring the importance of effective relief efforts. Many highly visible disaster responses have been inefficient and ineffective. Humanitarian agencies typically play a key role in disaster response (eg, procuring and distributing relief items to an affected population, assisting with evacuation, providing healthcare, assisting in the development of long-term shelter), and thus their efficiency is critical for a successful disaster response. The field of disaster and emergency response modeling is well established, but the application of such techniques to humanitarian logistics is relatively recent. This article surveys models of humanitarian response logistics and identifies promising opportunities for future work. Existing models analyze a variety of preparation and response decisions (eg, warehouse location and the distribution of relief supplies), consider both natural and manmade disasters, and typically seek to minimize cost or unmet demand. Opportunities to enhance the logistics of humanitarian response include the adaptation of models developed for general disaster response; the use of existing models, techniques, and insights from the literature on commercial supply chain management; the development of working partnerships between humanitarian aid organizations and private companies with expertise in logistics; and the consideration of behavioral factors relevant to a response. Implementable, realistic models that support the logistics of humanitarian relief can improve the preparation for and the response to disasters, which in turn can save lives.

  3. Building Protection Against External Ionizing Fallout Radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillon, Michael B.; Homann, Steven G.

    A nuclear explosion has the potential to injure or kill tens to hundreds of thousands of people through exposure to fallout (external gamma) radiation. Existing buildings can protect their occupants (reducing external radiation exposures) by placing material and distance between fallout particles and indoor individuals. This protection is not well captured in current fallout risk assessment models and so the US Department of Defense is implementing the Regional Shelter Analysis methodology to improve the ability of the Hazard Prediction and Assessment Capability (HPAC) model to account for building protection. This report supports the HPAC improvement effort by identifying a setmore » of building attributes (next page) that, when collectively specified, are sufficient to calculate reasonably accurate, i.e., within a factor of 2, fallout shelter quality estimates for many individual buildings. The set of building attributes were determined by first identifying the key physics controlling building protection from fallout radiation and then assessing which building attributes are relevant to the identified physics. This approach was evaluated by developing a screening model (PFscreen) based on the identified physics and comparing the screening model results against the set of existing independent experimental, theoretical, and modeled building protection estimates. In the interests of transparency, we have developed a benchmark dataset containing (a) most of the relevant primary experimental data published by prior generations of fallout protection scientists as well as (b) the screening model results.« less

  4. A Complete Procedure for Predicting and Improving the Performance of HAWT's

    NASA Astrophysics Data System (ADS)

    Al-Abadi, Ali; Ertunç, Özgür; Sittig, Florian; Delgado, Antonio

    2014-06-01

    A complete procedure for predicting and improving the performance of the horizontal axis wind turbine (HAWT) has been developed. The first process is predicting the power extracted by the turbine and the derived rotor torque, which should be identical to that of the drive unit. The BEM method and a developed post-stall treatment for resolving stall-regulated HAWT is incorporated in the prediction. For that, a modified stall-regulated prediction model, which can predict the HAWT performance over the operating range of oncoming wind velocity, is derived from existing models. The model involves radius and chord, which has made it more general in applications for predicting the performance of different scales and rotor shapes of HAWTs. The second process is modifying the rotor shape by an optimization process, which can be applied to any existing HAWT, to improve its performance. A gradient- based optimization is used for adjusting the chord and twist angle distribution of the rotor blade to increase the extraction of the power while keeping the drive torque constant, thus the same drive unit can be kept. The final process is testing the modified turbine to predict its enhanced performance. The procedure is applied to NREL phase-VI 10kW as a baseline turbine. The study has proven the applicability of the developed model in predicting the performance of the baseline as well as the optimized turbine. In addition, the optimization method has shown that the power coefficient can be increased while keeping same design rotational speed.

  5. Physical Activity Predicts Performance in an Unpracticed Bimanual Coordination Task.

    PubMed

    Boisgontier, Matthieu P; Serbruyns, Leen; Swinnen, Stephan P

    2017-01-01

    Practice of a given physical activity is known to improve the motor skills related to this activity. However, whether unrelated skills are also improved is still unclear. To test the impact of physical activity on an unpracticed motor task, 26 young adults completed the international physical activity questionnaire and performed a bimanual coordination task they had never practiced before. Results showed that higher total physical activity predicted higher performance in the bimanual task, controlling for multiple factors such as age, physical inactivity, music practice, and computer games practice. Linear mixed models allowed this effect of physical activity to be generalized to a large population of bimanual coordination conditions. This finding runs counter to the notion that generalized motor abilities do not exist and supports the existence of a "learning to learn" skill that could be improved through physical activity and that impacts performance in tasks that are not necessarily related to the practiced activity.

  6. Integrated Practice Improvement Solutions-Practical Steps to Operating Room Management.

    PubMed

    Chernov, Mikhail; Pullockaran, Janet; Vick, Angela; Leyvi, Galina; Delphin, Ellise

    2016-10-01

    Perioperative productivity is a vital concern for surgeons, anesthesiologists, and administrators as the OR is a major source of hospital elective admissions and revenue. Based on elements of existing Practice Improvement Methodologies (PIMs), "Integrated Practice Improvement Solutions" (IPIS) is a practical and simple solution incorporating aspects of multiple management approaches into a single open source framework to increase OR efficiency and productivity by better utilization of existing resources. OR efficiency was measured both before and after IPIS implementation using the total number of cases versus room utilization, OR/anesthesia revenue and staff overtime (OT) costs. Other parameters of efficiency, such as the first case on-time start and the turnover time (TOT) were measured in parallel. IPIS implementation resulted in increased numbers of surgical procedures performed by an average of 10.7%, and OR and anesthesia revenue increases of 18.5% and 6.9%, respectively, with a simultaneous decrease in TOT (15%) and OT for anesthesia staff (26%). The number of perioperative adverse events was stable during the two-year study period which involved a total of 20,378 patients. IPIS, an effective and flexible practice improvement model, was designed to quickly, significantly, and sustainably improve OR efficiency by better utilization of existing resources. Success of its implementation directly correlates with the involvement of and acceptance by the entire OR team and hospital administration.

  7. Double the dates and go for Bayes - Impacts of model choice, dating density and quality on chronologies

    NASA Astrophysics Data System (ADS)

    Blaauw, Maarten; Christen, J. Andrés; Bennett, K. D.; Reimer, Paula J.

    2018-05-01

    Reliable chronologies are essential for most Quaternary studies, but little is known about how age-depth model choice, as well as dating density and quality, affect the precision and accuracy of chronologies. A meta-analysis suggests that most existing late-Quaternary studies contain fewer than one date per millennium, and provide millennial-scale precision at best. We use existing and simulated sediment cores to estimate what dating density and quality are required to obtain accurate chronologies at a desired precision. For many sites, a doubling in dating density would significantly improve chronologies and thus their value for reconstructing and interpreting past environmental changes. Commonly used classical age-depth models stop becoming more precise after a minimum dating density is reached, but the precision of Bayesian age-depth models which take advantage of chronological ordering continues to improve with more dates. Our simulations show that classical age-depth models severely underestimate uncertainty and are inaccurate at low dating densities, and also perform poorly at high dating densities. On the other hand, Bayesian age-depth models provide more realistic precision estimates, including at low to average dating densities, and are much more robust against dating scatter and outliers. Indeed, Bayesian age-depth models outperform classical ones at all tested dating densities, qualities and time-scales. We recommend that chronologies should be produced using Bayesian age-depth models taking into account chronological ordering and based on a minimum of 2 dates per millennium.

  8. Steering operational synergies in terrestrial observation networks: opportunity for advancing Earth system dynamics modelling

    NASA Astrophysics Data System (ADS)

    Baatz, Roland; Sullivan, Pamela L.; Li, Li; Weintraub, Samantha R.; Loescher, Henry W.; Mirtl, Michael; Groffman, Peter M.; Wall, Diana H.; Young, Michael; White, Tim; Wen, Hang; Zacharias, Steffen; Kühn, Ingolf; Tang, Jianwu; Gaillardet, Jérôme; Braud, Isabelle; Flores, Alejandro N.; Kumar, Praveen; Lin, Henry; Ghezzehei, Teamrat; Jones, Julia; Gholz, Henry L.; Vereecken, Harry; Van Looy, Kris

    2018-05-01

    Advancing our understanding of Earth system dynamics (ESD) depends on the development of models and other analytical tools that apply physical, biological, and chemical data. This ambition to increase understanding and develop models of ESD based on site observations was the stimulus for creating the networks of Long-Term Ecological Research (LTER), Critical Zone Observatories (CZOs), and others. We organized a survey, the results of which identified pressing gaps in data availability from these networks, in particular for the future development and evaluation of models that represent ESD processes, and provide insights for improvement in both data collection and model integration. From this survey overview of data applications in the context of LTER and CZO research, we identified three challenges: (1) widen application of terrestrial observation network data in Earth system modelling, (2) develop integrated Earth system models that incorporate process representation and data of multiple disciplines, and (3) identify complementarity in measured variables and spatial extent, and promoting synergies in the existing observational networks. These challenges lead to perspectives and recommendations for an improved dialogue between the observation networks and the ESD modelling community, including co-location of sites in the existing networks and further formalizing these recommendations among these communities. Developing these synergies will enable cross-site and cross-network comparison and synthesis studies, which will help produce insights around organizing principles, classifications, and general rules of coupling processes with environmental conditions.

  9. An Innovative Approach to Health Care Delivery for Patients with Chronic Conditions.

    PubMed

    Clarke, Janice L; Bourn, Scott; Skoufalos, Alexis; Beck, Eric H; Castillo, Daniel J

    2017-02-01

    Although the health care reform movement has brought about positive changes, lingering inefficiencies and communication gaps continue to hamper system-wide progress toward achieving the overarching goal-higher quality health care and improved population health outcomes at a lower cost. The multiple interrelated barriers to improvement are most evident in care for the population of patients with multiple chronic conditions. During transitions of care, the lack of integration among various silos and inadequate communication among providers cause delays in delivering appropriate health care services to these vulnerable patients and their caregivers, diminishing positive health outcomes and driving costs ever higher. Long-entrenched acute care-focused treatment and reimbursement paradigms hamper more effective deployment of existing resources to improve the ongoing care of these patients. New models for care coordination during transitions, longitudinal high-risk care management, and unplanned acute episodic care have been conceived and piloted with promising results. Utilizing existing resources, Mobile Integrated Healthcare is an emerging model focused on closing these care gaps by means of a round-the-clock, technologically sophisticated, physician-led interprofessional team to manage care transitions and chronic care services on-site in patients' homes or workplaces.

  10. Two-Dimensional Hydrodynamic Modeling and Analysis of the Proposed Channel Modifications and Grade Control Structure on the Blue River near Byram's Ford Industrial Park, Kansas City, Missouri

    USGS Publications Warehouse

    Huizinga, Richard J.

    2007-01-01

    The Blue River Channel Modification project being implemented by the U.S. Army Corps of Engineers (USACE) is intended to provide flood protection within the Blue River valley in the Kansas City, Mo., metropolitan area. In the latest phase of the project, concerns have arisen about preserving the Civil War historic area of Byram's Ford and the associated Big Blue Battlefield while providing flood protection for the Byram's Ford Industrial Park. In 1996, the USACE used a physical model built at the Waterways Experiment Station (WES) in Vicksburg, Miss., to examine the feasibility of a proposed grade control structure (GCS) that would be placed downstream from the historic river crossing of Byram's Ford to provide a subtle transition of flow from the natural channel to the modified channel. The U.S. Geological Survey (USGS), in cooperation with the USACE, modified an existing two-dimensional finite element surface-water model of the river between 63d Street and Blue Parkway (the 'original model'), used the modified model to simulate the existing (as of 2006) unimproved channel and the proposed channel modifications and GCS, and analyzed the results from the simulations and those from the WES physical model. Modifications were made to the original model to create a model that represents existing (2006) conditions between the north end of Swope Park immediately upstream from 63d Street and the upstream limit of channel improvement on the Blue River (the 'model of existing conditions'). The model of existing conditions was calibrated to two measured floods. The model of existing conditions also was modified to create a model that represents conditions along the same reach of the Blue River with proposed channel modifications and the proposed GCS (the 'model of proposed conditions'). The models of existing conditions and proposed conditions were used to simulate the 30-, 50-, and 100-year recurrence floods. The discharge from the calibration flood of May 15, 1990, also was simulated in the models of existing and proposed conditions to provide results for that flood with the current downstream channel modifications and with the proposed channel modifications and GCS. Results from the model of existing conditions show that the downstream channel modifications as they exist (2006) may already be affecting flows in the unmodified upstream channel. The 30-year flood does not inundate most of the Byram's Ford Industrial Park near the upstream end of the study area. Analysis of the 1990 flood (with the historical 1990 channel conditions) and the 1990 flood simulated with the existing (2006) conditions indicates a substantial increase in velocity throughout the study area and a substantial decrease in inundated area from 1990 to 2006. Results from the model of proposed conditions show that the proposed channel modifications will contain the 30-year flood and that the spoil berm designed to provide additional flood protection for the Byram's Ford Industrial Park for the 30-year flood prevents inundation of the industrial park. In the vicinity of Byram's Ford for the 30-year flood, the maximum depth increased from 39.7 feet (ft) in the model of existing conditions to 43.5 ft in the model of proposed conditions, with a resulting decrease in velocity from 6.61 to 4.55 feet per second (ft/s). For the 50-year flood, the maximum depth increased from 42.3 to 45.8 ft, with a decrease in velocity from 6.12 to 4.16 ft/s from existing to proposed conditions. For the 100-year flood, the maximum depth increased from 44.0 to 46.6 ft, with a decrease in velocity from 5.64 to 4.12 ft/s from existing to proposed conditions. When the May 15, 1990, discharge is simulated in the model of existing conditions (with the existing (2006) modified channel downstream of the study area), the maximum depth increases from 38.4 to 42.0 ft, with a decrease in velocity from 6.54 to 4.84 ft/s from existing (2006) to proposed conditions. Analysis of the results fro

  11. An improved method for predicting the lightning performance of high and extra-high-voltage substation shielding

    NASA Astrophysics Data System (ADS)

    Vinh, T.

    1980-08-01

    There is a need for better and more effective lightning protection for transmission and switching substations. In the past, a number of empirical methods were utilized to design systems to protect substations and transmission lines from direct lightning strokes. The need exists for convenient analytical lightning models adequate for engineering usage. In this study, analytical lightning models were developed along with a method for improved analysis of the physical properties of lightning through their use. This method of analysis is based upon the most recent statistical field data. The result is an improved method for predicting the occurrence of sheilding failure and for designing more effective protection for high and extra high voltage substations from direct strokes.

  12. Power System Transient Stability Improvement by the Interline Power Flow Controller (IPFC)

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Yokoyama, Akihiko

    This paper presents a study on the power system transient stability improvement by means of interline power flow controller (IPFC). The power injection model of IPFC in transient analysis is proposed and can be easily incorporated into existing power systems. Based on the energy function analysis, the operation of IPFC should guarantee that the time derivative of the global energy of the system is not greater than zero in order to damp the electromechanical oscillations. Accordingly, control laws of IPFC are proposed for its application to the single-machine infinite-bus (SMIB) system and the multimachine systems, respectively. Numerical simulations on the corresponding model power systems are presented to demonstrate their effectiveness in improving power system transient stability.

  13. Product component genealogy modeling and field-failure prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Caleb; Hong, Yili; Meeker, William Q.

    Many industrial products consist of multiple components that are necessary for system operation. There is an abundance of literature on modeling the lifetime of such components through competing risks models. During the life-cycle of a product, it is common for there to be incremental design changes to improve reliability, to reduce costs, or due to changes in availability of certain part numbers. These changes can affect product reliability but are often ignored in system lifetime modeling. By incorporating this information about changes in part numbers over time (information that is readily available in most production databases), better accuracy can bemore » achieved in predicting time to failure, thus yielding more accurate field-failure predictions. This paper presents methods for estimating parameters and predictions for this generational model and a comparison with existing methods through the use of simulation. Our results indicate that the generational model has important practical advantages and outperforms the existing methods in predicting field failures.« less

  14. Product component genealogy modeling and field-failure prediction

    DOE PAGES

    King, Caleb; Hong, Yili; Meeker, William Q.

    2016-04-13

    Many industrial products consist of multiple components that are necessary for system operation. There is an abundance of literature on modeling the lifetime of such components through competing risks models. During the life-cycle of a product, it is common for there to be incremental design changes to improve reliability, to reduce costs, or due to changes in availability of certain part numbers. These changes can affect product reliability but are often ignored in system lifetime modeling. By incorporating this information about changes in part numbers over time (information that is readily available in most production databases), better accuracy can bemore » achieved in predicting time to failure, thus yielding more accurate field-failure predictions. This paper presents methods for estimating parameters and predictions for this generational model and a comparison with existing methods through the use of simulation. Our results indicate that the generational model has important practical advantages and outperforms the existing methods in predicting field failures.« less

  15. Determination of wind tunnel constraint effects by a unified pressure signature method. Part 2: Application to jet-in-crossflow

    NASA Technical Reports Server (NTRS)

    Hackett, J. E.; Sampath, S.; Phillips, C. G.

    1981-01-01

    The development of an improved jet-in-crossflow model for estimating wind tunnel blockage and angle-of-attack interference is described. Experiments showed that the simpler existing models fall seriously short of representing far-field flows properly. A new, vortex-source-doublet (VSD) model was therefore developed which employs curved trajectories and experimentally-based singularity strengths. The new model is consistent with existing and new experimental data and it predicts tunnel wall (i.e. far-field) pressures properly. It is implemented as a preprocessor to the wall-pressure-signature-based tunnel interference predictor. The supporting experiments and theoretical studies revealed some new results. Comparative flow field measurements with 1-inch "free-air" and 3-inch impinging jets showed that vortex penetration into the flow, in diameters, was almost unaltered until 'hard' impingement occurred. In modeling impinging cases, a 'plume redirection' term was introduced which is apparently absent in previous models. The effects of this term were found to be very significant.

  16. Feedforward Controller of Ill-Conditioned Hysteresis Using Singularity-Free Prandtl–Ishlinskii Model

    PubMed Central

    Tan, U-Xuan; Latt, Win Tun; Shee, Cheng Yap; Riviere, Cameron N.; Ang, Wei Tech

    2009-01-01

    Piezoelectric, magnetostrictive, and shape memory alloy actuators are gaining importance in high-frequency precision applications constrained by space. Their intrinsic hysteretic behavior makes control difficult. The Prandtl–Ishlinskii (PI) operator can model hysteresis well, albeit a major inadequacy: the inverse operator does not exist when the hysteretic curve gradient is not positive definite, i.e., ill condition occurs when slope is negative. An inevitable tradeoff between modeling accuracy and inversion stability exists. The hysteretic modeling improves with increasing number of play operators. But as the piecewise continuous interval of each operator reduces, the model tends to be ill-conditioned, especially at the turning points. Similar ill-conditioned situation arises when these actuators move heavy loads or operate at high frequency. This paper proposes an extended PI operator to map hysteresis to a domain where inversion is well behaved. The inverse weights are then evaluated to determine the inverse hysteresis model for the feedforward controller. For illustration purpose, a piezoelectric actuator is used. PMID:19936032

  17. Endometrial cancer risk prediction including serum-based biomarkers: results from the EPIC cohort.

    PubMed

    Fortner, Renée T; Hüsing, Anika; Kühn, Tilman; Konar, Meric; Overvad, Kim; Tjønneland, Anne; Hansen, Louise; Boutron-Ruault, Marie-Christine; Severi, Gianluca; Fournier, Agnès; Boeing, Heiner; Trichopoulou, Antonia; Benetou, Vasiliki; Orfanos, Philippos; Masala, Giovanna; Agnoli, Claudia; Mattiello, Amalia; Tumino, Rosario; Sacerdote, Carlotta; Bueno-de-Mesquita, H B As; Peeters, Petra H M; Weiderpass, Elisabete; Gram, Inger T; Gavrilyuk, Oxana; Quirós, J Ramón; Maria Huerta, José; Ardanaz, Eva; Larrañaga, Nerea; Lujan-Barroso, Leila; Sánchez-Cantalejo, Emilio; Butt, Salma Tunå; Borgquist, Signe; Idahl, Annika; Lundin, Eva; Khaw, Kay-Tee; Allen, Naomi E; Rinaldi, Sabina; Dossus, Laure; Gunter, Marc; Merritt, Melissa A; Tzoulaki, Ioanna; Riboli, Elio; Kaaks, Rudolf

    2017-03-15

    Endometrial cancer risk prediction models including lifestyle, anthropometric and reproductive factors have limited discrimination. Adding biomarker data to these models may improve predictive capacity; to our knowledge, this has not been investigated for endometrial cancer. Using a nested case-control study within the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort, we investigated the improvement in discrimination gained by adding serum biomarker concentrations to risk estimates derived from an existing risk prediction model based on epidemiologic factors. Serum concentrations of sex steroid hormones, metabolic markers, growth factors, adipokines and cytokines were evaluated in a step-wise backward selection process; biomarkers were retained at p < 0.157 indicating improvement in the Akaike information criterion (AIC). Improvement in discrimination was assessed using the C-statistic for all biomarkers alone, and change in C-statistic from addition of biomarkers to preexisting absolute risk estimates. We used internal validation with bootstrapping (1000-fold) to adjust for over-fitting. Adiponectin, estrone, interleukin-1 receptor antagonist, tumor necrosis factor-alpha and triglycerides were selected into the model. After accounting for over-fitting, discrimination was improved by 2.0 percentage points when all evaluated biomarkers were included and 1.7 percentage points in the model including the selected biomarkers. Models including etiologic markers on independent pathways and genetic markers may further improve discrimination. © 2016 UICC.

  18. Examining quality improvement programs: the case of Minnesota hospitals.

    PubMed

    Olson, John R; Belohlav, James A; Cook, Lori S; Hays, Julie M

    2008-10-01

    To determine if there is a hierarchy of improvement program adoption by hospitals and outline that hierarchy. Primary data were collected in the spring of 2007 via e-survey from 210 individuals representing 109 Minnesota hospitals. Secondary data from 2006 were assembled from the Leapfrog database. As part of a larger survey, respondents were given a list of improvement programs and asked to identify those programs that are used in their hospital. DATA COLLECTION/DATA EXTRACTION: Rasch Model Analysis was used to assess whether a unidimensional construct exists that defines a hospital's ability to implement performance improvement programs. Linear regression analysis was used to assess the relationship of the Rasch ability scores with Leapfrog Safe Practices Scores to validate the research findings. Principal Findings. The results of the study show that hospitals have widely varying abilities in implementing improvement programs. In addition, improvement programs present differing levels of difficulty for hospitals trying to implement them. Our findings also indicate that the ability to adopt improvement programs is important to the overall performance of hospitals. There is a hierarchy of improvement programs in the health care context. A hospital's ability to successfully adopt improvement programs is a function of its existing capabilities. As a hospital's capability increases, the ability to successfully implement higher level programs also increases.

  19. Computer model for the cardiovascular system: development of an e-learning tool for teaching of medical students.

    PubMed

    Warriner, David Roy; Bayley, Martin; Shi, Yubing; Lawford, Patricia Victoria; Narracott, Andrew; Fenner, John

    2017-11-21

    This study combined themes in cardiovascular modelling, clinical cardiology and e-learning to create an on-line environment that would assist undergraduate medical students in understanding key physiological and pathophysiological processes in the cardiovascular system. An interactive on-line environment was developed incorporating a lumped-parameter mathematical model of the human cardiovascular system. The model outputs were used to characterise the progression of key disease processes and allowed students to classify disease severity with the aim of improving their understanding of abnormal physiology in a clinical context. Access to the on-line environment was offered to students at all stages of undergraduate training as an adjunct to routine lectures and tutorials in cardiac pathophysiology. Student feedback was collected on this novel on-line material in the course of routine audits of teaching delivery. Medical students, irrespective of their stage of undergraduate training, reported that they found the models and the environment interesting and a positive experience. After exposure to the environment, there was a statistically significant improvement in student performance on a series of 6 questions based on cardiovascular medicine, with a 33% and 22% increase in the number of questions answered correctly, p < 0.0001 and p < 0.001 respectively. Considerable improvement was found in students' knowledge and understanding during assessment after exposure to the e-learning environment. Opportunities exist for development of similar environments in other fields of medicine, refinement of the existing environment and further engagement with student cohorts. This work combines some exciting and developing fields in medical education, but routine adoption of these types of tool will be possible only with the engagement of all stake-holders, from educationalists, clinicians, modellers to, most importantly, medical students.

  20. Recent progress in empirical modeling of ion composition in the topside ionosphere

    NASA Astrophysics Data System (ADS)

    Truhlik, Vladimir; Triskova, Ludmila; Bilitza, Dieter; Kotov, Dmytro; Bogomaz, Oleksandr; Domnin, Igor

    2016-07-01

    The last deep and prolonged solar minimum revealed shortcomings of existing empirical models, especially of parameter models that depend strongly on solar activity, such as the IRI (International Reference Ionosphere) ion composition model, and that are based on data sets from previous solar cycles. We have improved the TTS-03 ion composition model (Triskova et al., 2003) which is included in IRI since version 2007. The new model called AEIKion-13 employs an improved description of the dependence of ion composition on solar activity. We have also developed new global models of the upper transition height based on large data sets of vertical electron density profiles from ISIS, Alouette and COSMIC. The upper transition height is used as an anchor point for adjustment of the AEIKion-13 ion composition model. Additionally, we show also progress on improvements of the altitudinal dependence of the ion composition in the AEIKion-13 model. Results of the improved model are compared with data from other types of measurements including data from the Atmosphere Explorer C and E and C/NOFS satellites, and the Kharkiv and Arecibo incoherent scatter radars. Possible real time updating of the model by the upper transition height from the real time COSMIC vertical profiles is discussed. Triskova, L.,Truhlik,V., Smilauer, J.,2003. An empirical model of ion composition in the outer ionosphere. Adv. Space Res. 31(3), 653-663.

  1. Steps towards Improving GNSS Systematic Errors and Biases

    NASA Astrophysics Data System (ADS)

    Herring, T.; Moore, M.

    2017-12-01

    Four general areas of analysis method improvements, three related to data analysis models and the fourth to calibration methods, have been recommended at the recent unified analysis workshop (UAW) and we discuss aspects of these areas for improvement. The gravity fields used in the GNSS orbit integrations should be updated to match modern fields to make them consistent with the fields being used by the other IAG services. The update would include the static part of the field and a time variable component. The force models associated with radiation forces are the most uncertain and modeling of these forces can be made more consistent with the exchange of attitude information. The international GNSS service (IGS) will develop an attitude format and make attitude information available so that analysis centers can validate their models. The IGS has noted the appearance of the GPS draconitic period and harmonics of this period in time series of various geodetic products (e.g., positions and Earth orientation parameters). An updated short-period (diurnal and semidiurnal) model is needed and a method to determine the best model developed. The final area, not directly related to analysis models, is the recommendation that site dependent calibration of GNSS antennas are needed since these have a direct effect on the ITRF realization and position offsets when antennas are changed. Evaluation of the effects of the use of antenna specific phase center models will be investigated for those sites where these values are available without disturbing an existing antenna installation. Potential development of an in-situ antenna calibration system is strongly encouraged. In-situ calibration would be deployed at core sites where GNSS sites are tied to other geodetic systems. With recent expansion of the number of GPS satellites transmitting unencrypted codes on the GPS L2 frequency and the availability of software GNSS receivers in-situ calibration between an existing installation and a movable directional antenna is now more likely to generate accurate results than earlier analog switching systems. With all of these improvements, there is the expectation that there will be better agreement between the space geodetic methods thus allowing more definitive assessment and modeling of the Earth's time variable shape and gravity field.

  2. Video quality assessment using a statistical model of human visual speed perception.

    PubMed

    Wang, Zhou; Li, Qiang

    2007-12-01

    Motion is one of the most important types of information contained in natural video, but direct use of motion information in the design of video quality assessment algorithms has not been deeply investigated. Here we propose to incorporate a recent model of human visual speed perception [Nat. Neurosci. 9, 578 (2006)] and model visual perception in an information communication framework. This allows us to estimate both the motion information content and the perceptual uncertainty in video signals. Improved video quality assessment algorithms are obtained by incorporating the model as spatiotemporal weighting factors, where the weight increases with the information content and decreases with the perceptual uncertainty. Consistent improvement over existing video quality assessment algorithms is observed in our validation with the video quality experts group Phase I test data set.

  3. Effect of quantum learning model in improving creativity and memory

    NASA Astrophysics Data System (ADS)

    Sujatmika, S.; Hasanah, D.; Hakim, L. L.

    2018-04-01

    Quantum learning is a combination of many interactions that exist during learning. This model can be applied by current interesting topic, contextual, repetitive, and give opportunities to students to demonstrate their abilities. The basis of the quantum learning model are left brain theory, right brain theory, triune, visual, auditorial, kinesthetic, game, symbol, holistic, and experiential learning theory. Creativity plays an important role to be success in the working world. Creativity shows alternatives way to problem-solving or creates something. Good memory plays a role in the success of learning. Through quantum learning, students will use all of their abilities, interested in learning and create their own ways of memorizing concepts of the material being studied. From this idea, researchers assume that quantum learning models can improve creativity and memory of the students.

  4. An improved source model for aircraft interior noise studies

    NASA Technical Reports Server (NTRS)

    Mahan, J. R.; Fuller, C. R.

    1985-01-01

    There is concern that advanced turboprop engines currently being developed may produce excessive aircraft cabin noise levels. This concern has stimulated renewed interest in developing aircraft interior noise reduction methods that do not significantly increase take off weight. An existing analytical model for noise transmission into aircraft cabins was utilized to investigate the behavior of an improved propeller source model for use in aircraft interior noise studies. The new source model, a virtually rotating dipole, is shown to adequately match measured fuselage sound pressure distributions, including the correct phase relationships, for published data. The virtually rotating dipole is used to study the sensitivity of synchrophasing effectiveness to the fuselage sound pressure trace velocity distribution. Results of calculations are presented which reveal the importance of correctly modeling the surface pressure phase relations in synchrophasing and other aircraft interior noise studies.

  5. An improved source model for aircraft interior noise studies

    NASA Technical Reports Server (NTRS)

    Mahan, J. R.; Fuller, C. R.

    1985-01-01

    There is concern that advanced turboprop engines currently being developed may produce excessive aircraft cabin noise level. This concern has stimulated renewed interest in developing aircraft interior noise reduction methods that do not significnatly increase take off weight. An existing analytical model for noise transmission into aircraft cabins was utilized to investigate the behavior of an improved propeller source model for use in aircraft interior noise studies. The new source model, a virtually rotating dipole, is shown to adequately match measured fuselage sound pressure distributions, including the correct phase relationships, for published data. The virtually rotating dipole is used to study the sensitivity of synchrophasing effectiveness to the fuselage sound pressure trace velocity distribution. Results of calculations are presented which reveal the importance of correctly modeling the surface pressure phase relations in synchrophasing and other aircraft interior noise studies.

  6. Optimization of single photon detection model based on GM-APD

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Yang, Yi; Hao, Peiyu

    2017-11-01

    One hundred kilometers high precision laser ranging hopes the detector has very strong detection ability for very weak light. At present, Geiger-Mode of Avalanche Photodiode has more use. It has high sensitivity and high photoelectric conversion efficiency. Selecting and designing the detector parameters according to the system index is of great importance to the improvement of photon detection efficiency. Design optimization requires a good model. In this paper, we research the existing Poisson distribution model, and consider the important detector parameters of dark count rate, dead time, quantum efficiency and so on. We improve the optimization of detection model, select the appropriate parameters to achieve optimal photon detection efficiency. The simulation is carried out by using Matlab and compared with the actual test results. The rationality of the model is verified. It has certain reference value in engineering applications.

  7. Cumulative complexity: a functional, patient-centered model of patient complexity can improve research and practice.

    PubMed

    Shippee, Nathan D; Shah, Nilay D; May, Carl R; Mair, Frances S; Montori, Victor M

    2012-10-01

    To design a functional, patient-centered model of patient complexity with practical applicability to analytic design and clinical practice. Existing literature on patient complexity has mainly identified its components descriptively and in isolation, lacking clarity as to their combined functions in disrupting care or to how complexity changes over time. The authors developed a cumulative complexity model, which integrates existing literature and emphasizes how clinical and social factors accumulate and interact to complicate patient care. A narrative literature review is used to explicate the model. The model emphasizes a core, patient-level mechanism whereby complicating factors impact care and outcomes: the balance between patient workload of demands and patient capacity to address demands. Workload encompasses the demands on the patient's time and energy, including demands of treatment, self-care, and life in general. Capacity concerns ability to handle work (e.g., functional morbidity, financial/social resources, literacy). Workload-capacity imbalances comprise the mechanism driving patient complexity. Treatment and illness burdens serve as feedback loops, linking negative outcomes to further imbalances, such that complexity may accumulate over time. With its components largely supported by existing literature, the model has implications for analytic design, clinical epidemiology, and clinical practice. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Application of nonlinear least-squares regression to ground-water flow modeling, west-central Florida

    USGS Publications Warehouse

    Yobbi, D.K.

    2000-01-01

    A nonlinear least-squares regression technique for estimation of ground-water flow model parameters was applied to an existing model of the regional aquifer system underlying west-central Florida. The regression technique minimizes the differences between measured and simulated water levels. Regression statistics, including parameter sensitivities and correlations, were calculated for reported parameter values in the existing model. Optimal parameter values for selected hydrologic variables of interest are estimated by nonlinear regression. Optimal estimates of parameter values are about 140 times greater than and about 0.01 times less than reported values. Independently estimating all parameters by nonlinear regression was impossible, given the existing zonation structure and number of observations, because of parameter insensitivity and correlation. Although the model yields parameter values similar to those estimated by other methods and reproduces the measured water levels reasonably accurately, a simpler parameter structure should be considered. Some possible ways of improving model calibration are to: (1) modify the defined parameter-zonation structure by omitting and/or combining parameters to be estimated; (2) carefully eliminate observation data based on evidence that they are likely to be biased; (3) collect additional water-level data; (4) assign values to insensitive parameters, and (5) estimate the most sensitive parameters first, then, using the optimized values for these parameters, estimate the entire data set.

  9. Comparing and improving reconstruction methods for proxies based on compositional data

    NASA Astrophysics Data System (ADS)

    Nolan, C.; Tipton, J.; Booth, R.; Jackson, S. T.; Hooten, M.

    2017-12-01

    Many types of studies in paleoclimatology and paleoecology involve compositional data. Often, these studies aim to use compositional data to reconstruct an environmental variable of interest; the reconstruction is usually done via the development of a transfer function. Transfer functions have been developed using many different methods. Existing methods tend to relate the compositional data and the reconstruction target in very simple ways. Additionally, the results from different methods are rarely compared. Here we seek to address these two issues. First, we introduce a new hierarchical Bayesian multivariate gaussian process model; this model allows for the relationship between each species in the compositional dataset and the environmental variable to be modeled in a way that captures the underlying complexities. Then, we compare this new method to machine learning techniques and commonly used existing methods. The comparisons are based on reconstructing the water table depth history of Caribou Bog (an ombrotrophic Sphagnum peat bog in Old Town, Maine, USA) from a new 7500 year long record of testate amoebae assemblages. The resulting reconstructions from different methods diverge in both their resulting means and uncertainties. In particular, uncertainty tends to be drastically underestimated by some common methods. These results will help to improve inference of water table depth from testate amoebae. Furthermore, this approach can be applied to test and improve inferences of past environmental conditions from a broad array of paleo-proxies based on compositional data

  10. Optimizing airport capacity utilization in air traffic flow management subject to constraints at arrival and departure fixes

    DOT National Transportation Integrated Search

    1997-09-01

    This paper formulates a new approach for improvement : of air traffic flow management at airports, which leads to : more efficient utilization of existing airport capacity to alleviate : the consequences of congestion. A new model is presented, : whi...

  11. Potential capabilities of lunar laser ranging for geodesy and relativity

    NASA Technical Reports Server (NTRS)

    Muller, Jurgen; Williams, James G.; Turshev, Slava G.; Shelus, Peter J.

    2005-01-01

    Here, we review the LLR technique focusing on its impact on Geodesy and Relativity. We discuss the modem observational accuracy and the level of existing LLR modeling. We present the near-term objectives and emphasize improvements needed to fully utilize the scientific potential of LLR.

  12. Prediction of brittleness based on anisotropic rock physics model for kerogen-rich shale

    NASA Astrophysics Data System (ADS)

    Qian, Ke-Ran; He, Zhi-Liang; Chen, Ye-Quan; Liu, Xi-Wu; Li, Xiang-Yang

    2017-12-01

    The construction of a shale rock physics model and the selection of an appropriate brittleness index ( BI) are two significant steps that can influence the accuracy of brittleness prediction. On one hand, the existing models of kerogen-rich shale are controversial, so a reasonable rock physics model needs to be built. On the other hand, several types of equations already exist for predicting the BI whose feasibility needs to be carefully considered. This study constructed a kerogen-rich rock physics model by performing the selfconsistent approximation and the differential effective medium theory to model intercoupled clay and kerogen mixtures. The feasibility of our model was confirmed by comparison with classical models, showing better accuracy. Templates were constructed based on our model to link physical properties and the BI. Different equations for the BI had different sensitivities, making them suitable for different types of formations. Equations based on Young's Modulus were sensitive to variations in lithology, while those using Lame's Coefficients were sensitive to porosity and pore fluids. Physical information must be considered to improve brittleness prediction.

  13. Modelling and characterization of primary settlers in view of whole plant and resource recovery modelling.

    PubMed

    Bachis, Giulia; Maruéjouls, Thibaud; Tik, Sovanna; Amerlinck, Youri; Melcer, Henryk; Nopens, Ingmar; Lessard, Paul; Vanrolleghem, Peter A

    2015-01-01

    Characterization and modelling of primary settlers have been neglected pretty much to date. However, whole plant and resource recovery modelling requires primary settler model development, as current models lack detail in describing the dynamics and the diversity of the removal process for different particulate fractions. This paper focuses on the improved modelling and experimental characterization of primary settlers. First, a new modelling concept based on particle settling velocity distribution is proposed which is then applied for the development of an improved primary settler model as well as for its characterization under addition of chemicals (chemically enhanced primary treatment, CEPT). This model is compared to two existing simple primary settler models (Otterpohl and Freund; Lessard and Beck), showing to be better than the first one and statistically comparable to the second one, but with easier calibration thanks to the ease with which wastewater characteristics can be translated into model parameters. Second, the changes in the activated sludge model (ASM)-based chemical oxygen demand fractionation between inlet and outlet induced by primary settling is investigated, showing that typical wastewater fractions are modified by primary treatment. As they clearly impact the downstream processes, both model improvements demonstrate the need for more detailed primary settler models in view of whole plant modelling.

  14. Modeling Input Errors to Improve Uncertainty Estimates for Sediment Transport Model Predictions

    NASA Astrophysics Data System (ADS)

    Jung, J. Y.; Niemann, J. D.; Greimann, B. P.

    2016-12-01

    Bayesian methods using Markov chain Monte Carlo algorithms have recently been applied to sediment transport models to assess the uncertainty in the model predictions due to the parameter values. Unfortunately, the existing approaches can only attribute overall uncertainty to the parameters. This limitation is critical because no model can produce accurate forecasts if forced with inaccurate input data, even if the model is well founded in physical theory. In this research, an existing Bayesian method is modified to consider the potential errors in input data during the uncertainty evaluation process. The input error is modeled using Gaussian distributions, and the means and standard deviations are treated as uncertain parameters. The proposed approach is tested by coupling it to the Sedimentation and River Hydraulics - One Dimension (SRH-1D) model and simulating a 23-km reach of the Tachia River in Taiwan. The Wu equation in SRH-1D is used for computing the transport capacity for a bed material load of non-cohesive material. Three types of input data are considered uncertain: (1) the input flowrate at the upstream boundary, (2) the water surface elevation at the downstream boundary, and (3) the water surface elevation at a hydraulic structure in the middle of the reach. The benefits of modeling the input errors in the uncertainty analysis are evaluated by comparing the accuracy of the most likely forecast and the coverage of the observed data by the credible intervals to those of the existing method. The results indicate that the internal boundary condition has the largest uncertainty among those considered. Overall, the uncertainty estimates from the new method are notably different from those of the existing method for both the calibration and forecast periods.

  15. A constitutive model accounting for strain ageing effects on work-hardening. Application to a C-Mn steel

    NASA Astrophysics Data System (ADS)

    Ren, Sicong; Mazière, Matthieu; Forest, Samuel; Morgeneyer, Thilo F.; Rousselier, Gilles

    2017-12-01

    One of the most successful models for describing the Portevin-Le Chatelier effect in engineering applications is the Kubin-Estrin-McCormick model (KEMC). In the present work, the influence of dynamic strain ageing on dynamic recovery due to dislocation annihilation is introduced in order to improve the KEMC model. This modification accounts for additional strain hardening rate due to limited dislocation annihilation by the diffusion of solute atoms and dislocation pinning at low strain rate and/or high temperature. The parameters associated with this novel formulation are identified based on tensile tests for a C-Mn steel at seven temperatures ranging from 20 °C to 350 °C. The validity of the model and the improvement compared to existing models are tested using 2D and 3D finite element simulations of the Portevin-Le Chatelier effect in tension.

  16. The ODD protocol: A review and first update

    USGS Publications Warehouse

    Grimm, Volker; Berger, Uta; DeAngelis, Donald L.; Polhill, J. Gary; Giske, Jarl; Railsback, Steve F.

    2010-01-01

    The 'ODD' (Overview, Design concepts, and Details) protocol was published in 2006 to standardize the published descriptions of individual-based and agent-based models (ABMs). The primary objectives of ODD are to make model descriptions more understandable and complete, thereby making ABMs less subject to criticism for being irreproducible. We have systematically evaluated existing uses of the ODD protocol and identified, as expected, parts of ODD needing improvement and clarification. Accordingly, we revise the definition of ODD to clarify aspects of the original version and thereby facilitate future standardization of ABM descriptions. We discuss frequently raised critiques in ODD but also two emerging, and unanticipated, benefits: ODD improves the rigorous formulation of models and helps make the theoretical foundations of large models more visible. Although the protocol was designed for ABMs, it can help with documenting any large, complex model, alleviating some general objections against such models.

  17. Modeling Analysis for NASA GRC Vacuum Facility 5 Upgrade

    NASA Technical Reports Server (NTRS)

    Yim, J. T.; Herman, D. A.; Burt, J. M.

    2013-01-01

    A model of the VF5 test facility at NASA Glenn Research Center was developed using the direct simulation Monte Carlo Hypersonic Aerothermodynamics Particle (HAP) code. The model results were compared to several cold flow and thruster hot fire cases. The main uncertainty in the model is the determination of the effective sticking coefficient -- which sets the pumping effectiveness of the cryopanels and oil diffusion pumps including baffle transmission. An effective sticking coefficient of 0.25 was found to provide generally good agreement with the experimental chamber pressure data. The model, which assumes a cold diffuse inflow, also fared satisfactorily in predicting the pressure distribution during thruster operation. The model was used to assess other chamber configurations to improve the local effective pumping speed near the thruster. A new configuration of the existing cryopumps is found to show more than 2x improvement over the current baseline configuration.

  18. Improving Global Health Education: Development of a Global Health Competency Model

    PubMed Central

    Ablah, Elizabeth; Biberman, Dorothy A.; Weist, Elizabeth M.; Buekens, Pierre; Bentley, Margaret E.; Burke, Donald; Finnegan, John R.; Flahault, Antoine; Frenk, Julio; Gotsch, Audrey R.; Klag, Michael J.; Lopez, Mario Henry Rodriguez; Nasca, Philip; Shortell, Stephen; Spencer, Harrison C.

    2014-01-01

    Although global health is a recommended content area for the future of education in public health, no standardized global health competency model existed for master-level public health students. Without such a competency model, academic institutions are challenged to ensure that students are able to demonstrate the knowledge, skills, and attitudes (KSAs) needed for successful performance in today's global health workforce. The Association of Schools of Public Health (ASPH) sought to address this need by facilitating the development of a global health competency model through a multistage modified-Delphi process. Practitioners and academic global health experts provided leadership and guidance throughout the competency development process. The resulting product, the Global Health Competency Model 1.1, includes seven domains and 36 competencies. The Global Health Competency Model 1.1 provides a platform for engaging educators, students, and global health employers in discussion of the KSAs needed to improve human health on a global scale. PMID:24445206

  19. Audio visual speech source separation via improved context dependent association model

    NASA Astrophysics Data System (ADS)

    Kazemi, Alireza; Boostani, Reza; Sobhanmanesh, Fariborz

    2014-12-01

    In this paper, we exploit the non-linear relation between a speech source and its associated lip video as a source of extra information to propose an improved audio-visual speech source separation (AVSS) algorithm. The audio-visual association is modeled using a neural associator which estimates the visual lip parameters from a temporal context of acoustic observation frames. We define an objective function based on mean square error (MSE) measure between estimated and target visual parameters. This function is minimized for estimation of the de-mixing vector/filters to separate the relevant source from linear instantaneous or time-domain convolutive mixtures. We have also proposed a hybrid criterion which uses AV coherency together with kurtosis as a non-Gaussianity measure. Experimental results are presented and compared in terms of visually relevant speech detection accuracy and output signal-to-interference ratio (SIR) of source separation. The suggested audio-visual model significantly improves relevant speech classification accuracy compared to existing GMM-based model and the proposed AVSS algorithm improves the speech separation quality compared to reference ICA- and AVSS-based methods.

  20. NASA's Potential Contributions for Remediation of Retention Ponds Using Solar Ultraviolet Radiation and Photocatalysis

    NASA Technical Reports Server (NTRS)

    Underwood, Lauren W.; Ryan, Robert E.

    2007-01-01

    This Candidate Solution uses NASA Earth science research on atmospheric ozone and aerosols data (1) to help improve the prediction capabilities of water runoff models that are used to estimate runoff pollution from retention ponds, and (2) to understand the pollutant removal contribution and potential of photocatalytically coated materials that could be used in these ponds. Models (the EPA's SWMM and the USGS SLAMM) exist that estimate the release of pollutants into the environment from storm-water-related retention pond runoff. UV irradiance data acquired from the satellite mission Aura and from the OMI Surface UV algorithm will be incorporated into these models to enhance their capabilities, not only by increasing the general understanding of retention pond function (both the efficacy and efficiency) but additionally by adding photocatalytic materials to these retention ponds, augmenting their performance. State and local officials who run pollution protection programs could then develop and implement photocatalytic technologies for water pollution control in retention ponds and use them in conjunction with existing runoff models. More effective decisions about water pollution protection programs could be made, the persistence and toxicity of waste generated could be minimized, and subsequently our natural water resources would be improved. This Candidate Solution is in alignment with the Water Management and Public Health National Applications.

  1. An Integrated Ensemble-Based Operational Framework to Predict Urban Flooding: A Case Study of Hurricane Sandy in the Passaic and Hackensack River Basins

    NASA Astrophysics Data System (ADS)

    Saleh, F.; Ramaswamy, V.; Georgas, N.; Blumberg, A. F.; Wang, Y.

    2016-12-01

    Advances in computational resources and modeling techniques are opening the path to effectively integrate existing complex models. In the context of flood prediction, recent extreme events have demonstrated the importance of integrating components of the hydrosystem to better represent the interactions amongst different physical processes and phenomena. As such, there is a pressing need to develop holistic and cross-disciplinary modeling frameworks that effectively integrate existing models and better represent the operative dynamics. This work presents a novel Hydrologic-Hydraulic-Hydrodynamic Ensemble (H3E) flood prediction framework that operationally integrates existing predictive models representing coastal (New York Harbor Observing and Prediction System, NYHOPS), hydrologic (US Army Corps of Engineers Hydrologic Modeling System, HEC-HMS) and hydraulic (2-dimensional River Analysis System, HEC-RAS) components. The state-of-the-art framework is forced with 125 ensemble meteorological inputs from numerical weather prediction models including the Global Ensemble Forecast System, the European Centre for Medium-Range Weather Forecasts (ECMWF), the Canadian Meteorological Centre (CMC), the Short Range Ensemble Forecast (SREF) and the North American Mesoscale Forecast System (NAM). The framework produces, within a 96-hour forecast horizon, on-the-fly Google Earth flood maps that provide critical information for decision makers and emergency preparedness managers. The utility of the framework was demonstrated by retrospectively forecasting an extreme flood event, hurricane Sandy in the Passaic and Hackensack watersheds (New Jersey, USA). Hurricane Sandy caused significant damage to a number of critical facilities in this area including the New Jersey Transit's main storage and maintenance facility. The results of this work demonstrate that ensemble based frameworks provide improved flood predictions and useful information about associated uncertainties, thus improving the assessment of risks as when compared to a deterministic forecast. The work offers perspectives for short-term flood forecasts, flood mitigation strategies and best management practices for climate change scenarios.

  2. Intervertebral reaction force prediction using an enhanced assembly of OpenSim models.

    PubMed

    Senteler, Marco; Weisse, Bernhard; Rothenfluh, Dominique A; Snedeker, Jess G

    2016-01-01

    OpenSim offers a valuable approach to investigating otherwise difficult to assess yet important biomechanical parameters such as joint reaction forces. Although the range of available models in the public repository is continually increasing, there currently exists no OpenSim model for the computation of intervertebral joint reactions during flexion and lifting tasks. The current work combines and improves elements of existing models to develop an enhanced model of the upper body and lumbar spine. Models of the upper body with extremities, neck and head were combined with an improved version of a lumbar spine from the model repository. Translational motion was enabled for each lumbar vertebrae with six controllable degrees of freedom. Motion segment stiffness was implemented at lumbar levels and mass properties were assigned throughout the model. Moreover, body coordinate frames of the spine were modified to allow straightforward variation of sagittal alignment and to simplify interpretation of results. Evaluation of model predictions for level L1-L2, L3-L4 and L4-L5 in various postures of forward flexion and moderate lifting (8 kg) revealed an agreement within 10% to experimental studies and model-based computational analyses. However, in an extended posture or during lifting of heavier loads (20 kg), computed joint reactions differed substantially from reported in vivo measures using instrumented implants. We conclude that agreement between the model and available experimental data was good in view of limitations of both the model and the validation datasets. The presented model is useful in that it permits computation of realistic lumbar spine joint reaction forces during flexion and moderate lifting tasks. The model and corresponding documentation are now available in the online OpenSim repository.

  3. New lightcurve of asteroid (216) Kleopatra to evaluate the shape model

    NASA Astrophysics Data System (ADS)

    Hannan, Melissa A.; Howell, Ellen S.; Woodney, Laura M.; Taylor, Patrick A.

    2014-11-01

    Asteroid 216 Kleopatra is an M class asteroid in the Main Belt with an unusual shape model that looks like a dog bone. This model was created, from the radar data taken at Arecibo Observatory (Ostro et al. 1999). The discovery of satellites orbiting Kleopatra (Marchis et al. 2008) has led to determination of its mass and density (Descamps et al. 2011). New higher quality data were taken to improve upon the existing shape model. Radar images were obtained in November and December 2013, at Arecibo Observatory with resolution of 10.5 km per pixel. In addition, observations were made with the fully automated 20-inch telescope of the Murillo Family Observatory located on the CSUSB campus. The telescope was equipped with an Apogee U16M CCD camera with a 31 arcmin square field of view and BVR filters. Image data were acquired on 7 and 9 November, 2013 under mostly clear conditions and with 2x2 binning to a pixel scale of 0.9 arcseconds per pixel. These images were taken close in time to the radar observations in order to determine the rotational phase. These data also can be used to look for color changes with rotation. We used the lightcurve and the existing radar shape model to simulate the new radar observations. Although the model matches fairly well overall, it does not reproduce all of the features in the images, indicating that the model can be improved. Results of this analysis will be presented.

  4. Rapid Automated Aircraft Simulation Model Updating from Flight Data

    NASA Technical Reports Server (NTRS)

    Brian, Geoff; Morelli, Eugene A.

    2011-01-01

    Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.

  5. Genetic Algorithms and Local Search

    NASA Technical Reports Server (NTRS)

    Whitley, Darrell

    1996-01-01

    The first part of this presentation is a tutorial level introduction to the principles of genetic search and models of simple genetic algorithms. The second half covers the combination of genetic algorithms with local search methods to produce hybrid genetic algorithms. Hybrid algorithms can be modeled within the existing theoretical framework developed for simple genetic algorithms. An application of a hybrid to geometric model matching is given. The hybrid algorithm yields results that improve on the current state-of-the-art for this problem.

  6. Predicting Appropriate Admission of Bronchiolitis Patients in the Emergency Department: Rationale and Methods.

    PubMed

    Luo, Gang; Stone, Bryan L; Johnson, Michael D; Nkoy, Flory L

    2016-03-07

    In young children, bronchiolitis is the most common illness resulting in hospitalization. For children less than age 2, bronchiolitis incurs an annual total inpatient cost of $1.73 billion. Each year in the United States, 287,000 emergency department (ED) visits occur because of bronchiolitis, with a hospital admission rate of 32%-40%. Due to a lack of evidence and objective criteria for managing bronchiolitis, ED disposition decisions (hospital admission or discharge to home) are often made subjectively, resulting in significant practice variation. Studies reviewing admission need suggest that up to 29% of admissions from the ED are unnecessary. About 6% of ED discharges for bronchiolitis result in ED returns with admission. These inappropriate dispositions waste limited health care resources, increase patient and parental distress, expose patients to iatrogenic risks, and worsen outcomes. Existing clinical guidelines for bronchiolitis offer limited improvement in patient outcomes. Methodological shortcomings include that the guidelines provide no specific thresholds for ED decisions to admit or to discharge, have an insufficient level of detail, and do not account for differences in patient and illness characteristics including co-morbidities. Predictive models are frequently used to complement clinical guidelines, reduce practice variation, and improve clinicians' decision making. Used in real time, predictive models can present objective criteria supported by historical data for an individualized disease management plan and guide admission decisions. However, existing predictive models for ED patients with bronchiolitis have limitations, including low accuracy and the assumption that the actual ED disposition decision was appropriate. To date, no operational definition of appropriate admission exists. No model has been built based on appropriate admissions, which include both actual admissions that were necessary and actual ED discharges that were unsafe. The goal of this study is to develop a predictive model to guide appropriate hospital admission for ED patients with bronchiolitis. This study will: (1) develop an operational definition of appropriate hospital admission for ED patients with bronchiolitis, (2) develop and test the accuracy of a new model to predict appropriate hospital admission for an ED patient with bronchiolitis, and (3) conduct simulations to estimate the impact of using the model on bronchiolitis outcomes. We are currently extracting administrative and clinical data from the enterprise data warehouse of an integrated health care system. Our goal is to finish this study by the end of 2019. This study will produce a new predictive model that can be operationalized to guide and improve disposition decisions for ED patients with bronchiolitis. Broad use of the model would reduce iatrogenic risk, patient and parental distress, health care use, and costs and improve outcomes for bronchiolitis patients.

  7. User's Manual for Data for Validating Models for PV Module Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marion, W.; Anderberg, A.; Deline, C.

    2014-04-01

    This user's manual describes performance data measured for flat-plate photovoltaic (PV) modules installed in Cocoa, Florida, Eugene, Oregon, and Golden, Colorado. The data include PV module current-voltage curves and associated meteorological data for approximately one-year periods. These publicly available data are intended to facilitate the validation of existing models for predicting the performance of PV modules, and for the development of new and improved models. For comparing different modeling approaches, using these public data will provide transparency and more meaningful comparisons of the relative benefits.

  8. Yield Model Development (YMD) implementation plan for fiscal years 1981 and 1982

    NASA Technical Reports Server (NTRS)

    Ambroziak, R. A. (Principal Investigator)

    1981-01-01

    A plan is described for supporting USDA crop production forecasting and estimation by (1) testing, evaluating, and selecting crop yield models for application testing; (2) identifying areas of feasible research for improvement of models; and (3) conducting research to modify existing models and to develop new crop yield assessment methods. Tasks to be performed for each of these efforts are described as well as for project management and support. The responsibilities of USDA, USDC, USDI, and NASA are delineated as well as problem areas to be addressed.

  9. Vegetation projections for Wind Cave National Park with three future climate scenarios: Final report in completion of Task Agreement J8W07100052

    USGS Publications Warehouse

    King, David A.; Bachelet, Dominique M.; Symstad, Amy J.

    2013-01-01

    Since the initial application of MC1 to a small portion of WICA (Bachelet et al. 2000), the model has been altered to improve model performance with the inclusion of dynamic fire. Applying this improved version to WICA required substantial recalibration, during which we have made a number of improvements to MC1 that will be incorporated as permanent changes. In this report we document these changes and our calibration procedure following a brief overview of the model. We compare the projections of current vegetation to the current state of the park and present projections of vegetation dynamics under future climates downscaled from three GCMs selected to represent the existing range in available GCM projections. In doing so, we examine the consequences of different management options regarding fire and grazing, major aspects of biotic management at Wind Cave.

  10. An Ensemble Deep Convolutional Neural Network Model with Improved D-S Evidence Fusion for Bearing Fault Diagnosis.

    PubMed

    Li, Shaobo; Liu, Guokai; Tang, Xianghong; Lu, Jianguang; Hu, Jianjun

    2017-07-28

    Intelligent machine health monitoring and fault diagnosis are becoming increasingly important for modern manufacturing industries. Current fault diagnosis approaches mostly depend on expert-designed features for building prediction models. In this paper, we proposed IDSCNN, a novel bearing fault diagnosis algorithm based on ensemble deep convolutional neural networks and an improved Dempster-Shafer theory based evidence fusion. The convolutional neural networks take the root mean square (RMS) maps from the FFT (Fast Fourier Transformation) features of the vibration signals from two sensors as inputs. The improved D-S evidence theory is implemented via distance matrix from evidences and modified Gini Index. Extensive evaluations of the IDSCNN on the Case Western Reserve Dataset showed that our IDSCNN algorithm can achieve better fault diagnosis performance than existing machine learning methods by fusing complementary or conflicting evidences from different models and sensors and adapting to different load conditions.

  11. An Ensemble Deep Convolutional Neural Network Model with Improved D-S Evidence Fusion for Bearing Fault Diagnosis

    PubMed Central

    Li, Shaobo; Liu, Guokai; Tang, Xianghong; Lu, Jianguang

    2017-01-01

    Intelligent machine health monitoring and fault diagnosis are becoming increasingly important for modern manufacturing industries. Current fault diagnosis approaches mostly depend on expert-designed features for building prediction models. In this paper, we proposed IDSCNN, a novel bearing fault diagnosis algorithm based on ensemble deep convolutional neural networks and an improved Dempster–Shafer theory based evidence fusion. The convolutional neural networks take the root mean square (RMS) maps from the FFT (Fast Fourier Transformation) features of the vibration signals from two sensors as inputs. The improved D-S evidence theory is implemented via distance matrix from evidences and modified Gini Index. Extensive evaluations of the IDSCNN on the Case Western Reserve Dataset showed that our IDSCNN algorithm can achieve better fault diagnosis performance than existing machine learning methods by fusing complementary or conflicting evidences from different models and sensors and adapting to different load conditions. PMID:28788099

  12. Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software

    NASA Astrophysics Data System (ADS)

    Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.

    2017-09-01

    This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.

  13. Earth System Modeling and Field Experiments in the Arctic-Boreal Zone - Report from a NASA Workshop

    NASA Technical Reports Server (NTRS)

    Sellers, Piers; Rienecker Michele; Randall, David; Frolking, Steve

    2012-01-01

    Early climate modeling studies predicted that the Arctic Ocean and surrounding circumpolar land masses would heat up earlier and faster than other parts of the planet as a result of greenhouse gas-induced climate change, augmented by the sea-ice albedo feedback effect. These predictions have been largely borne out by observations over the last thirty years. However, despite constant improvement, global climate models have greater difficulty in reproducing the current climate in the Arctic than elsewhere and the scatter between projections from different climate models is much larger in the Arctic than for other regions. Biogeochemical cycle (BGC) models indicate that the warming in the Arctic-Boreal Zone (ABZ) could lead to widespread thawing of the permafrost, along with massive releases of CO2 and CH4, and large-scale changes in the vegetation cover in the ABZ. However, the uncertainties associated with these BGC model predictions are even larger than those associated with the physical climate system models used to describe climate change. These deficiencies in climate and BGC models reflect, at least in part, an incomplete understanding of the Arctic climate system and can be related to inadequate observational data or analyses of existing data. A workshop was held at NASA/GSFC, May 22-24 2012, to assess the predictive capability of the models, prioritize the critical science questions; and make recommendations regarding new field experiments needed to improve model subcomponents. This presentation will summarize the findings and recommendations of the workshop, including the need for aircraft and flux tower measurements and extension of existing in-situ measurements to improve process modeling of both the physical climate and biogeochemical cycle systems. Studies should be directly linked to remote sensing investigations with a view to scaling up the improved process models to the Earth System Model scale. Data assimilation and observing system simulation studies should be used to guide the deployment pattern and schedule for inversion studies as well. Synthesis and integration of previously funded Arctic-Boreal projects (e.g., ABLE, BOREAS, ICESCAPE, ICEBRIDGE, ARCTAS) should also be undertaken. Such an effort would include the integration of multiple remotely sensed products from the EOS satellites and other resources.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santos-Villalobos, Hector J; Barstow, Del R; Karakaya, Mahmut

    Iris recognition has been proven to be an accurate and reliable biometric. However, the recognition of non-ideal iris images such as off angle images is still an unsolved problem. We propose a new biometric targeted eye model and a method to reconstruct the off-axis eye to its frontal view allowing for recognition using existing methods and algorithms. This allows for existing enterprise level algorithms and approaches to be largely unmodified by using our work as a pre-processor to improve performance. In addition, we describe the `Limbus effect' and its importance for an accurate segmentation of off-axis irides. Our method usesmore » an anatomically accurate human eye model and ray-tracing techniques to compute a transformation function, which reconstructs the iris to its frontal, non-refracted state. Then, the same eye model is used to render a frontal view of the reconstructed iris. The proposed method is fully described and results from synthetic data are shown to establish an upper limit on performance improvement and establish the importance of the proposed approach over traditional linear elliptical unwrapping methods. Our results with synthetic data demonstrate the ability to perform an accurate iris recognition with an image taken as much as 70 degrees off-axis.« less

  15. Fused methods for visual saliency estimation

    NASA Astrophysics Data System (ADS)

    Danko, Amanda S.; Lyu, Siwei

    2015-02-01

    In this work, we present a new model of visual saliency by combing results from existing methods, improving upon their performance and accuracy. By fusing pre-attentive and context-aware methods, we highlight the abilities of state-of-the-art models while compensating for their deficiencies. We put this theory to the test in a series of experiments, comparatively evaluating the visual saliency maps and employing them for content-based image retrieval and thumbnail generation. We find that on average our model yields definitive improvements upon recall and f-measure metrics with comparable precisions. In addition, we find that all image searches using our fused method return more correct images and additionally rank them higher than the searches using the original methods alone.

  16. Community evolution mining and analysis in social network

    NASA Astrophysics Data System (ADS)

    Liu, Hongtao; Tian, Yuan; Liu, Xueyan; Jian, Jie

    2017-03-01

    With the development of digital and network technology, various social platforms emerge. These social platforms have greatly facilitated access to information, attracting more and more users. They use these social platforms every day to work, study and communicate, so every moment social platforms are generating massive amounts of data. These data can often be modeled as complex networks, making large-scale social network analysis possible. In this paper, the existing evolution classification model of community has been improved based on community evolution relationship over time in dynamic social network, and the Evolution-Tree structure is proposed which can show the whole life cycle of the community more clearly. The comparative test result shows that the improved model can excavate the evolution relationship of the community well.

  17. An improved model of homogeneous nucleation for high supersaturation conditions: aluminum vapor.

    PubMed

    Savel'ev, A M; Starik, A M

    2016-12-21

    A novel model of stationary nucleation, treating the thermodynamic functions of small clusters, has been built. The model is validated against the experimental data on the nucleation rate of water vapor obtained in a broad range of supersaturation values (S = 10-120), and, at high supersaturation values, it reproduces the experimental data much better than the traditional classical nucleation model. A comprehensive analysis of the nucleation of aluminum vapor with the usage of developed stationary and non-stationary nucleation models has been performed. It has been shown that, at some value of supersaturation, there exists a double potential nucleation barrier. It has been revealed that the existence of this barrier notably delayed the establishment of a stationary distribution of subcritical clusters. It has also been demonstrated that the non-stationary model of the present work and the model of liquid-droplet approximation predict different values of nucleation delay time, τ s . In doing so, the liquid-droplet model can underestimate notably (by more than an order of magnitude) the value of τ s .

  18. Multiple Roles: The Conflicted Realities of Community College Mission Statements

    ERIC Educational Resources Information Center

    Mrozinski, Mark D.

    2010-01-01

    Questions of efficacy have always plagued the use of mission statement as a strategic planning tool. In most planning models, the mission statement serves to clarify goals and guide the formation of strategies. However, little empirical evidence exists validating that mission statements actually improve the performance of organizations, even…

  19. Defining Learning Disability: Does IQ Have Anything Significant to Say?

    ERIC Educational Resources Information Center

    Dunn, Michael W.

    2010-01-01

    A debate exists in the research community about replacing the traditional IQ/achievement discrepancy method for learning disability identification with a "response-to-intervention model". This new assessment paradigm uses a student's level of improvement with small-group or individual programming to determine a possible need for…

  20. The Asthma Dialogues: A Model of Interactive Education for Skills

    ERIC Educational Resources Information Center

    Morrow, Robert; Fletcher, Jason; Mulvihill, Michael; Park, Heidi

    2007-01-01

    Introduction: A gap exists between asthma guidelines and actual care delivered. We developed an educational intervention using simulated physician-patient encounters as part of a project to improve asthma management by community-based primary care providers. We hypothesized that this type of skills-based interactive training would improve…

  1. Promoting Teacher Growth through Lesson Study: A Culturally Embedded Approach

    ERIC Educational Resources Information Center

    Ebaeguin, Marlon

    2015-01-01

    Lesson Study has captured the attention of many international educators with its promise of improved student learning and sustained teacher growth. Lesson Study, however, has cultural underpinnings that a simple transference model overlooks. A culturally embedded approach attends to the existing cultural orientations and values of host schools.…

  2. Systematically Evaluating the Effectiveness of Quality Assurance Programmes in Leading to Improvements in Institutional Performance

    ERIC Educational Resources Information Center

    Lillis, Deirdre

    2012-01-01

    Higher education institutions worldwide invest significant resources in their quality assurance systems. Little empirical evidence exists that demonstrates the effectiveness (or otherwise) of these systems. Methodological approaches for determining effectiveness are also underdeveloped. Self-study-with-peer-review is a widely used model for…

  3. Improved parameterization for the vertical flux of dust aerosols emitted by an eroding soil

    USDA-ARS?s Scientific Manuscript database

    The representation of the dust cycle in atmospheric circulation models hinges on an accurate parameterization of the vertical dust flux at emission. However, existing parameterizations of the vertical dust flux vary substantially in their scaling with wind friction velocity, require input parameters...

  4. Improvement of High-Resolution Tropical Cyclone Structure and Intensity Forecasts using COAMPS-TC

    DTIC Science & Technology

    2013-09-30

    scientific community including the recent T- PARC /TCS08, ITOP, and HS3 field campaigns to build upon the existing modeling capabilities. We will...heating and cooling rates in developing and non-developing tropical disturbances during tcs-08: radar -equivalent retrievals from mesoscale numerical

  5. BayMeth: improved DNA methylation quantification for affinity capture sequencing data using a flexible Bayesian approach

    PubMed Central

    2014-01-01

    Affinity capture of DNA methylation combined with high-throughput sequencing strikes a good balance between the high cost of whole genome bisulfite sequencing and the low coverage of methylation arrays. We present BayMeth, an empirical Bayes approach that uses a fully methylated control sample to transform observed read counts into regional methylation levels. In our model, inefficient capture can readily be distinguished from low methylation levels. BayMeth improves on existing methods, allows explicit modeling of copy number variation, and offers computationally efficient analytical mean and variance estimators. BayMeth is available in the Repitools Bioconductor package. PMID:24517713

  6. Incompressible viscous flow simulations of the NFAC wind tunnel

    NASA Technical Reports Server (NTRS)

    Champney, Joelle Milene

    1986-01-01

    The capabilities of an existing 3-D incompressible Navier-Stokes flow solver, INS3D, are extended and improved to solve turbulent flows through the incorporation of zero- and two-equation turbulence models. The two-equation model equations are solved in their high Reynolds number form and utilize wall functions in the treatment of solid wall boundary conditions. The implicit approximate factorization scheme is modified to improve the stability of the two-equation solver. Applications to the 3-D viscous flow inside the 80 by 120 feet open return wind tunnel of the National Full Scale Aerodynamics Complex (NFAC) are discussed and described.

  7. OpenIPSL: Open-Instance Power System Library - Update 1.5 to "iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations"

    NASA Astrophysics Data System (ADS)

    Baudette, Maxime; Castro, Marcelo; Rabuzin, Tin; Lavenius, Jan; Bogodorova, Tetiana; Vanfretti, Luigi

    2018-01-01

    This paper presents the latest improvements implemented in the Open-Instance Power System Library (OpenIPSL). The OpenIPSL is a fork from the original iTesla Power Systems Library (iPSL) by some of the original developers of the iPSL. This fork's motivation comes from the will of the authors to further develop the library with additional features tailored to research and teaching purposes. The enhancements include improvements to existing models, the addition of a new package of three phase models, and the implementation of automated tests through continuous integration.

  8. Seluge++: A Secure Over-the-Air Programming Scheme in Wireless Sensor Networks

    PubMed Central

    Doroodgar, Farzan; Razzaque, Mohammad Abdur; Isnin, Ismail Fauzi

    2014-01-01

    Over-the-air dissemination of code updates in wireless sensor networks have been researchers' point of interest in the last few years, and, more importantly, security challenges toward the remote propagation of code updating have occupied the majority of efforts in this context. Many security models have been proposed to establish a balance between the energy consumption and security strength, having their concentration on the constrained nature of wireless sensor network (WSN) nodes. For authentication purposes, most of them have used a Merkle hash tree to avoid using multiple public cryptography operations. These models mostly have assumed an environment in which security has to be at a standard level. Therefore, they have not investigated the tree structure for mission-critical situations in which security has to be at the maximum possible level (e.g., military applications, healthcare). Considering this, we investigate existing security models used in over-the-air dissemination of code updates for possible vulnerabilities, and then, we provide a set of countermeasures, correspondingly named Security Model Requirements. Based on the investigation, we concentrate on Seluge, one of the existing over-the-air programming schemes, and we propose an improved version of it, named Seluge++, which complies with the Security Model Requirements and replaces the use of the inefficient Merkle tree with a novel method. Analytical and simulation results show the improvements in Seluge++ compared to Seluge. PMID:24618781

  9. Wind power application research on the fusion of the determination and ensemble prediction

    NASA Astrophysics Data System (ADS)

    Lan, Shi; Lina, Xu; Yuzhu, Hao

    2017-07-01

    The fused product of wind speed for the wind farm is designed through the use of wind speed products of ensemble prediction from the European Centre for Medium-Range Weather Forecasts (ECMWF) and professional numerical model products on wind power based on Mesoscale Model5 (MM5) and Beijing Rapid Update Cycle (BJ-RUC), which are suitable for short-term wind power forecasting and electric dispatch. The single-valued forecast is formed by calculating the different ensemble statistics of the Bayesian probabilistic forecasting representing the uncertainty of ECMWF ensemble prediction. Using autoregressive integrated moving average (ARIMA) model to improve the time resolution of the single-valued forecast, and based on the Bayesian model averaging (BMA) and the deterministic numerical model prediction, the optimal wind speed forecasting curve and the confidence interval are provided. The result shows that the fusion forecast has made obvious improvement to the accuracy relative to the existing numerical forecasting products. Compared with the 0-24 h existing deterministic forecast in the validation period, the mean absolute error (MAE) is decreased by 24.3 % and the correlation coefficient (R) is increased by 12.5 %. In comparison with the ECMWF ensemble forecast, the MAE is reduced by 11.7 %, and R is increased 14.5 %. Additionally, MAE did not increase with the prolongation of the forecast ahead.

  10. Seluge++: a secure over-the-air programming scheme in wireless sensor networks.

    PubMed

    Doroodgar, Farzan; Abdur Razzaque, Mohammad; Isnin, Ismail Fauzi

    2014-03-11

    Over-the-air dissemination of code updates in wireless sensor networks have been researchers' point of interest in the last few years, and, more importantly, security challenges toward the remote propagation of code updating have occupied the majority of efforts in this context. Many security models have been proposed to establish a balance between the energy consumption and security strength, having their concentration on the constrained nature of wireless sensor network (WSN) nodes. For authentication purposes, most of them have used a Merkle hash tree to avoid using multiple public cryptography operations. These models mostly have assumed an environment in which security has to be at a standard level. Therefore, they have not investigated the tree structure for mission-critical situations in which security has to be at the maximum possible level (e.g., military applications, healthcare). Considering this, we investigate existing security models used in over-the-air dissemination of code updates for possible vulnerabilities, and then, we provide a set of countermeasures, correspondingly named Security Model Requirements. Based on the investigation, we concentrate on Seluge, one of the existing over-the-air programming schemes, and we propose an improved version of it, named Seluge++, which complies with the Security Model Requirements and replaces the use of the inefficient Merkle tree with a novel method. Analytical and simulation results show the improvements in Seluge++ compared to Seluge.

  11. Improved engineering models for turbulent wall flows

    NASA Astrophysics Data System (ADS)

    She, Zhen-Su; Chen, Xi; Zou, Hong-Yue; Hussain, Fazle

    2015-11-01

    We propose a new approach, called structural ensemble dynamics (SED), involving new concepts to describe the mean quantities in wall-bounded flows, and its application to improving the existing engineering turbulence models, as well as its physical interpretation. First, a revised k - ω model for pipe flows is obtained, which accurately predicts, for the first time, both mean velocity and (streamwise) kinetic energy for a wide range of the Reynolds number (Re), validated by Princeton experimental data. In particular, a multiplicative factor is introduced in the dissipation term to model an anomaly in the energy cascade in a meso-layer, predicting the outer peak of agreeing with data. Secondly, a new one-equation model is obtained for compressible turbulent boundary layers (CTBL), building on a multi-layer formula of the stress length function and a generalized temperature-velocity relation. The former refines the multi-layer description - viscous sublayer, buffer layer, logarithmic layer and a newly defined bulk zone - while the latter characterizes a parabolic relation between the mean velocity and temperature. DNS data show our predictions to have a 99% accuracy for several Mach numbers Ma = 2.25, 4.5, improving, up to 10%, a previous similar one-equation model (Baldwin & Lomax, 1978). Our results promise notable improvements in engineering models.

  12. Expanded modeling of temperature-dependent dielectric properties for microwave thermal ablation

    PubMed Central

    Ji, Zhen; Brace, Christopher L

    2011-01-01

    Microwaves are a promising source for thermal tumor ablation due to their ability to rapidly heat dispersive biological tissues, often to temperatures in excess of 100 °C. At these high temperatures, tissue dielectric properties change rapidly and, thus, so do the characteristics of energy delivery. Precise knowledge of how tissue dielectric properties change during microwave heating promises to facilitate more accurate simulation of device performance and helps optimize device geometry and energy delivery parameters. In this study, we measured the dielectric properties of liver tissue during high-temperature microwave heating. The resulting data were compiled into either a sigmoidal function of temperature or an integration of the time–temperature curve for both relative permittivity and effective conductivity. Coupled electromagnetic–thermal simulations of heating produced by a single monopole antenna using the new models were then compared to simulations with existing linear and static models, and experimental temperatures in liver tissue. The new sigmoidal temperature-dependent model more accurately predicted experimental temperatures when compared to temperature–time integrated or existing models. The mean percent differences between simulated and experimental temperatures over all times were 4.2% for sigmoidal, 10.1% for temperature–time integration, 27.0% for linear and 32.8% for static models at the antenna input power of 50 W. Correcting for tissue contraction improved agreement for powers up to 75 W. The sigmoidal model also predicted substantial changes in heating pattern due to dehydration. We can conclude from these studies that a sigmoidal model of tissue dielectric properties improves prediction of experimental results. More work is needed to refine and generalize this model. PMID:21791728

  13. Developing and integrating a practice model for health finance reform into wound healing programs: an examination of the triple aim approach.

    PubMed

    Flattau, Anna; Thompson, Maureen; Meara, Anne

    2013-10-01

    Throughout the United States, government and private payers are exploring new payment models such as accountable care organizations and shared savings agreements. These models are widely based on the construct of the Triple Aim, a set of three principles for health services reform: improving population-based outcomes, improving patient care experiences, and reducing costs through better delivery systems. Wound programs may adapt to the new health financing environment by incorporating initiatives known to promote the Triple Aim, such as diabetes amputation reduction and pressure ulcer prevention programs, and by rethinking how health services can best be delivered to meet these new criteria. The existing literature supports that programmatic approaches can improve care, quality, and cost, especially in the field of diabetic foot ulcers. Wound healing programs have opportunities to develop new business plan models that provide quality, cost-efficient care to their patient population and to be leaders in the development of new types of partnerships with payers and health delivery organizations.

  14. Simultaneous optimization of biomolecular energy function on features from small molecules and macromolecules

    PubMed Central

    Park, Hahnbeom; Bradley, Philip; Greisen, Per; Liu, Yuan; Mulligan, Vikram Khipple; Kim, David E.; Baker, David; DiMaio, Frank

    2017-01-01

    Most biomolecular modeling energy functions for structure prediction, sequence design, and molecular docking, have been parameterized using existing macromolecular structural data; this contrasts molecular mechanics force fields which are largely optimized using small-molecule data. In this study, we describe an integrated method that enables optimization of a biomolecular modeling energy function simultaneously against small-molecule thermodynamic data and high-resolution macromolecular structural data. We use this approach to develop a next-generation Rosetta energy function that utilizes a new anisotropic implicit solvation model, and an improved electrostatics and Lennard-Jones model, illustrating how energy functions can be considerably improved in their ability to describe large-scale energy landscapes by incorporating both small-molecule and macromolecule data. The energy function improves performance in a wide range of protein structure prediction challenges, including monomeric structure prediction, protein-protein and protein-ligand docking, protein sequence design, and prediction of the free energy changes by mutation, while reasonably recapitulating small-molecule thermodynamic properties. PMID:27766851

  15. A comparison of two coaching approaches to enhance implementation of a recovery-oriented service model.

    PubMed

    Deane, Frank P; Andresen, Retta; Crowe, Trevor P; Oades, Lindsay G; Ciarrochi, Joseph; Williams, Virginia

    2014-09-01

    Moving to recovery-oriented service provision in mental health may entail retraining existing staff, as well as training new staff. This represents a substantial burden on organisations, particularly since transfer of training into practice is often poor. Follow-up supervision and/or coaching have been found to improve the implementation and sustainment of new approaches. We compared the effect of two coaching conditions, skills-based and transformational coaching, on the implementation of a recovery-oriented model following training. Training followed by coaching led to significant sustained improvements in the quality of care planning in accordance with the new model over the 12-month study period. No interaction effect was observed between the two conditions. However, post hoc analyses suggest that transformational coaching warrants further exploration. The results support the provision of supervision in the form of coaching in the implementation of a recovery-oriented service model, and suggest the need to better elucidate the mechanisms within different coaching approaches that might contribute to improved care.

  16. Parallel equilibrium current effect on existence of reversed shear Alfvén eigenmodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Hua-sheng, E-mail: huashengxie@gmail.com; Xiao, Yong, E-mail: yxiao@zju.edu.cn

    2015-02-15

    A new fast global eigenvalue code, where the terms are segregated according to their physics contents, is developed to study Alfvén modes in tokamak plasmas, particularly, the reversed shear Alfvén eigenmode (RSAE). Numerical calculations show that the parallel equilibrium current corresponding to the kink term is strongly unfavorable for the existence of the RSAE. An improved criterion for the RSAE existence is given for with and without the parallel equilibrium current. In the limits of ideal magnetohydrodynamics (MHD) and zero-pressure, the toroidicity effect is the main possible favorable factor for the existence of the RSAE, which is however usually small.more » This suggests that it is necessary to include additional physics such as kinetic term in the MHD model to overcome the strong unfavorable effect of the parallel current in order to enable the existence of RSAE.« less

  17. Brian hears: online auditory processing using vectorization over channels.

    PubMed

    Fontaine, Bertrand; Goodman, Dan F M; Benichoux, Victor; Brette, Romain

    2011-01-01

    The human cochlea includes about 3000 inner hair cells which filter sounds at frequencies between 20 Hz and 20 kHz. This massively parallel frequency analysis is reflected in models of auditory processing, which are often based on banks of filters. However, existing implementations do not exploit this parallelism. Here we propose algorithms to simulate these models by vectorizing computation over frequency channels, which are implemented in "Brian Hears," a library for the spiking neural network simulator package "Brian." This approach allows us to use high-level programming languages such as Python, because with vectorized operations, the computational cost of interpretation represents a small fraction of the total cost. This makes it possible to define and simulate complex models in a simple way, while all previous implementations were model-specific. In addition, we show that these algorithms can be naturally parallelized using graphics processing units, yielding substantial speed improvements. We demonstrate these algorithms with several state-of-the-art cochlear models, and show that they compare favorably with existing, less flexible, implementations.

  18. Geometric model of pseudo-distance measurement in satellite location systems

    NASA Astrophysics Data System (ADS)

    Panchuk, K. L.; Lyashkov, A. A.; Lyubchinov, E. V.

    2018-04-01

    The existing mathematical model of pseudo-distance measurement in satellite location systems does not provide a precise solution of the problem, but rather an approximate one. The existence of such inaccuracy, as well as bias in measurement of distance from satellite to receiver, results in inaccuracy level of several meters. Thereupon, relevance of refinement of the current mathematical model becomes obvious. The solution of the system of quadratic equations used in the current mathematical model is based on linearization. The objective of the paper is refinement of current mathematical model and derivation of analytical solution of the system of equations on its basis. In order to attain the objective, geometric analysis is performed; geometric interpretation of the equations is given. As a result, an equivalent system of equations, which allows analytical solution, is derived. An example of analytical solution implementation is presented. Application of analytical solution algorithm to the problem of pseudo-distance measurement in satellite location systems allows to improve the accuracy such measurements.

  19. Reviews and syntheses: Four decades of modeling methane cycling in terrestrial ecosystems

    NASA Astrophysics Data System (ADS)

    Xu, Xiaofeng; Yuan, Fengming; Hanson, Paul J.; Wullschleger, Stan D.; Thornton, Peter E.; Riley, William J.; Song, Xia; Graham, David E.; Song, Changchun; Tian, Hanqin

    2016-06-01

    Over the past 4 decades, a number of numerical models have been developed to quantify the magnitude, investigate the spatial and temporal variations, and understand the underlying mechanisms and environmental controls of methane (CH4) fluxes within terrestrial ecosystems. These CH4 models are also used for integrating multi-scale CH4 data, such as laboratory-based incubation and molecular analysis, field observational experiments, remote sensing, and aircraft-based measurements across a variety of terrestrial ecosystems. Here we summarize 40 terrestrial CH4 models to characterize their strengths and weaknesses and to suggest a roadmap for future model improvement and application. Our key findings are that (1) the focus of CH4 models has shifted from theoretical to site- and regional-level applications over the past 4 decades, (2) large discrepancies exist among models in terms of representing CH4 processes and their environmental controls, and (3) significant data-model and model-model mismatches are partially attributed to different representations of landscape characterization and inundation dynamics. Three areas for future improvements and applications of terrestrial CH4 models are that (1) CH4 models should more explicitly represent the mechanisms underlying land-atmosphere CH4 exchange, with an emphasis on improving and validating individual CH4 processes over depth and horizontal space, (2) models should be developed that are capable of simulating CH4 emissions across highly heterogeneous spatial and temporal scales, particularly hot moments and hotspots, and (3) efforts should be invested to develop model benchmarking frameworks that can easily be used for model improvement, evaluation, and integration with data from molecular to global scales. These improvements in CH4 models would be beneficial for the Earth system models and further simulation of climate-carbon cycle feedbacks.

  20. An open-source model and solution method to predict co-contraction in the finger.

    PubMed

    MacIntosh, Alexander R; Keir, Peter J

    2017-10-01

    A novel open-source biomechanical model of the index finger with an electromyography (EMG)-constrained static optimization solution method are developed with the goal of improving co-contraction estimates and providing means to assess tendon tension distribution through the finger. The Intrinsic model has four degrees of freedom and seven muscles (with a 14 component extensor mechanism). A novel plugin developed for the OpenSim modelling software applied the EMG-constrained static optimization solution method. Ten participants performed static pressing in three finger postures and five dynamic free motion tasks. Index finger 3D kinematics, force (5, 15, 30 N), and EMG (4 extrinsic muscles and first dorsal interosseous) were used in the analysis. The Intrinsic model predicted co-contraction increased by 29% during static pressing over the existing model. Further, tendon tension distribution patterns and forces, known to be essential to produce finger action, were determined by the model across all postures. The Intrinsic model and custom solution method improved co-contraction estimates to facilitate force propagation through the finger. These tools improve our interpretation of loads in the finger to develop better rehabilitation and workplace injury risk reduction strategies.

  1. Vision-based building energy diagnostics and retrofit analysis using 3D thermography and building information modeling

    NASA Astrophysics Data System (ADS)

    Ham, Youngjib

    The emerging energy crisis in the building sector and the legislative measures on improving energy efficiency are steering the construction industry towards adopting new energy efficient design concepts and construction methods that decrease the overall energy loads. However, the problems of energy efficiency are not only limited to the design and construction of new buildings. Today, a significant amount of input energy in existing buildings is still being wasted during the operational phase. One primary source of the energy waste is attributed to unnecessary heat flows through building envelopes during hot and cold seasons. This inefficiency increases the operational frequency of heating and cooling systems to keep the desired thermal comfort of building occupants, and ultimately results in excessive energy use. Improving thermal performance of building envelopes can reduce the energy consumption required for space conditioning and in turn provide building occupants with an optimal thermal comfort at a lower energy cost. In this sense, energy diagnostics and retrofit analysis for existing building envelopes are key enablers for improving energy efficiency. Since proper retrofit decisions of existing buildings directly translate into energy cost saving in the future, building practitioners are increasingly interested in methods for reliable identification of potential performance problems so that they can take timely corrective actions. However, sensing what and where energy problems are emerging or are likely to emerge and then analyzing how the problems influence the energy consumption are not trivial tasks. The overarching goal of this dissertation focuses on understanding the gaps in knowledge in methods for building energy diagnostics and retrofit analysis, and filling these gaps by devising a new method for multi-modal visual sensing and analytics using thermography and Building Information Modeling (BIM). First, to address the challenges in scaling and localization issues of 2D thermal image-based inspection, a new computer vision-based method is presented for automated 3D spatio-thermal modeling of building environments from images and localizing the thermal images into the 3D reconstructed scenes, which helps better characterize the as-is condition of existing buildings in 3D. By using these models, auditors can conduct virtual walk-through in buildings and explore the as-is condition of building geometry and the associated thermal conditions in 3D. Second, to address the challenges in qualitative and subjective interpretation of visual data, a new model-based method is presented to convert the 3D thermal profiles of building environments into their associated energy performance metrics. More specifically, the Energy Performance Augmented Reality (EPAR) models are formed which integrate the actual 3D spatio-thermal models ('as-is') with energy performance benchmarks ('as-designed') in 3D. In the EPAR models, the presence and location of potential energy problems in building environments are inferred based on performance deviations. The as-is thermal resistances of the building assemblies are also calculated at the level of mesh vertex in 3D. Then, based on the historical weather data reflecting energy load for space conditioning, the amount of heat transfer that can be saved by improving the as-is thermal resistances of the defective areas to the recommended level is calculated, and the equivalent energy cost for this saving is estimated. The outcome provides building practitioners with unique information that can facilitate energy efficient retrofit decision-makings. This is a major departure from offhand calculations that are based on historical cost data of industry best practices. Finally, to improve the reliability of BIM-based energy performance modeling and analysis for existing buildings, a new model-based automated method is presented to map actual thermal resistance measurements at the level of 3D vertexes to the associated BIM elements and update their corresponding thermal properties in the gbXML schema. By reflecting the as-is building condition in the BIM-based energy modeling process, this method bridges over the gap between the architectural information in the as-designed BIM and the as-is building condition for accurate energy performance analysis. The performance of each method was validated on ten case studies from interiors and exteriors of existing residential and instructional buildings in IL and VA. The extensive experimental results show the promise of the proposed methods in addressing the fundamental challenges of (1) visual sensing : scaling 2D visual assessments to real-world building environments and localizing energy problems; (2) analytics: subjective and qualitative assessments; and (3) BIM-based building energy analysis : a lack of procedures for reflecting the as-is building condition in the energy modeling process. Beyond the technical contributions, the domain expert surveys conducted in this dissertation show that the proposed methods have potential to improve the quality of thermographic inspection processes and complement the current building energy analysis tools.

  2. An Abstraction-Based Data Model for Information Retrieval

    NASA Astrophysics Data System (ADS)

    McAllister, Richard A.; Angryk, Rafal A.

    Language ontologies provide an avenue for automated lexical analysis that may be used to supplement existing information retrieval methods. This paper presents a method of information retrieval that takes advantage of WordNet, a lexical database, to generate paths of abstraction, and uses them as the basis for an inverted index structure to be used in the retrieval of documents from an indexed corpus. We present this method as a entree to a line of research on using ontologies to perform word-sense disambiguation and improve the precision of existing information retrieval techniques.

  3. St. Paul Harbor, St. Paul Island, Alaska; Design for Wave and Shoaling Protection; Hydraulic Model Investigation

    DTIC Science & Technology

    1988-09-01

    S P a .E REPORT DOCUMENTATION PAGE OMR;oJ ’ , CRR Eo Dale n2 ;R6 ’a 4EPOR- SCRFT CASS F.C.T ON ’b RES’RICTI’,E MARKINGS Unclassified a ECRIT y...and selection of test waves 30. Measured prototype wave data on which a comprehensive statistical analysis of wave conditions could be based were...Tests Existing conditions 32. Prior to testing of the various improvement plans, comprehensive tests were conducted for existing conditions (Plate 1

  4. A Polygon Model for Wireless Sensor Network Deployment with Directional Sensing Areas

    PubMed Central

    Wu, Chun-Hsien; Chung, Yeh-Ching

    2009-01-01

    The modeling of the sensing area of a sensor node is essential for the deployment algorithm of wireless sensor networks (WSNs). In this paper, a polygon model is proposed for the sensor node with directional sensing area. In addition, a WSN deployment algorithm is presented with topology control and scoring mechanisms to maintain network connectivity and improve sensing coverage rate. To evaluate the proposed polygon model and WSN deployment algorithm, a simulation is conducted. The simulation results show that the proposed polygon model outperforms the existed disk model and circular sector model in terms of the maximum sensing coverage rate. PMID:22303159

  5. Ecological impact of historical and future land-use patterns in Senegal

    USGS Publications Warehouse

    Parton, W.; Tappan, G. Gray; Ojima, D.; Tschakert, P.

    2004-01-01

    The CENTURY model was used to simulate changes in total system carbon resulting from land-use history (1850–2000), and impacts of climatic changes and improved land-use management practices in Senegal. Results show that 0.477 Gtons of carbon have been lost from 1850 to 2000. Improved management practices have the potential of increasing carbon levels by 0.116 Gtons from 2000 to 2100. Potential to store carbon exists for improved forest management and agriculture practices in southern Senegal. Potential climatic changes decrease plant production (30 percent), total system carbon (14 percent), and the potential to store carbon from improved management practices (31 percent).

  6. Implications of human tissue studies for radiation protection.

    PubMed

    Kathren, R L

    1988-08-01

    Through radiochemical analysis of voluntary tissue donations, the U.S. Transuranium and Uranium Registries (USTR) are gaining improved understanding of the distribution and biokinetics of actinide elements in occupationally exposed persons. Evaluation of the first two whole-body contributions to the USTR revealed an inverse proportionality between actinide concentration and bone ash. The analysis of a whole body with significant 241Am deposition indicated a significantly shorter half-time in liver and a greater fraction resident in the skeleton than predicted by existing models. Other studies with tissues obtained at autopsy suggest that existing biokinetic models for 238Pu and 241Am and the currently accepted models and limits on intake, which use these models as their basis, may be inaccurately implying that revisions of existing safety standards may be necessary. Other studies of the registries are designed to evaluate in-vivo estimates of actinide deposition with those derived from postmortem tissue analysis, to compare results of animal experiments with human data, and to review histopathologic slides for tissue changes that might be attributable to exposure to transuranic elements. The implications of these recent findings and other work of the registries is discussed from the standpoint of this potential effect on biokinetic modeling, internal dose assessment, and safety standards and operational health physics practices.

  7. A nationwide survey of patient centered medical home demonstration projects.

    PubMed

    Bitton, Asaf; Martin, Carina; Landon, Bruce E

    2010-06-01

    The patient centered medical home has received considerable attention as a potential way to improve primary care quality and limit cost growth. Little information exists that systematically compares PCMH pilot projects across the country. Cross-sectional key-informant interviews. Leaders from existing PCMH demonstration projects with external payment reform. We used a semi-structured interview tool with the following domains: project history, organization and participants, practice requirements and selection process, medical home recognition, payment structure, practice transformation, and evaluation design. A total of 26 demonstrations in 18 states were interviewed. Current demonstrations include over 14,000 physicians caring for nearly 5 million patients. A majority of demonstrations are single payer, and most utilize a three component payment model (traditional fee for service, per person per month fixed payments, and bonus performance payments). The median incremental revenue per physician per year was $22,834 (range $720 to $91,146). Two major practice transformation models were identified--consultative and implementation of the chronic care model. A majority of demonstrations did not have well-developed evaluation plans. Current PCMH demonstration projects with external payment reform include large numbers of patients and physicians as well as a wide spectrum of implementation models. Key questions exist around the adequacy of current payment mechanisms and evaluation plans as public and policy interest in the PCMH model grows.

  8. Implications of human tissue studies for radiation protection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kathren, R.L.

    1988-08-01

    Through radiochemical analysis of voluntary tissue donations, the U.S. Transuranium and Uranium Registries (USTR) are gaining improved understanding of the distribution and biokinetics of actinide elements in occupationally exposed persons. Evaluation of the first two whole-body contributions to the USTR revealed an inverse proportionality between actinide concentration and bone ash. The analysis of a whole body with significant /sup 241/Am deposition indicated a significantly shorter half-time in liver and a greater fraction resident in the skeleton than predicted by existing models. Other studies with tissues obtained at autopsy suggest that existing biokinetic models for /sup 238/Pu and /sup 241/Am andmore » the currently accepted models and limits on intake, which use these models as their basis, may be inaccurately implying that revisions of existing safety standards may be necessary. Other studies of the registries are designed to evaluate in-vivo estimates of actinide deposition with those derived from postmortem tissue analysis, to compare results of animal experiments with human data, and to review histopathologic slides for tissue changes that might be attributable to exposure to transuranic elements. The implications of these recent findings and other work of the registries is discussed from the standpoint of this potential effect on biokinetic modeling, internal dose assessment, and safety standards and operational health physics practices.« less

  9. Properties predictive modeling through the concept of a hybrid interphase existing between phases in contact

    NASA Astrophysics Data System (ADS)

    Portan, D. V.; Papanicolaou, G. C.

    2018-02-01

    From practical point of view, predictive modeling based on the physics of composite material behavior is wealth generating; by guiding material system selection and process choices, by cutting down on experimentation and associated costs; and by speeding up the time frame from the research stage to the market place. The presence of areas with different properties and the existence of an interphase between them have a pronounced influence on the behavior of a composite system. The Viscoelastic Hybrid Interphase Model (VHIM), considers the existence of a non-homogeneous viscoelastic and anisotropic interphase having properties depended on the degree of adhesion between the two phases in contact. The model applies for any physical/mechanical property (e.g. mechanical, thermal, electrical and/or biomechanical). Knowing the interphasial variation of a specific property one can predict the corresponding macroscopic behavior of the composite. Moreover, the model acts as an algorithm and a two-way approach can be used: (i) phases in contact may be chosen to get the desired properties of the final composite system or (ii) the initial phases in contact determine the final behavior of the composite system, that can be approximately predicted. The VHIM has been proven, amongst others, to be extremely useful in biomaterial designing for improved contact with human tissues.

  10. Automated Transition State Theory Calculations for High-Throughput Kinetics.

    PubMed

    Bhoorasingh, Pierre L; Slakman, Belinda L; Seyedzadeh Khanshan, Fariba; Cain, Jason Y; West, Richard H

    2017-09-21

    A scarcity of known chemical kinetic parameters leads to the use of many reaction rate estimates, which are not always sufficiently accurate, in the construction of detailed kinetic models. To reduce the reliance on these estimates and improve the accuracy of predictive kinetic models, we have developed a high-throughput, fully automated, reaction rate calculation method, AutoTST. The algorithm integrates automated saddle-point geometry search methods and a canonical transition state theory kinetics calculator. The automatically calculated reaction rates compare favorably to existing estimated rates. Comparison against high level theoretical calculations show the new automated method performs better than rate estimates when the estimate is made by a poor analogy. The method will improve by accounting for internal rotor contributions and by improving methods to determine molecular symmetry.

  11. Improving the accuracy of vehicle emissions profiles for urban transportation greenhouse gas and air pollution inventories.

    PubMed

    Reyna, Janet L; Chester, Mikhail V; Ahn, Soyoung; Fraser, Andrew M

    2015-01-06

    Metropolitan greenhouse gas and air emissions inventories can better account for the variability in vehicle movement, fleet composition, and infrastructure that exists within and between regions, to develop more accurate information for environmental goals. With emerging access to high quality data, new methods are needed for informing transportation emissions assessment practitioners of the relevant vehicle and infrastructure characteristics that should be prioritized in modeling to improve the accuracy of inventories. The sensitivity of light and heavy-duty vehicle greenhouse gas (GHG) and conventional air pollutant (CAP) emissions to speed, weight, age, and roadway gradient are examined with second-by-second velocity profiles on freeway and arterial roads under free-flow and congestion scenarios. By creating upper and lower bounds for each factor, the potential variability which could exist in transportation emissions assessments is estimated. When comparing the effects of changes in these characteristics across U.S. cities against average characteristics of the U.S. fleet and infrastructure, significant variability in emissions is found to exist. GHGs from light-duty vehicles could vary by -2%-11% and CAP by -47%-228% when compared to the baseline. For heavy-duty vehicles, the variability is -21%-55% and -32%-174%, respectively. The results show that cities should more aggressively pursue the integration of emerging big data into regional transportation emissions modeling, and the integration of these data is likely to impact GHG and CAP inventories and how aggressively policies should be implemented to meet reductions. A web-tool is developed to aide cities in improving emissions uncertainty.

  12. Multi-Step Time Series Forecasting with an Ensemble of Varied Length Mixture Models.

    PubMed

    Ouyang, Yicun; Yin, Hujun

    2018-05-01

    Many real-world problems require modeling and forecasting of time series, such as weather temperature, electricity demand, stock prices and foreign exchange (FX) rates. Often, the tasks involve predicting over a long-term period, e.g. several weeks or months. Most existing time series models are inheritably for one-step prediction, that is, predicting one time point ahead. Multi-step or long-term prediction is difficult and challenging due to the lack of information and uncertainty or error accumulation. The main existing approaches, iterative and independent, either use one-step model recursively or treat the multi-step task as an independent model. They generally perform poorly in practical applications. In this paper, as an extension of the self-organizing mixture autoregressive (AR) model, the varied length mixture (VLM) models are proposed to model and forecast time series over multi-steps. The key idea is to preserve the dependencies between the time points within the prediction horizon. Training data are segmented to various lengths corresponding to various forecasting horizons, and the VLM models are trained in a self-organizing fashion on these segments to capture these dependencies in its component AR models of various predicting horizons. The VLM models form a probabilistic mixture of these varied length models. A combination of short and long VLM models and an ensemble of them are proposed to further enhance the prediction performance. The effectiveness of the proposed methods and their marked improvements over the existing methods are demonstrated through a number of experiments on synthetic data, real-world FX rates and weather temperatures.

  13. Theoretical and software considerations for nonlinear dynamic analysis

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1983-01-01

    In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.

  14. A simulation model of hospital management based on cost accounting analysis according to disease.

    PubMed

    Tanaka, Koji; Sato, Junzo; Guo, Jinqiu; Takada, Akira; Yoshihara, Hiroyuki

    2004-12-01

    Since a little before 2000, hospital cost accounting has been increasingly performed at Japanese national university hospitals. At Kumamoto University Hospital, for instance, departmental costs have been analyzed since 2000. And, since 2003, the cost balance has been obtained according to certain diseases for the preparation of Diagnosis-Related Groups and Prospective Payment System. On the basis of these experiences, we have constructed a simulation model of hospital management. This program has worked correctly at repeated trials and with satisfactory speed. Although there has been room for improvement of detailed accounts and cost accounting engine, the basic model has proved satisfactory. We have constructed a hospital management model based on the financial data of an existing hospital. We will later improve this program from the viewpoint of construction and using more various data of hospital management. A prospective outlook may be obtained for the practical application of this hospital management model.

  15. Towards High Spa-Temporal Resolution Estimates of Surface Radiative Fluxes from Geostationary Satellite Observations for the Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Niu, X.; Yang, K.; Tang, W.; Qin, J.

    2014-12-01

    Surface Solar Radiation (SSR) plays an important role of the hydrological and land process modeling, which particularly contributes more than 90% to the total melt energy for the Tibetan Plateau (TP) ice melting. Neither surface measurement nor existing remote sensing products can meet that requirement in TP. The well-known satellite products (i.e. ISCCP-FD and GEWEX-SRB) are in relatively low spatial resolution (0.5º-2.5º) and temporal resolution (3-hourly, daily, or monthly). The objective of this study is to develop capabilities to improved estimates of SSR in TP based on geostationary satellite observations from the Multi-functional Transport Satellite (MTSAT) with high spatial (0.05º) and temporal (hourly) resolution. An existing physical model, the UMD-SRB (University of Maryland Surface Radiation Budget) which is the basis of the GEWEX-SRB model, is re-visited to improve SSR estimates in TP. The UMD-SRB algorithm transforms TOA radiances into broadband albedos in order to infer atmospheric transmissivity which finally determines the SSR. Specifically, main updates introduced in this study are: implementation at 0.05º spatial resolution at hourly intervals integrated to daily and monthly time scales; and improvement of surface albedo model by introducing the most recently developed Global Land Surface Broadband Albedo Product (GLASS) based on MODIS data. This updated inference scheme will be evaluated against ground observations from China Meteorological Administration (CMA) radiation stations and three TP radiation stations contributed from the Institute of Tibetan Plateau Research.

  16. Testing the effects of in-stream sediment sources and sinks on simulated watershed sediment yield using the coupled U.S. Army Corps of Engineers GSSHA Model and SEDLIB Sediment Transport Library

    NASA Astrophysics Data System (ADS)

    Floyd, I. E.; Downer, C. W.; Brown, G.; Pradhan, N. R.

    2017-12-01

    The Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model is the US Army Corps of Engineers' (USACE)'s only fully coupled overland/in-stream sediment transport model. While the overland sediment transport formulation in GSSHA is considered state of the art, the existing in-stream sediment transport formulation is less robust. A major omission in the formulation of the existing GSSHA in-stream model is the lack of in-stream sources of fine materials. In this effort, we enhanced the in-stream sediment transport capacity of GSSHA by linking GSSHA to the SEDLIB sediment transport library. SEDLIB was developed at the Coastal and Hydraulics Laboratory (CHL) under the System Wide Water Resources Program (SWWRP) and Flood and Coastal (F&C) research program. It is designed to provide a library of sediment flux formulations for hydraulic and hydrologic models, such as GSSHA. This new version of GSSHA, with the updated in-stream sediment transport simulation capability afforded by the linkage to SEDLIB, was tested in against observations in an experimental watershed that had previously been used as a test bed for GSSHA. The results show a significant improvement in the ability to model in-stream sources of fine sediment. This improved capability will broaden the applicability of GSSHA to larger watersheds and watersheds with complex sediment dynamics, such as those subjected to fire hydrology.

  17. Modeling Circulation along the Vietnamese Coast Influenced by Monsoon Variability in Meteorology, River Discharge and Interactions with the Vietnamese East Sea

    DTIC Science & Technology

    2013-09-30

    productivity. Advanced variational methods for the assimilation of satellite and in situ observations to achieve improved state estimation and subsequent...time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection...South China Sea (SCS) using the Regional Ocean Modeling System (ROMS) with Incremental Strong Constraint 4-Dimensional Variational (IS4DVAR) data

  18. Inclusion of an ultraviolet radiation transfer component in an urban forest effects model for predicting tree influences on potential below-canopy exposure to UVB radiation

    Treesearch

    Gordon M. Heisler; Richard H. Grant; David J. Nowak; Wei Gao; Daniel E. Crane; Jeffery T. Walton

    2003-01-01

    Evaluating the impact of ultraviolet-B radiation (UVB) on urban populations would be enhanced by improved predictions of the UVB radiation at the level of human activity. This paper reports the status of plans for incorporating a UVB prediction module into an existing Urban Forest Effects (UFORE) model. UFORE currently has modules to quantify urban forest structure,...

  19. Application of the Fractions Skill Score for Tracking the Effectiveness of Improvements Made to Weather Research and Forecasting Model Simulations

    DTIC Science & Technology

    2017-11-22

    Weather Research and Forecasting Model Simulations by John W Raby and Huaqing Cai Computational and Information Sciences Directorate, ARL...burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this

  20. Understanding and Improving Modifiable Cardiovascular Risks within the Air Force

    DTIC Science & Technology

    2013-10-04

    Promotion Model ( HPM ). Findings: The definition of health included exercise, proper eating, sleep, and a spiritual connection, as well as the absence of...to health behaviors, including what it takes to be healthy, knowing oneself, and existing Air Force policies. The HPM did not fully address all of the...was used to arrange the data into data-driven themes. These themes were then compared to the elements of the Health Promotion Model ( HPM

  1. Quantifying the impacts of land surface schemes and dynamic vegetation on the model dependency of projected changes in surface energy and water budgets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Miao; Wang, Guiling; Chen, Haishan

    Assessing and quantifying the uncertainties in projected future changes of energy and water budgets over land surface are important steps toward improving our confidence in climate change projections. In our study, the contribution of land surface models to the inter-GCM variation of projected future changes in land surface energy and water fluxes are assessed based on output from 19 global climate models (GCMs) and offline Community Land Model version 4 (CLM4) simulations driven by meteorological forcing from the 19 GCMs. Similar offline simulations using CLM4 with its dynamic vegetation submodel are also conducted to investigate how dynamic vegetation feedback, amore » process that is being added to more earth system models, may amplify or moderate the intermodel variations of projected future changes. Projected changes are quantified as the difference between the 2081–2100 period from the Representative Concentration Pathway 8.5 (RCP8.5) future experiment and the 1981–2000 period from the historical simulation. Under RCP8.5, projected changes in surface water and heat fluxes show a high degree of model dependency across the globe. Although precipitation is very likely to increase in the high latitudes of the Northern Hemisphere, a high degree of model-related uncertainty exists for evapotranspiration, soil water content, and surface runoff, suggesting discrepancy among land surface models (LSMs) in simulating the surface hydrological processes and snow-related processes. Large model-related uncertainties for the surface water budget also exist in the Tropics including southeastern South America and Central Africa. Moreover, these uncertainties would be reduced in the hypothetical scenario of a single near-perfect land surface model being used across all GCMs, suggesting the potential to reduce uncertainties through the use of more consistent approaches toward land surface model development. Under such a scenario, the most significant reduction is likely to be seen in the Northern Hemisphere high latitudes. Including representation of vegetation dynamics is expected to further amplify the model-related uncertainties in projected future changes in surface water and heat fluxes as well as soil moisture content. This is especially the case in the high latitudes of the Northern Hemisphere (e.g., northwestern North America and central North Asia) where the projected vegetation changes are uncertain and in the Tropics (e.g., the Amazon and Congo Basins) where dense vegetation exists. Finally, findings from this study highlight the importance of improving land surface model parameterizations related to soil and snow processes, as well as the importance of improving the accuracy of dynamic vegetation models.« less

  2. Quantifying the impacts of land surface schemes and dynamic vegetation on the model dependency of projected changes in surface energy and water budgets

    DOE PAGES

    Yu, Miao; Wang, Guiling; Chen, Haishan

    2016-03-01

    Assessing and quantifying the uncertainties in projected future changes of energy and water budgets over land surface are important steps toward improving our confidence in climate change projections. In our study, the contribution of land surface models to the inter-GCM variation of projected future changes in land surface energy and water fluxes are assessed based on output from 19 global climate models (GCMs) and offline Community Land Model version 4 (CLM4) simulations driven by meteorological forcing from the 19 GCMs. Similar offline simulations using CLM4 with its dynamic vegetation submodel are also conducted to investigate how dynamic vegetation feedback, amore » process that is being added to more earth system models, may amplify or moderate the intermodel variations of projected future changes. Projected changes are quantified as the difference between the 2081–2100 period from the Representative Concentration Pathway 8.5 (RCP8.5) future experiment and the 1981–2000 period from the historical simulation. Under RCP8.5, projected changes in surface water and heat fluxes show a high degree of model dependency across the globe. Although precipitation is very likely to increase in the high latitudes of the Northern Hemisphere, a high degree of model-related uncertainty exists for evapotranspiration, soil water content, and surface runoff, suggesting discrepancy among land surface models (LSMs) in simulating the surface hydrological processes and snow-related processes. Large model-related uncertainties for the surface water budget also exist in the Tropics including southeastern South America and Central Africa. Moreover, these uncertainties would be reduced in the hypothetical scenario of a single near-perfect land surface model being used across all GCMs, suggesting the potential to reduce uncertainties through the use of more consistent approaches toward land surface model development. Under such a scenario, the most significant reduction is likely to be seen in the Northern Hemisphere high latitudes. Including representation of vegetation dynamics is expected to further amplify the model-related uncertainties in projected future changes in surface water and heat fluxes as well as soil moisture content. This is especially the case in the high latitudes of the Northern Hemisphere (e.g., northwestern North America and central North Asia) where the projected vegetation changes are uncertain and in the Tropics (e.g., the Amazon and Congo Basins) where dense vegetation exists. Finally, findings from this study highlight the importance of improving land surface model parameterizations related to soil and snow processes, as well as the importance of improving the accuracy of dynamic vegetation models.« less

  3. Snow Microwave Radiative Transfer (SMRT): A new model framework to simulate snow-microwave interactions for active and passive remote sensing applications

    NASA Astrophysics Data System (ADS)

    Loewe, H.; Picard, G.; Sandells, M. J.; Mätzler, C.; Kontu, A.; Dumont, M.; Maslanka, W.; Morin, S.; Essery, R.; Lemmetyinen, J.; Wiesmann, A.; Floury, N.; Kern, M.

    2016-12-01

    Forward modeling of snow-microwave interactions is widely used to interpret microwave remote sensing data from active and passive sensors. Though different models are yet available for that purpose, a joint effort has been undertaken in the past two years within the ESA Project "Microstructural origin of electromagnetic signatures in microwave remote sensing of snow". The new Snow Microwave Radiative Transfer (SMRT) model primarily facilitates a flexible treatment of snow microstructure as seen by X-ray tomography and seeks to unite respective advantages of existing models. In its main setting, SMRT considers radiation transfer in a plane-parallel snowpack consisting of homogeneous layers with a layer microstructure represented by an autocorrelation function. The electromagnetic model, which underlies permittivity, absorption and scattering calculations within a layer, is based on the improved Born approximation. The resulting vector-radiative transfer equation in the snowpack is solved using spectral decomposition of the discrete ordinates discretization. SMRT is implemented in Python and employs an object-oriented, modular design which intends to i) provide an intuitive and fail-safe API for basic users ii) enable efficient community developments for extensions (e.g. for improvements of sub-models for microstructure, permittivity, soil or interface reflectivity) from advanced users and iii) encapsulate the numerical core which is maintained by the developers. For cross-validation and inter-model comparison, SMRT implements various ingredients of existing models as selectable options (e.g. Rayleigh or DMRT-QCA phase functions) and shallow wrappers to invoke legacy model code directly (MEMLS, DMRT-QMS, HUT). In this paper we give an overview of the model components and show examples and results from different validation schemes.

  4. A fuzzy set preference model for market share analysis

    NASA Technical Reports Server (NTRS)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    Consumer preference models are widely used in new product design, marketing management, pricing, and market segmentation. The success of new products depends on accurate market share prediction and design decisions based on consumer preferences. The vague linguistic nature of consumer preferences and product attributes, combined with the substantial differences between individuals, creates a formidable challenge to marketing models. The most widely used methodology is conjoint analysis. Conjoint models, as currently implemented, represent linguistic preferences as ratio or interval-scaled numbers, use only numeric product attributes, and require aggregation of individuals for estimation purposes. It is not surprising that these models are costly to implement, are inflexible, and have a predictive validity that is not substantially better than chance. This affects the accuracy of market share estimates. A fuzzy set preference model can easily represent linguistic variables either in consumer preferences or product attributes with minimal measurement requirements (ordinal scales), while still estimating overall preferences suitable for market share prediction. This approach results in flexible individual-level conjoint models which can provide more accurate market share estimates from a smaller number of more meaningful consumer ratings. Fuzzy sets can be incorporated within existing preference model structures, such as a linear combination, using the techniques developed for conjoint analysis and market share estimation. The purpose of this article is to develop and fully test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation), and how much to make (market share prediction).

  5. Systematic Review of Model-Based Economic Evaluations of Treatments for Alzheimer's Disease.

    PubMed

    Hernandez, Luis; Ozen, Asli; DosSantos, Rodrigo; Getsios, Denis

    2016-07-01

    Numerous economic evaluations using decision-analytic models have assessed the cost effectiveness of treatments for Alzheimer's disease (AD) in the last two decades. It is important to understand the methods used in the existing models of AD and how they could impact results, as they could inform new model-based economic evaluations of treatments for AD. The aim of this systematic review was to provide a detailed description on the relevant aspects and components of existing decision-analytic models of AD, identifying areas for improvement and future development, and to conduct a quality assessment of the included studies. We performed a systematic and comprehensive review of cost-effectiveness studies of pharmacological treatments for AD published in the last decade (January 2005 to February 2015) that used decision-analytic models, also including studies considering patients with mild cognitive impairment (MCI). The background information of the included studies and specific information on the decision-analytic models, including their approach and components, assumptions, data sources, analyses, and results, were obtained from each study. A description of how the modeling approaches and assumptions differ across studies, identifying areas for improvement and future development, is provided. At the end, we present our own view of the potential future directions of decision-analytic models of AD and the challenges they might face. The included studies present a variety of different approaches, assumptions, and scope of decision-analytic models used in the economic evaluation of pharmacological treatments of AD. The major areas for improvement in future models of AD are to include domains of cognition, function, and behavior, rather than cognition alone; include a detailed description of how data used to model the natural course of disease progression were derived; state and justify the economic model selected and structural assumptions and limitations; provide a detailed (rather than high-level) description of the cost components included in the model; and report on the face-, internal-, and cross-validity of the model to strengthen the credibility and confidence in model results. The quality scores of most studies were rated as fair to good (average 87.5, range 69.5-100, in a scale of 0-100). Despite the advancements in decision-analytic models of AD, there remain several areas of improvement that are necessary to more appropriately and realistically capture the broad nature of AD and the potential benefits of treatments in future models of AD.

  6. A novel double loop control model design for chemical unstable processes.

    PubMed

    Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He

    2014-03-01

    In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods. © 2013 ISA Published by ISA All rights reserved.

  7. Improved darunavir genotypic mutation score predicting treatment response for patients infected with HIV-1 subtype B and non-subtype B receiving a salvage regimen.

    PubMed

    De Luca, Andrea; Flandre, Philippe; Dunn, David; Zazzi, Maurizio; Wensing, Annemarie; Santoro, Maria Mercedes; Günthard, Huldrych F; Wittkop, Linda; Kordossis, Theodoros; Garcia, Federico; Castagna, Antonella; Cozzi-Lepri, Alessandro; Churchill, Duncan; De Wit, Stéphane; Brockmeyer, Norbert H; Imaz, Arkaitz; Mussini, Cristina; Obel, Niels; Perno, Carlo Federico; Roca, Bernardino; Reiss, Peter; Schülter, Eugen; Torti, Carlo; van Sighem, Ard; Zangerle, Robert; Descamps, Diane

    2016-05-01

    The objective of this study was to improve the prediction of the impact of HIV-1 protease mutations in different viral subtypes on virological response to darunavir. Darunavir-containing treatment change episodes (TCEs) in patients previously failing PIs were selected from large European databases. HIV-1 subtype B-infected patients were used as the derivation dataset and HIV-1 non-B-infected patients were used as the validation dataset. The adjusted association of each mutation with week 8 HIV RNA change from baseline was analysed by linear regression. A prediction model was derived based on best subset least squares estimation with mutational weights corresponding to regression coefficients. Virological outcome prediction accuracy was compared with that from existing genotypic resistance interpretation systems (GISs) (ANRS 2013, Rega 9.1.0 and HIVdb 7.0). TCEs were selected from 681 subtype B-infected and 199 non-B-infected adults. Accompanying drugs were NRTIs in 87%, NNRTIs in 27% and raltegravir or maraviroc or enfuvirtide in 53%. The prediction model included weighted protease mutations, HIV RNA, CD4 and activity of accompanying drugs. The model's association with week 8 HIV RNA change in the subtype B (derivation) set was R(2) = 0.47 [average squared error (ASE) = 0.67, P < 10(-6)]; in the non-B (validation) set, ASE was 0.91. Accuracy investigated by means of area under the receiver operating characteristic curves with a binary response (above the threshold value of HIV RNA reduction) showed that our final model outperformed models with existing interpretation systems in both training and validation sets. A model with a new darunavir-weighted mutation score outperformed existing GISs in both B and non-B subtypes in predicting virological response to darunavir. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Simulated cold bias being improved by using MODIS time-varying albedo in the Tibetan Plateau in WRF model

    NASA Astrophysics Data System (ADS)

    Meng, X.; Lyu, S.; Zhang, T.; Zhao, L.; Li, Z.; Han, B.; Li, S.; Ma, D.; Chen, H.; Ao, Y.; Luo, S.; Shen, Y.; Guo, J.; Wen, L.

    2018-04-01

    Systematic cold biases exist in the simulation for 2 m air temperature in the Tibetan Plateau (TP) when using regional climate models and global atmospheric general circulation models. We updated the albedo in the Weather Research and Forecasting (WRF) Model lower boundary condition using the Global LAnd Surface Satellite Moderate-Resolution Imaging Spectroradiometer albedo products and demonstrated evident improvement for cold temperature biases in the TP. It is the large overestimation of albedo in winter and spring in the WRF model that resulted in the large cold temperature biases. The overestimated albedo was caused by the simulated precipitation biases and over-parameterization of snow albedo. Furthermore, light-absorbing aerosols can result in a large reduction of albedo in snow and ice cover. The results suggest the necessity of developing snow albedo parameterization using observations in the TP, where snow cover and melting are very different from other low-elevation regions, and the influence of aerosols should be considered as well. In addition to defining snow albedo, our results show an urgent call for improving precipitation simulation in the TP.

  9. Improved Modeling in a Matlab-Based Navigation System

    NASA Technical Reports Server (NTRS)

    Deutschmann, Julie; Bar-Itzhack, Itzhack; Harman, Rick; Larimore, Wallace E.

    1999-01-01

    An innovative approach to autonomous navigation is available for low earth orbit satellites. The system is developed in Matlab and utilizes an Extended Kalman Filter (EKF) to estimate the attitude and trajectory based on spacecraft magnetometer and gyro data. Preliminary tests of the system with real spacecraft data from the Rossi X-Ray Timing Explorer Satellite (RXTE) indicate the existence of unmodeled errors in the magnetometer data. Incorporating into the EKF a statistical model that describes the colored component of the effective measurement of the magnetic field vector could improve the accuracy of the trajectory and attitude estimates and also improve the convergence time. This model is identified as a first order Markov process. With the addition of the model, the EKF attempts to identify the non-white components of the noise allowing for more accurate estimation of the original state vector, i.e. the orbital elements and the attitude. Working in Matlab allows for easy incorporation of new models into the EKF and the resulting navigation system is generic and can easily be applied to future missions resulting in an alternative in onboard or ground-based navigation.

  10. Improved MODIS aerosol retrieval in urban areas using a land classification approach and empirical orthogonal functions

    NASA Astrophysics Data System (ADS)

    Levitan, Nathaniel; Gross, Barry

    2016-10-01

    New, high-resolution aerosol products are required in urban areas to improve the spatial coverage of the products, in terms of both resolution and retrieval frequency. These new products will improve our understanding of the spatial variability of aerosols in urban areas and will be useful in the detection of localized aerosol emissions. Urban aerosol retrieval is challenging for existing algorithms because of the high spatial variability of the surface reflectance, indicating the need for improved urban surface reflectance models. This problem can be stated in the language of novelty detection as the problem of selecting aerosol parameters whose effective surface reflectance spectrum is not an outlier in some space. In this paper, empirical orthogonal functions, a reconstruction-based novelty detection technique, is used to perform single-pixel aerosol retrieval using the single angular and temporal sample provided by the MODIS sensor. The empirical orthogonal basis functions are trained for different land classes using the MODIS BRDF MCD43 product. Existing land classification products are used in training and aerosol retrieval. The retrieval is compared against the existing operational MODIS 3 KM Dark Target (DT) aerosol product and co-located AERONET data. Based on the comparison, our method allows for a significant increase in retrieval frequency and a moderate decrease in the known biases of MODIS urban aerosol retrievals.

  11. Examining Quality Improvement Programs: The Case of Minnesota Hospitals

    PubMed Central

    Olson, John R; Belohlav, James A; Cook, Lori S; Hays, Julie M

    2008-01-01

    Objective To determine if there is a hierarchy of improvement program adoption by hospitals and outline that hierarchy. Data Sources Primary data were collected in the spring of 2007 via e-survey from 210 individuals representing 109 Minnesota hospitals. Secondary data from 2006 were assembled from the Leapfrog database. Study Design As part of a larger survey, respondents were given a list of improvement programs and asked to identify those programs that are used in their hospital. Data Collection/Data Extraction Rasch Model Analysis was used to assess whether a unidimensional construct exists that defines a hospital's ability to implement performance improvement programs. Linear regression analysis was used to assess the relationship of the Rasch ability scores with Leapfrog Safe Practices Scores to validate the research findings. Principal Findings The results of the study show that hospitals have widely varying abilities in implementing improvement programs. In addition, improvement programs present differing levels of difficulty for hospitals trying to implement them. Our findings also indicate that the ability to adopt improvement programs is important to the overall performance of hospitals. Conclusions There is a hierarchy of improvement programs in the health care context. A hospital's ability to successfully adopt improvement programs is a function of its existing capabilities. As a hospital's capability increases, the ability to successfully implement higher level programs also increases. PMID:18761677

  12. A Bayesian method for using simulator data to enhance human error probabilities assigned by existing HRA methods

    DOE PAGES

    Groth, Katrina M.; Smith, Curtis L.; Swiler, Laura P.

    2014-04-05

    In the past several years, several international agencies have begun to collect data on human performance in nuclear power plant simulators [1]. This data provides a valuable opportunity to improve human reliability analysis (HRA), but there improvements will not be realized without implementation of Bayesian methods. Bayesian methods are widely used in to incorporate sparse data into models in many parts of probabilistic risk assessment (PRA), but Bayesian methods have not been adopted by the HRA community. In this article, we provide a Bayesian methodology to formally use simulator data to refine the human error probabilities (HEPs) assigned by existingmore » HRA methods. We demonstrate the methodology with a case study, wherein we use simulator data from the Halden Reactor Project to update the probability assignments from the SPAR-H method. The case study demonstrates the ability to use performance data, even sparse data, to improve existing HRA methods. Furthermore, this paper also serves as a demonstration of the value of Bayesian methods to improve the technical basis of HRA.« less

  13. The Local Ensemble Transform Kalman Filter with the Weather Research and Forecasting Model: Experiments with Real Observations

    NASA Astrophysics Data System (ADS)

    Miyoshi, Takemasa; Kunii, Masaru

    2012-03-01

    The local ensemble transform Kalman filter (LETKF) is implemented with the Weather Research and Forecasting (WRF) model, and real observations are assimilated to assess the newly-developed WRF-LETKF system. The WRF model is a widely-used mesoscale numerical weather prediction model, and the LETKF is an ensemble Kalman filter (EnKF) algorithm particularly efficient in parallel computer architecture. This study aims to provide the basis of future research on mesoscale data assimilation using the WRF-LETKF system, an additional testbed to the existing EnKF systems with the WRF model used in the previous studies. The particular LETKF system adopted in this study is based on the system initially developed in 2004 and has been continuously improved through theoretical studies and wide applications to many kinds of dynamical models including realistic geophysical models. Most recent and important improvements include an adaptive covariance inflation scheme which considers the spatial and temporal inhomogeneity of inflation parameters. Experiments show that the LETKF successfully assimilates real observations and that adaptive inflation is advantageous. Additional experiments with various ensemble sizes show that using more ensemble members improves the analyses consistently.

  14. A Robust Sound Source Localization Approach for Microphone Array with Model Errors

    NASA Astrophysics Data System (ADS)

    Xiao, Hua; Shao, Huai-Zong; Peng, Qi-Cong

    In this paper, a robust sound source localization approach is proposed. The approach retains good performance even when model errors exist. Compared with previous work in this field, the contributions of this paper are as follows. First, an improved broad-band and near-field array model is proposed. It takes array gain, phase perturbations into account and is based on the actual positions of the elements. It can be used in arbitrary planar geometry arrays. Second, a subspace model errors estimation algorithm and a Weighted 2-Dimension Multiple Signal Classification (W2D-MUSIC) algorithm are proposed. The subspace model errors estimation algorithm estimates unknown parameters of the array model, i. e., gain, phase perturbations, and positions of the elements, with high accuracy. The performance of this algorithm is improved with the increasing of SNR or number of snapshots. The W2D-MUSIC algorithm based on the improved array model is implemented to locate sound sources. These two algorithms compose the robust sound source approach. The more accurate steering vectors can be provided for further processing such as adaptive beamforming algorithm. Numerical examples confirm effectiveness of this proposed approach.

  15. Scoring the correlation of genes by their shared properties using OScal, an improved overlap quantification model.

    PubMed

    Liu, Hui; Liu, Wei; Lin, Ying; Liu, Teng; Ma, Zhaowu; Li, Mo; Zhang, Hong-Mei; Kenneth Wang, Qing; Guo, An-Yuan

    2015-05-27

    Scoring the correlation between two genes by their shared properties is a common and basic work in biological study. A prospective way to score this correlation is to quantify the overlap between the two sets of homogeneous properties of the two genes. However the proper model has not been decided, here we focused on studying the quantification of overlap and proposed a more effective model after theoretically compared 7 existing models. We defined three characteristic parameters (d, R, r) of an overlap, which highlight essential differences among the 7 models and grouped them into two classes. Then the pros and cons of the two groups of model were fully examined by their solution space in the (d, R, r) coordinate system. Finally we proposed a new model called OScal (Overlap Score calculator), which was modified on Poisson distribution (one of 7 models) to avoid its disadvantages. Tested in assessing gene relation using different data, OScal performs better than existing models. In addition, OScal is a basic mathematic model, with very low computation cost and few restrictive conditions, so it can be used in a wide-range of research areas to measure the overlap or similarity of two entities.

  16. A Secure and Efficient Handover Authentication Protocol for Wireless Networks

    PubMed Central

    Wang, Weijia; Hu, Lei

    2014-01-01

    Handover authentication protocol is a promising access control technology in the fields of WLANs and mobile wireless sensor networks. In this paper, we firstly review an efficient handover authentication protocol, named PairHand, and its existing security attacks and improvements. Then, we present an improved key recovery attack by using the linearly combining method and reanalyze its feasibility on the improved PairHand protocol. Finally, we present a new handover authentication protocol, which not only achieves the same desirable efficiency features of PairHand, but enjoys the provable security in the random oracle model. PMID:24971471

  17. Short note: the experimental geopotential model XGM2016

    NASA Astrophysics Data System (ADS)

    Pail, R.; Fecher, T.; Barnes, D.; Factor, J. F.; Holmes, S. A.; Gruber, T.; Zingerle, P.

    2018-04-01

    As a precursor study for the upcoming combined Earth Gravitational Model 2020 (EGM2020), the Experimental Gravity Field Model XGM2016, parameterized as a spherical harmonic series up to degree and order 719, is computed. XGM2016 shares the same combination methodology as its predecessor model GOCO05c (Fecher et al. in Surv Geophys 38(3): 571-590, 2017. doi: 10.1007/s10712-016-9406-y). The main difference between these models is that XGM2016 is supported by an improved terrestrial data set of 15^' × 15^' gravity anomaly area-means provided by the United States National Geospatial-Intelligence Agency (NGA), resulting in significant upgrades compared to existing combined gravity field models, especially in continental areas such as South America, Africa, parts of Asia, and Antarctica. A combination strategy of relative regional weighting provides for improved performance in near-coastal ocean regions, including regions where the altimetric data are mostly unchanged from previous models. Comparing cumulative height anomalies, from both EGM2008 and XGM2016 at degree/order 719, yields differences of 26 cm in Africa and 40 cm in South America. These differences result from including additional information of satellite data, as well as from the improved ground data in these regions. XGM2016 also yields a smoother Mean Dynamic Topography with significantly reduced artifacts, which indicates an improved modeling of the ocean areas.

  18. Radioactive threat detection using scintillant-based detectors

    NASA Astrophysics Data System (ADS)

    Chalmers, Alex

    2004-09-01

    An update to the performance of AS&E's Radioactive Threat Detection sensor technology. A model is presented detailing the components of the scintillant-based RTD system employed in AS&E products aimed at detecting radiological WMD. An overview of recent improvements in the sensors, electrical subsystems and software algorithms are presented. The resulting improvements in performance are described and sample results shown from existing systems. Advanced and future capabilities are described with an assessment of their feasibility and their application to Homeland Defense.

  19. The T.O.S.CA. Project: research, education and care.

    PubMed

    Bossone, Eduardo; Limongelli, Giuseppe; Malizia, Graziella; Ferrara, Francesco; Vriz, Olga; Citro, Rodolfo; Marra, Alberto Maria; Arcopinto, Michele; Bobbio, Emanuele; Sirico, Domenico; Caliendo, Luigi; Ballotta, Andrea; D'Andrea, Antonello; Frigiola, Alessandro; Isgaard, Jorgen; Saccà, Luigi; Antonio, Cittadini

    2011-12-01

    Despite recent and exponential improvements in diagnostic-therapeutic pathways, an existing "GAP" has been revealed between the "real world care" and the "optimal care" of patients with chronic heart failure (CHF). We present the T.O.S.CA. Project (Trattamento Ormonale dello Scompenso CArdiaco), an Italian multicenter initiative involving different health care professionals and services aiming to explore the CHF "metabolic pathophysiological model" and to improve the quality of care of HF patients through research and continuing medical education.

  20. The admissible portfolio selection problem with transaction costs and an improved PSO algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Zhang, Wei-Guo

    2010-05-01

    In this paper, we discuss the portfolio selection problem with transaction costs under the assumption that there exist admissible errors on expected returns and risks of assets. We propose a new admissible efficient portfolio selection model and design an improved particle swarm optimization (PSO) algorithm because traditional optimization algorithms fail to work efficiently for our proposed problem. Finally, we offer a numerical example to illustrate the proposed effective approaches and compare the admissible portfolio efficient frontiers under different constraints.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebetrau, A.M.

    Work is underway at Pacific Northwest Laboratory (PNL) to improve the probabilistic analysis used to model pressurized thermal shock (PTS) incidents in reactor pressure vessels, and, further, to incorporate these improvements into the existing Vessel Integrity Simulation Analysis (VISA) code. Two topics related to work on input distributions in VISA are discussed in this paper. The first involves the treatment of flaw size distributions and the second concerns errors in the parameters in the (Guthrie) equation which is used to compute ..delta..RT/sub NDT/, the shift in reference temperature for nil ductility transition.

  2. Stacking transgenes in forest trees.

    PubMed

    Halpin, Claire; Boerjan, Wout

    2003-08-01

    Huge potential exists for improving plant raw materials and foodstuffs via metabolic engineering. To date, progress has mostly been limited to modulating the expression of single genes of well-studied pathways, such as the lignin biosynthetic pathway, in model species. However, a recent report illustrates a new level of sophistication in metabolic engineering by overexpressing one lignin enzyme while simultaneously suppressing the expression of another lignin gene in a tree, aspen. This novel approach to multi-gene manipulation has succeeded in concurrently improving several wood-quality traits.

  3. Ammonia flux above fertilized corn in central Illinois, USA, using relaxed eddy accumulation

    USDA-ARS?s Scientific Manuscript database

    The objective of this research is to quantify NH3 flux above an intensively managed cornfield in the Midwestern United States to improve understanding of NH3 emissions and evaluations of new and existing emission models. A relaxed eddy accumulation (REA) system was deployed above a corn canopy in ce...

  4. Collaborative Care in Schools: Enhancing Integration and Impact in Youth Mental Health

    ERIC Educational Resources Information Center

    Lyon, Aaron R.; Whitaker, Kelly; French, William P.; Richardson, Laura P.; Wasse, Jessica Knaster; McCauley, Elizabeth

    2016-01-01

    Collaborative care (CC) is an innovative approach to integrated mental health service delivery that focuses on reducing access barriers, improving service quality and lowering health care expenditures. A large body of evidence supports the effectiveness of CC models with adults and, increasingly, for youth. Although existing studies examining…

  5. Cultural Challenges in Adapting Lesson Study to a Philippines Setting

    ERIC Educational Resources Information Center

    Ebaeguin, Marlon; Stephens, Max

    2014-01-01

    Promising improved student and teacher learning, Japanese lesson study has attracted many international educators to try to implement it in their own contexts. However, a simple transference model of implementation is likely to meet difficulties. Key determinants of any adaptation will be differences between existing conventions of pedagogy and of…

  6. Information Technology for Schools: Creating Practical Knowledge To Improve Student Performance. The Jossey-Bass Education Series.

    ERIC Educational Resources Information Center

    Kallick, Bena, Ed.; Wilson, James M., III, Ed.

    This book chronicles practitioners' struggles in implementing information technology, identifies the existing barriers to implementation, and provides a set of frameworks from the current understanding of this process to support learning through information creation. The chapters are: chapter 1, "A Model for Organizational Learning: The…

  7. Transgender Individuals' Workplace Experiences: The Applicability of Sexual Minority Measures and Models

    ERIC Educational Resources Information Center

    Brewster, Melanie E.; Velez, Brandon; DeBlaere, Cirleen; Moradi, Bonnie

    2012-01-01

    The present study explored whether 3 existing measures of workplace constructs germane to the experiences of sexual minority people could be modified to improve their applicability with transgender individuals. To this end, the Workplace Heterosexist Experiences Questionnaire (WHEQ; C. R. Waldo, 1999); the Lesbian, Gay, Bisexual, and Transgendered…

  8. 78 FR 59654 - Possible Models for the Administration and Support of Discipline-Specific Guidance Groups for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-27

    ... science by improving coordination across a broad range of forensic science disciplines. The new initiative... intended to provide structured forums for the exchange of ideas among operational, technical, research, and... needs of forensic science research and measurement standards, and verifying the scientific basis exists...

  9. From Quick Start Teams to Home Teams: The Duke TQM Experience.

    ERIC Educational Resources Information Center

    Lubans, John; Gordon, Heather

    This paper describes the Duke University Libraries' transition in early 1994 from its traditional hierarchical model to an organization emphasizing Total Quality Management (TQM) concepts such as self-managing teams and continuous improvement. Existing conditions at the libraries that played a role in the decision to switch included: (1) rising…

  10. Raman spectra of lignin model compounds

    Treesearch

    Umesh P. Agarwal; Richard S. Reiner; Ashok K. Pandey; Sally A. Ralph; Kolby C. Hirth; Rajai H. Atalla

    2005-01-01

    To fully exploit the value of Raman spectroscopy for analyzing lignins and lignin containing materials, a detailed understanding of lignins’ Raman spectra needs to be achieved. Although advances made thus far have led to significant growth in application of Raman techniques, further developments are needed to improve upon the existing knowledge. Considering that lignin...

  11. Improving the Effectiveness of English Vocabulary Review by Integrating ARCS with Mobile Game-Based Learning

    ERIC Educational Resources Information Center

    Wu, Ting-Ting

    2018-01-01

    Memorizing English vocabulary is often considered uninteresting, and a lack of motivation exists during learning activities. Moreover, most vocabulary practice systems automatically select words from articles and do not provide integrated model methods for students. Therefore, this study constructed a mobile game-based English vocabulary practice…

  12. Characterization of Orbital Debris Via Hyper-Velocity Ground-Based Tests

    NASA Technical Reports Server (NTRS)

    Cowardin, Heather

    2015-01-01

    To replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoD and NASA breakup models. DebriSat is intended to be representative of modern LEO satellites.Major design decisions were reviewed and approved by Aerospace subject matter experts from different disciplines. DebriSat includes 7 major subsystems. Attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. A key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), supporting the development of the DoD and NASA satellite breakup models was conducted at AEDC in 1992 .Breakup models based on SOCIT have supported many applications and matched on-orbit events reasonably well over the years.

  13. Improving material removal determinacy based on the compensation of tool influence function

    NASA Astrophysics Data System (ADS)

    Zhong, Bo; Chen, Xian-hua; Deng, Wen-hui; Zhao, Shi-jie; Zheng, Nan

    2018-03-01

    In the process of computer-controlled optical surfacing (CCOS), the key of correcting the surface error of optical components is to ensure the consistency between the simulated tool influence function and the actual tool influence function (TIF). The existing removal model usually adopts the fixed-point TIF to remove the material with the planning path and velocity, and it considers that the polishing process is linear and time invariant. However, in the actual polishing process, the TIF is a function related to the feed speed. In this paper, the relationship between the actual TIF and the feed speed (i.e. the compensation relationship between static removal and dynamic removal) is determined by experimental method. Then, the existing removal model is modified based on the compensation relationship, to improve the conformity between simulated and actual processing. Finally, the surface error modification correction test are carried out. The results show that the fitting degree of the simulated surface and the experimental surface is better than 88%, and the surface correction accuracy can be better than 1/10 λ (Λ=632.8nm).

  14. Improving Metallic Thermal Protection System Hypervelocity Impact Resistance Through Design of Experiments Approach

    NASA Technical Reports Server (NTRS)

    Poteet, Carl C.; Blosser, Max L.

    2001-01-01

    A design of experiments approach has been implemented using computational hypervelocity impact simulations to determine the most effective place to add mass to an existing metallic Thermal Protection System (TPS) to improve hypervelocity impact protection. Simulations were performed using axisymmetric models in CTH, a shock-physics code developed by Sandia National Laboratories, and validated by comparison with existing test data. The axisymmetric models were then used in a statistical sensitivity analysis to determine the influence of five design parameters on degree of hypervelocity particle dispersion. Several damage metrics were identified and evaluated. Damage metrics related to the extent of substructure damage were seen to produce misleading results, however damage metrics related to the degree of dispersion of the hypervelocity particle produced results that corresponded to physical intuition. Based on analysis of variance results it was concluded that the most effective way to increase hypervelocity impact resistance is to increase the thickness of the outer foil layer. Increasing the spacing between the outer surface and the substructure is also very effective at increasing dispersion.

  15. Hydraulic and Condition Assessment of Existing Sewerage Network: A Case Study of an Educational Institute

    NASA Astrophysics Data System (ADS)

    Sourabh, Nishant; Timbadiya, P. V.

    2018-04-01

    The hydraulic simulation of the existing sewerage network provides various information about critical points to assess the deteriorating condition and help in rehabilitation of existing network and future expansion. In the present study, hydraulic and condition assessment of existing network of educational Institute (i.e. Sardar Vallabhbhai National Institute of Technology-Surat, Gujarat, India), having an area of 100 ha and ground levels in range of 5.0-9.0 m above mean sea level, has been carried out using sewage flow simulation for existing and future scenarios analysis using SewerGEMS v8i. The paper describes the features of 4.79 km long sewerage network of institute followed by network model simulation for aforesaid scenarios and recommendations on improvement of the existing network for future use. The total sewer loads for present and future scenarios are 1.67 million litres per day (MLD) and 3.62 MLD, considering the peak factor of 3 on the basis of population. The hydraulic simulation of the existing scenario indicated depth by diameter (d/D) ratio in the range of 0.02-0.48 and velocity range of 0.08-0.53 m/s for existing network for present scenario. For the future scenario, the existing network is needed to be modified and it was found that total of 11 conduits (length: 464.8 m) should be replaced to the next higher diameter available, i.e., 350 mm for utilization of existing network for future scenario. The present study provides the methodology for condition assessment of existing network and its utilization as per guidelines provided by Central Public Health and Environmental Engineering Organization, 2013. The methodology presented in this paper can be used by municipal/public health engineer for the assessment of existing sewerage network for its serviceability and improvement in future.

  16. Research of autonomous celestial navigation based on new measurement model of stellar refraction

    NASA Astrophysics Data System (ADS)

    Yu, Cong; Tian, Hong; Zhang, Hui; Xu, Bo

    2014-09-01

    Autonomous celestial navigation based on stellar refraction has attracted widespread attention for its high accuracy and full autonomy.In this navigation method, establishment of accurate stellar refraction measurement model is the fundament and key issue to achieve high accuracy navigation. However, the existing measurement models are limited due to the uncertainty of atmospheric parameters. Temperature, pressure and other factors which affect the stellar refraction within the height of earth's stratosphere are researched, and the varying model of atmosphere with altitude is derived on the basis of standard atmospheric data. Furthermore, a novel measurement model of stellar refraction in a continuous range of altitudes from 20 km to 50 km is produced by modifying the fixed altitude (25 km) measurement model, and equation of state with the orbit perturbations is established, then a simulation is performed using the improved Extended Kalman Filter. The results show that the new model improves the navigation accuracy, which has a certain practical application value.

  17. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent processes from a different domain or have different spatial and temporal resolutions. An open source framework that bridges OpenMI and OpenDA is presented. The framework provides a generic and easy means for any OpenMI compliant model to assimilate observation measurements. An example test case will be presented using MikeSHE, and OpenMI compliant fully coupled integrated hydrological model that can accurately simulate the feedback dynamics of overland flow, unsaturated zone and saturated zone.

  18. Lentivirus-mediated platelet gene therapy of murine hemophilia A with pre-existing anti-FVIII immunity

    PubMed Central

    Kuether, E. L.; Schroeder, J. A.; Fahs, S. A.; Cooley, B. C.; Chen, Y.; Montgomery, R. R.; Wilcox, D. A.; Shi, Q.

    2012-01-01

    Summary Background The development of inhibitory antibodies, referred to as inhibitors, against exogenous FVIII in a significant subset of patients with hemophilia A remains a persistent challenge to the efficacy of protein replacement therapy. Our previous studies using the transgenic approach provided proof-of-principle that platelet-specific expression could be successful for treating hemophilia A in the presence of inhibitory antibodies. Objective To investigate a clinically translatable approach for platelet gene therapy of hemophilia A with pre-existing inhibitors. Methods Platelet-FVIII expression in pre-immunized FVIIInull mice was introduced by transplantation of lentivirus-transduced bone marrow or enriched hematopoietic stem cells. FVIII expression was determined by a chromogenic assay. The transgene copy number per cell was quantitated by real time PCR. Inhibitor titer was measured by Bethesda assay. Phenotypic correction was assessed by the tail clipping assay and an electrolytic-induced venous injury model. Integration sites were analyzed by LAM-PCR. Results Therapeutic levels of platelet-FVIII expression were sustained long-term without evoking an anti-FVIII memory response in the transduced pre-immunized recipients. The tail clip survival test and the electrolytic injury model confirmed that hemostasis was improved in the treated animals. Sequential bone marrow transplants showed sustained platelet-FVIII expression resulting in phenotypic correction in pre-immunized secondary and tertiary recipients. Conclusions Lentivirus-mediated platelet-specific gene transfer improves hemostasis in hemophilic A mice with pre-existing inhibitors, indicating that this approach may be a promising strategy for gene therapy of hemophilia A even in the high-risk setting of pre-existing inhibitory antibodies. PMID:22632092

  19. Analysis and improvement measures of flight delay in China

    NASA Astrophysics Data System (ADS)

    Zang, Yuhang

    2017-03-01

    Firstly, this paper establishes the principal component regression model to analyze the data quantitatively, based on principal component analysis to get the three principal component factors of flight delays. Then the least square method is used to analyze the factors and obtained the regression equation expression by substitution, and then found that the main reason for flight delays is airlines, followed by weather and traffic. Aiming at the above problems, this paper improves the controllable aspects of traffic flow control. For reasons of traffic flow control, an adaptive genetic queuing model is established for the runway terminal area. This paper, establish optimization method that fifteen planes landed simultaneously on the three runway based on Beijing capital international airport, comparing the results with the existing FCFS algorithm, the superiority of the model is proved.

  20. Explicating an Evidence-Based, Theoretically Informed, Mobile Technology-Based System to Improve Outcomes for People in Recovery for Alcohol Dependence

    PubMed Central

    Gustafson, David H.; Isham, Andrew; Baker, Timothy; Boyle, Michael G.; Levy, Michael

    2011-01-01

    Post treatment relapse to uncontrolled alcohol use is common. More cost-effective approaches are needed. We believe currently available communication technology can use existing models for relapse prevention to cost-effectively improve long-term relapse prevention. This paper describes: 1) research-based elements of alcohol related relapse prevention and how they can be encompassed in Self Determination Theory (SDT) and Marlatt’s Cognitive Behavioral Relapse Prevention Model, 2) how technology could help address the needs of people seeking recovery, 3) a technology-based prototype, organized around Self Determination Theory and Marlatt’s model and 4) how we are testing a system based on the ideas in this article and related ethical and operational considerations. PMID:21190410

  1. Randomized trial of two e-learning programs for oral health students on secondary prevention of eating disorders.

    PubMed

    DeBate, Rita D; Severson, Herbert H; Cragun, Deborah; Bleck, Jennifer; Gau, Jeff; Merrell, Laura; Cantwell, Carley; Christiansen, Steve; Koerber, Anne; Tomar, Scott L; Brown, Kelli McCormack; Tedesco, Lisa A; Hendricson, William; Taris, Mark

    2014-01-01

    The purpose of this study was to test whether an interactive, web-based training program is more effective than an existing, flat-text, e-learning program at improving oral health students' knowledge, motivation, and self-efficacy to address signs of disordered eating behaviors with patients. Eighteen oral health classes of dental and dental hygiene students were randomized to either the Intervention (interactive program; n=259) or Alternative (existing program; n=58) conditions. Hierarchical linear modeling assessed for posttest differences between groups while controlling for baseline measures. Improvement among Intervention participants was superior to those who completed the Alternative program for three of the six outcomes: benefits/barriers, self-efficacy, and skills-based knowledge (effect sizes ranging from 0.43 to 0.87). This study thus suggests that interactive training programs may be better than flat-text e-learning programs for improving the skills-based knowledge and self-efficacy necessary for behavior change.

  2. Improved dense trajectories for action recognition based on random projection and Fisher vectors

    NASA Astrophysics Data System (ADS)

    Ai, Shihui; Lu, Tongwei; Xiong, Yudian

    2018-03-01

    As an important application of intelligent monitoring system, the action recognition in video has become a very important research area of computer vision. In order to improve the accuracy rate of the action recognition in video with improved dense trajectories, one advanced vector method is introduced. Improved dense trajectories combine Fisher Vector with Random Projection. The method realizes the reduction of the characteristic trajectory though projecting the high-dimensional trajectory descriptor into the low-dimensional subspace based on defining and analyzing Gaussian mixture model by Random Projection. And a GMM-FV hybrid model is introduced to encode the trajectory feature vector and reduce dimension. The computational complexity is reduced by Random Projection which can drop Fisher coding vector. Finally, a Linear SVM is used to classifier to predict labels. We tested the algorithm in UCF101 dataset and KTH dataset. Compared with existed some others algorithm, the result showed that the method not only reduce the computational complexity but also improved the accuracy of action recognition.

  3. Carcinogenicity and Mutagenicity Data: New Initiatives to ...

    EPA Pesticide Factsheets

    Currents models for prediction of chemical carcinogenicity and mutagenicity rely upon a relatively small number of publicly available data resources, where the data being modeled are highly summarized and aggregated representations of the actual experimental results. A number of new initiatives are underway to improve access to existing public carcinogenicity and mutagenicity data for use in modeling, as well as to encourage new approaches to the use of data in modeling. Rodent bioassay results from the NIEHS National Toxicology Program (NTP) and the Berkeley Carcinogenic Potency Database (CPDB) have provided the largest public data resources for building carcinogenicity prediction models to date. However, relatively few and limited representations of these data have actually informed existing models. Initiatives, such as EPA's DSSTox Database Network, offer elaborated and quality reviewed presentations of the CPDB and expanded data linkages and coverage of chemical space for carcinogenicity and mutagenicity. In particular the latest published DSSTox CPDBAS structure-data file includes a number of species-specific and summary activity fields, including a species-specific normalized score for carcinogenic potency (TD50) and various weighted summary activities. These data are being incorporated into PubChem to provide broad

  4. Peatlands through the Last Glacial Cycle: Evidence and Model Results

    NASA Astrophysics Data System (ADS)

    Kleinen, T.; Treat, C. C.; Brovkin, V.

    2017-12-01

    The spatiotemporal distribution of peatlands prior to the last glacial maxium (LGM) is largely unknown. However, some evidence of non-extant peatlands is available in the form of buried organic-rich sediments. We have undertaken a synthesis of these "buried" peatlands from > 1000 detailed stratigraphic descriptions and combined it with data on extant peatlands to derive a first global synthesis of global peatland extent through the last glacial cycle. We present results of this synthesis in combination with modeling results where we determined peatland extents and carbon stocks from a transient simulation of the last glacial cycle with the CLIMBER2-LPJ model. We show that peat has existed in boreal latitudes at all times since the last interglacial, that evidence for tropical peatlands exists for the last 50,000 yrs, and that the model results in general agree well with the collected evidence of past peatlands, allowing a first estimate of peat carbon stock changes through the last glacial cycle. We discuss data and model limitations, with a focus on requirements for improving model-based peatland estimates.

  5. Mosquito population dynamics from cellular automata-based simulation

    NASA Astrophysics Data System (ADS)

    Syafarina, Inna; Sadikin, Rifki; Nuraini, Nuning

    2016-02-01

    In this paper we present an innovative model for simulating mosquito-vector population dynamics. The simulation consist of two stages: demography and dispersal dynamics. For demography simulation, we follow the existing model for modeling a mosquito life cycles. Moreover, we use cellular automata-based model for simulating dispersal of the vector. In simulation, each individual vector is able to move to other grid based on a random walk. Our model is also capable to represent immunity factor for each grid. We simulate the model to evaluate its correctness. Based on the simulations, we can conclude that our model is correct. However, our model need to be improved to find a realistic parameters to match real data.

  6. Search for Muonic Dark Forces at BABAR

    NASA Astrophysics Data System (ADS)

    Godang, Romulus

    2017-04-01

    Many models of physics beyond Standard Model predict the existence of light Higgs states, dark photons, and new gauge bosons mediating interactions between dark sectors and the Standard Model. Using a full data sample collected with the BABAR detector at the PEP-II e+e- collider, we report searches for a light non-Standard Model Higgs boson, dark photon, and a new muonic dark force mediated by a gauge boson (Z') coupling only to the second and third lepton families. Our results significantly improve upon the current bounds and further constrain the remaining region of the allowed parameter space.

  7. Reusable Launch Vehicle (RLV) Market Analysis Model

    NASA Technical Reports Server (NTRS)

    Prince, Frank A.

    1999-01-01

    The RLV Market Analysis model is at best a rough order approximation of actual market behavior. However, it does give a quick indication if the flights exists to enable an economically viable RLV, and the assumptions necessary for the vehicle to capture those flights. Additional analysis, market research, and updating with the latest information on payloads and launches would improve the model. Plans are to update the model as new information becomes available and new requirements are levied. This tool will continue to be a vital part of NASA's RLV business analysis capability for the foreseeable future.

  8. Modelling of additive manufacturing processes: a review and classification

    NASA Astrophysics Data System (ADS)

    Stavropoulos, Panagiotis; Foteinopoulos, Panagis

    2018-03-01

    Additive manufacturing (AM) is a very promising technology; however, there are a number of open issues related to the different AM processes. The literature on modelling the existing AM processes is reviewed and classified. A categorization of the different AM processes in process groups, according to the process mechanism, has been conducted and the most important issues are stated. Suggestions are made as to which approach is more appropriate according to the key performance indicator desired to be modelled and a discussion is included as to the way that future modelling work can better contribute to improving today's AM process understanding.

  9. Assessment of Higher-Order RANS Closures in a Decelerated Planar Wall-Bounded Turbulent Flow

    NASA Technical Reports Server (NTRS)

    Jeyapaul, Elbert; Coleman, Gary N.; Rumsey, Christopher L.

    2014-01-01

    A reference DNS database is presented, which includes third- and fourth-order moment budgets for unstrained and strained planar channel flow. Existing RANS closure models for third- and fourth-order terms are surveyed, and new model ideas are introduced. The various models are then compared with the DNS data term by term using a priori testing of the higher-order budgets of turbulence transport, velocity-pressure-gradient, and dissipation for both the unstrained and strained databases. Generally, the models for the velocity-pressure-gradient terms are most in need of improvement.

  10. High resolution infrared datasets useful for validating stratospheric models

    NASA Technical Reports Server (NTRS)

    Rinsland, Curtis P.

    1992-01-01

    An important objective of the High Speed Research Program (HSRP) is to support research in the atmospheric sciences that will improve the basic understanding of the circulation and chemistry of the stratosphere and lead to an interim assessment of the impact of a projected fleet of High Speed Civil Transports (HSCT's) on the stratosphere. As part of this work, critical comparisons between models and existing high quality measurements are planned. These comparisons will be used to test the reliability of current atmospheric chemistry models. Two suitable sets of high resolution infrared measurements are discussed.

  11. Improving ontology matching with propagation strategy and user feedback

    NASA Astrophysics Data System (ADS)

    Li, Chunhua; Cui, Zhiming; Zhao, Pengpeng; Wu, Jian; Xin, Jie; He, Tianxu

    2015-07-01

    Markov logic networks which unify probabilistic graphical model and first-order logic provide an excellent framework for ontology matching. The existing approach requires a threshold to produce matching candidates and use a small set of constraints acting as filter to select the final alignments. We introduce novel match propagation strategy to model the influences between potential entity mappings across ontologies, which can help to identify the correct correspondences and produce missed correspondences. The estimation of appropriate threshold is a difficult task. We propose an interactive method for threshold selection through which we obtain an additional measurable improvement. Running experiments on a public dataset has demonstrated the effectiveness of proposed approach in terms of the quality of result alignment.

  12. Zoonotic Transmission of Waterborne Disease: A Mathematical Model.

    PubMed

    Waters, Edward K; Hamilton, Andrew J; Sidhu, Harvinder S; Sidhu, Leesa A; Dunbar, Michelle

    2016-01-01

    Waterborne parasites that infect both humans and animals are common causes of diarrhoeal illness, but the relative importance of transmission between humans and animals and vice versa remains poorly understood. Transmission of infection from animals to humans via environmental reservoirs, such as water sources, has attracted attention as a potential source of endemic and epidemic infections, but existing mathematical models of waterborne disease transmission have limitations for studying this phenomenon, as they only consider contamination of environmental reservoirs by humans. This paper develops a mathematical model that represents the transmission of waterborne parasites within and between both animal and human populations. It also improves upon existing models by including animal contamination of water sources explicitly. Linear stability analysis and simulation results, using realistic parameter values to describe Giardia transmission in rural Australia, show that endemic infection of an animal host with zoonotic protozoa can result in endemic infection in human hosts, even in the absence of person-to-person transmission. These results imply that zoonotic transmission via environmental reservoirs is important.

  13. Fluorescence microscopy point spread function model accounting for aberrations due to refractive index variability within a specimen.

    PubMed

    Ghosh, Sreya; Preza, Chrysanthe

    2015-07-01

    A three-dimensional (3-D) point spread function (PSF) model for wide-field fluorescence microscopy, suitable for imaging samples with variable refractive index (RI) in multilayered media, is presented. This PSF model is a key component for accurate 3-D image restoration of thick biological samples, such as lung tissue. Microscope- and specimen-derived parameters are combined with a rigorous vectorial formulation to obtain a new PSF model that accounts for additional aberrations due to specimen RI variability. Experimental evaluation and verification of the PSF model was accomplished using images from 175-nm fluorescent beads in a controlled test sample. Fundamental experimental validation of the advantage of using improved PSFs in depth-variant restoration was accomplished by restoring experimental data from beads (6  μm in diameter) mounted in a sample with RI variation. In the investigated study, improvement in restoration accuracy in the range of 18 to 35% was observed when PSFs from the proposed model were used over restoration using PSFs from an existing model. The new PSF model was further validated by showing that its prediction compares to an experimental PSF (determined from 175-nm beads located below a thick rat lung slice) with a 42% improved accuracy over the current PSF model prediction.

  14. Assessing Spectral Shortwave Cloud Observations at the Southern Great Plains Facility

    NASA Technical Reports Server (NTRS)

    McBride, P. J.; Marshak, A.; Wiscombe, W. J.; Flynn, C. J.; Vogelmann, A. M.

    2012-01-01

    The Atmospheric Radiation Measurement (ARM) program (now Atmospheric System Research) was established, in part, to improve radiation models so that they could be used reliably to compute radiation fluxes through the atmosphere, given knowledge of the surface albedo, atmospheric gases, and the aerosol and cloud properties. Despite years of observations, discrepancies still exist between radiative transfer models and observations, particularly in the presence of clouds. Progress has been made at closing discrepancies in the spectral region beyond 3 micron, but the progress lags at shorter wavelengths. Ratios of observed visible and near infrared cloud albedo from aircraft and satellite have shown both localized and global discrepancies between model and observations that are, thus far, unexplained. The capabilities of shortwave surface spectrometry have been improved in recent years at the Southern Great Plains facility (SGP) of the ARM Climate Research Facility through the addition of new instrumentation, the Shortwave Array Spectroradiometer, and upgrades to existing instrumentation, the Shortwave Spectroradiometer and the Rotating Shadowband Spectroradiometer. An airborne-based instrument, the HydroRad Spectroradiometer, was also deployed at the ARM site during the Routine ARM Aerial Facility Clouds with Low Optical Water Depths (CLOWD) Optical Radiative Observations (RACORO) field campaign. Using the new and upgraded spectral observations along with radiative transfer models, cloud scenes at the SGP are presented with the goal of characterizing the instrumentation and the cloud fields themselves.

  15. Pan-Arctic river discharge: Prioritizing monitoring of future climate change hot spots

    NASA Astrophysics Data System (ADS)

    Bring, Arvid; Shiklomanov, Alexander; Lammers, Richard B.

    2017-01-01

    The Arctic freshwater cycle is changing rapidly, which will require adequate monitoring of river flows to detect, observe, and understand changes and provide adaptation information. There has, however, been little detail about where the greatest flow changes are projected, and where monitoring therefore may need to be strengthened. In this study, we used a set of recent climate model runs and an advanced macro-scale hydrological model to analyze how flows across the continental pan-Arctic are projected to change and where the climate models agree on significant changes. We also developed a method to identify where monitoring stations should be placed to observe these significant changes, and compared this set of suggested locations with the existing network of monitoring stations. Overall, our results reinforce earlier indications of large increases in flow over much of the Arctic, but we also identify some areas where projections agree on significant changes but disagree on the sign of change. For monitoring, central and eastern Siberia, Alaska, and central Canada are hot spots for the highest changes. To take advantage of existing networks, a number of stations across central Canada and western and central Siberia could form a prioritized set. Further development of model representation of high-latitude hydrology would improve confidence in the areas we identify here. Nevertheless, ongoing observation programs may consider these suggested locations in efforts to improve monitoring of the rapidly changing Arctic freshwater cycle.

  16. Development of a Two-fluid Drag Law for Clustered Particles using Direct Numerical Simulation and Validation through Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gokaltun, Seckin; Munroe, Norman; Subramaniam, Shankar

    2014-12-31

    This study presents a new drag model, based on the cohesive inter-particle forces, implemented in the MFIX code. This new drag model combines an existing standard model in MFIX with a particle-based drag model based on a switching principle. Switches between the models in the computational domain occur where strong particle-to-particle cohesion potential is detected. Three versions of the new model were obtained by using one standard drag model in each version. Later, performance of each version was compared against available experimental data for a fluidized bed, published in the literature and used extensively by other researchers for validation purposes.more » In our analysis of the results, we first observed that standard models used in this research were incapable of producing closely matching results. Then, we showed for a simple case that a threshold is needed to be set on the solid volume fraction. This modification was applied to avoid non-physical results for the clustering predictions, when governing equation of the solid granular temperate was solved. Later, we used our hybrid technique and observed the capability of our approach in improving the numerical results significantly; however, improvement of the results depended on the threshold of the cohesive index, which was used in the switching procedure. Our results showed that small values of the threshold for the cohesive index could result in significant reduction of the computational error for all the versions of the proposed drag model. In addition, we redesigned an existing circulating fluidized bed (CFB) test facility in order to create validation cases for clustering regime of Geldart A type particles.« less

  17. The Application of Fractal and Multifractal Theory in Hydraulic-Flow-Unit Characterization and Permeability Estimation

    NASA Astrophysics Data System (ADS)

    Chen, X.; Yao, G.; Cai, J.

    2017-12-01

    Pore structure characteristics are important factors in influencing the fluid transport behavior of porous media, such as pore-throat ratio, pore connectivity and size distribution, moreover, wettability. To accurately characterize the diversity of pore structure among HFUs, five samples selected from different HFUs (porosities are approximately equal, however permeability varies widely) were chosen to conduct micro-computerized tomography test to acquire direct 3D images of pore geometries and to perform mercury injection experiments to obtain the pore volume-radii distribution. To characterize complex and high nonlinear pore structure of all samples, three classic fractal geometry models were applied. Results showed that each HFU has similar box-counting fractal dimension and generalized fractal dimension in the number-area model, but there are significant differences in multifractal spectrums. In the radius-volume model, there are three obvious linear segments, corresponding to three fractal dimension values, and the middle one is proved as the actual fractal dimension according to the maximum radius. In the number-radius model, the spherical-pore size distribution extracted by maximum ball algorithm exist a decrease in the number of small pores compared with the fractal power rate rather than the traditional linear law. Among the three models, only multifractal analysis can classify the HFUs accurately. Additionally, due to the tightness and low-permeability in reservoir rocks, connate water film existing in the inner surface of pore channels commonly forms bound water. The conventional model which is known as Yu-Cheng's model has been proved to be typically not applicable. Considering the effect of irreducible water saturation, an improved fractal permeability model was also deduced theoretically. The comparison results showed that the improved model can be applied to calculate permeability directly and accurately in such unconventional rocks.

  18. Yoga as Coping: A Conceptual Framework for Meaningful Participation in Yoga.

    PubMed

    Crowe, Brandi M; Van Puymbroeck, Marieke; Schmid, Arlene A

    2016-07-27

    Yoga facilitates relaxation and connection of mind, body, and spirit through the use of breathing, meditation, and physical postures. Participation in yoga has been extensively linked to decreased stress, and as a result, is considered a therapeutic intervention by many. However, few theories exist that explain the link between yoga participation and improved psychosocial wellbeing. The leisure-stress coping conceptual framework suggests that through participation in leisure, an individual can decrease stress while concurrently restoring and building up sustainable mental and physical capacities. Three types of leisure coping strategies exist: palliative coping, mood enhancement, and companionship. The purpose of this article is to propose the leisure-stress coping conceptual framework as a model for explaining benefits received from yoga participation via leisure coping strategies, which may explain or support improved ability to manage stress.

  19. Yoga as Coping: A Conceptual Framework for Meaningful Participation in Yoga.

    PubMed

    Crowe, Brandi M; Van Puymbroeck, Marieke; Schmid, Arlene A

    2016-01-01

    Yoga facilitates relaxation and connection of mind, body, and spirit through the use of breathing, meditation, and physical postures. Participation in yoga has been extensively linked to decreased stress, and as a result, is considered a therapeutic intervention by many. However, few theories exist that explain the link between yoga participation and improved psychosocial wellbeing. The leisure-stress coping conceptual framework suggests that through participation in leisure, an individual can decrease stress while concurrently restoring and building up sustainable mental and physical capacities. Three types of leisure coping strategies exist: palliative coping, mood enhancement, and companionship. The purpose of this article is to propose the leisure-stress coping conceptual framework as a model for explaining benefits received from yoga participation via leisure coping strategies, which may explain or support improved ability to manage stress.

  20. Publishing and sharing of hydrologic models through WaterHUB

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Ruddell, B. L.; Song, C.; Zhao, L.; Kim, J.; Assi, A.

    2011-12-01

    Most hydrologists use hydrologic models to simulate the hydrologic processes to understand hydrologic pathways and fluxes for research, decision making and engineering design. Once these tasks are complete including publication of results, the models generally are not published or made available to the public for further use and improvement. Although publication or sharing of models is not required for journal publications, sharing of models may open doors for new collaborations, and avoids duplication of efforts if other researchers are interested in simulating a particular watershed for which a model already exists. For researchers, who are interested in sharing models, there are limited avenues to publishing their models to the wider community. Towards filling this gap, a prototype cyberinfrastructure (CI), called WaterHUB, is developed for sharing hydrologic data and modeling tools in an interactive environment. To test the utility of WaterHUB for sharing hydrologic models, a system to publish and share SWAT (Soil Water Assessment Tool) is developed. Users can utilize WaterHUB to search and download existing SWAT models, and also upload new SWAT models. Metadata such as the name of the watershed, name of the person or agency who developed the model, simulation period, time step, and list of calibrated parameters also published with individual model.

  1. Healthcare technologies, quality improvement programs and hospital organizational culture in Canadian hospitals

    PubMed Central

    2013-01-01

    Background Healthcare technology and quality improvement programs have been identified as a means to influence healthcare costs and healthcare quality in Canada. This study seeks to identify whether the ability to implement healthcare technology by a hospital was related to usage of quality improvement programs within the hospital and whether the culture within a hospital plays a role in the adoption of quality improvement programs. Methods A cross-sectional study of Canadian hospitals was conducted in 2010. The sample consisted of hospital administrators that were selected by provincial review boards. The questionnaire consisted of 3 sections: 20 healthcare technology items, 16 quality improvement program items and 63 culture items. Results Rasch model analysis revealed that a hierarchy existed among the healthcare technologies based upon the difficulty of implementation. The results also showed a significant relationship existed between the ability to implement healthcare technologies and the number of quality improvement programs adopted. In addition, culture within a hospital served a mediating role in quality improvement programs adoption. Conclusions Healthcare technologies each have different levels of difficulty. As a consequence, hospitals need to understand their current level of capability before selecting a particular technology in order to assess the level of resources needed. Further the usage of quality improvement programs is related to the ability to implement technology and the culture within a hospital. PMID:24119419

  2. Numerical Modeling of River Ice Processes on the Lower Nelson River

    NASA Astrophysics Data System (ADS)

    Malenchak, Jarrod Joseph

    Water resource infrastructure in cold regions of the world can be significantly impacted by the existence of river ice. Major engineering concerns related to river ice include ice jam flooding, the design and operation of hydropower facilities and other hydraulic structures, water supplies, as well as ecological, environmental, and morphological effects. The use of numerical simulation models has been identified as one of the most efficient means by which river ice processes can be studied and the effects of river ice be evaluated. The continued advancement of these simulation models will help to develop new theories and evaluate potential mitigation alternatives for these ice issues. In this thesis, a literature review of existing river ice numerical models, of anchor ice formation and modeling studies, and of aufeis formation and modeling studies is conducted. A high level summary of the two-dimensional CRISSP numerical model is presented as well as the developed freeze-up model with a focus specifically on the anchor ice and aufeis growth processes. This model includes development in the detailed heat transfer calculations, an improved surface ice mass exchange model which includes the rapids entrainment process, and an improved dry bed treatment model along with the expanded anchor ice and aufeis growth model. The developed sub-models are tested in an ideal channel setting as somewhat of a model confirmation. A case study of significant anchor ice and aufeis growth on the Nelson River in northern Manitoba, Canada, will be the primary field test case for the anchor ice and aufeis model. A second case study on the same river will be used to evaluate the surface ice components of the model in a field setting. The results from these cases studies will be used to highlight the capabilities and deficiencies in the numerical model and to identify areas of further research and model development.

  3. Current issues with standards in the measurement and documentation of human skeletal anatomy.

    PubMed

    Magee, Justin; McClelland, Brian; Winder, John

    2012-09-01

    Digital modeling of human anatomy has become increasingly important and relies on well-documented quantitative anatomy literature. This type of documentation is common for the spine and pelvis; however, significant issues exist due to the lack of standardization in measurement and technique. Existing literature on quantitative anatomy for the spine and pelvis of white adults (aged 18-65 years, separated into decadal categories) was reviewed from the disciplines of anatomy, manipulative therapy, anthropometrics, occupational ergonomics, biomechanics and forensic science. The data were unified into a single normative model of the sub-axial spine. Two-dimensional orthographic drawings were produced from the 590 individual measurements identified, which informed the development of a 3D digital model. A similar review of full range of motion data was conducted as a meta-analysis and the results were applied to the existing model, providing an inter-connected, articulated digital spine. During these data analysis processes several inconsistencies were observed accompanied by an evidential lack of standardization with measurement and recording of data. These have been categorized as: anatomical terminology; scaling of measurements; measurement methodology, dimension and anatomical reference positions; global coordinate systems. There is inconsistency in anatomical terminology where independent researchers use the same terms to describe different aspects of anatomy or different terms for the same anatomy. Published standards exist for measurement methods of the human body regarding spatial interaction, anthropometric databases, automotive applications, clothing industries and for computer manikins, but none exists for skeletal anatomy. Presentation of measurements often lacks formal structure in clinical publications, seldom providing geometric reference points, therefore making digital reconstruction difficult. Published quantitative data does not follow existing international published standards relating to engineering drawing and visual communication. Large variations are also evident in standards or guidelines used for global coordinate systems across biomechanics, ergonomics, software systems and 3D software applications. This paper identifies where established good practice exists and suggests additional recommendations, informing an improved communication protocol, to assist reconstruction of skeletal anatomy using 3D digital modeling. © 2012 The Authors. Journal of Anatomy © 2012 Anatomical Society.

  4. Current issues with standards in the measurement and documentation of human skeletal anatomy

    PubMed Central

    Magee, Justin; McClelland, Brian; Winder, John

    2012-01-01

    Digital modeling of human anatomy has become increasingly important and relies on well-documented quantitative anatomy literature. This type of documentation is common for the spine and pelvis; however, significant issues exist due to the lack of standardization in measurement and technique. Existing literature on quantitative anatomy for the spine and pelvis of white adults (aged 18–65 years, separated into decadal categories) was reviewed from the disciplines of anatomy, manipulative therapy, anthropometrics, occupational ergonomics, biomechanics and forensic science. The data were unified into a single normative model of the sub-axial spine. Two-dimensional orthographic drawings were produced from the 590 individual measurements identified, which informed the development of a 3D digital model. A similar review of full range of motion data was conducted as a meta-analysis and the results were applied to the existing model, providing an inter-connected, articulated digital spine. During these data analysis processes several inconsistencies were observed accompanied by an evidential lack of standardization with measurement and recording of data. These have been categorized as: anatomical terminology; scaling of measurements; measurement methodology, dimension and anatomical reference positions; global coordinate systems. There is inconsistency in anatomical terminology where independent researchers use the same terms to describe different aspects of anatomy or different terms for the same anatomy. Published standards exist for measurement methods of the human body regarding spatial interaction, anthropometric databases, automotive applications, clothing industries and for computer manikins, but none exists for skeletal anatomy. Presentation of measurements often lacks formal structure in clinical publications, seldom providing geometric reference points, therefore making digital reconstruction difficult. Published quantitative data does not follow existing international published standards relating to engineering drawing and visual communication. Large variations are also evident in standards or guidelines used for global coordinate systems across biomechanics, ergonomics, software systems and 3D software applications. This paper identifies where established good practice exists and suggests additional recommendations, informing an improved communication protocol, to assist reconstruction of skeletal anatomy using 3D digital modeling. PMID:22747678

  5. Invitation to a forum: architecting operational `next generation' earth monitoring satellites based on best modeling, existing sensor capabilities, with constellation efficiencies to secure trusted datasets for the next 20 years

    NASA Astrophysics Data System (ADS)

    Helmuth, Douglas B.; Bell, Raymond M.; Grant, David A.; Lentz, Christopher A.

    2012-09-01

    Architecting the operational Next Generation of earth monitoring satellites based on matured climate modeling, reuse of existing sensor & satellite capabilities, attention to affordability and evolutionary improvements integrated with constellation efficiencies - becomes our collective goal for an open architectural design forum. Understanding the earth's climate and collecting requisite signatures over the next 30 years is a shared mandate by many of the world's governments. But there remains a daunting challenge to bridge scientific missions to 'operational' systems that truly support the demands of decision makers, scientific investigators and global users' requirements for trusted data. In this paper we will suggest an architectural structure that takes advantage of current earth modeling examples including cross-model verification and a first order set of critical climate parameters and metrics; that in turn, are matched up with existing space borne collection capabilities and sensors. The tools used and the frameworks offered are designed to allow collaborative overlays by other stakeholders nominating different critical parameters and their own treaded connections to existing international collection experience. These aggregate design suggestions will be held up to group review and prioritized as potential constellation solutions including incremental and spiral developments - including cost benefits and organizational opportunities. This Part IV effort is focused on being an inclusive 'Next Gen Constellation' design discussion and is the natural extension to earlier papers.

  6. Faster Mass Spectrometry-based Protein Inference: Junction Trees are More Efficient than Sampling and Marginalization by Enumeration

    PubMed Central

    Serang, Oliver; Noble, William Stafford

    2012-01-01

    The problem of identifying the proteins in a complex mixture using tandem mass spectrometry can be framed as an inference problem on a graph that connects peptides to proteins. Several existing protein identification methods make use of statistical inference methods for graphical models, including expectation maximization, Markov chain Monte Carlo, and full marginalization coupled with approximation heuristics. We show that, for this problem, the majority of the cost of inference usually comes from a few highly connected subgraphs. Furthermore, we evaluate three different statistical inference methods using a common graphical model, and we demonstrate that junction tree inference substantially improves rates of convergence compared to existing methods. The python code used for this paper is available at http://noble.gs.washington.edu/proj/fido. PMID:22331862

  7. An agenda-based routing protocol in delay tolerant mobile sensor networks.

    PubMed

    Wang, Xiao-Min; Zhu, Jin-Qi; Liu, Ming; Gong, Hai-Gang

    2010-01-01

    Routing in delay tolerant mobile sensor networks (DTMSNs) is challenging due to the networks' intermittent connectivity. Most existing routing protocols for DTMSNs use simplistic random mobility models for algorithm design and performance evaluation. In the real world, however, due to the unique characteristics of human mobility, currently existing random mobility models may not work well in environments where mobile sensor units are carried (such as DTMSNs). Taking a person's social activities into consideration, in this paper, we seek to improve DTMSN routing in terms of social structure and propose an agenda based routing protocol (ARP). In ARP, humans are classified based on their agendas and data transmission is made according to sensor nodes' transmission rankings. The effectiveness of ARP is demonstrated through comprehensive simulation studies.

  8. A spatial Bayesian network model to assess the benefits of early warning for urban flood risk to people

    NASA Astrophysics Data System (ADS)

    Balbi, Stefano; Villa, Ferdinando; Mojtahed, Vahid; Hegetschweiler, Karin Tessa; Giupponi, Carlo

    2016-06-01

    This article presents a novel methodology to assess flood risk to people by integrating people's vulnerability and ability to cushion hazards through coping and adapting. The proposed approach extends traditional risk assessments beyond material damages; complements quantitative and semi-quantitative data with subjective and local knowledge, improving the use of commonly available information; and produces estimates of model uncertainty by providing probability distributions for all of its outputs. Flood risk to people is modeled using a spatially explicit Bayesian network model calibrated on expert opinion. Risk is assessed in terms of (1) likelihood of non-fatal physical injury, (2) likelihood of post-traumatic stress disorder and (3) likelihood of death. The study area covers the lower part of the Sihl valley (Switzerland) including the city of Zurich. The model is used to estimate the effect of improving an existing early warning system, taking into account the reliability, lead time and scope (i.e., coverage of people reached by the warning). Model results indicate that the potential benefits of an improved early warning in terms of avoided human impacts are particularly relevant in case of a major flood event.

  9. Impact of Learning Model Based on Cognitive Conflict toward Student’s Conceptual Understanding

    NASA Astrophysics Data System (ADS)

    Mufit, F.; Festiyed, F.; Fauzan, A.; Lufri, L.

    2018-04-01

    The problems that often occur in the learning of physics is a matter of misconception and low understanding of the concept. Misconceptions do not only happen to students, but also happen to college students and teachers. The existing learning model has not had much impact on improving conceptual understanding and remedial efforts of student misconception. This study aims to see the impact of cognitive-based learning model in improving conceptual understanding and remediating student misconceptions. The research method used is Design / Develop Research. The product developed is a cognitive conflict-based learning model along with its components. This article reports on product design results, validity tests, and practicality test. The study resulted in the design of cognitive conflict-based learning model with 4 learning syntaxes, namely (1) preconception activation, (2) presentation of cognitive conflict, (3) discovery of concepts & equations, (4) Reflection. The results of validity tests by some experts on aspects of content, didactic, appearance or language, indicate very valid criteria. Product trial results also show a very practical product to use. Based on pretest and posttest results, cognitive conflict-based learning models have a good impact on improving conceptual understanding and remediating misconceptions, especially in high-ability students.

  10. [Integrated Quality Management System (IQMS): a model for improving the quality of reproductive health care in rural Kenya].

    PubMed

    Herrler, Claudia; Bramesfeld, Anke; Brodowski, Marc; Prytherch, Helen; Marx, Irmgard; Nafula, Maureen; Richter-Aairijoki, Heide; Musyoka, Lucy; Marx, Michael; Szecsenyi, Joachim

    2015-01-01

    To develop a model aiming to improve the quality of services for reproductive health care in rural Kenya and designed to measure the quality of reproductive health services in such a way that allows these services to identify measures for improving their performance. The Integrated Quality Management System (IQMS) was developed on the basis of a pre-existing and validated model for quality promotion, namely the European Practice Assessment (EPA). The methodology for quality assessment and feedback of assessment results to the service teams was adopted from the EPA model. Quality assessment methodology included data assessment through staff, patient surveys and service visitation. Quality is assessed by indicators, and so indicators had to be developed that were appropriate for assessing reproductive health care in rural Kenya. A search of the Kenyan and international literature was conducted to identify potential indicators. These were then rated for their relevance and clarity by a panel of Kenyan experts. 260 indicators were rated as relevant and assigned to 29 quality dimensions and 5 domains. The implementation of IQMS in ten facilities showed that IQMS is a feasible model for assessing the quality of reproductive health services in rural Kenya. IQMS enables these services to identify quality improvement targets and necessary improvement measures. Both strengths and limitations of IQMS will be discussed. Copyright © 2015. Published by Elsevier GmbH.

  11. Improving PAGER's real-time earthquake casualty and loss estimation toolkit: a challenge

    USGS Publications Warehouse

    Jaiswal, K.S.; Wald, D.J.

    2012-01-01

    We describe the on-going developments of PAGER’s loss estimation models, and discuss value-added web content that can be generated related to exposure, damage and loss outputs for a variety of PAGER users. These developments include identifying vulnerable building types in any given area, estimating earthquake-induced damage and loss statistics by building type, and developing visualization aids that help locate areas of concern for improving post-earthquake response efforts. While detailed exposure and damage information is highly useful and desirable, significant improvements are still necessary in order to improve underlying building stock and vulnerability data at a global scale. Existing efforts with the GEM’s GED4GEM and GVC consortia will help achieve some of these objectives. This will benefit PAGER especially in regions where PAGER’s empirical model is less-well constrained; there, the semi-empirical and analytical models will provide robust estimates of damage and losses. Finally, we outline some of the challenges associated with rapid casualty and loss estimation that we experienced while responding to recent large earthquakes worldwide.

  12. Development of a 3D Stream Network and Topography for Improved Large-Scale Hydraulic Modeling

    NASA Astrophysics Data System (ADS)

    Saksena, S.; Dey, S.; Merwade, V.

    2016-12-01

    Most digital elevation models (DEMs) used for hydraulic modeling do not include channel bed elevations. As a result, the DEMs are complimented with additional bathymetric data for accurate hydraulic simulations. Existing methods to acquire bathymetric information through field surveys or through conceptual models are limited to reach-scale applications. With an increasing focus on large scale hydraulic modeling of rivers, a framework to estimate and incorporate bathymetry for an entire stream network is needed. This study proposes an interpolation-based algorithm to estimate bathymetry for a stream network by modifying the reach-based empirical River Channel Morphology Model (RCMM). The effect of a 3D stream network that includes river bathymetry is then investigated by creating a 1D hydraulic model (HEC-RAS) and 2D hydrodynamic model (Integrated Channel and Pond Routing) for the Upper Wabash River Basin in Indiana, USA. Results show improved simulation of flood depths and storage in the floodplain. Similarly, the impact of river bathymetry incorporation is more significant in the 2D model as compared to the 1D model.

  13. Performance Evaluation and Improvement of Ferroelectric Field-Effect Transistor Memory

    NASA Astrophysics Data System (ADS)

    Yu, Hyung Suk

    Flash memory is reaching scaling limitations rapidly due to reduction of charge in floating gates, charge leakage and capacitive coupling between cells which cause threshold voltage fluctuations, short retention times, and interference. Many new memory technologies are being considered as alternatives to flash memory in an effort to overcome these limitations. Ferroelectric Field-Effect Transistor (FeFET) is one of the main emerging candidates because of its structural similarity to conventional FETs and fast switching speed. Nevertheless, the performance of FeFETs have not been systematically compared and analyzed against other competing technologies. In this work, we first benchmark the intrinsic performance of FeFETs and other memories by simulations in order to identify the strengths and weaknesses of FeFETs. To simulate realistic memory applications, we compare memories on an array structure. For the comparisons, we construct an accurate delay model and verify it by benchmarking against exact HSPICE simulations. Second, we propose an accurate model for FeFET memory window since the existing model has limitations. The existing model assumes symmetric operation voltages but it is not valid for the practical asymmetric operation voltages. In this modeling, we consider practical operation voltages and device dimensions. Also, we investigate realistic changes of memory window over time and retention time of FeFETs. Last, to improve memory window and subthreshold swing, we suggest nonplanar junctionless structures for FeFETs. Using the suggested structures, we study the dimensional dependences of crucial parameters like memory window and subthreshold swing and also analyze key interference mechanisms.

  14. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    William M. Bond; Salih Ersayin

    2007-03-30

    This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency ofmore » individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern Minnesota, and future proposals are pending with non-taconite mineral processing applications.« less

  15. CelOWS: an ontology based framework for the provision of semantic web services related to biological models.

    PubMed

    Matos, Ely Edison; Campos, Fernanda; Braga, Regina; Palazzi, Daniele

    2010-02-01

    The amount of information generated by biological research has lead to an intensive use of models. Mathematical and computational modeling needs accurate description to share, reuse and simulate models as formulated by original authors. In this paper, we introduce the Cell Component Ontology (CelO), expressed in OWL-DL. This ontology captures both the structure of a cell model and the properties of functional components. We use this ontology in a Web project (CelOWS) to describe, query and compose CellML models, using semantic web services. It aims to improve reuse and composition of existent components and allow semantic validation of new models.

  16. Vector Autoregression, Structural Equation Modeling, and Their Synthesis in Neuroimaging Data Analysis

    PubMed Central

    Chen, Gang; Glen, Daniel R.; Saad, Ziad S.; Hamilton, J. Paul; Thomason, Moriah E.; Gotlib, Ian H.; Cox, Robert W.

    2011-01-01

    Vector autoregression (VAR) and structural equation modeling (SEM) are two popular brain-network modeling tools. VAR, which is a data-driven approach, assumes that connected regions exert time-lagged influences on one another. In contrast, the hypothesis-driven SEM is used to validate an existing connectivity model where connected regions have contemporaneous interactions among them. We present the two models in detail and discuss their applicability to FMRI data, and interpretational limits. We also propose a unified approach that models both lagged and contemporaneous effects. The unifying model, structural vector autoregression (SVAR), may improve statistical and explanatory power, and avoids some prevalent pitfalls that can occur when VAR and SEM are utilized separately. PMID:21975109

  17. Toward Improved Force-Field Accuracy through Sensitivity Analysis of Host-Guest Binding Thermodynamics

    PubMed Central

    Yin, Jian; Fenley, Andrew T.; Henriksen, Niel M.; Gilson, Michael K.

    2015-01-01

    Improving the capability of atomistic computer models to predict the thermodynamics of noncovalent binding is critical for successful structure-based drug design, and the accuracy of such calculations remains limited by non-optimal force field parameters. Ideally, one would incorporate protein-ligand affinity data into force field parametrization, but this would be inefficient and costly. We now demonstrate that sensitivity analysis can be used to efficiently tune Lennard-Jones parameters of aqueous host-guest systems for increasingly accurate calculations of binding enthalpy. These results highlight the promise of a comprehensive use of calorimetric host-guest binding data, along with existing validation data sets, to improve force field parameters for the simulation of noncovalent binding, with the ultimate goal of making protein-ligand modeling more accurate and hence speeding drug discovery. PMID:26181208

  18. Existing Whole-House Solutions Case Study: Cascade Apartments - Deep Energy Multifamily Retrofit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2014-02-01

    In December of 2009-10, King County Housing Authority (KCHA) implemented energy retrofit improvements in the Cascade multifamily community, located in Kent, Washington, which resulted in annual energy cost savings of 22%, improved comfort and air quality for residents, and increased durability of the units. This research effort involved significant coordination from stakeholders KCHA, WA State Department of Commerce, utility Puget Sound Energy, and Cascade tenants. This report focuses on the following three primary Building America research questions: 1. What are the modeled energy savings using DOE low income weatherization approved TREAT software? 2. How did the modeled energy savings comparemore » with measured energy savings from aggregate utility billing analysis? 3. What is the Savings to Investment Ratio of the retrofit package after considering utility window incentives and KCHA capital improvement funding.« less

  19. The Integrated Behavioural Model for Water, Sanitation, and Hygiene: a systematic review of behavioural models and a framework for designing and evaluating behaviour change interventions in infrastructure-restricted settings.

    PubMed

    Dreibelbis, Robert; Winch, Peter J; Leontsini, Elli; Hulland, Kristyna R S; Ram, Pavani K; Unicomb, Leanne; Luby, Stephen P

    2013-10-26

    Promotion and provision of low-cost technologies that enable improved water, sanitation, and hygiene (WASH) practices are seen as viable solutions for reducing high rates of morbidity and mortality due to enteric illnesses in low-income countries. A number of theoretical models, explanatory frameworks, and decision-making models have emerged which attempt to guide behaviour change interventions related to WASH. The design and evaluation of such interventions would benefit from a synthesis of this body of theory informing WASH behaviour change and maintenance. We completed a systematic review of existing models and frameworks through a search of related articles available in PubMed and in the grey literature. Information on the organization of behavioural determinants was extracted from the references that fulfilled the selection criteria and synthesized. Results from this synthesis were combined with other relevant literature, and from feedback through concurrent formative and pilot research conducted in the context of two cluster-randomized trials on the efficacy of WASH behaviour change interventions to inform the development of a framework to guide the development and evaluation of WASH interventions: the Integrated Behavioural Model for Water, Sanitation, and Hygiene (IBM-WASH). We identified 15 WASH-specific theoretical models, behaviour change frameworks, or programmatic models, of which 9 addressed our review questions. Existing models under-represented the potential role of technology in influencing behavioural outcomes, focused on individual-level behavioural determinants, and had largely ignored the role of the physical and natural environment. IBM-WASH attempts to correct this by acknowledging three dimensions (Contextual Factors, Psychosocial Factors, and Technology Factors) that operate on five-levels (structural, community, household, individual, and habitual). A number of WASH-specific models and frameworks exist, yet with some limitations. The IBM-WASH model aims to provide both a conceptual and practical tool for improving our understanding and evaluation of the multi-level multi-dimensional factors that influence water, sanitation, and hygiene practices in infrastructure-constrained settings. We outline future applications of our proposed model as well as future research priorities needed to advance our understanding of the sustained adoption of water, sanitation, and hygiene technologies and practices.

  20. The Integrated Behavioural Model for Water, Sanitation, and Hygiene: a systematic review of behavioural models and a framework for designing and evaluating behaviour change interventions in infrastructure-restricted settings

    PubMed Central

    2013-01-01

    Background Promotion and provision of low-cost technologies that enable improved water, sanitation, and hygiene (WASH) practices are seen as viable solutions for reducing high rates of morbidity and mortality due to enteric illnesses in low-income countries. A number of theoretical models, explanatory frameworks, and decision-making models have emerged which attempt to guide behaviour change interventions related to WASH. The design and evaluation of such interventions would benefit from a synthesis of this body of theory informing WASH behaviour change and maintenance. Methods We completed a systematic review of existing models and frameworks through a search of related articles available in PubMed and in the grey literature. Information on the organization of behavioural determinants was extracted from the references that fulfilled the selection criteria and synthesized. Results from this synthesis were combined with other relevant literature, and from feedback through concurrent formative and pilot research conducted in the context of two cluster-randomized trials on the efficacy of WASH behaviour change interventions to inform the development of a framework to guide the development and evaluation of WASH interventions: the Integrated Behavioural Model for Water, Sanitation, and Hygiene (IBM-WASH). Results We identified 15 WASH-specific theoretical models, behaviour change frameworks, or programmatic models, of which 9 addressed our review questions. Existing models under-represented the potential role of technology in influencing behavioural outcomes, focused on individual-level behavioural determinants, and had largely ignored the role of the physical and natural environment. IBM-WASH attempts to correct this by acknowledging three dimensions (Contextual Factors, Psychosocial Factors, and Technology Factors) that operate on five-levels (structural, community, household, individual, and habitual). Conclusions A number of WASH-specific models and frameworks exist, yet with some limitations. The IBM-WASH model aims to provide both a conceptual and practical tool for improving our understanding and evaluation of the multi-level multi-dimensional factors that influence water, sanitation, and hygiene practices in infrastructure-constrained settings. We outline future applications of our proposed model as well as future research priorities needed to advance our understanding of the sustained adoption of water, sanitation, and hygiene technologies and practices. PMID:24160869

  1. Improved Doubly Robust Estimation when Data are Monotonely Coarsened, with Application to Longitudinal Studies with Dropout

    PubMed Central

    Tsiatis, Anastasios A.; Davidian, Marie; Cao, Weihua

    2010-01-01

    Summary A routine challenge is that of making inference on parameters in a statistical model of interest from longitudinal data subject to drop out, which are a special case of the more general setting of monotonely coarsened data. Considerable recent attention has focused on doubly robust estimators, which in this context involve positing models for both the missingness (more generally, coarsening) mechanism and aspects of the distribution of the full data, that have the appealing property of yielding consistent inferences if only one of these models is correctly specified. Doubly robust estimators have been criticized for potentially disastrous performance when both of these models are even only mildly misspecified. We propose a doubly robust estimator applicable in general monotone coarsening problems that achieves comparable or improved performance relative to existing doubly robust methods, which we demonstrate via simulation studies and by application to data from an AIDS clinical trial. PMID:20731640

  2. Bayesian hierarchical models for smoothing in two-phase studies, with application to small area estimation.

    PubMed

    Ross, Michelle; Wakefield, Jon

    2015-10-01

    Two-phase study designs are appealing since they allow for the oversampling of rare sub-populations which improves efficiency. In this paper we describe a Bayesian hierarchical model for the analysis of two-phase data. Such a model is particularly appealing in a spatial setting in which random effects are introduced to model between-area variability. In such a situation, one may be interested in estimating regression coefficients or, in the context of small area estimation, in reconstructing the population totals by strata. The efficiency gains of the two-phase sampling scheme are compared to standard approaches using 2011 birth data from the research triangle area of North Carolina. We show that the proposed method can overcome small sample difficulties and improve on existing techniques. We conclude that the two-phase design is an attractive approach for small area estimation.

  3. Unit mechanisms of fission gas release: Current understanding and future needs

    DOE PAGES

    Tonks, Michael; Andersson, David; Devanathan, Ram; ...

    2018-03-01

    Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. Here, this basic understanding of the fission gas behavior mechanisms has the potentialmore » to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.« less

  4. Unit mechanisms of fission gas release: Current understanding and future needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tonks, Michael; Andersson, David; Devanathan, Ram

    Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. Here, this basic understanding of the fission gas behavior mechanisms has the potentialmore » to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.« less

  5. ISG hybrid powertrain: a rule-based driver model incorporating look-ahead information

    NASA Astrophysics Data System (ADS)

    Shen, Shuiwen; Zhang, Junzhi; Chen, Xiaojiang; Zhong, Qing-Chang; Thornton, Roger

    2010-03-01

    According to European regulations, if the amount of regenerative braking is determined by the travel of the brake pedal, more stringent standards must be applied, otherwise it may adversely affect the existing vehicle safety system. The use of engine or vehicle speed to derive regenerative braking is one way to avoid strict design standards, but this introduces discontinuity in powertrain torque when the driver releases the acceleration pedal or applies the brake pedal. This is shown to cause oscillations in the pedal input and powertrain torque when a conventional driver model is adopted. Look-ahead information, together with other predicted vehicle states, are adopted to control the vehicle speed, in particular, during deceleration, and to improve the driver model so that oscillations can be avoided. The improved driver model makes analysis and validation of the control strategy for an integrated starter generator (ISG) hybrid powertrain possible.

  6. A diffusion modelling approach to understanding contextual cueing effects in children with ADHD

    PubMed Central

    Weigard, Alexander; Huang-Pollock, Cynthia

    2014-01-01

    Background Strong theoretical models suggest implicit learning deficits may exist among children with Attention Deficit Hyperactivity Disorder (ADHD). Method We examine implicit contextual cueing (CC) effects among children with ADHD (n=72) and non-ADHD Controls (n=36). Results Using Ratcliff’s drift diffusion model, we found that among Controls, the CC effect is due to improvements in attentional guidance and to reductions in response threshold. Children with ADHD did not show a CC effect; although they were able to use implicitly acquired information to deploy attentional focus, they had more difficulty adjusting their response thresholds. Conclusions Improvements in attentional guidance and reductions in response threshold together underlie the CC effect. Results are consistent with neurocognitive models of ADHD that posit sub-cortical dysfunction but intact spatial attention, and encourage the use of alternative data analytic methods when dealing with reaction time data. PMID:24798140

  7. Unit mechanisms of fission gas release: Current understanding and future needs

    NASA Astrophysics Data System (ADS)

    Tonks, Michael; Andersson, David; Devanathan, Ram; Dubourg, Roland; El-Azab, Anter; Freyss, Michel; Iglesias, Fernando; Kulacsy, Katalin; Pastore, Giovanni; Phillpot, Simon R.; Welland, Michael

    2018-06-01

    Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. This basic understanding of the fission gas behavior mechanisms has the potential to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.

  8. Exploiting salient semantic analysis for information retrieval

    NASA Astrophysics Data System (ADS)

    Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui

    2016-11-01

    Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.

  9. Expanded Study on the accumulation effect of tourism under the constraint of structure

    NASA Astrophysics Data System (ADS)

    Wang, Qiang; Yang, Zhenzhi; Huang, Lu

    2017-05-01

    There is a mutual influence between departmental structure and accumulation and growth. Therefore, the accumulation and growth of the tourism industry will be subject to certain restrictions on the industrial structure, and, conversely, it will have an impact on the existing industrial structure. Li Jingyi reported special research in the paper called "Research on tourism growth based on structural constraints" about the relationship between the growth of tourism and the existing industrial structure. It pointed out the specific interdependence between tourism and other economic sectors in terms of accumulation and growth. However, the research of Li Jingyi is based on the trichotomy of social product value. It is too abstract, while the study is understandable in theory. In practice, it is difficult to use the model of the paper to deal with specific problems. Therefore, how to improve the industry association model in the paper of Li and make it more in line with the actual situation becomes our concern. In this paper, the author hopes to improve the model of Li's paper by simplifying the decomposition of social product value. At the same time, it makes a further study on accumulation elasticity and growth elasticity. On this basis, some suggestions are put forward to guide the development of other industries based on the tourism industry.

  10. Differential effects on socioeconomic groups of modelling the location of mammography screening clinics using Geographic Information Systems.

    PubMed

    Hyndman, J C; Holman, C D

    2000-06-01

    To evaluate spatial access to mammography clinics and to investigate whether relocating clinics can improve global access. To determine whether any change in access is distributed equitably between different social groups. The study was undertaken in Perth, Western Australia in 1996. It was an analysis of travel distances to mammography clinics, comparing distances to the pattern of existing clinics and modelled relocated clinics. The study population was the 151,162 women aged 40-64 years resident in Perth in 1991. Overall travel distances to the existing clinics was reduced by 14% when a GIS system was used to relocate them so as to minimise the travel distance for all women. The travel distance of the most disadvantaged groups fell by 2% and by 24% for the least disadvantaged group. GIS modelling can be used to advantage to evaluate potential locations for screening clinics that improve the access for the target population, however global analysis should be supplemented by analysis of special groups to ensure that no group is disadvantaged by the proposal. If new technology is not used to evaluate the placement of health services, population travel distances may be greater than necessary, with possible impacts on attendance rates.

  11. Aerodynamic study of state transport bus using computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Kanekar, Siddhesh; Thakre, Prashant; Rajkumar, E.

    2017-11-01

    The main purpose of this study was to develop the aerodynamic study of a Maharashtra state road transport bus. The rising fuel price and strict government regulations makes the road transport uneconomical now days. With the objective of increasing fuel efficiency and reducing the emission of harmful exhaust gases. It has been proven experimentally that vehicle consumes almost 40% of the available useful engine power to overcome the drag resistance. This provides us a huge scope to study the influence of aerodynamic drag. The initial of the project was to identify the drag coefficient of the existing ordinary type model called “Parivartan” from ANSYS fluent. After preliminary analysis of the existing model corresponding changes are made in such a way that their implementation should be possible at workshop level. The simulation of the air flow over the bus was performed in two steps: design on SolidWorks CAD and ANSYS (FLUENT) is used as a virtual analysis tool to estimate the drag coefficient of the bus. We have used the turbulence models k-ε Realizable having a better approximation of the actual result. Around 28% improvement in the drag coefficient is achieved by CFD driven changes in the bus design. Coefficient of drag is improved by 28% and fuel efficiency increased by 20% by CFD driven changes.

  12. Optimized Structure of the Traffic Flow Forecasting Model With a Deep Learning Approach.

    PubMed

    Yang, Hao-Fan; Dillon, Tharam S; Chen, Yi-Ping Phoebe

    2017-10-01

    Forecasting accuracy is an important issue for successful intelligent traffic management, especially in the domain of traffic efficiency and congestion reduction. The dawning of the big data era brings opportunities to greatly improve prediction accuracy. In this paper, we propose a novel model, stacked autoencoder Levenberg-Marquardt model, which is a type of deep architecture of neural network approach aiming to improve forecasting accuracy. The proposed model is designed using the Taguchi method to develop an optimized structure and to learn traffic flow features through layer-by-layer feature granulation with a greedy layerwise unsupervised learning algorithm. It is applied to real-world data collected from the M6 freeway in the U.K. and is compared with three existing traffic predictors. To the best of our knowledge, this is the first time that an optimized structure of the traffic flow forecasting model with a deep learning approach is presented. The evaluation results demonstrate that the proposed model with an optimized structure has superior performance in traffic flow forecasting.

  13. Estimating abundance of an open population with an N-mixture model using auxiliary data on animal movements.

    PubMed

    Ketz, Alison C; Johnson, Therese L; Monello, Ryan J; Mack, John A; George, Janet L; Kraft, Benjamin R; Wild, Margaret A; Hooten, Mevin B; Hobbs, N Thompson

    2018-04-01

    Accurate assessment of abundance forms a central challenge in population ecology and wildlife management. Many statistical techniques have been developed to estimate population sizes because populations change over time and space and to correct for the bias resulting from animals that are present in a study area but not observed. The mobility of individuals makes it difficult to design sampling procedures that account for movement into and out of areas with fixed jurisdictional boundaries. Aerial surveys are the gold standard used to obtain data of large mobile species in geographic regions with harsh terrain, but these surveys can be prohibitively expensive and dangerous. Estimating abundance with ground-based census methods have practical advantages, but it can be difficult to simultaneously account for temporary emigration and observer error to avoid biased results. Contemporary research in population ecology increasingly relies on telemetry observations of the states and locations of individuals to gain insight on vital rates, animal movements, and population abundance. Analytical models that use observations of movements to improve estimates of abundance have not been developed. Here we build upon existing multi-state mark-recapture methods using a hierarchical N-mixture model with multiple sources of data, including telemetry data on locations of individuals, to improve estimates of population sizes. We used a state-space approach to model animal movements to approximate the number of marked animals present within the study area at any observation period, thereby accounting for a frequently changing number of marked individuals. We illustrate the approach using data on a population of elk (Cervus elaphus nelsoni) in Northern Colorado, USA. We demonstrate substantial improvement compared to existing abundance estimation methods and corroborate our results from the ground based surveys with estimates from aerial surveys during the same seasons. We develop a hierarchical Bayesian N-mixture model using multiple sources of data on abundance, movement and survival to estimate the population size of a mobile species that uses remote conservation areas. The model improves accuracy of inference relative to previous methods for estimating abundance of open populations. © 2018 by the Ecological Society of America.

  14. Improved Delayed-Neutron Spectroscopy Using Trapped Ions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norman, Eric B.

    The neutrons emitted following the β decay of fission fragments (known as delayed neutrons because they are emitted after fission on a timescale of the β-decay half-lives) play a crucial role in reactor performance and control. Reviews of delayed-neutron properties highlight the need for high-quality data for a wide variety of delayed-neutron emitters to better understand the time dependence and energy spectrum of the neutrons as these properties are essential for a detailed understanding of reactor kinetics needed for reactor safety and to understand the behavior of these reactors under various accident and component-failure scenarios. For fast breeder reactors, criticalitymore » calculations require accurate delayed-neutron energy spectra and approximations that are acceptable for light-water reactors such as assuming the delayed-neutron and fission-neutron energy spectra are identical are not acceptable and improved β-delayed neutron data is needed for safety and accident analyses for these reactors. With improved nuclear data, the delayed neutrons flux and energy spectrum could be calculated from the contributions from individual isotopes and therefore could be accurately modeled for any fuel-cycle concept, actinide mix, or irradiation history. High-quality β-delayed neutron measurements are also critical to constrain modern nuclear-structure calculations and empirical models that predict the decay properties for nuclei for which no data exists and improve the accuracy and flexibility of the existing empirical descriptions of delayed neutrons from fission such as the six-group representation« less

  15. Intermittent Fasting: Is the Wait Worth the Weight?

    PubMed

    Stockman, Mary-Catherine; Thomas, Dylan; Burke, Jacquelyn; Apovian, Caroline M

    2018-06-01

    We review the underlying mechanisms and potential benefits of intermittent fasting (IF) from animal models and recent clinical trials. Numerous variations of IF exist, and study protocols vary greatly in their interpretations of this weight loss trend. Most human IF studies result in minimal weight loss and marginal improvements in metabolic biomarkers, though outcomes vary. Some animal models have found that IF reduces oxidative stress, improves cognition, and delays aging. Additionally, IF has anti-inflammatory effects, promotes autophagy, and benefits the gut microbiome. The benefit-to-harm ratio varies by model, IF protocol, age at initiation, and duration. We provide an integrated perspective on potential benefits of IF as well as key areas for future investigation. In clinical trials, caloric restriction and IF result in similar degrees of weight loss and improvement in insulin sensitivity. Although these data suggest that IF may be a promising weight loss method, IF trials have been of moderate sample size and limited duration. More rigorous research is needed.

  16. Sensor-Based Optimization Model for Air Quality Improvement in Home IoT

    PubMed Central

    Kim, Jonghyuk

    2018-01-01

    We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market. PMID:29570684

  17. Sensor-Based Optimization Model for Air Quality Improvement in Home IoT.

    PubMed

    Kim, Jonghyuk; Hwangbo, Hyunwoo

    2018-03-23

    We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market.

  18. Derivation and calibration of a gas metal arc welding (GMAW) dynamic droplet model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reutzel, E.W.; Einerson, C.J.; Johnson, J.A.

    1996-12-31

    A rudimentary, existing dynamic model for droplet growth and detachment in gas metal arc welding (GMAW) was improved and calibrated to match experimental data. The model simulates droplets growing at the end of an imaginary spring. Mass is added to the drop as the electrode melts, the droplet grows, and the spring is displaced. Detachment occurs when one of two criteria is met, and the amount of mass that is detached is a function of the droplet velocity at the time of detachment. Improvements to the model include the addition of a second criterion for drop detachment, a more sophisticatedmore » model of the power supply and secondary electric circuit, and the incorporation of a variable electrode resistance. Relevant physical parameters in the model were adjusted during model calibration. The average current, droplet frequency, and parameter-space location of globular-to-streaming mode transition were used as criteria for tuning the model. The average current predicted by the calibrated model matched the experimental average current to within 5% over a wide range of operating conditions.« less

  19. Statistical models for incorporating data from routine HIV testing of pregnant women at antenatal clinics into HIV/AIDS epidemic estimates.

    PubMed

    Sheng, Ben; Marsh, Kimberly; Slavkovic, Aleksandra B; Gregson, Simon; Eaton, Jeffrey W; Bao, Le

    2017-04-01

    HIV prevalence data collected from routine HIV testing of pregnant women at antenatal clinics (ANC-RT) are potentially available from all facilities that offer testing services to pregnant women and can be used to improve estimates of national and subnational HIV prevalence trends. We develop methods to incorporate these new data source into the Joint United Nations Programme on AIDS Estimation and Projection Package in Spectrum 2017. We develop a new statistical model for incorporating ANC-RT HIV prevalence data, aggregated either to the health facility level (site-level) or regionally (census-level), to estimate HIV prevalence alongside existing sources of HIV prevalence data from ANC unlinked anonymous testing (ANC-UAT) and household-based national population surveys. Synthetic data are generated to understand how the availability of ANC-RT data affects the accuracy of various parameter estimates. We estimate HIV prevalence and additional parameters using both ANC-RT and other existing data. Fitting HIV prevalence using synthetic data generally gives precise estimates of the underlying trend and other parameters. More years of ANC-RT data should improve prevalence estimates. More ANC-RT sites and continuation with existing ANC-UAT sites may improve the estimate of calibration between ANC-UAT and ANC-RT sites. We have proposed methods to incorporate ANC-RT data into Spectrum to obtain more precise estimates of prevalence and other measures of the epidemic. Many assumptions about the accuracy, consistency, and representativeness of ANC-RT prevalence underlie the use of these data for monitoring HIV epidemic trends and should be tested as more data become available from national ANC-RT programs.

  20. Statistical Models for Incorporating Data from Routine HIV Testing of Pregnant Women at Antenatal Clinics into HIV/AIDS Epidemic Estimates

    PubMed Central

    Sheng, Ben; Marsh, Kimberly; Slavkovic, Aleksandra B.; Gregson, Simon; Eaton, Jeffrey W.; Bao, Le

    2017-01-01

    Objective HIV prevalence data collected from routine HIV testing of pregnant women at antenatal clinics (ANC-RT) are potentially available from all facilities that offer testing services to pregnant women, and can be used to improve estimates of national and sub-national HIV prevalence trends. We develop methods to incorporate this new data source into the UNAIDS Estimation and Projection Package (EPP) in Spectrum 2017. Methods We develop a new statistical model for incorporating ANC-RT HIV prevalence data, aggregated either to the health facility level (‘site-level’) or regionally (‘census-level’), to estimate HIV prevalence alongside existing sources of HIV prevalence data from ANC unlinked anonymous testing (ANC-UAT) and household-based national population surveys. Synthetic data are generated to understand how the availability of ANC-RT data affects the accuracy of various parameter estimates. Results We estimate HIV prevalence and additional parameters using both ANC-RT and other existing data. Fitting HIV prevalence using synthetic data generally gives precise estimates of the underlying trend and other parameters. More years of ANC-RT data should improve prevalence estimates. More ANC-RT sites and continuation with existing ANC-UAT sites may improve the estimate of calibration between ANC-UAT and ANC-RT sites. Conclusion We have proposed methods to incorporate ANC-RT data into Spectrum to obtain more precise estimates of prevalence and other measures of the epidemic. Many assumptions about the accuracy, consistency, and representativeness of ANC-RT prevalence underlie the use of these data for monitoring HIV epidemic trends, and should be tested as more data become available from national ANC-RT programs. PMID:28296804

Top