Horiguchi, Masatoshi; Miyata, Nariaki; Mizuno, Hiroshi
2017-04-01
In order to avoid epidermal heat damage, we developed a novel irradiation method termed "Focused multiple laser beams (FMLB)," which allows long-pulse neodymium:yttrium aluminum garnet (Nd:YAG) laser beams to be irradiated from several directions in a concentric fashion followed by focusing into the dermis without epidermal damage. This study aimed to assess whether FMLB achieves the desired dermal improvement without epidermal damage. The dorsal skin of New Zealand White rabbits was irradiated with FMLB. Macroscopic and histological analyses were performed after 1 hour and 1, 2, 3 and 4 weeks. Real-time PCR analysis of type I and III collagen expression was performed at two and four weeks. Control groups exhibited skin ulcers which were healed with scar formation whereas FMLB groups remained intact macroscopically. Histologically, FMLB group showed increase in dermal thickness at four weeks while the epidermis remained intact. Real-time PCR demonstrated that both type I and III collagen increased at two weeks but decreased at four weeks. FMLB can deliver the target laser energy to the dermis without significantly affecting the epidermis.
NASA Astrophysics Data System (ADS)
Zhuo, Congshan; Zhong, Chengwen
2016-11-01
In this paper, a three-dimensional filter-matrix lattice Boltzmann (FMLB) model based on large eddy simulation (LES) was verified for simulating wall-bounded turbulent flows. The Vreman subgrid-scale model was employed in the present FMLB-LES framework, which had been proved to be capable of predicting turbulent near-wall region accurately. The fully developed turbulent channel flows were performed at a friction Reynolds number Reτ of 180. The turbulence statistics computed from the present FMLB-LES simulations, including mean stream velocity profile, Reynolds stress profile and root-mean-square velocity fluctuations greed well with the LES results of multiple-relaxation-time (MRT) LB model, and some discrepancies in comparison with those direct numerical simulation (DNS) data of Kim et al. was also observed due to the relatively low grid resolution. Moreover, to investigate the influence of grid resolution on the present LES simulation, a DNS simulation on a finer gird was also implemented by present FMLB-D3Q19 model. Comparisons of detailed computed various turbulence statistics with available benchmark data of DNS showed quite well agreement.
Yu, Weiwen; Du, Pengcheng; Chen, Chen; Lu, Shan; Kan, Biao; Du, Xiaoping; Xu, Jianguo
2014-03-01
Farmer's markets with live birds (FMLB) are key sites where human infections by influenza A virus subtype H7N9 happened. Approximately 80% cases have exposed to FMLB. This study is to investigate the geographic relationship between FMLB and human cases based on analysis of internet data of their geographic locations. Using big data from internet, we searched all FMLB in the cities where the human cases have been reported, then analyzed geographic relations, and evaluated the possibility of visits of the patients to the FMLB around them. The densities of FMLB, population and live poultries were also analyzed. Forty-two cities and 10 615 markets were included in the study. It is indicated that the number of human cases has positive correlations with the population density, the number and density of markets. Except three markets in Foshan, human cases have been reported within 5 km of 10 of 13 markets, which shows that the live bird trading is highly relevant with the distribution of cases. We identified 13 hot spots in the cities including Hangzhou, Shenzhen, et al, where clustered cases have emerged. The numbers of human cases are significantly high in cities where FMLB are detected positive for H7N9 virus. These virus positive markets usually affect the people's residence within 5km area. The number and location of FMLB in cities should be re-evaluated and re-planed for healthy city where the risk of residents infecting avian influenza virus is greatly reduced or eliminated.
Hunt, R.J.; Anderson, M.P.; Kelson, V.A.
1998-01-01
This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.
[The improvement of mixed human serum-induced anaphylactic reaction death model in guinea pigs].
Chen, Jiong-Yuan; Lai, Yue; Li, Dang-Ri; Yue, Xia; Wang, Hui-Jun
2012-12-01
To increase the death rate of fatal anaphylaxis in guinea pigs and the detectahie level of the tryptase of mast cell in hlood serum. Seventy-four guinea pigs were randomly divided into five groups: original model group, original model control group, improved model group, improved model control group, improved model with non-anaphylaxis group. Using mixed human serum as the allergen, the way of injection, sensitization and induction were improved. ELISA was used to detect the serum mast cell tryptase and total IgE in guinea pigs of each group. The death rate of fatal anaphylaxis in original model group was 54.2% with the different degree of hemopericardium. The severe pericardial tamponade appeared in 9 guinea pigs in original model group and original model control group. The death rate of fatal anaphylaxis in improved model group was 75% without pericardial tamponade. The concentration of the serum total IgE showed no statistically difference hetween original model group and original model control group (P > 0.05), hut the serum mast cell tryptase level was higher in the original model group than that in the original model control group (P > 0.05). The concentration of the serum total IgE and the serum mast cell tryptase level were significantly higher in improved model group than that in the improved model control group (P < 0.05). The death rate of the improved model significantly increases, which can provide effective animal model for the study of serum total IgE and mast cell tryptase.
Toward an improvement over Kerner-Klenov-Wolf three-phase cellular automaton model.
Jiang, Rui; Wu, Qing-Song
2005-12-01
The Kerner-Klenov-Wolf (KKW) three-phase cellular automaton model has a nonrealistic velocity of the upstream front in widening synchronized flow pattern which separates synchronized flow downstream and free flow upstream. This paper presents an improved model, which is a combination of the initial KKW model and a modified Nagel-Schreckenberg (MNS) model. In the improved KKW model, a parameter is introduced to determine the vehicle moves according to the MNS model or the initial KKW model. The improved KKW model can not only simulate the empirical observations as the initial KKW model, but also overcome the nonrealistic velocity problem. The mechanism of the improvement is discussed.
Researchers' Roles in Patient Safety Improvement.
Pietikäinen, Elina; Reiman, Teemu; Heikkilä, Jouko; Macchi, Luigi
2016-03-01
In this article, we explore how researchers can contribute to patient safety improvement. We aim to expand the instrumental role researchers have often occupied in relation to patient safety improvement. We reflect on our own improvement model and experiences as patient safety researchers in an ongoing Finnish multi-actor innovation project through self-reflective narration. Our own patient safety improvement model can be described as systemic. Based on the purpose of the innovation project, our improvement model, and the improvement models of the other actors in the project, we have carried out a wide range of activities. Our activities can be summarized in 8 overlapping patient safety improvement roles: modeler, influencer, supplier, producer, ideator, reflector, facilitator, and negotiator. When working side by side with "practice," researchers are offered and engage in several different activities. The way researchers contribute to patient safety improvement and balance between different roles depends on the purpose of the study, as well as on the underlying patient safety improvement models. Different patient safety research paradigms seem to emphasize different improvement roles, and thus, they also face different challenges. Open reflection on the underlying improvement models and roles can help researchers with different backgrounds-as well as other actors involved in patient safety improvement-in structuring their work and collaborating productively.
A residency clinic chronic condition management quality improvement project.
Halverson, Larry W; Sontheimer, Dan; Duvall, Sharon
2007-02-01
Quality improvement in chronic disease management is a major agenda for improving health and reducing health care costs. A six-component chronic disease management model can help guide this effort. Several characteristics of the "new model" of family medicine described by the Future of Family Medicine (FFM) Project Leadership Committee are promulgated to foster practice changes that improve quality. Our objective was to implement and assess a quality improvement project guided by the components of a chronic disease management model and FFM new model characteristics. Diabetes was selected as a model chronic disease focus. Multiple practice changes were implemented. A mature electronic medical record facilitated data collection and measurement of quality improvement progress. Data from the diabetes registry demonstrates that our efforts have been effective. Significant improvement occurred in five out of six quality indicators. Multidisciplinary teamwork in a model residency practice guided by chronic disease management principles and the FFM new model characteristics can produce significant management improvements in one important chronic disease.
Quality Improvement on the Acute Inpatient Psychiatry Unit Using the Model for Improvement
Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean
2013-01-01
Background A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. Methods We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients—those starting or continuing on standing neuroleptics—with the Abnormal Involuntary Movement Scale (AIMS). Results After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Conclusion Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team. PMID:24052768
Quality improvement on the acute inpatient psychiatry unit using the model for improvement.
Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean
2013-01-01
A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients-those starting or continuing on standing neuroleptics-with the Abnormal Involuntary Movement Scale (AIMS). After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team.
An improved Burgers cellular automaton model for bicycle flow
NASA Astrophysics Data System (ADS)
Xue, Shuqi; Jia, Bin; Jiang, Rui; Li, Xingang; Shan, Jingjing
2017-12-01
As an energy-efficient and healthy transport mode, bicycling has recently attracted the attention of governments, transport planners, and researchers. The dynamic characteristics of the bicycle flow must be investigated to improve the facility design and traffic operation of bicycling. We model the bicycle flow by using an improved Burgers cellular automaton model. Through a following move mechanism, the modified model enables bicycles to move smoothly and increase the critical density to a more rational level than the original model. The model is calibrated and validated by using experimental data and field data. The results show that the improved model can effectively simulate the bicycle flow. The performance of the model under different parameters is investigated and discussed. Strengths and limitations of the improved model are suggested for future work.
Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems
2015-12-01
distribution is unlimited IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE SYSTEMS by Sang M. Sok December 2015...REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE...importance of the CoM. The improved conceptual model methodology (ICoMM) is developed in support of improving the structure of the CoM for both face and
Agricultural model intercomparison and improvement project: Overview of model intercomparisons
USDA-ARS?s Scientific Manuscript database
Improvement of crop simulation models to better estimate growth and yield is one of the objectives of the Agricultural Model Intercomparison and Improvement Project (AgMIP). The overall goal of AgMIP is to provide an assessment of crop model through rigorous intercomparisons and evaluate future clim...
NASA Astrophysics Data System (ADS)
Jonny, Zagloed, Teuku Yuri M.
2017-11-01
This paper aims to present an integrated health care model for Indonesian health care industry. Based on previous researches, there are two health care models in the industry such as decease- and patient-centered care models. In their developments, the patient-centered care model is widely applied due to its capability in reducing cost and improving quality simultaneously. However, there is still no comprehensive model resulting in cost reduction, quality improvement, patient satisfaction and hospital profitability simultaneously. Therefore, this research is intended to develop that model. In doing so, first, a conceptual model using Kano's Model, Quality Function Deployment (QFD) and Balanced Scorecard (BSC) is developed to generate several important elements of the model as required by stakeholders. Then, a case study of an Indonesian hospital is presented to evaluate the validity of the model using correlation analysis. As a result, it can be concluded that the model is validated implying several managerial insights among its elements such as l) leadership (r=0.85) and context of the organization (r=0.77) improve operations; 2) planning (r=0.96), support process (r=0.87) and continual improvement (r=0.95) also improve operations; 3) operations improve customer satisfaction (r=0.89) and financial performance (r=0.93) and 4) customer satisfaction improves the financial performance (0.98).
Interpreting incremental value of markers added to risk prediction models.
Pencina, Michael J; D'Agostino, Ralph B; Pencina, Karol M; Janssens, A Cecile J W; Greenland, Philip
2012-09-15
The discrimination of a risk prediction model measures that model's ability to distinguish between subjects with and without events. The area under the receiver operating characteristic curve (AUC) is a popular measure of discrimination. However, the AUC has recently been criticized for its insensitivity in model comparisons in which the baseline model has performed well. Thus, 2 other measures have been proposed to capture improvement in discrimination for nested models: the integrated discrimination improvement and the continuous net reclassification improvement. In the present study, the authors use mathematical relations and numerical simulations to quantify the improvement in discrimination offered by candidate markers of different strengths as measured by their effect sizes. They demonstrate that the increase in the AUC depends on the strength of the baseline model, which is true to a lesser degree for the integrated discrimination improvement. On the other hand, the continuous net reclassification improvement depends only on the effect size of the candidate variable and its correlation with other predictors. These measures are illustrated using the Framingham model for incident atrial fibrillation. The authors conclude that the increase in the AUC, integrated discrimination improvement, and net reclassification improvement offer complementary information and thus recommend reporting all 3 alongside measures characterizing the performance of the final model.
Model-based software process improvement
NASA Technical Reports Server (NTRS)
Zettervall, Brenda T.
1994-01-01
The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.
NASA Technical Reports Server (NTRS)
Maiorano, Andrea; Martre, Pierre; Asseng, Senthold; Ewert, Frank; Mueller, Christoph; Roetter, Reimund P.; Ruane, Alex C.; Semenov, Mikhail A.; Wallach, Daniel; Wang, Enli
2016-01-01
To improve climate change impact estimates and to quantify their uncertainty, multi-model ensembles (MMEs) have been suggested. Model improvements can improve the accuracy of simulations and reduce the uncertainty of climate change impact assessments. Furthermore, they can reduce the number of models needed in a MME. Herein, 15 wheat growth models of a larger MME were improved through re-parameterization and/or incorporating or modifying heat stress effects on phenology, leaf growth and senescence, biomass growth, and grain number and size using detailed field experimental data from the USDA Hot Serial Cereal experiment (calibration data set). Simulation results from before and after model improvement were then evaluated with independent field experiments from a CIMMYT worldwide field trial network (evaluation data set). Model improvements decreased the variation (10th to 90th model ensemble percentile range) of grain yields simulated by the MME on average by 39% in the calibration data set and by 26% in the independent evaluation data set for crops grown in mean seasonal temperatures greater than 24 C. MME mean squared error in simulating grain yield decreased by 37%. A reduction in MME uncertainty range by 27% increased MME prediction skills by 47%. Results suggest that the mean level of variation observed in field experiments and used as a benchmark can be reached with half the number of models in the MME. Improving crop models is therefore important to increase the certainty of model-based impact assessments and allow more practical, i.e. smaller MMEs to be used effectively.
[Analysis of the stability and adaptability of near infrared spectra qualitative analysis model].
Cao, Wu; Li, Wei-jun; Wang, Ping; Zhang, Li-ping
2014-06-01
The stability and adaptability of model of near infrared spectra qualitative analysis were studied. Method of separate modeling can significantly improve the stability and adaptability of model; but its ability of improving adaptability of model is limited. Method of joint modeling can not only improve the adaptability of the model, but also the stability of model, at the same time, compared to separate modeling, the method can shorten the modeling time, reduce the modeling workload; extend the term of validity of model, and improve the modeling efficiency. The experiment of model adaptability shows that, the correct recognition rate of separate modeling method is relatively low, which can not meet the requirements of application, and joint modeling method can reach the correct recognition rate of 90%, and significantly enhances the recognition effect. The experiment of model stability shows that, the identification results of model by joint modeling are better than the model by separate modeling, and has good application value.
NASA Astrophysics Data System (ADS)
Shen, Chengcheng; Shi, Honghua; Liu, Yongzhi; Li, Fen; Ding, Dewen
2016-07-01
Marine ecosystem dynamic models (MEDMs) are important tools for the simulation and prediction of marine ecosystems. This article summarizes the methods and strategies used for the improvement and assessment of MEDM skill, and it attempts to establish a technical framework to inspire further ideas concerning MEDM skill improvement. The skill of MEDMs can be improved by parameter optimization (PO), which is an important step in model calibration. An efficient approach to solve the problem of PO constrained by MEDMs is the global treatment of both sensitivity analysis and PO. Model validation is an essential step following PO, which validates the efficiency of model calibration by analyzing and estimating the goodness-of-fit of the optimized model. Additionally, by focusing on the degree of impact of various factors on model skill, model uncertainty analysis can supply model users with a quantitative assessment of model confidence. Research on MEDMs is ongoing; however, improvement in model skill still lacks global treatments and its assessment is not integrated. Thus, the predictive performance of MEDMs is not strong and model uncertainties lack quantitative descriptions, limiting their application. Therefore, a large number of case studies concerning model skill should be performed to promote the development of a scientific and normative technical framework for the improvement of MEDM skill.
Multiscale Methods for Accurate, Efficient, and Scale-Aware Models of the Earth System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldhaber, Steve; Holland, Marika
The major goal of this project was to contribute improvements to the infrastructure of an Earth System Model in order to support research in the Multiscale Methods for Accurate, Efficient, and Scale-Aware models of the Earth System project. In support of this, the NCAR team accomplished two main tasks: improving input/output performance of the model and improving atmospheric model simulation quality. Improvement of the performance and scalability of data input and diagnostic output within the model required a new infrastructure which can efficiently handle the unstructured grids common in multiscale simulations. This allows for a more computationally efficient model, enablingmore » more years of Earth System simulation. The quality of the model simulations was improved by reducing grid-point noise in the spectral element version of the Community Atmosphere Model (CAM-SE). This was achieved by running the physics of the model using grid-cell data on a finite-volume grid.« less
Accelerating quality improvement within your organization: Applying the Model for Improvement.
Crowl, Ashley; Sharma, Anita; Sorge, Lindsay; Sorensen, Todd
2015-01-01
To discuss the fundamentals of the Model for Improvement and how the model can be applied to quality improvement activities associated with medication use, including understanding the three essential questions that guide quality improvement, applying a process for actively testing change within an organization, and measuring the success of these changes on care delivery. PubMed from 1990 through April 2014 using the search terms quality improvement, process improvement, hospitals, and primary care. At the authors' discretion, studies were selected based on their relevance in demonstrating the quality improvement process and tests of change within an organization. Organizations are continuously seeking to enhance quality in patient care services, and much of this work focuses on improving care delivery processes. Yet change in these systems is often slow, which can lead to frustration or apathy among frontline practitioners. Adopting and applying the Model for Improvement as a core strategy for quality improvement efforts can accelerate the process. While the model is frequently well known in hospitals and primary care settings, it is not always familiar to pharmacists. In addition, while some organizations may be familiar with the "plan, do, study, act" (PDSA) cycles-one element of the Model for Improvement-many do not apply it effectively. The goal of the model is to combine a continuous process of small tests of change (PDSA cycles) within an overarching aim with a longitudinal measurement process. This process differs from other forms of improvement work that plan and implement large-scale change over an extended period, followed by months of data collection. In this scenario it may take months or years to determine whether an intervention will have a positive impact. By following the Model for Improvement, frontline practitioners and their organizational leaders quickly identify strategies that make a positive difference and result in a greater degree of success.
Agent-Based Computing in Distributed Adversarial Planning
2010-08-09
plans. An agent is expected to agree to deviate from its optimal uncoordinated plan only if it improves its position. - process models for opponent...Game . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2 Improvements ...plan only if it improves its position. – process models for opponent modeling – We have analyzed the suitability of business process models for creating
Model-driven approach to data collection and reporting for quality improvement
Curcin, Vasa; Woodcock, Thomas; Poots, Alan J.; Majeed, Azeem; Bell, Derek
2014-01-01
Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. PMID:24874182
NASA Astrophysics Data System (ADS)
Nijzink, R. C.; Samaniego, L.; Mai, J.; Kumar, R.; Thober, S.; Zink, M.; Schäfer, D.; Savenije, H. H. G.; Hrachowitz, M.
2015-12-01
Heterogeneity of landscape features like terrain, soil, and vegetation properties affect the partitioning of water and energy. However, it remains unclear to which extent an explicit representation of this heterogeneity at the sub-grid scale of distributed hydrological models can improve the hydrological consistency and the robustness of such models. In this study, hydrological process complexity arising from sub-grid topography heterogeneity was incorporated in the distributed mesoscale Hydrologic Model (mHM). Seven study catchments across Europe were used to test whether (1) the incorporation of additional sub-grid variability on the basis of landscape-derived response units improves model internal dynamics, (2) the application of semi-quantitative, expert-knowledge based model constraints reduces model uncertainty; and (3) the combined use of sub-grid response units and model constraints improves the spatial transferability of the model. Unconstrained and constrained versions of both, the original mHM and mHMtopo, which allows for topography-based sub-grid heterogeneity, were calibrated for each catchment individually following a multi-objective calibration strategy. In addition, four of the study catchments were simultaneously calibrated and their feasible parameter sets were transferred to the remaining three receiver catchments. In a post-calibration evaluation procedure the probabilities of model and transferability improvement, when accounting for sub-grid variability and/or applying expert-knowledge based model constraints, were assessed on the basis of a set of hydrological signatures. In terms of the Euclidian distance to the optimal model, used as overall measure for model performance with respect to the individual signatures, the model improvement achieved by introducing sub-grid heterogeneity to mHM in mHMtopo was on average 13 %. The addition of semi-quantitative constraints to mHM and mHMtopo resulted in improvements of 13 and 19 % respectively, compared to the base case of the unconstrained mHM. Most significant improvements in signature representations were, in particular, achieved for low flow statistics. The application of prior semi-quantitative constraints further improved the partitioning between runoff and evaporative fluxes. Besides, it was shown that suitable semi-quantitative prior constraints in combination with the transfer function based regularization approach of mHM, can be beneficial for spatial model transferability as the Euclidian distances for the signatures improved on average by 2 %. The effect of semi-quantitative prior constraints combined with topography-guided sub-grid heterogeneity on transferability showed a more variable picture of improvements and deteriorations, but most improvements were observed for low flow statistics.
NASA Astrophysics Data System (ADS)
Nijzink, Remko C.; Samaniego, Luis; Mai, Juliane; Kumar, Rohini; Thober, Stephan; Zink, Matthias; Schäfer, David; Savenije, Hubert H. G.; Hrachowitz, Markus
2016-03-01
Heterogeneity of landscape features like terrain, soil, and vegetation properties affects the partitioning of water and energy. However, it remains unclear to what extent an explicit representation of this heterogeneity at the sub-grid scale of distributed hydrological models can improve the hydrological consistency and the robustness of such models. In this study, hydrological process complexity arising from sub-grid topography heterogeneity was incorporated into the distributed mesoscale Hydrologic Model (mHM). Seven study catchments across Europe were used to test whether (1) the incorporation of additional sub-grid variability on the basis of landscape-derived response units improves model internal dynamics, (2) the application of semi-quantitative, expert-knowledge-based model constraints reduces model uncertainty, and whether (3) the combined use of sub-grid response units and model constraints improves the spatial transferability of the model. Unconstrained and constrained versions of both the original mHM and mHMtopo, which allows for topography-based sub-grid heterogeneity, were calibrated for each catchment individually following a multi-objective calibration strategy. In addition, four of the study catchments were simultaneously calibrated and their feasible parameter sets were transferred to the remaining three receiver catchments. In a post-calibration evaluation procedure the probabilities of model and transferability improvement, when accounting for sub-grid variability and/or applying expert-knowledge-based model constraints, were assessed on the basis of a set of hydrological signatures. In terms of the Euclidian distance to the optimal model, used as an overall measure of model performance with respect to the individual signatures, the model improvement achieved by introducing sub-grid heterogeneity to mHM in mHMtopo was on average 13 %. The addition of semi-quantitative constraints to mHM and mHMtopo resulted in improvements of 13 and 19 %, respectively, compared to the base case of the unconstrained mHM. Most significant improvements in signature representations were, in particular, achieved for low flow statistics. The application of prior semi-quantitative constraints further improved the partitioning between runoff and evaporative fluxes. In addition, it was shown that suitable semi-quantitative prior constraints in combination with the transfer-function-based regularization approach of mHM can be beneficial for spatial model transferability as the Euclidian distances for the signatures improved on average by 2 %. The effect of semi-quantitative prior constraints combined with topography-guided sub-grid heterogeneity on transferability showed a more variable picture of improvements and deteriorations, but most improvements were observed for low flow statistics.
Development of 3D Oxide Fuel Mechanics Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, B. W.; Casagranda, A.; Pitts, S. A.
This report documents recent work to improve the accuracy and robustness of the mechanical constitutive models used in the BISON fuel performance code. These developments include migration of the fuel mechanics models to be based on the MOOSE Tensor Mechanics module, improving the robustness of the smeared cracking model, implementing a capability to limit the time step size based on material model response, and improving the robustness of the return mapping iterations used in creep and plasticity models.
ERIC Educational Resources Information Center
Blank, Rolf K.; Smithson, John; Porter, Andrew; Nunnaley, Diana; Osthoff, Eric
2006-01-01
The instructional improvement model Data on Enacted Curriculum was tested with an experimental design using randomized place-based trials. The improvement model is based on using data on instructional practices and achievement to guide professional development and decisions to refocus on instruction. The model was tested in 50 U.S. middle schools…
Test code for the assessment and improvement of Reynolds stress models
NASA Technical Reports Server (NTRS)
Rubesin, M. W.; Viegas, J. R.; Vandromme, D.; Minh, H. HA
1987-01-01
An existing two-dimensional, compressible flow, Navier-Stokes computer code, containing a full Reynolds stress turbulence model, was adapted for use as a test bed for assessing and improving turbulence models based on turbulence simulation experiments. To date, the results of using the code in comparison with simulated channel flow and over an oscillating flat plate have shown that the turbulence model used in the code needs improvement for these flows. It is also shown that direct simulation of turbulent flows over a range of Reynolds numbers are needed to guide subsequent improvement of turbulence models.
A Bayesian model averaging method for improving SMT phrase table
NASA Astrophysics Data System (ADS)
Duan, Nan
2013-03-01
Previous methods on improving translation quality by employing multiple SMT models usually carry out as a second-pass decision procedure on hypotheses from multiple systems using extra features instead of using features in existing models in more depth. In this paper, we propose translation model generalization (TMG), an approach that updates probability feature values for the translation model being used based on the model itself and a set of auxiliary models, aiming to alleviate the over-estimation problem and enhance translation quality in the first-pass decoding phase. We validate our approach for translation models based on auxiliary models built by two different ways. We also introduce novel probability variance features into the log-linear models for further improvements. We conclude our approach can be developed independently and integrated into current SMT pipeline directly. We demonstrate BLEU improvements on the NIST Chinese-to-English MT tasks for single-system decodings.
Capability Maturity Model (CMM) for Software Process Improvements
NASA Technical Reports Server (NTRS)
Ling, Robert Y.
2000-01-01
This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.
Automated Student Model Improvement
ERIC Educational Resources Information Center
Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.
2012-01-01
Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…
Toward Improved Fidelity of Thermal Explosion Simulations
NASA Astrophysics Data System (ADS)
Nichols, Albert; Becker, Richard; Burnham, Alan; Howard, W. Michael; Knap, Jarek; Wemhoff, Aaron
2009-06-01
We present results of an improved thermal/chemical/mechanical model of HMX based explosives like LX04 and LX10 for thermal cook-off. The original HMX model and analysis scheme were developed by Yoh et.al. for use in the ALE3D modeling framework. The improvements were concentrated in four areas. First, we added porosity to the chemical material model framework in ALE3D used to model HMX explosive formulations to handle the roughly 2% porosity in solid explosives. Second, we improved the HMX reaction network, which included the addition of a reactive phase change model base on work by Henson et.al. Third, we added early decomposition gas species to the CHEETAH material database to improve equations of state for gaseous intermediates and products. Finally, we improved the implicit mechanics module in ALE3D to more naturally handle the long time scales associated with thermal cookoff. The application of the resulting framework to the analysis of the Scaled Thermal Explosion (STEX) experiments will be discussed.
Additional Research Needs to Support the GENII Biosphere Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Napier, Bruce A.; Snyder, Sandra F.; Arimescu, Carmen
In the course of evaluating the current parameter needs for the GENII Version 2 code (Snyder et al. 2013), areas of possible improvement for both the data and the underlying models have been identified. As the data review was implemented, PNNL staff identified areas where the models can be improved both to accommodate the locally significant pathways identified and also to incorporate newer models. The areas are general data needs for the existing models and improved formulations for the pathway models.
NASA Astrophysics Data System (ADS)
Caldararu, Silvia; Purves, Drew W.; Smith, Matthew J.
2017-04-01
Improving international food security under a changing climate and increasing human population will be greatly aided by improving our ability to modify, understand and predict crop growth. What we predominantly have at our disposal are either process-based models of crop physiology or statistical analyses of yield datasets, both of which suffer from various sources of error. In this paper, we present a generic process-based crop model (PeakN-crop v1.0) which we parametrise using a Bayesian model-fitting algorithm to three different sources: data-space-based vegetation indices, eddy covariance productivity measurements and regional crop yields. We show that the model parametrised without data, based on prior knowledge of the parameters, can largely capture the observed behaviour but the data-constrained model greatly improves both the model fit and reduces prediction uncertainty. We investigate the extent to which each dataset contributes to the model performance and show that while all data improve on the prior model fit, the satellite-based data and crop yield estimates are particularly important for reducing model error and uncertainty. Despite these improvements, we conclude that there are still significant knowledge gaps, in terms of available data for model parametrisation, but our study can help indicate the necessary data collection to improve our predictions of crop yields and crop responses to environmental changes.
Model-driven approach to data collection and reporting for quality improvement.
Curcin, Vasa; Woodcock, Thomas; Poots, Alan J; Majeed, Azeem; Bell, Derek
2014-12-01
Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Y.; Li, J.; Xu, H.
2016-01-01
Physically based distributed hydrological models (hereafter referred to as PBDHMs) divide the terrain of the whole catchment into a number of grid cells at fine resolution and assimilate different terrain data and precipitation to different cells. They are regarded to have the potential to improve the catchment hydrological process simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters. However, unfortunately the uncertainties associated with this model derivation are very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study: the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using particle swarm optimization (PSO) algorithm and to test its competence and to improve its performances; the second is to explore the possibility of improving physically based distributed hydrological model capability in catchment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with the Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improved PSO algorithm is developed for the parameter optimization of the Liuxihe model in catchment flood forecasting. The improvements include adoption of the linearly decreasing inertia weight strategy to change the inertia weight and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for the Liuxihe model parameter optimization effectively and could improve the model capability largely in catchment flood forecasting, thus proving that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological models. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for the Liuxihe model catchment flood forecasting are 20 and 30 respectively.
Dong, Ling-Bo; Liu, Zhao-Gang; Li, Feng-Ri; Jiang, Li-Chun
2013-09-01
By using the branch analysis data of 955 standard branches from 60 sampled trees in 12 sampling plots of Pinus koraiensis plantation in Mengjiagang Forest Farm in Heilongjiang Province of Northeast China, and based on the linear mixed-effect model theory and methods, the models for predicting branch variables, including primary branch diameter, length, and angle, were developed. Considering tree effect, the MIXED module of SAS software was used to fit the prediction models. The results indicated that the fitting precision of the models could be improved by choosing appropriate random-effect parameters and variance-covariance structure. Then, the correlation structures including complex symmetry structure (CS), first-order autoregressive structure [AR(1)], and first-order autoregressive and moving average structure [ARMA(1,1)] were added to the optimal branch size mixed-effect model. The AR(1) improved the fitting precision of branch diameter and length mixed-effect model significantly, but all the three structures didn't improve the precision of branch angle mixed-effect model. In order to describe the heteroscedasticity during building mixed-effect model, the CF1 and CF2 functions were added to the branch mixed-effect model. CF1 function improved the fitting effect of branch angle mixed model significantly, whereas CF2 function improved the fitting effect of branch diameter and length mixed model significantly. Model validation confirmed that the mixed-effect model could improve the precision of prediction, as compare to the traditional regression model for the branch size prediction of Pinus koraiensis plantation.
NASA Astrophysics Data System (ADS)
Chen, Y.; Li, J.; Xu, H.
2015-10-01
Physically based distributed hydrological models discrete the terrain of the whole catchment into a number of grid cells at fine resolution, and assimilate different terrain data and precipitation to different cells, and are regarded to have the potential to improve the catchment hydrological processes simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters, but unfortunately, the uncertanties associated with this model parameter deriving is very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study, the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using PSO algorithm and to test its competence and to improve its performances, the second is to explore the possibility of improving physically based distributed hydrological models capability in cathcment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improverd Particle Swarm Optimization (PSO) algorithm is developed for the parameter optimization of Liuxihe model in catchment flood forecasting, the improvements include to adopt the linear decreasing inertia weight strategy to change the inertia weight, and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for Liuxihe model parameter optimization effectively, and could improve the model capability largely in catchment flood forecasting, thus proven that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological model. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for Liuxihe model catchment flood forcasting is 20 and 30, respectively.
NASA Technical Reports Server (NTRS)
Rosenzweig, Cynthia E.; Jones, James W.; Hatfield, Jerry L.; Antle, John M.; Ruane, Alexander C.; Mutter, Carolyn Z.
2015-01-01
The Agricultural Model Intercomparison and Improvement Project (AgMIP) was founded in 2010. Its mission is to improve substantially the characterization of world food security as affected by climate variability and change, and to enhance adaptation capacity in both developing and developed countries. The objectives of AgMIP are to: Incorporate state-of-the-art climate, crop/livestock, and agricultural economic model improvements into coordinated multi-model regional and global assessments of future climate impacts and adaptation and other key aspects of the food system. Utilize multiple models, scenarios, locations, crops/livestock, and participants to explore uncertainty and the impact of data and methodological choices. Collaborate with regional experts in agronomy, animal sciences, economics, and climate to build a strong basis for model applications, addressing key climate related questions and sustainable intensification farming systems. Improve scientific and adaptive capacity in modeling for major agricultural regions in the developing and developed world, with a focus on vulnerable regions. Improve agricultural data and enhance data-sharing based on their intercomparison and evaluation using best scientific practices. Develop modeling frameworks to identify and evaluate promising adaptation technologies and policies and to prioritize strategies.
Automated dynamic analytical model improvement for damped structures
NASA Technical Reports Server (NTRS)
Fuh, J. S.; Berman, A.
1985-01-01
A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.
Improved heat transfer modeling of the eye for electromagnetic wave exposures.
Hirata, Akimasa
2007-05-01
This study proposed an improved heat transfer model of the eye for exposure to electromagnetic (EM) waves. Particular attention was paid to the difference from the simplified heat transfer model commonly used in this field. From our computational results, the temperature elevation in the eye calculated with the simplified heat transfer model was largely influenced by the EM absorption outside the eyeball, but not when we used our improved model.
Using simulation modeling to improve patient flow at an outpatient orthopedic clinic.
Rohleder, Thomas R; Lewkonia, Peter; Bischak, Diane P; Duffy, Paul; Hendijani, Rosa
2011-06-01
We report on the use of discrete event simulation modeling to support process improvements at an orthopedic outpatient clinic. The clinic was effective in treating patients, but waiting time and congestion in the clinic created patient dissatisfaction and staff morale issues. The modeling helped to identify improvement alternatives including optimized staffing levels, better patient scheduling, and an emphasis on staff arriving promptly. Quantitative results from the modeling provided motivation to implement the improvements. Statistical analysis of data taken before and after the implementation indicate that waiting time measures were significantly improved and overall patient time in the clinic was reduced.
El-Gabbas, Ahmed; Dormann, Carsten F
2018-02-01
Species distribution modeling (SDM) is an essential method in ecology and conservation. SDMs are often calibrated within one country's borders, typically along a limited environmental gradient with biased and incomplete data, making the quality of these models questionable. In this study, we evaluated how adequate are national presence-only data for calibrating regional SDMs. We trained SDMs for Egyptian bat species at two different scales: only within Egypt and at a species-specific global extent. We used two modeling algorithms: Maxent and elastic net, both under the point-process modeling framework. For each modeling algorithm, we measured the congruence of the predictions of global and regional models for Egypt, assuming that the lower the congruence, the lower the appropriateness of the Egyptian dataset to describe the species' niche. We inspected the effect of incorporating predictions from global models as additional predictor ("prior") to regional models, and quantified the improvement in terms of AUC and the congruence between regional models run with and without priors. Moreover, we analyzed predictive performance improvements after correction for sampling bias at both scales. On average, predictions from global and regional models in Egypt only weakly concur. Collectively, the use of priors did not lead to much improvement: similar AUC and high congruence between regional models calibrated with and without priors. Correction for sampling bias led to higher model performance, whatever prior used, making the use of priors less pronounced. Under biased and incomplete sampling, the use of global bats data did not improve regional model performance. Without enough bias-free regional data, we cannot objectively identify the actual improvement of regional models after incorporating information from the global niche. However, we still believe in great potential for global model predictions to guide future surveys and improve regional sampling in data-poor regions.
NASA Astrophysics Data System (ADS)
Lauer, Axel; Jones, Colin; Eyring, Veronika; Evaldsson, Martin; Hagemann, Stefan; Mäkelä, Jarmo; Martin, Gill; Roehrig, Romain; Wang, Shiyu
2018-01-01
The performance of updated versions of the four earth system models (ESMs) CNRM, EC-Earth, HadGEM, and MPI-ESM is assessed in comparison to their predecessor versions used in Phase 5 of the Coupled Model Intercomparison Project. The Earth System Model Evaluation Tool (ESMValTool) is applied to evaluate selected climate phenomena in the models against observations. This is the first systematic application of the ESMValTool to assess and document the progress made during an extensive model development and improvement project. This study focuses on the South Asian monsoon (SAM) and the West African monsoon (WAM), the coupled equatorial climate, and Southern Ocean clouds and radiation, which are known to exhibit systematic biases in present-day ESMs. The analysis shows that the tropical precipitation in three out of four models is clearly improved. Two of three updated coupled models show an improved representation of tropical sea surface temperatures with one coupled model not exhibiting a double Intertropical Convergence Zone (ITCZ). Simulated cloud amounts and cloud-radiation interactions are improved over the Southern Ocean. Improvements are also seen in the simulation of the SAM and WAM, although systematic biases remain in regional details and the timing of monsoon rainfall. Analysis of simulations with EC-Earth at different horizontal resolutions from T159 up to T1279 shows that the synoptic-scale variability in precipitation over the SAM and WAM regions improves with higher model resolution. The results suggest that the reasonably good agreement of modeled and observed mean WAM and SAM rainfall in lower-resolution models may be a result of unrealistic intensity distributions.
Lin, Ching-Hua; Yen, Yung-Chieh; Chen, Ming-Chao; Chen, Cheng-Chung
2013-12-02
The objective of this study was to investigate the effects of depression relief and pain relief on the improvement in daily functioning and quality of life (QOL) for depressed patients receiving a 6-week treatment of fluoxetine. A total of 131 acutely ill inpatients with major depressive disorder (MDD) were enrolled to receive 20mg of fluoxetine daily for 6 weeks. Depression severity, pain severity, daily functioning, and health-related QOL were assessed at baseline and again at week 6. Depression severity, pain severity, and daily functioning were assessed using the 17-item Hamilton Depression Rating Scale, the Short-Form 36 (SF-36) Body Pain Index, and the Work and Social Adjustment Scale. Health-related QOL was assessed by three primary domains of the SF-36, including social functioning, vitality, and general health perceptions. Pearson's correlation and structural equation modeling were used to examine relationships among the study variables. Five models were proposed. In model 1, depression relief alone improved daily functioning and QOL. In model 2, pain relief alone improved daily functioning and QOL. In model 3, depression relief, mediated by pain relief, improved daily functioning and QOL. In model 4, pain relief, mediated by depression relief, improved daily functioning and QOL. In model 5, both depression relief and pain relief improved daily functioning and QOL. One hundred and six patients completed all the measures at baseline and at week 6. Model 5 was the most fitted structural equation model (χ(2) = 8.62, df = 8, p = 0.376, GFI = 0.975, AGFI = 0.935, TLI = 0.992, CFI = 0.996, RMSEA = 0.027). Interventions which relieve depression and pain improve daily functioning and QOL among patients with MDD. The proposed model can provide quantitative estimates of improvement in treating patients with MDD. © 2013 Elsevier Inc. All rights reserved.
The origin of consistent protein structure refinement from structural averaging.
Park, Hahnbeom; DiMaio, Frank; Baker, David
2015-06-02
Recent studies have shown that explicit solvent molecular dynamics (MD) simulation followed by structural averaging can consistently improve protein structure models. We find that improvement upon averaging is not limited to explicit water MD simulation, as consistent improvements are also observed for more efficient implicit solvent MD or Monte Carlo minimization simulations. To determine the origin of these improvements, we examine the changes in model accuracy brought about by averaging at the individual residue level. We find that the improvement in model quality from averaging results from the superposition of two effects: a dampening of deviations from the correct structure in the least well modeled regions, and a reinforcement of consistent movements towards the correct structure in better modeled regions. These observations are consistent with an energy landscape model in which the magnitude of the energy gradient toward the native structure decreases with increasing distance from the native state. Copyright © 2015 Elsevier Ltd. All rights reserved.
wfip2.model/realtime.hrrr_esrl.graphics.01 (Model: Real Time)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/realtime.rap_esrl.icbc.01 (Model: Real Time)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/refcst.01.fcst.02 (Model: Year-Long Reforecast)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/refcst.coldstart.icbc.02 (Model: Year-Long Reforecast)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/realtime.hrrr_esrl.icbc.01 (Model: Real Time)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/realtime.rap_esrl.graphics.01 (Model: Real Time)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/refcst.01.fcst.01 (Model: Year-Long Reforecast)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/refcst.coldstart.icbc.01 (Model: Year-Long Reforecast)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/refcst.02.fcst.02 (Model: Year-Long Reforecast)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
USDA-ARS?s Scientific Manuscript database
To improve climate change impact estimates, multi-model ensembles (MMEs) have been suggested. MMEs enable quantifying model uncertainty, and their medians are more accurate than that of any single model when compared with observations. However, multi-model ensembles are costly to execute, so model i...
Orem's Self-Care Model With Trauma Patients: A Quasi-Experimental Study.
Khatiban, Mahnaz; Shirani, Fatemeh; Oshvandi, Khodayar; Soltanian, Ali Reza; Ebrahimian, Ramin
2018-07-01
To examine if the application of Orem's self-care model could improve self-care knowledge, attitudes, practices, and respiratory conditions of trauma patients with chest tubes, a quasi-experimental study was conducted. The participants were assigned to two groups-namely, Orem's model and routine care. Although the patients' self-care knowledge, attitudes, and practices were improved in both groups over the course of 3 days since the initial assessments, there was a greater degree of improvement in the experimental group than that in the control group. However, there were no differences in the improvement of the chest parameters between the two groups. Orem's model was effective in improving self-care in patients with chest tube.
Mechanical testing of bones: the positive synergy of finite-element models and in vitro experiments.
Cristofolini, Luca; Schileo, Enrico; Juszczyk, Mateusz; Taddei, Fulvia; Martelli, Saulo; Viceconti, Marco
2010-06-13
Bone biomechanics have been extensively investigated in the past both with in vitro experiments and numerical models. In most cases either approach is chosen, without exploiting synergies. Both experiments and numerical models suffer from limitations relative to their accuracy and their respective fields of application. In vitro experiments can improve numerical models by: (i) preliminarily identifying the most relevant failure scenarios; (ii) improving the model identification with experimentally measured material properties; (iii) improving the model identification with accurately measured actual boundary conditions; and (iv) providing quantitative validation based on mechanical properties (strain, displacements) directly measured from physical specimens being tested in parallel with the modelling activity. Likewise, numerical models can improve in vitro experiments by: (i) identifying the most relevant loading configurations among a number of motor tasks that cannot be replicated in vitro; (ii) identifying acceptable simplifications for the in vitro simulation; (iii) optimizing the use of transducers to minimize errors and provide measurements at the most relevant locations; and (iv) exploring a variety of different conditions (material properties, interface, etc.) that would require enormous experimental effort. By reporting an example of successful investigation of the femur, we show how a combination of numerical modelling and controlled experiments within the same research team can be designed to create a virtuous circle where models are used to improve experiments, experiments are used to improve models and their combination synergistically provides more detailed and more reliable results than can be achieved with either approach singularly.
Improving inflow forecasting into hydropower reservoirs through a complementary modelling framework
NASA Astrophysics Data System (ADS)
Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.
2014-10-01
Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead-time is considered within the day-ahead (Elspot) market of the Nordic exchange market. We present here a new approach for issuing hourly reservoir inflow forecasts that aims to improve on existing forecasting models that are in place operationally, without needing to modify the pre-existing approach, but instead formulating an additive or complementary model that is independent and captures the structure the existing model may be missing. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. The procedure presented comprises an error model added on top of an un-alterable constant parameter conceptual model, the models being demonstrated with reference to the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead-times up to 17 h. Season based evaluations indicated that the improvement in inflow forecasts varies across seasons and inflow forecasts in autumn and spring are less successful with the 95% prediction interval bracketing less than 95% of the observations for lead-times beyond 17 h.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, L.; Britt, J.; Birkmire, R.
ITN Energy Systems, Inc., and Global Solar Energy, Inc., assisted by NREL's PV Manufacturing R&D program, have continued to advance CIGS production technology by developing trajectory-oriented predictive/control models, fault-tolerance control, control platform development, in-situ sensors, and process improvements. Modeling activities included developing physics-based and empirical models for CIGS and sputter-deposition processing, implementing model-based control, and applying predictive models to the construction of new evaporation sources and for control. Model-based control is enabled by implementing reduced or empirical models into a control platform. Reliability improvement activities include implementing preventive maintenance schedules; detecting failed sensors/equipment and reconfiguring to tinue processing; and systematicmore » development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which in turn have been enabled by control and reliability improvements due to this PV Manufacturing R&D program.« less
A Final Approach Trajectory Model for Current Operations
NASA Technical Reports Server (NTRS)
Gong, Chester; Sadovsky, Alexander
2010-01-01
Predicting accurate trajectories with limited intent information is a challenge faced by air traffic management decision support tools in operation today. One such tool is the FAA's Terminal Proximity Alert system which is intended to assist controllers in maintaining safe separation of arrival aircraft during final approach. In an effort to improve the performance of such tools, two final approach trajectory models are proposed; one based on polynomial interpolation, the other on the Fourier transform. These models were tested against actual traffic data and used to study effects of the key final approach trajectory modeling parameters of wind, aircraft type, and weight class, on trajectory prediction accuracy. Using only the limited intent data available to today's ATM system, both the polynomial interpolation and Fourier transform models showed improved trajectory prediction accuracy over a baseline dead reckoning model. Analysis of actual arrival traffic showed that this improved trajectory prediction accuracy leads to improved inter-arrival separation prediction accuracy for longer look ahead times. The difference in mean inter-arrival separation prediction error between the Fourier transform and dead reckoning models was 0.2 nmi for a look ahead time of 120 sec, a 33 percent improvement, with a corresponding 32 percent improvement in standard deviation.
NASA Astrophysics Data System (ADS)
Janpaule, Inese; Haritonova, Diana; Balodis, Janis; Zarins, Ansis; Silabriedis, Gunars; Kaminskis, Janis
2015-03-01
Development of a digital zenith telescope prototype, improved zenith camera construction and analysis of experimental vertical deflection measurements for the improvement of the Latvian geoid model has been performed at the Institute of Geodesy and Geoinformatics (GGI), University of Latvia. GOCE satellite data was used to compute geoid model for the Riga region, and European gravimetric geoid model EGG97 and 102 data points of GNSS/levelling were used as input data in the calculations of Latvian geoid model.
2016 KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carrington, David Bradley; Waters, Jiajia
Los Alamos National Laboratory and its collaborators are facilitating engine modeling by improving accuracy and robustness of the modeling, and improving the robustness of software. We also continue to improve the physical modeling methods. We are developing and implementing new mathematical algorithms, those that represent the physics within an engine. We provide software that others may use directly or that they may alter with various models e.g., sophisticated chemical kinetics, different turbulent closure methods or other fuel injection and spray systems.
wfip2.model/retro.hrrr.01.fcst.01 (Model: 10-Day Retrospective)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/retro.hrrr.02.fcst.01 (Model: 10-Day Retrospective)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/retro.hrrr.02.fcst.02 (Model: 10-Day Retrospective)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/retro.rap.01.fcst.01 (Model: 10-Day Retrospective)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/realtime.hrrr_wfip2.graphics.02 (Model: Real Time)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/retro.rap.02.fcst.01 (Model: 10-Day Retrospective)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/realtime.hrrr_wfip2.icbc.02 (Model: Real Time)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
wfip2.model/retro.hrrr.01.fcst.02 (Model: 10-Day Retrospective)
Macduff, Matt
2017-10-27
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
Sun, Jin; Xu, Xiaosu; Liu, Yiting; Zhang, Tao; Li, Yao
2016-07-12
In order to reduce the influence of fiber optic gyroscope (FOG) random drift error on inertial navigation systems, an improved auto regressive (AR) model is put forward in this paper. First, based on real-time observations at each restart of the gyroscope, the model of FOG random drift can be established online. In the improved AR model, the FOG measured signal is employed instead of the zero mean signals. Then, the modified Sage-Husa adaptive Kalman filter (SHAKF) is introduced, which can directly carry out real-time filtering on the FOG signals. Finally, static and dynamic experiments are done to verify the effectiveness. The filtering results are analyzed with Allan variance. The analysis results show that the improved AR model has high fitting accuracy and strong adaptability, and the minimum fitting accuracy of single noise is 93.2%. Based on the improved AR(3) model, the denoising method of SHAKF is more effective than traditional methods, and its effect is better than 30%. The random drift error of FOG is reduced effectively, and the precision of the FOG is improved.
Process Correlation Analysis Model for Process Improvement Identification
Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170
Process correlation analysis model for process improvement identification.
Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.
NASA Astrophysics Data System (ADS)
Jacox, M.; Edwards, C. A.; Kahru, M.; Rudnick, D. L.; Kudela, R. M.
2012-12-01
A 26-year record of depth integrated primary productivity (PP) in the Southern California Current System (SCCS) is analyzed with the goal of improving satellite net primary productivity (PP) estimates. The ratio of integrated primary productivity to surface chlorophyll correlates strongly to surface chlorophyll concentration (chl0). However, chl0 does not correlate to chlorophyll-specific productivity, and appears to be a proxy for vertical phytoplankton distribution rather than phytoplankton physiology. Modest improvements in PP model performance are achieved by tuning existing algorithms for the SCCS, particularly by empirical parameterization of photosynthetic efficiency in the Vertically Generalized Production Model. Much larger improvements are enabled by improving accuracy of subsurface chlorophyll and light profiles. In a simple vertically resolved production model, substitution of in situ surface data for remote sensing estimates offers only marginal improvements in model r2 and total log10 root mean squared difference, while inclusion of in situ chlorophyll and light profiles improves these metrics significantly. Autonomous underwater gliders, capable of measuring subsurface fluorescence on long-term, long-range deployments, significantly improve PP model fidelity in the SCCS. We suggest their use (and that of other autonomous profilers such as Argo floats) in conjunction with satellites as a way forward for improved PP estimation in coastal upwelling systems.
NASA Astrophysics Data System (ADS)
Jacox, Michael G.; Edwards, Christopher A.; Kahru, Mati; Rudnick, Daniel L.; Kudela, Raphael M.
2015-02-01
A 26-year record of depth integrated primary productivity (PP) in the Southern California Current System (SCCS) is analyzed with the goal of improving satellite net primary productivity (PP) estimates. Modest improvements in PP model performance are achieved by tuning existing algorithms for the SCCS, particularly by parameterizing carbon fixation rate in the vertically generalized production model as a function of surface chlorophyll concentration and distance from shore. Much larger improvements are enabled by improving the accuracy of subsurface chlorophyll and light profiles. In a simple vertically resolved production model for the SCCS (VRPM-SC), substitution of in situ surface data for remote sensing estimates offers only marginal improvements in model r2 (from 0.54 to 0.56) and total log10 root mean squared difference (from 0.22 to 0.21), while inclusion of in situ chlorophyll and light profiles improves these metrics to 0.77 and 0.15, respectively. Autonomous underwater gliders, capable of measuring subsurface properties on long-term, long-range deployments, significantly improve PP model fidelity in the SCCS. We suggest their use (and that of other autonomous profilers such as Argo floats) in conjunction with satellites as a way forward for large-scale improvements in PP estimation.
NASA Astrophysics Data System (ADS)
Tavakkoli, M.; Kharrat, R.; Masihi, M.; Ghazanfari, M. H.; Fadaei, S.
2012-12-01
Thermodynamic modeling is known as a promising tool for phase behavior modeling of asphaltene precipitation under different conditions such as pressure depletion and CO2 injection. In this work, a thermodynamic approach is used for modeling the phase behavior of asphaltene precipitation. The precipitated asphaltene phase is represented by an improved solid model, while the oil and gas phases are modeled with an equation of state. The PR-EOS was used to perform flash calculations. Then, the onset point and the amount of precipitated asphaltene were predicted. A computer code based on an improved solid model has been developed and used for predicting asphaltene precipitation data for one of Iranian heavy crudes, under pressure depletion and CO2 injection conditions. A significant improvement has been observed in predicting the asphaltene precipitation data under gas injection conditions. Especially for the maximum value of asphaltene precipitation and for the trend of the curve after the peak point, good agreement was observed. For gas injection conditions, comparison of the thermodynamic micellization model and the improved solid model showed that the thermodynamic micellization model cannot predict the maximum of precipitation as well as the improved solid model. The non-isothermal improved solid model has been used for predicting asphaltene precipitation data under pressure depletion conditions. The pressure depletion tests were done at different levels of temperature and pressure, and the parameters of a non-isothermal model were tuned using three onset pressures at three different temperatures for the considered crude. The results showed that the model is highly sensitive to the amount of solid molar volume along with the interaction coefficient parameter between the asphaltene component and light hydrocarbon components. Using a non-isothermal improved solid model, the asphaltene phase envelope was developed. It has been revealed that at high temperatures, an increase in the temperature results in a lower amount of asphaltene precipitation and also it causes the convergence of lower and upper boundaries of the asphaltene phase envelope. This work illustrates successful application of a non-isothermal improved solid model for developing the asphaltene phase envelope of heavy crude which can be helpful for monitoring and controlling of asphaltene precipitation through the wellbore and surface facilities during heavy oil production.
Wandersman, Abraham; Alia, Kassandra Ann; Cook, Brittany; Ramaswamy, Rohit
2015-01-01
While the body of evidence-based healthcare interventions grows, the ability of health systems to deliver these interventions effectively and efficiently lags behind. Quality improvement approaches, such as the model for improvement, have demonstrated some success in healthcare but their impact has been lessened by implementation challenges. To help address these challenges, we describe the empowerment evaluation approach that has been developed by programme evaluators and a method for its application (Getting To Outcomes (GTO)). We then describe how GTO can be used to implement healthcare interventions. An illustrative healthcare quality improvement example that compares the model for improvement and the GTO method for reducing hospital admissions through improved diabetes care is described. We conclude with suggestions for integrating GTO and the model for improvement. PMID:26178332
Effect of Time Varying Gravity on DORIS processing for ITRF2013
NASA Astrophysics Data System (ADS)
Zelensky, N. P.; Lemoine, F. G.; Chinn, D. S.; Beall, J. W.; Melachroinos, S. A.; Beckley, B. D.; Pavlis, D.; Wimert, J.
2013-12-01
Computations are under way to develop a new time series of DORIS SINEX solutions to contribute to the development of the new realization of the terrestrial reference frame (c.f. ITRF2013). One of the improvements that are envisaged is the application of improved models of time-variable gravity in the background orbit modeling. At GSFC we have developed a time series of spherical harmonics to degree and order 5 (using the GOC02S model as a base), based on the processing of SLR and DORIS data to 14 satellites from 1993 to 2013. This is compared with the standard approach used in ITRF2008, based on the static model EIGEN-GL04S1 which included secular variations in only a few select coefficients. Previous work on altimeter satellite POD (c.f. TOPEX/Poseidon, Jason-1, Jason-2) has shown that the standard model is not adequate and orbit improvements are observed with application of more detailed models of time-variable gravity. In this study, we quantify the impact of TVG modeling on DORIS satellite POD, and ascertain the impact on DORIS station positions estimated weekly from 1993 to 2013. The numerous recent improvements to SLR and DORIS processing at GSFC include a more complete compliance to IERS2010 standards, improvements to SLR/DORIS measurement modeling, and improved non-conservative force modeling to DORIS satellites. These improvements will affect gravity coefficient estimates, POD, and the station solutions. Tests evaluate the impact of time varying gravity on tracking data residuals, station consistency, and the geocenter and scale reference frame parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, L.
ITN Energy Systems, Inc., and Global Solar Energy, Inc., with the assistance of NREL's PV Manufacturing R&D program, have continued the advancement of CIGS production technology through the development of trajectory-oriented predictive/control models, fault-tolerance control, control-platform development, in-situ sensors, and process improvements. Modeling activities to date include the development of physics-based and empirical models for CIGS and sputter-deposition processing, implementation of model-based control, and application of predictive models to the construction of new evaporation sources and for control. Model-based control is enabled through implementation of reduced or empirical models into a control platform. Reliability improvement activities include implementation of preventivemore » maintenance schedules; detection of failed sensors/equipment and reconfiguration to continue processing; and systematic development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which, in turn, have been enabled by control and reliability improvements due to this PV Manufacturing R&D program. This has resulted in substantial improvements of flexible CIGS PV module performance and efficiency.« less
Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B
2007-03-01
The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.
Skill of Predicting Heavy Rainfall Over India: Improvement in Recent Years Using UKMO Global Model
NASA Astrophysics Data System (ADS)
Sharma, Kuldeep; Ashrit, Raghavendra; Bhatla, R.; Mitra, A. K.; Iyengar, G. R.; Rajagopal, E. N.
2017-11-01
The quantitative precipitation forecast (QPF) performance for heavy rains is still a challenge, even for the most advanced state-of-art high-resolution Numerical Weather Prediction (NWP) modeling systems. This study aims to evaluate the performance of UK Met Office Unified Model (UKMO) over India for prediction of high rainfall amounts (>2 and >5 cm/day) during the monsoon period (JJAS) from 2007 to 2015 in short range forecast up to Day 3. Among the various modeling upgrades and improvements in the parameterizations during this period, the model horizontal resolution has seen an improvement from 40 km in 2007 to 17 km in 2015. Skill of short range rainfall forecast has improved in UKMO model in recent years mainly due to increased horizontal and vertical resolution along with improved physics schemes. Categorical verification carried out using the four verification metrics, namely, probability of detection (POD), false alarm ratio (FAR), frequency bias (Bias) and Critical Success Index, indicates that QPF has improved by >29 and >24% in case of POD and FAR. Additionally, verification scores like EDS (Extreme Dependency Score), EDI (Extremal Dependence Index) and SEDI (Symmetric EDI) are used with special emphasis on verification of extreme and rare rainfall events. These scores also show an improvement by 60% (EDS) and >34% (EDI and SEDI) during the period of study, suggesting an improved skill of predicting heavy rains.
Macduff, Matt
2017-10-26
The primary purpose of WFIP2 Model Development Team is to improve existing numerical weather prediction models in a manner that leads to improved wind forecasts in regions of complex terrain. Improvements in the models will come through better understanding of the physics associated with the wind flow in and around the wind plant across a range of temporal and spatial scales, which will be gained through WFIP2’s observational field study and analysis.
Automation of energy demand forecasting
NASA Astrophysics Data System (ADS)
Siddique, Sanzad
Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.
Modelling the impacts of pests and diseases on agricultural systems.
Donatelli, M; Magarey, R D; Bregaglio, S; Willocquet, L; Whish, J P M; Savary, S
2017-07-01
The improvement and application of pest and disease models to analyse and predict yield losses including those due to climate change is still a challenge for the scientific community. Applied modelling of crop diseases and pests has mostly targeted the development of support capabilities to schedule scouting or pesticide applications. There is a need for research to both broaden the scope and evaluate the capabilities of pest and disease models. Key research questions not only involve the assessment of the potential effects of climate change on known pathosystems, but also on new pathogens which could alter the (still incompletely documented) impacts of pests and diseases on agricultural systems. Yield loss data collected in various current environments may no longer represent a adequate reference to develop tactical, decision-oriented, models for plant diseases and pests and their impacts, because of the ongoing changes in climate patterns. Process-based agricultural simulation modelling, on the other hand, appears to represent a viable methodology to estimate the impacts of these potential effects. A new generation of tools based on state-of-the-art knowledge and technologies is needed to allow systems analysis including key processes and their dynamics over appropriate suitable range of environmental variables. This paper offers a brief overview of the current state of development in coupling pest and disease models to crop models, and discusses technical and scientific challenges. We propose a five-stage roadmap to improve the simulation of the impacts caused by plant diseases and pests; i) improve the quality and availability of data for model inputs; ii) improve the quality and availability of data for model evaluation; iii) improve the integration with crop models; iv) improve the processes for model evaluation; and v) develop a community of plant pest and disease modelers.
Davis, Michael J; Janke, Robert
2018-01-04
The effect of limitations in the structural detail available in a network model on contamination warning system (CWS) design was examined in case studies using the original and skeletonized network models for two water distribution systems (WDSs). The skeletonized models were used as proxies for incomplete network models. CWS designs were developed by optimizing sensor placements for worst-case and mean-case contamination events. Designs developed using the skeletonized network models were transplanted into the original network model for evaluation. CWS performance was defined as the number of people who ingest more than some quantity of a contaminant in tap water before the CWS detects the presence of contamination. Lack of structural detail in a network model can result in CWS designs that (1) provide considerably less protection against worst-case contamination events than that obtained when a more complete network model is available and (2) yield substantial underestimates of the consequences associated with a contamination event. Nevertheless, CWSs developed using skeletonized network models can provide useful reductions in consequences for contaminants whose effects are not localized near the injection location. Mean-case designs can yield worst-case performances similar to those for worst-case designs when there is uncertainty in the network model. Improvements in network models for WDSs have the potential to yield significant improvements in CWS designs as well as more realistic evaluations of those designs. Although such improvements would be expected to yield improved CWS performance, the expected improvements in CWS performance have not been quantified previously. The results presented here should be useful to those responsible for the design or implementation of CWSs, particularly managers and engineers in water utilities, and encourage the development of improved network models.
NASA Astrophysics Data System (ADS)
Davis, Michael J.; Janke, Robert
2018-05-01
The effect of limitations in the structural detail available in a network model on contamination warning system (CWS) design was examined in case studies using the original and skeletonized network models for two water distribution systems (WDSs). The skeletonized models were used as proxies for incomplete network models. CWS designs were developed by optimizing sensor placements for worst-case and mean-case contamination events. Designs developed using the skeletonized network models were transplanted into the original network model for evaluation. CWS performance was defined as the number of people who ingest more than some quantity of a contaminant in tap water before the CWS detects the presence of contamination. Lack of structural detail in a network model can result in CWS designs that (1) provide considerably less protection against worst-case contamination events than that obtained when a more complete network model is available and (2) yield substantial underestimates of the consequences associated with a contamination event. Nevertheless, CWSs developed using skeletonized network models can provide useful reductions in consequences for contaminants whose effects are not localized near the injection location. Mean-case designs can yield worst-case performances similar to those for worst-case designs when there is uncertainty in the network model. Improvements in network models for WDSs have the potential to yield significant improvements in CWS designs as well as more realistic evaluations of those designs. Although such improvements would be expected to yield improved CWS performance, the expected improvements in CWS performance have not been quantified previously. The results presented here should be useful to those responsible for the design or implementation of CWSs, particularly managers and engineers in water utilities, and encourage the development of improved network models.
Incorporating groundwater flow into the WEPP model
William Elliot; Erin Brooks; Tim Link; Sue Miller
2010-01-01
The water erosion prediction project (WEPP) model is a physically-based hydrology and erosion model. In recent years, the hydrology prediction within the model has been improved for forest watershed modeling by incorporating shallow lateral flow into watershed runoff prediction. This has greatly improved WEPP's hydrologic performance on small watersheds with...
ERIC Educational Resources Information Center
Grogan, Rita D.
2017-01-01
Purpose: The purpose of this case study was to determine the impact of utilizing predictive modeling to improve successful course completion rates for at-risk students at California community colleges. A secondary purpose of the study was to identify factors of predictive modeling that have the most importance for improving successful course…
Naturalness of Electroweak Symmetry Breaking while Waiting for the LHC
NASA Astrophysics Data System (ADS)
Espinosa, J. R.
2007-06-01
After revisiting the hierarchy problem of the Standard Model and its implications for the scale of New Physics, I consider the finetuning problem of electroweak symmetry breaking in several scenarios beyond the Standard Model: SUSY, Little Higgs and "improved naturalness" models. The main conclusions are that: New Physics should appear on the reach of the LHC; some SUSY models can solve the hierarchy problem with acceptable residual tuning; Little Higgs models generically suffer from large tunings, many times hidden; and, finally, that "improved naturalness" models do not generically improve the naturalness of the SM.
Improving of local ozone forecasting by integrated models.
Gradišar, Dejan; Grašič, Boštjan; Božnar, Marija Zlata; Mlakar, Primož; Kocijan, Juš
2016-09-01
This paper discuss the problem of forecasting the maximum ozone concentrations in urban microlocations, where reliable alerting of the local population when thresholds have been surpassed is necessary. To improve the forecast, the methodology of integrated models is proposed. The model is based on multilayer perceptron neural networks that use as inputs all available information from QualeAria air-quality model, WRF numerical weather prediction model and onsite measurements of meteorology and air pollution. While air-quality and meteorological models cover large geographical 3-dimensional space, their local resolution is often not satisfactory. On the other hand, empirical methods have the advantage of good local forecasts. In this paper, integrated models are used for improved 1-day-ahead forecasting of the maximum hourly value of ozone within each day for representative locations in Slovenia. The WRF meteorological model is used for forecasting meteorological variables and the QualeAria air-quality model for gas concentrations. Their predictions, together with measurements from ground stations, are used as inputs to a neural network. The model validation results show that integrated models noticeably improve ozone forecasts and provide better alert systems.
Improvements to a global-scale groundwater model to estimate the water table across New Zealand
NASA Astrophysics Data System (ADS)
Westerhoff, Rogier; Miguez-Macho, Gonzalo; White, Paul
2017-04-01
Groundwater models at the global scale have become increasingly important in recent years to assess the effects of climate change and groundwater depletion. However, these global-scale models are typically not used for studies at the catchment scale, because they are simplified and too spatially coarse. In this study, we improved the global-scale Equilibrium Water Table (EWT) model, so it could better assess water table depth and water table elevation at the national scale for New Zealand. The resulting National Water Table (NWT) model used improved input data (i.e., national input data of terrain, geology, and recharge) and model equations (e.g., a hydraulic conductivity - depth relation). The NWT model produced maps of the water table that identified the main alluvial aquifers with fine spatial detail. Two regional case studies at the catchment scale demonstrated excellent correlation between the water table elevation and observations of hydraulic head. The NWT water tables are an improved water table estimation over the EWT model. In two case studies the NWT model provided a better approximation to observed water table for deep aquifers and the improved resolution of the model provided the capability to fill the gaps in data-sparse areas. This national model calculated water table depth and elevation across regional jurisdictions. Therefore, the model is relevant where trans-boundary issues, such as source protection and catchment boundary definition, occur. The NWT model also has the potential to constrain the uncertainty of catchment-scale models, particularly where data are sparse. Shortcomings of the NWT model are caused by the inaccuracy of input data and the simplified model properties. Future research should focus on improved estimation of input data (e.g., hydraulic conductivity and terrain). However, more advanced catchment-scale groundwater models should be used where groundwater flow is dominated by confining layers and fractures.
Model-data integration to improve the LPJmL dynamic global vegetation model
NASA Astrophysics Data System (ADS)
Forkel, Matthias; Thonicke, Kirsten; Schaphoff, Sibyll; Thurner, Martin; von Bloh, Werner; Dorigo, Wouter; Carvalhais, Nuno
2017-04-01
Dynamic global vegetation models show large uncertainties regarding the development of the land carbon balance under future climate change conditions. This uncertainty is partly caused by differences in how vegetation carbon turnover is represented in global vegetation models. Model-data integration approaches might help to systematically assess and improve model performances and thus to potentially reduce the uncertainty in terrestrial vegetation responses under future climate change. Here we present several applications of model-data integration with the LPJmL (Lund-Potsdam-Jena managed Lands) dynamic global vegetation model to systematically improve the representation of processes or to estimate model parameters. In a first application, we used global satellite-derived datasets of FAPAR (fraction of absorbed photosynthetic activity), albedo and gross primary production to estimate phenology- and productivity-related model parameters using a genetic optimization algorithm. Thereby we identified major limitations of the phenology module and implemented an alternative empirical phenology model. The new phenology module and optimized model parameters resulted in a better performance of LPJmL in representing global spatial patterns of biomass, tree cover, and the temporal dynamic of atmospheric CO2. Therefore, we used in a second application additionally global datasets of biomass and land cover to estimate model parameters that control vegetation establishment and mortality. The results demonstrate the ability to improve simulations of vegetation dynamics but also highlight the need to improve the representation of mortality processes in dynamic global vegetation models. In a third application, we used multiple site-level observations of ecosystem carbon and water exchange, biomass and soil organic carbon to jointly estimate various model parameters that control ecosystem dynamics. This exercise demonstrates the strong role of individual data streams on the simulated ecosystem dynamics which consequently changed the development of ecosystem carbon stocks and fluxes under future climate and CO2 change. In summary, our results demonstrate challenges and the potential of using model-data integration approaches to improve a dynamic global vegetation model.
NASA Astrophysics Data System (ADS)
Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.
2015-08-01
Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead time is considered within the day-ahead (Elspot) market of the Nordic exchange market. A complementary modelling framework presents an approach for improving real-time forecasting without needing to modify the pre-existing forecasting model, but instead formulating an independent additive or complementary model that captures the structure the existing operational model may be missing. We present here the application of this principle for issuing improved hourly inflow forecasts into hydropower reservoirs over extended lead times, and the parameter estimation procedure reformulated to deal with bias, persistence and heteroscedasticity. The procedure presented comprises an error model added on top of an unalterable constant parameter conceptual model. This procedure is applied in the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead times up to 17 h. Evaluation of the percentage of observations bracketed in the forecasted 95 % confidence interval indicated that the degree of success in containing 95 % of the observations varies across seasons and hydrologic years.
Consistency of internal fluxes in a hydrological model running at multiple time steps
NASA Astrophysics Data System (ADS)
Ficchi, Andrea; Perrin, Charles; Andréassian, Vazken
2016-04-01
Improving hydrological models remains a difficult task and many ways can be explored, among which one can find the improvement of spatial representation, the search for more robust parametrization, the better formulation of some processes or the modification of model structures by trial-and-error procedure. Several past works indicate that model parameters and structure can be dependent on the modelling time step, and there is thus some rationale in investigating how a model behaves across various modelling time steps, to find solutions for improvements. Here we analyse the impact of data time step on the consistency of the internal fluxes of a rainfall-runoff model run at various time steps, by using a large data set of 240 catchments. To this end, fine time step hydro-climatic information at sub-hourly resolution is used as input of a parsimonious rainfall-runoff model (GR) that is run at eight different model time steps (from 6 minutes to one day). The initial structure of the tested model (i.e. the baseline) corresponds to the daily model GR4J (Perrin et al., 2003), adapted to be run at variable sub-daily time steps. The modelled fluxes considered are interception, actual evapotranspiration and intercatchment groundwater flows. Observations of these fluxes are not available, but the comparison of modelled fluxes at multiple time steps gives additional information for model identification. The joint analysis of flow simulation performance and consistency of internal fluxes at different time steps provides guidance to the identification of the model components that should be improved. Our analysis indicates that the baseline model structure is to be modified at sub-daily time steps to warrant the consistency and realism of the modelled fluxes. For the baseline model improvement, particular attention is devoted to the interception model component, whose output flux showed the strongest sensitivity to modelling time step. The dependency of the optimal model complexity on time step is also analysed. References: Perrin, C., Michel, C., Andréassian, V., 2003. Improvement of a parsimonious model for streamflow simulation. Journal of Hydrology, 279(1-4): 275-289. DOI:10.1016/S0022-1694(03)00225-7
Zhou, Miaolei; Wang, Shoubin; Gao, Wei
2013-01-01
As a new type of intelligent material, magnetically shape memory alloy (MSMA) has a good performance in its applications in the actuator manufacturing. Compared with traditional actuators, MSMA actuator has the advantages as fast response and large deformation; however, the hysteresis nonlinearity of the MSMA actuator restricts its further improving of control precision. In this paper, an improved Krasnosel'skii-Pokrovskii (KP) model is used to establish the hysteresis model of MSMA actuator. To identify the weighting parameters of the KP operators, an improved gradient correction algorithm and a variable step-size recursive least square estimation algorithm are proposed in this paper. In order to demonstrate the validity of the proposed modeling approach, simulation experiments are performed, simulations with improved gradient correction algorithm and variable step-size recursive least square estimation algorithm are studied, respectively. Simulation results of both identification algorithms demonstrate that the proposed modeling approach in this paper can establish an effective and accurate hysteresis model for MSMA actuator, and it provides a foundation for improving the control precision of MSMA actuator.
Hysteresis Modeling of Magnetic Shape Memory Alloy Actuator Based on Krasnosel'skii-Pokrovskii Model
Wang, Shoubin; Gao, Wei
2013-01-01
As a new type of intelligent material, magnetically shape memory alloy (MSMA) has a good performance in its applications in the actuator manufacturing. Compared with traditional actuators, MSMA actuator has the advantages as fast response and large deformation; however, the hysteresis nonlinearity of the MSMA actuator restricts its further improving of control precision. In this paper, an improved Krasnosel'skii-Pokrovskii (KP) model is used to establish the hysteresis model of MSMA actuator. To identify the weighting parameters of the KP operators, an improved gradient correction algorithm and a variable step-size recursive least square estimation algorithm are proposed in this paper. In order to demonstrate the validity of the proposed modeling approach, simulation experiments are performed, simulations with improved gradient correction algorithm and variable step-size recursive least square estimation algorithm are studied, respectively. Simulation results of both identification algorithms demonstrate that the proposed modeling approach in this paper can establish an effective and accurate hysteresis model for MSMA actuator, and it provides a foundation for improving the control precision of MSMA actuator. PMID:23737730
Primary care access improvement: an empowerment-interaction model.
Ledlow, G R; Bradshaw, D M; Shockley, C
2000-05-01
Improving community primary care access is a difficult and dynamic undertaking. Realizing a need to improve appointment availability, a systematic approach based on measurement, empowerment, and interaction was developed. The model fostered exchange of information and problem solving between interdependent staff sections within a managed care system. Measuring appointments demanded but not available proved to be a credible customer-focused approach to benchmark against set goals. Changing the organizational culture to become more sensitive to changing beneficiary needs was a paramount consideration. Dependent-group t tests were performed to compare the pretreatment and posttreatment effect. The empowerment-interaction model significantly improved the availability of routine and wellness-type appointments. The availability of urgent appointments improved but not significantly; a better prospective model needs to be developed. In aggregate, appointments demanded but not available (empowerment-interaction model) were more than 10% before the treatment and less than 3% with the treatment.
NASA Astrophysics Data System (ADS)
Abitew, T. A.; van Griensven, A.; Bauwens, W.
2015-12-01
Evapotranspiration is the main process in hydrology (on average around 60%), though has not received as much attention in the evaluation and calibration of hydrological models. In this study, Remote Sensing (RS) derived Evapotranspiration (ET) is used to improve the spatially distributed processes of ET of SWAT model application in the upper Mara basin (Kenya) and the Blue Nile basin (Ethiopia). The RS derived ET data is obtained from recently compiled global datasets (continuously monthly data at 1 km resolution from MOD16NBI,SSEBop,ALEXI,CMRSET models) and from regionally applied Energy Balance Models (for several cloud free days). The RS-RT data is used in different forms: Method 1) to evaluate spatially distributed evapotransiration model resultsMethod 2) to calibrate the evotranspiration processes in hydrological modelMethod 3) to bias-correct the evapotranpiration in hydrological model during simulation after changing the SWAT codesAn inter-comparison of the RS-ET products shows that at present there is a significant bias, but at the same time an agreement on the spatial variability of ET. The ensemble mean of different ET products seems the most realistic estimation and was further used in this study.The results show that:Method 1) the spatially mapped evapotranspiration of hydrological models shows clear differences when compared to RS derived evapotranspiration (low correlations). Especially evapotranspiration in forested areas is strongly underestimated compared to other land covers.Method 2) Calibration allows to improve the correlations between the RS and hydrological model results to some extent.Method 3) Bias-corrections are efficient in producing (sesonal or annual) evapotranspiration maps from hydrological models which are very similar to the patterns obtained from RS data.Though the bias-correction is very efficient, it is advised to improve the model results by better representing the ET processes by improved plant/crop computations, improved agricultural management practices or by providing improved meteorological data.
Augmenting an observation network to facilitate flow and transport model discrimination.
USDA-ARS?s Scientific Manuscript database
Improving understanding of subsurface conditions includes performance comparison for competing models, independently developed or obtained via model abstraction. The model comparison and discrimination can be improved if additional observations will be included. The objective of this work was to i...
NASA Astrophysics Data System (ADS)
Abbasi Baharanchi, Ahmadreza
This dissertation focused on development and utilization of numerical and experimental approaches to improve the CFD modeling of fluidization flow of cohesive micron size particles. The specific objectives of this research were: (1) Developing a cluster prediction mechanism applicable to Two-Fluid Modeling (TFM) of gas-solid systems (2) Developing more accurate drag models for Two-Fluid Modeling (TFM) of gas-solid fluidization flow with the presence of cohesive interparticle forces (3) using the developed model to explore the improvement of accuracy of TFM in simulation of fluidization flow of cohesive powders (4) Understanding the causes and influential factor which led to improvements and quantification of improvements (5) Gathering data from a fast fluidization flow and use these data for benchmark validations. Simulation results with two developed cluster-aware drag models showed that cluster prediction could effectively influence the results in both the first and second cluster-aware models. It was proven that improvement of accuracy of TFM modeling using three versions of the first hybrid model was significant and the best improvements were obtained by using the smallest values of the switch parameter which led to capturing the smallest chances of cluster prediction. In the case of the second hybrid model, dependence of critical model parameter on only Reynolds number led to the fact that improvement of accuracy was significant only in dense section of the fluidized bed. This finding may suggest that a more sophisticated particle resolved DNS model, which can span wide range of solid volume fraction, can be used in the formulation of the cluster-aware drag model. The results of experiment suing high speed imaging indicated the presence of particle clusters in the fluidization flow of FCC inside the riser of FIU-CFB facility. In addition, pressure data was successfully captured along the fluidization column of the facility and used as benchmark validation data for the second hybrid model developed in the present dissertation. It was shown the second hybrid model could predict the pressure data in the dense section of the fluidization column with better accuracy.
Safaie, Ammar; Wendzel, Aaron; Ge, Zhongfu; Nevers, Meredith; Whitman, Richard L.; Corsi, Steven R.; Phanikumar, Mantha S.
2016-01-01
Statistical and mechanistic models are popular tools for predicting the levels of indicator bacteria at recreational beaches. Researchers tend to use one class of model or the other, and it is difficult to generalize statements about their relative performance due to differences in how the models are developed, tested, and used. We describe a cooperative modeling approach for freshwater beaches impacted by point sources in which insights derived from mechanistic modeling were used to further improve the statistical models and vice versa. The statistical models provided a basis for assessing the mechanistic models which were further improved using probability distributions to generate high-resolution time series data at the source, long-term “tracer” transport modeling based on observed electrical conductivity, better assimilation of meteorological data, and the use of unstructured-grids to better resolve nearshore features. This approach resulted in improved models of comparable performance for both classes including a parsimonious statistical model suitable for real-time predictions based on an easily measurable environmental variable (turbidity). The modeling approach outlined here can be used at other sites impacted by point sources and has the potential to improve water quality predictions resulting in more accurate estimates of beach closures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolski, Jeffrey S.; Barlow, David B.; Macek, Robert J.
2011-01-01
Particle ray tracing through simulated 3D magnetic fields was executed to investigate the effective quadrupole strength of the edge focusing of the rectangular bending magnets in the Los Alamos Proton Storage Ring (PSR). The particle rays receive a kick in the edge field of the rectangular dipole. A focal length may be calculated from the particle tracking and related to the fringe field integral (FINT) model parameter. This tech note introduces the baseline lattice model of the PSR and motivates the need for an improvement in the baseline model's vertical tune prediction, which differs from measurement by .05. An improvedmore » model of the PSR is created by modifying the fringe field integral parameter to those suggested by the ray tracing investigation. This improved model is then verified against measurement at the nominal PSR operating set point and at set points far away from the nominal operating conditions. Lastly, Linear Optics from Closed Orbits (LOCO) is employed in an orbit response matrix method for model improvement to verify the quadrupole strengths of the improved model.« less
Ogrinc, Greg; Hoffman, Kimberly G.; Stevenson, Katherine M.; Shalaby, Marc; Beard, Albertine S.; Thörne, Karin E.; Coleman, Mary T.; Baum, Karyn D.
2016-01-01
Problem Current models of health care quality improvement do not explicitly describe the role of health professions education. The authors propose the Exemplary Care and Learning Site (ECLS) model as an approach to achieving continual improvement in care and learning in the clinical setting. Approach From 2008–2012, an iterative, interactive process was used to develop the ECLS model and its core elements—patients and families informing process changes; trainees engaging both in care and the improvement of care; leaders knowing, valuing, and practicing improvement; data transforming into useful information; and health professionals competently engaging both in care improvement and teaching about care improvement. In 2012–2013, a three-part feasibility test of the model, including a site self-assessment, an independent review of each site’s ratings, and implementation case stories, was conducted at six clinical teaching sites (in the United States and Sweden). Outcomes Site leaders reported the ECLS model provided a systematic approach toward improving patient (and population) outcomes, system performance, and professional development. Most sites found it challenging to incorporate the patients and families element. The trainee element was strong at four sites. The leadership and data elements were self-assessed as the most fully developed. The health professionals element exhibited the greatest variability across sites. Next Steps The next test of the model should be prospective, linked to clinical and educa tional outcomes, to evaluate whether it helps care delivery teams, educators, and patients and families take action to achieve better patient (and population) outcomes, system performance, and professional development. PMID:26760058
A framework for multi-criteria assessment of model enhancements
NASA Astrophysics Data System (ADS)
Francke, Till; Foerster, Saskia; Brosinsky, Arlena; Delgado, José; Güntner, Andreas; López-Tarazón, José A.; Bronstert, Axel
2016-04-01
Modellers are often faced with unsatisfactory model performance for a specific setup of a hydrological model. In these cases, the modeller may try to improve the setup by addressing selected causes for the model errors (i.e. data errors, structural errors). This leads to adding certain "model enhancements" (MEs), e.g. climate data based on more monitoring stations, improved calibration data, modifications in process formulations. However, deciding on which MEs to implement remains a matter of expert knowledge, guided by some sensitivity analysis at best. When multiple MEs have been implemented, a resulting improvement in model performance is not easily attributed, especially when considering different aspects of this improvement (e.g. better performance dynamics vs. reduced bias). In this study we present an approach for comparing the effect of multiple MEs in the face of multiple improvement aspects. A stepwise selection approach and structured plots help in addressing the multidimensionality of the problem. The approach is applied to a case study, which employs the meso-scale hydrosedimentological model WASA-SED for a sub-humid catchment. The results suggest that the effect of the MEs is quite diverse, with some MEs (e.g. augmented rainfall data) cause improvements for almost all aspects, while the effect of other MEs is restricted to few aspects or even deteriorate some. These specific results may not be generalizable. However, we suggest that based on studies like this, identifying the most promising MEs to implement may be facilitated.
USDA-ARS?s Scientific Manuscript database
The coupling of land surface models and hydrological models potentially improves the land surface representation, benefiting both the streamflow prediction capabilities as well as providing improved estimates of water and energy fluxes into the atmosphere. In this study, the simple biosphere model 2...
An Improved Inventory Control Model for the Brazilian Navy Supply System
2001-12-01
Portuguese Centro de Controle de Inventario da Marinha, the Brazilian Navy Inventory Control Point (ICP) developed an empirical model called SPAADA...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS Approved for public release; distribution is unlimited AN IMPROVED INVENTORY CONTROL ...AN IMPROVED INVENTORY CONTROL MODEL FOR THE BRAZILIAN NAVY SUPPLY SYSTEM Contract Number Grant Number Program Element Number Author(s) Moreira
The use of discrete-event simulation modelling to improve radiation therapy planning processes.
Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven
2009-07-01
The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.
Model-data integration for developing the Cropland Carbon Monitoring System (CCMS)
NASA Astrophysics Data System (ADS)
Jones, C. D.; Bandaru, V.; Pnvr, K.; Jin, H.; Reddy, A.; Sahajpal, R.; Sedano, F.; Skakun, S.; Wagle, P.; Gowda, P. H.; Hurtt, G. C.; Izaurralde, R. C.
2017-12-01
The Cropland Carbon Monitoring System (CCMS) has been initiated to improve regional estimates of carbon fluxes from croplands in the conterminous United States through integration of terrestrial ecosystem modeling, use of remote-sensing products and publically available datasets, and development of improved landscape and management databases. In order to develop these improved carbon flux estimates, experimental datasets are essential for evaluating the skill of estimates, characterizing the uncertainty of these estimates, characterizing parameter sensitivities, and calibrating specific modeling components. Experiments were sought that included flux tower measurement of CO2 fluxes under production of major agronomic crops. Currently data has been collected from 17 experiments comprising 117 site-years from 12 unique locations. Calibration of terrestrial ecosystem model parameters using available crop productivity and net ecosystem exchange (NEE) measurements resulted in improvements in RMSE of NEE predictions of between 3.78% to 7.67%, while improvements in RMSE for yield ranged from -1.85% to 14.79%. Model sensitivities were dominated by parameters related to leaf area index (LAI) and spring growth, demonstrating considerable capacity for model improvement through development and integration of remote-sensing products. Subsequent analyses will assess the impact of such integrated approaches on skill of cropland carbon flux estimates.
ENSO Simulation in Coupled Ocean-Atmosphere Models: Are the Current Models Better?
DOE Office of Scientific and Technical Information (OSTI.GOV)
AchutaRao, K; Sperber, K R
Maintaining a multi-model database over a generation or more of model development provides an important framework for assessing model improvement. Using control integrations, we compare the simulation of the El Nino/Southern Oscillation (ENSO), and its extratropical impact, in models developed for the 2007 Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report with models developed in the late 1990's (the so-called Coupled Model Intercomparison Project-2 [CMIP2] models). The IPCC models tend to be more realistic in representing the frequency with which ENSO occurs, and they are better at locating enhanced temperature variability over the eastern Pacific Ocean. When compared withmore » reanalyses, the IPCC models have larger pattern correlations of tropical surface air temperature than do the CMIP2 models during the boreal winter peak phase of El Nino. However, for sea-level pressure and precipitation rate anomalies, a clear separation in performance between the two vintages of models is not as apparent. The strongest improvement occurs for the modeling groups whose CMIP2 model tended to have the lowest pattern correlations with observations. This has been checked by subsampling the multi-century IPCC simulations in a manner to be consistent with the single 80-year time segment available from CMIP2. Our results suggest that multi-century integrations may be required to statistically assess model improvement of ENSO. The quality of the El Nino precipitation composite is directly related to the fidelity of the boreal winter precipitation climatology, highlighting the importance of reducing systematic model error. Over North America distinct improvement of El Nino forced boreal winter surface air temperature, sea-level pressure, and precipitation rate anomalies in the IPCC models occurs. This improvement, is directly proportional to the skill of the tropical El Nino forced precipitation anomalies.« less
The agricultural model intercomparison and improvement project (AgMIP): Protocols and pilot studies
USDA-ARS?s Scientific Manuscript database
The Agricultural Model Intercomparison and Improvement Project (AgMIP) is a distributed climate-scenario simulation research activity for historical period model intercomparison and future climate change conditions with participation of multiple crop and agricultural economic model groups around the...
NASA Technical Reports Server (NTRS)
Orme, John S.; Schkolnik, Gerard S.
1995-01-01
Performance Seeking Control (PSC), an onboard, adaptive, real-time optimization algorithm, relies upon an onboard propulsion system model. Flight results illustrated propulsion system performance improvements as calculated by the model. These improvements were subject to uncertainty arising from modeling error. Thus to quantify uncertainty in the PSC performance improvements, modeling accuracy must be assessed. A flight test approach to verify PSC-predicted increases in thrust (FNP) and absolute levels of fan stall margin is developed and applied to flight test data. Application of the excess thrust technique shows that increases of FNP agree to within 3 percent of full-scale measurements for most conditions. Accuracy to these levels is significant because uncertainty bands may now be applied to the performance improvements provided by PSC. Assessment of PSC fan stall margin modeling accuracy was completed with analysis of in-flight stall tests. Results indicate that the model overestimates the stall margin by between 5 to 10 percent. Because PSC achieves performance gains by using available stall margin, this overestimation may represent performance improvements to be recovered with increased modeling accuracy. Assessment of thrust and stall margin modeling accuracy provides a critical piece for a comprehensive understanding of PSC's capabilities and limitations.
PconsFold: improved contact predictions improve protein models.
Michel, Mirco; Hayat, Sikander; Skwark, Marcin J; Sander, Chris; Marks, Debora S; Elofsson, Arne
2014-09-01
Recently it has been shown that the quality of protein contact prediction from evolutionary information can be improved significantly if direct and indirect information is separated. Given sufficiently large protein families, the contact predictions contain sufficient information to predict the structure of many protein families. However, since the first studies contact prediction methods have improved. Here, we ask how much the final models are improved if improved contact predictions are used. In a small benchmark of 15 proteins, we show that the TM-scores of top-ranked models are improved by on average 33% using PconsFold compared with the original version of EVfold. In a larger benchmark, we find that the quality is improved with 15-30% when using PconsC in comparison with earlier contact prediction methods. Further, using Rosetta instead of CNS does not significantly improve global model accuracy, but the chemistry of models generated with Rosetta is improved. PconsFold is a fully automated pipeline for ab initio protein structure prediction based on evolutionary information. PconsFold is based on PconsC contact prediction and uses the Rosetta folding protocol. Due to its modularity, the contact prediction tool can be easily exchanged. The source code of PconsFold is available on GitHub at https://www.github.com/ElofssonLab/pcons-fold under the MIT license. PconsC is available from http://c.pcons.net/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Nearshore Current Model Workshop Summary.
1983-09-01
dissipation , and wave-current interaction. b. Incorporation into models of wave-breaking. c. Parameterization of turbulence in models. d. Incorporation...into models of surf zone energy dissipation . e. Methods to specify waves and currents on the boundaries of the grid. f. Incorporation into models of...also recommended. Improvements should include nonlinear and irregular wave effects and improved models of wave-breaking and wave energy dissipation in
NASA Technical Reports Server (NTRS)
Guenther, D. B.; Demarque, P.; Kim, Y.-C.; Pinsonneault, M. H.
1992-01-01
A set of solar models have been constructed, each based on a single modification to the physics of a reference solar model. In addition, a model combining several of the improvements has been calculated to provide a best solar model. Improvements were made to the nuclear reaction rates, the equation of state, the opacities, and the treatment of the atmosphere. The impact on both the structure and the frequencies of the low-l p-modes of the model to these improvements are discussed. It is found that the combined solar model, which is based on the best physics available (and does not contain any ad hoc assumptions), reproduces the observed oscillation spectrum (for low-l) within the errors associated with the uncertainties in the model physics (primarily opacities).
Process of Continual Improvement in a School of Nursing.
ERIC Educational Resources Information Center
Norman, Linda D.; Lutenbacher, Melanie
1996-01-01
Vanderbilt University School of Nursing used the Batalden model of systems improvement to change its program. The model analyzes services and products, customers, social community need, and customer knowledge to approach improvements in a systematic way. (JOW)
Lee, Seung Yup; Skolnick, Jeffrey
2007-07-01
To improve the accuracy of TASSER models especially in the limit where threading provided template alignments are of poor quality, we have developed the TASSER(iter) algorithm which uses the templates and contact restraints from TASSER generated models for iterative structure refinement. We apply TASSER(iter) to a large benchmark set of 2,773 nonhomologous single domain proteins that are < or = 200 in length and that cover the PDB at the level of 35% pairwise sequence identity. Overall, TASSER(iter) models have a smaller global average RMSD of 5.48 A compared to 5.81 A RMSD of the original TASSER models. Classifying the targets by the level of prediction difficulty (where Easy targets have a good template with a corresponding good threading alignment, Medium targets have a good template but a poor alignment, and Hard targets have an incorrectly identified template), TASSER(iter) (TASSER) models have an average RMSD of 4.15 A (4.35 A) for the Easy set and 9.05 A (9.52 A) for the Hard set. The largest reduction of average RMSD is for the Medium set where the TASSER(iter) models have an average global RMSD of 5.67 A compared to 6.72 A of the TASSER models. Seventy percent of the Medium set TASSER(iter) models have a smaller RMSD than the TASSER models, while 63% of the Easy and 60% of the Hard TASSER models are improved by TASSER(iter). For the foldable cases, where the targets have a RMSD to the native <6.5 A, TASSER(iter) shows obvious improvement over TASSER models: For the Medium set, it improves the success rate from 57.0 to 67.2%, followed by the Hard targets where the success rate improves from 32.0 to 34.8%, with the smallest improvement in the Easy targets from 82.6 to 84.0%. These results suggest that TASSER(iter) can provide more reliable predictions for targets of Medium difficulty, a range that had resisted improvement in the quality of protein structure predictions. 2007 Wiley-Liss, Inc.
An Improved BeiDou-2 Satellite-Induced Code Bias Estimation Method.
Fu, Jingyang; Li, Guangyun; Wang, Li
2018-04-27
Different from GPS, GLONASS, GALILEO and BeiDou-3, it is confirmed that the code multipath bias (CMB), which originate from the satellite end and can be over 1 m, are commonly found in the code observations of BeiDou-2 (BDS) IGSO and MEO satellites. In order to mitigate their adverse effects on absolute precise applications which use the code measurements, we propose in this paper an improved correction model to estimate the CMB. Different from the traditional model which considering the correction values are orbit-type dependent (estimating two sets of values for IGSO and MEO, respectively) and modeling the CMB as a piecewise linear function with a elevation node separation of 10°, we estimate the corrections for each BDS IGSO + MEO satellite on one hand, and a denser elevation node separation of 5° is used to model the CMB variations on the other hand. Currently, the institutions such as IGS-MGEX operate over 120 stations which providing the daily BDS observations. These large amounts of data provide adequate support to refine the CMB estimation satellite by satellite in our improved model. One month BDS observations from MGEX are used for assessing the performance of the improved CMB model by means of precise point positioning (PPP). Experimental results show that for the satellites on the same orbit type, obvious differences can be found in the CMB at the same node and frequency. Results show that the new correction model can improve the wide-lane (WL) ambiguity usage rate for WL fractional cycle bias estimation, shorten the WL and narrow-lane (NL) time to first fix (TTFF) in PPP ambiguity resolution (AR) as well as improve the PPP positioning accuracy. With our improved correction model, the usage of WL ambiguity is increased from 94.1% to 96.0%, the WL and NL TTFF of PPP AR is shorten from 10.6 to 9.3 min, 67.9 to 63.3 min, respectively, compared with the traditional correction model. In addition, both the traditional and improved CMB model have a better performance in these aspects compared with the model which does not account for the CMB correction.
An Improved BeiDou-2 Satellite-Induced Code Bias Estimation Method
Fu, Jingyang; Li, Guangyun; Wang, Li
2018-01-01
Different from GPS, GLONASS, GALILEO and BeiDou-3, it is confirmed that the code multipath bias (CMB), which originate from the satellite end and can be over 1 m, are commonly found in the code observations of BeiDou-2 (BDS) IGSO and MEO satellites. In order to mitigate their adverse effects on absolute precise applications which use the code measurements, we propose in this paper an improved correction model to estimate the CMB. Different from the traditional model which considering the correction values are orbit-type dependent (estimating two sets of values for IGSO and MEO, respectively) and modeling the CMB as a piecewise linear function with a elevation node separation of 10°, we estimate the corrections for each BDS IGSO + MEO satellite on one hand, and a denser elevation node separation of 5° is used to model the CMB variations on the other hand. Currently, the institutions such as IGS-MGEX operate over 120 stations which providing the daily BDS observations. These large amounts of data provide adequate support to refine the CMB estimation satellite by satellite in our improved model. One month BDS observations from MGEX are used for assessing the performance of the improved CMB model by means of precise point positioning (PPP). Experimental results show that for the satellites on the same orbit type, obvious differences can be found in the CMB at the same node and frequency. Results show that the new correction model can improve the wide-lane (WL) ambiguity usage rate for WL fractional cycle bias estimation, shorten the WL and narrow-lane (NL) time to first fix (TTFF) in PPP ambiguity resolution (AR) as well as improve the PPP positioning accuracy. With our improved correction model, the usage of WL ambiguity is increased from 94.1% to 96.0%, the WL and NL TTFF of PPP AR is shorten from 10.6 to 9.3 min, 67.9 to 63.3 min, respectively, compared with the traditional correction model. In addition, both the traditional and improved CMB model have a better performance in these aspects compared with the model which does not account for the CMB correction. PMID:29702559
Dickie, Ben R; Banerji, Anita; Kershaw, Lucy E; McPartlin, Andrew; Choudhury, Ananya; West, Catharine M; Rose, Chris J
2016-10-01
To improve the accuracy and precision of tracer kinetic model parameter estimates for use in dynamic contrast enhanced (DCE) MRI studies of solid tumors. Quantitative DCE-MRI requires an estimate of precontrast T1 , which is obtained prior to fitting a tracer kinetic model. As T1 mapping and tracer kinetic signal models are both a function of precontrast T1 it was hypothesized that its joint estimation would improve the accuracy and precision of both precontrast T1 and tracer kinetic model parameters. Accuracy and/or precision of two-compartment exchange model (2CXM) parameters were evaluated for standard and joint fitting methods in well-controlled synthetic data and for 36 bladder cancer patients. Methods were compared under a number of experimental conditions. In synthetic data, joint estimation led to statistically significant improvements in the accuracy of estimated parameters in 30 of 42 conditions (improvements between 1.8% and 49%). Reduced accuracy was observed in 7 of the remaining 12 conditions. Significant improvements in precision were observed in 35 of 42 conditions (between 4.7% and 50%). In clinical data, significant improvements in precision were observed in 18 of 21 conditions (between 4.6% and 38%). Accuracy and precision of DCE-MRI parameter estimates are improved when signal models are fit jointly rather than sequentially. Magn Reson Med 76:1270-1281, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Reviews and syntheses: Four decades of modeling methane cycling in terrestrial ecosystems
NASA Astrophysics Data System (ADS)
Xu, Xiaofeng; Yuan, Fengming; Hanson, Paul J.; Wullschleger, Stan D.; Thornton, Peter E.; Riley, William J.; Song, Xia; Graham, David E.; Song, Changchun; Tian, Hanqin
2016-06-01
Over the past 4 decades, a number of numerical models have been developed to quantify the magnitude, investigate the spatial and temporal variations, and understand the underlying mechanisms and environmental controls of methane (CH4) fluxes within terrestrial ecosystems. These CH4 models are also used for integrating multi-scale CH4 data, such as laboratory-based incubation and molecular analysis, field observational experiments, remote sensing, and aircraft-based measurements across a variety of terrestrial ecosystems. Here we summarize 40 terrestrial CH4 models to characterize their strengths and weaknesses and to suggest a roadmap for future model improvement and application. Our key findings are that (1) the focus of CH4 models has shifted from theoretical to site- and regional-level applications over the past 4 decades, (2) large discrepancies exist among models in terms of representing CH4 processes and their environmental controls, and (3) significant data-model and model-model mismatches are partially attributed to different representations of landscape characterization and inundation dynamics. Three areas for future improvements and applications of terrestrial CH4 models are that (1) CH4 models should more explicitly represent the mechanisms underlying land-atmosphere CH4 exchange, with an emphasis on improving and validating individual CH4 processes over depth and horizontal space, (2) models should be developed that are capable of simulating CH4 emissions across highly heterogeneous spatial and temporal scales, particularly hot moments and hotspots, and (3) efforts should be invested to develop model benchmarking frameworks that can easily be used for model improvement, evaluation, and integration with data from molecular to global scales. These improvements in CH4 models would be beneficial for the Earth system models and further simulation of climate-carbon cycle feedbacks.
Improvement on a simplified model for protein folding simulation.
Zhang, Ming; Chen, Changjun; He, Yi; Xiao, Yi
2005-11-01
Improvements were made on a simplified protein model--the Ramachandran model-to achieve better computer simulation of protein folding. To check the validity of such improvements, we chose the ultrafast folding protein Engrailed Homeodomain as an example and explored several aspects of its folding. The engrailed homeodomain is a mainly alpha-helical protein of 61 residues from Drosophila melanogaster. We found that the simplified model of Engrailed Homeodomain can fold into a global minimum state with a tertiary structure in good agreement with its native structure.
An Improved K-Epsilon Model for Near-Wall Turbulence and Comparison with Direct Numerical Simulation
NASA Technical Reports Server (NTRS)
Shih, T. H.
1990-01-01
An improved k-epsilon model for low Reynolds number turbulence near a wall is presented. The near-wall asymptotic behavior of the eddy viscosity and the pressure transport term in the turbulent kinetic energy equation is analyzed. Based on this analysis, a modified eddy viscosity model, having correct near-wall behavior, is suggested, and a model for the pressure transport term in the k-equation is proposed. In addition, a modeled dissipation rate equation is reformulated. Fully developed channel flows were used for model testing. The calculations using various k-epsilon models are compared with direct numerical simulations. The results show that the present k-epsilon model performs well in predicting the behavior of near-wall turbulence. Significant improvement over previous k-epsilon models is obtained.
NASA Astrophysics Data System (ADS)
Wei, Jiangfeng; Dirmeyer, Paul A.; Yang, Zong-Liang; Chen, Haishan
2017-10-01
Through a series of model simulations with an atmospheric general circulation model coupled to three different land surface models, this study investigates the impacts of land model ensembles and coupled model ensemble on precipitation simulation. It is found that coupling an ensemble of land models to an atmospheric model has a very minor impact on the improvement of precipitation climatology and variability, but a simple ensemble average of the precipitation from three individually coupled land-atmosphere models produces better results, especially for precipitation variability. The generally weak impact of land processes on precipitation should be the main reason that the land model ensembles do not improve precipitation simulation. However, if there are big biases in the land surface model or land surface data set, correcting them could improve the simulated climate, especially for well-constrained regional climate simulations.
An improved Multimodel Approach for Global Sea Surface Temperature Forecasts
NASA Astrophysics Data System (ADS)
Khan, M. Z. K.; Mehrotra, R.; Sharma, A.
2014-12-01
The concept of ensemble combinations for formulating improved climate forecasts has gained popularity in recent years. However, many climate models share similar physics or modeling processes, which may lead to similar (or strongly correlated) forecasts. Recent approaches for combining forecasts that take into consideration differences in model accuracy over space and time have either ignored the similarity of forecast among the models or followed a pairwise dynamic combination approach. Here we present a basis for combining model predictions, illustrating the improvements that can be achieved if procedures for factoring in inter-model dependence are utilised. The utility of the approach is demonstrated by combining sea surface temperature (SST) forecasts from five climate models over a period of 1960-2005. The variable of interest, the monthly global sea surface temperature anomalies (SSTA) at a 50´50 latitude-longitude grid, is predicted three months in advance to demonstrate the utility of the proposed algorithm. Results indicate that the proposed approach offers consistent and significant improvements for majority of grid points compared to the case where the dependence among the models is ignored. Therefore, the proposed approach of combining multiple models by taking into account the existing interdependence, provides an attractive alternative to obtain improved climate forecast. In addition, an approach to combine seasonal forecasts from multiple climate models with varying periods of availability is also demonstrated.
Wandersman, Abraham; Alia, Kassandra Ann; Cook, Brittany; Ramaswamy, Rohit
2015-10-01
While the body of evidence-based healthcare interventions grows, the ability of health systems to deliver these interventions effectively and efficiently lags behind. Quality improvement approaches, such as the model for improvement, have demonstrated some success in healthcare but their impact has been lessened by implementation challenges. To help address these challenges, we describe the empowerment evaluation approach that has been developed by programme evaluators and a method for its application (Getting To Outcomes (GTO)). We then describe how GTO can be used to implement healthcare interventions. An illustrative healthcare quality improvement example that compares the model for improvement and the GTO method for reducing hospital admissions through improved diabetes care is described. We conclude with suggestions for integrating GTO and the model for improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
An improved large signal model of InP HEMTs
NASA Astrophysics Data System (ADS)
Li, Tianhao; Li, Wenjun; Liu, Jun
2018-05-01
An improved large signal model for InP HEMTs is proposed in this paper. The channel current and charge model equations are constructed based on the Angelov model equations. Both the equations for channel current and gate charge models were all continuous and high order drivable, and the proposed gate charge model satisfied the charge conservation. For the strong leakage induced barrier reduction effect of InP HEMTs, the Angelov current model equations are improved. The channel current model could fit DC performance of devices. A 2 × 25 μm × 70 nm InP HEMT device is used to demonstrate the extraction and validation of the model, in which the model has predicted the DC I–V, C–V and bias related S parameters accurately. Project supported by the National Natural Science Foundation of China (No. 61331006).
Creating High Reliability in Health Care Organizations
Pronovost, Peter J; Berenholtz, Sean M; Goeschel, Christine A; Needham, Dale M; Sexton, J Bryan; Thompson, David A; Lubomski, Lisa H; Marsteller, Jill A; Makary, Martin A; Hunt, Elizabeth
2006-01-01
Objective The objective of this paper was to present a comprehensive approach to help health care organizations reliably deliver effective interventions. Context Reliability in healthcare translates into using valid rate-based measures. Yet high reliability organizations have proven that the context in which care is delivered, called organizational culture, also has important influences on patient safety. Model for Improvement Our model to improve reliability, which also includes interventions to improve culture, focuses on valid rate-based measures. This model includes (1) identifying evidence-based interventions that improve the outcome, (2) selecting interventions with the most impact on outcomes and converting to behaviors, (3) developing measures to evaluate reliability, (4) measuring baseline performance, and (5) ensuring patients receive the evidence-based interventions. The comprehensive unit-based safety program (CUSP) is used to improve culture and guide organizations in learning from mistakes that are important, but cannot be measured as rates. Conclusions We present how this model was used in over 100 intensive care units in Michigan to improve culture and eliminate catheter-related blood stream infections—both were accomplished. Our model differs from existing models in that it incorporates efforts to improve a vital component for system redesign—culture, it targets 3 important groups—senior leaders, team leaders, and front line staff, and facilitates change management—engage, educate, execute, and evaluate for planned interventions. PMID:16898981
NASA Astrophysics Data System (ADS)
Ahmad, Sabrina; Jalil, Intan Ermahani A.; Ahmad, Sharifah Sakinah Syed
2016-08-01
It is seldom technical issues which impede the process of eliciting software requirements. The involvement of multiple stakeholders usually leads to conflicts and therefore the need of conflict detection and resolution effort is crucial. This paper presents a conceptual model to further improve current efforts. Hence, this paper forwards an improved conceptual model to assist the conflict detection and resolution effort which extends the model ability and improves overall performance. The significant of the new model is to empower the automation of conflicts detection and its severity level with rule-based reasoning.
Improving Water Level and Soil Moisture Over Peatlands in a Global Land Modeling System
NASA Technical Reports Server (NTRS)
Bechtold, M.; De Lannoy, G. J. M.; Roose, D.; Reichle, R. H.; Koster, R. D.; Mahanama, S. P.
2017-01-01
New model structure for peatlands results in improved skill metrics (without any parameter calibration) Simulated surface soil moisture strongly affected by new model, but reliable soil moisture data lacking for validation.
Model improvements to simulate charging in SEM
NASA Astrophysics Data System (ADS)
Arat, K. T.; Klimpel, T.; Hagen, C. W.
2018-03-01
Charging of insulators is a complex phenomenon to simulate since the accuracy of the simulations is very sensitive to the interaction of electrons with matter and electric fields. In this study, we report model improvements for a previously developed Monte-Carlo simulator to more accurately simulate samples that charge. The improvements include both modelling of low energy electron scattering and charging of insulators. The new first-principle scattering models provide a more realistic charge distribution cloud in the material, and a better match between non-charging simulations and experimental results. Improvements on charging models mainly focus on redistribution of the charge carriers in the material with an induced conductivity (EBIC) and a breakdown model, leading to a smoother distribution of the charges. Combined with a more accurate tracing of low energy electrons in the electric field, we managed to reproduce the dynamically changing charging contrast due to an induced positive surface potential.
Adjusting the Stems Regional Forest Growth Model to Improve Local Predictions
W. Brad Smith
1983-01-01
A simple procedure using double sampling is described for adjusting growth in the STEMS regional forest growth model to compensate for subregional variations. Predictive accuracy of the STEMS model (a distance-independent, individual tree growth model for Lake States forests) was improved by using this procedure
Improved Solar-Radiation-Pressure Models for GPS Satellites
NASA Technical Reports Server (NTRS)
Bar-Sever, Yoaz; Kuang, Da
2006-01-01
A report describes a series of computational models conceived as an improvement over prior models for determining effects of solar-radiation pressure on orbits of Global Positioning System (GPS) satellites. These models are based on fitting coefficients of Fourier functions of Sun-spacecraft- Earth angles to observed spacecraft orbital motions.
Business Models for Training and Performance Improvement Departments
ERIC Educational Resources Information Center
Carliner, Saul
2004-01-01
Although typically applied to entire enterprises, the concept of business models applies to training and performance improvement groups. Business models are "the method by which firm[s] build and use [their] resources to offer.. value." Business models affect the types of projects, services offered, skills required, business processes, and type of…
Protein homology model refinement by large-scale energy optimization.
Park, Hahnbeom; Ovchinnikov, Sergey; Kim, David E; DiMaio, Frank; Baker, David
2018-03-20
Proteins fold to their lowest free-energy structures, and hence the most straightforward way to increase the accuracy of a partially incorrect protein structure model is to search for the lowest-energy nearby structure. This direct approach has met with little success for two reasons: first, energy function inaccuracies can lead to false energy minima, resulting in model degradation rather than improvement; and second, even with an accurate energy function, the search problem is formidable because the energy only drops considerably in the immediate vicinity of the global minimum, and there are a very large number of degrees of freedom. Here we describe a large-scale energy optimization-based refinement method that incorporates advances in both search and energy function accuracy that can substantially improve the accuracy of low-resolution homology models. The method refined low-resolution homology models into correct folds for 50 of 84 diverse protein families and generated improved models in recent blind structure prediction experiments. Analyses of the basis for these improvements reveal contributions from both the improvements in conformational sampling techniques and the energy function.
Learning epistatic interactions from sequence-activity data to predict enantioselectivity
NASA Astrophysics Data System (ADS)
Zaugg, Julian; Gumulya, Yosephine; Malde, Alpeshkumar K.; Bodén, Mikael
2017-12-01
Enzymes with a high selectivity are desirable for improving economics of chemical synthesis of enantiopure compounds. To improve enzyme selectivity mutations are often introduced near the catalytic active site. In this compact environment epistatic interactions between residues, where contributions to selectivity are non-additive, play a significant role in determining the degree of selectivity. Using support vector machine regression models we map mutations to the experimentally characterised enantioselectivities for a set of 136 variants of the epoxide hydrolase from the fungus Aspergillus niger (AnEH). We investigate whether the influence a mutation has on enzyme selectivity can be accurately predicted through linear models, and whether prediction accuracy can be improved using higher-order counterparts. Comparing linear and polynomial degree = 2 models, mean Pearson coefficients (r) from 50 {× } 5 -fold cross-validation increase from 0.84 to 0.91 respectively. Equivalent models tested on interaction-minimised sequences achieve values of r=0.90 and r=0.93 . As expected, testing on a simulated control data set with no interactions results in no significant improvements from higher-order models. Additional experimentally derived AnEH mutants are tested with linear and polynomial degree = 2 models, with values increasing from r=0.51 to r=0.87 respectively. The study demonstrates that linear models perform well, however the representation of epistatic interactions in predictive models improves identification of selectivity-enhancing mutations. The improvement is attributed to higher-order kernel functions that represent epistatic interactions between residues.
Learning epistatic interactions from sequence-activity data to predict enantioselectivity
NASA Astrophysics Data System (ADS)
Zaugg, Julian; Gumulya, Yosephine; Malde, Alpeshkumar K.; Bodén, Mikael
2017-12-01
Enzymes with a high selectivity are desirable for improving economics of chemical synthesis of enantiopure compounds. To improve enzyme selectivity mutations are often introduced near the catalytic active site. In this compact environment epistatic interactions between residues, where contributions to selectivity are non-additive, play a significant role in determining the degree of selectivity. Using support vector machine regression models we map mutations to the experimentally characterised enantioselectivities for a set of 136 variants of the epoxide hydrolase from the fungus Aspergillus niger ( AnEH). We investigate whether the influence a mutation has on enzyme selectivity can be accurately predicted through linear models, and whether prediction accuracy can be improved using higher-order counterparts. Comparing linear and polynomial degree = 2 models, mean Pearson coefficients ( r) from 50 {× } 5-fold cross-validation increase from 0.84 to 0.91 respectively. Equivalent models tested on interaction-minimised sequences achieve values of r=0.90 and r=0.93. As expected, testing on a simulated control data set with no interactions results in no significant improvements from higher-order models. Additional experimentally derived AnEH mutants are tested with linear and polynomial degree = 2 models, with values increasing from r=0.51 to r=0.87 respectively. The study demonstrates that linear models perform well, however the representation of epistatic interactions in predictive models improves identification of selectivity-enhancing mutations. The improvement is attributed to higher-order kernel functions that represent epistatic interactions between residues.
Learning epistatic interactions from sequence-activity data to predict enantioselectivity.
Zaugg, Julian; Gumulya, Yosephine; Malde, Alpeshkumar K; Bodén, Mikael
2017-12-01
Enzymes with a high selectivity are desirable for improving economics of chemical synthesis of enantiopure compounds. To improve enzyme selectivity mutations are often introduced near the catalytic active site. In this compact environment epistatic interactions between residues, where contributions to selectivity are non-additive, play a significant role in determining the degree of selectivity. Using support vector machine regression models we map mutations to the experimentally characterised enantioselectivities for a set of 136 variants of the epoxide hydrolase from the fungus Aspergillus niger (AnEH). We investigate whether the influence a mutation has on enzyme selectivity can be accurately predicted through linear models, and whether prediction accuracy can be improved using higher-order counterparts. Comparing linear and polynomial degree = 2 models, mean Pearson coefficients (r) from [Formula: see text]-fold cross-validation increase from 0.84 to 0.91 respectively. Equivalent models tested on interaction-minimised sequences achieve values of [Formula: see text] and [Formula: see text]. As expected, testing on a simulated control data set with no interactions results in no significant improvements from higher-order models. Additional experimentally derived AnEH mutants are tested with linear and polynomial degree = 2 models, with values increasing from [Formula: see text] to [Formula: see text] respectively. The study demonstrates that linear models perform well, however the representation of epistatic interactions in predictive models improves identification of selectivity-enhancing mutations. The improvement is attributed to higher-order kernel functions that represent epistatic interactions between residues.
NASA Technical Reports Server (NTRS)
Kypuros, Javier A.; Colson, Rodrigo; Munoz, Afredo
2004-01-01
This paper describes efforts conducted to improve dynamic temperature estimations of a turbine tip clearance system to facilitate design of a generalized tip clearance controller. This work builds upon research previously conducted and presented in and focuses primarily on improving dynamic temperature estimations of the primary components affecting tip clearance (i.e. the rotor, blades, and casing/shroud). The temperature profiles estimated by the previous model iteration, specifically for the rotor and blades, were found to be inaccurate and, more importantly, insufficient to facilitate controller design. Some assumptions made to facilitate the previous results were not valid, and thus improvements are presented here to better match the physical reality. As will be shown, the improved temperature sub- models, match a commercially validated model and are sufficiently simplified to aid in controller design.
NASA Technical Reports Server (NTRS)
Quattrochi, D. A.; Lapenta, W. M.; Crosson, W. L.; Estes, M. G., Jr.; Limaye, A.; Kahn, M.
2006-01-01
Local and state agencies are responsible for developing state implementation plans to meet National Ambient Air Quality Standards. Numerical models used for this purpose simulate the transport and transformation of criteria pollutants and their precursors. The specification of land use/land cover (LULC) plays an important role in controlling modeled surface meteorology and emissions. NASA researchers have worked with partners and Atlanta stakeholders to incorporate an improved high-resolution LULC dataset for the Atlanta area within their modeling system and to assess meteorological and air quality impacts of Urban Heat Island (UHI) mitigation strategies. The new LULC dataset provides a more accurate representation of land use, has the potential to improve model accuracy, and facilitates prediction of LULC changes. Use of the new LULC dataset for two summertime episodes improved meteorological forecasts, with an existing daytime cold bias of approx. equal to 3 C reduced by 30%. Model performance for ozone prediction did not show improvement. In addition, LULC changes due to Atlanta area urbanization were predicted through 2030, for which model simulations predict higher urban air temperatures. The incorporation of UHI mitigation strategies partially offset this warming trend. The data and modeling methods used are generally applicable to other U.S. cities.
Improvement of the model for surface process of tritium release from lithium oxide
NASA Astrophysics Data System (ADS)
Yamaki, Daiju; Iwamoto, Akira; Jitsukawa, Shiro
2000-12-01
Among the various tritium transport processes in lithium ceramics, the importance and the detailed mechanism of surface reactions remain to be elucidated. The dynamic adsorption and desorption model for tritium desorption from lithium ceramics, especially Li 2O was constructed. From the experimental results, it was considered that both H 2 and H 2O are dissociatively adsorbed on Li 2O and generate OH - on the surface. In the first model developed in 1994, it was assumed that either the dissociative adsorption of H 2 or H 2O on Li 2O generates two OH - on the surface. However, recent calculation results show that the generation of one OH - and one H - is more stable than that of two OH -s by the dissociative adsorption of H 2. Therefore, assumption of H 2 adsorption and desorption in the first model is improved and the tritium release behavior from Li 2O surface is evaluated again by using the improved model. The tritium residence time on the Li 2O surface is calculated using the improved model, and the results are compared with the experimental results. The calculation results using the improved model agree well with the experimental results than those using the first model.
NASA Astrophysics Data System (ADS)
Bamberger, Yael M.; Davis, Elizabeth A.
2013-01-01
This paper focuses on students' ability to transfer modelling performances across content areas, taking into consideration their improvement of content knowledge as a result of a model-based instruction. Sixty-five sixth grade students of one science teacher in an urban public school in the Midwestern USA engaged in scientific modelling practices that were incorporated into a curriculum focused on the nature of matter. Concept-process models were embedded in the curriculum, as well as emphasis on meta-modelling knowledge and modelling practices. Pre-post test items that required drawing scientific models of smell, evaporation, and friction were analysed. The level of content understanding was coded and scored, as were the following elements of modelling performance: explanation, comparativeness, abstraction, and labelling. Paired t-tests were conducted to analyse differences in students' pre-post tests scores on content knowledge and on each element of the modelling performances. These are described in terms of the amount of transfer. Students significantly improved in their content knowledge for the smell and the evaporation models, but not for the friction model, which was expected as that topic was not taught during the instruction. However, students significantly improved in some of their modelling performances for all the three models. This improvement serves as evidence that the model-based instruction can help students acquire modelling practices that they can apply in a new content area.
Dynamic physiological modeling for functional diffuse optical tomography
Diamond, Solomon Gilbert; Huppert, Theodore J.; Kolehmainen, Ville; Franceschini, Maria Angela; Kaipio, Jari P.; Arridge, Simon R.; Boas, David A.
2009-01-01
Diffuse optical tomography (DOT) is a noninvasive imaging technology that is sensitive to local concentration changes in oxy- and deoxyhemoglobin. When applied to functional neuroimaging, DOT measures hemodynamics in the scalp and brain that reflect competing metabolic demands and cardiovascular dynamics. The diffuse nature of near-infrared photon migration in tissue and the multitude of physiological systems that affect hemodynamics motivate the use of anatomical and physiological models to improve estimates of the functional hemodynamic response. In this paper, we present a linear state-space model for DOT analysis that models the physiological fluctuations present in the data with either static or dynamic estimation. We demonstrate the approach by using auxiliary measurements of blood pressure variability and heart rate variability as inputs to model the background physiology in DOT data. We evaluate the improvements accorded by modeling this physiology on ten human subjects with simulated functional hemodynamic responses added to the baseline physiology. Adding physiological modeling with a static estimator significantly improved estimates of the simulated functional response, and further significant improvements were achieved with a dynamic Kalman filter estimator (paired t tests, n = 10, P < 0.05). These results suggest that physiological modeling can improve DOT analysis. The further improvement with the Kalman filter encourages continued research into dynamic linear modeling of the physiology present in DOT. Cardiovascular dynamics also affect the blood-oxygen-dependent (BOLD) signal in functional magnetic resonance imaging (fMRI). This state-space approach to DOT analysis could be extended to BOLD fMRI analysis, multimodal studies and real-time analysis. PMID:16242967
Medium- and Long-term Prediction of LOD Change by the Leap-step Autoregressive Model
NASA Astrophysics Data System (ADS)
Wang, Qijie
2015-08-01
The accuracy of medium- and long-term prediction of length of day (LOD) change base on combined least-square and autoregressive (LS+AR) deteriorates gradually. Leap-step autoregressive (LSAR) model can significantly reduce the edge effect of the observation sequence. Especially, LSAR model greatly improves the resolution of signals’ low-frequency components. Therefore, it can improve the efficiency of prediction. In this work, LSAR is used to forecast the LOD change. The LOD series from EOP 08 C04 provided by IERS is modeled by both the LSAR and AR models. The results of the two models are analyzed and compared. When the prediction length is between 10-30 days, the accuracy improvement is less than 10%. When the prediction length amounts to above 30 day, the accuracy improved obviously, with the maximum being around 19%. The results show that the LSAR model has higher prediction accuracy and stability in medium- and long-term prediction.
Manothum, Aniruth; Rukijkanpanich, Jittra; Thawesaengskulthai, Damrong; Thampitakkul, Boonwa; Chaikittiporn, Chalermchai; Arphorn, Sara
2009-01-01
The purpose of this study was to evaluate the implementation of an Occupational Health and Safety Management Model for informal sector workers in Thailand. The studied model was characterized by participatory approaches to preliminary assessment, observation of informal business practices, group discussion and participation, and the use of environmental measurements and samples. This model consisted of four processes: capacity building, risk analysis, problem solving, and monitoring and control. The participants consisted of four local labor groups from different regions, including wood carving, hand-weaving, artificial flower making, and batik processing workers. The results demonstrated that, as a result of applying the model, the working conditions of the informal sector workers had improved to meet necessary standards. This model encouraged the use of local networks, which led to cooperation within the groups to create appropriate technologies to solve their problems. The authors suggest that this model could effectively be applied elsewhere to improve informal sector working conditions on a broader scale.
NASA Astrophysics Data System (ADS)
Arumugam, S.; Ramakrishna, P.; Sangavi, S.
2018-02-01
Improvements in heating technology with solar energy is gaining focus, especially solar parabolic collectors. Solar heating in conventional parabolic collectors is done with the help of radiation concentration on receiver tubes. Conventional receiver tubes are open to atmosphere and loose heat by ambient air currents. In order to reduce the convection losses and also to improve the aperture area, we designed a tube with cavity. This study is a comparative performance behaviour of conventional tube and cavity model tube. The performance formulae were derived for the cavity model based on conventional model. Reduction in overall heat loss coefficient was observed for cavity model, though collector heat removal factor and collector efficiency were nearly same for both models. Improvement in efficiency was also observed in the cavity model’s performance. The approach towards the design of a cavity model tube as the receiver tube in solar parabolic collectors gave improved results and proved as a good consideration.
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao
Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.
Halliwell, Emma; Dittmar, Helga
2005-09-01
This study investigates the effect of social comparisons with media models on women's body image based on either self-evaluation or self-improvement motives. Ninety-eight women, for whom appearance was a relevant comparison dimension, viewed advertisements that did, or did not, feature idealised models, after being prompted to engage in self-evaluation or self-improvement comparisons. The results indicate that, when focusing on self-evaluation, comparisons with thin models are associated with higher body-focused anxiety than viewing no model advertisements. In contrast, when focusing on self-improvement, comparisons with thin models are not associated with higher body-focused anxiety than viewing no models. Furthermore, women's general tendency to engage in social comparisons moderated the effects of self-evaluative comparisons with models, so that women who did not habitually engage in social comparisons were most strongly affected. It is suggested that motive for social comparison may explain previous inconsistencies in the experimental exposure literature and warrants more careful attention in future research.
An improved interfacial bonding model for material interface modeling
Lin, Liqiang; Wang, Xiaodu; Zeng, Xiaowei
2016-01-01
An improved interfacial bonding model was proposed from potential function point of view to investigate interfacial interactions in polycrystalline materials. It characterizes both attractive and repulsive interfacial interactions and can be applied to model different material interfaces. The path dependence of work-of-separation study indicates that the transformation of separation work is smooth in normal and tangential direction and the proposed model guarantees the consistency of the cohesive constitutive model. The improved interfacial bonding model was verified through a simple compression test in a standard hexagonal structure. The error between analytical solutions and numerical results from the proposed model is reasonable in linear elastic region. Ultimately, we investigated the mechanical behavior of extrafibrillar matrix in bone and the simulation results agreed well with experimental observations of bone fracture. PMID:28584343
New temperature model of the Netherlands from new data and novel modelling methodology
NASA Astrophysics Data System (ADS)
Bonté, Damien; Struijk, Maartje; Békési, Eszter; Cloetingh, Sierd; van Wees, Jan-Diederik
2017-04-01
Deep geothermal energy has grown in interest in Western Europe in the last decades, for direct use but also, as the knowledge of the subsurface improves, for electricity generation. In the Netherlands, where the sector took off with the first system in 2005, geothermal energy is seen has a key player for a sustainable future. The knowledge of the temperature subsurface, together with the available flow from the reservoir, is an important factor that can determine the success of a geothermal energy project. To support the development of deep geothermal energy system in the Netherlands, we have made a first assessment of the subsurface temperature based on thermal data but also on geological elements (Bonté et al, 2012). An outcome of this work was ThermoGIS that uses the temperature model. This work is a revision of the model that is used in ThermoGIS. The improvement from the first model are multiple, we have been improving not only the dataset used for the calibration and structural model, but also the methodology trough an improved software (called b3t). The temperature dataset has been updated by integrating temperature on the newly accessible wells. The sedimentary description in the basin has been improved by using an updated and refined structural model and an improved lithological definition. A major improvement in from the methodology used to perform the modelling, with b3t the calibration is made not only using the lithospheric parameters but also using the thermal conductivity of the sediments. The result is a much more accurate definition of the parameters for the model and a perfected handling of the calibration process. The result obtain is a precise and improved temperature model of the Netherlands. The thermal conductivity variation in the sediments associated with geometry of the layers is an important factor of temperature variations and the influence of the Zechtein salt in the north of the country is important. In addition, the radiogenic heat production in the crust shows a significant impact. From the temperature values, also identify in the lower part of the basin, deep convective systems that could be major geothermal energy target in the future.
Space logistics simulation: Launch-on-time
NASA Technical Reports Server (NTRS)
Nii, Kendall M.
1990-01-01
During 1989-1990 the Center for Space Construction developed the Launch-On-Time (L-O-T) Model to help asses and improve the likelihood of successfully supporting space construction requiring multi-logistic delivery flights. The model chose a reference by which the L-O-T probability and improvements to L-O-T probability can be judged. The measure of improvement was chosen as the percent reduction in E(S(sub N)), the total expected amount of unscheduled 'hold' time. We have also previously developed an approach to determining the reduction in E(S(sub N)) by reducing some of the causes of unscheduled holds and increasing the speed at which the problems causing the holds may be 'fixed.' We provided a mathematical (binary linear programming) model for measuring the percent reduction in E(S(sub N)) given such improvements. In this presentation we shall exercise the model which was developed and draw some conclusions about the following: methods used, data available and needed, and make suggestions for areas of improvement in 'real world' application of the model.
NASA Technical Reports Server (NTRS)
Kim, Y.-C.; Demarque, P.; Guenther, D. B.
1991-01-01
Improvements to the Yale Rotating Stellar Evolution Code (YREC) by incorporating the Mihalas-Hummer-Daeppen equation of state, an improved opacity interpolation routine, and the effects of molecular opacities, calculated at Los Alamos, have been made. the effect of each of the improvements on the standard solar model has been tested independently by computing the corresponding solar nonradial oscillation frequencies. According to these tests, the Mihalas-Hummer-Daeppen equation of state has very little effect on the model's low l p-mode oscillation spectrum compared to the model using the existing analytical equation of state implemented in YREC. On the other hand, the molecular opacity does improve the model's oscillation spectrum. The effect of molecular opacity on the computed solar oscillation frequencies is much larger than that of the Mihalas-Hummer-Daeppen equation of state. together, the two improvements to the physics reduce the discrepancy with observations by 10 microHz for the low l modes.
Improved Modeling of Three-Point Estimates for Decision Making: Going Beyond the Triangle
2016-03-01
OF THREE-POINT ESTIMATES FOR DECISION MAKING: GOING BEYOND THE TRIANGLE by Daniel W. Mulligan March 2016 Thesis Advisor: Mark Rhoades...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE IMPROVED MODELING OF THREE-POINT ESTIMATES FOR DECISION MAKING: GOING BEYOND...unlimited IMPROVED MODELING OF THREE-POINT ESTIMATES FOR DECISION MAKING: GOING BEYOND THE TRIANGLE Daniel W. Mulligan Civilian, National
Athanasiou, Thanos
2016-01-01
Despite taking advantage of established learning from other industries, quality improvement initiatives in healthcare may struggle to outperform secular trends. The reasons for this are rarely explored in detail, and are often attributed merely to difficulties in engaging clinicians in quality improvement work. In a narrative review of the literature, we argue that this focus on clinicians, at the relative expense of managerial staff, has proven counterproductive. Clinical engagement is not a universal challenge; moreover, there is evidence that managers—particularly middle managers—also have a role to play in quality improvement. Yet managerial participation in quality improvement interventions is often assumed, rather than proven. We identify specific factors that influence the coordination of front-line staff and managers in quality improvement, and integrate these factors into a novel model: the model of alignment. We use this model to explore the implementation of an interdisciplinary intervention in a recent trial, describing different participation incentives and barriers for different staff groups. The extent to which clinical and managerial interests align may be an important determinant of the ultimate success of quality improvement interventions. PMID:26647411
Implementing the Mother-Baby Model of Nursing Care Using Models and Quality Improvement Tools.
Brockman, Vicki
As family-centered care has become the expected standard, many facilities follow the mother-baby model, in which care is provided to both a woman and her newborn in the same room by the same nurse. My facility employed a traditional model of nursing care, which was not evidence-based or financially sustainable. After implementing the mother-baby model, we experienced an increase in exclusive breastfeeding rates at hospital discharge, increased patient satisfaction, improved staff productivity and decreased salary costs, all while the number of births increased. Our change was successful because it was guided by the use of quality improvement tools, change theory and evidence-based practice models. © 2015 AWHONN.
ERIC Educational Resources Information Center
Maij-de Meij, Annette M.; Kelderman, Henk; van der Flier, Henk
2008-01-01
Mixture item response theory (IRT) models aid the interpretation of response behavior on personality tests and may provide possibilities for improving prediction. Heterogeneity in the population is modeled by identifying homogeneous subgroups that conform to different measurement models. In this study, mixture IRT models were applied to the…
Quan, Guo-zheng; Yu, Chun-tang; Liu, Ying-ying; Xia, Yu-feng
2014-01-01
The stress-strain data of 20MnNiMo alloy were collected from a series of hot compressions on Gleeble-1500 thermal-mechanical simulator in the temperature range of 1173 ∼ 1473 K and strain rate range of 0.01 ∼ 10 s(-1). Based on the experimental data, the improved Arrhenius-type constitutive model and the artificial neural network (ANN) model were established to predict the high temperature flow stress of as-cast 20MnNiMo alloy. The accuracy and reliability of the improved Arrhenius-type model and the trained ANN model were further evaluated in terms of the correlation coefficient (R), the average absolute relative error (AARE), and the relative error (η). For the former, R and AARE were found to be 0.9954 and 5.26%, respectively, while, for the latter, 0.9997 and 1.02%, respectively. The relative errors (η) of the improved Arrhenius-type model and the ANN model were, respectively, in the range of -39.99% ∼ 35.05% and -3.77% ∼ 16.74%. As for the former, only 16.3% of the test data set possesses η-values within ± 1%, while, as for the latter, more than 79% possesses. The results indicate that the ANN model presents a higher predictable ability than the improved Arrhenius-type constitutive model.
Extended charge banking model of dual path shocks for implantable cardioverter defibrillators
Dosdall, Derek J; Sweeney, James D
2008-01-01
Background Single path defibrillation shock methods have been improved through the use of the Charge Banking Model of defibrillation, which predicts the response of the heart to shocks as a simple resistor-capacitor (RC) circuit. While dual path defibrillation configurations have significantly reduced defibrillation thresholds, improvements to dual path defibrillation techniques have been limited to experimental observations without a practical model to aid in improving dual path defibrillation techniques. Methods The Charge Banking Model has been extended into a new Extended Charge Banking Model of defibrillation that represents small sections of the heart as separate RC circuits, uses a weighting factor based on published defibrillation shock field gradient measures, and implements a critical mass criteria to predict the relative efficacy of single and dual path defibrillation shocks. Results The new model reproduced the results from several published experimental protocols that demonstrated the relative efficacy of dual path defibrillation shocks. The model predicts that time between phases or pulses of dual path defibrillation shock configurations should be minimized to maximize shock efficacy. Discussion Through this approach the Extended Charge Banking Model predictions may be used to improve dual path and multi-pulse defibrillation techniques, which have been shown experimentally to lower defibrillation thresholds substantially. The new model may be a useful tool to help in further improving dual path and multiple pulse defibrillation techniques by predicting optimal pulse durations and shock timing parameters. PMID:18673561
Creating a Test Validated Structural Dynamic Finite Element Model of the X-56A Aircraft
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi; Truong, Samson
2014-01-01
Small modeling errors in the finite element model will eventually induce errors in the structural flexibility and mass, thus propagating into unpredictable errors in the unsteady aerodynamics and the control law design. One of the primary objectives of the Multi Utility Technology Test-bed, X-56A aircraft, is the flight demonstration of active flutter suppression, and therefore in this study, the identification of the primary and secondary modes for the structural model tuning based on the flutter analysis of the X-56A aircraft. The ground vibration test-validated structural dynamic finite element model of the X-56A aircraft is created in this study. The structural dynamic finite element model of the X-56A aircraft is improved using a model tuning tool. In this study, two different weight configurations of the X-56A aircraft have been improved in a single optimization run. Frequency and the cross-orthogonality (mode shape) matrix were the primary focus for improvement, while other properties such as center of gravity location, total weight, and offdiagonal terms of the mass orthogonality matrix were used as constraints. The end result was a more improved and desirable structural dynamic finite element model configuration for the X-56A aircraft. Improved frequencies and mode shapes in this study increased average flutter speeds of the X-56A aircraft by 7.6% compared to the baseline model.
Creating a Test-Validated Finite-Element Model of the X-56A Aircraft Structure
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi; Truong, Samson
2014-01-01
Small modeling errors in a finite-element model will eventually induce errors in the structural flexibility and mass, thus propagating into unpredictable errors in the unsteady aerodynamics and the control law design. One of the primary objectives of the X-56A Multi-Utility Technology Testbed aircraft is the flight demonstration of active flutter suppression and, therefore, in this study, the identification of the primary and secondary modes for the structural model tuning based on the flutter analysis of the X-56A aircraft. The ground-vibration test-validated structural dynamic finite-element model of the X-56A aircraft is created in this study. The structural dynamic finite-element model of the X-56A aircraft is improved using a model-tuning tool. In this study, two different weight configurations of the X-56A aircraft have been improved in a single optimization run. Frequency and the cross-orthogonality (mode shape) matrix were the primary focus for improvement, whereas other properties such as c.g. location, total weight, and off-diagonal terms of the mass orthogonality matrix were used as constraints. The end result was an improved structural dynamic finite-element model configuration for the X-56A aircraft. Improved frequencies and mode shapes in this study increased average flutter speeds of the X-56A aircraft by 7.6% compared to the baseline model.
Data Assimilation in the Solar Wind: Challenges and First Results
NASA Astrophysics Data System (ADS)
Lang, Matthew; Browne, Phil; van Leeuwen, Peter Jan; Owens, Matt
2017-04-01
Data assimilation (DA) is currently underused in the solar wind field to improve the modelled variables using observations. Data assimilation has been used in Numerical Weather Prediction (NWP) models with great success, and it can be seen that the improvement of DA methods in NWP modelling has led to improvements in forecasting skill over the past 20-30 years. The state of the art DA methods developed for NWP modelling have never been applied to space weather models, hence it is important to implement the improvements that can be gained from these methods to improve our understanding of the solar wind and how to model it. The ENLIL solar wind model has been coupled to the EMPIRE data assimilation library in order to apply these advanced data assimilation methods to a space weather model. This coupling allows multiple data assimilation methods to be applied to ENLIL with relative ease. I shall discuss twin experiments that have been undertaken, applying the LETKF to the ENLIL model when a CME occurs in the observation and when it does not. These experiments show that there is potential in the application of advanced data assimilation methods to the solar wind field, however, there is still a long way to go until it can be applied effectively. I shall discuss these issues and suggest potential avenues for future research in this area.
Peng, Yi; Xiong, Xiong; Adhikari, Kabindra; Knadel, Maria; Grunwald, Sabine; Greve, Mogens Humlekrog
2015-01-01
There is a great challenge in combining soil proximal spectra and remote sensing spectra to improve the accuracy of soil organic carbon (SOC) models. This is primarily because mixing of spectral data from different sources and technologies to improve soil models is still in its infancy. The first objective of this study was to integrate information of SOC derived from visible near-infrared reflectance (Vis-NIR) spectra in the laboratory with remote sensing (RS) images to improve predictions of topsoil SOC in the Skjern river catchment, Denmark. The second objective was to improve SOC prediction results by separately modeling uplands and wetlands. A total of 328 topsoil samples were collected and analyzed for SOC. Satellite Pour l'Observation de la Terre (SPOT5), Landsat Data Continuity Mission (Landsat 8) images, laboratory Vis-NIR and other ancillary environmental data including terrain parameters and soil maps were compiled to predict topsoil SOC using Cubist regression and Bayesian kriging. The results showed that the model developed from RS data, ancillary environmental data and laboratory spectral data yielded a lower root mean square error (RMSE) (2.8%) and higher R2 (0.59) than the model developed from only RS data and ancillary environmental data (RMSE: 3.6%, R2: 0.46). Plant-available water (PAW) was the most important predictor for all the models because of its close relationship with soil organic matter content. Moreover, vegetation indices, such as the Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index (EVI), were very important predictors in SOC spatial models. Furthermore, the 'upland model' was able to more accurately predict SOC compared with the 'upland & wetland model'. However, the separately calibrated 'upland and wetland model' did not improve the prediction accuracy for wetland sites, since it was not possible to adequately discriminate the vegetation in the RS summer images. We conclude that laboratory Vis-NIR spectroscopy adds critical information that significantly improves the prediction accuracy of SOC compared to using RS data alone. We recommend the incorporation of laboratory spectra with RS data and other environmental data to improve soil spatial modeling and digital soil mapping (DSM).
Locher, Kathrin; Borghardt, Jens M; Frank, Kerstin J; Kloft, Charlotte; Wagner, Karl G
2016-08-01
Biphasic dissolution models are proposed to have good predictive power for the in vivo absorption. The aim of this study was to improve our previously introduced mini-scale dissolution model to mimic in vivo situations more realistically and to increase the robustness of the experimental model. Six dissolved APIs (BCS II) were tested applying the improved mini-scale biphasic dissolution model (miBIdi-pH-II). The influence of experimental model parameters including various excipients, API concentrations, dual paddle and its rotation speed was investigated. The kinetics in the biphasic model was described applying a one- and four-compartment pharmacokinetic (PK) model. The improved biphasic dissolution model was robust related to differing APIs and excipient concentrations. The dual paddle guaranteed homogenous mixing in both phases; the optimal rotation speed was 25 and 75rpm for the aqueous and the octanol phase, respectively. A one-compartment PK model adequately characterised the data of fully dissolved APIs. A four-compartment PK model best quantified dissolution, precipitation, and partitioning also of undissolved amounts due to realistic pH profiles. The improved dissolution model is a powerful tool for investigating the interplay between dissolution, precipitation and partitioning of various poorly soluble APIs (BCS II). In vivo-relevant PK parameters could be estimated applying respective PK models. Copyright © 2016 Elsevier B.V. All rights reserved.
Cook, David J; Thompson, Jeffrey E; Suri, Rakesh; Prinsen, Sharon K
2014-01-01
The absence of standardization in surgical care process, exemplified in a "solution shop" model, can lead to unwarranted variation, increased cost, and reduced quality. A comprehensive effort was undertaken to improve quality of care around indwelling bladder catheter use following surgery by creating a "focused factory" model within the cardiac surgical practice. Baseline compliance with Surgical Care Improvement Inf-9, removal of urinary catheter by the end of surgical postoperative day 2, was determined. Comparison of baseline data to postintervention results showed clinically important reductions in the duration of indwelling bladder catheters as well as marked reduction in practice variation. Following the intervention, Surgical Care Improvement Inf-9 guidelines were met in 97% of patients. Although clinical quality improvement was notable, the process to accomplish this-identification of patients suitable for standardized pathways, protocol application, and electronic systems to support the standardized practice model-has potentially greater relevance than the specific clinical results. © 2013 by the American College of Medical Quality.
Model verification of large structural systems. [space shuttle model response
NASA Technical Reports Server (NTRS)
Lee, L. T.; Hasselman, T. K.
1978-01-01
A computer program for the application of parameter identification on the structural dynamic models of space shuttle and other large models with hundreds of degrees of freedom is described. Finite element, dynamic, analytic, and modal models are used to represent the structural system. The interface with math models is such that output from any structural analysis program applied to any structural configuration can be used directly. Processed data from either sine-sweep tests or resonant dwell tests are directly usable. The program uses measured modal data to condition the prior analystic model so as to improve the frequency match between model and test. A Bayesian estimator generates an improved analytical model and a linear estimator is used in an iterative fashion on highly nonlinear equations. Mass and stiffness scaling parameters are generated for an improved finite element model, and the optimum set of parameters is obtained in one step.
ERIC Educational Resources Information Center
Chen, Chung-Yang; Chang, Huiju; Hsu, Wen-Chin; Sheen, Gwo-Ji
2017-01-01
This paper proposes a training model for raters, with the goal to improve the intra- and inter-consistency of evaluation quality for higher education curricula. The model, termed the learning, behaviour and reaction (LBR) circular training model, is an interdisciplinary application from the business and organisational training domain. The…
USDA-ARS?s Scientific Manuscript database
The progressive improvement of computer science and development of auto-calibration techniques means that calibration of simulation models is no longer a major challenge for watershed planning and management. Modelers now increasingly focus on challenges such as improved representation of watershed...
Rhode Island Model Evaluation & Support System: Teacher. Edition III
ERIC Educational Resources Information Center
Rhode Island Department of Education, 2015
2015-01-01
Rhode Island educators believe that implementing a fair, accurate, and meaningful educator evaluation and support system will help improve teaching and learning. The primary purpose of the Rhode Island Model Teacher Evaluation and Support System (Rhode Island Model) is to help all teachers improve. Through the Model, the goal is to help create a…
Guide to Working with Model Providers.
ERIC Educational Resources Information Center
Walter, Katie; Hassel, Bryan C.
Often a central feature of a school's improvement efforts is the adoption of a Comprehensive School Reform (CSR) model, an externally developed research-based design for school improvement. Adopting a model is only the first step in CSR. Another important step is forging partnerships with developers of CSR models. This guide aims to help schools…
Modeling bladder cancer in mice: opportunities and challenges
Kobayashi, Takashi; Owczarek, Tomasz B.; McKiernan, James M.; Abate-Shen, Cory
2015-01-01
The prognosis and treatment of bladder cancer have hardly improved in the last 20 years. Bladder cancer remains a debilitating and often fatal disease, and among the most costly cancers to treat. The generation of informative mouse models has the potential to improve our understanding of bladder cancer progression, as well as impact its diagnosis and treatment. However, relatively few mouse models of bladder cancer have been described and particularly few that develop invasive cancer phenotypes. This review focuses on opportunities for improving the landscape of mouse models of bladder cancer. PMID:25533675
Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio
2018-03-03
A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175–183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave. The improved model contains six of the 10 terms inmore » the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. In conclusion, compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value.« less
Space, time, and the third dimension (model error)
Moss, Marshall E.
1979-01-01
The space-time tradeoff of hydrologic data collection (the ability to substitute spatial coverage for temporal extension of records or vice versa) is controlled jointly by the statistical properties of the phenomena that are being measured and by the model that is used to meld the information sources. The control exerted on the space-time tradeoff by the model and its accompanying errors has seldom been studied explicitly. The technique, known as Network Analyses for Regional Information (NARI), permits such a study of the regional regression model that is used to relate streamflow parameters to the physical and climatic characteristics of the drainage basin.The NARI technique shows that model improvement is a viable and sometimes necessary means of improving regional data collection systems. Model improvement provides an immediate increase in the accuracy of regional parameter estimation and also increases the information potential of future data collection. Model improvement, which can only be measured in a statistical sense, cannot be quantitatively estimated prior to its achievement; thus an attempt to upgrade a particular model entails a certain degree of risk on the part of the hydrologist.
Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio
2018-05-30
A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175-183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave ). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave . The improved model contains six of the 10 terms in the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. Compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value. Copyright © 2018 Elsevier B.V. All rights reserved.
Kim, Hea-Won; Park, Taekyung; Quiring, Stephanie; Barrett, Diana
2018-01-01
A coalition model is often used to serve victims of human trafficking but little is known about whether the model is adequately meeting the needs of the victims. The purpose of this study was to examine anti-human trafficking collaboration model in terms of its impact and the collaborative experience, including challenges and lessons learned from the service providers' perspective. Mixed methods study was conducted to evaluate the impact of a citywide anti-trafficking coalition model from the providers' perspectives. Web-based survey was administered with service providers (n = 32) and focus groups were conducted with Core Group members (n = 10). Providers reported the coalition model has made important impacts in the community by increasing coordination among the key agencies, law enforcement, and service providers and improving quality of service provision. Providers identified the improved and expanded partnerships among coalition members as the key contributing factor to the success of the coalition model. Several key strategies were suggested to improve the coalition model: improved referral tracking, key partner and protocol development, and information sharing.
Johansson, Michael A; Reich, Nicholas G; Hota, Aditi; Brownstein, John S; Santillana, Mauricio
2016-09-26
Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model.
Johansson, Michael A.; Reich, Nicholas G.; Hota, Aditi; Brownstein, John S.; Santillana, Mauricio
2016-01-01
Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model. PMID:27665707
Improving the Validity of Activity of Daily Living Dependency Risk Assessment
Clark, Daniel O.; Stump, Timothy E.; Tu, Wanzhu; Miller, Douglas K.
2015-01-01
Objectives Efforts to prevent activity of daily living (ADL) dependency may be improved through models that assess older adults’ dependency risk. We evaluated whether cognition and gait speed measures improve the predictive validity of interview-based models. Method Participants were 8,095 self-respondents in the 2006 Health and Retirement Survey who were aged 65 years or over and independent in five ADLs. Incident ADL dependency was determined from the 2008 interview. Models were developed using random 2/3rd cohorts and validated in the remaining 1/3rd. Results Compared to a c-statistic of 0.79 in the best interview model, the model including cognitive measures had c-statistics of 0.82 and 0.80 while the best fitting gait speed model had c-statistics of 0.83 and 0.79 in the development and validation cohorts, respectively. Conclusion Two relatively brief models, one that requires an in-person assessment and one that does not, had excellent validity for predicting incident ADL dependency but did not significantly improve the predictive validity of the best fitting interview-based models. PMID:24652867
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio
A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175–183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave. The improved model contains six of the 10 terms inmore » the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. In conclusion, compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value.« less
Harrison, Kenneth W.; Tian, Yudong; Peters-Lidard, Christa D.; Ringerud, Sarah; Kumar, Sujay V.
2018-01-01
Better estimation of land surface microwave emissivity promises to improve over-land precipitation retrievals in the GPM era. Forward models of land microwave emissivity are available but have suffered from poor parameter specification and limited testing. Here, forward models are calibrated and the accompanying change in predictive power is evaluated. With inputs (e.g., soil moisture) from the Noah land surface model and applying MODIS LAI data, two microwave emissivity models are tested, the Community Radiative Transfer Model (CRTM) and Community Microwave Emission Model (CMEM). The calibration is conducted with the NASA Land Information System (LIS) parameter estimation subsystem using AMSR-E based emissivity retrievals for the calibration dataset. The extent of agreement between the modeled and retrieved estimates is evaluated using the AMSR-E retrievals for a separate 7-year validation period. Results indicate that calibration can significantly improve the agreement, simulating emissivity with an across-channel average root-mean-square-difference (RMSD) of about 0.013, or about 20% lower than if relying on daily estimates based on climatology. The results also indicate that calibration of the microwave emissivity model alone, as was done in prior studies, results in as much as 12% higher across-channel average RMSD, as compared to joint calibration of the land surface and microwave emissivity models. It remains as future work to assess the extent to which the improvements in emissivity estimation translate into improvements in precipitation retrieval accuracy. PMID:29795962
Improvements and validation of the erythropoiesis control model for bed rest simulation
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1977-01-01
The most significant improvement in the model is the explicit formulation of separate elements representing erythropoietin production and red cell production. Other modifications include bone marrow time-delays, capability to shift oxyhemoglobin affinity and an algorithm for entering experimental data as time-varying driving functions. An area of model development is suggested by applying the model to simulating onset, diagnosis and treatment of a hematologic disorder. Recommendations for further improvements in the model and suggestions for experimental application are also discussed. A detailed analysis of the hematologic response to bed rest including simulation of the recent Baylor Medical College bed rest studies is also presented.
Realism of Indian Summer Monsoon Simulation in a Quarter Degree Global Climate Model
NASA Astrophysics Data System (ADS)
Salunke, P.; Mishra, S. K.; Sahany, S.; Gupta, K.
2017-12-01
This study assesses the fidelity of Indian Summer Monsoon (ISM) simulations using a global model at an ultra-high horizontal resolution (UHR) of 0.25°. The model used was the atmospheric component of the Community Earth System Model version 1.2.0 (CESM 1.2.0) developed at the National Center for Atmospheric Research (NCAR). Precipitation and temperature over the Indian region were analyzed for a wide range of space and time scales to evaluate the fidelity of the model under UHR, with special emphasis on the ISM simulations during the period of June-through-September (JJAS). Comparing the UHR simulations with observed data from the India Meteorological Department (IMD) over the Indian land, it was found that 0.25° resolution significantly improved spatial rainfall patterns over many regions, including the Western Ghats and the South-Eastern peninsula as compared to the standard model resolution. Convective and large-scale rainfall components were analyzed using the European Centre for Medium Range Weather Forecast (ECMWF) Re-Analysis (ERA)-Interim (ERA-I) data and it was found that at 0.25° resolution, there was an overall increase in the large-scale component and an associated decrease in the convective component of rainfall as compared to the standard model resolution. Analysis of the diurnal cycle of rainfall suggests a significant improvement in the phase characteristics simulated by the UHR model as compared to the standard model resolution. Analysis of the annual cycle of rainfall, however, failed to show any significant improvement in the UHR model as compared to the standard version. Surface temperature analysis showed small improvements in the UHR model simulations as compared to the standard version. Thus, one may conclude that there are some significant improvements in the ISM simulations using a 0.25° global model, although there is still plenty of scope for further improvement in certain aspects of the annual cycle of rainfall.
Satellite Sounder Data Assimilation for Improving Alaska Region Weather Forecast
NASA Technical Reports Server (NTRS)
Zhu, Jiang; Stevens, E.; Zavodsky, B. T.; Zhang, X.; Heinrichs, T.; Broderson, D.
2014-01-01
Data assimilation has been demonstrated very useful in improving both global and regional numerical weather prediction. Alaska has very coarser surface observation sites. On the other hand, it gets much more satellite overpass than lower 48 states. How to utilize satellite data to improve numerical prediction is one of hot topics among weather forecast community in Alaska. The Geographic Information Network of Alaska (GINA) at University of Alaska is conducting study on satellite data assimilation for WRF model. AIRS/CRIS sounder profile data are used to assimilate the initial condition for the customized regional WRF model (GINA-WRF model). Normalized standard deviation, RMSE, and correlation statistic analysis methods are applied to analyze one case of 48 hours forecasts and one month of 24-hour forecasts in order to evaluate the improvement of regional numerical model from Data assimilation. The final goal of the research is to provide improved real-time short-time forecast for Alaska regions.
NASA Technical Reports Server (NTRS)
Bird, J. F.
1985-01-01
In testing a stochastic variational principle at high frequencies by using a Kirchhoffean trial function in an idealized model for surface scattering - a randomly embossed plane - we have found not only the predicted high-frequency improvement but also an unexpected low-frequency improvement in the calculated scattering amplitudes. To investigate systematically the all-frequency variational behavior, we consider here the deterministic one-boss case - Rayleigh's classic model whose exact solution is available for comparison - over all wavelengths, polarizations, and configurations of incidence and scattering. We examine analytically in particular the long-wave limit of the variational-Kirchhoff amplitudes; the results demonstrate improvements in both wavelength and angle depedence for horizontal (TM) polarization and some variational improvements for vertical (TE) polarization. This low-frequency behavior in tandem with the foreseen high-frequency improvement leads to good variational-Kirchhoff results through the intermediate resonance-frequency regime for this model.
Improved meteorology from an updated WRF/CMAQ modeling system with MODIS vegetation and albedo
Realistic vegetation characteristics and phenology from the Moderate Resolution Imaging Spectroradiometer (MODIS) products improve the simulation for the meteorology and air quality modeling system WRF/CMAQ (Weather Research and Forecasting model and Community Multiscale Air Qual...
ERIC Educational Resources Information Center
Chaseling, Marilyn; Boyd, William Edgar; Smith, Robert; Boyd, Wendy; Shipway, Bradley; Foster, Alan; Lembke, Cathy
2017-01-01
This paper reports on a preliminary Australian adoption and adaptation, in the North Coast region of New South Wales, Australia, of the Townsend and Adams' model of leadership growth for school improvement in Alberta. The Australian adaptation of this Alberta model has been named the North Coast Initiative for School Improvement (NCISI). The…
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
NASA Astrophysics Data System (ADS)
Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli
2018-01-01
Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.
More than Anecdotes: Fishers' Ecological Knowledge Can Fill Gaps for Ecosystem Modeling.
Bevilacqua, Ana Helena V; Carvalho, Adriana R; Angelini, Ronaldo; Christensen, Villy
2016-01-01
Ecosystem modeling applied to fisheries remains hampered by a lack of local information. Fishers' knowledge could fill this gap, improving participation in and the management of fisheries. The same fishing area was modeled using two approaches: based on fishers' knowledge and based on scientific information. For the former, the data was collected by interviews through the Delphi methodology, and for the latter, the data was gathered from the literature. Agreement between the attributes generated by the fishers' knowledge model and scientific model is discussed and explored, aiming to improve data availability, the ecosystem model, and fisheries management. The ecosystem attributes produced from the fishers' knowledge model were consistent with the ecosystem attributes produced by the scientific model, and elaborated using only the scientific data from literature. This study provides evidence that fishers' knowledge may suitably complement scientific data, and may improve the modeling tools for the research and management of fisheries.
Developing and Testing a Model to Predict Outcomes of Organizational Change
Gustafson, David H; Sainfort, François; Eichler, Mary; Adams, Laura; Bisognano, Maureen; Steudel, Harold
2003-01-01
Objective To test the effectiveness of a Bayesian model employing subjective probability estimates for predicting success and failure of health care improvement projects. Data Sources Experts' subjective assessment data for model development and independent retrospective data on 221 healthcare improvement projects in the United States, Canada, and the Netherlands collected between 1996 and 2000 for validation. Methods A panel of theoretical and practical experts and literature in organizational change were used to identify factors predicting the outcome of improvement efforts. A Bayesian model was developed to estimate probability of successful change using subjective estimates of likelihood ratios and prior odds elicited from the panel of experts. A subsequent retrospective empirical analysis of change efforts in 198 health care organizations was performed to validate the model. Logistic regression and ROC analysis were used to evaluate the model's performance using three alternative definitions of success. Data Collection For the model development, experts' subjective assessments were elicited using an integrative group process. For the validation study, a staff person intimately involved in each improvement project responded to a written survey asking questions about model factors and project outcomes. Results Logistic regression chi-square statistics and areas under the ROC curve demonstrated a high level of model performance in predicting success. Chi-square statistics were significant at the 0.001 level and areas under the ROC curve were greater than 0.84. Conclusions A subjective Bayesian model was effective in predicting the outcome of actual improvement projects. Additional prospective evaluations as well as testing the impact of this model as an intervention are warranted. PMID:12785571
Improving the realism of hydrologic model through multivariate parameter estimation
NASA Astrophysics Data System (ADS)
Rakovec, Oldrich; Kumar, Rohini; Attinger, Sabine; Samaniego, Luis
2017-04-01
Increased availability and quality of near real-time observations should improve understanding of predictive skills of hydrological models. Recent studies have shown the limited capability of river discharge data alone to adequately constrain different components of distributed model parameterizations. In this study, the GRACE satellite-based total water storage (TWS) anomaly is used to complement the discharge data with an aim to improve the fidelity of mesoscale hydrologic model (mHM) through multivariate parameter estimation. The study is conducted in 83 European basins covering a wide range of hydro-climatic regimes. The model parameterization complemented with the TWS anomalies leads to statistically significant improvements in (1) discharge simulations during low-flow period, and (2) evapotranspiration estimates which are evaluated against independent (FLUXNET) data. Overall, there is no significant deterioration in model performance for the discharge simulations when complemented by information from the TWS anomalies. However, considerable changes in the partitioning of precipitation into runoff components are noticed by in-/exclusion of TWS during the parameter estimation. A cross-validation test carried out to assess the transferability and robustness of the calibrated parameters to other locations further confirms the benefit of complementary TWS data. In particular, the evapotranspiration estimates show more robust performance when TWS data are incorporated during the parameter estimation, in comparison with the benchmark model constrained against discharge only. This study highlights the value for incorporating multiple data sources during parameter estimation to improve the overall realism of hydrologic model and its applications over large domains. Rakovec, O., Kumar, R., Attinger, S. and Samaniego, L. (2016): Improving the realism of hydrologic model functioning through multivariate parameter estimation. Water Resour. Res., 52, http://dx.doi.org/10.1002/2016WR019430
Towards improved storm surge models in the northern Bay of Bengal
NASA Astrophysics Data System (ADS)
Krien, Y.; Testut, L.; Islam, A. K. M. S.; Bertin, X.; Durand, F.; Mayet, C.; Tazkia, A. R.; Becker, M.; Calmant, S.; Papa, F.; Ballu, V.; Shum, C. K.; Khan, Z. H.
2017-03-01
The northern Bay of Bengal is home to some of the deadliest cyclones recorded during the last decades. Storm surge models developed for this region significantly improved in recent years, but they still fail to predict patterns of coastal flooding with sufficient accuracy. In the present paper, we make use of a state-of-the art numerical modeling system with improved bathymetric and topographic data to identify the strengths, weaknesses, and to suggest areas for improvement of current storm surge models in this area. The new model is found to perform relatively well in reproducing waves characteristics and maximum water levels for the two extreme cyclones studied here: Phailin (2013) and Sidr (2007). The wave setup turns out to be small compared to the wind-driven surge, although it still plays a significant role for inland flooding. Relatively large tide-surge interactions mainly due to shallow water effects are also evidenced by the model. These findings plead in favor of further efforts to improve the representation of the bathymetry, especially in the nearshore area, and the implementation of models including tides and radiation stresses explicitly. The main limit of the model is its inability to predict the detailed patterns of coastal flooding satisfactorily. The reason lies mainly in the fact that topographic data also need to be further improved. In particular, a good knowledge of embankments characteristics (crest elevation and their condition) is found to be of primary importance to represent inland flooding correctly. Public authorities should take urgent action to ensure that better data are available to the scientific community, so that state-of-the-art storm surge models reaching a sufficiently high level of confidence can be used for emergency preparedness and to implement mitigation strategies in the northern Bay of Bengal.
NASA Astrophysics Data System (ADS)
Barik, M. G.; Al-Hamdan, M. Z.; Crosson, W. L.; Yang, C. A.; Coffield, S. R.
2017-12-01
Satellite-derived environmental data, available in a range of spatio-temporal scales, are contributing to the growing use of health impact assessments of air pollution in the public health sector. Models developed using correlation of Moderate Resolution Imaging Spectrometer (MODIS) Aerosol Optical Depth (AOD) with ground measurements of fine particulate matter less than 2.5 microns (PM2.5) are widely applied to measure PM2.5 spatial and temporal variability. In the public health sector, associations of PM2.5 with respiratory and cardiovascular diseases are often investigated to quantify air quality impacts on these health concerns. In order to improve predictability of PM2.5 estimation using correlation models, we have included meteorological variables, higher-resolution AOD products and instantaneous PM2.5 observations into statistical estimation models. Our results showed that incorporation of high-resolution (1-km) Multi-Angle Implementation of Atmospheric Correction (MAIAC)-generated MODIS AOD, meteorological variables and instantaneous PM2.5 observations improved model performance in various parts of California (CA), USA, where single variable AOD-based models showed relatively weak performance. In this study, we further asked whether these improved models actually would be more successful for exploring associations of public health outcomes with estimated PM2.5. To answer this question, we geospatially investigated model-estimated PM2.5's relationship with respiratory and cardiovascular diseases such as asthma, high blood pressure, coronary heart disease, heart attack and stroke in CA using health data from the Centers for Disease Control and Prevention (CDC)'s Wide-ranging Online Data for Epidemiologic Research (WONDER) and the Behavioral Risk Factor Surveillance System (BRFSS). PM2.5 estimation from these improved models have the potential to improve our understanding of associations between public health concerns and air quality.
An EKV-based high voltage MOSFET model with improved mobility and drift model
NASA Astrophysics Data System (ADS)
Chauhan, Yogesh Singh; Gillon, Renaud; Bakeroot, Benoit; Krummenacher, Francois; Declercq, Michel; Ionescu, Adrian Mihai
2007-11-01
An EKV-based high voltage MOSFET model is presented. The intrinsic channel model is derived based on the charge based EKV-formalism. An improved mobility model is used for the modeling of the intrinsic channel to improve the DC characteristics. The model uses second order dependence on the gate bias and an extra parameter for the smoothening of the saturation voltage of the intrinsic drain. An improved drift model [Chauhan YS, Anghel C, Krummenacher F, Ionescu AM, Declercq M, Gillon R, et al. A highly scalable high voltage MOSFET model. In: IEEE European solid-state device research conference (ESSDERC), September 2006. p. 270-3; Chauhan YS, Anghel C, Krummenacher F, Maier C, Gillon R, Bakeroot B, et al. Scalable general high voltage MOSFET model including quasi-saturation and self-heating effect. Solid State Electron 2006;50(11-12):1801-13] is used for the modeling of the drift region, which gives smoother transition on output characteristics and also models well the quasi-saturation region of high voltage MOSFETs. First, the model is validated on the numerical device simulation of the VDMOS transistor and then, on the measured characteristics of the SOI-LDMOS transistor. The accuracy of the model is better than our previous model [Chauhan YS, Anghel C, Krummenacher F, Maier C, Gillon R, Bakeroot B, et al. Scalable general high voltage MOSFET model including quasi-saturation and self-heating effect. Solid State Electron 2006;50(11-12):1801-13] especially in the quasi-saturation region of output characteristics.
Hu, Chuanpu; Zhou, Honghui
2016-02-01
Improving the quality of exposure-response modeling is important in clinical drug development. The general joint modeling of multiple endpoints is made possible in part by recent progress on the latent variable indirect response (IDR) modeling for ordered categorical endpoints. This manuscript aims to investigate, when modeling a continuous and a categorical clinical endpoint, the level of improvement achievable by joint modeling in the latent variable IDR modeling framework through the sharing of model parameters for the individual endpoints, guided by the appropriate representation of drug and placebo mechanism. This was illustrated with data from two phase III clinical trials of intravenously administered mAb X for the treatment of rheumatoid arthritis, with the 28-joint disease activity score (DAS28) and 20, 50, and 70% improvement in the American College of Rheumatology (ACR20, ACR50, and ACR70) disease severity criteria were used as efficacy endpoints. The joint modeling framework led to a parsimonious final model with reasonable performance, evaluated by visual predictive check. The results showed that, compared with the more common approach of separately modeling the endpoints, it is possible for the joint model to be more parsimonious and yet better describe the individual endpoints. In particular, the joint model may better describe one endpoint through subject-specific random effects that would not have been estimable from data of this endpoint alone.
Improving software maintenance through measurement
NASA Technical Reports Server (NTRS)
Rombach, H. Dieter; Ulery, Bradford T.
1989-01-01
A practical approach to improving software maintenance through measurements is presented. This approach is based on general models for measurement and improvement. Both models, their integration, and practical guidelines for transferring them into industrial maintenance settings are presented. Several examples of applications of the approach to real-world maintenance environments are discussed.
Economic analysis of tree improvement: A status report
George F. Dutrow
1974-01-01
Review of current literature establishes that most authors believe that tree improvement expands production, although some point out drawbacks and alternatives. Both softwood and hardwood improvement programs have been analyzed. The authors used various models, economic assumptions, and standards of measurement, but available data were limited. Future models shouId...
USDA-ARS?s Scientific Manuscript database
Materials and Methods The simulation exercise and model improvement were implemented in phase-wise. In the first modelling activities, the model sensitivities were evaluated to given CO2 concentrations varying from 360 to 720 'mol mol-1 at an interval of 90 'mol mol-1 and air temperature increments...
Gravity model development for precise orbit computations for satellite altimetry
NASA Technical Reports Server (NTRS)
Marsh, James G.; Lerch, Francis, J.; Smith, David E.; Klosko, Steven M.; Pavlis, Erricos
1986-01-01
Two preliminary gravity models developed as a first step in reaching the TOPEX/Poseidon modeling goals are discussed. They were obtained by NASA-Goddard from an analysis of exclusively satellite tracking observations. With the new Preliminary Gravity Solution-T2 model, an improved global estimate of the field is achieved with an improved description of the geoid.
Improving Environmental Model Calibration and Prediction
2011-01-18
REPORT Final Report - Improving Environmental Model Calibration and Prediction 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: First, we have continued to...develop tools for efficient global optimization of environmental models. Our algorithms are hybrid algorithms that combine evolutionary strategies...toward practical hybrid optimization tools for environmental models. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 18-01-2011 13
Kirkham, R; Boyle, J A; Whitbread, C; Dowden, M; Connors, C; Corpus, S; McCarthy, L; Oats, J; McIntyre, H D; Moore, E; O'Dea, K; Brown, A; Maple-Brown, L
2017-08-03
Australian Aboriginal and Torres Strait Islander women have high rates of gestational and pre-existing type 2 diabetes in pregnancy. The Northern Territory (NT) Diabetes in Pregnancy Partnership was established to enhance systems and services to improve health outcomes. It has three arms: a clinical register, developing models of care and a longitudinal birth cohort. This study used a process evaluation to report on health professional's perceptions of models of care and related quality improvement activities since the implementation of the Partnership. Changes to models of care were documented according to goals and aims of the Partnership and reviewed annually by the Partnership Steering group. A 'systems assessment tool' was used to guide six focus groups (49 healthcare professionals). Transcripts were coded and analysed according to pre-identified themes of orientation and guidelines, education, communication, logistics and access, and information technology. Key improvements since implementation of the Partnership include: health professional relationships, communication and education; and integration of quality improvement activities. Focus groups with 49 health professionals provided in depth information about how these activities have impacted their practice and models of care for diabetes in pregnancy. Co-ordination of care was reported to have improved, however it was also identified as an opportunity for further development. Recommendations included a central care coordinator, better integration of information technology systems and ongoing comprehensive quality improvement processes. The Partnership has facilitated quality improvement through supporting the development of improved systems that enhance models of care. Persisting challenges exist for delivering care to a high risk population however improvements in formal processes and structures, as demonstrated in this work thus far, play an important role in work towards improving health outcomes.
Wykes, Til; Reeder, Clare; Huddy, Vyv; Taylor, Rumina; Wood, Helen; Ghirasim, Natalia; Kontis, Dimitrios; Landau, Sabine
2012-01-01
Background Cognitive remediation (CRT) affects functioning but the extent and type of cognitive improvements necessary are unknown. Aim To develop and test models of how cognitive improvement transfers to work behaviour using the data from a current service. Method Participants (N49) with a support worker and a paid or voluntary job were offered CRT in a Phase 2 single group design with three assessments: baseline, post therapy and follow-up. Working memory, cognitive flexibility, planning and work outcomes were assessed. Results Three models were tested (mediation — cognitive improvements drive functioning improvement; moderation — post treatment cognitive level affects the impact of CRT on functioning; moderated mediation — cognition drives functioning improvements only after a certain level is achieved). There was evidence of mediation (planning improvement associated with improved work quality). There was no evidence that cognitive flexibility (total Wisconsin Card Sorting Test errors) and working memory (Wechsler Adult Intelligence Scale III digit span) mediated work functioning despite significant effects. There was some evidence of moderated mediation for planning improvement if participants had poorer memory and/or made fewer WCST errors. The total CRT effect on work quality was d = 0.55, but the indirect (planning-mediated CRT effect) was d = 0.082 Conclusion Planning improvements led to better work quality but only accounted for a small proportion of the total effect on work outcome. Other specific and non-specific effects of CRT and the work programme are likely to account for some of the remaining effect. This is the first time complex models have been tested and future Phase 3 studies need to further test mediation and moderated mediation models. PMID:22503640
Four decades of modeling methane cycling in terrestrial ecosystems: Where we are heading?
NASA Astrophysics Data System (ADS)
Xu, X.; Yuan, F.; Hanson, P. J.; Wullschleger, S. D.; Thornton, P. E.; Tian, H.; Riley, W. J.; Song, X.; Graham, D. E.; Song, C.
2015-12-01
A modeling approach to methane (CH4) is widely used to quantify the budget, investigate spatial and temporal variabilities, and understand the mechanistic processes and environmental controls on CH4 fluxes across spatial and temporal scales. Moreover, CH4 models are an important tool for integrating CH4 data from multiple sources, such as laboratory-based incubation and molecular analysis, field observational experiments, remote sensing, and aircraft-based measurements across a variety of terrestrial ecosystems. We reviewed 39 terrestrial CH4 models to characterize their strengths and weaknesses and to design a roadmap for future model improvement and application. We found that: (1) the focus of CH4 models have been shifted from theoretical to site- to regional-level application over the past four decades, expressed as dramatic increases in CH4 model development on regional budget quantification; (2) large discrepancies exist among models in terms of representing CH4 processes and their environmental controls; (3) significant data-model and model-model mismatches are partially attributed to different representations of wetland characterization and inundation dynamics. Three efforts should be paid special attention for future improvements and applications of fully mechanistic CH4 models: (1) CH4 models should be improved to represent the mechanisms underlying land-atmosphere CH4 exchange, with emphasis on improving and validating individual CH4 processes over depth and horizontal space; (2) models should be developed that are capable of simulating CH4 fluxes across space and time (particularly hot moments and hot spots); (3) efforts should be invested to develop model benchmarking frameworks that can easily be used for model improvement, evaluation, and integration with data from molecular to global scales. A newly developed microbial functional group-based CH4 model (CLM-Microbe) was further used to demonstrate the features of mechanistic representation and integration with multiple source of observational datasets.
An improved gravity model for Mars: Goddard Mars Model 1
NASA Technical Reports Server (NTRS)
Smith, D. E.; Lerch, F. J.; Nerem, R. S.; Zuber, M. T.; Patel, G. B.; Fricke, S. K.; Lemoine, F. G.
1993-01-01
Doppler tracking data of three orbiting spacecraft have been reanalyzed to develop a new gravitational field model for the planet Mars, Goddard Mars Model 1 (GMM-1). This model employs nearly all available data, consisting of approximately 1100 days of S band tracking data collected by NASA's Deep Space Network from the Mariner 9 and Viking 1 and Viking 2 spacecraft, in seven different orbits, between 1971 and 1979. GMM-1 is complete to spherical harmonic degree and order 50, which corresponds to a half-wavelength spatial resolution of 200-300 km where the data permit. GMM-1 represents satellite orbits with considerably better accuracy than previous Mars gravity models and shows greater resolution of identifiable geological structures. The notable improvement in GMM-1 over previous models is a consequence of several factors: improved computational capabilities, the use of otpimum weighting and least squares collocation solution techniques which stabilized the behavior of the solution at high degree and order, and the use of longer satellite arcs than employed in previous solutions that were made possible by improved force and measurement models. The inclusion of X band tracking data from the 379-km altitude, nnear-polar orbiting Mars Observer spacecraft should provide a significant improvement over GMM-1, particularly at high latitudes where current data poorly resolve the gravitational signature of the planet.
An improved gravity model for Mars: Goddard Mars Model-1 (GMM-1)
NASA Technical Reports Server (NTRS)
Smith, D. E.; Lerch, F. J.; Nerem, R. S.; Zuber, M. T.; Patel, G. B.; Fricke, S. K.; Lemoine, F. G.
1993-01-01
Doppler tracking data of three orbiting spacecraft have been reanalyzed to develop a new gravitational field model for the planet Mars, GMM-1 (Goddard Mars Model-1). This model employs nearly all available data, consisting of approximately 1100 days of S-bank tracking data collected by NASA's Deep Space Network from the Mariner 9, and Viking 1 and Viking 2 spacecraft, in seven different orbits, between 1971 and 1979. GMM-1 is complete to spherical harmonic degree and order 50, which corresponds to a half-wavelength spatial resolution of 200-300 km where the data permit. GMM-1 represents satellite orbits with considerably better accuracy than previous Mars gravity models and shows greater resolution of identifiable geological structures. The notable improvement in GMM-1 over previous models is a consequence of several factors: improved computational capabilities, the use of optimum weighting and least-squares collocation solution techniques which stabilized the behavior of the solution at high degree and order, and the use of longer satellite arcs than employed in previous solutions that were made possible by improved force and measurement models. The inclusion of X-band tracking data from the 379-km altitude, near-polar orbiting Mars Observer spacecraft should provide a significant improvement over GMM-1, particularly at high latitudes where current data poorly resolves the gravitational signature of the planet.
Hvitfeldt-Forsberg, Helena; Mazzocato, Pamela; Glaser, Daniel; Keller, Christina; Unbeck, Maria
2017-01-01
Objective To explore healthcare staffs’ and managers’ perceptions of how and when discrete event simulation modelling can be used as a decision support in improvement efforts. Design Two focus group discussions were performed. Setting Two settings were included: a rheumatology department and an orthopaedic section both situated in Sweden. Participants Healthcare staff and managers (n=13) from the two settings. Interventions Two workshops were performed, one at each setting. Workshops were initiated by a short introduction to simulation modelling. Results from the respective simulation model were then presented and discussed in the following focus group discussion. Results Categories from the content analysis are presented according to the following research questions: how and when simulation modelling can assist healthcare improvement? Regarding how, the participants mentioned that simulation modelling could act as a tool for support and a way to visualise problems, potential solutions and their effects. Regarding when, simulation modelling could be used both locally and by management, as well as a pedagogical tool to develop and test innovative ideas and to involve everyone in the improvement work. Conclusions Its potential as an information and communication tool and as an instrument for pedagogic work within healthcare improvement render a broader application and value of simulation modelling than previously reported. PMID:28588107
Improved parametrization of the growth index for dark energy and DGP models
NASA Astrophysics Data System (ADS)
Jing, Jiliang; Chen, Songbai
2010-03-01
We propose two improved parameterized form for the growth index of the linear matter perturbations: (I) γ(z)=γ0+(γ∞-γ0)z/z+1 and (II) γ(z)=γ0+γ1 z/z+1 +(γ∞-γ1-γ0)(. With these forms of γ(z), we analyze the accuracy of the approximation the growth factor f by Ωmγ(z) for both the wCDM model and the DGP model. For the first improved parameterized form, we find that the approximation accuracy is enhanced at the high redshifts for both kinds of models, but it is not at the low redshifts. For the second improved parameterized form, it is found that Ωmγ(z) approximates the growth factor f very well for all redshifts. For chosen α, the relative error is below 0.003% for the ΛCDM model and 0.028% for the DGP model when Ωm=0.27. Thus, the second improved parameterized form of γ(z) should be useful for the high precision constraint on the growth index of different models with the observational data. Moreover, we also show that α depends on the equation of state w and the fractional energy density of matter Ωm0, which may help us learn more information about dark energy and DGP models.
High Resolution Simulations of Arctic Sea Ice, 1979-1993
2003-01-01
William H. Lipscomb * PO[ARISSP To evaluate improvements in modelling Arctic sea ice, we compare results from two regional models at 1/120 horizontal...resolution. The first is a coupled ice-ocean model of the Arctic Ocean, consisting of an ocean model (adapted from the Parallel Ocean Program, Los...Alamos National Laboratory [LANL]) and the "old" sea ice model . The second model uses the same grid but consists of an improved "new" sea ice model (LANL
WILDLAND FIRE EMISSION MODELING FOR CMAQ: AN UPDATE
This paper summarizes recent efforts to improve the methods used for modeling wild land fire emissions both for retrospective modeling and real-time forecasting. These improvements focus on the temporal and spatial resolution of the activity data as well as the methods to estimat...
NASA Astrophysics Data System (ADS)
Mukhopadhyay, P.; Phani Murali Krishna, R.; Goswami, Bidyut B.; Abhik, S.; Ganai, Malay; Mahakur, M.; Khairoutdinov, Marat; Dudhia, Jimmy
2016-05-01
Inspite of significant improvement in numerical model physics, resolution and numerics, the general circulation models (GCMs) find it difficult to simulate realistic seasonal and intraseasonal variabilities over global tropics and particularly over Indian summer monsoon (ISM) region. The bias is mainly attributed to the improper representation of physical processes. Among all the processes, the cloud and convective processes appear to play a major role in modulating model bias. In recent times, NCEP CFSv2 model is being adopted under Monsoon Mission for dynamical monsoon forecast over Indian region. The analyses of climate free run of CFSv2 in two resolutions namely at T126 and T382, show largely similar bias in simulating seasonal rainfall, in capturing the intraseasonal variability at different scales over the global tropics and also in capturing tropical waves. Thus, the biases of CFSv2 indicate a deficiency in model's parameterization of cloud and convective processes. Keeping this in background and also for the need to improve the model fidelity, two approaches have been adopted. Firstly, in the superparameterization, 32 cloud resolving models each with a horizontal resolution of 4 km are embedded in each GCM (CFSv2) grid and the conventional sub-grid scale convective parameterization is deactivated. This is done to demonstrate the role of resolving cloud processes which otherwise remain unresolved. The superparameterized CFSv2 (SP-CFS) is developed on a coarser version T62. The model is integrated for six and half years in climate free run mode being initialised from 16 May 2008. The analyses reveal that SP-CFS simulates a significantly improved mean state as compared to default CFS. The systematic bias of lesser rainfall over Indian land mass, colder troposphere has substantially been improved. Most importantly the convectively coupled equatorial waves and the eastward propagating MJO has been found to be simulated with more fidelity in SP-CFS. The reason of such betterment in model mean state has been found to be due to the systematic improvement in moisture field, temperature profile and moist instability. The model also has better simulated the cloud and rainfall relation. This initiative demonstrates the role of cloud processes on the mean state of coupled GCM. As the superparameterization approach is computationally expensive, so in another approach, the conventional Simplified Arakawa Schubert (SAS) scheme is replaced by a revised SAS scheme (RSAS) and also the old and simplified cloud scheme of Zhao-Karr (1997) has been replaced by WSM6 in CFSV2 (hereafter CFS-CR). The primary objective of such modifications is to improve the distribution of convective rain in the model by using RSAS and the grid-scale or the large scale nonconvective rain by WSM6. The WSM6 computes the tendency of six class (water vapour, cloud water, ice, snow, graupel, rain water) hydrometeors at each of the model grid and contributes in the low, middle and high cloud fraction. By incorporating WSM6, for the first time in a global climate model, we are able to show a reasonable simulation of cloud ice and cloud liquid water distribution vertically and spatially as compared to Cloudsat observations. The CFS-CR has also showed improvement in simulating annual rainfall cycle and intraseasonal variability over the ISM region. These improvements in CFS-CR are likely to be associated with improvement of the convective and stratiform rainfall distribution in the model. These initiatives clearly address a long standing issue of resolving the cloud processes in climate model and demonstrate that the improved cloud and convective process paramterizations can eventually reduce the systematic bias and improve the model fidelity.
NASA Trapezoidal Wing Computations Including Transition and Advanced Turbulence Modeling
NASA Technical Reports Server (NTRS)
Rumsey, C. L.; Lee-Rausch, E. M.
2012-01-01
Flow about the NASA Trapezoidal Wing is computed with several turbulence models by using grids from the first High Lift Prediction Workshop in an effort to advance understanding of computational fluid dynamics modeling for this type of flowfield. Transition is accounted for in many of the computations. In particular, a recently-developed 4-equation transition model is utilized and works well overall. Accounting for transition tends to increase lift and decrease moment, which improves the agreement with experiment. Upper surface flap separation is reduced, and agreement with experimental surface pressures and velocity profiles is improved. The predicted shape of wakes from upstream elements is strongly influenced by grid resolution in regions above the main and flap elements. Turbulence model enhancements to account for rotation and curvature have the general effect of increasing lift and improving the resolution of the wing tip vortex as it convects downstream. However, none of the models improve the prediction of surface pressures near the wing tip, where more grid resolution is needed.
Wu, Jiang; Li, Jia; Xu, Zhenming
2009-08-15
Electrostatic separation presents an effective and environmentally friendly way for recycling metals and nonmetals from ground waste electrical and electronic equipment (WEEE). For this process, the trajectory of conductive particle is significant and some models have been established. However, the results of previous researches are limited by some simplifying assumptions and lead to a notable discrepancy between the model prediction and the experimental results. In the present research, a roll-type corona-electrostatic separator and ground printed circuit board (PCB) wastes were used to investigate the trajectory of the conductive particle. Two factors, the air drag force and the different charging situation, were introduced into the improved model. Their effects were analyzed and an improved model for the theoretical trajectory of conductive particle was established. Compared with the previous one, the improved model shows a good agreement with the experimental results. It provides a positive guidance for designing of separator and makes a progress for recycling the metals and nonmetals from WEEE.
Improvement of Reynolds-Stress and Triple-Product Lag Models
NASA Technical Reports Server (NTRS)
Olsen, Michael E.; Lillard, Randolph P.
2017-01-01
The Reynolds-stress and triple product Lag models were created with a normal stress distribution which was denied by a 4:3:2 distribution of streamwise, spanwise and wall normal stresses, and a ratio of r(sub w) = 0.3k in the log layer region of high Reynolds number flat plate flow, which implies R11(+)= [4/(9/2)*.3] approximately 2.96. More recent measurements show a more complex picture of the log layer region at high Reynolds numbers. The first cut at improving these models along with the direction for future refinements is described. Comparison with recent high Reynolds number data shows areas where further work is needed, but also shows inclusion of the modeled turbulent transport terms improve the prediction where they influence the solution. Additional work is needed to make the model better match experiment, but there is significant improvement in many of the details of the log layer behavior.
Pan, Qing; Yao, Jialiang; Wang, Ruofan; Cao, Ping; Ning, Gangmin; Fang, Luping
2017-08-01
The vessels in the microcirculation keep adjusting their structure to meet the functional requirements of the different tissues. A previously developed theoretical model can reproduce the process of vascular structural adaptation to help the study of the microcirculatory physiology. However, until now, such model lacks the appropriate methods for its parameter settings with subsequent limitation of further applications. This study proposed an improved quantum-behaved particle swarm optimization (QPSO) algorithm for setting the parameter values in this model. The optimization was performed on a real mesenteric microvascular network of rat. The results showed that the improved QPSO was superior to the standard particle swarm optimization, the standard QPSO and the previously reported Downhill algorithm. We conclude that the improved QPSO leads to a better agreement between mathematical simulation and animal experiment, rendering the model more reliable in future physiological studies.
Research on artistic gymnastics training guidance model
NASA Astrophysics Data System (ADS)
Luo, Lin; Sun, Xianzhong
2017-04-01
Rhythmic gymnastics training guidance model, taking into consideration the features of artistic gymnastics training, is put forward to help gymnasts identify their deficiencies and unskilled technical movements and improve their training effects. The model is built on the foundation of both physical quality indicator model and artistic gymnastics training indicator model. Physical quality indicator model composed of bodily factor, flexibility-strength factor and speed-dexterity factor delivers an objective evaluation with reference to basic sport testing data. Training indicator model, based on physical fitness indicator, helps analyze the technical movements, through which the impact from each bodily factor on technical movements is revealed. AG training guidance model, in further combination with actual training data and in comparison with the data shown in the training indicator model, helps identify the problems in trainings, and thus improve the training effect. These three models when in combined use and in comparison with historical model data can check and verify the improvement in training effect over a certain period of time.
Zhu, Wei; Wang, Wei; Yuan, Gannan
2016-06-01
In order to improve the tracking accuracy, model estimation accuracy and quick response of multiple model maneuvering target tracking, the interacting multiple models five degree cubature Kalman filter (IMM5CKF) is proposed in this paper. In the proposed algorithm, the interacting multiple models (IMM) algorithm processes all the models through a Markov Chain to simultaneously enhance the model tracking accuracy of target tracking. Then a five degree cubature Kalman filter (5CKF) evaluates the surface integral by a higher but deterministic odd ordered spherical cubature rule to improve the tracking accuracy and the model switch sensitivity of the IMM algorithm. Finally, the simulation results demonstrate that the proposed algorithm exhibits quick and smooth switching when disposing different maneuver models, and it also performs better than the interacting multiple models cubature Kalman filter (IMMCKF), interacting multiple models unscented Kalman filter (IMMUKF), 5CKF and the optimal mode transition matrix IMM (OMTM-IMM).
Athens, Jessica K.; Remington, Patrick L.; Gangnon, Ronald E.
2015-01-01
Objectives The University of Wisconsin Population Health Institute has published the County Health Rankings since 2010. These rankings use population-based data to highlight health outcomes and the multiple determinants of these outcomes and to encourage in-depth health assessment for all United States counties. A significant methodological limitation, however, is the uncertainty of rank estimates, particularly for small counties. To address this challenge, we explore the use of longitudinal and pooled outcome data in hierarchical Bayesian models to generate county ranks with greater precision. Methods In our models we used pooled outcome data for three measure groups: (1) Poor physical and poor mental health days; (2) percent of births with low birth weight and fair or poor health prevalence; and (3) age-specific mortality rates for nine age groups. We used the fixed and random effects components of these models to generate posterior samples of rates for each measure. We also used time-series data in longitudinal random effects models for age-specific mortality. Based on the posterior samples from these models, we estimate ranks and rank quartiles for each measure, as well as the probability of a county ranking in its assigned quartile. Rank quartile probabilities for univariate, joint outcome, and/or longitudinal models were compared to assess improvements in rank precision. Results The joint outcome model for poor physical and poor mental health days resulted in improved rank precision, as did the longitudinal model for age-specific mortality rates. Rank precision for low birth weight births and fair/poor health prevalence based on the univariate and joint outcome models were equivalent. Conclusion Incorporating longitudinal or pooled outcome data may improve rank certainty, depending on characteristics of the measures selected. For measures with different determinants, joint modeling neither improved nor degraded rank precision. This approach suggests a simple way to use existing information to improve the precision of small-area measures of population health. PMID:26098858
Improving the use of health data for health system strengthening.
Nutley, Tara; Reynolds, Heidi W
2013-02-13
Good quality and timely data from health information systems are the foundation of all health systems. However, too often data sit in reports, on shelves or in databases and are not sufficiently utilised in policy and program development, improvement, strategic planning and advocacy. Without specific interventions aimed at improving the use of data produced by information systems, health systems will never fully be able to meet the needs of the populations they serve. To employ a logic model to describe a pathway of how specific activities and interventions can strengthen the use of health data in decision making to ultimately strengthen the health system. A logic model was developed to provide a practical strategy for developing, monitoring and evaluating interventions to strengthen the use of data in decision making. The model draws on the collective strengths and similarities of previous work and adds to those previous works by making specific recommendations about interventions and activities that are most proximate to affect the use of data in decision making. The model provides an organizing framework for how interventions and activities work to strengthen the systematic demand, synthesis, review, and use of data. The logic model and guidance are presented to facilitate its widespread use and to enable improved data-informed decision making in program review and planning, advocacy, policy development. Real world examples from the literature support the feasible application of the activities outlined in the model. The logic model provides specific and comprehensive guidance to improve data demand and use. It can be used to design, monitor and evaluate interventions, and to improve demand for, and use of, data in decision making. As more interventions are implemented to improve use of health data, those efforts need to be evaluated.
Using airborne geophysical surveys to improve groundwater resource management models
Abraham, Jared D.; Cannia, James C.; Peterson, Steven M.; Smith, Bruce D.; Minsley, Burke J.; Bedrosian, Paul A.
2010-01-01
Increasingly, groundwater management requires more accurate hydrogeologic frameworks for groundwater models. These complex issues have created the demand for innovative approaches to data collection. In complicated terrains, groundwater modelers benefit from continuous high‐resolution geologic maps and their related hydrogeologic‐parameter estimates. The USGS and its partners have collaborated to use airborne geophysical surveys for near‐continuous coverage of areas of the North Platte River valley in western Nebraska. The survey objectives were to map the aquifers and bedrock topography of the area to help improve the understanding of groundwater‐surface‐water relationships, leading to improved water management decisions. Frequency‐domain heliborne electromagnetic surveys were completed, using a unique survey design to collect resistivity data that can be related to lithologic information to refine groundwater model inputs. To render the geophysical data useful to multidimensional groundwater models, numerical inversion is necessary to convert the measured data into a depth‐dependent subsurface resistivity model. This inverted model, in conjunction with sensitivity analysis, geological ground truth (boreholes and surface geology maps), and geological interpretation, is used to characterize hydrogeologic features. Interpreted two‐ and three‐dimensional data coverage provides the groundwater modeler with a high‐resolution hydrogeologic framework and a quantitative estimate of framework uncertainty. This method of creating hydrogeologic frameworks improved the understanding of flow path orientation by redefining the location of the paleochannels and associated bedrock highs. The improved models reflect actual hydrogeology at a level of accuracy not achievable using previous data sets.
Improving and Evaluating Nested Sampling Algorithm for Marginal Likelihood Estimation
NASA Astrophysics Data System (ADS)
Ye, M.; Zeng, X.; Wu, J.; Wang, D.; Liu, J.
2016-12-01
With the growing impacts of climate change and human activities on the cycle of water resources, an increasing number of researches focus on the quantification of modeling uncertainty. Bayesian model averaging (BMA) provides a popular framework for quantifying conceptual model and parameter uncertainty. The ensemble prediction is generated by combining each plausible model's prediction, and each model is attached with a model weight which is determined by model's prior weight and marginal likelihood. Thus, the estimation of model's marginal likelihood is crucial for reliable and accurate BMA prediction. Nested sampling estimator (NSE) is a new proposed method for marginal likelihood estimation. The process of NSE is accomplished by searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm is often used for local sampling. However, M-H is not an efficient sampling algorithm for high-dimensional or complicated parameter space. For improving the efficiency of NSE, it could be ideal to incorporate the robust and efficient sampling algorithm - DREAMzs into the local sampling of NSE. The comparison results demonstrated that the improved NSE could improve the efficiency of marginal likelihood estimation significantly. However, both improved and original NSEs suffer from heavy instability. In addition, the heavy computation cost of huge number of model executions is overcome by using an adaptive sparse grid surrogates.
Yu, Chun-tang; Liu, Ying-ying; Xia, Yu-feng
2014-01-01
The stress-strain data of 20MnNiMo alloy were collected from a series of hot compressions on Gleeble-1500 thermal-mechanical simulator in the temperature range of 1173∼1473 K and strain rate range of 0.01∼10 s−1. Based on the experimental data, the improved Arrhenius-type constitutive model and the artificial neural network (ANN) model were established to predict the high temperature flow stress of as-cast 20MnNiMo alloy. The accuracy and reliability of the improved Arrhenius-type model and the trained ANN model were further evaluated in terms of the correlation coefficient (R), the average absolute relative error (AARE), and the relative error (η). For the former, R and AARE were found to be 0.9954 and 5.26%, respectively, while, for the latter, 0.9997 and 1.02%, respectively. The relative errors (η) of the improved Arrhenius-type model and the ANN model were, respectively, in the range of −39.99%∼35.05% and −3.77%∼16.74%. As for the former, only 16.3% of the test data set possesses η-values within ±1%, while, as for the latter, more than 79% possesses. The results indicate that the ANN model presents a higher predictable ability than the improved Arrhenius-type constitutive model. PMID:24688358
Redesigning inpatient care: Testing the effectiveness of an accountable care team model.
Kara, Areeba; Johnson, Cynthia S; Nicley, Amy; Niemeier, Michael R; Hui, Siu L
2015-12-01
US healthcare underperforms on quality and safety metrics. Inpatient care constitutes an immense opportunity to intervene to improve care. Describe a model of inpatient care and measure its impact. A quantitative assessment of the implementation of a new model of care. The graded implementation of the model allowed us to follow outcomes and measure their association with the dose of the implementation. Inpatient medical and surgical units in a large academic health center. Eight interventions rooted in improving interprofessional collaboration (IPC), enabling data-driven decisions, and providing leadership were implemented. Outcome data from August 2012 to December 2013 were analyzed using generalized linear mixed models for associations with the implementation of the model. Length of stay (LOS) index, case-mix index-adjusted variable direct costs (CMI-adjusted VDC), 30-day readmission rates, overall patient satisfaction scores, and provider satisfaction with the model were measured. The implementation of the model was associated with decreases in LOS index (P < 0.0001) and CMI-adjusted VDC (P = 0.0006). We did not detect improvements in readmission rates or patient satisfaction scores. Most providers (95.8%, n = 92) agreed that the model had improved the quality and safety of the care delivered. Creating an environment and framework in which IPC is fostered, performance data are transparently available, and leadership is provided may improve value on both medical and surgical units. These interventions appear to be well accepted by front-line staff. Readmission rates and patient satisfaction remain challenging. © 2015 Society of Hospital Medicine.
Regional Seismic Travel-Time Prediction, Uncertainty, and Location Improvement in Western Eurasia
NASA Astrophysics Data System (ADS)
Flanagan, M. P.; Myers, S. C.
2004-12-01
We investigate our ability to improve regional travel-time prediction and seismic event location using an a priori, three-dimensional velocity model of Western Eurasia and North Africa: WENA1.0 [Pasyanos et al., 2004]. Our objective is to improve the accuracy of seismic location estimates and calculate representative location uncertainty estimates. As we focus on the geographic region of Western Eurasia, the Middle East, and North Africa, we develop, test, and validate 3D model-based travel-time prediction models for 30 stations in the study region. Three principal results are presented. First, the 3D WENA1.0 velocity model improves travel-time prediction over the iasp91 model, as measured by variance reduction, for regional Pg, Pn, and P phases recorded at the 30 stations. Second, a distance-dependent uncertainty model is developed and tested for the WENA1.0 model. Third, an end-to-end validation test based on 500 event relocations demonstrates improved location performance over the 1-dimensional iasp91 model. Validation of the 3D model is based on a comparison of approximately 11,000 Pg, Pn, and P travel-time predictions and empirical observations from ground truth (GT) events. Ray coverage for the validation dataset is chosen to provide representative, regional-distance sampling across Eurasia and North Africa. The WENA1.0 model markedly improves travel-time predictions for most stations with an average variance reduction of 25% for all ray paths. We find that improvement is station dependent, with some stations benefiting greatly from WENA1.0 predictions (52% at APA, 33% at BKR, and 32% at NIL), some stations showing moderate improvement (12% at KEV, 14% at BOM, and 12% at TAM), some benefiting only slightly (6% at MOX, and 4% at SVE), and some are degraded (-6% at MLR and -18% at QUE). We further test WENA1.0 by comparing location accuracy with results obtained using the iasp91 model. Again, relocation of these events is dependent on ray paths that evenly sample WENA1.0 and therefore provide an unbiased assessment of location performance. A statistically significant sample is achieved by generating 500 location realizations based on 5 events with location accuracy between 1 km and 5 km. Each realization is a randomly selected event with location determined by randomly selecting 5 stations from the available network. In 340 cases (68% of the instances), locations are improved, and average mislocation is reduced from 31 km to 26 km. Preliminary test of uncertainty estimates suggest that our uncertainty model produces location uncertainty ellipses that are representative of location accuracy. These results highlight the importance of accurate GT datasets in assessing regional travel-time models and demonstrate that an a priori 3D model can markedly improve our ability to locate small magnitude events in a regional monitoring context. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48, Contribution UCRL-CONF-206386.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tilmes, Simone; Lamarque, Jean -Francois; Emmons, Louisa K.
The Community Earth System Model (CESM1) CAM4-chem has been used to perform the Chemistry Climate Model Initiative (CCMI) reference and sensitivity simulations. In this model, the Community Atmospheric Model version 4 (CAM4) is fully coupled to tropospheric and stratospheric chemistry. Details and specifics of each configuration, including new developments and improvements are described. CESM1 CAM4-chem is a low-top model that reaches up to approximately 40 km and uses a horizontal resolution of 1.9° latitude and 2.5° longitude. For the specified dynamics experiments, the model is nudged to Modern-Era Retrospective Analysis for Research and Applications (MERRA) reanalysis. We summarize the performance ofmore » the three reference simulations suggested by CCMI, with a focus on the last 15 years of the simulation when most observations are available. Comparisons with selected data sets are employed to demonstrate the general performance of the model. We highlight new data sets that are suited for multi-model evaluation studies. Most important improvements of the model are the treatment of stratospheric aerosols and the corresponding adjustments for radiation and optics, the updated chemistry scheme including improved polar chemistry and stratospheric dynamics and improved dry deposition rates. These updates lead to a very good representation of tropospheric ozone within 20 % of values from available observations for most regions. In particular, the trend and magnitude of surface ozone is much improved compared to earlier versions of the model. Furthermore, stratospheric column ozone of the Southern Hemisphere in winter and spring is reasonably well represented. In conclusion, all experiments still underestimate CO most significantly in Northern Hemisphere spring and show a significant underestimation of hydrocarbons based on surface observations.« less
Tilmes, Simone; Lamarque, Jean -Francois; Emmons, Louisa K.; ...
2016-05-20
The Community Earth System Model (CESM1) CAM4-chem has been used to perform the Chemistry Climate Model Initiative (CCMI) reference and sensitivity simulations. In this model, the Community Atmospheric Model version 4 (CAM4) is fully coupled to tropospheric and stratospheric chemistry. Details and specifics of each configuration, including new developments and improvements are described. CESM1 CAM4-chem is a low-top model that reaches up to approximately 40 km and uses a horizontal resolution of 1.9° latitude and 2.5° longitude. For the specified dynamics experiments, the model is nudged to Modern-Era Retrospective Analysis for Research and Applications (MERRA) reanalysis. We summarize the performance ofmore » the three reference simulations suggested by CCMI, with a focus on the last 15 years of the simulation when most observations are available. Comparisons with selected data sets are employed to demonstrate the general performance of the model. We highlight new data sets that are suited for multi-model evaluation studies. Most important improvements of the model are the treatment of stratospheric aerosols and the corresponding adjustments for radiation and optics, the updated chemistry scheme including improved polar chemistry and stratospheric dynamics and improved dry deposition rates. These updates lead to a very good representation of tropospheric ozone within 20 % of values from available observations for most regions. In particular, the trend and magnitude of surface ozone is much improved compared to earlier versions of the model. Furthermore, stratospheric column ozone of the Southern Hemisphere in winter and spring is reasonably well represented. In conclusion, all experiments still underestimate CO most significantly in Northern Hemisphere spring and show a significant underestimation of hydrocarbons based on surface observations.« less
Update on Bayesian Blocks: Segmented Models for Sequential Data
NASA Technical Reports Server (NTRS)
Scargle, Jeff
2017-01-01
The Bayesian Block algorithm, in wide use in astronomy and other areas, has been improved in several ways. The model for block shape has been generalized to include other than constant signal rate - e.g., linear, exponential, or other parametric models. In addition the computational efficiency has been improved, so that instead of O(N**2) the basic algorithm is O(N) in most cases. Other improvements in the theory and application of segmented representations will be described.
Improved modeling of photon observables with the event-by-event fission model FREYA
Vogt, R.; Randrup, J.
2017-12-28
The event-by-event fission model FREYA has been improved, in particular to address deficiencies in the calculation of photon observables. In this paper, we discuss the improvements that have been made and introduce several new variables, some detector dependent, that affect the photon observables. We show the sensitivity of FREYA to these variables. Finally, we then compare the results to the available photon data from spontaneous and thermal neutron-induced fission.
Optimization of multi-objective micro-grid based on improved particle swarm optimization algorithm
NASA Astrophysics Data System (ADS)
Zhang, Jian; Gan, Yang
2018-04-01
The paper presents a multi-objective optimal configuration model for independent micro-grid with the aim of economy and environmental protection. The Pareto solution set can be obtained by solving the multi-objective optimization configuration model of micro-grid with the improved particle swarm algorithm. The feasibility of the improved particle swarm optimization algorithm for multi-objective model is verified, which provides an important reference for multi-objective optimization of independent micro-grid.
Parchebafieh, Samaneh; Gholizadeh, Leila; Lakdizaji, Sima; Ghiasvandiyan, Shahrzad; Davoodi, Arefeh
2014-01-01
This study examined the effectiveness of the clinical teaching associate (CTA) model to improve clinical learning outcomes in nursing students. Students were randomly allocated to either the CTA (n = 28) or traditional training group (n = 32), and their clinical knowledge, skills, and satisfaction with the learning experience were assessed and compared. The results showed that the CTA model was equally effective in improving clinical knowledge, skills, and satisfaction of nursing students.
2007-05-01
Organizational Structure 40 6.1.3 Funding Model 40 6.1.4 Role of Information Technology 40 6.2 Considering Process Improvement 41 6.2.1 Dimensions of...to the process definition for resiliency engineering. 6.1.3 Funding Model Just as organizational structures tend to align across security and...responsibility. Adopting an enter- prise view of operational resiliency and a process improvement approach requires that the funding model evolve to one
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
Endometrial cancer risk prediction including serum-based biomarkers: results from the EPIC cohort.
Fortner, Renée T; Hüsing, Anika; Kühn, Tilman; Konar, Meric; Overvad, Kim; Tjønneland, Anne; Hansen, Louise; Boutron-Ruault, Marie-Christine; Severi, Gianluca; Fournier, Agnès; Boeing, Heiner; Trichopoulou, Antonia; Benetou, Vasiliki; Orfanos, Philippos; Masala, Giovanna; Agnoli, Claudia; Mattiello, Amalia; Tumino, Rosario; Sacerdote, Carlotta; Bueno-de-Mesquita, H B As; Peeters, Petra H M; Weiderpass, Elisabete; Gram, Inger T; Gavrilyuk, Oxana; Quirós, J Ramón; Maria Huerta, José; Ardanaz, Eva; Larrañaga, Nerea; Lujan-Barroso, Leila; Sánchez-Cantalejo, Emilio; Butt, Salma Tunå; Borgquist, Signe; Idahl, Annika; Lundin, Eva; Khaw, Kay-Tee; Allen, Naomi E; Rinaldi, Sabina; Dossus, Laure; Gunter, Marc; Merritt, Melissa A; Tzoulaki, Ioanna; Riboli, Elio; Kaaks, Rudolf
2017-03-15
Endometrial cancer risk prediction models including lifestyle, anthropometric and reproductive factors have limited discrimination. Adding biomarker data to these models may improve predictive capacity; to our knowledge, this has not been investigated for endometrial cancer. Using a nested case-control study within the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort, we investigated the improvement in discrimination gained by adding serum biomarker concentrations to risk estimates derived from an existing risk prediction model based on epidemiologic factors. Serum concentrations of sex steroid hormones, metabolic markers, growth factors, adipokines and cytokines were evaluated in a step-wise backward selection process; biomarkers were retained at p < 0.157 indicating improvement in the Akaike information criterion (AIC). Improvement in discrimination was assessed using the C-statistic for all biomarkers alone, and change in C-statistic from addition of biomarkers to preexisting absolute risk estimates. We used internal validation with bootstrapping (1000-fold) to adjust for over-fitting. Adiponectin, estrone, interleukin-1 receptor antagonist, tumor necrosis factor-alpha and triglycerides were selected into the model. After accounting for over-fitting, discrimination was improved by 2.0 percentage points when all evaluated biomarkers were included and 1.7 percentage points in the model including the selected biomarkers. Models including etiologic markers on independent pathways and genetic markers may further improve discrimination. © 2016 UICC.
NASA Astrophysics Data System (ADS)
Ma, Yulong; Liu, Heping
2017-12-01
Atmospheric flow over complex terrain, particularly recirculation flows, greatly influences wind-turbine siting, forest-fire behaviour, and trace-gas and pollutant dispersion. However, there is a large uncertainty in the simulation of flow over complex topography, which is attributable to the type of turbulence model, the subgrid-scale (SGS) turbulence parametrization, terrain-following coordinates, and numerical errors in finite-difference methods. Here, we upgrade the large-eddy simulation module within the Weather Research and Forecasting model by incorporating the immersed-boundary method into the module to improve simulations of the flow and recirculation over complex terrain. Simulations over the Bolund Hill indicate improved mean absolute speed-up errors with respect to previous studies, as well an improved simulation of the recirculation zone behind the escarpment of the hill. With regard to the SGS parametrization, the Lagrangian-averaged scale-dependent Smagorinsky model performs better than the classic Smagorinsky model in reproducing both velocity and turbulent kinetic energy. A finer grid resolution also improves the strength of the recirculation in flow simulations, with a higher horizontal grid resolution improving simulations just behind the escarpment, and a higher vertical grid resolution improving results on the lee side of the hill. Our modelling approach has broad applications for the simulation of atmospheric flows over complex topography.
Improved parameter inference in catchment models: 1. Evaluating parameter uncertainty
NASA Astrophysics Data System (ADS)
Kuczera, George
1983-10-01
A Bayesian methodology is developed to evaluate parameter uncertainty in catchment models fitted to a hydrologic response such as runoff, the goal being to improve the chance of successful regionalization. The catchment model is posed as a nonlinear regression model with stochastic errors possibly being both autocorrelated and heteroscedastic. The end result of this methodology, which may use Box-Cox power transformations and ARMA error models, is the posterior distribution, which summarizes what is known about the catchment model parameters. This can be simplified to a multivariate normal provided a linearization in parameter space is acceptable; means of checking and improving this assumption are discussed. The posterior standard deviations give a direct measure of parameter uncertainty, and study of the posterior correlation matrix can indicate what kinds of data are required to improve the precision of poorly determined parameters. Finally, a case study involving a nine-parameter catchment model fitted to monthly runoff and soil moisture data is presented. It is shown that use of ordinary least squares when its underlying error assumptions are violated gives an erroneous description of parameter uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirman, C.R., E-mail: ckirman@summittoxicology.com
A physiologically based pharmacokinetic (PBPK) model for hexavalent chromium [Cr(VI)] in mice, rats, and humans developed previously (Kirman et al., 2012, 2013), was updated to reflect an improved understanding of the toxicokinetics of the gastrointestinal tract following oral exposures. Improvements were made to: (1) the reduction model, which describes the pH-dependent reduction of Cr(VI) to Cr(III) in the gastrointestinal tract under both fasted and fed states; (2) drinking water pattern simulations, to better describe dosimetry in rodents under the conditions of the NTP cancer bioassay; and (3) parameterize the model to characterize potentially sensitive human populations. Important species differences, sourcesmore » of non-linear toxicokinetics, and human variation are identified and discussed within the context of human health risk assessment. - Highlights: • An improved version of the PBPK model for Cr(VI) toxicokinetics was developed. • The model incorporates data collected to fill important data gaps. • Model predictions for specific age groups and sensitive subpopulations are provided. • Implications to human health risk assessment are discussed.« less
Xu, Dong; Zhang, Jian; Roy, Ambrish; Zhang, Yang
2011-01-01
I-TASSER is an automated pipeline for protein tertiary structure prediction using multiple threading alignments and iterative structure assembly simulations. In CASP9 experiments, two new algorithms, QUARK and FG-MD, were added to the I-TASSER pipeline for improving the structural modeling accuracy. QUARK is a de novo structure prediction algorithm used for structure modeling of proteins that lack detectable template structures. For distantly homologous targets, QUARK models are found useful as a reference structure for selecting good threading alignments and guiding the I-TASSER structure assembly simulations. FG-MD is an atomic-level structural refinement program that uses structural fragments collected from the PDB structures to guide molecular dynamics simulation and improve the local structure of predicted model, including hydrogen-bonding networks, torsion angles and steric clashes. Despite considerable progress in both the template-based and template-free structure modeling, significant improvements on protein target classification, domain parsing, model selection, and ab initio folding of beta-proteins are still needed to further improve the I-TASSER pipeline. PMID:22069036
Improving Earth/Prediction Models to Improve Network Processing
NASA Astrophysics Data System (ADS)
Wagner, G. S.
2017-12-01
The United States Atomic Energy Detection System (USAEDS) primaryseismic network consists of a relatively small number of arrays andthree-component stations. The relatively small number of stationsin the USAEDS primary network make it both necessary and feasibleto optimize both station and network processing.Station processing improvements include detector tuning effortsthat use Receiver Operator Characteristic (ROC) curves to helpjudiciously set acceptable Type 1 (false) vs. Type 2 (miss) errorrates. Other station processing improvements include the use ofempirical/historical observations and continuous background noisemeasurements to compute time-varying, maximum likelihood probabilityof detection thresholds.The USAEDS network processing software makes extensive use of theazimuth and slowness information provided by frequency-wavenumberanalysis at array sites, and polarization analysis at three-componentsites. Most of the improvements in USAEDS network processing aredue to improvements in the models used to predict azimuth, slowness,and probability of detection. Kriged travel-time, azimuth andslowness corrections-and associated uncertainties-are computedusing a ground truth database. Improvements in station processingand the use of improved models for azimuth, slowness, and probabilityof detection have led to significant improvements in USADES networkprocessing.
A comparison of walk-in counselling and the wait list model for delivering counselling services.
Stalker, Carol A; Riemer, Manuel; Cait, Cheryl-Anne; Horton, Susan; Booton, Jocelyn; Josling, Leslie; Bedggood, Joanna; Zaczek, Margaret
2016-10-01
Walk-in counselling has been used to reduce wait times but there are few controlled studies to compare outcomes between walk-in and the traditional model of service delivery. To compare change in psychological distress by clients receiving services from two models of service delivery, a walk-in counselling model and a traditional counselling model involving a wait list. Mixed-methods sequential explanatory design including quantitative comparison of groups with one pre-test and two follow-ups, and qualitative analysis of interviews with a sub-sample. Five-hundred and twenty-four participants ≥16 years were recruited from two Family Counselling Agencies; the General Health Questionnaire-12 assessed change in psychological distress. Hierarchical linear modelling revealed clients of the walk-in model improved faster and were less distressed at the four-week follow-up compared to the traditional service delivery model. Ten weeks later, both groups had improved and were similar. Participants receiving instrumental services prior to baseline improved more slowly. The qualitative data confirmed participants highly valued the accessibility of the walk-in model, and were frustrated by the lengthy waits associated with the traditional model. This study improves methodologically on previous studies of walk-in counselling, an approach to service delivery not conducive to randomized controlled trials.
An Employee-Centered Care Model Responds to the Triple Aim: Improving Employee Health.
Fox, Kelly; McCorkle, Ruth
2018-01-01
Health care expenditures, patient satisfaction, and timely access to care will remain problematic if dramatic changes in health care delivery models are not developed and implemented. To combat this challenge, a Triple Aim approach is essential; Innovation in payment and health care delivery models is required. Using the Donabedian framework of structure, process, and outcome, this article describes a nurse-led employee-centered care model designed to improve consumers' health care experiences, improve employee health, and increase access to care while reducing health care costs for employees, age 18 and older, in a corporate environment.
Improved Slip Casting Of Ceramic Models
NASA Technical Reports Server (NTRS)
Buck, Gregory M.; Vasquez, Peter; Hicks, Lana P.
1994-01-01
Improved technique of investment slip casting developed for making precise ceramic wind-tunnel models. Needed in wind-tunnel experiments to verify predictions of aerothermodynamical computer codes. Ceramic materials used because of their low heat conductivities and ability to survive high temperatures. Present improved slip-casting technique enables casting of highly detailed models from aqueous or nonaqueous solutions. Wet shell molds peeled off models to ensure precise and undamaged details. Used at NASA Langley Research Center to form superconducting ceramic components from nonaqueous slip solutions. Technique has many more applications when ceramic materials developed further for such high-strength/ temperature components as engine parts.
The Role of Multimodel Combination in Improving Streamflow Prediction
NASA Astrophysics Data System (ADS)
Arumugam, S.; Li, W.
2008-12-01
Model errors are the inevitable part in any prediction exercise. One approach that is currently gaining attention to reduce model errors is by optimally combining multiple models to develop improved predictions. The rationale behind this approach primarily lies on the premise that optimal weights could be derived for each model so that the developed multimodel predictions will result in improved predictability. In this study, we present a new approach to combine multiple hydrological models by evaluating their predictability contingent on the predictor state. We combine two hydrological models, 'abcd' model and Variable Infiltration Capacity (VIC) model, with each model's parameter being estimated by two different objective functions to develop multimodel streamflow predictions. The performance of multimodel predictions is compared with individual model predictions using correlation, root mean square error and Nash-Sutcliffe coefficient. To quantify precisely under what conditions the multimodel predictions result in improved predictions, we evaluate the proposed algorithm by testing it against streamflow generated from a known model ('abcd' model or VIC model) with errors being homoscedastic or heteroscedastic. Results from the study show that streamflow simulated from individual models performed better than multimodels under almost no model error. Under increased model error, the multimodel consistently performed better than the single model prediction in terms of all performance measures. The study also evaluates the proposed algorithm for streamflow predictions in two humid river basins from NC as well as in two arid basins from Arizona. Through detailed validation in these four sites, the study shows that multimodel approach better predicts the observed streamflow in comparison to the single model predictions.
Recent progress in empirical modeling of ion composition in the topside ionosphere
NASA Astrophysics Data System (ADS)
Truhlik, Vladimir; Triskova, Ludmila; Bilitza, Dieter; Kotov, Dmytro; Bogomaz, Oleksandr; Domnin, Igor
2016-07-01
The last deep and prolonged solar minimum revealed shortcomings of existing empirical models, especially of parameter models that depend strongly on solar activity, such as the IRI (International Reference Ionosphere) ion composition model, and that are based on data sets from previous solar cycles. We have improved the TTS-03 ion composition model (Triskova et al., 2003) which is included in IRI since version 2007. The new model called AEIKion-13 employs an improved description of the dependence of ion composition on solar activity. We have also developed new global models of the upper transition height based on large data sets of vertical electron density profiles from ISIS, Alouette and COSMIC. The upper transition height is used as an anchor point for adjustment of the AEIKion-13 ion composition model. Additionally, we show also progress on improvements of the altitudinal dependence of the ion composition in the AEIKion-13 model. Results of the improved model are compared with data from other types of measurements including data from the Atmosphere Explorer C and E and C/NOFS satellites, and the Kharkiv and Arecibo incoherent scatter radars. Possible real time updating of the model by the upper transition height from the real time COSMIC vertical profiles is discussed. Triskova, L.,Truhlik,V., Smilauer, J.,2003. An empirical model of ion composition in the outer ionosphere. Adv. Space Res. 31(3), 653-663.
NASA Astrophysics Data System (ADS)
Guo, Pengbin; Sun, Jian; Hu, Shuling; Xue, Ju
2018-02-01
Pulsar navigation is a promising navigation method for high-altitude orbit space tasks or deep space exploration. At present, an important reason for restricting the development of pulsar navigation is that navigation accuracy is not high due to the slow update of the measurements. In order to improve the accuracy of pulsar navigation, an asynchronous observation model which can improve the update rate of the measurements is proposed on the basis of satellite constellation which has a broad space for development because of its visibility and reliability. The simulation results show that the asynchronous observation model improves the positioning accuracy by 31.48% and velocity accuracy by 24.75% than that of the synchronous observation model. With the new Doppler effects compensation method in the asynchronous observation model proposed in this paper, the positioning accuracy is improved by 32.27%, and the velocity accuracy is improved by 34.07% than that of the traditional method. The simulation results show that without considering the clock error will result in a filtering divergence.
NASA Astrophysics Data System (ADS)
Machet, Tania; Lowe, David; Gütl, Christian
2012-12-01
This paper explores the hypothesis that embedding a laboratory activity into a virtual environment can provide a richer experimental context and hence improve the understanding of the relationship between a theoretical model and the real world, particularly in terms of the model's strengths and weaknesses. While an identified learning objective of laboratories is to support the understanding of the relationship between models and reality, the paper illustrates that this understanding is hindered by inherently limited experiments and that there is scope for improvement. Despite the contextualisation of learning activities having been shown to support learning objectives in many fields, there is traditionally little contextual information presented during laboratory experimentation. The paper argues that the enhancing laboratory activity with contextual information affords an opportunity to improve students' understanding of the relationship between the theoretical model and the experiment (which is effectively a proxy for the complex real world), thereby improving their understanding of the relationship between the model and reality. The authors propose that these improvements can be achieved by setting remote laboratories within context-rich virtual worlds.
Performance improvement CME for quality: challenges inherent to the process.
Vakani, Farhan Saeed; O'Beirne, Ronan
2015-01-01
The purpose of this paper is to discuss the perspective debates upon the real-time challenges for a three-staged Performance Improvement Continuing Medical Education (PI-CME) model, an innovative and potential approach for future CME, to inform providers to think, prepare and to act proactively. In this discussion, the challenges associated for adopting the American Medical Association's three-staged PI-CME model are reported. Not many institutions in USA are using a three-staged performance improvement model and then customizing it to their own healthcare context for the specific targeted audience. They integrate traditional CME methods with performance and quality initiatives, and linking with CME credits. Overall the US health system is interested in a structured PI-CME model with the potential to improve physicians practicing behaviors. Knowing the dearth of evidence for applying this structured performance improvement methodology into the design of CME activities, and the lack of clarity on challenges inherent to the process that learners and providers encounter. This paper establishes all-important first step to render the set of challenges for a three-staged PI-CME model.
Improving Air Quality Forecasts with AURA Observations
NASA Technical Reports Server (NTRS)
Newchurch, M. J.; Biazer, A.; Khan, M.; Koshak, W. J.; Nair, U.; Fuller, K.; Wang, L.; Parker, Y.; Williams, R.; Liu, X.
2008-01-01
Past studies have identified model initial and boundary conditions as sources of reducible errors in air-quality simulations. In particular, improving the initial condition improves the accuracy of short-term forecasts as it allows for the impact of local emissions to be realized by the model and improving boundary conditions improves long range transport through the model domain, especially in recirculating anticyclones. During the August 2006 period, we use AURA/OMI ozone measurements along with MODIS and CALIPSO aerosol observations to improve the initial and boundary conditions of ozone and Particulate Matter. Assessment of the model by comparison of the control run and satellite assimilation run to the IONS06 network of ozonesonde observations, which comprise the densest ozone sounding campaign ever conducted in North America, to AURA/TES ozone profile measurements, and to the EPA ground network of ozone and PM measurements will show significant improvement in the CMAQ calculations that use AURA initial and boundary conditions. Further analyses of lightning occurrences from ground and satellite observations and AURA/OMI NO2 column abundances will identify the lightning NOx signal evident in OMI measurements and suggest pathways for incorporating the lightning and NO2 data into the CMAQ simulations.
Improved reference models for middle atmosphere ozone
NASA Technical Reports Server (NTRS)
Keating, G. M.; Pitts, M. C.; Chen, C.
1989-01-01
Improvements are provided for the ozone reference model which is to be incorporated in the COSPAR International Reference Atmosphere (CIRA). The ozone reference model will provide considerable information on the global ozone distribution, including ozone vertical structure as a function of month and latitude from approximately 25 to 90 km, combining data from five recent satellite experiments (Nimbus 7 LIMS, Nimbus 7 SBUV, AE-2 SAGE, Solar Mesosphere Explorer (SME) UVS, and SME IR). The improved models are described and use reprocessed AE-2 SAGE data (sunset) and extend the use of SAGE data from 1981 to the period 1981-1983. Comparisons are shown between the ozone reference model and various nonsatellite measurements at different levels in the middle atmosphere.
The United States Environmental Protection Agency's (EPA) National Exposure Research Laboratory (NERL) has initiated a project to improve the methodology for modeling human exposure to motor vehicle emission. The overall project goal is to develop improved methods for modeling...
COMPUTER PROGRAM DOCUMENTATION FOR THE ENHANCED STREAM WATER QUALITY MODEL QUAL2E
Presented in the manual are recent modifications and improvements to the widely used stream water quality model QUAL-II. Called QUAL2E, the enhanced model incorporates improvements in eight areas: (1) algal, nitrogen, phosphorus, and dissolved oxygen interactions; (2) algal growt...
NASA Astrophysics Data System (ADS)
Johns, Jesse M.; Burkes, Douglas
2017-07-01
In this work, a multilayered perceptron (MLP) network is used to develop predictive isothermal time-temperature-transformation (TTT) models covering a range of U-Mo binary and ternary alloys. The selected ternary alloys for model development are U-Mo-Ru, U-Mo-Nb, U-Mo-Zr, U-Mo-Cr, and U-Mo-Re. These model's ability to predict 'novel' U-Mo alloys is shown quite well despite the discrepancies between literature sources for similar alloys which likely arise from different thermal-mechanical processing conditions. These models are developed with the primary purpose of informing experimental decisions. Additional experimental insight is necessary in order to reduce the number of experiments required to isolate ideal alloys. These models allow test planners to evaluate areas of experimental interest; once initial tests are conducted, the model can be updated and further improve follow-on testing decisions. The model also improves analysis capabilities by reducing the number of data points necessary from any particular test. For example, if one or two isotherms are measured during a test, the model can construct the rest of the TTT curve over a wide range of temperature and time. This modeling capability reduces the cost of experiments while also improving the value of the results from the tests. The reduced costs could result in improved material characterization and therefore improved fundamental understanding of TTT dynamics. As additional understanding of phenomena driving TTTs is acquired, this type of MLP model can be used to populate unknowns (such as material impurity and other thermal mechanical properties) from past literature sources.
Hirshman, Brian R; Wilson, Bayard; Ali, Mir Amaan; Proudfoot, James A; Koiso, Takao; Nagano, Osamu; Carter, Bob S; Serizawa, Toru; Yamamoto, Masaaki; Chen, Clark C
2018-04-01
Two intracranial tumor volume variables have been shown to prognosticate survival of stereotactic-radiosurgery-treated brain metastasis patients: the largest intracranial tumor volume (LITV) and the cumulative intracranial tumor volume (CITV). To determine whether the prognostic value of the Scored Index for Radiosurgery (SIR) model can be improved by replacing one of its components-LITV-with CITV. We compared LITV and CITV in terms of their survival prognostication using a series of multivariable models that included known components of the SIR: age, Karnofsky Performance Score, status of extracranial disease, and the number of brain metastases. Models were compared using established statistical measures, including the net reclassification improvement (NRI > 0) and integrated discrimination improvement (IDI). The analysis was performed in 2 independent cohorts, each consisting of ∼3000 patients. In both cohorts, CITV was shown to be independently predictive of patient survival. Replacement of LITV with CITV in the SIR model improved the model's ability to predict 1-yr survival. In the first cohort, the CITV model showed an NRI > 0 improvement of 0.2574 (95% confidence interval [CI] 0.1890-0.3257) and IDI of 0.0088 (95% CI 0.0057-0.0119) relative to the LITV model. In the second cohort, the CITV model showed a NRI > 0 of 0.2604 (95% CI 0.1796-0.3411) and IDI of 0.0051 (95% CI 0.0029-0.0073) relative to the LITV model. After accounting for covariates within the SIR model, CITV offers superior prognostic value relative to LITV for stereotactic radiosurgery-treated brain metastasis patients.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
ERIC Educational Resources Information Center
Neumann, Yoram; Neumann, Edith; Lewis, Shelia
2017-01-01
This study integrated the Spiral Curriculum approach into the Robust Learning Model as part of a continuous improvement process that was designed to improve educational effectiveness and then assessed the differences between the initial and integrated models as well as the predictability of the first course in the integrated learning model on a…
ERIC Educational Resources Information Center
Cleckner, John
The author reviews five cost-effectiveness basic models including log-log correlational, general utility theory, simultaneous equations, nonlinear theoretical, and feedback. Several suggestions are made to improve the models and increase the domain of problems that can be considered by the models. In the second part of the paper, the author…
A Model of Reading Teaching for University EFL Students: Need Analysis and Model Design
ERIC Educational Resources Information Center
Hamra, Arifuddin; Syatriana, Eny
2012-01-01
This study designed a model of teaching reading for university EFL students based on the English curriculum at the Faculty of Languages and Literature and the concept of the team-based learning in order to improve the reading comprehension of the students. What kind of teaching model can help students to improve their reading comprehension? The…
ERIC Educational Resources Information Center
Govender, K. K.
2011-01-01
The objective of this article is to develop a conceptual model aimed at improving the postgraduate research students' experience. Since postgraduate students "vote with their feet" an improved understanding of the postgraduate research service encounter may result in improving the quality of the encounter and so increasing throughput and…
ERIC Educational Resources Information Center
Rossi, Robert D.
2015-01-01
Improving student engagement in STEM (science, technology, engineering, and mathematics) courses generally, and organic chemistry specifically, has long been a goal for educators. Recently educators at all academic levels have been exploring the "inverted classroom" or "flipped classroom" pedagogical model for improving student…
Transonic wing DFVLR-F4 as European test model
NASA Technical Reports Server (NTRS)
Redeker, G.; Schmidt, N.
1980-01-01
A transonic wing, the DFVLR-F4 was designed and tested as a model in European transonic wind tunnels and was found to give performance improvements over conventional wings. One reason for the improvement was the reduction of compression shocks in the transonic region as the result of improved wing design.
Water balance models in one-month-ahead streamflow forecasting
Alley, William M.
1985-01-01
Techniques are tested that incorporate information from water balance models in making 1-month-ahead streamflow forecasts in New Jersey. The results are compared to those based on simple autoregressive time series models. The relative performance of the models is dependent on the month of the year in question. The water balance models are most useful for forecasts of April and May flows. For the stations in northern New Jersey, the April and May forecasts were made in order of decreasing reliability using the water-balance-based approaches, using the historical monthly means, and using simple autoregressive models. The water balance models were useful to a lesser extent for forecasts during the fall months. For the rest of the year the improvements in forecasts over those obtained using the simpler autoregressive models were either very small or the simpler models provided better forecasts. When using the water balance models, monthly corrections for bias are found to improve minimum mean-square-error forecasts as well as to improve estimates of the forecast conditional distributions.
Development of an Improved Simulator for Chemical and Microbial EOR Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, Gary A.; Sepehrnoori, Kamy; Delshad, Mojdeh
2000-09-11
The objective of this research was to extend the capability of an existing simulator (UTCHEM) to improved oil recovery methods that use surfactants, polymers, gels, alkaline chemicals, microorganisms and foam as well as various combinations of these in both conventional and naturally fractured oil reservoirs. Task 1 is the addition of a dual-porosity model for chemical improved of recovery processes in naturally fractured oil reservoirs. Task 2 is the addition of a foam model. Task 3 addresses several numerical and coding enhancements that will greatly improve the versatility and performance of UTCHEM. Task 4 is the enhancements of physical propertymore » models.« less
Kanchense, Jane Handina Murigwa
2006-08-01
The primary health care model of public health has been implemented in many countries around the globe since the Declaration of Alma Ata in 1978, without pilot testing the primary health care model. Therefore, many public health researchers have sought methods of improving primary health care by creating evidence-based models. Many of these researchers recognize the role of behavioral models in public health. These offshoots of primary health care include the ecological, care, central human capabilities, and the SPECIES models. Holistic self-management education and support is a capacity-building philosophy that ensures active involvement of consumers of health care in the planning and implementation and evaluation of health care services. It helps consumers of health care to achieve the desired improved quality of health and life in managing and sustaining their health at the grassroots level. The care model addresses disease management ideals of the in the original primary health care model. The SPECIES model addresses those aspects of the primary health care model that include the cultural and social factors, as well as individual health education and support in the original primary health care model. The ecological model offers an improvement of the socioeconomic ideal in the original primary health care model. Improving the health of individuals will prevent illness, thereby reducing health care costs and lessening the current strain on an overburdened health care system in Zimbabwe. Holistic self-management education and support links health care delivery systems with social processes. It is a best practices model that could better serve Zimbabwean girls and women by contributing positively to the national challenges in health care, thereby meeting the Zimbabwean primary health care and safe motherhood goals. It is here recommended that holistic self-management education and support must be pilot tested before being adopted as the most appropriate model for ensuring population health.
The standardized live patient and mechanical patient models--their roles in trauma teaching.
Ali, Jameel; Al Ahmadi, Khalid; Williams, Jack Ivan; Cherry, Robert Allen
2009-01-01
We have previously demonstrated improved medical student performance using standardized live patient models in the Trauma Evaluation and Management (TEAM) program. The trauma manikin has also been offered as an option for teaching trauma skills in this program. In this study, we compare performance using both models. Final year medical students were randomly assigned to three groups: group I (n = 22) with neither model, group II (n = 24) with patient model, and group III (n = 24) with mechanical model using the same clinical scenario. All students completed pre-TEAM and post-TEAM multiple choice question (MCQ) exams and an evaluation questionnaire scoring five items on a scale of 1 to 5 with 5 being the highest. The items were objectives were met, knowledge improved, skills improved, overall satisfaction, and course should be mandatory. Students (groups II and III) then switched models, rating preferences in six categories: more challenging, more interesting, more dynamic, more enjoyable learning, more realistic, and overall better model. Scores were analyzed by ANOVA with p < 0.05 being considered statistically significant. All groups had similar scores (means % +/- SD)in the pretest (group I - 50.8 +/- 7.4, group II - 51.3 +/- 6.4, group III - 51.1 +/- 6.6). All groups improved their post-test scores but groups II and III scored higher than group I with no difference in scores between groups II and III (group I - 77.5 +/- 3.8, group II - 84.8 +/- 3.6, group III - 86.3 +/- 3.2). The percent of students scoring 5 in the questionnaire are as follows: objectives met - 100% for all groups; knowledge improved: group I - 91%, group II - 96%, group III - 92%; skills improved: group I - 9%, group II - 83%, group III - 96%; overall satisfaction: group I - 91%, group II - 92%, group III - 92%; should be mandatory: group I - 32%, group II - 96%, group III - 100%. Student preferences (48 students) are as follows: the mechanical model was more challenging (44 of 48); more interesting (40 of 48); more dynamic (46 of 48); more enjoyable (48 of 48); more realistic (32/48), and better overall model (42 of 48). Using the TEAM program, we have demonstrated that improvement in knowledge and skills are equally enhanced by using mechanical or patient models in trauma teaching. However, students overwhelmingly preferred the mechanical model.
The United States Environmental Protection Agency's National Exposure Research Laboratory has initiated a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop improved methods for modeling the source t...
The United States Environmental Protection Agency's National Exposure Research Laboratory has initiated a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop improved methods for modeling the source t...
The United States Environmental Protection Agency's National Exposure Research Laboratory has initiated a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop improved methods for modeling the source t...
The United States Environmental Protection Agency's National Exposure Research Laboratory is pursuing a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project is to develop improved methods for modeling the source through...
Design for Success: New Configurations and Governance Models for Catholic Schools
ERIC Educational Resources Information Center
Haney, Regina M.
2010-01-01
The 2008 Selected Programs for Improving Catholic Education (SPICE), a national diffusion network, shares school configurations and related governance models that may improve the sustainability of Catholic schools. This article describes how these model schools are successfully addressing their challenges. The structure and authority of their…
EFFECTS OF VERTICAL-LAYER STRUCTURE AND BOUNDARY CONDITIONS ON CMAQ-V4.5 AND V4.6 MODELS
This work is aimed at determining whether the increased vertical layers in CMAQ provides substantially improved model performance and assess whether using the spatially and temporally varying boundary conditions from GEOS-CHEM offer improved model performance as compared to the d...
ERIC Educational Resources Information Center
Mulford, Bill; Silins, Halia
2011-01-01
Purpose: This study aims to present revised models and a reconceptualisation of successful school principalship for improved student outcomes. Design/methodology/approach: The study's approach is qualitative and quantitative, culminating in model building and multi-level statistical analyses. Findings: Principals who promote both capacity building…
76 FR 53137 - Bundled Payments for Care Improvement Initiative: Request for Applications
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-25
... (RFA) will test episode-based payment for acute care and associated post-acute care, using both retrospective and prospective bundled payment methods. The RFA requests applications to test models centered around acute care; these models will inform the design of future models, including care improvement for...
78 FR 29139 - Medicare Program; Bundled Payments for Care Improvement Model 1 Open Period
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-17
... initiative. DATES: Model 1 of the Bundled Payments for Care Improvement Deadline: Interested organizations... initiative. For additional information on this initiative go to the CMS Center for Medicare and Medicaid Innovation Web site at http://innovation.cms.gov/initiatives/BPCI-Model-1/index.html . SUPPLEMENTARY...
Emergent Evidence in Support of a Community Collaboration Model for School Improvement
ERIC Educational Resources Information Center
Anderson-Butcher, Dawn; Lawson, Hal A.; Iachini, Aidyn; Flaspohler, Paul; Bean, Jerry; Wade-Mdivanian, Rebecca
2010-01-01
Community collaboration models expand conventional school improvement planning, which tends to be walled in, building centered, and bracketed by school and district boundaries. These community models enable educators, social workers, and other school professionals to form sustainable, strategic partnerships with families, community agencies,…
Rhode Island Model Evaluation & Support System: Building Administrator. Edition III
ERIC Educational Resources Information Center
Rhode Island Department of Education, 2015
2015-01-01
Rhode Island educators believe that implementing a fair, accurate, and meaningful educator evaluation and support system will help improve teaching, learning, and school leadership. The primary purpose of the Rhode Island Model Building Administrator Evaluation and Support System (Rhode Island Model) is to help all building administrators improve.…
Improving Motor Skills through Listening
ERIC Educational Resources Information Center
Wang, Lin
2004-01-01
In this article, the author discusses how to improve a child's motor skills through listening by using three simple steps--recording the auditory model, determining when to use the auditory model, and considering where to use the auditory model. She points out the importance of using a demonstration technique that helps learners understand the…
A Total Quality Leadership Process Improvement Model
1993-12-01
Leadership Process Improvement Model by Archester Houston, Ph.D. and Steven L. Dockstader, Ph.D. DTICS ELECTE tleaese oand sale itsFeat ben proe 94-12058...tTl ’AND SIATE COVERID0 Z lits Z40 uerI’Ll12/93 IFinalS.FNR IM F A Total Quality Leadership Process Improvement Model M ARRhOW~ Archester Houston, Ph.D...and Steven L. Dockstader, Ph.D. ?. 7PEJORMING ORG-AN1:AION NAMEIS) AND 00-RESS(ES) L PERFORMIN4 ORAINIZATION Total Quality Leadership OfficeREOTNMR
Knoedler, Margaret; Feibus, Allison H; Lange, Andrew; Maddox, Michael M; Ledet, Elisa; Thomas, Raju; Silberstein, Jonathan L
2015-06-01
To evaluate the effect of 3-dimensionally (3D) printed physical renal models with enhancing masses on medical trainee characterization, localization, and understanding of renal malignancy. Proprietary software was used to import standard computed tomography (CT) cross-sectional imaging into 3D printers to create physical models of renal units with enhancing renal lesions in situ. Six different models were printed from a transparent plastic resin; the normal parenchyma was printed in a clear, translucent plastic, with a red hue delineating the suspicious renal lesion. Medical students, who had completed their first year of training, were given an overview and tasked with completion of RENAL nephrometry scores, separately using CT imaging and 3D models. Trainees were also asked to complete a questionnaire about their experience. Variability between trainees was assessed by intraclass correlation coefficients (ICCs), and kappa statistics were used to compare the trainee to experts. Overall trainee nephrometry score accuracy was significantly improved with the 3D model vs CT scan (P <.01). Furthermore, 3 of the 4 components of the nephrometry score (radius, nearness to collecting system, and location) showed significant improvement (P <.001) using the models. There was also more consistent agreement among trainees when using the 3D models compared with CT scans to assess the nephrometry score (intraclass correlation coefficient, 0.28 for CT scan vs 0.72 for 3D models). Qualitative evaluation with questionnaires filled out by the trainees further confirmed that the 3D models improved their ability to understand and conceptualize the renal mass. Physical 3D models using readily available printing techniques improve trainees' understanding and characterization of individual patients' enhancing renal lesions. Published by Elsevier Inc.
Clarity versus complexity: land-use modeling as a practical tool for decision-makers
Sohl, Terry L.; Claggett, Peter
2013-01-01
The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.
Marelle, Louis; Raut, Jean-Christophe; Law, Kathy S.; ...
2017-01-01
In this study, the WRF-Chem regional model is updated to improve simulated short-lived pollutants (e.g., aerosols, ozone) in the Arctic. Specifically, we include in WRF-Chem 3.5.1 (with SAPRC-99 gas-phase chemistry and MOSAIC aerosols) (1) a correction to the sedimentation of aerosols, (2) dimethyl sulfide (DMS) oceanic emissions and gas-phase chemistry, (3) an improved representation of the dry deposition of trace gases over seasonal snow, and (4) an UV-albedo dependence on snow and ice cover for photolysis calculations. We also (5) correct the representation of surface temperatures over melting ice in the Noah Land Surface Model and (6) couple and further test the recent KF-CuP (Kain–Fritsch +more » Cumulus Potential) cumulus parameterization that includes the effect of cumulus clouds on aerosols and trace gases. The updated model is used to perform quasi-hemispheric simulations of aerosols and ozone, which are evaluated against surface measurements of black carbon (BC), sulfate, and ozone as well as airborne measurements of BC in the Arctic. The updated model shows significant improvements in terms of seasonal aerosol cycles at the surface and root mean square errors (RMSEs) for surface ozone, aerosols, and BC aloft, compared to the base version of the model and to previous large-scale evaluations of WRF-Chem in the Arctic. These improvements are mostly due to the inclusion of cumulus effects on aerosols and trace gases in KF-CuP (improved RMSE for surface BC and BC profiles, surface sulfate, and surface ozone), the improved surface temperatures over sea ice (surface ozone, BC, and sulfate), and the updated trace gas deposition and UV albedo over snow and ice (improved RMSE and correlation for surface ozone). DMS emissions and chemistry improve surface sulfate at all Arctic sites except Zeppelin, and correcting aerosol sedimentation has little influence on aerosols except in the upper troposphere.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marelle, Louis; Raut, Jean-Christophe; Law, Kathy S.
In this study, the WRF-Chem regional model is updated to improve simulated short-lived pollutants (e.g., aerosols, ozone) in the Arctic. Specifically, we include in WRF-Chem 3.5.1 (with SAPRC-99 gas-phase chemistry and MOSAIC aerosols) (1) a correction to the sedimentation of aerosols, (2) dimethyl sulfide (DMS) oceanic emissions and gas-phase chemistry, (3) an improved representation of the dry deposition of trace gases over seasonal snow, and (4) an UV-albedo dependence on snow and ice cover for photolysis calculations. We also (5) correct the representation of surface temperatures over melting ice in the Noah Land Surface Model and (6) couple and further test the recent KF-CuP (Kain–Fritsch +more » Cumulus Potential) cumulus parameterization that includes the effect of cumulus clouds on aerosols and trace gases. The updated model is used to perform quasi-hemispheric simulations of aerosols and ozone, which are evaluated against surface measurements of black carbon (BC), sulfate, and ozone as well as airborne measurements of BC in the Arctic. The updated model shows significant improvements in terms of seasonal aerosol cycles at the surface and root mean square errors (RMSEs) for surface ozone, aerosols, and BC aloft, compared to the base version of the model and to previous large-scale evaluations of WRF-Chem in the Arctic. These improvements are mostly due to the inclusion of cumulus effects on aerosols and trace gases in KF-CuP (improved RMSE for surface BC and BC profiles, surface sulfate, and surface ozone), the improved surface temperatures over sea ice (surface ozone, BC, and sulfate), and the updated trace gas deposition and UV albedo over snow and ice (improved RMSE and correlation for surface ozone). DMS emissions and chemistry improve surface sulfate at all Arctic sites except Zeppelin, and correcting aerosol sedimentation has little influence on aerosols except in the upper troposphere.« less
NASA Astrophysics Data System (ADS)
Marelle, Louis; Raut, Jean-Christophe; Law, Kathy S.; Berg, Larry K.; Fast, Jerome D.; Easter, Richard C.; Shrivastava, Manish; Thomas, Jennie L.
2017-10-01
In this study, the WRF-Chem regional model is updated to improve simulated short-lived pollutants (e.g., aerosols, ozone) in the Arctic. Specifically, we include in WRF-Chem 3.5.1 (with SAPRC-99 gas-phase chemistry and MOSAIC aerosols) (1) a correction to the sedimentation of aerosols, (2) dimethyl sulfide (DMS) oceanic emissions and gas-phase chemistry, (3) an improved representation of the dry deposition of trace gases over seasonal snow, and (4) an UV-albedo dependence on snow and ice cover for photolysis calculations. We also (5) correct the representation of surface temperatures over melting ice in the Noah Land Surface Model and (6) couple and further test the recent KF-CuP (Kain-Fritsch + Cumulus Potential) cumulus parameterization that includes the effect of cumulus clouds on aerosols and trace gases. The updated model is used to perform quasi-hemispheric simulations of aerosols and ozone, which are evaluated against surface measurements of black carbon (BC), sulfate, and ozone as well as airborne measurements of BC in the Arctic. The updated model shows significant improvements in terms of seasonal aerosol cycles at the surface and root mean square errors (RMSEs) for surface ozone, aerosols, and BC aloft, compared to the base version of the model and to previous large-scale evaluations of WRF-Chem in the Arctic. These improvements are mostly due to the inclusion of cumulus effects on aerosols and trace gases in KF-CuP (improved RMSE for surface BC and BC profiles, surface sulfate, and surface ozone), the improved surface temperatures over sea ice (surface ozone, BC, and sulfate), and the updated trace gas deposition and UV albedo over snow and ice (improved RMSE and correlation for surface ozone). DMS emissions and chemistry improve surface sulfate at all Arctic sites except Zeppelin, and correcting aerosol sedimentation has little influence on aerosols except in the upper troposphere.
Improvements to constitutive material model for fabrics
NASA Astrophysics Data System (ADS)
Morea, Mihai I.
2011-12-01
The high strength to weight ratio of woven fabric offers a cost effective solution to be used in a containment system for aircraft propulsion engines. Currently, Kevlar is the only Federal Aviation Administration (FAA) approved fabric for usage in systems intended to mitigate fan blade-out events. This research builds on an earlier constitutive model of Kevlar 49 fabric developed at Arizona State University (ASU) with the addition of new and improved modeling details. Latest stress strain experiments provided new and valuable data used to modify the material model post peak behavior. These changes reveal an overall improvement of the Finite Element (FE) model's ability to predict experimental results. First, the steel projectile is modeled using Johnson-Cook material model and provides a more realistic behavior in the FE ballistic models. This is particularly noticeable when comparing FE models with laboratory tests where large deformations in projectiles are observed. Second, follow-up analysis of the results obtained through the new picture frame tests conducted at ASU provides new values for the shear moduli and corresponding strains. The new approach for analysis of data from picture frame tests combines digital image analysis and a two-level factorial optimization formulation. Finally, an additional improvement in the material model for Kevlar involves checking the convergence at variation of mesh density of fabrics. The study performed and described herein shows the converging trend, therefore validating the FE model.
Evaluating diagnosis-based risk-adjustment methods in a population with spinal cord dysfunction.
Warner, Grace; Hoenig, Helen; Montez, Maria; Wang, Fei; Rosen, Amy
2004-02-01
To examine performance of models in predicting health care utilization for individuals with spinal cord dysfunction. Regression models compared 2 diagnosis-based risk-adjustment methods, the adjusted clinical groups (ACGs) and diagnostic cost groups (DCGs). To improve prediction, we added to our model: (1) spinal cord dysfunction-specific diagnostic information, (2) limitations in self-care function, and (3) both 1 and 2. Models were replicated in 3 populations. Samples from 3 populations: (1) 40% of veterans using Veterans Health Administration services in fiscal year 1997 (FY97) (N=1,046,803), (2) veteran sample with spinal cord dysfunction identified by codes from the International Statistical Classification of Diseases, 9th Revision, Clinical Modifications (N=7666), and (3) veteran sample identified in Veterans Affairs Spinal Cord Dysfunction Registry (N=5888). Not applicable. Inpatient, outpatient, and total days of care in FY97. The DCG models (R(2) range,.22-.38) performed better than ACG models (R(2) range,.04-.34) for all outcomes. Spinal cord dysfunction-specific diagnostic information improved prediction more in the ACG model than in the DCG model (R(2) range for ACG,.14-.34; R(2) range for DCG,.24-.38). Information on self-care function slightly improved performance (R(2) range increased from 0 to.04). The DCG risk-adjustment models predicted health care utilization better than ACG models. ACG model prediction was improved by adding information.
Grieger, Jessica A; Johnson, Brittany J; Wycherley, Thomas P; Golley, Rebecca K
2017-05-01
Background: Dietary simulation modeling can predict dietary strategies that may improve nutritional or health outcomes. Objectives: The study aims were to undertake a systematic review of simulation studies that model dietary strategies aiming to improve nutritional intake, body weight, and related chronic disease, and to assess the methodologic and reporting quality of these models. Methods: The Preferred Reporting Items for Systematic Reviews and Meta-Analyses guided the search strategy with studies located through electronic searches [Cochrane Library, Ovid (MEDLINE and Embase), EBSCOhost (CINAHL), and Scopus]. Study findings were described and dietary modeling methodology and reporting quality were critiqued by using a set of quality criteria adapted for dietary modeling from general modeling guidelines. Results: Forty-five studies were included and categorized as modeling moderation, substitution, reformulation, or promotion dietary strategies. Moderation and reformulation strategies targeted individual nutrients or foods to theoretically improve one particular nutrient or health outcome, estimating small to modest improvements. Substituting unhealthy foods with healthier choices was estimated to be effective across a range of nutrients, including an estimated reduction in intake of saturated fatty acids, sodium, and added sugar. Promotion of fruits and vegetables predicted marginal changes in intake. Overall, the quality of the studies was moderate to high, with certain features of the quality criteria consistently reported. Conclusions: Based on the results of reviewed simulation dietary modeling studies, targeting a variety of foods rather than individual foods or nutrients theoretically appears most effective in estimating improvements in nutritional intake, particularly reducing intake of nutrients commonly consumed in excess. A combination of strategies could theoretically be used to deliver the best improvement in outcomes. Study quality was moderate to high. However, given the lack of dietary simulation reporting guidelines, future work could refine the quality tool to harmonize consistency in the reporting of subsequent dietary modeling studies. © 2017 American Society for Nutrition.
Morris, Ralph E; McNally, Dennis E; Tesche, Thomas W; Tonnesen, Gail; Boylan, James W; Brewer, Patricia
2005-11-01
The Visibility Improvement State and Tribal Association of the Southeast (VISTAS) is one of five Regional Planning Organizations that is charged with the management of haze, visibility, and other regional air quality issues in the United States. The VISTAS Phase I work effort modeled three episodes (January 2002, July 1999, and July 2001) to identify the optimal model configuration(s) to be used for the 2002 annual modeling in Phase II. Using model configurations recommended in the Phase I analysis, 2002 annual meteorological (Mesoscale Meterological Model [MM5]), emissions (Sparse Matrix Operator Kernal Emissions [SMOKE]), and air quality (Community Multiscale Air Quality [CMAQ]) simulations were performed on a 36-km grid covering the continental United States and a 12-km grid covering the Eastern United States. Model estimates were then compared against observations. This paper presents the results of the preliminary CMAQ model performance evaluation for the initial 2002 annual base case simulation. Model performance is presented for the Eastern United States using speciated fine particle concentration and wet deposition measurements from several monitoring networks. Initial results indicate fairly good performance for sulfate with fractional bias values generally within +/-20%. Nitrate is overestimated in the winter by approximately +50% and underestimated in the summer by more than -100%. Organic carbon exhibits a large summer underestimation bias of approximately -100% with much improved performance seen in the winter with a bias near zero. Performance for elemental carbon is reasonable with fractional bias values within +/- 40%. Other fine particulate (soil) and coarse particular matter exhibit large (80-150%) overestimation in the winter but improved performance in the summer. The preliminary 2002 CMAQ runs identified several areas of enhancements to improve model performance, including revised temporal allocation factors for ammonia emissions to improve nitrate performance and addressing missing processes in the secondary organic aerosol module to improve OC performance.
Duan, Zhenhao; Sun, R.; Zhu, Chen; Chou, I.-Ming
2006-01-01
An improved model is presented for the calculation of the solubility of carbon dioxide in aqueous solutions containing Na+, K+, Ca2+, Mg2+, Cl-, and SO42- in a wide temperature-pressure-ionic strength range (from 273 to 533 K, from 0 to 2000 bar, and from 0 to 4.5 molality of salts) with experimental accuracy. The improvements over the previous model [Duan, Z. and Sun, R., 2003. An improved model calculating CO2 solubility in pure water and aqueous NaCl solutions from 273 to 533K and from 0 to 2000 bar. Chemical Geology, 193: 257-271] include: (1) By developing a non-iterative equation to replace the original equation of state in the calculation of CO 2 fugacity coefficients, the new model is at least twenty times computationally faster and can be easily adapted to numerical reaction-flow simulator for such applications as CO2 sequestration and (2) By fitting to the new solubility data, the new model improved the accuracy below 288 K from 6% to about 3% of uncertainty but still retains the high accuracy of the original model above 288 K. We comprehensively evaluate all experimental CO2 solubility data. Compared with these data, this model not only reproduces all the reliable data used for the parameterization but also predicts the data that were not used in the parameterization. In order to facilitate the application to CO2 sequestration, we also predicted CO2 solubility in seawater at two-phase coexistence (vapor-liquid or liquid-liquid) and at three-phase coexistence (CO2 hydrate-liquid water-vapor CO2 [or liquid CO2]). The improved model is programmed and can be downloaded from the website http://www.geochem-model.org/programs.htm. ?? 2005 Elsevier B.V. All rights reserved.
Aerothermal modeling program, phase 2
NASA Technical Reports Server (NTRS)
Mongia, H. C.; Patankar, S. V.; Murthy, S. N. B.; Sullivan, J. P.; Samuelsen, G. S.
1985-01-01
The main objectives of the Aerothermal Modeling Program, Phase 2 are: to develop an improved numerical scheme for incorporation in a 3-D combustor flow model; to conduct a benchmark quality experiment to study the interaction of a primary jet with a confined swirling crossflow and to assess current and advanced turbulence and scalar transport models; and to conduct experimental evaluation of the air swirler interaction with fuel injectors, assessments of current two-phase models, and verification the improved spray evaporation/dispersion models.
Implementing Model-Check for Employee and Management Satisfaction
NASA Technical Reports Server (NTRS)
Jones, Corey; LaPha, Steven
2013-01-01
This presentation will discuss methods to which ModelCheck can be implemented to not only improve model quality, but also satisfy both employees and management through different sets of quality checks. This approach allows a standard set of modeling practices to be upheld throughout a company, with minimal interaction required by the end user. The presenter will demonstrate how to create multiple ModelCheck standards, preventing users from evading the system, and how it can improve the quality of drawings and models.
The Effect of ISO 9001 and the EFQM Model on Improving Hospital Performance: A Systematic Review.
Yousefinezhadi, Taraneh; Mohamadi, Efat; Safari Palangi, Hossein; Akbari Sari, Ali
2015-12-01
This study aimed to explore the effect of the International Organization for Standardization (ISO) ISO 9001 standard and the European foundation for quality management (EFQM) model on improving hospital performance. PubMed, Embase and the Cochrane Library databases were searched. In addition, Elsevier and Springer were searched as main publishers in the field of health sciences. We included empirical studies with any design that had used ISO 9001 or the EFQM model to improve the quality of healthcare. Data were collected and tabulated into a data extraction sheet that was specifically designed for this study. The collected data included authors' names, country, year of publication, intervention, improvement aims, setting, length of program, study design, and outcomes. Seven out of the 121 studies that were retrieved met the inclusion criteria. Three studies assessed the EFQM model and four studies assessed the ISO 9001 standard. Use of the EFQM model increased the degree of patient satisfaction and the number of hospital admissions and reduced the average length of stay, the delay on the surgical waiting list, and the number of emergency re-admissions. ISO 9001 also increased the degree of patient satisfaction and patient safety, increased cost-effectiveness, improved the hospital admissions process, and reduced the percentage of unscheduled returns to the hospital. Generally, there is a lack of robust and high quality empirical evidence regarding the effects of ISO 9001 and the EFQM model on the quality care provided by and the performance of hospitals. However, the limited evidence shows that ISO 9001 and the EFQM model might improve hospital performance.
NASA Astrophysics Data System (ADS)
Li, Mingchao; Han, Shuai; Zhou, Sibao; Zhang, Ye
2018-06-01
Based on a 3D model of a discrete fracture network (DFN) in a rock mass, an improved projective method for computing the 3D mechanical connectivity rate was proposed. The Monte Carlo simulation method, 2D Poisson process and 3D geological modeling technique were integrated into a polyhedral DFN modeling approach, and the simulation results were verified by numerical tests and graphical inspection. Next, the traditional projective approach for calculating the rock mass connectivity rate was improved using the 3D DFN models by (1) using the polyhedral model to replace the Baecher disk model; (2) taking the real cross section of the rock mass, rather than a part of the cross section, as the test plane; and (3) dynamically searching the joint connectivity rates using different dip directions and dip angles at different elevations to calculate the maximum, minimum and average values of the joint connectivity at each elevation. In a case study, the improved method and traditional method were used to compute the mechanical connectivity rate of the slope of a dam abutment. The results of the two methods were further used to compute the cohesive force of the rock masses. Finally, a comparison showed that the cohesive force derived from the traditional method had a higher error, whereas the cohesive force derived from the improved method was consistent with the suggested values. According to the comparison, the effectivity and validity of the improved method were verified indirectly.
NASA Astrophysics Data System (ADS)
Bahtiar; Rahayu, Y. S.; Wasis
2018-01-01
This research aims to produce P3E learning model to improve students’ critical thinking skills. The developed model is named P3E, consisting of 4 (four) stages namely; organization, inquiry, presentation, and evaluation. This development research refers to the development stage by Kemp. The design of the wide scale try-out used pretest-posttest group design. The wide scale try-out was conducted in grade X of 2016/2017 academic year. The analysis of the results of this development research inludes three aspects, namely: validity, practicality, and effectiveness of the model developed. The research results showed; (1) the P3E learning model was valid, according to experts with an average value of 3.7; (2) The completion of the syntax of the learning model developed obtained 98.09% and 94.39% for two schools based on the assessment of the observers. This shows that the developed model is practical to be implemented; (3) the developed model is effective for improving students’ critical thinking skills, although the n-gain of the students’ critical thinking skills was 0.54 with moderate category. Based on the results of the research above, it can be concluded that the developed P3E learning model is suitable to be used to improve students’ critical thinking skills.
Gravity model development for TOPEX/POSEIDON: Joint gravity models 1 and 2
NASA Technical Reports Server (NTRS)
Nerem, R. S.; Lerch, F. J.; Marshall, J. A.; Pavlis, E. C.; Putney, B. H.; Tapley, B. D.; Eanes, R. J.; Ries, J. C.; Schutz, B. E.; Shum, C. K.
1994-01-01
The TOPEX/POSEIDON (T/P) prelaunch Joint Gravity Model-1 (JGM-1) and the postlaunch JGM-2 Earth gravitational models have been developed to support precision orbit determination for T/P. Each of these models is complete to degree 70 in spherical harmonics and was computed from a combination of satellite tracking data, satellite altimetry, and surface gravimetry. While improved orbit determination accuracies for T/P have driven the improvements in the models, the models are general in application and also provide an improved geoid for oceanographic computations. The postlaunch model, JGM-2, which includes T/P satellite laser ranging (SLR) and Doppler orbitography and radiopositioning integrated by satellite (DORIS) tracking data, introduces radial orbit errors for T/P that are only 2 cm RMS with the commission errors of the marine geoid for terms to degree 70 being +/- 25 cm. Errors in modeling the nonconservative forces acting on T/P increase the total radial errors to only 3-4 cm root mean square (RMS), a result much better than premission goals. While the orbit accuracy goal for T/P has been far surpassed geoid errors still prevent the absolute determination of the ocean dynamic topography for wavelengths shorter than about 2500 km. Only a dedicated gravitational field satellite mission will likely provide the necessary improvement in the geoid.
Using aircraft and satellite observations to improve regulatory air quality models
NASA Astrophysics Data System (ADS)
Canty, T. P.; Vinciguerra, T.; Anderson, D. C.; Carpenter, S. F.; Goldberg, D. L.; Hembeck, L.; Montgomery, L.; Liu, X.; Salawitch, R. J.; Dickerson, R. R.
2014-12-01
Federal and state agencies rely on EPA approved models to develop attainment strategies that will bring states into compliance with the National Ambient Air Quality Standards (NAAQS). We will describe modifications to the Community Multi-Scale Air Quality (CMAQ) model and Comprehensive Air Quality Model with Extensions (CAMx) frameworks motivated by analysis of NASA satellite and aircraft measurements. Observations of tropospheric column NO2 from OMI have already led to the identification of an important deficiency in the chemical mechanisms used by models; data collected during the DISCOVER-AQ field campaign has been instrumental in devising an improved representation of the chemistry of nitrogen species. Our recent work has focused on the use of: OMI observations of tropospheric O3 to assess and improve the representation of boundary conditions used by AQ models, OMI NO2 to derive a top down NOx emission inventory from commercial shipping vessels that affect air quality in the Eastern U.S., and OMI HCHO to assess the C5H8 emission inventories provided by bioegenic emissions models. We will describe how these OMI-driven model improvements are being incorporated into the State Implementation Plans (SIPs) being prepared for submission to EPA in summer 2015 and how future modeling efforts may be impacted by our findings.
Making ecological models adequate
Getz, Wayne M.; Marshall, Charles R.; Carlson, Colin J.; Giuggioli, Luca; Ryan, Sadie J.; Romañach, Stephanie; Boettiger, Carl; Chamberlain, Samuel D.; Larsen, Laurel; D'Odorico, Paolo; O'Sullivan, David
2018-01-01
Critical evaluation of the adequacy of ecological models is urgently needed to enhance their utility in developing theory and enabling environmental managers and policymakers to make informed decisions. Poorly supported management can have detrimental, costly or irreversible impacts on the environment and society. Here, we examine common issues in ecological modelling and suggest criteria for improving modelling frameworks. An appropriate level of process description is crucial to constructing the best possible model, given the available data and understanding of ecological structures. Model details unsupported by data typically lead to over parameterisation and poor model performance. Conversely, a lack of mechanistic details may limit a model's ability to predict ecological systems’ responses to management. Ecological studies that employ models should follow a set of model adequacy assessment protocols that include: asking a series of critical questions regarding state and control variable selection, the determinacy of data, and the sensitivity and validity of analyses. We also need to improve model elaboration, refinement and coarse graining procedures to better understand the relevancy and adequacy of our models and the role they play in advancing theory, improving hind and forecasting, and enabling problem solving and management.
NASA Astrophysics Data System (ADS)
da Silva, Felipe das Neves Roque; Alves, José Luis Drummond; Cataldi, Marcio
2018-03-01
This paper aims to validate inflow simulations concerning the present-day climate at Água Vermelha Hydroelectric Plant (AVHP—located on the Grande River Basin) based on the Soil Moisture Accounting Procedure (SMAP) hydrological model. In order to provide rainfall data to the SMAP model, the RegCM regional climate model was also used working with boundary conditions from the MIROC model. Initially, present-day climate simulation performed by RegCM model was analyzed. It was found that, in terms of rainfall, the model was able to simulate the main patterns observed over South America. A bias correction technique was also used and it was essential to reduce mistakes related to rainfall simulation. Comparison between rainfall simulations from RegCM and MIROC showed improvements when the dynamical downscaling was performed. Then, SMAP, a rainfall-runoff hydrological model, was used to simulate inflows at Água Vermelha Hydroelectric Plant. After calibration with observed rainfall, SMAP simulations were evaluated in two different periods from the one used in calibration. During calibration, SMAP captures the inflow variability observed at AVHP. During validation periods, the hydrological model obtained better results and statistics with observed rainfall. However, in spite of some discrepancies, the use of simulated rainfall without bias correction captured the interannual flow variability. However, the use of bias removal in the simulated rainfall performed by RegCM brought significant improvements to the simulation of natural inflows performed by SMAP. Not only the curve of simulated inflow became more similar to the observed inflow, but also the statistics improved their values. Improvements were also noticed in the inflow simulation when the rainfall was provided by the regional climate model compared to the global model. In general, results obtained so far prove that there was an added value in rainfall when regional climate model was compared to global climate model and that data from regional models must be bias-corrected so as to improve their results.
Schoppe, Oliver; King, Andrew J.; Schnupp, Jan W.H.; Harper, Nicol S.
2016-01-01
Adaptation to stimulus statistics, such as the mean level and contrast of recently heard sounds, has been demonstrated at various levels of the auditory pathway. It allows the nervous system to operate over the wide range of intensities and contrasts found in the natural world. Yet current standard models of the response properties of auditory neurons do not incorporate such adaptation. Here we present a model of neural responses in the ferret auditory cortex (the IC Adaptation model), which takes into account adaptation to mean sound level at a lower level of processing: the inferior colliculus (IC). The model performs high-pass filtering with frequency-dependent time constants on the sound spectrogram, followed by half-wave rectification, and passes the output to a standard linear–nonlinear (LN) model. We find that the IC Adaptation model consistently predicts cortical responses better than the standard LN model for a range of synthetic and natural stimuli. The IC Adaptation model introduces no extra free parameters, so it improves predictions without sacrificing parsimony. Furthermore, the time constants of adaptation in the IC appear to be matched to the statistics of natural sounds, suggesting that neurons in the auditory midbrain predict the mean level of future sounds and adapt their responses appropriately. SIGNIFICANCE STATEMENT An ability to accurately predict how sensory neurons respond to novel stimuli is critical if we are to fully characterize their response properties. Attempts to model these responses have had a distinguished history, but it has proven difficult to improve their predictive power significantly beyond that of simple, mostly linear receptive field models. Here we show that auditory cortex receptive field models benefit from a nonlinear preprocessing stage that replicates known adaptation properties of the auditory midbrain. This improves their predictive power across a wide range of stimuli but keeps model complexity low as it introduces no new free parameters. Incorporating the adaptive coding properties of neurons will likely improve receptive field models in other sensory modalities too. PMID:26758822
Effects of tempol and redox-cycling nitroxides in models of oxidative stress
Wilcox, Christopher S.
2010-01-01
Tempol is a redox cycling nitroxide that promotes the metabolism of many reactive oxygen species (ROS) and improves nitric oxide bioavailability. It has been studied extensively in animal models of oxidative stress. Tempol has been shown to preserve mitochondria against oxidative damage and improve tissue oxygenation. Tempol improved insulin responsiveness in models of diabetes mellitus and improved the dyslipidemia, reduced the weight gain and prevented diastolic dysfunction and heart failure in fat-fed models of the metabolic syndrome. Tempol protected many organs, including the heart and brain, from ischemia/reperfusion damage. Tempol prevented podocyte damage, glomerulosclerosis, proteinuria and progressive loss of renal function in models of salt and mineralocorticosteroid excess. It reduced brain or spinal cord damage after ischemia or trauma and exerted a spinal analgesic action. Tempol improved survival in several models of shock. It protected normal cells from radiation while maintaining radiation sensitivity of tumor cells. Its paradoxical pro-oxidant action in tumor cells accounted for a reduction in spontaneous tumor formation. Tempol was effective in some models of neurodegeneration. Thus, tempol has been effective in preventing several of the adverse consequences of oxidative stress and inflammation that underlie radiation damage and many of the diseases associated with aging. Indeed, tempol given from birth prolonged the life span of normal mice. However, presently tempol has been used only in human subjects as a topical agent to prevent radiation-induced alopecia. PMID:20153367
Improving Catastrophe Modeling for Business Interruption Insurance Needs.
Rose, Adam; Huyck, Charles K
2016-10-01
While catastrophe (CAT) modeling of property damage is well developed, modeling of business interruption (BI) lags far behind. One reason is the crude nature of functional relationships in CAT models that translate property damage into BI. Another is that estimating BI losses is more complicated because it depends greatly on public and private decisions during recovery with respect to resilience tactics that dampen losses by using remaining resources more efficiently to maintain business function and to recover more quickly. This article proposes a framework for improving hazard loss estimation for BI insurance needs. Improved data collection that allows for analysis at the level of individual facilities within a company can improve matching the facilities with the effectiveness of individual forms of resilience, such as accessing inventories, relocating operations, and accelerating repair, and can therefore improve estimation accuracy. We then illustrate the difference this can make in a case study example of losses from a hurricane. © 2016 Society for Risk Analysis.
A model for estimating the impact of changes in children's vaccines.
Simpson, K N; Biddle, A K; Rabinovich, N R
1995-12-01
To assist in strategic planning for the improvement of vaccines and vaccine programs, an economic model was developed and tested that estimates the potential impact of vaccine innovations on health outcomes and costs associated with vaccination and illness. A multistep, iterative process of data extraction/integration was used to develop the model and the scenarios. Parameter replication, sensitivity analysis, and expert review were used to validate the model. The greatest impact on the improvement of health is expected to result from the production of less reactogenic vaccines that require fewer inoculations for immunity. The greatest economic impact is predicted from improvements that decrease the number of inoculations required. Scenario analysis may be useful for integrating health outcomes and economic data into decision making. For childhood infections, this analysis indicates that large cost savings can be achieved in the future if we can improve vaccine efficacy so that the number of required inoculations is reduced. Such an improvement represents a large potential "payback" for the United States and might benefit other countries.
An Updated AP2 Beamline TURTLE Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gormley, M.; O'Day, S.
1991-08-23
This note describes a TURTLE model of the AP2 beamline. This model was created by D. Johnson and improved by J. Hangst. The authors of this note have made additional improvements which reflect recent element and magnet setting changes. The magnet characteristics measurements and survey data compiled to update the model will be presented. A printout of the actual TURTLE deck may be found in appendix A.
Improving the Amazonian Hydrologic Cycle in a Coupled Land-Atmosphere, Single Column Model
NASA Astrophysics Data System (ADS)
Harper, A. B.; Denning, S.; Baker, I.; Prihodko, L.; Branson, M.
2006-12-01
We have coupled a land-surface model, the Simple Biosphere Model (SiB3), to a single column of the Colorado State University General Circulation Model (CSU-GCM) in the Amazon River Basin. This is a preliminary step in the broader goal of improved simulation of Basin-wide hydrology. A previous version of the coupled model (SiB2) showed drought and catastrophic dieback of the Amazon rain forest. SiB3 includes updated soil hydrology and root physiology. Our test area for the coupled single column model is near Santarem, Brazil, where measurements from the km 83 flux tower in the Tapajos National Forest can be used to evaluate model output. The model was run for 2001 using NCEP2 Reanalysis as driver data. Preliminary results show that the updated biosphere model coupled to the GCM produces improved simulations of the seasonal cycle of surface water balance and precipitation. Comparisons of the diurnal and seasonal cycles of surface fluxes are also being made.
More than Anecdotes: Fishers’ Ecological Knowledge Can Fill Gaps for Ecosystem Modeling
Bevilacqua, Ana Helena V.; Carvalho, Adriana R.; Angelini, Ronaldo; Christensen, Villy
2016-01-01
Background Ecosystem modeling applied to fisheries remains hampered by a lack of local information. Fishers’ knowledge could fill this gap, improving participation in and the management of fisheries. Methodology The same fishing area was modeled using two approaches: based on fishers’ knowledge and based on scientific information. For the former, the data was collected by interviews through the Delphi methodology, and for the latter, the data was gathered from the literature. Agreement between the attributes generated by the fishers’ knowledge model and scientific model is discussed and explored, aiming to improve data availability, the ecosystem model, and fisheries management. Principal Findings The ecosystem attributes produced from the fishers’ knowledge model were consistent with the ecosystem attributes produced by the scientific model, and elaborated using only the scientific data from literature. Conclusions/Significance This study provides evidence that fishers’ knowledge may suitably complement scientific data, and may improve the modeling tools for the research and management of fisheries. PMID:27196131
Bauer, Julia; Chen, Wenjing; Nischwitz, Sebastian; Liebl, Jakob; Rieken, Stefan; Welzel, Thomas; Debus, Juergen; Parodi, Katia
2018-04-24
A reliable Monte Carlo prediction of proton-induced brain tissue activation used for comparison to particle therapy positron-emission-tomography (PT-PET) measurements is crucial for in vivo treatment verification. Major limitations of current approaches to overcome include the CT-based patient model and the description of activity washout due to tissue perfusion. Two approaches were studied to improve the activity prediction for brain irradiation: (i) a refined patient model using tissue classification based on MR information and (ii) a PT-PET data-driven refinement of washout model parameters. Improvements of the activity predictions compared to post-treatment PT-PET measurements were assessed in terms of activity profile similarity for six patients treated with a single or two almost parallel fields delivered by active proton beam scanning. The refined patient model yields a generally higher similarity for most of the patients, except in highly pathological areas leading to tissue misclassification. Using washout model parameters deduced from clinical patient data could considerably improve the activity profile similarity for all patients. Current methods used to predict proton-induced brain tissue activation can be improved with MR-based tissue classification and data-driven washout parameters, thus providing a more reliable basis for PT-PET verification. Copyright © 2018 Elsevier B.V. All rights reserved.
Air Quality Modeling Using the NASA GEOS-5 Multispecies Data Assimilation System
NASA Technical Reports Server (NTRS)
Keller, Christoph A.; Pawson, Steven; Wargan, Krzysztof; Weir, Brad
2018-01-01
The NASA Goddard Earth Observing System (GEOS) data assimilation system (DAS) has been expanded to include chemically reactive tropospheric trace gases including ozone (O3), nitrogen dioxide (NO2), and carbon monoxide (CO). This system combines model analyses from the GEOS-5 model with detailed atmospheric chemistry and observations from MLS (O3), OMI (O3 and NO2), and MOPITT (CO). We show results from a variety of assimilation test experiments, highlighting the improvements in the representation of model species concentrations by up to 50% compared to an assimilation-free control experiment. Taking into account the rapid chemical cycling of NO2 when applying the assimilation increments greatly improves assimilation skills for NO2 and provides large benefits for model concentrations near the surface. Analysis of the geospatial distribution of the assimilation increments suggest that the free-running model overestimates biomass burning emissions but underestimates lightning NOx emissions by 5-20%. We discuss the capability of the chemical data assimilation system to improve atmospheric composition forecasts through improved initial value and boundary condition inputs, particularly during air pollution events. We find that the current assimilation system meaningfully improves short-term forecasts (1-3 day). For longer-term forecasts more emphasis on updating the emissions instead of initial concentration fields is needed.
Sunde, Synnøve; Walstad, Rolf Aksel; Bentsen, Signe Berit; Lunde, Solfrid J; Wangen, Eva Marie; Rustøen, Tone; Henriksen, Anne Hildur
2014-09-01
Adherence to guidelines for managing stable chronic obstructive pulmonary disease (COPD) and its exacerbations is inadequate among healthcare workers and patients. An appropriate care model would meet patient needs, enhance their coping with COPD and improve their quality of life (QOL). This study aims to present the 'COPD-Home' as an integrated care model for patients with severe or very severe COPD. One principle of the COPD-Home model is that hospital treatment should lead to follow up in the patient's home. The model also includes education, improved coordination of levels of care, improved accessibility and a management plan. One of the main elements of the COPD-Home model is the clear role of the home-care nurse. Model development is based on earlier research and clinical experience. It comprises: (i) education provided through an education programme for patients and involved nurses, (ii) joint visits and telephone checks, (iii) a call centre for support and communication with a general practitioner and (iv) an individualised self-management plan including home monitoring and a plan for pharmacological and nonpharmacological interventions. The COPD-Home model attempts to cultivate competences and behaviours of patients and community nurses that better accord with guidelines for interventions. The next step in its development will be to evaluate its ability to assist both healthcare workers and planners to improve the management of COPD, reduce exacerbations and improve QOL and coping among patients with COPD. © 2013 Nordic College of Caring Science.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-01
... Market and Planning Efficiency Through Improved Software; Notice of Agenda and Procedures for Staff... planning models and software. The technical conference will be held from 8 a.m. to 5:30 p.m. (EDT) on June.... Agenda for AD10-12 Staff Technical Conference on Planning Models and Software Federal Energy Regulatory...
Improving Students' Critical Thinking Skills through Remap NHT in Biology Classroom
ERIC Educational Resources Information Center
Mahanal, Susriyati; Zubaidah, Siti; Bahri, Arsad; Syahadatud Dinnurriya, Maratusy
2016-01-01
Previous studies in Malang, Indonesia, showed that there were the failure biology learning caused by not only the low students' prior knowledge, but also biology learning model has not improved the students' critical thinking skills yet, which affected the low of cognitive learning outcomes. The learning model is required to improve students'…
ERIC Educational Resources Information Center
Herrmann, Mariesa; Dragoset, Lisa; James-Burdumy, Susanne
2014-01-01
The federal School Improvement Grants (SIG) program aims to improve student achievement by promoting the implementation of four school intervention models: transformation, turnaround, restart, and closure. Previous research provides evidence that low-performing schools adopt some practices promoted by the four models, but little is known about how…
Assessment and Improvement of GOCE based Global Geopotential Models Using Wavelet Decomposition
NASA Astrophysics Data System (ADS)
Erol, Serdar; Erol, Bihter; Serkan Isik, Mustafa
2016-07-01
The contribution of recent Earth gravity field satellite missions, specifically GOCE mission, leads significant improvement in quality of gravity field models in both accuracy and resolution manners. However the performance and quality of each released model vary not only depending on the spatial location of the Earth but also the different bands of the spectral expansion. Therefore the assessment of the global model performances with validations using in situ-data in varying territories on the Earth is essential for clarifying their exact performances in local. Beside of this, their spectral evaluation and quality assessment of the signal in each part of the spherical harmonic expansion spectrum is essential to have a clear decision for the commission error content of the model and determining its optimal degree, revealed the best results, as well. The later analyses provide also a perspective and comparison on the global behavior of the models and opportunity to report the sequential improvement of the models depending on the mission developments and hence the contribution of the new data of missions. In this study a review on spectral assessment results of the recently released GOCE based global geopotential models DIR-R5, TIM-R5 with the enhancement using EGM2008, as reference model, in Turkey, versus the terrestrial data is provided. Beside of reporting the GOCE mission contribution to the models in Turkish territory, the possible improvement in the spectral quality of these models, via decomposition that are highly contaminated by noise, is purposed. In the analyses the motivation is on achieving an optimal amount of improvement that rely on conserving the useful component of the GOCE signal as much as possible, while fusing the filtered GOCE based models with EGM2008 in the appropriate spectral bands. The investigation also contain the assessment of the coherence and the correlation between the Earth gravity field parameters (free-air gravity anomalies and geoid undulations), derived from the validated geopotential models and terrestrial data (GPS/leveling, terrestrial gravity observations, DTM etc.), as well as the WGM2012 products. In the conclusion, with the numerical results, the performance of the assessed models are clarified in Turkish territory and the potential of the Wavelet decomposition in the improvement of the geopotential models is verified.
Performance Analysis of Several GPS/Galileo Precise Point Positioning Models
Afifi, Akram; El-Rabbany, Ahmed
2015-01-01
This paper examines the performance of several precise point positioning (PPP) models, which combine dual-frequency GPS/Galileo observations in the un-differenced and between-satellite single-difference (BSSD) modes. These include the traditional un-differenced model, the decoupled clock model, the semi-decoupled clock model, and the between-satellite single-difference model. We take advantage of the IGS-MGEX network products to correct for the satellite differential code biases and the orbital and satellite clock errors. Natural Resources Canada’s GPSPace PPP software is modified to handle the various GPS/Galileo PPP models. A total of six data sets of GPS and Galileo observations at six IGS stations are processed to examine the performance of the various PPP models. It is shown that the traditional un-differenced GPS/Galileo PPP model, the GPS decoupled clock model, and the semi-decoupled clock GPS/Galileo PPP model improve the convergence time by about 25% in comparison with the un-differenced GPS-only model. In addition, the semi-decoupled GPS/Galileo PPP model improves the solution precision by about 25% compared to the traditional un-differenced GPS/Galileo PPP model. Moreover, the BSSD GPS/Galileo PPP model improves the solution convergence time by about 50%, in comparison with the un-differenced GPS PPP model, regardless of the type of BSSD combination used. As well, the BSSD model improves the precision of the estimated parameters by about 50% and 25% when the loose and the tight combinations are used, respectively, in comparison with the un-differenced GPS-only model. Comparable results are obtained through the tight combination when either a GPS or a Galileo satellite is selected as a reference. PMID:26102495
Performance Analysis of Several GPS/Galileo Precise Point Positioning Models.
Afifi, Akram; El-Rabbany, Ahmed
2015-06-19
This paper examines the performance of several precise point positioning (PPP) models, which combine dual-frequency GPS/Galileo observations in the un-differenced and between-satellite single-difference (BSSD) modes. These include the traditional un-differenced model, the decoupled clock model, the semi-decoupled clock model, and the between-satellite single-difference model. We take advantage of the IGS-MGEX network products to correct for the satellite differential code biases and the orbital and satellite clock errors. Natural Resources Canada's GPSPace PPP software is modified to handle the various GPS/Galileo PPP models. A total of six data sets of GPS and Galileo observations at six IGS stations are processed to examine the performance of the various PPP models. It is shown that the traditional un-differenced GPS/Galileo PPP model, the GPS decoupled clock model, and the semi-decoupled clock GPS/Galileo PPP model improve the convergence time by about 25% in comparison with the un-differenced GPS-only model. In addition, the semi-decoupled GPS/Galileo PPP model improves the solution precision by about 25% compared to the traditional un-differenced GPS/Galileo PPP model. Moreover, the BSSD GPS/Galileo PPP model improves the solution convergence time by about 50%, in comparison with the un-differenced GPS PPP model, regardless of the type of BSSD combination used. As well, the BSSD model improves the precision of the estimated parameters by about 50% and 25% when the loose and the tight combinations are used, respectively, in comparison with the un-differenced GPS-only model. Comparable results are obtained through the tight combination when either a GPS or a Galileo satellite is selected as a reference.
NASA Technical Reports Server (NTRS)
Andrews, J.
1977-01-01
An optimal decision model of crop production, trade, and storage was developed for use in estimating the economic consequences of improved forecasts and estimates of worldwide crop production. The model extends earlier distribution benefits models to include production effects as well. Application to improved information systems meeting the goals set in the large area crop inventory experiment (LACIE) indicates annual benefits to the United States of $200 to $250 million for wheat, $50 to $100 million for corn, and $6 to $11 million for soybeans, using conservative assumptions on expected LANDSAT system performance.
Monitoring by forward scatter radar techniques: an improved second-order analytical model
NASA Astrophysics Data System (ADS)
Falconi, Marta Tecla; Comite, Davide; Galli, Alessandro; Marzano, Frank S.; Pastina, Debora; Lombardo, Pierfrancesco
2017-10-01
In this work, a second-order phase approximation is introduced to provide an improved analytical model of the signal received in forward scatter radar systems. A typical configuration with a rectangular metallic object illuminated while crossing the baseline, in far- or near-field conditions, is considered. An improved second-order model is compared with a simplified one already proposed by the authors and based on a paraxial approximation. A phase error analysis is carried out to investigate benefits and limitations of the second-order modeling. The results are validated by developing full-wave numerical simulations implementing the relevant scattering problem on a commercial tool.
Li, Tian-Jiao; Li, Sai; Yuan, Yuan; Liu, Yu-Dong; Xu, Chuan-Long; Shuai, Yong; Tan, He-Ping
2017-04-03
Plenoptic cameras are used for capturing flames in studies of high-temperature phenomena. However, simulations of plenoptic camera models can be used prior to the experiment improve experimental efficiency and reduce cost. In this work, microlens arrays, which are based on the established light field camera model, are optimized into a hexagonal structure with three types of microlenses. With this improved plenoptic camera model, light field imaging of static objects and flame are simulated using the calibrated parameters of the Raytrix camera (R29). The optimized models improve the image resolution, imaging screen utilization, and shooting range of depth of field.
NASA Astrophysics Data System (ADS)
Qian, Xiaoshan
2018-01-01
The traditional model of evaporation process parameters have continuity and cumulative characteristics of the prediction error larger issues, based on the basis of the process proposed an adaptive particle swarm neural network forecasting method parameters established on the autoregressive moving average (ARMA) error correction procedure compensated prediction model to predict the results of the neural network to improve prediction accuracy. Taking a alumina plant evaporation process to analyze production data validation, and compared with the traditional model, the new model prediction accuracy greatly improved, can be used to predict the dynamic process of evaporation of sodium aluminate solution components.
Desai, Neeraj R; French, Kim D; Diamond, Edward; Kovitz, Kevin L
2018-05-31
Value-based care is evolving with a focus on improving efficiency, reducing cost, and enhancing the patient experience. Interventional pulmonology has the opportunity to lead an effective value-based care model. This model is supported by the relatively low cost of pulmonary procedures and has the potential to improve efficiencies in thoracic care. We discuss key strategies to evaluate and improve efficiency in Interventional Pulmonology practice and describe our experience in developing an interventional pulmonology suite. Such a model can be adapted to other specialty areas and may encourage a more coordinated approach to specialty care. Copyright © 2018. Published by Elsevier Inc.
Improved LTVMPC design for steering control of autonomous vehicle
NASA Astrophysics Data System (ADS)
Velhal, Shridhar; Thomas, Susy
2017-01-01
An improved linear time varying model predictive control for steering control of autonomous vehicle running on slippery road is presented. Control strategy is designed such that the vehicle will follow the predefined trajectory with highest possible entry speed. In linear time varying model predictive control, nonlinear vehicle model is successively linearized at each sampling instant. This linear time varying model is used to design MPC which will predict the future horizon. By incorporating predicted input horizon in each successive linearization the effectiveness of controller has been improved. The tracking performance using steering with front wheel and braking at four wheels are presented to illustrate the effectiveness of the proposed method.
Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code
NASA Technical Reports Server (NTRS)
Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William
2006-01-01
The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.
NASA Astrophysics Data System (ADS)
Aubé, M.; Simoneau, A.
2018-05-01
Illumina is one of the most physically detailed artificial night sky brightness model to date. It has been in continuous development since 2005 [1]. In 2016-17, many improvements were made to the Illumina code including an overhead cloud scheme, an improved blocking scheme for subgrid obstacles (trees and buildings), and most importantly, a full hyperspectral modeling approach. Code optimization resulted in significant reduction in execution time enabling users to run the model on standard personal computers for some applications. After describing the new schemes introduced in the model, we give some examples of applications for a peri-urban and a rural site both located inside the International Dark Sky reserve of Mont-Mégantic (QC, Canada).
Decker, Martha M; Buggey, Tom
2014-01-01
The authors compared the effects of video self-modeling and video peer modeling on oral reading fluency of elementary students with learning disabilities. A control group was also included to gauge general improvement due to reading instruction and familiarity with researchers. The results indicated that both interventions resulted in improved fluency. Students in both experimental groups improved their reading fluency. Two students in the self-modeling group made substantial and immediate gains beyond any of the other students. Discussion is included that focuses on the importance that positive imagery can have on student performance and the possible applications of both forms of video modeling with students who have had negative experiences in reading.
NASA Astrophysics Data System (ADS)
Pomeroy, J. W.; Fang, X.
2014-12-01
The vast effort in hydrology devoted to parameter calibration as a means to improve model performance assumes that the models concerned are not fundamentally wrong. By focussing on finding optimal parameter sets and ascribing poor model performance to parameter or data uncertainty, these efforts may fail to consider the need to improve models with more intelligent descriptions of hydrological processes. To test this hypothesis, a flexible physically based hydrological model including a full suite of snow hydrology processes as well as warm season, hillslope and groundwater hydrology was applied to Marmot Creek Research Basin, Canadian Rocky Mountains where excellent driving meteorology and basin biophysical descriptions exist. Model parameters were set from values found in the basin or from similar environments; no parameters were calibrated. The model was tested against snow surveys and streamflow observations. The model used algorithms that describe snow redistribution, sublimation and forest canopy effects on snowmelt and evaporative processes that are rarely implemented in hydrological models. To investigate the contribution of these processes to model predictive capability, the model was "falsified" by deleting parameterisations for forest canopy snow mass and energy, blowing snow, intercepted rain evaporation, and sublimation. Model falsification by ignoring forest canopy processes contributed to a large increase in SWE errors for forested portions of the research basin with RMSE increasing from 19 to 55 mm and mean bias (MB) increasing from 0.004 to 0.62. In the alpine tundra portion, removing blowing processes resulted in an increase in model SWE MB from 0.04 to 2.55 on north-facing slopes and -0.006 to -0.48 on south-facing slopes. Eliminating these algorithms degraded streamflow prediction with the Nash Sutcliffe efficiency dropping from 0.58 to 0.22 and MB increasing from 0.01 to 0.09. These results show dramatic model improvements by including snow redistribution and melt processes associated with wind transport and forest canopies. As most hydrological models do not currently include these processes, it is suggested that modellers first improve the realism of model structures before trying to optimise what are inherently inadequate simulations of hydrology.
NASA Astrophysics Data System (ADS)
Song, Chi; Zhang, Xuejun; Zhang, Xin; Hu, Haifei; Zeng, Xuefeng
2017-06-01
A rigid conformal (RC) lap can smooth mid-spatial-frequency (MSF) errors, which are naturally smaller than the tool size, while still removing large-scale errors in a short time. However, the RC-lap smoothing efficiency performance is poorer than expected, and existing smoothing models cannot explicitly specify the methods to improve this efficiency. We presented an explicit time-dependent smoothing evaluation model that contained specific smoothing parameters directly derived from the parametric smoothing model and the Preston equation. Based on the time-dependent model, we proposed a strategy to improve the RC-lap smoothing efficiency, which incorporated the theoretical model, tool optimization, and efficiency limit determination. Two sets of smoothing experiments were performed to demonstrate the smoothing efficiency achieved using the time-dependent smoothing model. A high, theory-like tool influence function and a limiting tool speed of 300 RPM were o
Model Uncertainty Quantification Methods In Data Assimilation
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.
2017-12-01
Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.
NASA Astrophysics Data System (ADS)
Dobronets, Boris S.; Popova, Olga A.
2018-05-01
The paper considers a new approach of regression modeling that uses aggregated data presented in the form of density functions. Approaches to Improving the reliability of aggregation of empirical data are considered: improving accuracy and estimating errors. We discuss the procedures of data aggregation as a preprocessing stage for subsequent to regression modeling. An important feature of study is demonstration of the way how represent the aggregated data. It is proposed to use piecewise polynomial models, including spline aggregate functions. We show that the proposed approach to data aggregation can be interpreted as the frequency distribution. To study its properties density function concept is used. Various types of mathematical models of data aggregation are discussed. For the construction of regression models, it is proposed to use data representation procedures based on piecewise polynomial models. New approaches to modeling functional dependencies based on spline aggregations are proposed.
Using Ecosystem Experiments to Improve Vegetation Models
Medlyn, Belinda; Zaehle, S; DeKauwe, Martin G.; ...
2015-05-21
Ecosystem responses to rising CO2 concentrations are a major source of uncertainty in climate change projections. Data from ecosystem-scale Free-Air CO2 Enrichment (FACE) experiments provide a unique opportunity to reduce this uncertainty. The recent FACE Model–Data Synthesis project aimed to use the information gathered in two forest FACE experiments to assess and improve land ecosystem models. A new 'assumption-centred' model intercomparison approach was used, in which participating models were evaluated against experimental data based on the ways in which they represent key ecological processes. Identifying and evaluating the main assumptions caused differences among models, and the assumption-centered approach produced amore » clear roadmap for reducing model uncertainty. We explain this approach and summarize the resulting research agenda. We encourage the application of this approach in other model intercomparison projects to fundamentally improve predictive understanding of the Earth system.« less
Titan I propulsion system modeling and possible performance improvements
NASA Astrophysics Data System (ADS)
Giusti, Oreste
This thesis features the Titan I propulsion systems and offers data-supported suggestions for improvements to increase performance. The original propulsion systems were modeled both graphically in CAD and via equations. Due to the limited availability of published information, it was necessary to create a more detailed, secondary set of models. Various engineering equations---pertinent to rocket engine design---were implemented in order to generate the desired extra detail. This study describes how these new models were then imported into the ESI CFD Suite. Various parameters are applied to these imported models as inputs that include, for example, bi-propellant combinations, pressure, temperatures, and mass flow rates. The results were then processed with ESI VIEW, which is visualization software. The output files were analyzed for forces in the nozzle, and various results were generated, including sea level thrust and ISP. Experimental data are provided to compare the original engine configuration models to the derivative suggested improvement models.
Gravity model improvement investigation. [improved gravity model for determination of ocean geoid
NASA Technical Reports Server (NTRS)
Siry, J. W.; Kahn, W. D.; Bryan, J. W.; Vonbun, F. F.
1973-01-01
This investigation was undertaken to improve the gravity model and hence the ocean geoid. A specific objective is the determination of the gravity field and geoid with a space resolution of approximately 5 deg and a height resolution of the order of five meters. The concept of the investigation is to utilize both GEOS-C altimeter and satellite-to-satellite tracking data to achieve the gravity model improvement. It is also planned to determine the geoid in selected regions with a space resolution of about a degree and a height resolution of the order of a meter or two. The short term objectives include the study of the gravity field in the GEOS-C calibration area outlined by Goddard, Bermuda, Antigua, and Cape Kennedy, and also in the eastern Pacific area which is viewed by ATS-F.
Lunar Gravity Field Determination Using SELENE Same-Beam Differential VLBI Tracking Data
NASA Technical Reports Server (NTRS)
Goossens, S.; Matsumoto, K.; Liu, Q.; Kikuchi, F.; Sato, K.; Hanada, H.; Ishihara, Y.; Noda, H.; Kawano, N.; Namiki, N.;
2010-01-01
A lunar gravity field model up to degree and order 100 in spherical harmonics, named SGM 100i, has been determined from SELENE and historical tracking data, with an emphasis on using same-beam S-band differential VLBI data obtained in the SELENE mission between January 2008 and February 2009. Orbit consistency throughout the entire mission period of SELENE as determined from orbit overlaps for the two sub-satellites of SELENE involved in the VLBI tracking improved consistently from several hundreds of metres to several tens of metres by including differential VLBI data. Through orbits that are better determined, the gravity field model is also improved by including these data. Orbit determination performance for the new model shows improvements over earlier 100th degree and order models, especially for edge-on orbits over the deep far side. Lunar Prospector orbit determination shows an improvement of orbit consistency from I-day predictions for 2-day arcs of 6 m in a total sense, with most improvement in the along and cross-track directions. Data fit for the types and satellites involved is also improved. Formal errors for the lower degrees are smaller, and the new model also shows increased correlations with topography over the far side. The estimated value for the lunar GM for this model equals 4902.80080 +/- 0.0009 cu km/sq s (10 sigma). The lunar degree 2 potential Love number k2 was also estimated, and has a value of 0.0255 +/- 0.0016 (10 sigma as well).
Modelled female sale options demonstrate improved profitability in northern beef herds.
Niethe, G E; Holmes, W E
2008-12-01
To examine the impact of improving the average value of cows sold, the risk of decreasing the number weaned, and total sales on the profitability of northern Australian cattle breeding properties. Gather, model and interpret breeder herd performances and production parameters on properties from six beef-producing regions in northern Australia. Production parameters, prices, costs and herd structure were entered into a herd simulation model for six northern Australian breeding properties that spay females to enhance their marketing options. After the data were validated by management, alternative management strategies were modelled using current market prices and most likely herd outcomes. The model predicted a close relationship between the average sale value of cows, the total herd sales and the gross margin/adult equivalent. Keeping breeders out of the herd to fatten generally improves their sale value, and this can be cost-effective, despite the lower number of progeny produced and the subsequent reduction in total herd sales. Furthermore, if the price of culled cows exceeds the price of culled heifers, provided there are sufficient replacement pregnant heifers available to maintain the breeder herd nucleus, substantial gains in profitability can be obtained by decreasing the age at which cows are culled from the herd. Generalised recommendations on improving reproductive performance are not necessarily the most cost-effective strategy to improve breeder herd profitability. Judicious use of simulation models is essential to help develop the best turnoff strategies for females and to improve station profitability.
Keegan, Ronan M; McNicholas, Stuart J; Thomas, Jens M H; Simpkin, Adam J; Simkovic, Felix; Uski, Ville; Ballard, Charles C; Winn, Martyn D; Wilson, Keith S; Rigden, Daniel J
2018-03-01
Increasing sophistication in molecular-replacement (MR) software and the rapid expansion of the PDB in recent years have allowed the technique to become the dominant method for determining the phases of a target structure in macromolecular X-ray crystallography. In addition, improvements in bioinformatic techniques for finding suitable homologous structures for use as MR search models, combined with developments in refinement and model-building techniques, have pushed the applicability of MR to lower sequence identities and made weak MR solutions more amenable to refinement and improvement. MrBUMP is a CCP4 pipeline which automates all stages of the MR procedure. Its scope covers everything from the sourcing and preparation of suitable search models right through to rebuilding of the positioned search model. Recent improvements to the pipeline include the adoption of more sensitive bioinformatic tools for sourcing search models, enhanced model-preparation techniques including better ensembling of homologues, and the use of phase improvement and model building on the resulting solution. The pipeline has also been deployed as an online service through CCP4 online, which allows its users to exploit large bioinformatic databases and coarse-grained parallelism to speed up the determination of a possible solution. Finally, the molecular-graphics application CCP4mg has been combined with MrBUMP to provide an interactive visual aid to the user during the process of selecting and manipulating search models for use in MR. Here, these developments in MrBUMP are described with a case study to explore how some of the enhancements to the pipeline and to CCP4mg can help to solve a difficult case.
Improvements in mode-based waveform modeling and application to Eurasian velocity structure
NASA Astrophysics Data System (ADS)
Panning, M. P.; Marone, F.; Kim, A.; Capdeville, Y.; Cupillard, P.; Gung, Y.; Romanowicz, B.
2006-12-01
We introduce several recent improvements to mode-based 3D and asymptotic waveform modeling and examine how to integrate them with numerical approaches for an improved model of upper-mantle structure under eastern Eurasia. The first step in our approach is to create a large-scale starting model including shear anisotropy using Nonlinear Asymptotic Coupling Theory (NACT; Li and Romanowicz, 1995), which models the 2D sensitivity of the waveform to the great-circle path between source and receiver. We have recently improved this approach by implementing new crustal corrections which include a non-linear correction for the difference between the average structure of several large regions from the global model with further linear corrections to account for the local structure along the path between source and receiver (Marone and Romanowicz, 2006; Panning and Romanowicz, 2006). This model is further refined using a 3D implementation of Born scattering (Capdeville, 2005). We have made several recent improvements to this method, in particular introducing the ability to represent perturbations to discontinuities. While the approach treats all sensitivity as linear perturbations to the waveform, we have also experimented with a non-linear modification analogous to that used in the development of NACT. This allows us to treat large accumulated phase delays determined from a path-average approximation non-linearly, while still using the full 3D sensitivity of the Born approximation. Further refinement of shallow regions of the model is obtained using broadband forward finite-difference waveform modeling. We are also integrating a regional Spectral Element Method code into our tomographic modeling, allowing us to move beyond many assumptions inherent in the analytic mode-based approaches, while still taking advantage of their computational efficiency. Illustrations of the effects of these increasingly sophisticated steps will be presented.
NASA Astrophysics Data System (ADS)
Yu, Liuqian; Fennel, Katja; Bertino, Laurent; Gharamti, Mohamad El; Thompson, Keith R.
2018-06-01
Effective data assimilation methods for incorporating observations into marine biogeochemical models are required to improve hindcasts, nowcasts and forecasts of the ocean's biogeochemical state. Recent assimilation efforts have shown that updating model physics alone can degrade biogeochemical fields while only updating biogeochemical variables may not improve a model's predictive skill when the physical fields are inaccurate. Here we systematically investigate whether multivariate updates of physical and biogeochemical model states are superior to only updating either physical or biogeochemical variables. We conducted a series of twin experiments in an idealized ocean channel that experiences wind-driven upwelling. The forecast model was forced with biased wind stress and perturbed biogeochemical model parameters compared to the model run representing the "truth". Taking advantage of the multivariate nature of the deterministic Ensemble Kalman Filter (DEnKF), we assimilated different combinations of synthetic physical (sea surface height, sea surface temperature and temperature profiles) and biogeochemical (surface chlorophyll and nitrate profiles) observations. We show that when biogeochemical and physical properties are highly correlated (e.g., thermocline and nutricline), multivariate updates of both are essential for improving model skill and can be accomplished by assimilating either physical (e.g., temperature profiles) or biogeochemical (e.g., nutrient profiles) observations. In our idealized domain, the improvement is largely due to a better representation of nutrient upwelling, which results in a more accurate nutrient input into the euphotic zone. In contrast, assimilating surface chlorophyll improves the model state only slightly, because surface chlorophyll contains little information about the vertical density structure. We also show that a degradation of the correlation between observed subsurface temperature and nutrient fields, which has been an issue in several previous assimilation studies, can be reduced by multivariate updates of physical and biogeochemical fields.
Keegan, Ronan M.; McNicholas, Stuart J.; Thomas, Jens M. H.; Simpkin, Adam J.; Uski, Ville; Ballard, Charles C.
2018-01-01
Increasing sophistication in molecular-replacement (MR) software and the rapid expansion of the PDB in recent years have allowed the technique to become the dominant method for determining the phases of a target structure in macromolecular X-ray crystallography. In addition, improvements in bioinformatic techniques for finding suitable homologous structures for use as MR search models, combined with developments in refinement and model-building techniques, have pushed the applicability of MR to lower sequence identities and made weak MR solutions more amenable to refinement and improvement. MrBUMP is a CCP4 pipeline which automates all stages of the MR procedure. Its scope covers everything from the sourcing and preparation of suitable search models right through to rebuilding of the positioned search model. Recent improvements to the pipeline include the adoption of more sensitive bioinformatic tools for sourcing search models, enhanced model-preparation techniques including better ensembling of homologues, and the use of phase improvement and model building on the resulting solution. The pipeline has also been deployed as an online service through CCP4 online, which allows its users to exploit large bioinformatic databases and coarse-grained parallelism to speed up the determination of a possible solution. Finally, the molecular-graphics application CCP4mg has been combined with MrBUMP to provide an interactive visual aid to the user during the process of selecting and manipulating search models for use in MR. Here, these developments in MrBUMP are described with a case study to explore how some of the enhancements to the pipeline and to CCP4mg can help to solve a difficult case. PMID:29533225
NASA Astrophysics Data System (ADS)
Yassin, F.; Anis, M. R.; Razavi, S.; Wheater, H. S.
2017-12-01
Water management through reservoirs, diversions, and irrigation have significantly changed river flow regimes and basin-wide energy and water balance cycles. Failure to represent these effects limits the performance of land surface-hydrology models not only for streamflow prediction but also for the estimation of soil moisture, evapotranspiration, and feedbacks to the atmosphere. Despite recent research to improve the representation of water management in land surface models, there remains a need to develop improved modeling approaches that work in complex and highly regulated basins such as the 406,000 km2 Saskatchewan River Basin (SaskRB). A particular challenge for regional and global application is a lack of local information on reservoir operational management. To this end, we implemented a reservoir operation, water abstraction, and irrigation algorithm in the MESH land surface-hydrology model and tested it over the SaskRB. MESH is Environment Canada's Land Surface-hydrology modeling system that couples Canadian Land Surface Scheme (CLASS) with hydrological routing model. The implemented reservoir algorithm uses an inflow-outflow relationship that accounts for the physical characteristics of reservoirs (e.g., storage-area-elevation relationships) and includes simplified operational characteristics based on local information (e.g., monthly target volume and release under limited, normal, and flood storage zone). The irrigation algorithm uses the difference between actual and potential evapotranspiration to estimate irrigation water demand. This irrigation demand is supplied from the neighboring reservoirs/diversion in the river system. We calibrated the model enabled with the new reservoir and irrigation modules in a multi-objective optimization setting. Results showed that the reservoir and irrigation modules significantly improved the MESH model performance in generating streamflow and evapotranspiration across the SaskRB and that this our approach provides a basis for improved large scale hydrological modelling.
Improving a Spectral Bin Microphysical Scheme Using TRMM Satellite Observations
NASA Technical Reports Server (NTRS)
Li, Xiaowen; Tao, Wei-Kuo; Matsui, Toshihisa; Liu, Chuntao; Masunaga, Hirohiko
2010-01-01
Comparisons between cloud model simulations and observations are crucial in validating model performance and improving physical processes represented in the mod Tel.hese modeled physical processes are idealized representations and almost always have large rooms for improvements. In this study, we use data from two different sensors onboard TRMM (Tropical Rainfall Measurement Mission) satellite to improve the microphysical scheme in the Goddard Cumulus Ensemble (GCE) model. TRMM observed mature-stage squall lines during late spring, early summer in central US over a 9-year period are compiled and compared with a case simulation by GCE model. A unique aspect of the GCE model is that it has a state-of-the-art spectral bin microphysical scheme, which uses 33 different bins to represent particle size distribution of each of the seven hydrometeor species. A forward radiative transfer model calculates TRMM Precipitation Radar (PR) reflectivity and TRMM Microwave Imager (TMI) 85 GHz brightness temperatures from simulated particle size distributions. Comparisons between model outputs and observations reveal that the model overestimates sizes of snow/aggregates in the stratiform region of the squall line. After adjusting temperature-dependent collection coefficients among ice-phase particles, PR comparisons become good while TMI comparisons worsen. Further investigations show that the partitioning between graupel (a high-density form of aggregate), and snow (a low-density form of aggregate) needs to be adjusted in order to have good comparisons in both PR reflectivity and TMI brightness temperature. This study shows that long-term satellite observations, especially those with multiple sensors, can be very useful in constraining model microphysics. It is also the first study in validating and improving a sophisticated spectral bin microphysical scheme according to long-term satellite observations.
Developing the Mathematics Learning Management Model for Improving Creative Thinking in Thailand
ERIC Educational Resources Information Center
Sriwongchai, Arunee; Jantharajit, Nirat; Chookhampaeng, Sumalee
2015-01-01
The study purposes were: 1) To study current states and problems of relevant secondary students in developing mathematics learning management model for improving creative thinking, 2) To evaluate the effectiveness of model about: a) efficiency of learning process, b) comparisons of pretest and posttest on creative thinking and achievement of…
A Model Schedule for a Capital Improvement Program.
ERIC Educational Resources Information Center
Oates, Arnold D.; Burch, A. Lee
The Model Schedule for a Capital Improvement Program described in this paper encourages school leaders to consider a more holistic view of the planning process. It is intended to assist those responsible for educational facility planning, who must assure that all important and relevant tasks are accomplished in a timely manner. The model's six…
Peng, Yi; Xiong, Xiong; Adhikari, Kabindra; Knadel, Maria; Grunwald, Sabine; Greve, Mogens Humlekrog
2015-01-01
There is a great challenge in combining soil proximal spectra and remote sensing spectra to improve the accuracy of soil organic carbon (SOC) models. This is primarily because mixing of spectral data from different sources and technologies to improve soil models is still in its infancy. The first objective of this study was to integrate information of SOC derived from visible near-infrared reflectance (Vis-NIR) spectra in the laboratory with remote sensing (RS) images to improve predictions of topsoil SOC in the Skjern river catchment, Denmark. The second objective was to improve SOC prediction results by separately modeling uplands and wetlands. A total of 328 topsoil samples were collected and analyzed for SOC. Satellite Pour l’Observation de la Terre (SPOT5), Landsat Data Continuity Mission (Landsat 8) images, laboratory Vis-NIR and other ancillary environmental data including terrain parameters and soil maps were compiled to predict topsoil SOC using Cubist regression and Bayesian kriging. The results showed that the model developed from RS data, ancillary environmental data and laboratory spectral data yielded a lower root mean square error (RMSE) (2.8%) and higher R2 (0.59) than the model developed from only RS data and ancillary environmental data (RMSE: 3.6%, R2: 0.46). Plant-available water (PAW) was the most important predictor for all the models because of its close relationship with soil organic matter content. Moreover, vegetation indices, such as the Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index (EVI), were very important predictors in SOC spatial models. Furthermore, the ‘upland model’ was able to more accurately predict SOC compared with the ‘upland & wetland model’. However, the separately calibrated ‘upland and wetland model’ did not improve the prediction accuracy for wetland sites, since it was not possible to adequately discriminate the vegetation in the RS summer images. We conclude that laboratory Vis-NIR spectroscopy adds critical information that significantly improves the prediction accuracy of SOC compared to using RS data alone. We recommend the incorporation of laboratory spectra with RS data and other environmental data to improve soil spatial modeling and digital soil mapping (DSM). PMID:26555071
NASA Astrophysics Data System (ADS)
Hersch, Roger David; Crété, Frédérique
2004-12-01
Dot gain is different when dots are printed alone, printed in superposition with one ink or printed in superposition with two inks. In addition, the dot gain may also differ depending on which solid ink the considered halftone layer is superposed. In a previous research project, we developed a model for computing the effective surface coverage of a dot according to its superposition conditions. In the present contribution, we improve the Yule-Nielsen modified Neugebauer model by integrating into it our effective dot surface coverage computation model. Calibration of the reproduction curves mapping nominal to effective surface coverages in every superposition condition is carried out by fitting effective dot surfaces which minimize the sum of square differences between the measured reflection density spectra and reflection density spectra predicted according to the Yule-Nielsen modified Neugebauer model. In order to predict the reflection spectrum of a patch, its known nominal surface coverage values are converted into effective coverage values by weighting the contributions from different reproduction curves according to the weights of the contributing superposition conditions. We analyze the colorimetric prediction improvement brought by our extended dot surface coverage model for clustered-dot offset prints, thermal transfer prints and ink-jet prints. The color differences induced by the differences between measured reflection spectra and reflection spectra predicted according to the new dot surface estimation model are quantified on 729 different cyan, magenta, yellow patches covering the full color gamut. As a reference, these differences are also computed for the classical Yule-Nielsen modified spectral Neugebauer model incorporating a single halftone reproduction curve for each ink. Taking into account dot surface coverages according to different superposition conditions considerably improves the predictions of the Yule-Nielsen modified Neugebauer model. In the case of offset prints, the mean difference between predictions and measurements expressed in CIE-LAB CIE-94 ΔE94 values is reduced at 100 lpi from 1.54 to 0.90 (accuracy improvement factor: 1.7) and at 150 lpi it is reduced from 1.87 to 1.00 (accuracy improvement factor: 1.8). Similar improvements have been observed for a thermal transfer printer at 600 dpi, at lineatures of 50 and 75 lpi. In the case of an ink-jet printer at 600 dpi, the mean ΔE94 value is reduced at 75 lpi from 3.03 to 0.90 (accuracy improvement factor: 3.4) and at 100 lpi from 3.08 to 0.91 (accuracy improvement factor: 3.4).
NASA Astrophysics Data System (ADS)
Hersch, Roger David; Crete, Frederique
2005-01-01
Dot gain is different when dots are printed alone, printed in superposition with one ink or printed in superposition with two inks. In addition, the dot gain may also differ depending on which solid ink the considered halftone layer is superposed. In a previous research project, we developed a model for computing the effective surface coverage of a dot according to its superposition conditions. In the present contribution, we improve the Yule-Nielsen modified Neugebauer model by integrating into it our effective dot surface coverage computation model. Calibration of the reproduction curves mapping nominal to effective surface coverages in every superposition condition is carried out by fitting effective dot surfaces which minimize the sum of square differences between the measured reflection density spectra and reflection density spectra predicted according to the Yule-Nielsen modified Neugebauer model. In order to predict the reflection spectrum of a patch, its known nominal surface coverage values are converted into effective coverage values by weighting the contributions from different reproduction curves according to the weights of the contributing superposition conditions. We analyze the colorimetric prediction improvement brought by our extended dot surface coverage model for clustered-dot offset prints, thermal transfer prints and ink-jet prints. The color differences induced by the differences between measured reflection spectra and reflection spectra predicted according to the new dot surface estimation model are quantified on 729 different cyan, magenta, yellow patches covering the full color gamut. As a reference, these differences are also computed for the classical Yule-Nielsen modified spectral Neugebauer model incorporating a single halftone reproduction curve for each ink. Taking into account dot surface coverages according to different superposition conditions considerably improves the predictions of the Yule-Nielsen modified Neugebauer model. In the case of offset prints, the mean difference between predictions and measurements expressed in CIE-LAB CIE-94 ΔE94 values is reduced at 100 lpi from 1.54 to 0.90 (accuracy improvement factor: 1.7) and at 150 lpi it is reduced from 1.87 to 1.00 (accuracy improvement factor: 1.8). Similar improvements have been observed for a thermal transfer printer at 600 dpi, at lineatures of 50 and 75 lpi. In the case of an ink-jet printer at 600 dpi, the mean ΔE94 value is reduced at 75 lpi from 3.03 to 0.90 (accuracy improvement factor: 3.4) and at 100 lpi from 3.08 to 0.91 (accuracy improvement factor: 3.4).
Genders, Tessa S S; Steyerberg, Ewout W; Nieman, Koen; Galema, Tjebbe W; Mollet, Nico R; de Feyter, Pim J; Krestin, Gabriel P; Alkadhi, Hatem; Leschka, Sebastian; Desbiolles, Lotus; Meijs, Matthijs F L; Cramer, Maarten J; Knuuti, Juhani; Kajander, Sami; Bogaert, Jan; Goetschalckx, Kaatje; Cademartiri, Filippo; Maffei, Erica; Martini, Chiara; Seitun, Sara; Aldrovandi, Annachiara; Wildermuth, Simon; Stinn, Björn; Fornaro, Jürgen; Feuchtner, Gudrun; De Zordo, Tobias; Auer, Thomas; Plank, Fabian; Friedrich, Guy; Pugliese, Francesca; Petersen, Steffen E; Davies, L Ceri; Schoepf, U Joseph; Rowe, Garrett W; van Mieghem, Carlos A G; van Driessche, Luc; Sinitsyn, Valentin; Gopalan, Deepa; Nikolaou, Konstantin; Bamberg, Fabian; Cury, Ricardo C; Battle, Juan; Maurovich-Horvat, Pál; Bartykowszki, Andrea; Merkely, Bela; Becker, Dávid; Hadamitzky, Martin; Hausleiter, Jörg; Dewey, Marc; Zimmermann, Elke; Laule, Michael
2012-01-01
Objectives To develop prediction models that better estimate the pretest probability of coronary artery disease in low prevalence populations. Design Retrospective pooled analysis of individual patient data. Setting 18 hospitals in Europe and the United States. Participants Patients with stable chest pain without evidence for previous coronary artery disease, if they were referred for computed tomography (CT) based coronary angiography or catheter based coronary angiography (indicated as low and high prevalence settings, respectively). Main outcome measures Obstructive coronary artery disease (≥50% diameter stenosis in at least one vessel found on catheter based coronary angiography). Multiple imputation accounted for missing predictors and outcomes, exploiting strong correlation between the two angiography procedures. Predictive models included a basic model (age, sex, symptoms, and setting), clinical model (basic model factors and diabetes, hypertension, dyslipidaemia, and smoking), and extended model (clinical model factors and use of the CT based coronary calcium score). We assessed discrimination (c statistic), calibration, and continuous net reclassification improvement by cross validation for the four largest low prevalence datasets separately and the smaller remaining low prevalence datasets combined. Results We included 5677 patients (3283 men, 2394 women), of whom 1634 had obstructive coronary artery disease found on catheter based coronary angiography. All potential predictors were significantly associated with the presence of disease in univariable and multivariable analyses. The clinical model improved the prediction, compared with the basic model (cross validated c statistic improvement from 0.77 to 0.79, net reclassification improvement 35%); the coronary calcium score in the extended model was a major predictor (0.79 to 0.88, 102%). Calibration for low prevalence datasets was satisfactory. Conclusions Updated prediction models including age, sex, symptoms, and cardiovascular risk factors allow for accurate estimation of the pretest probability of coronary artery disease in low prevalence populations. Addition of coronary calcium scores to the prediction models improves the estimates. PMID:22692650
Counteracting structural errors in ensemble forecast of influenza outbreaks.
Pei, Sen; Shaman, Jeffrey
2017-10-13
For influenza forecasts generated using dynamical models, forecast inaccuracy is partly attributable to the nonlinear growth of error. As a consequence, quantification of the nonlinear error structure in current forecast models is needed so that this growth can be corrected and forecast skill improved. Here, we inspect the error growth of a compartmental influenza model and find that a robust error structure arises naturally from the nonlinear model dynamics. By counteracting these structural errors, diagnosed using error breeding, we develop a new forecast approach that combines dynamical error correction and statistical filtering techniques. In retrospective forecasts of historical influenza outbreaks for 95 US cities from 2003 to 2014, overall forecast accuracy for outbreak peak timing, peak intensity and attack rate, are substantially improved for predicted lead times up to 10 weeks. This error growth correction method can be generalized to improve the forecast accuracy of other infectious disease dynamical models.Inaccuracy of influenza forecasts based on dynamical models is partly due to nonlinear error growth. Here the authors address the error structure of a compartmental influenza model, and develop a new improved forecast approach combining dynamical error correction and statistical filtering techniques.
Toward Improved Fidelity of Thermal Explosion Simulations
NASA Astrophysics Data System (ADS)
Nichols, A. L.; Becker, R.; Howard, W. M.; Wemhoff, A.
2009-12-01
We will present results of an effort to improve the thermal/chemical/mechanical modeling of HMX based explosives like LX04 and LX10 for thermal cook-off The original HMX model and analysis scheme were developed by Yoh et al. for use in the ALE3D modeling framework. The current results were built to remedy the deficiencies of that original model. We concentrated our efforts in four areas. The first area was addition of porosity to the chemical material model framework in ALE3D that is used to model the HMX explosive formulation. This is needed to handle the roughly 2% porosity in solid explosives. The second area was the improvement of the HMX reaction network, which included a reactive phase change model base on work by Henson et al. The third area required adding early decomposition gas species to the CHEETAH material database to develop more accurate equations of state for gaseous intermediates and products. Finally, it was necessary to improve the implicit mechanics module in ALE3D to more naturally handle the long time scales associated with thermal cook-off The application of the resulting framework to the analysis of the Scaled Thermal Explosion (STEX) experiments will be discussed.
Plant water potential improves prediction of empirical stomatal models.
Anderegg, William R L; Wolf, Adam; Arango-Velez, Adriana; Choat, Brendan; Chmura, Daniel J; Jansen, Steven; Kolb, Thomas; Li, Shan; Meinzer, Frederick; Pita, Pilar; Resco de Dios, Víctor; Sperry, John S; Wolfe, Brett T; Pacala, Stephen
2017-01-01
Climate change is expected to lead to increases in drought frequency and severity, with deleterious effects on many ecosystems. Stomatal responses to changing environmental conditions form the backbone of all ecosystem models, but are based on empirical relationships and are not well-tested during drought conditions. Here, we use a dataset of 34 woody plant species spanning global forest biomes to examine the effect of leaf water potential on stomatal conductance and test the predictive accuracy of three major stomatal models and a recently proposed model. We find that current leaf-level empirical models have consistent biases of over-prediction of stomatal conductance during dry conditions, particularly at low soil water potentials. Furthermore, the recently proposed stomatal conductance model yields increases in predictive capability compared to current models, and with particular improvement during drought conditions. Our results reveal that including stomatal sensitivity to declining water potential and consequent impairment of plant water transport will improve predictions during drought conditions and show that many biomes contain a diversity of plant stomatal strategies that range from risky to conservative stomatal regulation during water stress. Such improvements in stomatal simulation are greatly needed to help unravel and predict the response of ecosystems to future climate extremes.
The report of the Gravity Field Workshop
NASA Astrophysics Data System (ADS)
Smith, D. E.
1982-04-01
A Gravity Field Workshop was convened to review the actions which could be taken prior to a GRAVSAT mission to improve the Earth's gravity field model. This review focused on the potential improvements in the Earth's gravity field which could be obtained using the current satellite and surface gravity data base. In particular, actions to improve the quality of the gravity field determination through refined measurement corrections, selected data augmentation and a more accurate reprocessing of the data were considered. In addition, recommendations were formulated which define actions which NASA should take to develop the necessary theoretical and computation techniques for gravity model determination and to use these approaches to improve the accuracy of the Earth's gravity model.
Improved Cell Culture Method for Growing Contracting Skeletal Muscle Models
NASA Technical Reports Server (NTRS)
Marquette, Michele L.; Sognier, Marguerite A.
2013-01-01
An improved method for culturing immature muscle cells (myoblasts) into a mature skeletal muscle overcomes some of the notable limitations of prior culture methods. The development of the method is a major advance in tissue engineering in that, for the first time, a cell-based model spontaneously fuses and differentiates into masses of highly aligned, contracting myotubes. This method enables (1) the construction of improved two-dimensional (monolayer) skeletal muscle test beds; (2) development of contracting three-dimensional tissue models; and (3) improved transplantable tissues for biomedical and regenerative medicine applications. With adaptation, this method also offers potential application for production of other tissue types (i.e., bone and cardiac) from corresponding precursor cells.
Improvements in GRACE Gravity Field Determination through Stochastic Observation Modeling
NASA Astrophysics Data System (ADS)
McCullough, C.; Bettadpur, S. V.
2016-12-01
Current unconstrained Release 05 GRACE gravity field solutions from the Center for Space Research (CSR RL05) assume random observation errors following an independent multivariate Gaussian distribution. This modeling of observations, a simplifying assumption, fails to account for long period, correlated errors arising from inadequacies in the background force models. Fully modeling the errors inherent in the observation equations, through the use of a full observation covariance (modeling colored noise), enables optimal combination of GPS and inter-satellite range-rate data and obviates the need for estimating kinematic empirical parameters during the solution process. Most importantly, fully modeling the observation errors drastically improves formal error estimates of the spherical harmonic coefficients, potentially enabling improved uncertainty quantification of scientific results derived from GRACE and optimizing combinations of GRACE with independent data sets and a priori constraints.
Passage relevance models for genomics search.
Urbain, Jay; Frieder, Ophir; Goharian, Nazli
2009-03-19
We present a passage relevance model for integrating syntactic and semantic evidence of biomedical concepts and topics using a probabilistic graphical model. Component models of topics, concepts, terms, and document are represented as potential functions within a Markov Random Field. The probability of a passage being relevant to a biologist's information need is represented as the joint distribution across all potential functions. Relevance model feedback of top ranked passages is used to improve distributional estimates of query concepts and topics in context, and a dimensional indexing strategy is used for efficient aggregation of concept and term statistics. By integrating multiple sources of evidence including dependencies between topics, concepts, and terms, we seek to improve genomics literature passage retrieval precision. Using this model, we are able to demonstrate statistically significant improvements in retrieval precision using a large genomics literature corpus.
NASA Astrophysics Data System (ADS)
Song, Yanpo; Peng, Xiaoqi; Tang, Ying; Hu, Zhikun
2013-07-01
To improve the operation level of copper converter, the approach to optimal decision making modeling for coppermatte converting process based on data mining is studied: in view of the characteristics of the process data, such as containing noise, small sample size and so on, a new robust improved ANN (artificial neural network) modeling method is proposed; taking into account the application purpose of decision making model, three new evaluation indexes named support, confidence and relative confidence are proposed; using real production data and the methods mentioned above, optimal decision making model for blowing time of S1 period (the 1st slag producing period) are developed. Simulation results show that this model can significantly improve the converting quality of S1 period, increase the optimal probability from about 70% to about 85%.
Bernath, Katrin; Roschewitz, Anna
2008-11-01
The extension of contingent valuation models with an attitude-behavior based framework has been proposed in order to improve the descriptive and predictive ability of the models. This study examines the potential of the theory of planned behavior to explain willingness to pay (WTP) in a contingent valuation survey of the recreational benefits of the Zurich city forests. Two aspects of WTP responses, protest votes and bid levels, were analyzed separately. In both steps, models with and without the psychological predictors proposed by the theory of planned behavior were compared. Whereas the inclusion of the psychological predictors significantly improved explanations of protest votes, their ability to improve the performance of the model explaining bid levels was limited. The results indicate that the interpretation of bid levels as behavioral intention may not be appropriate and that the potential of the theory of planned behavior to improve contingent valuation models depends on which aspect of WTP responses is examined.
Communication in palliative care: the applicability of the SAGE and THYME model in Singapore.
Martin, Ang Seng Hock; Costello, John; Griffiths, Jane
2017-06-02
Majority of the progress and development in palliative care in the last decade has been improvements in physical aspects of treatment, namely pain and symptom management. Psychosocial aspects of care have improved, although not enough to meet the needs of many patients and family members. This is evident in many parts of the world and notably in Singapore, where palliative care is seen as an emerging medical and nursing specialty. To discuss the implementation of the SAGE and THYME communication model in a palliative care context. The article examines the use of the model and how its implementation can improve communication between patients and nurses. The model works by reviewing contemporary developments made in relation to improving communication in palliative care. These include, highlighting the importance of meeting individual needs, therapeutic relationship building, and advanced communication training within a Singaporean context. The implementation of the SAGE and THYME model can be a useful way of enabling nurses to improve and maintain effective communication in a medically dominated health care system. The challenges and constraints in educating and training nurses with limited skills in palliative care, forms part of the review, including the cultural and attitude constraints specific to Singaporean palliative care.
NASA Astrophysics Data System (ADS)
Pinem, M.; Fauzi, R.
2018-02-01
One technique for ensuring continuity of wireless communication services and keeping a smooth transition on mobile communication networks is the soft handover technique. In the Soft Handover (SHO) technique the inclusion and reduction of Base Station from the set of active sets is determined by initiation triggers. One of the initiation triggers is based on the strong reception signal. In this paper we observed the influence of parameters of large-scale radio propagation models to improve the performance of mobile communications. The observation parameters for characterizing the performance of the specified mobile system are Drop Call, Radio Link Degradation Rate and Average Size of Active Set (AS). The simulated results show that the increase in altitude of Base Station (BS) Antenna and Mobile Station (MS) Antenna contributes to the improvement of signal power reception level so as to improve Radio Link quality and increase the average size of Active Set and reduce the average Drop Call rate. It was also found that Hata’s propagation model contributed significantly to improvements in system performance parameters compared to Okumura’s propagation model and Lee’s propagation model.
Improving Crop Productions Using the Irrigation & Crop Production Model Under Drought
NASA Astrophysics Data System (ADS)
Shin, Y.; Lee, T.; Lee, S. H.; Kim, J.; Jang, W.; Park, S.
2017-12-01
We aimed to improve crop productions by providing optimal irrigation water amounts (IWAs) for various soils and crops using the Irrigation & Crop Production (ICP) model under various hydro-climatic regions. We selected the Little Washita (LW 13/21) and Bangdong-ri sites in Oklahoma (United States of America) and Chuncheon (Republic of Korea) for the synthetic studies. Our results showed that the ICP model performed well for improving crop productions by providing optimal IWAs during the study period (2000 to 2016). Crop productions were significantly affected by the solar radiation and precipitation, but the maximum and minimum temperature showed less impact on crop productions. When we considerd that the weather variables cannot be adjusted by artifical activities, irrigation might be the only solution for improving crop productions under drought. Also, the presence of shallow ground water (SGW) table depths higlhy influences on crop production. Although certainties exist in the synthetic studies, our results showed the robustness of the ICP model for improving crop productions under the drought condition. Thus, the ICP model can contribute to efficient water management plans under drought in regions at where water availability is limited.
NASA Astrophysics Data System (ADS)
Lupita, Alessandra; Rangkuti, Sabrina Heriza; Sutopo, Wahyudi; Hisjam, Muh.
2017-11-01
There are significant differences related to the quality and price of the beef commodity in traditional market and modern market in Indonesia. Those are caused by very different treatments of the commodity. The different treatments are in the slaughter lines, the transportation from the abattoir to the outlet, the display system, and the control system. If the problem is not solved by the Government, the gap will result a great loss of the consumer regarding to the quality and sustainability of traditional traders business because of the declining interest in purchasing beef in the traditional markets. This article aims to improve the quality of beef in traditional markets. This study proposed A Supply Chain Model that involves the schemes of investment and government incentive for improving the distribution system. The supply chain model is can be formulated using the Mix Integer Linear Programming (MILP) and solved using the IBM®ILOG®CPLEX software. The results show that the proposed model can be used to determine the priority of programs for improving the quality and sustainability business of traditional beef merchants. By using the models, The Government can make a decision to consider incentives for improving the condition.
Shared or Integrated: Which Type of Integration is More Effective Improves Students’ Creativity?
NASA Astrophysics Data System (ADS)
Mariyam, M.; Kaniawati, I.; Sriyati, S.
2017-09-01
Integrated science learning has various types of integration. This study aims to apply shared and integrated type of integration with project based learning (PjBL) model to improve students’ creativity on waste recycling theme. The research method used is a quasi experiment with the matching-only pre test-post test design. The samples of this study are 108 students consisting of 36 students (experiment class 1st), 35 students (experiment class 2nd) and 37 students (control class 3rd) at one of Junior High School in Tanggamus, Lampung. The results show that there is difference of creativity improvement in the class applied by PjBL model with shared type of integration, integrated type of integration and without any integration in waste recycling theme. Class applied by PjBL model with shared type of integration has the higher creativity improvement than the PjBL model with integrated type of integration and without any integration. Integrated science learning using shared type only combines 2 lessons, hence an intact concept is resulted. So, PjBL model with shared type of integration more effective improves students’ creativity than integrated type.
Care zoning in a psychiatric intensive care unit: strengthening ongoing clinical risk assessment.
Mullen, Antony; Drinkwater, Vincent; Lewin, Terry J
2014-03-01
To implement and evaluate the care zoning model in an eight-bed psychiatric intensive care unit and, specifically, to examine the model's ability to improve the documentation and communication of clinical risk assessment and management. Care zoning guides nurses in assessing clinical risk and planning care within a mental health context. Concerns about the varying quality of clinical risk assessment prompted a trial of the care zoning model in a psychiatric intensive care unit within a regional mental health facility. The care zoning model assigns patients to one of 3 'zones' according to their clinical risk, encouraging nurses to document and implement targeted interventions required to manage those risks. An implementation trial framework was used for this research to refine, implement and evaluate the impact of the model on nurses' clinical practice within the psychiatric intensive care unit, predominantly as a quality improvement initiative. The model was trialled for three months using a pre- and postimplementation staff survey, a pretrial file audit and a weekly file audit. Informal staff feedback was also sought via surveys and regular staff meetings. This trial demonstrated improvement in the quality of mental state documentation, and clinical risk information was identified more accurately. There was limited improvement in the quality of care planning and the documentation of clinical interventions. Nurses' initial concerns over the introduction of the model shifted into overall acceptance and recognition of the benefits. The results of this trial demonstrate that the care zoning model was able to improve the consistency and quality of risk assessment information documented. Care planning and evaluation of associated outcomes showed less improvement. Care zoning remains a highly applicable model for the psychiatric intensive care unit environment and is a useful tool in guiding nurses to carry out routine patient risk assessments. © 2013 John Wiley & Sons Ltd.
Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration
NASA Technical Reports Server (NTRS)
Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.
1993-01-01
Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.
Modeling of turbulent supersonic H2-air combustion with an improved joint beta PDF
NASA Technical Reports Server (NTRS)
Baurle, R. A.; Hassan, H. A.
1991-01-01
Attempts at modeling recent experiments of Cheng et al. indicated that discrepancies between theory and experiment can be a result of the form of assumed probability density function (PDF) and/or the turbulence model employed. Improvements in both the form of the assumed PDF and the turbulence model are presented. The results are again used to compare with measurements. Initial comparisons are encouraging.
Improved Regional Seismic Event Locations Using 3-D Velocity Models
1999-12-15
regional velocity model to estimate event hypocenters. Travel times for the regional phases are calculated using a sophisticated eikonal finite...can greatly improve estimates of event locations. Our algorithm calculates travel times using a finite difference approximation of the eikonal ...such as IASP91 or J-B. 3-D velocity models require more sophisticated travel time modeling routines; thus, we use a 3-D eikonal equation solver
NASA Astrophysics Data System (ADS)
Downey, N.; Begnaud, M. L.; Hipp, J. R.; Ballard, S.; Young, C. S.; Encarnacao, A. V.
2017-12-01
The SALSA3D global 3D velocity model of the Earth was developed to improve the accuracy and precision of seismic travel time predictions for a wide suite of regional and teleseismic phases. Recently, the global SALSA3D model was updated to include additional body wave phases including mantle phases, core phases, reflections off the core-mantle boundary and underside reflections off the surface of the Earth. We show that this update improves travel time predictions and leads directly to significant improvements in the accuracy and precision of seismic event locations as compared to locations computed using standard 1D velocity models like ak135, or 2½D models like RSTT. A key feature of our inversions is that path-specific model uncertainty of travel time predictions are calculated using the full 3D model covariance matrix computed during tomography, which results in more realistic uncertainty ellipses that directly reflect tomographic data coverage. Application of this method can also be done at a regional scale: we present a velocity model with uncertainty obtained using data obtained from the University of Utah Seismograph Stations. These results show a reduction in travel-time residuals for re-located events compared with those obtained using previously published models.
Development of a High Resolution 3D Infant Stomach Model for Surgical Planning
NASA Astrophysics Data System (ADS)
Chaudry, Qaiser; Raza, S. Hussain; Lee, Jeonggyu; Xu, Yan; Wulkan, Mark; Wang, May D.
Medical surgical procedures have not changed much during the past century due to the lack of accurate low-cost workbench for testing any new improvement. The increasingly cheaper and powerful computer technologies have made computer-based surgery planning and training feasible. In our work, we have developed an accurate 3D stomach model, which aims to improve the surgical procedure that treats the infant pediatric and neonatal gastro-esophageal reflux disease (GERD). We generate the 3-D infant stomach model based on in vivo computer tomography (CT) scans of an infant. CT is a widely used clinical imaging modality that is cheap, but with low spatial resolution. To improve the model accuracy, we use the high resolution Visible Human Project (VHP) in model building. Next, we add soft muscle material properties to make the 3D model deformable. Then we use virtual reality techniques such as haptic devices to make the 3D stomach model deform upon touching force. This accurate 3D stomach model provides a workbench for testing new GERD treatment surgical procedures. It has the potential to reduce or eliminate the extensive cost associated with animal testing when improving any surgical procedure, and ultimately, to reduce the risk associated with infant GERD surgery.
NASA Astrophysics Data System (ADS)
Lee, Soon Hwan; Kim, Ji Sun; Lee, Kang Yeol; Shon, Keon Tae
2017-04-01
Air quality due to increasing Particulate Matter(PM) in Korea in Asia is getting worse. At present, the PM forecast is announced based on the PM concentration predicted from the air quality prediction numerical model. However, forecast accuracy is not as high as expected due to various uncertainties for PM physical and chemical characteristics. The purpose of this study was to develop a numerical-statistically ensemble models to improve the accuracy of prediction of PM10 concentration. Numerical models used in this study are the three dimensional atmospheric model Weather Research and Forecasting(WRF) and the community multiscale air quality model (CMAQ). The target areas for the PM forecast are Seoul, Busan, Daegu, and Daejeon metropolitan areas in Korea. The data used in the model development are PM concentration and CMAQ predictions and the data period is 3 months (March 1 - May 31, 2014). The dynamic-statistical technics for reducing the systematic error of the CMAQ predictions was applied to the dynamic linear model(DLM) based on the Baysian Kalman filter technic. As a result of applying the metrics generated from the dynamic linear model to the forecasting of PM concentrations accuracy was improved. Especially, at the high PM concentration where the damage is relatively large, excellent improvement results are shown.
[The methods of assessment of health risk from exposure to radon and radon daughters].
Demin, V F; Zhukovskiy, M V; Kiselev, S M
2014-01-01
The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.
An improved cellular automata model for train operation simulation with dynamic acceleration
NASA Astrophysics Data System (ADS)
Li, Wen-Jun; Nie, Lei
2018-03-01
Urban rail transit plays an important role in the urban public traffic because of its advantages of fast speed, large transport capacity, high safety, reliability and low pollution. This study proposes an improved cellular automaton (CA) model by considering the dynamic characteristic of the train acceleration to analyze the energy consumption and train running time. Constructing an effective model for calculating energy consumption to aid train operation improvement is the basis for studying and analyzing energy-saving measures for urban rail transit system operation.
ERIC Educational Resources Information Center
Ferrão, Maria Eugénia; Couto, Alcino Pinto
2014-01-01
This article focuses on the use of a value-added approach for promoting school improvement. It presents yearly value-added estimates, analyses their stability over time, and discusses the contribution of this methodological approach for promoting school improvement programmes in the Portuguese system of evaluation. The value-added model is applied…
Characterization of structural connections for multicomponent systems
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Huckelbridge, Arthur A.
1988-01-01
This study explores combining Component Mode Synthesis methods for coupling structural components with Parameter Identification procedures for improving the analytical modeling of the connections. Improvements in the connection stiffness and damping properties are computed in terms of physical parameters so that the physical characteristics of the connections can be better understood, in addition to providing improved input for the system model.
ERIC Educational Resources Information Center
Atkinson Duina, Angela
2013-01-01
New regulations attached to ARRA funding of federal School Improvement Fund grants aimed at producing rapid turnaround of low performing schools were highly criticized as unsuitable for rural schools. This mixed-methods study looked at the implementation of the School Improvement Fund Transformation Model in two rural Maine high schools during the…
ERIC Educational Resources Information Center
Spanbauer, Stanley J.
The Measurement and Costing Model (MCM) described in this book was developed and tested at Fox Valley Technical College (FVTC), Wisconsin, to enhance the college's quality improvement process and to serve as a guide to other institutions interested in improving their quality. The book presents a description of the model and outlines seven steps…
Song, Zirui; Rose, Sherri; Chernew, Michael E.; Safran, Dana Gelb
2018-01-01
As population-based payment models become increasingly common, it is crucial to understand how such payment models affect health disparities. We evaluated health care quality and spending among enrollees in areas with lower versus higher socioeconomic status in Massachusetts before and after providers entered into the Alternative Quality Contract, a two-sided population-based payment model with substantial incentives tied to quality. We compared changes in process measures, outcome measures, and spending between enrollees in areas with lower and higher socioeconomic status from 2006 to 2012 (outcome measures were measured after the intervention only). Quality improved for all enrollees in the Alternative Quality Contract after their provider organizations entered the contract. Process measures improved 1.2 percentage points per year more among enrollees in areas with lower socioeconomic status than among those in areas with higher socioeconomic status. Outcome measure improvement was no different between the subgroups; neither were changes in spending. Larger or comparable improvements in quality among enrollees in areas with lower socioeconomic status suggest a potential narrowing of disparities. Strong pay-for-performance incentives within a population-based payment model could encourage providers to focus on improving quality for more disadvantaged populations. PMID:28069849
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritchie, L.T.; Alpert, D.J.; Burke, R.P.
1984-03-01
The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions.
Prediction of stock markets by the evolutionary mix-game model
NASA Astrophysics Data System (ADS)
Chen, Fang; Gou, Chengling; Guo, Xiaoqian; Gao, Jieping
2008-06-01
This paper presents the efforts of using the evolutionary mix-game model, which is a modified form of the agent-based mix-game model, to predict financial time series. Here, we have carried out three methods to improve the original mix-game model by adding the abilities of strategy evolution to agents, and then applying the new model referred to as the evolutionary mix-game model to forecast the Shanghai Stock Exchange Composite Index. The results show that these modifications can improve the accuracy of prediction greatly when proper parameters are chosen.
NASA Astrophysics Data System (ADS)
Viero, Daniele P.
2018-01-01
Citizen science and crowdsourcing are gaining increasing attention among hydrologists. In a recent contribution, Mazzoleni et al. (2017) investigated the integration of crowdsourced data (CSD) into hydrological models to improve the accuracy of real-time flood forecasts. The authors used synthetic CSD (i.e. not actually measured), because real CSD were not available at the time of the study. In their work, which is a proof-of-concept study, Mazzoleni et al. (2017) showed that assimilation of CSD improves the overall model performance; the impact of irregular frequency of available CSD, and that of data uncertainty, were also deeply assessed. However, the use of synthetic CSD in conjunction with (semi-)distributed hydrological models deserves further discussion. As a result of equifinality, poor model identifiability, and deficiencies in model structure, internal states of (semi-)distributed models can hardly mimic the actual states of complex systems away from calibration points. Accordingly, the use of synthetic CSD that are drawn from model internal states under best-fit conditions can lead to overestimation of the effectiveness of CSD assimilation in improving flood prediction. Operational flood forecasting, which results in decisions of high societal value, requires robust knowledge of the model behaviour and an in-depth assessment of both model structure and forcing data. Additional guidelines are given that are useful for the a priori evaluation of CSD for real-time flood forecasting and, hopefully, for planning apt design strategies for both model calibration and collection of CSD.
Accurate and dynamic predictive model for better prediction in medicine and healthcare.
Alanazi, H O; Abdullah, A H; Qureshi, K N; Ismail, A S
2018-05-01
Information and communication technologies (ICTs) have changed the trend into new integrated operations and methods in all fields of life. The health sector has also adopted new technologies to improve the systems and provide better services to customers. Predictive models in health care are also influenced from new technologies to predict the different disease outcomes. However, still, existing predictive models have suffered from some limitations in terms of predictive outcomes performance. In order to improve predictive model performance, this paper proposed a predictive model by classifying the disease predictions into different categories. To achieve this model performance, this paper uses traumatic brain injury (TBI) datasets. TBI is one of the serious diseases worldwide and needs more attention due to its seriousness and serious impacts on human life. The proposed predictive model improves the predictive performance of TBI. The TBI data set is developed and approved by neurologists to set its features. The experiment results show that the proposed model has achieved significant results including accuracy, sensitivity, and specificity.
Toward improved simulation of river operations through integration with a hydrologic model
Morway, Eric D.; Niswonger, Richard G.; Triana, Enrique
2016-01-01
Advanced modeling tools are needed for informed water resources planning and management. Two classes of modeling tools are often used to this end–(1) distributed-parameter hydrologic models for quantifying supply and (2) river-operation models for sorting out demands under rule-based systems such as the prior-appropriation doctrine. Within each of these two broad classes of models, there are many software tools that excel at simulating the processes specific to each discipline, but have historically over-simplified, or at worse completely neglected, aspects of the other. As a result, water managers reliant on river-operation models for administering water resources need improved tools for representing spatially and temporally varying groundwater resources in conjunctive-use systems. A new tool is described that improves the representation of groundwater/surface-water (GW-SW) interaction within a river-operations modeling context and, in so doing, advances evaluation of system-wide hydrologic consequences of new or altered management regimes.
Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models
NASA Astrophysics Data System (ADS)
Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.
2017-12-01
Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream measurements.
The Effect of ISO 9001 and the EFQM Model on Improving Hospital Performance: A Systematic Review
Yousefinezhadi, Taraneh; Mohamadi, Efat; Safari Palangi, Hossein; Akbari Sari, Ali
2015-01-01
Context: This study aimed to explore the effect of the International Organization for Standardization (ISO) ISO 9001 standard and the European foundation for quality management (EFQM) model on improving hospital performance. Evidence Acquisition: PubMed, Embase and the Cochrane Library databases were searched. In addition, Elsevier and Springer were searched as main publishers in the field of health sciences. We included empirical studies with any design that had used ISO 9001 or the EFQM model to improve the quality of healthcare. Data were collected and tabulated into a data extraction sheet that was specifically designed for this study. The collected data included authors’ names, country, year of publication, intervention, improvement aims, setting, length of program, study design, and outcomes. Results: Seven out of the 121 studies that were retrieved met the inclusion criteria. Three studies assessed the EFQM model and four studies assessed the ISO 9001 standard. Use of the EFQM model increased the degree of patient satisfaction and the number of hospital admissions and reduced the average length of stay, the delay on the surgical waiting list, and the number of emergency re-admissions. ISO 9001 also increased the degree of patient satisfaction and patient safety, increased cost-effectiveness, improved the hospital admissions process, and reduced the percentage of unscheduled returns to the hospital. Conclusions: Generally, there is a lack of robust and high quality empirical evidence regarding the effects of ISO 9001 and the EFQM model on the quality care provided by and the performance of hospitals. However, the limited evidence shows that ISO 9001 and the EFQM model might improve hospital performance. PMID:26756012
Parikh, Nisha I.; Jeppson, Rebecca P.; Berger, Jeffrey S.; Eaton, Charles B.; Kroenke, Candyce H.; LeBlanc, Erin S.; Lewis, Cora E.; Loucks, Eric B.; Parker, Donna R.; Rillamas-Sun, Eileen; Ryckman, Kelli K; Waring, Molly E.; Schenken, Robert S.; Johnson, Karen C; Edstedt-Bonamy, Anna-Karin; Allison, Matthew A.; Howard, Barbara V.
2016-01-01
Background Reproductive factors provide an early window into a woman’s coronary heart disease (CHD) risk, however their contribution to CHD risk stratification is uncertain. Methods and Results In the Women’s Health Initiative Observational Study, we constructed Cox proportional hazards models for CHD including age, pregnancy status, number of live births, age at menarche, menstrual irregularity, age at first birth, stillbirths, miscarriages, infertility ≥ 1 year, infertility cause, and breastfeeding. We next added each candidate reproductive factor to an established CHD risk factor model. A final model was then constructed with significant reproductive factors added to established CHD risk factors. Improvement in C-statistic, net reclassification index (or NRI with risk categories of <5%, 5–<10%, and ≥10% 10-year risk of CHD) and integrated discriminatory index (IDI) were assessed. Among 72,982 women [n=4607 CHD events, median follow-up=12.0 (IQR=8.3–13.7) years, mean (SD) age 63.2 (7.2) years], an age-adjusted reproductive risk factor model had a C-statistic of 0.675 for CHD. In a model adjusted for established CHD risk factors, younger age at first birth, number of still births, number of miscarriages and lack of breastfeeding were positively associated with CHD. Reproductive factors modestly improved model discrimination (C-statistic increased from 0.726 to 0.730; IDI=0.0013, p-value < 0.0001). Net reclassification for women with events was not improved (NRI events=0.007, p-value=0.18); and for women without events was marginally improved (NRI non-events=0.002, p-value=0.04) Conclusions Key reproductive factors are associated with CHD independently of established CHD risk factors, very modestly improve model discrimination and do not materially improve net reclassification. PMID:27143682
He, Yan-Lin; Xu, Yuan; Geng, Zhi-Qiang; Zhu, Qun-Xiong
2016-03-01
In this paper, a hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) is proposed. Firstly, an improved functional link neural network with small norm of expanded weights and high input-output correlation (SNEWHIOC-FLNN) was proposed for enhancing the generalization performance of FLNN. Unlike the traditional FLNN, the expanded variables of the original inputs are not directly used as the inputs in the proposed SNEWHIOC-FLNN model. The original inputs are attached to some small norm of expanded weights. As a result, the correlation coefficient between some of the expanded variables and the outputs is enhanced. The larger the correlation coefficient is, the more relevant the expanded variables tend to be. In the end, the expanded variables with larger correlation coefficient are selected as the inputs to improve the performance of the traditional FLNN. In order to test the proposed SNEWHIOC-FLNN model, three UCI (University of California, Irvine) regression datasets named Housing, Concrete Compressive Strength (CCS), and Yacht Hydro Dynamics (YHD) are selected. Then a hybrid model based on the improved FLNN integrating with partial least square (IFLNN-PLS) was built. In IFLNN-PLS model, the connection weights are calculated using the partial least square method but not the error back propagation algorithm. Lastly, IFLNN-PLS was developed as an intelligent measurement model for accurately predicting the key variables in the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. Simulation results illustrated that the IFLNN-PLS could significant improve the prediction performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Youngblut, C.
1984-01-01
Orography and geographically fixed heat sources which force a zonally asymmetric motion field are examined. An extensive space-time spectral analysis of the GLAS climate model (D130) response and observations are compared. An updated version of the model (D150) showed a remarkable improvement in the simulation of the standing waves. The main differences in the model code are an improved boundary layer flux computation and a more realistic specification of the global boundary conditions.
Full velocity difference car-following model considering desired inter-vehicle distance
NASA Astrophysics Data System (ADS)
Xin, Tong; Yi, Liu; Rongjun, Cheng; Hongxia, Ge
Based on the full velocity difference car-following model, an improved car-following model is put forward by considering the driver’s desired inter-vehicle distance. The stability conditions are obtained by applying the control method. The results of theoretical analysis are used to demonstrate the advantages of our model. Numerical simulations are used to show that traffic congestion can be improved as the desired inter-vehicle distance is considered in the full velocity difference car-following model.
Consumer preference models: fuzzy theory approach
NASA Astrophysics Data System (ADS)
Turksen, I. B.; Wilson, I. A.
1993-12-01
Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).
Improved meteorology from an updated WRF/CMAQ modeling ...
Realistic vegetation characteristics and phenology from the Moderate Resolution Imaging Spectroradiometer (MODIS) products improve the simulation for the meteorology and air quality modeling system WRF/CMAQ (Weather Research and Forecasting model and Community Multiscale Air Quality model) that employs the Pleim-Xiu land surface model (PX LSM). Recently, PX LSM WRF/CMAQ has been updated in vegetation, soil, and boundary layer processes resulting in improved 2 m temperature (T) and mixing ratio (Q), 10 m wind speed, and surface ozone simulations across the domain compared to the previous version for a period around August 2006. Yearlong meteorology simulations with the updated system demonstrate that MODIS input helps reduce bias of the 2 m Q estimation during the growing season from April to September. Improvements follow the green-up in the southeast from April and move toward the west and north through August. From October to March, MODIS input does not have much influence on the system because vegetation is not as active. The greatest effects of MODIS input include more accurate phenology, better representation of leaf area index (LAI) for various forest ecosystems and agricultural areas, and realistically sparse vegetation coverage in the western drylands. Despite the improved meteorology, MODIS input causes higher bias for the surface O3 simulation in April, August, and October in areas where MODIS LAI is much less than the base LAI. Thus, improvement
A roadmap for improving healthcare service quality.
Kennedy, Denise M; Caselli, Richard J; Berry, Leonard L
2011-01-01
A data-driven, comprehensive model for improving service and creating long-term value was developed and implemented at Mayo Clinic Arizona (MCA). Healthcare organizations can use this model to prepare for value-based purchasing, a payment system in which quality and patient experience measures will influence reimbursement. Surviving and thriving in such a system will require a comprehensive approach to sustaining excellent service performance from physicians and allied health staff (e.g., nurses, technicians, nonclinical staff). The seven prongs in MCA's service quality improvement model are (1) multiple data sources to drive improvement, (2) accountability for service quality, (3) service consultation and improvement tools, (4) service values and behaviors, (5) education and training, (6) ongoing monitoring and control, and (7) recognition and reward. The model was fully implemented and tested in five departments in which patient perception of provider-specific service attributes and/or overall quality of care were below the 90th percentile for patient satisfaction in the vendor's database. Extent of the implementation was at the discretion of department leadership. Perception data rating various service attributes were collected from randomly selected patients and monitored over a 24-month period. The largest increases in patient perception of excellence over the pilot period were realized when all seven prongs of the model were implemented as a comprehensive improvement approach. The results of this pilot may help other healthcare organizations prepare for value-based purchasing.
NASA Technical Reports Server (NTRS)
Hou, Arthur Y.; Einaudi, Franco (Technical Monitor)
2001-01-01
I will discuss the need for accurate rainfall observations to improve our ability to model the earth's climate and improve short-range weather forecasts. I will give an overview of the recent progress in using of rainfall data provided by TRMM and other microwave instruments in data assimilation to improve global analyses and diagnose state-dependent systematic errors in physical parameterizations. I will outline the current and future research strategies in preparation for the Global Precipitation Mission.
Developing empirically supported theories of change for housing investment and health
Thomson, Hilary; Thomas, Sian
2015-01-01
The assumption that improving housing conditions can lead to improved health may seem a self-evident hypothesis. Yet evidence from intervention studies suggests small or unclear health improvements, indicating that further thought is required to refine this hypothesis. Articulation of a theory can help avoid a black box approach to research and practice and has been advocated as especially valuable for those evaluating complex social interventions like housing. This paper presents a preliminary theory of housing improvement and health based on a systematic review conducted by the authors. Following extraction of health outcomes, data on all socio-economic impacts were extracted by two independent reviewers from both qualitative and quantitative studies. Health and socio-economic outcome data from the better quality studies (n = 23/34) were mapped onto a one page logic models by two independent reviewers and a final model reflecting reviewer agreement was prepared. Where there was supporting evidence of links between outcomes these were indicated in the model. Two models of specific improvements (warmth & energy efficiency; and housing led renewal), and a final overall model were prepared. The models provide a visual map of the best available evidence on the health and socio-economic impacts of housing improvement. The use of a logic model design helps to elucidate the possible pathways between housing improvement and health and as such might be described as an empirically based theory. Changes in housing factors were linked to changes in socio-economic determinants of health. This points to the potential for longer term health impacts which could not be detected within the lifespan of the evaluations. The developed theories are limited by the available data and need to be tested and refined. However, in addition to providing one page summaries for evidence users, the theory may usefully inform future research on housing and health. PMID:25461878
Developing empirically supported theories of change for housing investment and health.
Thomson, Hilary; Thomas, Sian
2015-01-01
The assumption that improving housing conditions can lead to improved health may seem a self-evident hypothesis. Yet evidence from intervention studies suggests small or unclear health improvements, indicating that further thought is required to refine this hypothesis. Articulation of a theory can help avoid a black box approach to research and practice and has been advocated as especially valuable for those evaluating complex social interventions like housing. This paper presents a preliminary theory of housing improvement and health based on a systematic review conducted by the authors. Following extraction of health outcomes, data on all socio-economic impacts were extracted by two independent reviewers from both qualitative and quantitative studies. Health and socio-economic outcome data from the better quality studies (n = 23/34) were mapped onto a one page logic models by two independent reviewers and a final model reflecting reviewer agreement was prepared. Where there was supporting evidence of links between outcomes these were indicated in the model. Two models of specific improvements (warmth & energy efficiency; and housing led renewal), and a final overall model were prepared. The models provide a visual map of the best available evidence on the health and socio-economic impacts of housing improvement. The use of a logic model design helps to elucidate the possible pathways between housing improvement and health and as such might be described as an empirically based theory. Changes in housing factors were linked to changes in socio-economic determinants of health. This points to the potential for longer term health impacts which could not be detected within the lifespan of the evaluations. The developed theories are limited by the available data and need to be tested and refined. However, in addition to providing one page summaries for evidence users, the theory may usefully inform future research on housing and health. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Experimentally validated modification to Cook-Torrance BRDF model for improved accuracy
NASA Astrophysics Data System (ADS)
Butler, Samuel D.; Ethridge, James A.; Nauyoks, Stephen E.; Marciniak, Michael A.
2017-09-01
The BRDF describes optical scatter off realistic surfaces. The microfacet BRDF model assumes geometric optics but is computationally simple compared to wave optics models. In this work, MERL BRDF data is fitted to the original Cook-Torrance microfacet model, and a modified Cook-Torrance model using the polarization factor in place of the mathematically problematic cross section conversion and geometric attenuation terms. The results provide experimental evidence that this modified Cook-Torrance model leads to improved fits, particularly for large incident and scattered angles. These results are expected to lead to more accurate BRDF modeling for remote sensing.
NASA Astrophysics Data System (ADS)
Mao, Y.; Crow, W. T.; Nijssen, B.
2017-12-01
Soil moisture (SM) plays an important role in runoff generation both by partitioning infiltration and surface runoff during rainfall events and by controlling the rate of subsurface flow during inter-storm periods. Therefore, more accurate SM state estimation in hydrologic models is potentially beneficial for streamflow prediction. Various previous studies have explored the potential of assimilating SM data into hydrologic models for streamflow improvement. These studies have drawn inconsistent conclusions, ranging from significantly improved runoff via SM data assimilation (DA) to limited or degraded runoff. These studies commonly treat the whole assimilation procedure as a black box without separating the contribution of each step in the procedure, making it difficult to attribute the underlying causes of runoff improvement (or the lack thereof). In this study, we decompose the overall DA process into three steps by answering the following questions (3-step framework): 1) how much can assimilation of surface SM measurements improve surface SM state in a hydrologic model? 2) how much does surface SM improvement propagate to deeper layers? 3) How much does (surface and deeper-layer) SM improvement propagate into runoff improvement? A synthetic twin experiment is carried out in the Arkansas-Red River basin ( 600,000 km2) where a synthetic "truth" run, an open-loop run (without DA) and a DA run (where synthetic surface SM measurements are assimilated) are generated. All model runs are performed at 1/8 degree resolution and over a 10-year period using the Variable Infiltration Capacity (VIC) hydrologic model at a 3-hourly time step. For the DA run, the ensemble Kalman filter (EnKF) method is applied. The updated surface and deeper-layer SM states with DA are compared to the open-loop SM to quantitatively evaluate the first two steps in the framework. To quantify the third step, a set of perfect-state runs are generated where the "true" SM states are directly inserted in the model to assess the maximum possible runoff improvement that can be achieved by improving SM states alone. Our results show that the 3-step framework is able to effectively identify the potential as well as bottleneck of runoff improvement and point out the cases where runoff improvement via assimilation of surface SM is prone to failure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finley, Cathy
2014-04-30
This report contains the results from research aimed at improving short-range (0-6 hour) hub-height wind forecasts in the NOAA weather forecast models through additional data assimilation and model physics improvements for use in wind energy forecasting. Additional meteorological observing platforms including wind profilers, sodars, and surface stations were deployed for this study by NOAA and DOE, and additional meteorological data at or near wind turbine hub height were provided by South Dakota State University and WindLogics/NextEra Energy Resources over a large geographical area in the U.S. Northern Plains for assimilation into NOAA research weather forecast models. The resulting improvements inmore » wind energy forecasts based on the research weather forecast models (with the additional data assimilation and model physics improvements) were examined in many different ways and compared with wind energy forecasts based on the current operational weather forecast models to quantify the forecast improvements important to power grid system operators and wind plant owners/operators participating in energy markets. Two operational weather forecast models (OP_RUC, OP_RAP) and two research weather forecast models (ESRL_RAP, HRRR) were used as the base wind forecasts for generating several different wind power forecasts for the NextEra Energy wind plants in the study area. Power forecasts were generated from the wind forecasts in a variety of ways, from very simple to quite sophisticated, as they might be used by a wide range of both general users and commercial wind energy forecast vendors. The error characteristics of each of these types of forecasts were examined and quantified using bulk error statistics for both the local wind plant and the system aggregate forecasts. The wind power forecast accuracy was also evaluated separately for high-impact wind energy ramp events. The overall bulk error statistics calculated over the first six hours of the forecasts at both the individual wind plant and at the system-wide aggregate level over the one year study period showed that the research weather model-based power forecasts (all types) had lower overall error rates than the current operational weather model-based power forecasts, both at the individual wind plant level and at the system aggregate level. The bulk error statistics of the various model-based power forecasts were also calculated by season and model runtime/forecast hour as power system operations are more sensitive to wind energy forecast errors during certain times of year and certain times of day. The results showed that there were significant differences in seasonal forecast errors between the various model-based power forecasts. The results from the analysis of the various wind power forecast errors by model runtime and forecast hour showed that the forecast errors were largest during the times of day that have increased significance to power system operators (the overnight hours and the morning/evening boundary layer transition periods), but the research weather model-based power forecasts showed improvement over the operational weather model-based power forecasts at these times.« less
Bachis, Giulia; Maruéjouls, Thibaud; Tik, Sovanna; Amerlinck, Youri; Melcer, Henryk; Nopens, Ingmar; Lessard, Paul; Vanrolleghem, Peter A
2015-01-01
Characterization and modelling of primary settlers have been neglected pretty much to date. However, whole plant and resource recovery modelling requires primary settler model development, as current models lack detail in describing the dynamics and the diversity of the removal process for different particulate fractions. This paper focuses on the improved modelling and experimental characterization of primary settlers. First, a new modelling concept based on particle settling velocity distribution is proposed which is then applied for the development of an improved primary settler model as well as for its characterization under addition of chemicals (chemically enhanced primary treatment, CEPT). This model is compared to two existing simple primary settler models (Otterpohl and Freund; Lessard and Beck), showing to be better than the first one and statistically comparable to the second one, but with easier calibration thanks to the ease with which wastewater characteristics can be translated into model parameters. Second, the changes in the activated sludge model (ASM)-based chemical oxygen demand fractionation between inlet and outlet induced by primary settling is investigated, showing that typical wastewater fractions are modified by primary treatment. As they clearly impact the downstream processes, both model improvements demonstrate the need for more detailed primary settler models in view of whole plant modelling.
ERIC Educational Resources Information Center
Ilyas, Mohammed
2017-01-01
Today organizations have adopted a corporate university model to meet their training requirements, a model that adds value to the business in terms of revenue and profit, improvement in customer retention, improved employee productivity, cost reduction and retention of talented employees. This paper highlights the radical change and an evolution…
ERIC Educational Resources Information Center
Simon, Sue; Christie, Michael; Graham, Wayne A.; Call, Kairen
2014-01-01
This paper presents a new model of leadership that can improve the knowledge and skills needed by school leaders who undertake Masters of Business and Masters of Education. The model is called PIVOTAL which stands for Partnerships, Innovation and Vitality--Opportunities for Thriving Academic Leadership. The model is compared and contrasted with…
A Maturity Model: Does It Provide a Path for Online Course Design?
ERIC Educational Resources Information Center
Neuhauser, Charlotte
2004-01-01
Maturity models are successfully used by organizations attempting to improve their processes, products, and delivery. As more faculty include online course design and teaching, a maturity model of online course design may serve as a tool in planning and assessing their courses for improvement based on best practices. This article presents such a…
Improved Modeling of Intelligent Tutoring Systems Using Ant Colony Optimization
ERIC Educational Resources Information Center
Rastegarmoghadam, Mahin; Ziarati, Koorush
2017-01-01
Swarm intelligence approaches, such as ant colony optimization (ACO), are used in adaptive e-learning systems and provide an effective method for finding optimal learning paths based on self-organization. The aim of this paper is to develop an improved modeling of adaptive tutoring systems using ACO. In this model, the learning object is…
ERIC Educational Resources Information Center
Kim, Seohyun; Lu, Zhenqiu; Cohen, Allan S.
2018-01-01
Bayesian algorithms have been used successfully in the social and behavioral sciences to analyze dichotomous data particularly with complex structural equation models. In this study, we investigate the use of the Polya-Gamma data augmentation method with Gibbs sampling to improve estimation of structural equation models with dichotomous variables.…
Achieving World-Class Schools: Mastering School Improvement Using a Genetic Model.
ERIC Educational Resources Information Center
Kimmelman, Paul L.; Kroeze, David J.
In providing its program for education reform, this book uses, as an analogy, the genetic model taken from the Human Genome project. In the first part, "Theoretical Underpinnings," the book explains why a genetic model can be used to improve school systems; describes the critical components of a world-class school system; and details the…
ERIC Educational Resources Information Center
McKinnis, David R.; Sloan, Mary Anne; Snow, L. David; Garimella, Suresh V.
2014-01-01
The Purdue Technical Assistance Program (TAP) offers a model of university engagement and service that is achieving technology adoption and performance improvement impacts in healthcare, manufacturing, government, and other sectors. The TAP model focuses on understanding and meeting the changing and challenging needs of those served, always…
ERIC Educational Resources Information Center
Derlina; Sabani; Mihardi, Satria
2015-01-01
Education Research in Indonesia has begun to lead to the development of character education and is no longer fixated on the outcomes of cognitive learning. This study purposed to produce character education based general physics learning model (CEBGP Learning Model) and with valid, effective and practical peripheral devices to improve character…
Improved brain tumor segmentation by utilizing tumor growth model in longitudinal brain MRI
NASA Astrophysics Data System (ADS)
Pei, Linmin; Reza, Syed M. S.; Li, Wei; Davatzikos, Christos; Iftekharuddin, Khan M.
2017-03-01
In this work, we propose a novel method to improve texture based tumor segmentation by fusing cell density patterns that are generated from tumor growth modeling. To model tumor growth, we solve the reaction-diffusion equation by using Lattice-Boltzmann method (LBM). Computational tumor growth modeling obtains the cell density distribution that potentially indicates the predicted tissue locations in the brain over time. The density patterns is then considered as novel features along with other texture (such as fractal, and multifractal Brownian motion (mBm)), and intensity features in MRI for improved brain tumor segmentation. We evaluate the proposed method with about one hundred longitudinal MRI scans from five patients obtained from public BRATS 2015 data set, validated by the ground truth. The result shows significant improvement of complete tumor segmentation using ANOVA analysis for five patients in longitudinal MR images.
Improved brain tumor segmentation by utilizing tumor growth model in longitudinal brain MRI.
Pei, Linmin; Reza, Syed M S; Li, Wei; Davatzikos, Christos; Iftekharuddin, Khan M
2017-02-11
In this work, we propose a novel method to improve texture based tumor segmentation by fusing cell density patterns that are generated from tumor growth modeling. In order to model tumor growth, we solve the reaction-diffusion equation by using Lattice-Boltzmann method (LBM). Computational tumor growth modeling obtains the cell density distribution that potentially indicates the predicted tissue locations in the brain over time. The density patterns is then considered as novel features along with other texture (such as fractal, and multifractal Brownian motion (mBm)), and intensity features in MRI for improved brain tumor segmentation. We evaluate the proposed method with about one hundred longitudinal MRI scans from five patients obtained from public BRATS 2015 data set, validated by the ground truth. The result shows significant improvement of complete tumor segmentation using ANOVA analysis for five patients in longitudinal MR images.
An improved car-following model considering headway changes with memory
NASA Astrophysics Data System (ADS)
Yu, Shaowei; Shi, Zhongke
2015-03-01
To describe car-following behaviors in complex situations better, increase roadway traffic mobility and minimize cars' fuel consumptions, the linkage between headway changes with memory and car-following behaviors was explored with the field car-following data by using the gray correlation analysis method, and then an improved car-following model considering headway changes with memory on a single lane was proposed based on the full velocity difference model. Some numerical simulations were carried out by employing the improved car-following model to explore how headway changes with memory affected each car's velocity, acceleration, headway and fuel consumptions. The research results show that headway changes with memory have significant effects on car-following behaviors and fuel consumptions and that considering headway changes with memory in designing the adaptive cruise control strategy can improve the traffic flow stability and minimize cars' fuel consumptions.
NASA Astrophysics Data System (ADS)
Bangga, Galih; Kusumadewi, Tri; Hutomo, Go; Sabila, Ahmad; Syawitri, Taurista; Setiadi, Herlambang; Faisal, Muhamad; Wiranegara, Raditya; Hendranata, Yongki; Lastomo, Dwi; Putra, Louis; Kristiadi, Stefanus
2018-03-01
Numerical simulations for relatively thick airfoils are carried out in the present studies. An attempt to improve the accuracy of the numerical predictions is done by adjusting the turbulent viscosity of the eddy-viscosity Menter Shear-Stress-Transport (SST) model. The modification involves the addition of a damping factor on the wall-bounded flows incorporating the ratio of the turbulent kinetic energy to its specific dissipation rate for separation detection. The results are compared with available experimental data and CFD simulations using the original Menter SST model. The present model improves the lift polar prediction even though the stall angle is still overestimated. The improvement is caused by the better prediction of separated flow under a strong adverse pressure gradient. The results show that the Reynolds stresses are damped near the wall causing variation of the logarithmic velocity profiles.
DOT2: Macromolecular Docking With Improved Biophysical Models
Roberts, Victoria A.; Thompson, Elaine E.; Pique, Michael E.; Perez, Martin S.; Eyck, Lynn Ten
2015-01-01
Computational docking is a useful tool for predicting macromolecular complexes, which are often difficult to determine experimentally. Here we present the DOT2 software suite, an updated version of the DOT intermolecular docking program. DOT2 provides straightforward, automated construction of improved biophysical models based on molecular coordinates, offering checkpoints that guide the user to include critical features. DOT has been updated to run more quickly, allow flexibility in grid size and spacing, and generate a complete list of favorable candidate configu-rations. Output can be filtered by experimental data and rescored by the sum of electrostatic and atomic desolvation energies. We show that this rescoring method improves the ranking of correct complexes for a wide range of macromolecular interactions, and demonstrate that biologically relevant models are essential for biologically relevant results. The flexibility and versatility of DOT2 accommodate realistic models of complex biological systems, improving the likelihood of a successful docking outcome. PMID:23695987
Demonstrating the improvement of predictive maturity of a computational model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hemez, Francois M; Unal, Cetin; Atamturktur, Huriye S
2010-01-01
We demonstrate an improvement of predictive capability brought to a non-linear material model using a combination of test data, sensitivity analysis, uncertainty quantification, and calibration. A model that captures increasingly complicated phenomena, such as plasticity, temperature and strain rate effects, is analyzed. Predictive maturity is defined, here, as the accuracy of the model to predict multiple Hopkinson bar experiments. A statistical discrepancy quantifies the systematic disagreement (bias) between measurements and predictions. Our hypothesis is that improving the predictive capability of a model should translate into better agreement between measurements and predictions. This agreement, in turn, should lead to a smallermore » discrepancy. We have recently proposed to use discrepancy and coverage, that is, the extent to which the physical experiments used for calibration populate the regime of applicability of the model, as basis to define a Predictive Maturity Index (PMI). It was shown that predictive maturity could be improved when additional physical tests are made available to increase coverage of the regime of applicability. This contribution illustrates how the PMI changes as 'better' physics are implemented in the model. The application is the non-linear Preston-Tonks-Wallace (PTW) strength model applied to Beryllium metal. We demonstrate that our framework tracks the evolution of maturity of the PTW model. Robustness of the PMI with respect to the selection of coefficients needed in its definition is also studied.« less
Multiobjective constraints for climate model parameter choices: Pragmatic Pareto fronts in CESM1
NASA Astrophysics Data System (ADS)
Langenbrunner, B.; Neelin, J. D.
2017-09-01
Global climate models (GCMs) are examples of high-dimensional input-output systems, where model output is a function of many variables, and an update in model physics commonly improves performance in one objective function (i.e., measure of model performance) at the expense of degrading another. Here concepts from multiobjective optimization in the engineering literature are used to investigate parameter sensitivity and optimization in the face of such trade-offs. A metamodeling technique called cut high-dimensional model representation (cut-HDMR) is leveraged in the context of multiobjective optimization to improve GCM simulation of the tropical Pacific climate, focusing on seasonal precipitation, column water vapor, and skin temperature. An evolutionary algorithm is used to solve for Pareto fronts, which are surfaces in objective function space along which trade-offs in GCM performance occur. This approach allows the modeler to visualize trade-offs quickly and identify the physics at play. In some cases, Pareto fronts are small, implying that trade-offs are minimal, optimal parameter value choices are more straightforward, and the GCM is well-functioning. In all cases considered here, the control run was found not to be Pareto-optimal (i.e., not on the front), highlighting an opportunity for model improvement through objectively informed parameter selection. Taylor diagrams illustrate that these improvements occur primarily in field magnitude, not spatial correlation, and they show that specific parameter updates can improve fields fundamental to tropical moist processes—namely precipitation and skin temperature—without significantly impacting others. These results provide an example of how basic elements of multiobjective optimization can facilitate pragmatic GCM tuning processes.
Hvitfeldt-Forsberg, Helena; Mazzocato, Pamela; Glaser, Daniel; Keller, Christina; Unbeck, Maria
2017-06-06
To explore healthcare staffs' and managers' perceptions of how and when discrete event simulation modelling can be used as a decision support in improvement efforts. Two focus group discussions were performed. Two settings were included: a rheumatology department and an orthopaedic section both situated in Sweden. Healthcare staff and managers (n=13) from the two settings. Two workshops were performed, one at each setting. Workshops were initiated by a short introduction to simulation modelling. Results from the respective simulation model were then presented and discussed in the following focus group discussion. Categories from the content analysis are presented according to the following research questions: how and when simulation modelling can assist healthcare improvement? Regarding how, the participants mentioned that simulation modelling could act as a tool for support and a way to visualise problems, potential solutions and their effects. Regarding when, simulation modelling could be used both locally and by management, as well as a pedagogical tool to develop and test innovative ideas and to involve everyone in the improvement work. Its potential as an information and communication tool and as an instrument for pedagogic work within healthcare improvement render a broader application and value of simulation modelling than previously reported. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Improving Visibility of Stereo-Radiographic Spine Reconstruction with Geometric Inferences.
Kumar, Sampath; Nayak, K Prabhakar; Hareesha, K S
2016-04-01
Complex deformities of the spine, like scoliosis, are evaluated more precisely using stereo-radiographic 3D reconstruction techniques. Primarily, it uses six stereo-corresponding points available on the vertebral body for the 3D reconstruction of each vertebra. The wireframe structure obtained in this process has poor visualization, hence difficult to diagnose. In this paper, a novel method is proposed to improve the visibility of this wireframe structure using a deformation of a generic spine model in accordance with the 3D-reconstructed corresponding points. Then, the geometric inferences like vertebral orientations are automatically extracted from the radiographs to improve the visibility of the 3D model. Biplanar radiographs are acquired from five scoliotic subjects on a specifically designed calibration bench. The stereo-corresponding point reconstruction method is used to build six-point wireframe vertebral structures and thus the entire spine model. Using the 3D spine midline and automatically extracted vertebral orientation features, a more realistic 3D spine model is generated. To validate the method, the 3D spine model is back-projected on biplanar radiographs and the error difference is computed. Though, this difference is within the error limits available in the literature, the proposed work is simple and economical. The proposed method does not require more corresponding points and image features to improve the visibility of the model. Hence, it reduces the computational complexity. Expensive 3D digitizer and vertebral CT scan models are also excluded from this study. Thus, the visibility of stereo-corresponding point reconstruction is improved to obtain a low-cost spine model for a better diagnosis of spinal deformities.
Re-engineering pre-employment check-up systems: a model for improving health services.
Rateb, Said Abdel Hakim; El Nouman, Azza Abdel Razek; Rateb, Moshira Abdel Hakim; Asar, Mohamed Naguib; El Amin, Ayman Mohammed; Gad, Saad abdel Aziz; Mohamed, Mohamed Salah Eldin
2011-01-01
The purpose of this paper is to develop a model for improving health services provided by the pre-employment medical fitness check-up system affiliated to Egypt's Health Insurance Organization (HIO). Operations research, notably system re-engineering, is used in six randomly selected centers and findings before and after re-engineering are compared. The re-engineering model follows a systems approach, focusing on three areas: structure, process and outcome. The model is based on six main components: electronic booking, standardized check-up processes, protected medical documents, advanced archiving through an electronic content management (ECM) system, infrastructure development, and capacity building. The model originates mainly from customer needs and expectations. The centers' monthly customer flow increased significantly after re-engineering. The mean time spent per customer cycle improved after re-engineering--18.3 +/- 5.5 minutes as compared to 48.8 +/- 14.5 minutes before. Appointment delay was also significantly decreased from an average 18 to 6.2 days. Both beneficiaries and service providers were significantly more satisfied with the services after re-engineering. The model proves that re-engineering program costs are exceeded by increased revenue. Re-engineering in this study involved multiple structure and process elements. The literature review did not reveal similar re-engineering healthcare packages. Therefore, each element was compared separately. This model is highly recommended for improving service effectiveness and efficiency. This research is the first in Egypt to apply the re-engineering approach to public health systems. Developing user-friendly models for service improvement is an added value.
NASA Astrophysics Data System (ADS)
Avianti, R.; Suyatno; Sugiarto, B.
2018-04-01
This study aims to create an appropriate learning material based on CORE (Connecting, Organizing, Reflecting, Extending) model to improve students’ learning achievement in Chemical Bonding Topic. This study used 4-D models as research design and one group pretest-posttest as design of the material treatment. The subject of the study was teaching materials based on CORE model, conducted on 30 students of Science class grade 10. The collecting data process involved some techniques such as validation, observation, test, and questionnaire. The findings were that: (1) all the contents were valid, (2) the practicality and the effectiveness of all the contents were good. The conclusion of this research was that the CORE model is appropriate to improve students’ learning outcomes for studying Chemical Bonding.
NASA Astrophysics Data System (ADS)
Sun, Xiao; Chai, Guobei; Liu, Wei; Bao, Wenzhuo; Zhao, Xiaoning; Ming, Delie
2018-02-01
Simple cells in primary visual cortex are believed to extract local edge information from a visual scene. In this paper, inspired by different receptive field properties and visual information flow paths of neurons, an improved Combination of Receptive Fields (CORF) model combined with non-classical receptive fields was proposed to simulate the responses of simple cell's receptive fields. Compared to the classical model, the proposed model is able to better imitate simple cell's physiologic structure with consideration of facilitation and suppression of non-classical receptive fields. And on this base, an edge detection algorithm as an application of the improved CORF model was proposed. Experimental results validate the robustness of the proposed algorithm to noise and background interference.
Fuel thermal conductivity (FTHCON). Status report. [PWR; BWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagrman, D. L.
1979-02-01
An improvement of the fuel thermal conductivity subcode is described which is part of the fuel rod behavior modeling task performed at EG and G Idaho, Inc. The original version was published in the Materials Properties (MATPRO) Handbook, Section A-2 (Fuel Thermal Conductivity). The improved version incorporates data which were not included in the previous work and omits some previously used data which are believed to come from cracked specimens. The models for the effect of porosity on thermal conductivity and for the electronic contribution to thermal coductivity have been completely revised in order to place these models on amore » more mechanistic basis. As a result of modeling improvements the standard error of the model with respect to its data base has been significantly reduced.« less
Murine models of osteosarcoma: A piece of the translational puzzle.
Walia, Mannu K; Castillo-Tandazo, Wilson; Mutsaers, Anthony J; Martin, Thomas John; Walkley, Carl R
2018-06-01
Osteosarcoma (OS) is the most common cancer of bone in children and young adults. Despite extensive research efforts, there has been no significant improvement in patient outcome for many years. An improved understanding of the biology of this cancer and how genes frequently mutated contribute to OS may help improve outcomes for patients. While our knowledge of the mutational burden of OS is approaching saturation, our understanding of how these mutations contribute to OS initiation and maintenance is less clear. Murine models of OS have now been demonstrated to be highly valid recapitulations of human OS. These models were originally based on the frequent disruption of p53 and Rb in familial OS syndromes, which are also common mutations in sporadic OS. They have been applied to significantly improve our understanding about the functions of recurrently mutated genes in disease. The murine models can be used as a platform for preclinical testing and identifying new therapeutic targets, in addition to testing the role of additional mutations in vivo. Most recently these models have begun to be used for discovery based approaches and screens, which hold significant promise in furthering our understanding of the genetic and therapeutic sensitivities of OS. In this review, we discuss the mouse models of OS that have been reported in the last 3-5 years and newly identified pathways from these studies. Finally, we discuss the preclinical utilization of the mouse models of OS for identifying and validating actionable targets to improve patient outcome. © 2017 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Yeo, I. Y.; Lang, M.; Lee, S.; Huang, C.; Jin, H.; McCarty, G.; Sadeghi, A.
2017-12-01
The wetland ecosystem plays crucial roles in improving hydrological function and ecological integrity for the downstream water and the surrounding landscape. However, changing behaviours and functioning of wetland ecosystems are poorly understood and extremely difficult to characterize. Improved understanding on hydrological behaviours of wetlands, considering their interaction with surrounding landscapes and impacts on downstream waters, is an essential first step toward closing the knowledge gap. We present an integrated wetland-catchment modelling study that capitalizes on recently developed inundation maps and other geospatial data. The aim of the data-model integration is to improve spatial prediction of wetland inundation and evaluate cumulative hydrological benefits at the catchment scale. In this paper, we highlight problems arising from data preparation, parameterization, and process representation in simulating wetlands within a distributed catchment model, and report the recent progress on mapping of wetland dynamics (i.e., inundation) using multiple remotely sensed data. We demonstrate the value of spatially explicit inundation information to develop site-specific wetland parameters and to evaluate model prediction at multi-spatial and temporal scales. This spatial data-model integrated framework is tested using Soil and Water Assessment Tool (SWAT) with improved wetland extension, and applied for an agricultural watershed in the Mid-Atlantic Coastal Plain, USA. This study illustrates necessity of spatially distributed information and a data integrated modelling approach to predict inundation of wetlands and hydrologic function at the local landscape scale, where monitoring and conservation decision making take place.
Nurse-directed care model in a psychiatric hospital: a model for clinical accountability.
E-Morris, Marlene; Caldwell, Barbara; Mencher, Kathleen J; Grogan, Kimberly; Judge-Gorny, Margaret; Patterson, Zelda; Christopher, Terrian; Smith, Russell C; McQuaide, Teresa
2010-01-01
The focus on recovery for persons with severe and persistent mental illness is leading state psychiatric hospitals to transform their method of care delivery. This article describes a quality improvement project involving a hospital's administration and multidisciplinary state-university affiliation that collaborated in the development and implementation of a nursing care delivery model in a state psychiatric hospital. The quality improvement project team instituted a new model to promote the hospital's vision of wellness and recovery through utilization of the therapeutic relationship and greater clinical accountability. Implementation of the model was accomplished in 2 phases: first, the establishment of a structure to lay the groundwork for accountability and, second, the development of a mechanism to provide a clinical supervision process for staff in their work with clients. Effectiveness of the model was assessed by surveys conducted at baseline and after implementation. Results indicated improvement in clinical practices and client living environment. As a secondary outcome, these improvements appeared to be associated with increased safety on the units evidenced by reduction in incidents of seclusion and restraint. Restructuring of the service delivery system of care so that clients are the center of clinical focus improves safety and can enhance the staff's attention to work with clients on their recovery. The role of the advanced practice nurse can influence the recovery of clients in state psychiatric hospitals. Future research should consider the impact on clients and their perceptions of the new service models.
Development of Novel PEM Membrane and Multiphase CD Modeling of PEM Fuel Cell
DOE Office of Scientific and Technical Information (OSTI.GOV)
K. J. Berry; Susanta Das
2009-12-30
To understand heat and water management phenomena better within an operational proton exchange membrane fuel cell's (PEMFC) conditions, a three-dimensional, two-phase computational fluid dynamic (CFD) flow model has been developed and simulated for a complete PEMFC. Both liquid and gas phases are considered in the model by taking into account the gas flow, diffusion, charge transfer, change of phase, electro-osmosis, and electrochemical reactions to understand the overall dynamic behaviors of species within an operating PEMFC. The CFD model is solved numerically under different parametric conditions in terms of water management issues in order to improve cell performance. The results obtainedmore » from the CFD two-phase flow model simulations show improvement in cell performance as well as water management under PEMFCs operational conditions as compared to the results of a single phase flow model available in the literature. The quantitative information obtained from the two-phase model simulation results helped to develop a CFD control algorithm for low temperature PEM fuel cell stacks which opens up a route in designing improvement of PEMFC for better operational efficiency and performance. To understand heat and water management phenomena better within an operational proton exchange membrane fuel cell's (PEMFC) conditions, a three-dimensional, two-phase computational fluid dynamic (CFD) flow model has been developed and simulated for a complete PEMFC. Both liquid and gas phases are considered in the model by taking into account the gas flow, diffusion, charge transfer, change of phase, electro-osmosis, and electrochemical reactions to understand the overall dynamic behaviors of species within an operating PEMFC. The CFD model is solved numerically under different parametric conditions in terms of water management issues in order to improve cell performance. The results obtained from the CFD two-phase flow model simulations show improvement in cell performance as well as water management under PEMFCs operational conditions as compared to the results of a single phase flow model available in the literature. The quantitative information obtained from the two-phase model simulation results helped to develop a CFD control algorithm for low temperature PEM fuel cell stacks which opens up a route in designing improvement of PEMFC for better operational efficiency and performance.« less
NASA Technical Reports Server (NTRS)
Arnold, Nathan; Barahona, Donifan; Achuthavarier, Deepthi
2017-01-01
Weather and climate models have long struggled to realistically simulate the Madden-Julian Oscillation (MJO). Here we present a significant improvement in MJO simulation in NASA's GEOS atmospheric model with the implementation of 2-moment microphysics and the UW shallow cumulus parameterization. Comparing ten-year runs (2007-2016) with the old (1mom) and updated (2mom+shlw) model physics, the updated model has increased intra-seasonal variance with increased coherence. Surface fluxes and OLR are found to vary more realistically with precipitation, and a moisture budget suggests that changes in rain reevaporation and the cloud longwave feedback help support heavy precipitation. Preliminary results also show improved MJO hindcast skill.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yang; Leung, L. Ruby; Fan, Jiwen
This is a collaborative project among North Carolina State University, Pacific Northwest National Laboratory, and Scripps Institution of Oceanography, University of California at San Diego to address the critical need for an accurate representation of aerosol indirect effect in climate and Earth system models. In this project, we propose to develop and improve parameterizations of aerosol-cloud-precipitation feedbacks in climate models and apply them to study the effect of aerosols and clouds on radiation and hydrologic cycle. Our overall objective is to develop, improve, and evaluate parameterizations to enable more accurate simulations of these feedbacks in high resolution regional and globalmore » climate models.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-09-01
This document presents a modeling and control study of the Fluid Bed Gasification (FBG) unit at the Morgantown Energy Technology Center (METC). The work is performed under contract no. DE-FG21-94MC31384. The purpose of this study is to generate a simple FBG model from process data, and then use the model to suggest an improved control scheme which will improve operation of the gasifier. The work first developes a simple linear model of the gasifier, then suggests an improved gasifier pressure and MGCR control configuration, and finally suggests the use of a multivariable control strategy for the gasifier.
An improved task-role-based access control model for G-CSCW applications
NASA Astrophysics Data System (ADS)
He, Chaoying; Chen, Jun; Jiang, Jie; Han, Gang
2005-10-01
Access control is an important and popular security mechanism for multi-user applications. GIS-based Computer Supported Cooperative Work (G-CSCW) application is one of such applications. This paper presents an improved Task-Role-Based Access Control (X-TRBAC) model for G-CSCW applications. The new model inherits the basic concepts of the old ones, such as role and task. Moreover, it has introduced two concepts, i.e. object hierarchy and operation hierarchy, and the corresponding rules to improve the efficiency of permission definition in access control models. The experiments show that the method can simplify the definition of permissions, and it is more applicable for G-CSCW applications.
Astashkina, Anna; Grainger, David W
2014-04-01
Drug failure due to toxicity indicators remains among the primary reasons for staggering drug attrition rates during clinical studies and post-marketing surveillance. Broader validation and use of next-generation 3-D improved cell culture models are expected to improve predictive power and effectiveness of drug toxicological predictions. However, after decades of promising research significant gaps remain in our collective ability to extract quality human toxicity information from in vitro data using 3-D cell and tissue models. Issues, challenges and future directions for the field to improve drug assay predictive power and reliability of 3-D models are reviewed. Copyright © 2014 Elsevier B.V. All rights reserved.
WAM: an improved algorithm for modelling antibodies on the WEB.
Whitelegg, N R; Rees, A R
2000-12-01
An improved antibody modelling algorithm has been developed which incorporates significant improvements to the earlier versions developed by Martin et al. (1989, 1991), Pedersen et al. (1992) and Rees et al. (1996) and known as AbM (Oxford Molecular). The new algorithm, WAM (for Web Antibody Modelling), has been launched as an online modelling service and is located at URL http://antibody.bath.ac.uk. Here we provide a summary only of the important features of WAM. Readers interested in further details are directed to the website, which gives extensive background information on the methods employed. A brief description of the rationale behind some of the newer methodology (specifically, the knowledge-based screens) is also given.
NASA Astrophysics Data System (ADS)
Khouider, B.; Goswami, B. B.; Majda, A.; Krishna, R. P. M. M.; Mukhopadhyay, P.
2016-12-01
Improvements in the capability of climate models to realistically capture the synoptic and intra-seasonnal variability, associated with tropical rainfall, are conditioned by improvement in the representation of the subgrid variability due to organized convection and the underlying two-way interactions through multiple scales and thus breaking with the quasi-equilibrium bottleneck. By design, the stochastic multi-cloud model (SMCM) mimics the life cycle of organized tropical convective systems and the interactions of the associated cloud types with each other and with large scales, as it is observed. It is based a lattice particle interaction model for predefined microscopic (subgrid) sites that make random transitions from one cloud type to another conditional to the large scale state. In return the SMCM provides the cloud type area fractions on the form of a Markov chain model which can be run in parallel with the climate model without any significant computational overhead. The SMCM was previously successfully tested in both reduced complexity tropical models and an aquaplanet global atmospheric model. Here, we report for the first time the results of its implementation in the fully coupled NCEP climate model (CFSv2) through the used of prescribed vertical profiles of heating and drying obtained from observations. While many known biases in CFSv2 have been slightly improved there are no noticeable degradation in the simulated mean climatology. Nonetheless, comparison with observations show that the improvements in terms of synoptic and intra-seasonnal variability are spectacular, despite the fact that CFSv2 is one of the best models in this regard. In particular, while CFSv2 exaggerates the intra-seasonnal variance at the expense of the synoptic contribution, the CFS-SMCM shows a good balance between the two as in the observations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johns, Jesse M.; Burkes, Douglas
In this work, a multilayered perceptron (MLP) network is used to develop predictive isothermal time-temperature-transformation (TTT) models covering a range of U-Mo binary and ternary alloys. The selected ternary alloys for model development are U-Mo-Ru, U-Mo-Nb, U-Mo-Zr, U-Mo-Cr, and U-Mo-Re. These model’s ability to predict 'novel' U-Mo alloys is shown quite well despite the discrepancies between literature sources for similar alloys which likely arise from different thermal-mechanical processing conditions. These models are developed with the primary purpose of informing experimental decisions. Additional experimental insight is necessary in order to reduce the number of experiments required to isolate ideal alloys. Thesemore » models allow test planners to evaluate areas of experimental interest; once initial tests are conducted, the model can be updated and further improve follow-on testing decisions. The model also improves analysis capabilities by reducing the number of data points necessary from any particular test. For example, if one or two isotherms are measured during a test, the model can construct the rest of the TTT curve over a wide range of temperature and time. This modeling capability reduces the cost of experiments while also improving the value of the results from the tests. The reduced costs could result in improved material characterization and therefore improved fundamental understanding of TTT dynamics. As additional understanding of phenomena driving TTTs is acquired, this type of MLP model can be used to populate unknowns (such as material impurity and other thermal mechanical properties) from past literature sources.« less
Construction of mathematical model for measuring material concentration by colorimetric method
NASA Astrophysics Data System (ADS)
Liu, Bing; Gao, Lingceng; Yu, Kairong; Tan, Xianghua
2018-06-01
This paper use the method of multiple linear regression to discuss the data of C problem of mathematical modeling in 2017. First, we have established a regression model for the concentration of 5 substances. But only the regression model of the substance concentration of urea in milk can pass through the significance test. The regression model established by the second sets of data can pass the significance test. But this model exists serious multicollinearity. We have improved the model by principal component analysis. The improved model is used to control the system so that it is possible to measure the concentration of material by direct colorimetric method.
Hu, Chuanpu; Randazzo, Bruce; Sharma, Amarnath; Zhou, Honghui
2017-10-01
Exposure-response modeling plays an important role in optimizing dose and dosing regimens during clinical drug development. The modeling of multiple endpoints is made possible in part by recent progress in latent variable indirect response (IDR) modeling for ordered categorical endpoints. This manuscript aims to investigate the level of improvement achievable by jointly modeling two such endpoints in the latent variable IDR modeling framework through the sharing of model parameters. This is illustrated with an application to the exposure-response of guselkumab, a human IgG1 monoclonal antibody in clinical development that blocks IL-23. A Phase 2b study was conducted in 238 patients with psoriasis for which disease severity was assessed using Psoriasis Area and Severity Index (PASI) and Physician's Global Assessment (PGA) scores. A latent variable Type I IDR model was developed to evaluate the therapeutic effect of guselkumab dosing on 75, 90 and 100% improvement of PASI scores from baseline and PGA scores, with placebo effect empirically modeled. The results showed that the joint model is able to describe the observed data better with fewer parameters compared with the common approach of separately modeling the endpoints.
The Agricultural Model Intercomparison and Improvement Project (AgMIP): Protocols and Pilot Studies
NASA Technical Reports Server (NTRS)
Rosenzweig, C.; Jones, J. W.; Hatfield, J. L.; Ruane, A. C.; Boote, K. J.; Thorburn, P.; Antle, J. M.; Nelson, G. C.; Porter, C.; Janssen, S.;
2012-01-01
The Agricultural Model Intercomparison and Improvement Project (AgMIP) is a major international effort linking the climate, crop, and economic modeling communities with cutting-edge information technology to produce improved crop and economic models and the next generation of climate impact projections for the agricultural sector. The goals of AgMIP are to improve substantially the characterization of world food security due to climate change and to enhance adaptation capacity in both developing and developed countries. Analyses of the agricultural impacts of climate variability and change require a transdisciplinary effort to consistently link state-of-the-art climate scenarios to crop and economic models. Crop model outputs are aggregated as inputs to regional and global economic models to determine regional vulnerabilities, changes in comparative advantage, price effects, and potential adaptation strategies in the agricultural sector. Climate, Crop Modeling, Economics, and Information Technology Team Protocols are presented to guide coordinated climate, crop modeling, economics, and information technology research activities around the world, along with AgMIP Cross-Cutting Themes that address uncertainty, aggregation and scaling, and the development of Representative Agricultural Pathways (RAPs) to enable testing of climate change adaptations in the context of other regional and global trends. The organization of research activities by geographic region and specific crops is described, along with project milestones. Pilot results demonstrate AgMIP's role in assessing climate impacts with explicit representation of uncertainties in climate scenarios and simulations using crop and economic models. An intercomparison of wheat model simulations near Obregón, Mexico reveals inter-model differences in yield sensitivity to [CO2] with model uncertainty holding approximately steady as concentrations rise, while uncertainty related to choice of crop model increases with rising temperatures. Wheat model simulations with midcentury climate scenarios project a slight decline in absolute yields that is more sensitive to selection of crop model than to global climate model, emissions scenario, or climate scenario downscaling method. A comparison of regional and national-scale economic simulations finds a large sensitivity of projected yield changes to the simulations' resolved scales. Finally, a global economic model intercomparison example demonstrates that improvements in the understanding of agriculture futures arise from integration of the range of uncertainty in crop, climate, and economic modeling results in multi-model assessments.
Model improvements and validation of TerraSAR-X precise orbit determination
NASA Astrophysics Data System (ADS)
Hackel, S.; Montenbruck, O.; Steigenberger, P.; Balss, U.; Gisinger, C.; Eineder, M.
2017-05-01
The radar imaging satellite mission TerraSAR-X requires precisely determined satellite orbits for validating geodetic remote sensing techniques. Since the achieved quality of the operationally derived, reduced-dynamic (RD) orbit solutions limits the capabilities of the synthetic aperture radar (SAR) validation, an effort is made to improve the estimated orbit solutions. This paper discusses the benefits of refined dynamical models on orbit accuracy as well as estimated empirical accelerations and compares different dynamic models in a RD orbit determination. Modeling aspects discussed in the paper include the use of a macro-model for drag and radiation pressure computation, the use of high-quality atmospheric density and wind models as well as the benefit of high-fidelity gravity and ocean tide models. The Sun-synchronous dusk-dawn orbit geometry of TerraSAR-X results in a particular high correlation of solar radiation pressure modeling and estimated normal-direction positions. Furthermore, this mission offers a unique suite of independent sensors for orbit validation. Several parameters serve as quality indicators for the estimated satellite orbit solutions. These include the magnitude of the estimated empirical accelerations, satellite laser ranging (SLR) residuals, and SLR-based orbit corrections. Moreover, the radargrammetric distance measurements of the SAR instrument are selected for assessing the quality of the orbit solutions and compared to the SLR analysis. The use of high-fidelity satellite dynamics models in the RD approach is shown to clearly improve the orbit quality compared to simplified models and loosely constrained empirical accelerations. The estimated empirical accelerations are substantially reduced by 30% in tangential direction when working with the refined dynamical models. Likewise the SLR residuals are reduced from -3 ± 17 to 2 ± 13 mm, and the SLR-derived normal-direction position corrections are reduced from 15 to 6 mm, obtained from the 2012-2014 period. The radar range bias is reduced from -10.3 to -6.1 mm with the updated orbit solutions, which coincides with the reduced standard deviation of the SLR residuals. The improvements are mainly driven by the satellite macro-model for the purpose of solar radiation pressure modeling, improved atmospheric density models, and the use of state-of-the-art gravity field models.
NASA Astrophysics Data System (ADS)
Rosyidi, C. N.; Jauhari, WA; Suhardi, B.; Hamada, K.
2016-02-01
Quality improvement must be performed in a company to maintain its product competitiveness in the market. The goal of such improvement is to increase the customer satisfaction and the profitability of the company. In current practice, a company needs several suppliers to provide the components in assembly process of a final product. Hence quality improvement of the final product must involve the suppliers. In this paper, an optimization model to allocate the variance reduction is developed. Variation reduction is an important term in quality improvement for both manufacturer and suppliers. To improve suppliers’ components quality, the manufacturer must invest an amount of their financial resources in learning process of the suppliers. The objective function of the model is to minimize the total cost consists of investment cost, and quality costs for both internal and external quality costs. The Learning curve will determine how the employee of the suppliers will respond to the learning processes in reducing the variance of the component.
Model based control of dynamic atomic force microscope.
Lee, Chibum; Salapaka, Srinivasa M
2015-04-01
A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H(∞) control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.
Research on Nonlinear Time Series Forecasting of Time-Delay NN Embedded with Bayesian Regularization
NASA Astrophysics Data System (ADS)
Jiang, Weijin; Xu, Yusheng; Xu, Yuhui; Wang, Jianmin
Based on the idea of nonlinear prediction of phase space reconstruction, this paper presented a time delay BP neural network model, whose generalization capability was improved by Bayesian regularization. Furthermore, the model is applied to forecast the imp&exp trades in one industry. The results showed that the improved model has excellent generalization capabilities, which not only learned the historical curve, but efficiently predicted the trend of business. Comparing with common evaluation of forecasts, we put on a conclusion that nonlinear forecast can not only focus on data combination and precision improvement, it also can vividly reflect the nonlinear characteristic of the forecasting system. While analyzing the forecasting precision of the model, we give a model judgment by calculating the nonlinear characteristic value of the combined serial and original serial, proved that the forecasting model can reasonably 'catch' the dynamic characteristic of the nonlinear system which produced the origin serial.
NASA Technical Reports Server (NTRS)
Moore, James; Marty, Dave; Cody, Joe
2000-01-01
SRS and NASA/MSFC have developed software with unique capabilities to couple bearing kinematic modeling with high fidelity thermal modeling. The core thermomechanical modeling software was developed by SRS and others in the late 1980's and early 1990's under various different contractual efforts. SRS originally developed software that enabled SHABERTH (Shaft Bearing Thermal Model) and SINDA (Systems Improved Numerical Differencing Analyzer) to exchange data and autonomously allowing bearing component temperature effects to propagate into the steady state bearing mechanical model. A separate contract was issued in 1990 to create a personal computer version of the software. At that time SRS performed major improvements to the code. Both SHABERTH and SINDA were independently ported to the PC and compiled. SRS them integrated the two programs into a single program that was named SINSHA. This was a major code improvement.
NASA Technical Reports Server (NTRS)
Moore, James; Marty, Dave; Cody, Joe
2000-01-01
SRS and NASA/MSFC have developed software with unique capabilities to couple bearing kinematic modeling with high fidelity thermal modeling. The core thermomechanical modeling software was developed by SRS and others in the late 1980's and early 1990's under various different contractual efforts. SRS originally developed software that enabled SHABERTH (Shaft Bearing Thermal Model) and SINDA (Systems Improved Numerical Differencing Analyzer) to exchange data and autonomously allowing bearing component temperature effects to propagate into the steady state bearing mechanical model. A separate contract was issued in 1990 to create a personal computer version of the software. At that time SRS performed major improvements to the code. Both SHABERTH and SINDA were independently ported to the PC and compiled. SRS them integrated the two programs into a single program that was named SINSHA. This was a major code improvement.
A novel medical information management and decision model for uncertain demand optimization.
Bi, Ya
2015-01-01
Accurately planning the procurement volume is an effective measure for controlling the medicine inventory cost. Due to uncertain demand it is difficult to make accurate decision on procurement volume. As to the biomedicine sensitive to time and season demand, the uncertain demand fitted by the fuzzy mathematics method is obviously better than general random distribution functions. To establish a novel medical information management and decision model for uncertain demand optimization. A novel optimal management and decision model under uncertain demand has been presented based on fuzzy mathematics and a new comprehensive improved particle swarm algorithm. The optimal management and decision model can effectively reduce the medicine inventory cost. The proposed improved particle swarm optimization is a simple and effective algorithm to improve the Fuzzy interference and hence effectively reduce the calculation complexity of the optimal management and decision model. Therefore the new model can be used for accurate decision on procurement volume under uncertain demand.
How the national healthcare quality and disparities reports can catalyze quality improvement.
McNeill, Dwight; Kelley, Ed
2005-03-01
The purpose of the National Reports on Healthcare Quality and Disparities is to enhance awareness of quality and health care disparities, track progress, understand variations, and catalyze improvements in health care. The objective of this paper is to propose a model that will facilitate a user's progression from knowledge to action and to show how the reports, its data warehouse, associated products, and Agency for Healthcare Research and Quality resources are integrated and focused on a comprehensive campaign to improve health care quality. The design of the paper is to present a conceptual model and to show how implementation strategies for the reports fit the model. The authors propose a quality improvement supply chain model to help elucidate the links of the process, corresponding developmental stages that potential users need to master and progress through, and "just-in-time" supply chain inputs at each of the corresponding stages, and populate the model with examples. The traditional ways of disseminating knowledge derived from science through reports and conferences are inadequate to the humbling need for vast improvements in the US health care system. Our model suggests the need for a wide variety of information, packaged in a diverse ways, and delivered just in time and on demand. It encourages the alignment of decision makers and researchers, along with information intermediaries and innovation brokers, to make the information production cycle more efficient and effective. Future iterations of the reports will improve relevance, meaning, and distribution of information to facilitate its uptake by potential users.
Active surface model improvement by energy function optimization for 3D segmentation.
Azimifar, Zohreh; Mohaddesi, Mahsa
2015-04-01
This paper proposes an optimized and efficient active surface model by improving the energy functions, searching method, neighborhood definition and resampling criterion. Extracting an accurate surface of the desired object from a number of 3D images using active surface and deformable models plays an important role in computer vision especially medical image processing. Different powerful segmentation algorithms have been suggested to address the limitations associated with the model initialization, poor convergence to surface concavities and slow convergence rate. This paper proposes a method to improve one of the strongest and recent segmentation algorithms, namely the Decoupled Active Surface (DAS) method. We consider a gradient of wavelet edge extracted image and local phase coherence as external energy to extract more information from images and we use curvature integral as internal energy to focus on high curvature region extraction. Similarly, we use resampling of points and a line search for point selection to improve the accuracy of the algorithm. We further employ an estimation of the desired object as an initialization for the active surface model. A number of tests and experiments have been done and the results show the improvements with regards to the extracted surface accuracy and computational time of the presented algorithm compared with the best and recent active surface models. Copyright © 2015 Elsevier Ltd. All rights reserved.
State-of-the-art satellite laser range modeling for geodetic and oceanographic applications
NASA Technical Reports Server (NTRS)
Klosko, Steve M.; Smith, David E.
1993-01-01
Significant improvements have been made in the modeling and accuracy of Satellite Laser Range (SLR) data since the launch of LAGEOS in 1976. Some of these include: improved models of the static geopotential, solid-Earth and ocean tides, more advanced atmospheric drag models, and the adoption of the J2000 reference system with improved nutation and precession. Site positioning using SLR systems currently yield approximately 2 cm static and 5 mm/y kinematic descriptions of the geocentric location of these sites. Incorporation of a large set of observations from advanced Satellite Laser Ranging (SLR) tracking systems have directly made major contributions to the gravitational fields and in advancing the state-of-the-art in precision orbit determination. SLR is the baseline tracking system for the altimeter bearing TOPEX/Poseidon and ERS-1 satellites and thus, will play an important role in providing the Conventional Terrestrial Reference Frame for instantaneously locating the geocentric position of the ocean surface over time, in providing an unchanging range standard for altimeter range calibration, and for improving the geoid models to separate gravitational from ocean circulation signals seen in the sea surface. Nevertheless, despite the unprecedented improvements in the accuracy of the models used to support orbit reduction of laser observations, there still remain systematic unmodeled effects which limit the full exploitation of modern SLR data.
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby
2017-01-01
Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.
Kempfle, Judith S.; BuSaba, Nicholas Y.; Dobrowski, John M.; Westover, Michael B.; Bianchi, Matt T.
2017-01-01
Objectives/Hypothesis Nasal surgery has been implicated to improve continuous positive airway pressure (CPAP) compliance in patients with obstructive sleep apnea (OSA) and nasal obstruction. However, the cost-effectiveness of nasal surgery to improve CPAP compliance is not known. We modeled the cost-effectiveness of two types of nasal surgery versus no surgery in patients with OSA and nasal obstruction undergoing CPAP therapy. Study Design Cost-effectiveness decision tree model. Methods We built a decision tree model to identify conditions under which nasal surgery would be cost-effective to improve CPAP adherence over the standard of care. We compared turbinate reduction and septoplasty to nonsurgical treatment over varied time horizons from a third-party payer perspective. We included variables for cost of untreated OSA, surgical cost and complications, improved compliance postoperatively, and quality of life. Results Our study identified nasal surgery as a cost-effective strategy to improve compliance of OSA patients using CPAP across a range of plausible model assumptions regarding the cost of untreated OSA, the probability of adherence improvement, and a chronic time horizon. The relatively lower surgical cost of turbinate reduction made it more cost-effective at earlier time horizons, whereas septoplasty became cost-effective after a longer timespan. Conclusions Across a range of plausible values in a clinically relevant decision model, nasal surgery is a cost-effective strategy to improve CPAP compliance in OSA patients with nasal obstruction. Our results suggest that OSA patients with nasal obstruction who struggle with CPAP therapy compliance should undergo evaluation for nasal surgery. PMID:27653626
Operational improvements at traffic circles : final report, December 2008.
DOT National Transportation Integrated Search
2008-12-01
This study deals with the development of a credible and valid simulation model of the Collingwood, : Brooklawn, and Asbury traffic circles in New Jersey. These simulation models are used to evaluate : various geometric and operational improvement alt...
Accuracy Analysis of a Box-wing Theoretical SRP Model
NASA Astrophysics Data System (ADS)
Wang, Xiaoya; Hu, Xiaogong; Zhao, Qunhe; Guo, Rui
2016-07-01
For Beidou satellite navigation system (BDS) a high accuracy SRP model is necessary for high precise applications especially with Global BDS establishment in future. The BDS accuracy for broadcast ephemeris need be improved. So, a box-wing theoretical SRP model with fine structure and adding conical shadow factor of earth and moon were established. We verified this SRP model by the GPS Block IIF satellites. The calculation was done with the data of PRN 1, 24, 25, 27 satellites. The results show that the physical SRP model for POD and forecast for GPS IIF satellite has higher accuracy with respect to Bern empirical model. The 3D-RMS of orbit is about 20 centimeters. The POD accuracy for both models is similar but the prediction accuracy with the physical SRP model is more than doubled. We tested 1-day 3-day and 7-day orbit prediction. The longer is the prediction arc length, the more significant is the improvement. The orbit prediction accuracy with the physical SRP model for 1-day, 3-day and 7-day arc length are 0.4m, 2.0m, 10.0m respectively. But they are 0.9m, 5.5m and 30m with Bern empirical model respectively. We apply this means to the BDS and give out a SRP model for Beidou satellites. Then we test and verify the model with Beidou data of one month only for test. Initial results show the model is good but needs more data for verification and improvement. The orbit residual RMS is similar to that with our empirical force model which only estimate the force for along track, across track direction and y-bias. But the orbit overlap and SLR observation evaluation show some improvement. The remaining empirical force is reduced significantly for present Beidou constellation.
Economic analysis of interventions to improve village chicken production in Myanmar.
Henning, J; Morton, J; Pym, R; Hla, T; Sunn, K; Meers, J
2013-07-01
A cost-benefit analysis using deterministic and stochastic modelling was conducted to identify the net benefits for households that adopt (1) vaccination of individual birds against Newcastle disease (ND) or (2) improved management of chick rearing by providing coops for the protection of chicks from predation and chick starter feed inside a creep feeder to support chicks' nutrition in village chicken flocks in Myanmar. Partial budgeting was used to assess the additional costs and benefits associated with each of the two interventions tested relative to neither strategy. In the deterministic model, over the first 3 years after the introduction of the interventions, the cumulative sum of the net differences from neither strategy was 13,189Kyat for ND vaccination and 77,645Kyat for improved chick management (effective exchange rate in 2005: 1000Kyat=1$US). Both interventions were also profitable after discounting over a 10-year period; Net Present Values for ND vaccination and improved chick management were 30,791 and 167,825Kyat, respectively. The Benefit-Cost Ratio for ND vaccination was very high (28.8). This was lower for improved chick management, due to greater costs of the intervention, but still favourable at 4.7. Using both interventions concurrently yielded a Net Present Value of 470,543Kyat and a Benefit-Cost Ratio of 11.2 over the 10-year period in the deterministic model. Using the stochastic model, for the first 3 years following the introduction of the interventions, the mean cumulative sums of the net difference were similar to those values obtained from the deterministic model. Sensitivity analysis indicated that the cumulative net differences were strongly influenced by grower bird sale income, particularly under improved chick management. The effects of the strategies on odds of households selling and consuming birds after 7 months, and numbers of birds being sold or consumed after this period also influenced profitability. Cost variations for equipment used under improved chick management were not markedly associated with profitability. Net Present Values and Benefit-Cost Ratios discounted over a 10-year period were also similar to the deterministic model when mean values obtained through stochastic modelling were used. In summary, the study showed that ND vaccination and improved chick management can improve the viability and profitability of village chicken production in Myanmar. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Goad, Clyde C.; Chadwell, C. David
1993-01-01
GEODYNII is a conventional batch least-squares differential corrector computer program with deterministic models of the physical environment. Conventional algorithms were used to process differenced phase and pseudorange data to determine eight-day Global Positioning system (GPS) orbits with several meter accuracy. However, random physical processes drive the errors whose magnitudes prevent improving the GPS orbit accuracy. To improve the orbit accuracy, these random processes should be modeled stochastically. The conventional batch least-squares algorithm cannot accommodate stochastic models, only a stochastic estimation algorithm is suitable, such as a sequential filter/smoother. Also, GEODYNII cannot currently model the correlation among data values. Differenced pseudorange, and especially differenced phase, are precise data types that can be used to improve the GPS orbit precision. To overcome these limitations and improve the accuracy of GPS orbits computed using GEODYNII, we proposed to develop a sequential stochastic filter/smoother processor by using GEODYNII as a type of trajectory preprocessor. Our proposed processor is now completed. It contains a correlated double difference range processing capability, first order Gauss Markov models for the solar radiation pressure scale coefficient and y-bias acceleration, and a random walk model for the tropospheric refraction correction. The development approach was to interface the standard GEODYNII output files (measurement partials and variationals) with software modules containing the stochastic estimator, the stochastic models, and a double differenced phase range processing routine. Thus, no modifications to the original GEODYNII software were required. A schematic of the development is shown. The observational data are edited in the preprocessor and the data are passed to GEODYNII as one of its standard data types. A reference orbit is determined using GEODYNII as a batch least-squares processor and the GEODYNII measurement partial (FTN90) and variational (FTN80, V-matrix) files are generated. These two files along with a control statement file and a satellite identification and mass file are passed to the filter/smoother to estimate time-varying parameter states at each epoch, improved satellite initial elements, and improved estimates of constant parameters.
Li, Tingting; Xie, Baohua; Wang, Guocheng; Zhang, Wen; Zhang, Qing; Vesala, Timo; Raivonen, Maarit
2016-07-15
Coastal wetlands are important CH4 sources to the atmosphere. Coastal wetlands account for ~10% of the total area of natural wetlands in China, but the size of this potential CH4 source remains highly uncertain. We introduced the influence of salinity on CH4 production and CH4 diffusion into a biogeophysical model named CH4MODwetland so that it can be used in coastal wetlands. The improved model can generally simulate seasonal CH4 variations from tidal marshes dominated by Phragmites and Scirpus. However, the model underestimated winter CH4 fluxes from tidal marshes in the Yellow River Delta and YanCheng Estuary. It also failed to capture the accurate timing of the CH4 peaks in YanCheng Estuary and ChongMing Island in 2012. The improved model could generally simulate the difference between the annual mean CH4 fluxes from mangrove sites in GuangZhou and HaiKou city under different salinity and water table depth conditions, although fluxes were systematically underestimated in the mangrove site of HaiKou city. Using the improved model, the seasonal CH4 emissions simulated across all of the coastal wetlands ranged from 0.1 to 44.90gm(-2), with an average value of 7.89gm(-2), which is in good agreement with the observed values. The improved model significantly decreased the RMSE and RMD from 424% to 14% and 314% to -2%, respectively, and improved the EF from -18.30 to 0.99. Model sensitivity analysis showed that CH4 emissions were most sensitive to Pox in the tidal marshes and salinity in the mangroves. The results show that previous studies may have overestimated CH4 emissions on a regional or global scale by neglecting the influence of salinity. In general, the CH4MODwetland model can simulate seasonal CH4 emissions from different types of coastal wetlands under various conditions. Further improvements of CH4MODwetland should include the specific characteristics of CH4 processes in mangroves to decrease the uncertainty in estimating regional or global CH4 emissions from natural wetlands. Copyright © 2016 Elsevier B.V. All rights reserved.
Vanos, J K; Warland, J S; Gillespie, T J; Kenny, N A
2012-11-01
The purpose of this paper is to implement current and novel research techniques in human energy budget estimations to give more accurate and efficient application of models by a variety of users. Using the COMFA model, the conditioning level of an individual is incorporated into overall energy budget predictions, giving more realistic estimations of the metabolism experienced at various fitness levels. Through the use of VO(2) reserve estimates, errors are found when an elite athlete is modelled as an unconditioned or a conditioned individual, giving budgets underpredicted significantly by -173 and -123 W m(-2), respectively. Such underprediction can result in critical errors regarding heat stress, particularly in highly motivated individuals; thus this revision is critical for athletic individuals. A further improvement in the COMFA model involves improved adaptation of clothing insulation (I (cl)), as well clothing non-uniformity, with changing air temperature (T (a)) and metabolic activity (M (act)). Equivalent T (a) values (for I (cl) estimation) are calculated in order to lower the I (cl) value with increasing M (act) at equal T (a). Furthermore, threshold T (a) values are calculated to predict the point at which an individual will change from a uniform I (cl) to a segmented I (cl) (full ensemble to shorts and a T-shirt). Lastly, improved relative velocity (v (r)) estimates were found with a refined equation accounting for the degree angle of wind to body movement. Differences between the original and improved v (r) equations increased with higher wind and activity speeds, and as the wind to body angle moved away from 90°. Under moderate microclimate conditions, and wind from behind a person, the convective heat loss and skin temperature estimates were 47 W m(-2) and 1.7°C higher when using the improved v (r) equation. These model revisions improve the applicability and usability of the COMFA energy budget model for subjects performing physical activity in outdoor environments. Application is possible for other similar energy budget models, and within various urban and rural environments.
NASA Astrophysics Data System (ADS)
Vanos, J. K.; Warland, J. S.; Gillespie, T. J.; Kenny, N. A.
2012-11-01
The purpose of this paper is to implement current and novel research techniques in human energy budget estimations to give more accurate and efficient application of models by a variety of users. Using the COMFA model, the conditioning level of an individual is incorporated into overall energy budget predictions, giving more realistic estimations of the metabolism experienced at various fitness levels. Through the use of VO2 reserve estimates, errors are found when an elite athlete is modelled as an unconditioned or a conditioned individual, giving budgets underpredicted significantly by -173 and -123 W m-2, respectively. Such underprediction can result in critical errors regarding heat stress, particularly in highly motivated individuals; thus this revision is critical for athletic individuals. A further improvement in the COMFA model involves improved adaptation of clothing insulation ( I cl), as well clothing non-uniformity, with changing air temperature ( T a) and metabolic activity ( M act). Equivalent T a values (for I cl estimation) are calculated in order to lower the I cl value with increasing M act at equal T a. Furthermore, threshold T a values are calculated to predict the point at which an individual will change from a uniform I cl to a segmented I cl (full ensemble to shorts and a T-shirt). Lastly, improved relative velocity ( v r) estimates were found with a refined equation accounting for the degree angle of wind to body movement. Differences between the original and improved v r equations increased with higher wind and activity speeds, and as the wind to body angle moved away from 90°. Under moderate microclimate conditions, and wind from behind a person, the convective heat loss and skin temperature estimates were 47 W m-2 and 1.7°C higher when using the improved v r equation. These model revisions improve the applicability and usability of the COMFA energy budget model for subjects performing physical activity in outdoor environments. Application is possible for other similar energy budget models, and within various urban and rural environments.
Improving the Power of GWAS and Avoiding Confounding from Population Stratification with PC-Select
Tucker, George; Price, Alkes L.; Berger, Bonnie
2014-01-01
Using a reduced subset of SNPs in a linear mixed model can improve power for genome-wide association studies, yet this can result in insufficient correction for population stratification. We propose a hybrid approach using principal components that does not inflate statistics in the presence of population stratification and improves power over standard linear mixed models. PMID:24788602
Barrett, Jessica; Pennells, Lisa; Sweeting, Michael; Willeit, Peter; Di Angelantonio, Emanuele; Gudnason, Vilmundur; Nordestgaard, Børge G.; Psaty, Bruce M; Goldbourt, Uri; Best, Lyle G; Assmann, Gerd; Salonen, Jukka T; Nietert, Paul J; Verschuren, W. M. Monique; Brunner, Eric J; Kronmal, Richard A; Salomaa, Veikko; Bakker, Stephan J L; Dagenais, Gilles R; Sato, Shinichi; Jansson, Jan-Håkan; Willeit, Johann; Onat, Altan; de la Cámara, Agustin Gómez; Roussel, Ronan; Völzke, Henry; Dankner, Rachel; Tipping, Robert W; Meade, Tom W; Donfrancesco, Chiara; Kuller, Lewis H; Peters, Annette; Gallacher, John; Kromhout, Daan; Iso, Hiroyasu; Knuiman, Matthew; Casiglia, Edoardo; Kavousi, Maryam; Palmieri, Luigi; Sundström, Johan; Davis, Barry R; Njølstad, Inger; Couper, David; Danesh, John; Thompson, Simon G; Wood, Angela
2017-01-01
Abstract The added value of incorporating information from repeated blood pressure and cholesterol measurements to predict cardiovascular disease (CVD) risk has not been rigorously assessed. We used data on 191,445 adults from the Emerging Risk Factors Collaboration (38 cohorts from 17 countries with data encompassing 1962–2014) with more than 1 million measurements of systolic blood pressure, total cholesterol, and high-density lipoprotein cholesterol. Over a median 12 years of follow-up, 21,170 CVD events occurred. Risk prediction models using cumulative mean values of repeated measurements and summary measures from longitudinal modeling of the repeated measurements were compared with models using measurements from a single time point. Risk discrimination (C-index) and net reclassification were calculated, and changes in C-indices were meta-analyzed across studies. Compared with the single-time-point model, the cumulative means and longitudinal models increased the C-index by 0.0040 (95% confidence interval (CI): 0.0023, 0.0057) and 0.0023 (95% CI: 0.0005, 0.0042), respectively. Reclassification was also improved in both models; compared with the single-time-point model, overall net reclassification improvements were 0.0369 (95% CI: 0.0303, 0.0436) for the cumulative-means model and 0.0177 (95% CI: 0.0110, 0.0243) for the longitudinal model. In conclusion, incorporating repeated measurements of blood pressure and cholesterol into CVD risk prediction models slightly improves risk prediction. PMID:28549073
Look, Clarisse; McCabe, Patricia; Heard, Robert; Madill, Catherine J
2018-02-02
Modeling and instruction are frequent components of both traditional and technology-assisted voice therapy. This study investigated the value of video modeling and instruction in the early acquisition and short-term retention of a complex voice task without external feedback. Thirty participants were randomized to two conditions and trained to produce a vocal siren over 40 trials. One group received a model and verbal instructions, the other group received a model only. Sirens were analyzed for phonation time, vocal intensity, cepstral peak prominence, peak-to-peak time, and root-mean-square error at five time points. The model and instruction group showed significant improvement on more outcome measures than the model-only group. There was an interaction effect for vocal intensity, which showed that instructions facilitated greater improvement when they were first introduced. However, neither group reproduced the model's siren performance across all parameters or retained the skill 1 day later. Providing verbal instruction with a model appears more beneficial than providing a model only in the prepractice phase of acquiring a complex voice skill. Improved performance was observed; however, the higher level of performance was not retained after 40 trials in both conditions. Other prepractice variables may need to be considered. Findings have implications for traditional and technology-assisted voice therapy. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Data envelopment analysis in service quality evaluation: an empirical study
NASA Astrophysics Data System (ADS)
Najafi, Seyedvahid; Saati, Saber; Tavana, Madjid
2015-09-01
Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A slack-based measure (SBM) of efficiency with constant inputs is used to calculate the PSQI. In addition, a non-linear programming model based on the SBM is proposed to delineate an improvement guideline and improve service quality. An empirical study is conducted to assess the applicability of the method proposed in this study. A large number of studies have used DEA as a benchmarking tool to measure service quality. These models do not propose a coherent performance evaluation construct and consequently fail to deliver improvement guidelines for improving service quality. The DEA models proposed in this study are designed to evaluate and improve service quality within a comprehensive framework and without any dependency on external data.
MODIS Data Assimilation in the CROPGRO model for improving soybean yield estimations
NASA Astrophysics Data System (ADS)
Richetti, J.; Monsivais-Huertero, A.; Ahmad, I.; Judge, J.
2017-12-01
Soybean is one of the main agricultural commodities in the world. Thus, having better estimates of its agricultural production is important. Improving the soybean crop models in Brazil is crucial for better understanding of the soybean market and enhancing decision making, because Brazil is the second largest soybean producer in the world, Parana state is responsible for almost 20% of it, and by itself would be the fourth greatest soybean producer in the world. Data assimilation techniques provide a method to improve spatio-temporal continuity of crops through integration of remotely sensed observations and crop growth models. This study aims to use MODIS EVI to improve DSSAT-CROPGRO soybean yield estimations in the Parana state, southern Brazil. The method uses the Ensemble Kalman filter which assimilates MODIS Terra and Aqua combined products (MOD13Q1 and MYD13Q1) into the CROPGRO model to improve the agricultural production estimates through update of light interception data over time. Expected results will be validated with monitored commercial farms during the period of 2013-2014.
Interacting multiple model forward filtering and backward smoothing for maneuvering target tracking
NASA Astrophysics Data System (ADS)
Nandakumaran, N.; Sutharsan, S.; Tharmarasa, R.; Lang, Tom; McDonald, Mike; Kirubarajan, T.
2009-08-01
The Interacting Multiple Model (IMM) estimator has been proven to be effective in tracking agile targets. Smoothing or retrodiction, which uses measurements beyond the current estimation time, provides better estimates of target states. Various methods have been proposed for multiple model smoothing in the literature. In this paper, a new smoothing method, which involves forward filtering followed by backward smoothing while maintaining the fundamental spirit of the IMM, is proposed. The forward filtering is performed using the standard IMM recursion, while the backward smoothing is performed using a novel interacting smoothing recursion. This backward recursion mimics the IMM estimator in the backward direction, where each mode conditioned smoother uses standard Kalman smoothing recursion. Resulting algorithm provides improved but delayed estimates of target states. Simulation studies are performed to demonstrate the improved performance with a maneuvering target scenario. The comparison with existing methods confirms the improved smoothing accuracy. This improvement results from avoiding the augmented state vector used by other algorithms. In addition, the new technique to account for model switching in smoothing is a key in improving the performance.
Establishment and outcomes of a model primary care pharmacy service system.
Carmichael, Jannet M; Alvarez, Autumn; Chaput, Ryan; DiMaggio, Jennifer; Magallon, Heather; Mambourg, Scott
2004-03-01
The establishment and outcomes of a model primary care pharmacy service system are described. A primary care pharmacy practice model was established at a government health care facility in March 1996. The original objective was to establish a primary pharmacy practice model that would demonstrate improved patient outcomes and maximize the pharmacist's contributions to drug therapy. Since its inception, many improvements have been realized and supported by advanced computer and automated systems, expanded disease state management practices, and unique practitioner and administrative support. Many outcomes studies have been performed on the pharmacist-initiated and -managed clinics, leading to improved patient care and conveying the quality-conscious and cost-effective role pharmacists can play as independent practitioners in this environment. These activities demonstrate cutting-edge leadership in health-system pharmacy. Redesign has been used to improve consistent access to a medication expert and has significantly improved the quality of patient care while easing physicians' workload without increasing health care costs. A system using pharmacists as independent practitioners to promote primary care has achieved high-quality and cost-effective patient care.
NASA Astrophysics Data System (ADS)
Prahani, B. K.; Suprapto, N.; Suliyanah; Lestari, N. A.; Jauhariyah, M. N. R.; Admoko, S.; Wahyuni, S.
2018-03-01
In the previous research, Collaborative Problem Based Physic Learning (CPBPL) model has been developed to improve student’s science process skills, collaborative problem solving, and self-confidence on physics learning. This research is aimed to analyze the effectiveness of CPBPL model towards the improvement of student’s self-confidence on physics learning. This research implemented quasi experimental design on 140 senior high school students who were divided into 4 groups. Data collection was conducted through questionnaire, observation, and interview. Self-confidence measurement was conducted through Self-Confidence Evaluation Sheet (SCES). The data was analyzed using Wilcoxon test, n-gain, and Kruskal Wallis test. Result shows that: (1) There is a significant score improvement on student’s self-confidence on physics learning (α=5%), (2) n-gain value student’s self-confidence on physics learning is high, and (3) n-gain average student’s self-confidence on physics learning was consistent throughout all groups. It can be concluded that CPBPL model is effective to improve student’s self-confidence on physics learning.
An Update on Improvements to NiCE Support for PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Andrew; McCaskey, Alexander J.; Billings, Jay Jay
2015-09-01
The Department of Energy Office of Nuclear Energy's Nuclear Energy Advanced Modeling and Simulation (NEAMS) program has supported the development of the NEAMS Integrated Computational Environment (NiCE), a modeling and simulation workflow environment that provides services and plugins to facilitate tasks such as code execution, model input construction, visualization, and data analysis. This report details the development of workflows for the reactor core neutronics application, PROTEUS. This advanced neutronics application (primarily developed at Argonne National Laboratory) aims to improve nuclear reactor design and analysis by providing an extensible and massively parallel, finite-element solver for current and advanced reactor fuel neutronicsmore » modeling. The integration of PROTEUS-specific tools into NiCE is intended to make the advanced capabilities that PROTEUS provides more accessible to the nuclear energy research and development community. This report will detail the work done to improve existing PROTEUS workflow support in NiCE. We will demonstrate and discuss these improvements, including the development of flexible IO services, an improved interface for input generation, and the addition of advanced Fortran development tools natively in the platform.« less
Thirty Years of Improving the NCEP Global Forecast System
NASA Astrophysics Data System (ADS)
White, G. H.; Manikin, G.; Yang, F.
2014-12-01
Current eight day forecasts by the NCEP Global Forecast System are as accurate as five day forecasts 30 years ago. This revolution in weather forecasting reflects increases in computer power, improvements in the assimilation of observations, especially satellite data, improvements in model physics, improvements in observations and international cooperation and competition. One important component has been and is the diagnosis, evaluation and reduction of systematic errors. The effect of proposed improvements in the GFS on systematic errors is one component of the thorough testing of such improvements by the Global Climate and Weather Modeling Branch. Examples of reductions in systematic errors in zonal mean temperatures and winds and other fields will be presented. One challenge in evaluating systematic errors is uncertainty in what reality is. Model initial states can be regarded as the best overall depiction of the atmosphere, but can be misleading in areas of few observations or for fields not well observed such as humidity or precipitation over the oceans. Verification of model physics is particularly difficult. The Environmental Modeling Center emphasizes the evaluation of systematic biases against observations. Recently EMC has placed greater emphasis on synoptic evaluation and on precipitation, 2-meter temperatures and dew points and 10 meter winds. A weekly EMC map discussion reviews the performance of many models over the United States and has helped diagnose and alleviate significant systematic errors in the GFS, including a near surface summertime evening cold wet bias over the eastern US and a multi-week period when the GFS persistently developed bogus tropical storms off Central America. The GFS exhibits a wet bias for light rain and a dry bias for moderate to heavy rain over the continental United States. Significant changes to the GFS are scheduled to be implemented in the fall of 2014. These include higher resolution, improved physics and improvements to the assimilation. These changes significantly improve the tropospheric flow and reduce a tropical upper tropospheric warm bias. One important error remaining is the failure of the GFS to maintain deep convection over Indonesia and in the tropical west Pacific. This and other current systematic errors will be presented.
NASA Astrophysics Data System (ADS)
Fijani, Elham; Nadiri, Ata Allah; Asghari Moghaddam, Asghar; Tsai, Frank T.-C.; Dixon, Barnali
2013-10-01
Contamination of wells with nitrate-N (NO3-N) poses various threats to human health. Contamination of groundwater is a complex process and full of uncertainty in regional scale. Development of an integrative vulnerability assessment methodology can be useful to effectively manage (including prioritization of limited resource allocation to monitor high risk areas) and protect this valuable freshwater source. This study introduces a supervised committee machine with artificial intelligence (SCMAI) model to improve the DRASTIC method for groundwater vulnerability assessment for the Maragheh-Bonab plain aquifer in Iran. Four different AI models are considered in the SCMAI model, whose input is the DRASTIC parameters. The SCMAI model improves the committee machine artificial intelligence (CMAI) model by replacing the linear combination in the CMAI with a nonlinear supervised ANN framework. To calibrate the AI models, NO3-N concentration data are divided in two datasets for the training and validation purposes. The target value of the AI models in the training step is the corrected vulnerability indices that relate to the first NO3-N concentration dataset. After model training, the AI models are verified by the second NO3-N concentration dataset. The results show that the four AI models are able to improve the DRASTIC method. Since the best AI model performance is not dominant, the SCMAI model is considered to combine the advantages of individual AI models to achieve the optimal performance. The SCMAI method re-predicts the groundwater vulnerability based on the different AI model prediction values. The results show that the SCMAI outperforms individual AI models and committee machine with artificial intelligence (CMAI) model. The SCMAI model ensures that no water well with high NO3-N levels would be classified as low risk and vice versa. The study concludes that the SCMAI model is an effective model to improve the DRASTIC model and provides a confident estimate of the pollution risk.
On the Conditioning of Machine-Learning-Assisted Turbulence Modeling
NASA Astrophysics Data System (ADS)
Wu, Jinlong; Sun, Rui; Wang, Qiqi; Xiao, Heng
2017-11-01
Recently, several researchers have demonstrated that machine learning techniques can be used to improve the RANS modeled Reynolds stress by training on available database of high fidelity simulations. However, obtaining improved mean velocity field remains an unsolved challenge, restricting the predictive capability of current machine-learning-assisted turbulence modeling approaches. In this work we define a condition number to evaluate the model conditioning of data-driven turbulence modeling approaches, and propose a stability-oriented machine learning framework to model Reynolds stress. Two canonical flows, the flow in a square duct and the flow over periodic hills, are investigated to demonstrate the predictive capability of the proposed framework. The satisfactory prediction performance of mean velocity field for both flows demonstrates the predictive capability of the proposed framework for machine-learning-assisted turbulence modeling. With showing the capability of improving the prediction of mean flow field, the proposed stability-oriented machine learning framework bridges the gap between the existing machine-learning-assisted turbulence modeling approaches and the demand of predictive capability of turbulence models in real applications.
Dynamic one-dimensional modeling of secondary settling tanks and design impacts of sizing decisions.
Li, Ben; Stenstrom, Michael K
2014-03-01
As one of the most significant components in the activated sludge process (ASP), secondary settling tanks (SSTs) can be investigated with mathematical models to optimize design and operation. This paper takes a new look at the one-dimensional (1-D) SST model by analyzing and considering the impacts of numerical problems, especially the process robustness. An improved SST model with Yee-Roe-Davis technique as the PDE solver is proposed and compared with the widely used Takács model to show its improvement in numerical solution quality. The improved and Takács models are coupled with a bioreactor model to reevaluate ASP design basis and several popular control strategies for economic plausibility, contaminant removal efficiency and system robustness. The time-to-failure due to rising sludge blanket during overloading, as a key robustness indicator, is analyzed to demonstrate the differences caused by numerical issues in SST models. The calculated results indicate that the Takács model significantly underestimates time to failure, thus leading to a conservative design. Copyright © 2013 Elsevier Ltd. All rights reserved.
Status of Air Quality in Central California and Needs for Further Study
NASA Astrophysics Data System (ADS)
Tanrikulu, S.; Beaver, S.; Soong, S.; Tran, C.; Jia, Y.; Matsuoka, J.; McNider, R. T.; Biazar, A. P.; Palazoglu, A.; Lee, P.; Wang, J.; Kang, D.; Aneja, V. P.
2012-12-01
Ozone and PM2.5 levels frequently exceed NAAQS in central California (CC). Additional emission reductions are needed to attain and maintain the standards there. Agencies are developing cost-effective emission control strategies along with complementary incentive programs to reduce emissions when exceedances are forecasted. These approaches require accurate modeling and forecasting capabilities. A variety of models have been rigorously applied (MM5, WRF, CMAQ, CAMx) over CC. Despite the vast amount of land-based measurements from special field programs and significant effort, models have historically exhibited marginal performance. Satellite data may improve model performance by: establishing IC/BC over outlying areas of the modeling domain having unknown conditions; enabling FDDA over the Pacific Ocean to characterize important marine inflows and pollutant outflows; and filling in the gaps of the land-based monitoring network. BAAQMD, in collaboration with the NASA AQAST, plans to conduct four studies that include satellite-based data in CC air quality analysis and modeling: The first project enhances and refines weather patterns, especially aloft, impacting summer ozone formation. Surface analyses were unable to characterize the strong attenuating effect of the complex terrain to steer marine winds impinging on the continent. The dense summer clouds and fog over the Pacific Ocean form spatial patterns that can be related to the downstream air flows through polluted areas. The goal of this project is to explore, characterize, and quantify these relationships using cloud cover data. Specifically, cloud agreement statistics will be developed using satellite data and model clouds. Model skin temperature predictions will be compared to both MODIS and GOES skin temperatures. The second project evaluates and improves the initial and simulated fields of meteorological models that provide inputs to air quality models. The study will attempt to determine whether a cloud dynamical adjustment developed by UAHuntsville can improve model performance for maritime stratus and whether a moisture adjustment scheme in the Pleim-Xiu boundary layer scheme can use satellite data in place of coarse surface air temperature measurements. The goal is to improve meteorological model performance that leads to improved air quality model performance. The third project evaluates and improves forecasting skills of the National Air Quality Forecasting Model in CC by using land-based routine measurements as well as satellite data. Local forecasts are mostly based on surface meteorological and air quality measurements and weather charts provided by NWS. The goal is to improve the average accuracy in forecasting exceedances, which is around 60%. The fourth project uses satellite data for monitoring trends in fine particulate matter (PM2.5) in the San Francisco Bay Area. It evaluates the effectiveness of a rule adopted in 2008 that restricts household wood burning on days forecasted to have high PM2.5 levels. The goal is to complement current analyses based on surface data covering the largest sub-regions and population centers. The overall goal is to use satellite data to overcome limitations of land-based measurements. The outcomes will be further conceptual understanding of pollutant formation, improved regulatory model performance, and better optimized forecasting programs.
NASA Astrophysics Data System (ADS)
Krishnamurthy, Lakshmi; Muñoz, Ángel G.; Vecchi, Gabriel A.; Msadek, Rym; Wittenberg, Andrew T.; Stern, Bill; Gudgel, Rich; Zeng, Fanrong
2018-05-01
The Caribbean low-level jet (CLLJ) is an important component of the atmospheric circulation over the Intra-Americas Sea (IAS) which impacts the weather and climate both locally and remotely. It influences the rainfall variability in the Caribbean, Central America, northern South America, the tropical Pacific and the continental Unites States through the transport of moisture. We make use of high-resolution coupled and uncoupled models from the Geophysical Fluid Dynamics Laboratory (GFDL) to investigate the simulation of the CLLJ and its teleconnections and further compare with low-resolution models. The high-resolution coupled model FLOR shows improvements in the simulation of the CLLJ and its teleconnections with rainfall and SST over the IAS compared to the low-resolution coupled model CM2.1. The CLLJ is better represented in uncoupled models (AM2.1 and AM2.5) forced with observed sea-surface temperatures (SSTs), emphasizing the role of SSTs in the simulation of the CLLJ. Further, we determine the forecast skill for observed rainfall using both high- and low-resolution predictions of rainfall and SSTs for the July-August-September season. We determine the role of statistical correction of model biases, coupling and horizontal resolution on the forecast skill. Statistical correction dramatically improves area-averaged forecast skill. But the analysis of spatial distribution in skill indicates that the improvement in skill after statistical correction is region dependent. Forecast skill is sensitive to coupling in parts of the Caribbean, Central and northern South America, and it is mostly insensitive over North America. Comparison of forecast skill between high and low-resolution coupled models does not show any dramatic difference. However, uncoupled models show improvement in the area-averaged skill in the high-resolution atmospheric model compared to lower resolution model. Understanding and improving the forecast skill over the IAS has important implications for highly vulnerable nations in the region.
NASA Astrophysics Data System (ADS)
Subramanian, Aneesh C.; Palmer, Tim N.
2017-06-01
Stochastic schemes to represent model uncertainty in the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system has helped improve its probabilistic forecast skill over the past decade by both improving its reliability and reducing the ensemble mean error. The largest uncertainties in the model arise from the model physics parameterizations. In the tropics, the parameterization of moist convection presents a major challenge for the accurate prediction of weather and climate. Superparameterization is a promising alternative strategy for including the effects of moist convection through explicit turbulent fluxes calculated from a cloud-resolving model (CRM) embedded within a global climate model (GCM). In this paper, we compare the impact of initial random perturbations in embedded CRMs, within the ECMWF ensemble prediction system, with stochastically perturbed physical tendency (SPPT) scheme as a way to represent model uncertainty in medium-range tropical weather forecasts. We especially focus on forecasts of tropical convection and dynamics during MJO events in October-November 2011. These are well-studied events for MJO dynamics as they were also heavily observed during the DYNAMO field campaign. We show that a multiscale ensemble modeling approach helps improve forecasts of certain aspects of tropical convection during the MJO events, while it also tends to deteriorate certain large-scale dynamic fields with respect to stochastically perturbed physical tendencies approach that is used operationally at ECMWF.
TU-CD-BRB-01: Normal Lung CT Texture Features Improve Predictive Models for Radiation Pneumonitis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krafft, S; The University of Texas Graduate School of Biomedical Sciences, Houston, TX; Briere, T
2015-06-15
Purpose: Existing normal tissue complication probability (NTCP) models for radiation pneumonitis (RP) traditionally rely on dosimetric and clinical data but are limited in terms of performance and generalizability. Extraction of pre-treatment image features provides a potential new category of data that can improve NTCP models for RP. We consider quantitative measures of total lung CT intensity and texture in a framework for prediction of RP. Methods: Available clinical and dosimetric data was collected for 198 NSCLC patients treated with definitive radiotherapy. Intensity- and texture-based image features were extracted from the T50 phase of the 4D-CT acquired for treatment planning. Amore » total of 3888 features (15 clinical, 175 dosimetric, and 3698 image features) were gathered and considered candidate predictors for modeling of RP grade≥3. A baseline logistic regression model with mean lung dose (MLD) was first considered. Additionally, a least absolute shrinkage and selection operator (LASSO) logistic regression was applied to the set of clinical and dosimetric features, and subsequently to the full set of clinical, dosimetric, and image features. Model performance was assessed by comparing area under the curve (AUC). Results: A simple logistic fit of MLD was an inadequate model of the data (AUC∼0.5). Including clinical and dosimetric parameters within the framework of the LASSO resulted in improved performance (AUC=0.648). Analysis of the full cohort of clinical, dosimetric, and image features provided further and significant improvement in model performance (AUC=0.727). Conclusions: To achieve significant gains in predictive modeling of RP, new categories of data should be considered in addition to clinical and dosimetric features. We have successfully incorporated CT image features into a framework for modeling RP and have demonstrated improved predictive performance. Validation and further investigation of CT image features in the context of RP NTCP modeling is warranted. This work was supported by the Rosalie B. Hite Fellowship in Cancer research awarded to SPK.« less
Chemistry-Transport Modeling of the Satellite Observed Distribution of Tropical Tropospheric Ozone
NASA Technical Reports Server (NTRS)
Peters, Wouter; Krol, Maarten; Dentener, Frank; Thompson, Anne M.; Leloeveld, Jos; Bhartia, P. K. (Technical Monitor)
2002-01-01
We have compared the 14-year record of satellite derived tropical tropospheric ozone columns (TTOC) from the NIMBUS-7 Total Ozone Mapping Spectrometer (TOMS) to TTOC calculated by a chemistry-transport model (CTM). An objective measure of error, based on the zonal distribution of TTOC in the tropics, is applied to perform this comparison systematically. In addition, the sensitivity of the model to several key processes in the tropics is quantified to select directions for future improvements. The comparisons indicate a widespread, systematic (20%) discrepancy over the tropical Atlantic Ocean, which maximizes during austral Spring. Although independent evidence from ozonesondes shows that some of the disagreement is due to satellite over-estimate of TTOC, the Atlantic mismatch is largely due to a misrepresentation of seasonally recurring processes in the model. Only minor differences between the model and observations over the Pacific occur, mostly due to interannual variability not captured by the model. Although chemical processes determine the TTOC extent, dynamical processes dominate the TTOC distribution, as the use of actual meteorology pertaining to the year of observations always leads to a better agreement with TTOC observations than using a random year or a climatology. The modeled TTOC is remarkably insensitive to many model parameters due to efficient feedbacks in the ozone budget. Nevertheless, the simulations would profit from an improved biomass burning calendar, as well as from an increase in NOX abundances in free tropospheric biomass burning plumes. The model showed the largest response to lightning NOX emissions, but systematic improvements could not be found. The use of multi-year satellite derived tropospheric data to systematically test and improve a CTM is a promising new addition to existing methods of model validation, and is a first step to integrating tropospheric satellite observations into global ozone modeling studies. Conversely,the CTM may suggest improvements to evolving satellite retrievals for tropospheric ozone.
NASA Earth Science Research Results for Improved Regional Crop Yield Prediction
NASA Astrophysics Data System (ADS)
Mali, P.; O'Hara, C. G.; Shrestha, B.; Sinclair, T. R.; G de Goncalves, L. G.; Salado Navarro, L. R.
2007-12-01
National agencies such as USDA Foreign Agricultural Service (FAS), Production Estimation and Crop Assessment Division (PECAD) work specifically to analyze and generate timely crop yield estimates that help define national as well as global food policies. The USDA/FAS/PECAD utilizes a Decision Support System (DSS) called CADRE (Crop Condition and Data Retrieval Evaluation) mainly through an automated database management system that integrates various meteorological datasets, crop and soil models, and remote sensing data; providing significant contribution to the national and international crop production estimates. The "Sinclair" soybean growth model has been used inside CADRE DSS as one of the crop models. This project uses Sinclair model (a semi-mechanistic crop growth model) for its potential to be effectively used in a geo-processing environment with remote-sensing-based inputs. The main objective of this proposed work is to verify, validate and benchmark current and future NASA earth science research results for the benefit in the operational decision making process of the PECAD/CADRE DSS. For this purpose, the NASA South American Land Data Assimilation System (SALDAS) meteorological dataset is tested for its applicability as a surrogate meteorological input in the Sinclair model meteorological input requirements. Similarly, NASA sensor MODIS products is tested for its applicability in the improvement of the crop yield prediction through improving precision of planting date estimation, plant vigor and growth monitoring. The project also analyzes simulated Visible/Infrared Imager/Radiometer Suite (VIIRS, a future NASA sensor) vegetation product for its applicability in crop growth prediction to accelerate the process of transition of VIIRS research results for the operational use of USDA/FAS/PECAD DSS. The research results will help in providing improved decision making capacity to the USDA/FAS/PECAD DSS through improved vegetation growth monitoring from high spatial and temporal resolution remote sensing datasets; improved time-series meteorological inputs required for crop growth models; and regional prediction capability through geo-processing-based yield modeling.
Singal, Amit G.; Mukherjee, Ashin; Elmunzer, B. Joseph; Higgins, Peter DR; Lok, Anna S.; Zhu, Ji; Marrero, Jorge A; Waljee, Akbar K
2015-01-01
Background Predictive models for hepatocellular carcinoma (HCC) have been limited by modest accuracy and lack of validation. Machine learning algorithms offer a novel methodology, which may improve HCC risk prognostication among patients with cirrhosis. Our study's aim was to develop and compare predictive models for HCC development among cirrhotic patients, using conventional regression analysis and machine learning algorithms. Methods We enrolled 442 patients with Child A or B cirrhosis at the University of Michigan between January 2004 and September 2006 (UM cohort) and prospectively followed them until HCC development, liver transplantation, death, or study termination. Regression analysis and machine learning algorithms were used to construct predictive models for HCC development, which were tested on an independent validation cohort from the Hepatitis C Antiviral Long-term Treatment against Cirrhosis (HALT-C) Trial. Both models were also compared to the previously published HALT-C model. Discrimination was assessed using receiver operating characteristic curve analysis and diagnostic accuracy was assessed with net reclassification improvement and integrated discrimination improvement statistics. Results After a median follow-up of 3.5 years, 41 patients developed HCC. The UM regression model had a c-statistic of 0.61 (95%CI 0.56-0.67), whereas the machine learning algorithm had a c-statistic of 0.64 (95%CI 0.60–0.69) in the validation cohort. The machine learning algorithm had significantly better diagnostic accuracy as assessed by net reclassification improvement (p<0.001) and integrated discrimination improvement (p=0.04). The HALT-C model had a c-statistic of 0.60 (95%CI 0.50-0.70) in the validation cohort and was outperformed by the machine learning algorithm (p=0.047). Conclusion Machine learning algorithms improve the accuracy of risk stratifying patients with cirrhosis and can be used to accurately identify patients at high-risk for developing HCC. PMID:24169273
Clark, Matthew T.; Calland, James Forrest; Enfield, Kyle B.; Voss, John D.; Lake, Douglas E.; Moorman, J. Randall
2017-01-01
Background Charted vital signs and laboratory results represent intermittent samples of a patient’s dynamic physiologic state and have been used to calculate early warning scores to identify patients at risk of clinical deterioration. We hypothesized that the addition of cardiorespiratory dynamics measured from continuous electrocardiography (ECG) monitoring to intermittently sampled data improves the predictive validity of models trained to detect clinical deterioration prior to intensive care unit (ICU) transfer or unanticipated death. Methods and findings We analyzed 63 patient-years of ECG data from 8,105 acute care patient admissions at a tertiary care academic medical center. We developed models to predict deterioration resulting in ICU transfer or unanticipated death within the next 24 hours using either vital signs, laboratory results, or cardiorespiratory dynamics from continuous ECG monitoring and also evaluated models using all available data sources. We calculated the predictive validity (C-statistic), the net reclassification improvement, and the probability of achieving the difference in likelihood ratio χ2 for the additional degrees of freedom. The primary outcome occurred 755 times in 586 admissions (7%). We analyzed 395 clinical deteriorations with continuous ECG data in the 24 hours prior to an event. Using only continuous ECG measures resulted in a C-statistic of 0.65, similar to models using only laboratory results and vital signs (0.63 and 0.69 respectively). Addition of continuous ECG measures to models using conventional measurements improved the C-statistic by 0.01 and 0.07; a model integrating all data sources had a C-statistic of 0.73 with categorical net reclassification improvement of 0.09 for a change of 1 decile in risk. The difference in likelihood ratio χ2 between integrated models with and without cardiorespiratory dynamics was 2158 (p value: <0.001). Conclusions Cardiorespiratory dynamics from continuous ECG monitoring detect clinical deterioration in acute care patients and improve performance of conventional models that use only laboratory results and vital signs. PMID:28771487
Moss, Travis J; Clark, Matthew T; Calland, James Forrest; Enfield, Kyle B; Voss, John D; Lake, Douglas E; Moorman, J Randall
2017-01-01
Charted vital signs and laboratory results represent intermittent samples of a patient's dynamic physiologic state and have been used to calculate early warning scores to identify patients at risk of clinical deterioration. We hypothesized that the addition of cardiorespiratory dynamics measured from continuous electrocardiography (ECG) monitoring to intermittently sampled data improves the predictive validity of models trained to detect clinical deterioration prior to intensive care unit (ICU) transfer or unanticipated death. We analyzed 63 patient-years of ECG data from 8,105 acute care patient admissions at a tertiary care academic medical center. We developed models to predict deterioration resulting in ICU transfer or unanticipated death within the next 24 hours using either vital signs, laboratory results, or cardiorespiratory dynamics from continuous ECG monitoring and also evaluated models using all available data sources. We calculated the predictive validity (C-statistic), the net reclassification improvement, and the probability of achieving the difference in likelihood ratio χ2 for the additional degrees of freedom. The primary outcome occurred 755 times in 586 admissions (7%). We analyzed 395 clinical deteriorations with continuous ECG data in the 24 hours prior to an event. Using only continuous ECG measures resulted in a C-statistic of 0.65, similar to models using only laboratory results and vital signs (0.63 and 0.69 respectively). Addition of continuous ECG measures to models using conventional measurements improved the C-statistic by 0.01 and 0.07; a model integrating all data sources had a C-statistic of 0.73 with categorical net reclassification improvement of 0.09 for a change of 1 decile in risk. The difference in likelihood ratio χ2 between integrated models with and without cardiorespiratory dynamics was 2158 (p value: <0.001). Cardiorespiratory dynamics from continuous ECG monitoring detect clinical deterioration in acute care patients and improve performance of conventional models that use only laboratory results and vital signs.
NASA Astrophysics Data System (ADS)
Chaudhari, Rajan; Heim, Andrew J.; Li, Zhijun
2015-05-01
Evidenced by the three-rounds of G-protein coupled receptors (GPCR) Dock competitions, improving homology modeling methods of helical transmembrane proteins including the GPCRs, based on templates of low sequence identity, remains an eminent challenge. Current approaches addressing this challenge adopt the philosophy of "modeling first, refinement next". In the present work, we developed an alternative modeling approach through the novel application of available multiple templates. First, conserved inter-residue interactions are derived from each additional template through conservation analysis of each template-target pairwise alignment. Then, these interactions are converted into distance restraints and incorporated in the homology modeling process. This approach was applied to modeling of the human β2 adrenergic receptor using the bovin rhodopsin and the human protease-activated receptor 1 as templates and improved model quality was demonstrated compared to the homology model generated by standard single-template and multiple-template methods. This method of "refined restraints first, modeling next", provides a fast and complementary way to the current modeling approaches. It allows rational identification and implementation of additional conserved distance restraints extracted from multiple templates and/or experimental data, and has the potential to be applicable to modeling of all helical transmembrane proteins.
Liu, Hua; Wu, Wen
2017-01-01
For improving the tracking accuracy and model switching speed of maneuvering target tracking in nonlinear systems, a new algorithm named the interacting multiple model fifth-degree spherical simplex-radial cubature Kalman filter (IMM5thSSRCKF) is proposed in this paper. The new algorithm is a combination of the interacting multiple model (IMM) filter and the fifth-degree spherical simplex-radial cubature Kalman filter (5thSSRCKF). The proposed algorithm makes use of Markov process to describe the switching probability among the models, and uses 5thSSRCKF to deal with the state estimation of each model. The 5thSSRCKF is an improved filter algorithm, which utilizes the fifth-degree spherical simplex-radial rule to improve the filtering accuracy. Finally, the tracking performance of the IMM5thSSRCKF is evaluated by simulation in a typical maneuvering target tracking scenario. Simulation results show that the proposed algorithm has better tracking performance and quicker model switching speed when disposing maneuver models compared with the interacting multiple model unscented Kalman filter (IMMUKF), the interacting multiple model cubature Kalman filter (IMMCKF) and the interacting multiple model fifth-degree cubature Kalman filter (IMM5thCKF). PMID:28608843
Liu, Hua; Wu, Wen
2017-06-13
For improving the tracking accuracy and model switching speed of maneuvering target tracking in nonlinear systems, a new algorithm named the interacting multiple model fifth-degree spherical simplex-radial cubature Kalman filter (IMM5thSSRCKF) is proposed in this paper. The new algorithm is a combination of the interacting multiple model (IMM) filter and the fifth-degree spherical simplex-radial cubature Kalman filter (5thSSRCKF). The proposed algorithm makes use of Markov process to describe the switching probability among the models, and uses 5thSSRCKF to deal with the state estimation of each model. The 5thSSRCKF is an improved filter algorithm, which utilizes the fifth-degree spherical simplex-radial rule to improve the filtering accuracy. Finally, the tracking performance of the IMM5thSSRCKF is evaluated by simulation in a typical maneuvering target tracking scenario. Simulation results show that the proposed algorithm has better tracking performance and quicker model switching speed when disposing maneuver models compared with the interacting multiple model unscented Kalman filter (IMMUKF), the interacting multiple model cubature Kalman filter (IMMCKF) and the interacting multiple model fifth-degree cubature Kalman filter (IMM5thCKF).
Powathil, Gibin G; Swat, Maciej; Chaplain, Mark A J
2015-02-01
The multiscale complexity of cancer as a disease necessitates a corresponding multiscale modelling approach to produce truly predictive mathematical models capable of improving existing treatment protocols. To capture all the dynamics of solid tumour growth and its progression, mathematical modellers need to couple biological processes occurring at various spatial and temporal scales (from genes to tissues). Because effectiveness of cancer therapy is considerably affected by intracellular and extracellular heterogeneities as well as by the dynamical changes in the tissue microenvironment, any model attempt to optimise existing protocols must consider these factors ultimately leading to improved multimodal treatment regimes. By improving existing and building new mathematical models of cancer, modellers can play important role in preventing the use of potentially sub-optimal treatment combinations. In this paper, we analyse a multiscale computational mathematical model for cancer growth and spread, incorporating the multiple effects of radiation therapy and chemotherapy in the patient survival probability and implement the model using two different cell based modelling techniques. We show that the insights provided by such multiscale modelling approaches can ultimately help in designing optimal patient-specific multi-modality treatment protocols that may increase patients quality of life. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Jackson-Blake, L.
2014-12-01
Process-based catchment water quality models are increasingly used as tools to inform land management. However, for such models to be reliable they need to be well calibrated and shown to reproduce key catchment processes. Calibration can be challenging for process-based models, which tend to be complex and highly parameterised. Calibrating a large number of parameters generally requires a large amount of monitoring data, but even in well-studied catchments, streams are often only sampled at a fortnightly or monthly frequency. The primary aim of this study was therefore to investigate how the quality and uncertainty of model simulations produced by one process-based catchment model, INCA-P (the INtegrated CAtchment model of Phosphorus dynamics), were improved by calibration to higher frequency water chemistry data. Two model calibrations were carried out for a small rural Scottish catchment: one using 18 months of daily total dissolved phosphorus (TDP) concentration data, another using a fortnightly dataset derived from the daily data. To aid comparability, calibrations were carried out automatically using the MCMC-DREAM algorithm. Using daily rather than fortnightly data resulted in improved simulation of the magnitude of peak TDP concentrations, in turn resulting in improved model performance statistics. Marginal posteriors were better constrained by the higher frequency data, resulting in a large reduction in parameter-related uncertainty in simulated TDP (the 95% credible interval decreased from 26 to 6 μg/l). The number of parameters that could be reliably auto-calibrated was lower for the fortnightly data, leading to the recommendation that parameters should not be varied spatially for models such as INCA-P unless there is solid evidence that this is appropriate, or there is a real need to do so for the model to fulfil its purpose. Secondary study aims were to highlight the subjective elements involved in auto-calibration and suggest practical improvements that could make models such as INCA-P more suited to auto-calibration and uncertainty analyses. Two key improvements include model simplification, so that all model parameters can be included in an analysis of this kind, and better documenting of recommended ranges for each parameter, to help in choosing sensible priors.
Improved model quality assessment using ProQ2.
Ray, Arjun; Lindahl, Erik; Wallner, Björn
2012-09-10
Employing methods to assess the quality of modeled protein structures is now standard practice in bioinformatics. In a broad sense, the techniques can be divided into methods relying on consensus prediction on the one hand, and single-model methods on the other. Consensus methods frequently perform very well when there is a clear consensus, but this is not always the case. In particular, they frequently fail in selecting the best possible model in the hard cases (lacking consensus) or in the easy cases where models are very similar. In contrast, single-model methods do not suffer from these drawbacks and could potentially be applied on any protein of interest to assess quality or as a scoring function for sampling-based refinement. Here, we present a new single-model method, ProQ2, based on ideas from its predecessor, ProQ. ProQ2 is a model quality assessment algorithm that uses support vector machines to predict local as well as global quality of protein models. Improved performance is obtained by combining previously used features with updated structural and predicted features. The most important contribution can be attributed to the use of profile weighting of the residue specific features and the use features averaged over the whole model even though the prediction is still local. ProQ2 is significantly better than its predecessors at detecting high quality models, improving the sum of Z-scores for the selected first-ranked models by 20% and 32% compared to the second-best single-model method in CASP8 and CASP9, respectively. The absolute quality assessment of the models at both local and global level is also improved. The Pearson's correlation between the correct and local predicted score is improved from 0.59 to 0.70 on CASP8 and from 0.62 to 0.68 on CASP9; for global score to the correct GDT_TS from 0.75 to 0.80 and from 0.77 to 0.80 again compared to the second-best single methods in CASP8 and CASP9, respectively. ProQ2 is available at http://proq2.wallnerlab.org.
CHENG, JIANLIN; EICKHOLT, JESSE; WANG, ZHENG; DENG, XIN
2013-01-01
After decades of research, protein structure prediction remains a very challenging problem. In order to address the different levels of complexity of structural modeling, two types of modeling techniques — template-based modeling and template-free modeling — have been developed. Template-based modeling can often generate a moderate- to high-resolution model when a similar, homologous template structure is found for a query protein but fails if no template or only incorrect templates are found. Template-free modeling, such as fragment-based assembly, may generate models of moderate resolution for small proteins of low topological complexity. Seldom have the two techniques been integrated together to improve protein modeling. Here we develop a recursive protein modeling approach to selectively and collaboratively apply template-based and template-free modeling methods to model template-covered (i.e. certain) and template-free (i.e. uncertain) regions of a protein. A preliminary implementation of the approach was tested on a number of hard modeling cases during the 9th Critical Assessment of Techniques for Protein Structure Prediction (CASP9) and successfully improved the quality of modeling in most of these cases. Recursive modeling can signicantly reduce the complexity of protein structure modeling and integrate template-based and template-free modeling to improve the quality and efficiency of protein structure prediction. PMID:22809379
ERIC Educational Resources Information Center
Herlihy, Corinne M.; Kemple, James J.
2004-01-01
The Talent Development Middle School model was created to make a difference in struggling urban middle schools. The model is part of a trend in school improvement strategies whereby whole-school reform projects aim to improve performance and attendance outcomes for students through the use of major changes in both the organizational structure and…
ISMS: A New Model for Improving Student Motivation and Self-Esteem in Primary Education
ERIC Educational Resources Information Center
Ghilay, Yaron; Ghilay, Ruth
2015-01-01
In this study we introduce a new model for primary education called ISMS: Improving Student Motivation and Self-esteem. Following a two-year study undertaken in a primary school (n = 67), the new model was found to be successful. Students who participated in the research, reported that a course based on ISMS principles was very helpful for…
Joel W. Homan; Charles H. Luce; James P. McNamara; Nancy F. Glenn
2011-01-01
Describing the spatial variability of heterogeneous snowpacks at a watershed or mountain-front scale is important for improvements in large-scale snowmelt modelling. Snowmelt depletion curves, which relate fractional decreases in snowcovered area (SCA) against normalized decreases in snow water equivalent (SWE), are a common approach to scale-up snowmelt models....
Lunar Reconnaissance Orbiter Orbit Determination Accuracy Analysis
NASA Technical Reports Server (NTRS)
Slojkowski, Steven E.
2014-01-01
Results from operational OD produced by the NASA Goddard Flight Dynamics Facility for the LRO nominal and extended mission are presented. During the LRO nominal mission, when LRO flew in a low circular orbit, orbit determination requirements were met nearly 100% of the time. When the extended mission began, LRO returned to a more elliptical frozen orbit where gravity and other modeling errors caused numerous violations of mission accuracy requirements. Prediction accuracy is particularly challenged during periods when LRO is in full-Sun. A series of improvements to LRO orbit determination are presented, including implementation of new lunar gravity models, improved spacecraft solar radiation pressure modeling using a dynamic multi-plate area model, a shorter orbit determination arc length, and a constrained plane method for estimation. The analysis presented in this paper shows that updated lunar gravity models improved accuracy in the frozen orbit, and a multiplate dynamic area model improves prediction accuracy during full-Sun orbit periods. Implementation of a 36-hour tracking data arc and plane constraints during edge-on orbit geometry also provide benefits. A comparison of the operational solutions to precision orbit determination solutions shows agreement on a 100- to 250-meter level in definitive accuracy.
Bayesian logistic regression approaches to predict incorrect DRG assignment.
Suleiman, Mani; Demirhan, Haydar; Boyd, Leanne; Girosi, Federico; Aksakalli, Vural
2018-05-07
Episodes of care involving similar diagnoses and treatments and requiring similar levels of resource utilisation are grouped to the same Diagnosis-Related Group (DRG). In jurisdictions which implement DRG based payment systems, DRGs are a major determinant of funding for inpatient care. Hence, service providers often dedicate auditing staff to the task of checking that episodes have been coded to the correct DRG. The use of statistical models to estimate an episode's probability of DRG error can significantly improve the efficiency of clinical coding audits. This study implements Bayesian logistic regression models with weakly informative prior distributions to estimate the likelihood that episodes require a DRG revision, comparing these models with each other and to classical maximum likelihood estimates. All Bayesian approaches had more stable model parameters than maximum likelihood. The best performing Bayesian model improved overall classification per- formance by 6% compared to maximum likelihood, with a 34% gain compared to random classification, respectively. We found that the original DRG, coder and the day of coding all have a significant effect on the likelihood of DRG error. Use of Bayesian approaches has improved model parameter stability and classification accuracy. This method has already lead to improved audit efficiency in an operational capacity.
Protocols for Molecular Modeling with Rosetta3 and RosettaScripts
2016-01-01
Previously, we published an article providing an overview of the Rosetta suite of biomacromolecular modeling software and a series of step-by-step tutorials [Kaufmann, K. W., et al. (2010) Biochemistry 49, 2987–2998]. The overwhelming positive response to this publication we received motivates us to here share the next iteration of these tutorials that feature de novo folding, comparative modeling, loop construction, protein docking, small molecule docking, and protein design. This updated and expanded set of tutorials is needed, as since 2010 Rosetta has been fully redesigned into an object-oriented protein modeling program Rosetta3. Notable improvements include a substantially improved energy function, an XML-like language termed “RosettaScripts” for flexibly specifying modeling task, new analysis tools, the addition of the TopologyBroker to control conformational sampling, and support for multiple templates in comparative modeling. Rosetta’s ability to model systems with symmetric proteins, membrane proteins, noncanonical amino acids, and RNA has also been greatly expanded and improved. PMID:27490953
Enhanced stability of car-following model upon incorporation of short-term driving memory
NASA Astrophysics Data System (ADS)
Liu, Da-Wei; Shi, Zhong-Ke; Ai, Wen-Huan
2017-06-01
Based on the full velocity difference model, a new car-following model is developed to investigate the effect of short-term driving memory on traffic flow in this paper. Short-term driving memory is introduced as the influence factor of driver's anticipation behavior. The stability condition of the newly developed model is derived and the modified Korteweg-de Vries (mKdV) equation is constructed to describe the traffic behavior near the critical point. Via numerical method, evolution of a small perturbation is investigated firstly. The results show that the improvement of this new car-following model over the previous ones lies in the fact that the new model can improve the traffic stability. Starting and breaking processes of vehicles in the signalized intersection are also investigated. The numerical simulations illustrate that the new model can successfully describe the driver's anticipation behavior, and that the efficiency and safety of the vehicles passing through the signalized intersection are improved by considering short-term driving memory.
Thermal modelling using discrete vasculature for thermal therapy: a review
Kok, H.P.; Gellermann, J.; van den Berg, C.A.T.; Stauffer, P.R.; Hand, J.W.; Crezee, J.
2013-01-01
Reliable temperature information during clinical hyperthermia and thermal ablation is essential for adequate treatment control, but conventional temperature measurements do not provide 3D temperature information. Treatment planning is a very useful tool to improve treatment quality and substantial progress has been made over the last decade. Thermal modelling is a very important and challenging aspect of hyperthermia treatment planning. Various thermal models have been developed for this purpose, with varying complexity. Since blood perfusion is such an important factor in thermal redistribution of energy in in vivo tissue, thermal simulations are most accurately performed by modelling discrete vasculature. This review describes the progress in thermal modelling with discrete vasculature for the purpose of hyperthermia treatment planning and thermal ablation. There has been significant progress in thermal modelling with discrete vasculature. Recent developments have made real-time simulations possible, which can provide feedback during treatment for improved therapy. Future clinical application of thermal modelling with discrete vasculature in hyperthermia treatment planning is expected to further improve treatment quality. PMID:23738700
Terminal Area Productivity Airport Wind Analysis and Chicago O'Hare Model Description
NASA Technical Reports Server (NTRS)
Hemm, Robert; Shapiro, Gerald
1998-01-01
This paper describes two results from a continuing effort to provide accurate cost-benefit analyses of the NASA Terminal Area Productivity (TAP) program technologies. Previous tasks have developed airport capacity and delay models and completed preliminary cost benefit estimates for TAP technologies at 10 U.S. airports. This task covers two improvements to the capacity and delay models. The first improvement is the completion of a detailed model set for the Chicago O'Hare (ORD) airport. Previous analyses used a more general model to estimate the benefits for ORD. This paper contains a description of the model details with results corresponding to current conditions. The second improvement is the development of specific wind speed and direction criteria for use in the delay models to predict when the Aircraft Vortex Spacing System (AVOSS) will allow use of reduced landing separations. This paper includes a description of the criteria and an estimate of AVOSS utility for 10 airports based on analysis of 35 years of weather data.
Improvements in continuum modeling for biomolecular systems
NASA Astrophysics Data System (ADS)
Yu, Qiao; Ben-Zhuo, Lu
2016-01-01
Modeling of biomolecular systems plays an essential role in understanding biological processes, such as ionic flow across channels, protein modification or interaction, and cell signaling. The continuum model described by the Poisson- Boltzmann (PB)/Poisson-Nernst-Planck (PNP) equations has made great contributions towards simulation of these processes. However, the model has shortcomings in its commonly used form and cannot capture (or cannot accurately capture) some important physical properties of the biological systems. Considerable efforts have been made to improve the continuum model to account for discrete particle interactions and to make progress in numerical methods to provide accurate and efficient simulations. This review will summarize recent main improvements in continuum modeling for biomolecular systems, with focus on the size-modified models, the coupling of the classical density functional theory and the PNP equations, the coupling of polar and nonpolar interactions, and numerical progress. Project supported by the National Natural Science Foundation of China (Grant No. 91230106) and the Chinese Academy of Sciences Program for Cross & Cooperative Team of the Science & Technology Innovation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thorne, P.D.; Chamness, M.A.; Vermeul, V.R.
This report documents work conducted during the fiscal year 1994 to development an improved three-dimensional conceptual model of ground-water flow in the unconfined aquifer system across the Hanford Site Ground-Water Surveillance Project, which is managed by Pacific Northwest Laboratory. The main objective of the ongoing effort to develop an improved conceptual model of ground-water flow is to provide the basis for improved numerical report models that will be capable of accurately predicting the movement of radioactive and chemical contaminant plumes in the aquifer beneath Hanford. More accurate ground-water flow models will also be useful in assessing the impacts of changesmore » in facilities and operations. For example, decreasing volumes of operational waste-water discharge are resulting in a declining water table in parts of the unconfined aquifer. In addition to supporting numerical modeling, the conceptual model also provides a qualitative understanding of the movement of ground water and contaminants in the aquifer.« less
NASA Technical Reports Server (NTRS)
Sanchez, Braulio V.
1990-01-01
The Japanese Experimental Geodetic Satellite Ajisai was launched on August 12, 1986. In response to the TOPEX-POSEIDON mission requirements, the GSFC Space Geodesy Branch and its associates are producing improved models of the Earth's gravitational field. With the launch of Ajisai, precise laser data is now available which can be used to test many current gravity models. The testing of the various gravity field models show improvements of more than 70 percent in the orbital fits when using GEM-T1 and GEM-T2 relative to results obtained with the earlier GEM-10B model. The GEM-T2 orbital fits are at the 13-cm level (RMS). The results of the tests with the various versions of the GEM-T1 model indicate that the addition of satellite altimetry and surface gravity anomalies as additional data types should improve future gravity field models.
Evaluation of airborne lidar data to predict vegetation Presence/Absence
Palaseanu-Lovejoy, M.; Nayegandhi, A.; Brock, J.; Woodman, R.; Wright, C.W.
2009-01-01
This study evaluates the capabilities of the Experimental Advanced Airborne Research Lidar (EAARL) in delineating vegetation assemblages in Jean Lafitte National Park, Louisiana. Five-meter-resolution grids of bare earth, canopy height, canopy-reflection ratio, and height of median energy were derived from EAARL data acquired in September 2006. Ground-truth data were collected along transects to assess species composition, canopy cover, and ground cover. To decide which model is more accurate, comparisons of general linear models and generalized additive models were conducted using conventional evaluation methods (i.e., sensitivity, specificity, Kappa statistics, and area under the curve) and two new indexes, net reclassification improvement and integrated discrimination improvement. Generalized additive models were superior to general linear models in modeling presence/absence in training vegetation categories, but no statistically significant differences between the two models were achieved in determining the classification accuracy at validation locations using conventional evaluation methods, although statistically significant improvements in net reclassifications were observed. ?? 2009 Coastal Education and Research Foundation.
Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments
NASA Technical Reports Server (NTRS)
Manning, Ted A.; Lawrence, Scott L.
2014-01-01
As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.
NASA Astrophysics Data System (ADS)
Hogue, T. S.; He, M.; Franz, K. J.; Margulis, S. A.; Vrugt, J. A.
2010-12-01
The current study presents an integrated uncertainty analysis and data assimilation approach to improve streamflow predictions while simultaneously providing meaningful estimates of the associated uncertainty. Study models include the National Weather Service (NWS) operational snow model (SNOW17) and rainfall-runoff model (SAC-SMA). The proposed approach uses the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) to simultaneously estimate uncertainties in model parameters, forcing, and observations. An ensemble Kalman filter (EnKF) is configured with the DREAM-identified uncertainty structure and applied to assimilating snow water equivalent data into the SNOW17 model for improved snowmelt simulations. Snowmelt estimates then serves as an input to the SAC-SMA model to provide streamflow predictions at the basin outlet. The robustness and usefulness of the approach is evaluated for a snow-dominated watershed in the northern Sierra Mountains. This presentation describes the implementation of DREAM and EnKF into the coupled SNOW17 and SAC-SMA models and summarizes study results and findings.
How to Begin a Quality Improvement Project.
Silver, Samuel A; Harel, Ziv; McQuillan, Rory; Weizman, Adam V; Thomas, Alison; Chertow, Glenn M; Nesrallah, Gihad; Bell, Chaim M; Chan, Christopher T
2016-05-06
Quality improvement involves a combined effort among health care staff and stakeholders to diagnose and treat problems in the health care system. However, health care professionals often lack training in quality improvement methods, which makes it challenging to participate in improvement efforts. This article familiarizes health care professionals with how to begin a quality improvement project. The initial steps involve forming an improvement team that possesses expertise in the quality of care problem, leadership, and change management. Stakeholder mapping and analysis are useful tools at this stage, and these are reviewed to help identify individuals who might have a vested interest in the project. Physician engagement is a particularly important component of project success, and the knowledge that patients/caregivers can offer as members of a quality improvement team should not be overlooked. After a team is formed, an improvement framework helps to organize the scientific process of system change. Common quality improvement frameworks include Six Sigma, Lean, and the Model for Improvement. These models are contrasted, with a focus on the Model for Improvement, because it is widely used and applicable to a variety of quality of care problems without advanced training. It involves three steps: setting aims to focus improvement, choosing a balanced set of measures to determine if improvement occurs, and testing new ideas to change the current process. These new ideas are evaluated using Plan-Do-Study-Act cycles, where knowledge is gained by testing changes and reflecting on their effect. To show the real world utility of the quality improvement methods discussed, they are applied to a hypothetical quality improvement initiative that aims to promote home dialysis (home hemodialysis and peritoneal dialysis). This provides an example that kidney health care professionals can use to begin their own quality improvement projects. Copyright © 2016 by the American Society of Nephrology.
How to Begin a Quality Improvement Project
Harel, Ziv; McQuillan, Rory; Weizman, Adam V.; Thomas, Alison; Chertow, Glenn M.; Nesrallah, Gihad; Bell, Chaim M.; Chan, Christopher T.
2016-01-01
Quality improvement involves a combined effort among health care staff and stakeholders to diagnose and treat problems in the health care system. However, health care professionals often lack training in quality improvement methods, which makes it challenging to participate in improvement efforts. This article familiarizes health care professionals with how to begin a quality improvement project. The initial steps involve forming an improvement team that possesses expertise in the quality of care problem, leadership, and change management. Stakeholder mapping and analysis are useful tools at this stage, and these are reviewed to help identify individuals who might have a vested interest in the project. Physician engagement is a particularly important component of project success, and the knowledge that patients/caregivers can offer as members of a quality improvement team should not be overlooked. After a team is formed, an improvement framework helps to organize the scientific process of system change. Common quality improvement frameworks include Six Sigma, Lean, and the Model for Improvement. These models are contrasted, with a focus on the Model for Improvement, because it is widely used and applicable to a variety of quality of care problems without advanced training. It involves three steps: setting aims to focus improvement, choosing a balanced set of measures to determine if improvement occurs, and testing new ideas to change the current process. These new ideas are evaluated using Plan-Do-Study-Act cycles, where knowledge is gained by testing changes and reflecting on their effect. To show the real world utility of the quality improvement methods discussed, they are applied to a hypothetical quality improvement initiative that aims to promote home dialysis (home hemodialysis and peritoneal dialysis). This provides an example that kidney health care professionals can use to begin their own quality improvement projects. PMID:27016497
Toward Improved Fidelity of Thermal Explosion Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, A L; Becker, R; Howard, W M
2009-07-17
We will present results of an effort to improve the thermal/chemical/mechanical modeling of HMX based explosive like LX04 and LX10 for thermal cook-off. The original HMX model and analysis scheme were developed by Yoh et.al. for use in the ALE3D modeling framework. The current results were built to remedy the deficiencies of that original model. We concentrated our efforts in four areas. The first area was addition of porosity to the chemical material model framework in ALE3D that is used to model the HMX explosive formulation. This is needed to handle the roughly 2% porosity in solid explosives. The secondmore » area was the improvement of the HMX reaction network, which included the inclusion of a reactive phase change model base on work by Henson et.al. The third area required adding early decomposition gas species to the CHEETAH material database to develop more accurate equations of state for gaseous intermediates and products. Finally, it was necessary to improve the implicit mechanics module in ALE3D to more naturally handle the long time scales associated with thermal cook-off. The application of the resulting framework to the analysis of the Scaled Thermal Explosion (STEX) experiments will be discussed.« less
Scientific analysis of satellite ranging data
NASA Technical Reports Server (NTRS)
Smith, David E.
1994-01-01
A network of satellite laser ranging (SLR) tracking systems with continuously improving accuracies is challenging the modelling capabilities of analysts worldwide. Various data analysis techniques have yielded many advances in the development of orbit, instrument and Earth models. The direct measurement of the distance to the satellite provided by the laser ranges has given us a simple metric which links the results obtained by diverse approaches. Different groups have used SLR data, often in combination with observations from other space geodetic techniques, to improve models of the static geopotential, the solid Earth, ocean tides, and atmospheric drag models for low Earth satellites. Radiation pressure models and other non-conservative forces for satellite orbits above the atmosphere have been developed to exploit the full accuracy of the latest SLR instruments. SLR is the baseline tracking system for the altimeter missions TOPEX/Poseidon, and ERS-1 and will play an important role in providing the reference frame for locating the geocentric position of the ocean surface, in providing an unchanging range standard for altimeter calibration, and for improving the geoid models to separate gravitational from ocean circulation signals seen in the sea surface. However, even with the many improvements in the models used to support the orbital analysis of laser observations, there remain systematic effects which limit the full exploitation of SLR accuracy today.
Revisiting the pole tide for and from satellite altimetry
NASA Astrophysics Data System (ADS)
Desai, Shailen; Wahr, John; Beckley, Brian
2015-12-01
Satellite altimeter sea surface height observations include the geocentric displacements caused by the pole tide, namely the response of the solid Earth and oceans to polar motion. Most users of these data remove these effects using a model that was developed more than 20 years ago. We describe two improvements to the pole tide model for satellite altimeter measurements. Firstly, we recommend an approach that improves the model for the response of the oceans by including the effects of self-gravitation, loading, and mass conservation. Our recommended approach also specifically includes the previously ignored displacement of the solid Earth due to the load of the ocean response, and includes the effects of geocenter motion. Altogether, this improvement amplifies the modeled geocentric pole tide by 15 %, or up to 2 mm of sea surface height displacement. We validate this improvement using two decades of satellite altimeter measurements. Secondly, we recommend that the altimetry pole tide model exclude geocentric sea surface displacements resulting from the long-term drift in polar motion. The response to this particular component of polar motion requires a more rigorous approach than is used by conventional models. We show that erroneously including the response to this component of polar motion in the pole tide model impacts interpretation of regional sea level rise by ± 0.25 mm/year.
Improved engineering models for turbulent wall flows
NASA Astrophysics Data System (ADS)
She, Zhen-Su; Chen, Xi; Zou, Hong-Yue; Hussain, Fazle
2015-11-01
We propose a new approach, called structural ensemble dynamics (SED), involving new concepts to describe the mean quantities in wall-bounded flows, and its application to improving the existing engineering turbulence models, as well as its physical interpretation. First, a revised k - ω model for pipe flows is obtained, which accurately predicts, for the first time, both mean velocity and (streamwise) kinetic energy for a wide range of the Reynolds number (Re), validated by Princeton experimental data. In particular, a multiplicative factor is introduced in the dissipation term to model an anomaly in the energy cascade in a meso-layer, predicting the outer peak of agreeing with data. Secondly, a new one-equation model is obtained for compressible turbulent boundary layers (CTBL), building on a multi-layer formula of the stress length function and a generalized temperature-velocity relation. The former refines the multi-layer description - viscous sublayer, buffer layer, logarithmic layer and a newly defined bulk zone - while the latter characterizes a parabolic relation between the mean velocity and temperature. DNS data show our predictions to have a 99% accuracy for several Mach numbers Ma = 2.25, 4.5, improving, up to 10%, a previous similar one-equation model (Baldwin & Lomax, 1978). Our results promise notable improvements in engineering models.
Impact of Land Use/Land Cover Conditions on WRF Model Evaluation for Heat Island Assessment
NASA Astrophysics Data System (ADS)
Bhati, S.; Mohan, M.
2017-12-01
Urban heat island effect has been assessed using Weather Research and Forecasting model (WRF v3.5) focusing on air temperature and surface skin temperature in the sub-tropical urban Indian megacity of Delhi. Impact of urbanization related changes in land use/land cover (LULC) on model outputs has been analyzed. Four simulations have been carried out with different types of LULC data viz. (1) USGS , (2) MODIS, (3) user-modified USGS and (4) user modified land use data coupled with urban canopy model (UCM) for incorporation of canopy features. Heat island intensities have been estimated based on these simulations and subsequently compared with those derived from in-situ and satellite observations. There is a significant improvement in model performance with modification of LULC and inclusion of UCM. Overall, RMSEs for near surface temperature improved from 6.3°C to 3.9°C and index of agreement for mean urban heat island intensities (UHI) improved from 0.4 to 0.7 with modified land use coupled with UCM. In general, model is able to capture the magnitude of UHI as well as high UHI zones well. The study highlights the importance of appropriate and updated representation of landuse-landcover and urban canopies for improving predictive capabilities of the mesoscale models.
NASA Astrophysics Data System (ADS)
Yuan, Chunhua; Wang, Jiang; Yi, Guosheng
2017-03-01
Estimation of ion channel parameters is crucial to spike initiation of neurons. The biophysical neuron models have numerous ion channel parameters, but only a few of them play key roles in the firing patterns of the models. So we choose three parameters featuring the adaptation in the Ermentrout neuron model to be estimated. However, the traditional particle swarm optimization (PSO) algorithm is still easy to fall into local optimum and has the premature convergence phenomenon in the study of some problems. In this paper, we propose an improved method that uses a concave function and dynamic logistic chaotic mapping mixed to adjust the inertia weights of the fitness value, effectively improve the global convergence ability of the algorithm. The perfect predicting firing trajectories of the rebuilt model using the estimated parameters prove that only estimating a few important ion channel parameters can establish the model well and the proposed algorithm is effective. Estimations using two classic PSO algorithms are also compared to the improved PSO to verify that the algorithm proposed in this paper can avoid local optimum and quickly converge to the optimal value. The results provide important theoretical foundations for building biologically realistic neuron models.
Terluin, Berend; Eekhout, Iris; Terwee, Caroline B
2017-03-01
Patients have their individual minimal important changes (iMICs) as their personal benchmarks to determine whether a perceived health-related quality of life (HRQOL) change constitutes a (minimally) important change for them. We denote the mean iMIC in a group of patients as the "genuine MIC" (gMIC). The aims of this paper are (1) to examine the relationship between the gMIC and the anchor-based minimal important change (MIC), determined by receiver operating characteristic analysis or by predictive modeling; (2) to examine the impact of the proportion of improved patients on these MICs; and (3) to explore the possibility to adjust the MIC for the influence of the proportion of improved patients. Multiple simulations of patient samples involved in anchor-based MIC studies with different characteristics of HRQOL (change) scores and distributions of iMICs. In addition, a real data set is analyzed for illustration. The receiver operating characteristic-based and predictive modeling MICs equal the gMIC when the proportion of improved patients equals 0.5. The MIC is estimated higher than the gMIC when the proportion improved is greater than 0.5, and the MIC is estimated lower than the gMIC when the proportion improved is less than 0.5. Using an equation including the predictive modeling MIC, the log-odds of improvement, the standard deviation of the HRQOL change score, and the correlation between the HRQOL change score and the anchor results in an adjusted MIC reflecting the gMIC irrespective of the proportion of improved patients. Adjusting the predictive modeling MIC for the proportion of improved patients assures that the adjusted MIC reflects the gMIC. We assumed normal distributions and global perceived change scores that were independent on the follow-up score. Additionally, floor and ceiling effects were not taken into account. Copyright © 2017 Elsevier Inc. All rights reserved.
Helping agencies improve their planning analysis techniques.
DOT National Transportation Integrated Search
2011-11-18
This report summarizes the results of a peer review of the AZTDM. The peer review was : supported by the Travel Model Improvement Program (TMIP), which is sponsored by FHWA. : The peer review of a travel model can serve multiple purposes, including i...
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, D.L.
1995-11-01
The objective of this work was to develop improved performance model for modules and systems for for all operating conditions for use in module specifications, system and BOS component design, and system rating or monitoring. The approach taken was to identify and quantify the influence of dominant factors of solar irradiance, cell temperature, angle-of-incidence; and solar spectrum; use outdoor test procedures to separate the effects of electrical, thermal, and optical performance; use fundamental cell characteristics to improve analysis; and combine factors in simple model using the common variables.