Sample records for model substantially improves

  1. EFFECTS OF VERTICAL-LAYER STRUCTURE AND BOUNDARY CONDITIONS ON CMAQ-V4.5 AND V4.6 MODELS

    EPA Science Inventory

    This work is aimed at determining whether the increased vertical layers in CMAQ provides substantially improved model performance and assess whether using the spatially and temporally varying boundary conditions from GEOS-CHEM offer improved model performance as compared to the d...

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, L.

    ITN Energy Systems, Inc., and Global Solar Energy, Inc., with the assistance of NREL's PV Manufacturing R&D program, have continued the advancement of CIGS production technology through the development of trajectory-oriented predictive/control models, fault-tolerance control, control-platform development, in-situ sensors, and process improvements. Modeling activities to date include the development of physics-based and empirical models for CIGS and sputter-deposition processing, implementation of model-based control, and application of predictive models to the construction of new evaporation sources and for control. Model-based control is enabled through implementation of reduced or empirical models into a control platform. Reliability improvement activities include implementation of preventivemore » maintenance schedules; detection of failed sensors/equipment and reconfiguration to continue processing; and systematic development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which, in turn, have been enabled by control and reliability improvements due to this PV Manufacturing R&D program. This has resulted in substantial improvements of flexible CIGS PV module performance and efficiency.« less

  3. A Wind Tunnel Model to Explore Unsteady Circulation Control for General Aviation Applications

    NASA Technical Reports Server (NTRS)

    Cagle, Christopher M.; Jones, Gregory S.

    2002-01-01

    Circulation Control airfoils have been demonstrated to provide substantial improvements in lift over conventional airfoils. The General Aviation Circular Control model is an attempt to address some of the concerns of this technique. The primary focus is to substantially reduce the amount of air mass flow by implementing unsteady flow. This paper describes a wind tunnel model that implements unsteady circulation control by pulsing internal pneumatic valves and details some preliminary results from the first test entry.

  4. Probabilistic models of eukaryotic evolution: time for integration

    PubMed Central

    Lartillot, Nicolas

    2015-01-01

    In spite of substantial work and recent progress, a global and fully resolved picture of the macroevolutionary history of eukaryotes is still under construction. This concerns not only the phylogenetic relations among major groups, but also the general characteristics of the underlying macroevolutionary processes, including the patterns of gene family evolution associated with endosymbioses, as well as their impact on the sequence evolutionary process. All these questions raise formidable methodological challenges, calling for a more powerful statistical paradigm. In this direction, model-based probabilistic approaches have played an increasingly important role. In particular, improved models of sequence evolution accounting for heterogeneities across sites and across lineages have led to significant, although insufficient, improvement in phylogenetic accuracy. More recently, one main trend has been to move away from simple parametric models and stepwise approaches, towards integrative models explicitly considering the intricate interplay between multiple levels of macroevolutionary processes. Such integrative models are in their infancy, and their application to the phylogeny of eukaryotes still requires substantial improvement of the underlying models, as well as additional computational developments. PMID:26323768

  5. Effect of canard position and wing leading-edge flap deflection on wing buffet at transonic speeds

    NASA Technical Reports Server (NTRS)

    Gloss, B. B.; Henderson, W. P.; Huffman, J. K.

    1974-01-01

    A generalized wind-tunnel model, with canard and wing planform typical of highly maneuverable aircraft, was tested. The addition of a canard above the wing chord plane, for the configuration with leading-edge flaps undeflected, produced substantially higher total configuration lift coefficients before buffet onset than the configuration with the canard off and leading-edge flaps undeflected. The wing buffet intensity was substantially lower for the canard-wing configuration than the wing-alone configuration. The low-canard configuration generally displayed the poorest buffet characteristics. Deflecting the wing leading-edge flaps substantially improved the wing buffet characteristics for canard-off configurations. The addition of the high canard did not appear to substantially improve the wing buffet characteristics of the wing with leading-edge flaps deflected.

  6. Fractional Brownian motion and multivariate-t models for longitudinal biomedical data, with application to CD4 counts in HIV-positive patients.

    PubMed

    Stirrup, Oliver T; Babiker, Abdel G; Carpenter, James R; Copas, Andrew J

    2016-04-30

    Longitudinal data are widely analysed using linear mixed models, with 'random slopes' models particularly common. However, when modelling, for example, longitudinal pre-treatment CD4 cell counts in HIV-positive patients, the incorporation of non-stationary stochastic processes such as Brownian motion has been shown to lead to a more biologically plausible model and a substantial improvement in model fit. In this article, we propose two further extensions. Firstly, we propose the addition of a fractional Brownian motion component, and secondly, we generalise the model to follow a multivariate-t distribution. These extensions are biologically plausible, and each demonstrated substantially improved fit on application to example data from the Concerted Action on SeroConversion to AIDS and Death in Europe study. We also propose novel procedures for residual diagnostic plots that allow such models to be assessed. Cohorts of patients were simulated from the previously reported and newly developed models in order to evaluate differences in predictions made for the timing of treatment initiation under different clinical management strategies. A further simulation study was performed to demonstrate the substantial biases in parameter estimates of the mean slope of CD4 decline with time that can occur when random slopes models are applied in the presence of censoring because of treatment initiation, with the degree of bias found to depend strongly on the treatment initiation rule applied. Our findings indicate that researchers should consider more complex and flexible models for the analysis of longitudinal biomarker data, particularly when there are substantial missing data, and that the parameter estimates from random slopes models must be interpreted with caution. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  7. Special issue : safety advancements

    DOT National Transportation Integrated Search

    1999-04-24

    This issue of 'Status Report' focuses on some of the most recent key safety technology improvements. The crash protection in passenger vehicles is improving substantially; advanced frontal airbags will soon be available in a number of models and side...

  8. Using the Madeline Hunter Direct Instruction Model to Improve Outcomes Assessments in Marketing Programs

    ERIC Educational Resources Information Center

    Steward, Michelle D.; Martin, Gregory S.; Burns, Alvin C.; Bush, Ronald F.

    2010-01-01

    This study introduces marketing educators to the Madeline Hunter Direct Instruction Model (HDIM) as an approach to significantly and substantially improve student learning through course-embedded assessment. The effectiveness of the method is illustrated in three different marketing courses taught by three different marketing professors. The…

  9. Increasing the Cryogenic Toughness of Steels

    NASA Technical Reports Server (NTRS)

    Rush, H. F.

    1986-01-01

    Grain-refining heat treatments increase toughness without substantial strength loss. Five alloys selected for study, all at or near technological limit. Results showed clearly grain sizes of these alloys refined by such heat treatments and grain refinement results in large improvement in toughness without substantial loss in strength. Best improvements seen in HP-9-4-20 Steel, at low-strength end of technological limit, and in Maraging 200, at high-strength end. These alloys, in grain refined condition, considered for model applications in high-Reynolds-number cryogenic wind tunnels.

  10. Comparison of Individualized Covert Modeling, Self-Control Desensitization, and Study Skills Training for Alleviation of Test Anxiety.

    ERIC Educational Resources Information Center

    Harris, Gina; Johhson, Suzanne Bennett

    1980-01-01

    Individualized covert modeling and self-control desensitization substantially reduced self-reported test anxiety. However, the individualized covert modeling group was the only treatment group that showed significant improvement in academic performance. (Author)

  11. Consumer preference models: fuzzy theory approach

    NASA Astrophysics Data System (ADS)

    Turksen, I. B.; Wilson, I. A.

    1993-12-01

    Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).

  12. Function modeling improves the efficiency of spatial modeling using big data from remote sensing

    Treesearch

    John Hogland; Nathaniel Anderson

    2017-01-01

    Spatial modeling is an integral component of most geographic information systems (GISs). However, conventional GIS modeling techniques can require substantial processing time and storage space and have limited statistical and machine learning functionality. To address these limitations, many have parallelized spatial models using multiple coding libraries and have...

  13. Trans-dimensional matched-field geoacoustic inversion with hierarchical error models and interacting Markov chains.

    PubMed

    Dettmer, Jan; Dosso, Stan E

    2012-10-01

    This paper develops a trans-dimensional approach to matched-field geoacoustic inversion, including interacting Markov chains to improve efficiency and an autoregressive model to account for correlated errors. The trans-dimensional approach and hierarchical seabed model allows inversion without assuming any particular parametrization by relaxing model specification to a range of plausible seabed models (e.g., in this case, the number of sediment layers is an unknown parameter). Data errors are addressed by sampling statistical error-distribution parameters, including correlated errors (covariance), by applying a hierarchical autoregressive error model. The well-known difficulty of low acceptance rates for trans-dimensional jumps is addressed with interacting Markov chains, resulting in a substantial increase in efficiency. The trans-dimensional seabed model and the hierarchical error model relax the degree of prior assumptions required in the inversion, resulting in substantially improved (more realistic) uncertainty estimates and a more automated algorithm. In particular, the approach gives seabed parameter uncertainty estimates that account for uncertainty due to prior model choice (layering and data error statistics). The approach is applied to data measured on a vertical array in the Mediterranean Sea.

  14. Function modeling: improved raster analysis through delayed reading and function raster datasets

    Treesearch

    John S. Hogland; Nathaniel M. Anderson; J .Greg Jones

    2013-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...

  15. Improving optimal control of grid-connected lithium-ion batteries through more accurate battery and degradation modelling

    NASA Astrophysics Data System (ADS)

    Reniers, Jorn M.; Mulder, Grietus; Ober-Blöbaum, Sina; Howey, David A.

    2018-03-01

    The increased deployment of intermittent renewable energy generators opens up opportunities for grid-connected energy storage. Batteries offer significant flexibility but are relatively expensive at present. Battery lifetime is a key factor in the business case, and it depends on usage, but most techno-economic analyses do not account for this. For the first time, this paper quantifies the annual benefits of grid-connected batteries including realistic physical dynamics and nonlinear electrochemical degradation. Three lithium-ion battery models of increasing realism are formulated, and the predicted degradation of each is compared with a large-scale experimental degradation data set (Mat4Bat). A respective improvement in RMS capacity prediction error from 11% to 5% is found by increasing the model accuracy. The three models are then used within an optimal control algorithm to perform price arbitrage over one year, including degradation. Results show that the revenue can be increased substantially while degradation can be reduced by using more realistic models. The estimated best case profit using a sophisticated model is a 175% improvement compared with the simplest model. This illustrates that using a simplistic battery model in a techno-economic assessment of grid-connected batteries might substantially underestimate the business case and lead to erroneous conclusions.

  16. Improved analyses using function datasets and statistical modeling

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2014-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...

  17. RELEASE NOTES FOR MODELS-3 VERSION 4.1 PATCH: SMOKE TOOL AND FILE CONVERTER

    EPA Science Inventory

    This software patch to the Models-3 system corrects minor errors in the Models-3 framework, provides substantial improvements in the ASCII to I/O API format conversion of the File Converter utility, and new functionalities for the SMOKE Tool. Version 4.1 of the Models-3 system...

  18. THE USE OF COMPUTER MODELING PACKAGES TO ILLUSTRATE UNCERTAINTY IN RISK ASSESSMENTS: AN EASE OF USE AND INTERPRETATION COMPARISON

    EPA Science Inventory

    Consistent improvements in processor speed and computer access have substantially increased the use of computer modeling by experts and non-experts alike. Several new computer modeling packages operating under graphical operating systems (i.e. Microsoft Windows or Macintosh) m...

  19. Constrained inference in mixed-effects models for longitudinal data with application to hearing loss.

    PubMed

    Davidov, Ori; Rosen, Sophia

    2011-04-01

    In medical studies, endpoints are often measured for each patient longitudinally. The mixed-effects model has been a useful tool for the analysis of such data. There are situations in which the parameters of the model are subject to some restrictions or constraints. For example, in hearing loss studies, we expect hearing to deteriorate with time. This means that hearing thresholds which reflect hearing acuity will, on average, increase over time. Therefore, the regression coefficients associated with the mean effect of time on hearing ability will be constrained. Such constraints should be accounted for in the analysis. We propose maximum likelihood estimation procedures, based on the expectation-conditional maximization either algorithm, to estimate the parameters of the model while accounting for the constraints on them. The proposed methods improve, in terms of mean square error, on the unconstrained estimators. In some settings, the improvement may be substantial. Hypotheses testing procedures that incorporate the constraints are developed. Specifically, likelihood ratio, Wald, and score tests are proposed and investigated. Their empirical significance levels and power are studied using simulations. It is shown that incorporating the constraints improves the mean squared error of the estimates and the power of the tests. These improvements may be substantial. The methodology is used to analyze a hearing loss study.

  20. The changing global carbon cycle: Linking plant-soil carbon dynamics to global consequences

    USGS Publications Warehouse

    Chapin, F. S.; McFarland, J.; McGuire, David A.; Euskirchen, E.S.; Ruess, Roger W.; Kielland, K.

    2009-01-01

    Synthesis. Current climate systems models that include only NPP and HR are inadequate under conditions of rapid change. Many of the recent advances in biogeochemical understanding are sufficiently mature to substantially improve representation of ecosystem C dynamics in these models.

  1. IMPROVING CHEMICAL TRANSPORT MODEL PREDICTIONS OF ORGANIC AEROSOL: MEASUREMENT AND SIMULATION OF SEMIVOLATILE ORGANIC EMISSIONS FROM MOBILE AND NON-MOBILE SOURCES

    EPA Science Inventory

    Organic material contributes a significant fraction of PM2.5 mass across all regions of the United States, but state-of-the-art chemical transport models often substantially underpredict measured organic aerosol concentrations. Recent revisions to these models that...

  2. Quantifying surgical complexity with machine learning: looking beyond patient factors to improve surgical models.

    PubMed

    Van Esbroeck, Alexander; Rubinfeld, Ilan; Hall, Bruce; Syed, Zeeshan

    2014-11-01

    To investigate the use of machine learning to empirically determine the risk of individual surgical procedures and to improve surgical models with this information. American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) data from 2005 to 2009 were used to train support vector machine (SVM) classifiers to learn the relationship between textual constructs in current procedural terminology (CPT) descriptions and mortality, morbidity, Clavien 4 complications, and surgical-site infections (SSI) within 30 days of surgery. The procedural risk scores produced by the SVM classifiers were validated on data from 2010 in univariate and multivariate analyses. The procedural risk scores produced by the SVM classifiers achieved moderate-to-high levels of discrimination in univariate analyses (area under receiver operating characteristic curve: 0.871 for mortality, 0.789 for morbidity, 0.791 for SSI, 0.845 for Clavien 4 complications). Addition of these scores also substantially improved multivariate models comprising patient factors and previously proposed correlates of procedural risk (net reclassification improvement and integrated discrimination improvement: 0.54 and 0.001 for mortality, 0.46 and 0.011 for morbidity, 0.68 and 0.022 for SSI, 0.44 and 0.001 for Clavien 4 complications; P < .05 for all comparisons). Similar improvements were noted in discrimination and calibration for other statistical measures, and in subcohorts comprising patients with general or vascular surgery. Machine learning provides clinically useful estimates of surgical risk for individual procedures. This information can be measured in an entirely data-driven manner and substantially improves multifactorial models to predict postoperative complications. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Utilizing multiple scale models to improve predictions of extra-axial hemorrhage in the immature piglet.

    PubMed

    Scott, Gregory G; Margulies, Susan S; Coats, Brittany

    2016-10-01

    Traumatic brain injury (TBI) is a leading cause of death and disability in the USA. To help understand and better predict TBI, researchers have developed complex finite element (FE) models of the head which incorporate many biological structures such as scalp, skull, meninges, brain (with gray/white matter differentiation), and vasculature. However, most models drastically simplify the membranes and substructures between the pia and arachnoid membranes. We hypothesize that substructures in the pia-arachnoid complex (PAC) contribute substantially to brain deformation following head rotation, and that when included in FE models accuracy of extra-axial hemorrhage prediction improves. To test these hypotheses, microscale FE models of the PAC were developed to span the variability of PAC substructure anatomy and regional density. The constitutive response of these models were then integrated into an existing macroscale FE model of the immature piglet brain to identify changes in cortical stress distribution and predictions of extra-axial hemorrhage (EAH). Incorporating regional variability of PAC substructures substantially altered the distribution of principal stress on the cortical surface of the brain compared to a uniform representation of the PAC. Simulations of 24 non-impact rapid head rotations in an immature piglet animal model resulted in improved accuracy of EAH prediction (to 94 % sensitivity, 100 % specificity), as well as a high accuracy in regional hemorrhage prediction (to 82-100 % sensitivity, 100 % specificity). We conclude that including a biofidelic PAC substructure variability in FE models of the head is essential for improved predictions of hemorrhage at the brain/skull interface.

  4. Role of socioeconomic status measures in long-term mortality risk prediction after myocardial infarction.

    PubMed

    Molshatzki, Noa; Drory, Yaacov; Myers, Vicki; Goldbourt, Uri; Benyamini, Yael; Steinberg, David M; Gerber, Yariv

    2011-07-01

    The relationship of risk factors to outcomes has traditionally been assessed by measures of association such as odds ratio or hazard ratio and their statistical significance from an adjusted model. However, a strong, highly significant association does not guarantee a gain in stratification capacity. Using recently developed model performance indices, we evaluated the incremental discriminatory power of individual and neighborhood socioeconomic status (SES) measures after myocardial infarction (MI). Consecutive patients aged ≤65 years (N=1178) discharged from 8 hospitals in central Israel after incident MI in 1992 to 1993 were followed-up through 2005. A basic model (demographic variables, traditional cardiovascular risk factors, and disease severity indicators) was compared with an extended model including SES measures (education, income, employment, living with a steady partner, and neighborhood SES) in terms of Harrell c statistic, integrated discrimination improvement (IDI), and net reclassification improvement (NRI). During the 13-year follow-up, 326 (28%) patients died. Cox proportional hazards models showed that all SES measures were significantly and independently associated with mortality. Furthermore, compared with the basic model, the extended model yielded substantial gains (all P<0.001) in c statistic (0.723 to 0.757), NRI (15.2%), IDI (5.9%), and relative IDI (32%). Improvement was observed both for sensitivity (classification of events) and specificity (classification of nonevents). This study illustrates the additional insights that can be gained from considering the IDI and NRI measures of model performance and suggests that, among community patients with incident MI, incorporating SES measures into a clinical-based model substantially improves long-term mortality risk prediction.

  5. Improving LHC searches for dark photons using lepton-jet substructure

    NASA Astrophysics Data System (ADS)

    Barello, G.; Chang, Spencer; Newby, Christopher A.; Ostdiek, Bryan

    2017-03-01

    Collider signals of dark photons are an exciting probe for new gauge forces and are characterized by events with boosted lepton jets. Existing techniques are efficient in searching for muonic lepton jets but due to substantial backgrounds have difficulty constraining lepton jets containing only electrons. This is unfortunate since upcoming intensity frontier experiments are sensitive to dark photon masses which only allow electron decays. Analyzing a recently proposed model of kinetic mixing, with new scalar particles decaying into dark photons, we find that existing techniques for electron jets can be substantially improved. We show that using lepton-jet-substructure variables, in association with a boosted decision tree, improves background rejection, significantly increasing the LHC's reach for dark photons in this region of parameter space.

  6. A fuzzy set preference model for market share analysis

    NASA Technical Reports Server (NTRS)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    Consumer preference models are widely used in new product design, marketing management, pricing, and market segmentation. The success of new products depends on accurate market share prediction and design decisions based on consumer preferences. The vague linguistic nature of consumer preferences and product attributes, combined with the substantial differences between individuals, creates a formidable challenge to marketing models. The most widely used methodology is conjoint analysis. Conjoint models, as currently implemented, represent linguistic preferences as ratio or interval-scaled numbers, use only numeric product attributes, and require aggregation of individuals for estimation purposes. It is not surprising that these models are costly to implement, are inflexible, and have a predictive validity that is not substantially better than chance. This affects the accuracy of market share estimates. A fuzzy set preference model can easily represent linguistic variables either in consumer preferences or product attributes with minimal measurement requirements (ordinal scales), while still estimating overall preferences suitable for market share prediction. This approach results in flexible individual-level conjoint models which can provide more accurate market share estimates from a smaller number of more meaningful consumer ratings. Fuzzy sets can be incorporated within existing preference model structures, such as a linear combination, using the techniques developed for conjoint analysis and market share estimation. The purpose of this article is to develop and fully test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation), and how much to make (market share prediction).

  7. Local control on precipitation in a fully coupled climate-hydrology model.

    PubMed

    Larsen, Morten A D; Christensen, Jens H; Drews, Martin; Butts, Michael B; Refsgaard, Jens C

    2016-03-10

    The ability to simulate regional precipitation realistically by climate models is essential to understand and adapt to climate change. Due to the complexity of associated processes, particularly at unresolved temporal and spatial scales this continues to be a major challenge. As a result, climate simulations of precipitation often exhibit substantial biases that affect the reliability of future projections. Here we demonstrate how a regional climate model (RCM) coupled to a distributed hydrological catchment model that fully integrates water and energy fluxes between the subsurface, land surface, plant cover and the atmosphere, enables a realistic representation of local precipitation. Substantial improvements in simulated precipitation dynamics on seasonal and longer time scales is seen for a simulation period of six years and can be attributed to a more complete treatment of hydrological sub-surface processes including groundwater and moisture feedback. A high degree of local influence on the atmosphere suggests that coupled climate-hydrology models have a potential for improving climate projections and the results further indicate a diminished need for bias correction in climate-hydrology impact studies.

  8. Local control on precipitation in a fully coupled climate-hydrology model

    PubMed Central

    Larsen, Morten A. D.; Christensen, Jens H.; Drews, Martin; Butts, Michael B.; Refsgaard, Jens C.

    2016-01-01

    The ability to simulate regional precipitation realistically by climate models is essential to understand and adapt to climate change. Due to the complexity of associated processes, particularly at unresolved temporal and spatial scales this continues to be a major challenge. As a result, climate simulations of precipitation often exhibit substantial biases that affect the reliability of future projections. Here we demonstrate how a regional climate model (RCM) coupled to a distributed hydrological catchment model that fully integrates water and energy fluxes between the subsurface, land surface, plant cover and the atmosphere, enables a realistic representation of local precipitation. Substantial improvements in simulated precipitation dynamics on seasonal and longer time scales is seen for a simulation period of six years and can be attributed to a more complete treatment of hydrological sub-surface processes including groundwater and moisture feedback. A high degree of local influence on the atmosphere suggests that coupled climate-hydrology models have a potential for improving climate projections and the results further indicate a diminished need for bias correction in climate-hydrology impact studies. PMID:26960564

  9. Quantifying Reporting Timeliness to Improve Outbreak Control

    PubMed Central

    Swaan, Corien; van Steenbergen, Jim; Kretzschmar, Mirjam

    2015-01-01

    The extent to which reporting delays should be reduced to gain substantial improvement in outbreak control is unclear. We developed a model to quantitatively assess reporting timeliness. Using reporting speed data for 6 infectious diseases in the notification system in the Netherlands, we calculated the proportion of infections produced by index and secondary cases until the index case is reported. We assumed interventions that immediately stop transmission. Reporting delays render useful only those interventions that stop transmission from index and secondary cases. We found that current reporting delays are adequate for hepatitis A and B control. However, reporting delays should be reduced by a few days to improve measles and mumps control, by at least 10 days to improve shigellosis control, and by at least 5 weeks to substantially improve pertussis control. Our method provides quantitative insight into the required reporting delay reductions needed to achieve outbreak control and other transmission prevention goals. PMID:25625374

  10. Next Generation Waveform Based Three-Dimensional Models and Metrics to Improve Nuclear Explosion Monitoring in the Middle East (Postprint)

    DTIC Science & Technology

    2011-12-30

    improvements also significantly increase anomaly strength while sharpening the anomaly edges to create stronger and more pronounced tectonic structures. The...continental deformation and crustal thickening is occurring, the wave speeds are substantially slower. This Asian north-to-south, fast-to-slow wave speed

  11. Development of cost-effective pavement treatment selection and treatment performance models.

    DOT National Transportation Integrated Search

    2015-09-01

    Louisiana Department of Transportation and Development (DOTD) has spent substantial financial resources on various : rehabilitation and maintenance treatments to minimize pavement distresses and improve pavement life. Such treatments : include, but a...

  12. Using video self- and peer modeling to facilitate reading fluency in children with learning disabilities.

    PubMed

    Decker, Martha M; Buggey, Tom

    2014-01-01

    The authors compared the effects of video self-modeling and video peer modeling on oral reading fluency of elementary students with learning disabilities. A control group was also included to gauge general improvement due to reading instruction and familiarity with researchers. The results indicated that both interventions resulted in improved fluency. Students in both experimental groups improved their reading fluency. Two students in the self-modeling group made substantial and immediate gains beyond any of the other students. Discussion is included that focuses on the importance that positive imagery can have on student performance and the possible applications of both forms of video modeling with students who have had negative experiences in reading.

  13. Modeling the dissipation rate in rotating turbulent flows

    NASA Technical Reports Server (NTRS)

    Speziale, Charles G.; Raj, Rishi; Gatski, Thomas B.

    1990-01-01

    A variety of modifications to the modeled dissipation rate transport equation that have been proposed during the past two decades to account for rotational strains are examined. The models are subjected to two crucial test cases: the decay of isotropic turbulence in a rotating frame and homogeneous shear flow in a rotating frame. It is demonstrated that these modifications do not yield substantially improved predictions for these two test cases and in many instances give rise to unphysical behavior. An alternative proposal, based on the use of the tensor dissipation rate, is made for the development of improved models.

  14. A review of typhoid fever transmission dynamic models and economic evaluations of vaccination.

    PubMed

    Watson, Conall H; Edmunds, W John

    2015-06-19

    Despite a recommendation by the World Health Organization (WHO) that typhoid vaccines be considered for the control of endemic disease and outbreaks, programmatic use remains limited. Transmission models and economic evaluation may be informative in decision making about vaccine programme introductions and their role alongside other control measures. A literature search found few typhoid transmission models or economic evaluations relative to analyses of other infectious diseases of similar or lower health burden. Modelling suggests vaccines alone are unlikely to eliminate endemic disease in the short to medium term without measures to reduce transmission from asymptomatic carriage. The single identified data-fitted transmission model of typhoid vaccination suggests vaccines can reduce disease burden substantially when introduced programmatically but that indirect protection depends on the relative contribution of carriage to transmission in a given setting. This is an important source of epidemiological uncertainty, alongside the extent and nature of natural immunity. Economic evaluations suggest that typhoid vaccination can be cost-saving to health services if incidence is extremely high and cost-effective in other high-incidence situations, when compared to WHO norms. Targeting vaccination to the highest incidence age-groups is likely to improve cost-effectiveness substantially. Economic perspective and vaccine costs substantially affect estimates, with disease incidence, case-fatality rates, and vaccine efficacy over time also important determinants of cost-effectiveness and sources of uncertainty. Static economic models may under-estimate benefits of typhoid vaccination by omitting indirect protection. Typhoid fever transmission models currently require per-setting epidemiological parameterisation to inform their use in economic evaluation, which may limit their generalisability. We found no economic evaluation based on transmission dynamic modelling, and no economic evaluation of typhoid vaccination against interventions such as improvements in sanitation or hygiene. Copyright © 2015. Published by Elsevier Ltd.

  15. Domestic refrigeration appliances in Poland: Potential for improving energy efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyers, S.; Schipper, L.; Lebot, B.

    1993-08-01

    This report is based on information collected from the main Polish manufacturer of refrigeration appliances. We describe their production facilities, and show that the energy consumption of their models for domestic sale is substantially higher than the average for similar models made in W. Europe. Lack of data and uncertainty about future production costs in Poland limits our evaluation of the cost-effective potential to increase energy efficiency, but it appears likely that considerable improvement would be economic from a societal perspective. Many design options are likely to have a simple payback of less than five years. We found that themore » production facilities are in need of substantial modernization in order to produce higher quality and more efficient appliances. We discuss policy options that could help to build a market for more efficient appliances in Poland and thereby encourage investment to produce such equipment.« less

  16. Quality Improvement on the Acute Inpatient Psychiatry Unit Using the Model for Improvement

    PubMed Central

    Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean

    2013-01-01

    Background A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. Methods We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients—those starting or continuing on standing neuroleptics—with the Abnormal Involuntary Movement Scale (AIMS). Results After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Conclusion Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team. PMID:24052768

  17. Quality improvement on the acute inpatient psychiatry unit using the model for improvement.

    PubMed

    Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean

    2013-01-01

    A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients-those starting or continuing on standing neuroleptics-with the Abnormal Involuntary Movement Scale (AIMS). After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team.

  18. High-precision radiometric tracking for planetary approach and encounter in the inner solar system

    NASA Technical Reports Server (NTRS)

    Christensen, C. S.; Thurman, S. W.; Davidson, J. M.; Finger, M. H.; Folkner, W. M.

    1989-01-01

    The benefits of improved radiometric tracking data have been studied for planetary approach within the inner Solar System using the Mars Rover Sample Return trajectory as a model. It was found that the benefit of improved data to approach and encounter navigation was highly dependent on the a priori uncertainties assumed for several non-estimated parameters, including those for frame-tie, Earth orientation, troposphere delay, and station locations. With these errors at their current levels, navigational performance was found to be insensitive to enhancements in data accuracy. However, when expected improvements in these errors are modeled, performance with current-accuracy data significantly improves, with substantial further improvements possible with enhancements in data accuracy.

  19. Retinal image contrast obtained by a model eye with combined correction of chromatic and spherical aberrations

    PubMed Central

    Ohnuma, Kazuhiko; Kayanuma, Hiroyuki; Lawu, Tjundewo; Negishi, Kazuno; Yamaguchi, Takefumi; Noda, Toru

    2011-01-01

    Correcting spherical and chromatic aberrations in vitro in human eyes provides substantial visual acuity and contrast sensitivity improvements. We found the same improvement in the retinal images using a model eye with/without correction of longitudinal chromatic aberrations (LCAs) and spherical aberrations (SAs). The model eye included an intraocular lens (IOL) and artificial cornea with human ocular LCAs and average human SAs. The optotypes were illuminated using a D65 light source, and the images were obtained using two-dimensional luminance colorimeter. The contrast improvement from the SA correction was higher than the LCA correction, indicating the benefit of an aspheric achromatic IOL. PMID:21698008

  20. Are we in the dark ages of environmental toxicology?

    PubMed

    McCarty, L S

    2013-12-01

    Environmental toxicity is judged to be in a "dark ages" period due to longstanding limitations in the implementation of the simple conceptual model that is the basis of current aquatic toxicity testing protocols. Fortunately, the environmental regulatory revolution of the last half-century is not substantially compromised as development of past regulatory guidance was designed to deal with limited amounts of relatively poor quality toxicity data. However, as regulatory objectives have substantially increased in breadth and depth, aquatic toxicity data derived with old testing methods are no longer adequate. In the near-term explicit model description and routine assumption validation should be mandatory. Updated testing methods could provide some improvements in toxicological data quality. A thorough reevaluation of toxicity testing objectives and methods resulting in substantially revised standard testing methods, plus a comprehensive scheme for classification of modes/mechanisms of toxic action, should be the long-term objective. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Frailty Models for Familial Risk with Application to Breast Cancer.

    PubMed

    Gorfine, Malka; Hsu, Li; Parmigiani, Giovanni

    2013-12-01

    In evaluating familial risk for disease we have two main statistical tasks: assessing the probability of carrying an inherited genetic mutation conferring higher risk; and predicting the absolute risk of developing diseases over time, for those individuals whose mutation status is known. Despite substantial progress, much remains unknown about the role of genetic and environmental risk factors, about the sources of variation in risk among families that carry high-risk mutations, and about the sources of familial aggregation beyond major Mendelian effects. These sources of heterogeneity contribute substantial variation in risk across families. In this paper we present simple and efficient methods for accounting for this variation in familial risk assessment. Our methods are based on frailty models. We implemented them in the context of generalizing Mendelian models of cancer risk, and compared our approaches to others that do not consider heterogeneity across families. Our extensive simulation study demonstrates that when predicting the risk of developing a disease over time conditional on carrier status, accounting for heterogeneity results in a substantial improvement in the area under the curve of the receiver operating characteristic. On the other hand, the improvement for carriership probability estimation is more limited. We illustrate the utility of the proposed approach through the analysis of BRCA1 and BRCA2 mutation carriers in the Washington Ashkenazi Kin-Cohort Study of Breast Cancer.

  2. Modeling Global Ocean Biogeochemistry With Physical Data Assimilation: A Pragmatic Solution to the Equatorial Instability

    NASA Astrophysics Data System (ADS)

    Park, Jong-Yeon; Stock, Charles A.; Yang, Xiaosong; Dunne, John P.; Rosati, Anthony; John, Jasmin; Zhang, Shaoqing

    2018-03-01

    Reliable estimates of historical and current biogeochemistry are essential for understanding past ecosystem variability and predicting future changes. Efforts to translate improved physical ocean state estimates into improved biogeochemical estimates, however, are hindered by high biogeochemical sensitivity to transient momentum imbalances that arise during physical data assimilation. Most notably, the breakdown of geostrophic constraints on data assimilation in equatorial regions can lead to spurious upwelling, resulting in excessive equatorial productivity and biogeochemical fluxes. This hampers efforts to understand and predict the biogeochemical consequences of El Niño and La Niña. We develop a strategy to robustly integrate an ocean biogeochemical model with an ensemble coupled-climate data assimilation system used for seasonal to decadal global climate prediction. Addressing spurious vertical velocities requires two steps. First, we find that tightening constraints on atmospheric data assimilation maintains a better equatorial wind stress and pressure gradient balance. This reduces spurious vertical velocities, but those remaining still produce substantial biogeochemical biases. The remainder is addressed by imposing stricter fidelity to model dynamics over data constraints near the equator. We determine an optimal choice of model-data weights that removed spurious biogeochemical signals while benefitting from off-equatorial constraints that still substantially improve equatorial physical ocean simulations. Compared to the unconstrained control run, the optimally constrained model reduces equatorial biogeochemical biases and markedly improves the equatorial subsurface nitrate concentrations and hypoxic area. The pragmatic approach described herein offers a means of advancing earth system prediction in parallel with continued data assimilation advances aimed at fully considering equatorial data constraints.

  3. Strategies for Energy Efficient Resource Management of Hybrid Programming Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Dong; Supinski, Bronis de; Schulz, Martin

    2013-01-01

    Many scientific applications are programmed using hybrid programming models that use both message-passing and shared-memory, due to the increasing prevalence of large-scale systems with multicore, multisocket nodes. Previous work has shown that energy efficiency can be improved using software-controlled execution schemes that consider both the programming model and the power-aware execution capabilities of the system. However, such approaches have focused on identifying optimal resource utilization for one programming model, either shared-memory or message-passing, in isolation. The potential solution space, thus the challenge, increases substantially when optimizing hybrid models since the possible resource configurations increase exponentially. Nonetheless, with the accelerating adoptionmore » of hybrid programming models, we increasingly need improved energy efficiency in hybrid parallel applications on large-scale systems. In this work, we present new software-controlled execution schemes that consider the effects of dynamic concurrency throttling (DCT) and dynamic voltage and frequency scaling (DVFS) in the context of hybrid programming models. Specifically, we present predictive models and novel algorithms based on statistical analysis that anticipate application power and time requirements under different concurrency and frequency configurations. We apply our models and methods to the NPB MZ benchmarks and selected applications from the ASC Sequoia codes. Overall, we achieve substantial energy savings (8.74% on average and up to 13.8%) with some performance gain (up to 7.5%) or negligible performance loss.« less

  4. The Point of No Return? Interest Groups, School Board Elections and the Sustainment of the Portfolio Management Model in Post-Katrina New Orleans

    ERIC Educational Resources Information Center

    Welsh, Richard; Hall, Michelle

    2018-01-01

    Context: Given the growing popularity of the portfolio management model (PMM) as a method of improving education, it is important to examine how these market-based reforms are sustained over time and how the politics of sustaining this model have substantial policy implications. Purpose of Study: The purpose of this article is to examine important…

  5. Using the SWAT model to improve process descriptions and define hydrologic partitioning in South Korea

    NASA Astrophysics Data System (ADS)

    Shope, C. L.; Maharjan, G. R.; Tenhunen, J.; Seo, B.; Kim, K.; Riley, J.; Arnhold, S.; Koellner, T.; Ok, Y. S.; Peiffer, S.; Kim, B.; Park, J.-H.; Huwe, B.

    2014-02-01

    Watershed-scale modeling can be a valuable tool to aid in quantification of water quality and yield; however, several challenges remain. In many watersheds, it is difficult to adequately quantify hydrologic partitioning. Data scarcity is prevalent, accuracy of spatially distributed meteorology is difficult to quantify, forest encroachment and land use issues are common, and surface water and groundwater abstractions substantially modify watershed-based processes. Our objective is to assess the capability of the Soil and Water Assessment Tool (SWAT) model to capture event-based and long-term monsoonal rainfall-runoff processes in complex mountainous terrain. To accomplish this, we developed a unique quality-control, gap-filling algorithm for interpolation of high-frequency meteorological data. We used a novel multi-location, multi-optimization calibration technique to improve estimations of catchment-wide hydrologic partitioning. The interdisciplinary model was calibrated to a unique combination of statistical, hydrologic, and plant growth metrics. Our results indicate scale-dependent sensitivity of hydrologic partitioning and substantial influence of engineered features. The addition of hydrologic and plant growth objective functions identified the importance of culverts in catchment-wide flow distribution. While this study shows the challenges of applying the SWAT model to complex terrain and extreme environments; by incorporating anthropogenic features into modeling scenarios, we can enhance our understanding of the hydroecological impact.

  6. Improving assessment and modelling of climate change impacts on global terrestrial biodiversity.

    PubMed

    McMahon, Sean M; Harrison, Sandy P; Armbruster, W Scott; Bartlein, Patrick J; Beale, Colin M; Edwards, Mary E; Kattge, Jens; Midgley, Guy; Morin, Xavier; Prentice, I Colin

    2011-05-01

    Understanding how species and ecosystems respond to climate change has become a major focus of ecology and conservation biology. Modelling approaches provide important tools for making future projections, but current models of the climate-biosphere interface remain overly simplistic, undermining the credibility of projections. We identify five ways in which substantial advances could be made in the next few years: (i) improving the accessibility and efficiency of biodiversity monitoring data, (ii) quantifying the main determinants of the sensitivity of species to climate change, (iii) incorporating community dynamics into projections of biodiversity responses, (iv) accounting for the influence of evolutionary processes on the response of species to climate change, and (v) improving the biophysical rule sets that define functional groupings of species in global models. Published by Elsevier Ltd.

  7. Supply-chain management: exceeding the customer's expectations.

    PubMed

    Ramsay, B

    2000-10-01

    Driven by increasing competition, manufacturers are desperate to cut costs and are looking for increased efficiency and customer service from their supply chains. E-commerce offers a new model of supply and demand, but many companies do not have the processes in place to support this new model. By implementing the techniques discussed here they can achieve substantial improvements in performance.

  8. IMPROVED ALGORITHMS FOR RADAR-BASED RECONSTRUCTION OF ASTEROID SHAPES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenberg, Adam H.; Margot, Jean-Luc

    We describe our implementation of a global-parameter optimizer and Square Root Information Filter into the asteroid-modeling software shape. We compare the performance of our new optimizer with that of the existing sequential optimizer when operating on various forms of simulated data and actual asteroid radar data. In all cases, the new implementation performs substantially better than its predecessor: it converges faster, produces shape models that are more accurate, and solves for spin axis orientations more reliably. We discuss potential future changes to improve shape's fitting speed and accuracy.

  9. The Agricultural Model Intercomparison and Improvement Project: Phase I Activities by a Global Community of Science. Chapter 1

    NASA Technical Reports Server (NTRS)

    Rosenzweig, Cynthia E.; Jones, James W.; Hatfield, Jerry L.; Antle, John M.; Ruane, Alexander C.; Mutter, Carolyn Z.

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) was founded in 2010. Its mission is to improve substantially the characterization of world food security as affected by climate variability and change, and to enhance adaptation capacity in both developing and developed countries. The objectives of AgMIP are to: Incorporate state-of-the-art climate, crop/livestock, and agricultural economic model improvements into coordinated multi-model regional and global assessments of future climate impacts and adaptation and other key aspects of the food system. Utilize multiple models, scenarios, locations, crops/livestock, and participants to explore uncertainty and the impact of data and methodological choices. Collaborate with regional experts in agronomy, animal sciences, economics, and climate to build a strong basis for model applications, addressing key climate related questions and sustainable intensification farming systems. Improve scientific and adaptive capacity in modeling for major agricultural regions in the developing and developed world, with a focus on vulnerable regions. Improve agricultural data and enhance data-sharing based on their intercomparison and evaluation using best scientific practices. Develop modeling frameworks to identify and evaluate promising adaptation technologies and policies and to prioritize strategies.

  10. Sensitivity analyses of factors influencing CMAQ performance for fine particulate nitrate.

    PubMed

    Shimadera, Hikari; Hayami, Hiroshi; Chatani, Satoru; Morino, Yu; Mori, Yasuaki; Morikawa, Tazuko; Yamaji, Kazuyo; Ohara, Toshimasa

    2014-04-01

    Improvement of air quality models is required so that they can be utilized to design effective control strategies for fine particulate matter (PM2.5). The Community Multiscale Air Quality modeling system was applied to the Greater Tokyo Area of Japan in winter 2010 and summer 2011. The model results were compared with observed concentrations of PM2.5 sulfate (SO4(2-)), nitrate (NO3(-)) and ammonium, and gaseous nitric acid (HNO3) and ammonia (NH3). The model approximately reproduced PM2.5 SO4(2-) concentration, but clearly overestimated PM2.5 NO3(-) concentration, which was attributed to overestimation of production of ammonium nitrate (NH4NO3). This study conducted sensitivity analyses of factors associated with the model performance for PM2.5 NO3(-) concentration, including temperature and relative humidity, emission of nitrogen oxides, seasonal variation of NH3 emission, HNO3 and NH3 dry deposition velocities, and heterogeneous reaction probability of dinitrogen pentoxide. Change in NH3 emission directly affected NH3 concentration, and substantially affected NH4NO3 concentration. Higher dry deposition velocities of HNO3 and NH3 led to substantial reductions of concentrations of the gaseous species and NH4NO3. Because uncertainties in NH3 emission and dry deposition processes are probably large, these processes may be key factors for improvement of the model performance for PM2.5 NO3(-). The Community Multiscale Air Quality modeling system clearly overestimated the concentration of fine particulate nitrate in the Greater Tokyo Area of Japan, which was attributed to overestimation of production of ammonium nitrate. Sensitivity analyses were conducted for factors associated with the model performance for nitrate. Ammonia emission and dry deposition of nitric acid and ammonia may be key factors for improvement of the model performance.

  11. HST image restoration: A comparison of pre- and post-servicing mission results

    NASA Technical Reports Server (NTRS)

    Hanisch, R. J.; Mo, J.

    1992-01-01

    A variety of image restoration techniques (e.g., Wiener filter, Lucy-Richardson, MEM) have been applied quite successfully to the aberrated HST images. The HST servicing mission (scheduled for late 1993 or early 1994) will install a corrective optics system (COSTAR) for the Faint Object Camera and spectrographs and replace the Wide Field/Planetary Camera with a second generation instrument (WF/PC-II) having its own corrective elements. The image quality is expected to be improved substantially with these new instruments. What then is the role of image restoration for the HST in the long term? Through a series of numerical experiments using model point-spread functions for both aberrated and unaberrated optics, we find that substantial improvements in image resolution can be obtained for post-servicing mission data using the same or similar algorithms as being employed now to correct aberrated images. Included in our investigations are studies of the photometric integrity of the restoration algorithms and explicit models for HST pointing errors (spacecraft jitter).

  12. A Thermal Runaway Failure Model for Low-Voltage BME Ceramic Capacitors with Defects

    NASA Technical Reports Server (NTRS)

    Teverovsky, Alexander

    2017-01-01

    Reliability of base metal electrode (BME) multilayer ceramic capacitors (MLCCs) that until recently were used mostly in commercial applications, have been improved substantially by using new materials and processes. Currently, the inception of intrinsic wear-out failures in high quality capacitors became much greater than the mission duration in most high-reliability applications. However, in capacitors with defects degradation processes might accelerate substantially and cause infant mortality failures. In this work, a physical model that relates the presence of defects to reduction of breakdown voltages and decreasing times to failure has been suggested. The effect of the defect size has been analyzed using a thermal runaway model of failures. Adequacy of highly accelerated life testing (HALT) to predict reliability at normal operating conditions and limitations of voltage acceleration are considered. The applicability of the model to BME capacitors with cracks is discussed and validated experimentally.

  13. Interactions between Flight Dynamics and Propulsion Systems of Air-Breathing Hypersonic Vehicles

    DTIC Science & Technology

    2013-01-01

    coupled with combustor – Combustor, component for subsonic or supersonic combustion – Nozzle , expands flow for high thrust and may provide lift... supersonic solution method that is used for both the inlet and nozzle components. The supersonic model SAMURI is a substantial improvement over previous models...purely supersonic inviscid flow. As a result, the model is also appropriate for other applications, including the nozzle , which is important 19 Figure

  14. The use of acoustically tuned resonators to improve the sound transmission loss of double panel partitions

    NASA Astrophysics Data System (ADS)

    Mason, J. M.; Fahy, F. J.

    1986-10-01

    The effectiveness of tuned Helmholtz resonators connected to the partition cavity in double-leaf partitions utilized in situations requiring low weight structures with high transmission loss is investigated as a method of improving sound transmission loss. This is demonstrated by a simple theoretical model and then experimentally verified. Results show that substantial improvements may be obtained at and around the mass-air-mass frequency for a total resonator volume 15 percent of the cavity volume.

  15. Numerical Computation of a Continuous-thrust State Transition Matrix Incorporating Accurate Hardware and Ephemeris Models

    NASA Technical Reports Server (NTRS)

    Ellison, Donald; Conway, Bruce; Englander, Jacob

    2015-01-01

    A significant body of work exists showing that providing a nonlinear programming (NLP) solver with expressions for the problem constraint gradient substantially increases the speed of program execution and can also improve the robustness of convergence, especially for local optimizers. Calculation of these derivatives is often accomplished through the computation of spacecraft's state transition matrix (STM). If the two-body gravitational model is employed as is often done in the context of preliminary design, closed form expressions for these derivatives may be provided. If a high fidelity dynamics model, that might include perturbing forces such as the gravitational effect from multiple third bodies and solar radiation pressure is used then these STM's must be computed numerically. We present a method for the power hardward model and a full ephemeris model. An adaptive-step embedded eight order Dormand-Prince numerical integrator is discussed and a method for the computation of the time of flight derivatives in this framework is presented. The use of these numerically calculated derivatieves offer a substantial improvement over finite differencing in the context of a global optimizer. Specifically the inclusion of these STM's into the low thrust missiondesign tool chain in use at NASA Goddard Spaceflight Center allows for an increased preliminary mission design cadence.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, L.; Britt, J.; Birkmire, R.

    ITN Energy Systems, Inc., and Global Solar Energy, Inc., assisted by NREL's PV Manufacturing R&D program, have continued to advance CIGS production technology by developing trajectory-oriented predictive/control models, fault-tolerance control, control platform development, in-situ sensors, and process improvements. Modeling activities included developing physics-based and empirical models for CIGS and sputter-deposition processing, implementing model-based control, and applying predictive models to the construction of new evaporation sources and for control. Model-based control is enabled by implementing reduced or empirical models into a control platform. Reliability improvement activities include implementing preventive maintenance schedules; detecting failed sensors/equipment and reconfiguring to tinue processing; and systematicmore » development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which in turn have been enabled by control and reliability improvements due to this PV Manufacturing R&D program.« less

  17. Protein homology model refinement by large-scale energy optimization.

    PubMed

    Park, Hahnbeom; Ovchinnikov, Sergey; Kim, David E; DiMaio, Frank; Baker, David

    2018-03-20

    Proteins fold to their lowest free-energy structures, and hence the most straightforward way to increase the accuracy of a partially incorrect protein structure model is to search for the lowest-energy nearby structure. This direct approach has met with little success for two reasons: first, energy function inaccuracies can lead to false energy minima, resulting in model degradation rather than improvement; and second, even with an accurate energy function, the search problem is formidable because the energy only drops considerably in the immediate vicinity of the global minimum, and there are a very large number of degrees of freedom. Here we describe a large-scale energy optimization-based refinement method that incorporates advances in both search and energy function accuracy that can substantially improve the accuracy of low-resolution homology models. The method refined low-resolution homology models into correct folds for 50 of 84 diverse protein families and generated improved models in recent blind structure prediction experiments. Analyses of the basis for these improvements reveal contributions from both the improvements in conformational sampling techniques and the energy function.

  18. Small angle X-ray scattering and cross-linking for data assisted protein structure prediction in CASP 12 with prospects for improved accuracy.

    PubMed

    Ogorzalek, Tadeusz L; Hura, Greg L; Belsom, Adam; Burnett, Kathryn H; Kryshtafovych, Andriy; Tainer, John A; Rappsilber, Juri; Tsutakawa, Susan E; Fidelis, Krzysztof

    2018-03-01

    Experimental data offers empowering constraints for structure prediction. These constraints can be used to filter equivalently scored models or more powerfully within optimization functions toward prediction. In CASP12, Small Angle X-ray Scattering (SAXS) and Cross-Linking Mass Spectrometry (CLMS) data, measured on an exemplary set of novel fold targets, were provided to the CASP community of protein structure predictors. As solution-based techniques, SAXS and CLMS can efficiently measure states of the full-length sequence in its native solution conformation and assembly. However, this experimental data did not substantially improve prediction accuracy judged by fits to crystallographic models. One issue, beyond intrinsic limitations of the algorithms, was a disconnect between crystal structures and solution-based measurements. Our analyses show that many targets had substantial percentages of disordered regions (up to 40%) or were multimeric or both. Thus, solution measurements of flexibility and assembly support variations that may confound prediction algorithms trained on crystallographic data and expecting globular fully-folded monomeric proteins. Here, we consider the CLMS and SAXS data collected, the information in these solution measurements, and the challenges in incorporating them into computational prediction. As improvement opportunities were only partly realized in CASP12, we provide guidance on how data from the full-length biological unit and the solution state can better aid prediction of the folded monomer or subunit. We furthermore describe strategic integrations of solution measurements with computational prediction programs with the aim of substantially improving foundational knowledge and the accuracy of computational algorithms for biologically-relevant structure predictions for proteins in solution. © 2018 Wiley Periodicals, Inc.

  19. Getting It Right: Designing Principal Preparation Programs That Meet District Needs for Improving Low-Performing Schools. A Technical Report on Innovative Principal Preparation Models

    ERIC Educational Resources Information Center

    Fry-Ahearn, Betty; Collins, David

    2016-01-01

    A grant from the School Leadership Program sponsored by the U.S. Department of Education during 2008-14 provided the opportunities and resources for SREB to bring together its cutting-edge knowledge base, field experience, and substantial bank of publications and training materials in the closely related fields of school improvement and school…

  20. Evaluating the performance of infectious disease forecasts: A comparison of climate-driven and seasonal dengue forecasts for Mexico.

    PubMed

    Johansson, Michael A; Reich, Nicholas G; Hota, Aditi; Brownstein, John S; Santillana, Mauricio

    2016-09-26

    Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model.

  1. Evaluating the performance of infectious disease forecasts: A comparison of climate-driven and seasonal dengue forecasts for Mexico

    PubMed Central

    Johansson, Michael A.; Reich, Nicholas G.; Hota, Aditi; Brownstein, John S.; Santillana, Mauricio

    2016-01-01

    Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model. PMID:27665707

  2. Communication: Energy transfer and reaction dynamics for DCl scattering on Au(111): An ab initio molecular dynamics study.

    PubMed

    Kolb, Brian; Guo, Hua

    2016-07-07

    Scattering and dissociative chemisorption of DCl on Au(111) are investigated using ab initio molecular dynamics with a slab model, in which the top two layers of Au are mobile. Substantial kinetic energy loss in the scattered DCl is found, but the amount of energy transfer is notably smaller than that observed in the experiment. On the other hand, the dissociative chemisorption probability reproduces the experimental trend with respect to the initial kinetic energy, but is about one order of magnitude larger than the reported initial sticking probability. While the theory-experiment agreement is significantly improved from the previous rigid surface model, the remaining discrepancies are still substantial, calling for further scrutiny in both theory and experiment.

  3. Image-optimized Coronal Magnetic Field Models

    NASA Astrophysics Data System (ADS)

    Jones, Shaela I.; Uritsky, Vadim; Davila, Joseph M.

    2017-08-01

    We have reported previously on a new method we are developing for using image-based information to improve global coronal magnetic field models. In that work, we presented early tests of the method, which proved its capability to improve global models based on flawed synoptic magnetograms, given excellent constraints on the field in the model volume. In this follow-up paper, we present the results of similar tests given field constraints of a nature that could realistically be obtained from quality white-light coronagraph images of the lower corona. We pay particular attention to difficulties associated with the line-of-sight projection of features outside of the assumed coronagraph image plane and the effect on the outcome of the optimization of errors in the localization of constraints. We find that substantial improvement in the model field can be achieved with these types of constraints, even when magnetic features in the images are located outside of the image plane.

  4. Image-Optimized Coronal Magnetic Field Models

    NASA Technical Reports Server (NTRS)

    Jones, Shaela I.; Uritsky, Vadim; Davila, Joseph M.

    2017-01-01

    We have reported previously on a new method we are developing for using image-based information to improve global coronal magnetic field models. In that work we presented early tests of the method which proved its capability to improve global models based on flawed synoptic magnetograms, given excellent constraints on the field in the model volume. In this follow-up paper we present the results of similar tests given field constraints of a nature that could realistically be obtained from quality white-light coronagraph images of the lower corona. We pay particular attention to difficulties associated with the line-of-sight projection of features outside of the assumed coronagraph image plane, and the effect on the outcome of the optimization of errors in localization of constraints. We find that substantial improvement in the model field can be achieved with this type of constraints, even when magnetic features in the images are located outside of the image plane.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony Leonard; Phillippe Chatelain; Michael Rebel

    Heavy ground vehicles, especially those involved in long-haul freight transportation, consume a significant part of our nation's energy supply. it is therefore of utmost importance to improve their efficiency, both to reduce emissions and to decrease reliance on imported oil. At highway speeds, more than half of the power consumed by a typical semi truck goes into overcoming aerodynamic drag, a fraction which increases with speed and crosswind. Thanks to better tools and increased awareness, recent years have seen substantial aerodynamic improvements by the truck industry, such as tractor/trailer height matching, radiator area reduction, and swept fairings. However, there remainsmore » substantial room for improvement as understanding of turbulent fluid dynamics grows. The group's research effort focused on vortex particle methods, a novel approach for computational fluid dynamics (CFD). Where common CFD methods solve or model the Navier-Stokes equations on a grid which stretches from the truck surface outward, vortex particle methods solve the vorticity equation on a Lagrangian basis of smooth particles and do not require a grid. They worked to advance the state of the art in vortex particle methods, improving their ability to handle the complicated, high Reynolds number flow around heavy vehicles. Specific challenges that they have addressed include finding strategies to accurate capture vorticity generation and resultant forces at the truck wall, handling the aerodynamics of spinning bodies such as tires, application of the method to the GTS model, computation time reduction through improved integration methods, a closest point transform for particle method in complex geometrics, and work on large eddy simulation (LES) turbulence modeling.« less

  6. Improved parameterization for the vertical flux of dust aerosols emitted by an eroding soil

    USDA-ARS?s Scientific Manuscript database

    The representation of the dust cycle in atmospheric circulation models hinges on an accurate parameterization of the vertical dust flux at emission. However, existing parameterizations of the vertical dust flux vary substantially in their scaling with wind friction velocity, require input parameters...

  7. Predicting dimensions of personality disorder from domains and facets of the Five-Factor Model.

    PubMed

    Reynolds, S K; Clark, L A

    2001-04-01

    We compared the utility of several trait models for describing personality disorder in a heterogeneous clinical sample (N = 94). Participants completed the Schedule for Nonadaptive and Adaptive Personality (SNAP; Clark, 1993b), a self-report measure that assesses traits relevant to personality disorder, and two measures of the Five-Factor Model: the Revised NEO Personality Inventory (NEO-PI-R; Costa and McCrae, 1992) and the Big Five Inventory (BFI; John, Donahue, & Kentle, 1991). Regression analyses indicated substantial overlap between the SNAP scales and the NEO-PI-R facets. In addition, use of the NEO-PI-R facets afforded substantial improvement over the Five-Factor Model domains in predicting interview-based ratings of DSM-IV personality disorder (American Psychiatric Association, 1994), such that the NEO facets and the SNAP scales demonstrated roughly equivalent levels of predictive power. Results support assessment of the full range of NEO-PI-R facets over the Five-Factor Model domains for both research and clinical use.

  8. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  9. Anisotropic path modeling to assess pedestrian-evacuation potential from Cascadia-related tsunamis in the US Pacific Northwest

    USGS Publications Warehouse

    Wood, Nathan J.; Schmidtlein, Mathew C.

    2012-01-01

    Recent disasters highlight the threat that tsunamis pose to coastal communities. When developing tsunami-education efforts and vertical-evacuation strategies, emergency managers need to understand how much time it could take for a coastal population to reach higher ground before tsunami waves arrive. To improve efforts to model pedestrian evacuations from tsunamis, we examine the sensitivity of least-cost-distance models to variations in modeling approaches, data resolutions, and travel-rate assumptions. We base our observations on the assumption that an anisotropic approach that uses path-distance algorithms and accounts for variations in land cover and directionality in slope is the most realistic of an actual evacuation landscape. We focus our efforts on the Long Beach Peninsula in Washington (USA), where a substantial residential and tourist population is threatened by near-field tsunamis related to a potential Cascadia subduction zone earthquake. Results indicate thousands of people are located in areas where evacuations to higher ground will be difficult before arrival of the first tsunami wave. Deviations from anisotropic modeling assumptions substantially influence the amount of time likely needed to reach higher ground. Across the entire study, changes in resolution of elevation data has a greater impact on calculated travel times than changes in land-cover resolution. In particular areas, land-cover resolution had a substantial impact when travel-inhibiting waterways were not reflected in small-scale data. Changes in travel-speed parameters had a substantial impact also, suggesting the importance of public-health campaigns as a tsunami risk-reduction strategy.

  10. Protocols for Molecular Modeling with Rosetta3 and RosettaScripts

    PubMed Central

    2016-01-01

    Previously, we published an article providing an overview of the Rosetta suite of biomacromolecular modeling software and a series of step-by-step tutorials [Kaufmann, K. W., et al. (2010) Biochemistry 49, 2987–2998]. The overwhelming positive response to this publication we received motivates us to here share the next iteration of these tutorials that feature de novo folding, comparative modeling, loop construction, protein docking, small molecule docking, and protein design. This updated and expanded set of tutorials is needed, as since 2010 Rosetta has been fully redesigned into an object-oriented protein modeling program Rosetta3. Notable improvements include a substantially improved energy function, an XML-like language termed “RosettaScripts” for flexibly specifying modeling task, new analysis tools, the addition of the TopologyBroker to control conformational sampling, and support for multiple templates in comparative modeling. Rosetta’s ability to model systems with symmetric proteins, membrane proteins, noncanonical amino acids, and RNA has also been greatly expanded and improved. PMID:27490953

  11. Thermal modelling using discrete vasculature for thermal therapy: a review

    PubMed Central

    Kok, H.P.; Gellermann, J.; van den Berg, C.A.T.; Stauffer, P.R.; Hand, J.W.; Crezee, J.

    2013-01-01

    Reliable temperature information during clinical hyperthermia and thermal ablation is essential for adequate treatment control, but conventional temperature measurements do not provide 3D temperature information. Treatment planning is a very useful tool to improve treatment quality and substantial progress has been made over the last decade. Thermal modelling is a very important and challenging aspect of hyperthermia treatment planning. Various thermal models have been developed for this purpose, with varying complexity. Since blood perfusion is such an important factor in thermal redistribution of energy in in vivo tissue, thermal simulations are most accurately performed by modelling discrete vasculature. This review describes the progress in thermal modelling with discrete vasculature for the purpose of hyperthermia treatment planning and thermal ablation. There has been significant progress in thermal modelling with discrete vasculature. Recent developments have made real-time simulations possible, which can provide feedback during treatment for improved therapy. Future clinical application of thermal modelling with discrete vasculature in hyperthermia treatment planning is expected to further improve treatment quality. PMID:23738700

  12. Data worth and prediction uncertainty for pesticide transport and fate models in Nebraska and Maryland, United States

    USGS Publications Warehouse

    Nolan, Bernard T.; Malone, Robert W.; Doherty, John E.; Barbash, Jack E.; Ma, Liwang; Shaner, Dale L.

    2015-01-01

    CONCLUSIONS: Although the observed data were sparse, they substantially reduced prediction uncertainty in unsampled regions of pesticide breakthrough curves. Nitrate evidently functioned as a surrogate for soil hydraulic data in well-drained loam soils conducive to conservative transport of nitrogen. Pesticide properties and macropore parameters could most benefit from improved characterization further to reduce model misfit and prediction uncertainty.

  13. High-End Climate Science: Development of Modeling and Related Computing Capabilities

    DTIC Science & Technology

    2000-12-01

    toward strengthening research on key scientific issues. The Program has supported research that has led to substantial increases in knowledge , improved...provides overall direction and executive oversight of the USGCRP. Within this framework, agencies manage and coordinate Federally supported scientific...critical for the U.S. Global Change Research Program. Such models can be used to look backward to test the consistency of our knowledge of Earth system

  14. Histogram equalization with Bayesian estimation for noise robust speech recognition.

    PubMed

    Suh, Youngjoo; Kim, Hoirin

    2018-02-01

    The histogram equalization approach is an efficient feature normalization technique for noise robust automatic speech recognition. However, it suffers from performance degradation when some fundamental conditions are not satisfied in the test environment. To remedy these limitations of the original histogram equalization methods, class-based histogram equalization approach has been proposed. Although this approach showed substantial performance improvement under noise environments, it still suffers from performance degradation due to the overfitting problem when test data are insufficient. To address this issue, the proposed histogram equalization technique employs the Bayesian estimation method in the test cumulative distribution function estimation. It was reported in a previous study conducted on the Aurora-4 task that the proposed approach provided substantial performance gains in speech recognition systems based on the acoustic modeling of the Gaussian mixture model-hidden Markov model. In this work, the proposed approach was examined in speech recognition systems with deep neural network-hidden Markov model (DNN-HMM), the current mainstream speech recognition approach where it also showed meaningful performance improvement over the conventional maximum likelihood estimation-based method. The fusion of the proposed features with the mel-frequency cepstral coefficients provided additional performance gains in DNN-HMM systems, which otherwise suffer from performance degradation in the clean test condition.

  15. Adverse Selection and Inertia in Health Insurance Markets: When Nudging Hurts.

    PubMed

    Handel, Benjamin R

    2013-12-01

    This paper investigates consumer inertia in health insurance markets, where adverse selection is a potential concern. We leverage a major change to insurance provision that occurred at a large firm to identify substantial inertia, and develop and estimate a choice model that also quantifies risk preferences and ex ante health risk. We use these estimates to study the impact of policies that nudge consumers toward better decisions by reducing inertia. When aggregated, these improved individual-level choices substantially exacerbate adverse selection in our setting, leading to an overall reduction in welfare that doubles the existing welfare loss from adverse selection.

  16. Quantitative cardiac SPECT reconstruction with reduced image degradation due to patient anatomy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsui, B.M.W.; Zhao, X.D.; Gregoriou, G.K.

    1994-12-01

    Patient anatomy has complicated effects on cardiac SPECT images. The authors investigated reconstruction methods which substantially reduced these effects for improved image quality. A 3D mathematical cardiac-torso (MCAT) phantom which models the anatomical structures in the thorax region were used in the study. The phantom was modified to simulate variations in patient anatomy including regions of natural thinning along the myocardium, body size, diaphragmatic shape, gender, and size and shape of breasts for female patients. Distributions of attenuation coefficients and Tl-201 uptake in different organs in a normal patient were also simulated. Emission projection data were generated from the phantomsmore » including effects of attenuation and detector response. The authors have observed the attenuation-induced artifacts caused by patient anatomy in the conventional FBP reconstructed images. Accurate attenuation compensation using iterative reconstruction algorithms and attenuation maps substantially reduced the image artifacts and improved quantitative accuracy. They conclude that reconstruction methods which accurately compensate for non-uniform attenuation can substantially reduce image degradation caused by variations in patient anatomy in cardiac SPECT.« less

  17. Improvements and applications of COBRA-TF for stand-alone and coupled LWR safety analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avramova, M.; Cuervo, D.; Ivanov, K.

    2006-07-01

    The advanced thermal-hydraulic subchannel code COBRA-TF has been recently improved and applied for stand-alone and coupled LWR core calculations at the Pennsylvania State Univ. in cooperation with AREVA NP GmbH (Germany)) and the Technical Univ. of Madrid. To enable COBRA-TF for academic and industrial applications including safety margins evaluations and LWR core design analyses, the code programming, numerics, and basic models were revised and substantially improved. The code has undergone through an extensive validation, verification, and qualification program. (authors)

  18. Realization of process improvement at a diagnostic radiology department with aid of simulation modeling.

    PubMed

    Oh, Hong-Choon; Toh, Hong-Guan; Giap Cheong, Eddy Seng

    2011-11-01

    Using the classical process improvement framework of Plan-Do-Study-Act (PDSA), the diagnostic radiology department of a tertiary hospital identified several patient cycle time reduction strategies. Experimentation of these strategies (which included procurement of new machines, hiring of new staff, redesign of queue system, etc.) through pilot scale implementation was impractical because it might incur substantial expenditure or be operationally disruptive. With this in mind, simulation modeling was used to test these strategies via performance of "what if" analyses. Using the output generated by the simulation model, the team was able to identify a cost-free cycle time reduction strategy, which subsequently led to a reduction of patient cycle time and achievement of a management-defined performance target. As healthcare professionals work continually to improve healthcare operational efficiency in response to rising healthcare costs and patient expectation, simulation modeling offers an effective scientific framework that can complement established process improvement framework like PDSA to realize healthcare process enhancement. © 2011 National Association for Healthcare Quality.

  19. Improvements in the Scalability of the NASA Goddard Multiscale Modeling Framework for Hurricane Climate Studies

    NASA Technical Reports Server (NTRS)

    Shen, Bo-Wen; Tao, Wei-Kuo; Chern, Jiun-Dar

    2007-01-01

    Improving our understanding of hurricane inter-annual variability and the impact of climate change (e.g., doubling CO2 and/or global warming) on hurricanes brings both scientific and computational challenges to researchers. As hurricane dynamics involves multiscale interactions among synoptic-scale flows, mesoscale vortices, and small-scale cloud motions, an ideal numerical model suitable for hurricane studies should demonstrate its capabilities in simulating these interactions. The newly-developed multiscale modeling framework (MMF, Tao et al., 2007) and the substantial computing power by the NASA Columbia supercomputer show promise in pursuing the related studies, as the MMF inherits the advantages of two NASA state-of-the-art modeling components: the GEOS4/fvGCM and 2D GCEs. This article focuses on the computational issues and proposes a revised methodology to improve the MMF's performance and scalability. It is shown that this prototype implementation enables 12-fold performance improvements with 364 CPUs, thereby making it more feasible to study hurricane climate.

  20. Markov chain-incorporated and synthetic data-supported conditional artificial neural network models for forecasting monthly precipitation in arid regions

    NASA Astrophysics Data System (ADS)

    Aksoy, Hafzullah; Dahamsheh, Ahmad

    2018-07-01

    For forecasting monthly precipitation in an arid region, the feed forward back-propagation, radial basis function and generalized regression artificial neural networks (ANNs) are used in this study. The ANN models are improved after incorporation of a Markov chain-based algorithm (MC-ANNs) with which the percentage of dry months is forecasted perfectly, thus generation of any non-physical negative precipitation is eliminated. Due to the fact that recorded precipitation time series are usually shorter than the length needed for a proper calibration of ANN models, synthetic monthly precipitation data are generated by Thomas-Fiering model to further improve the performance of forecasting. For case studies from Jordan, it is seen that only a slightly better performance is achieved with the use of MC and synthetic data. A conditional statement is, therefore, established and imbedded into the ANN models after the incorporation of MC and support of synthetic data, to substantially improve the ability of the models for forecasting monthly precipitation in arid regions.

  1. Atmospheric Ionizing Radiation and Human Exposure

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Mertens, Christopher J.; Goldhagen, Paul; Friedberg, W.; DeAngelis, G.; Clem, J. M.; Copeland, K.; Bidasaria, H. B.

    2005-01-01

    Atmospheric ionizing radiation is of interest, apart from its main concern of aircraft exposures, because it is a principal source of human exposure to radiations with high linear energy transfer (LET). The ionizing radiations of the lower atmosphere near the Earth s surface tend to be dominated by the terrestrial radioisotopes. especially along the coastal plain and interior low lands, and have only minor contributions from neutrons (11 percent). The world average is substantially larger but the high altitude cities especially have substantial contributions from neutrons (25 to 45 percent). Understanding the world distribution of neutron exposures requires an improved understanding of the latitudinal, longitudinal, altitude and spectral distribution that depends on local terrain and time. These issues are being investigated in a combined experimental and theoretical program. This paper will give an overview of human exposures and describe the development of improved environmental models.

  2. Atmospheric Ionizing Radiation and Human Exposure

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Goldhagen, P.; Friedberg, W.; DeAngelis, G.; Clem, J. M.; Copeland, K.; Bidasaria, H. B.

    2004-01-01

    Atmospheric ionizing radiation is of interest, apart from its main concern of aircraft exposures, because it is a principal source of human exposure to radiations with high linear energy transfer (LET). The ionizing radiations of the lower atmosphere near the Earth s surface tend to be dominated by the terrestrial radioisotopes especially along the coastal plain and interior low lands and have only minor contributions from neutrons (11 percent). The world average is substantially larger but the high altitude cities especially have substantial contributions from neutrons (25 to 45 percent). Understanding the world distribution of neutron exposures requires an improved understanding of the latitudinal, longitudinal, altitude and spectral distribution that depends on local terrain and time. These issues are being investigated in a combined experimental and theoretical program. This paper will give an overview of human exposures and describe the development of improved environmental models.

  3. GWM-2005 - A Groundwater-Management Process for MODFLOW-2005 with Local Grid Refinement (LGR) Capability

    USGS Publications Warehouse

    Ahlfeld, David P.; Baker, Kristine M.; Barlow, Paul M.

    2009-01-01

    This report describes the Groundwater-Management (GWM) Process for MODFLOW-2005, the 2005 version of the U.S. Geological Survey modular three-dimensional groundwater model. GWM can solve a broad range of groundwater-management problems by combined use of simulation- and optimization-modeling techniques. These problems include limiting groundwater-level declines or streamflow depletions, managing groundwater withdrawals, and conjunctively using groundwater and surface-water resources. GWM was initially released for the 2000 version of MODFLOW. Several modifications and enhancements have been made to GWM since its initial release to increase the scope of the program's capabilities and to improve its operation and reporting of results. The new code, which is called GWM-2005, also was designed to support the local grid refinement capability of MODFLOW-2005. Local grid refinement allows for the simulation of one or more higher resolution local grids (referred to as child models) within a coarser grid parent model. Local grid refinement is often needed to improve simulation accuracy in regions where hydraulic gradients change substantially over short distances or in areas requiring detailed representation of aquifer heterogeneity. GWM-2005 can be used to formulate and solve groundwater-management problems that include components in both parent and child models. Although local grid refinement increases simulation accuracy, it can also substantially increase simulation run times.

  4. About improving efficiency of the P3 M algorithms when computing the inter-particle forces in beam dynamics

    NASA Astrophysics Data System (ADS)

    Kozynchenko, Alexander I.; Kozynchenko, Sergey A.

    2017-03-01

    In the paper, a problem of improving efficiency of the particle-particle- particle-mesh (P3M) algorithm in computing the inter-particle electrostatic forces is considered. The particle-mesh (PM) part of the algorithm is modified in such a way that the space field equation is solved by the direct method of summation of potentials over the ensemble of particles lying not too close to a reference particle. For this purpose, a specific matrix "pattern" is introduced to describe the spatial field distribution of a single point charge, so the "pattern" contains pre-calculated potential values. This approach allows to reduce a set of arithmetic operations performed at the innermost of nested loops down to an addition and assignment operators and, therefore, to decrease the running time substantially. The simulation model developed in C++ substantiates this view, showing the descent accuracy acceptable in particle beam calculations together with the improved speed performance.

  5. Models of Marine Fish Biodiversity: Assessing Predictors from Three Habitat Classification Schemes.

    PubMed

    Yates, Katherine L; Mellin, Camille; Caley, M Julian; Radford, Ben T; Meeuwig, Jessica J

    2016-01-01

    Prioritising biodiversity conservation requires knowledge of where biodiversity occurs. Such knowledge, however, is often lacking. New technologies for collecting biological and physical data coupled with advances in modelling techniques could help address these gaps and facilitate improved management outcomes. Here we examined the utility of environmental data, obtained using different methods, for developing models of both uni- and multivariate biodiversity metrics. We tested which biodiversity metrics could be predicted best and evaluated the performance of predictor variables generated from three types of habitat data: acoustic multibeam sonar imagery, predicted habitat classification, and direct observer habitat classification. We used boosted regression trees (BRT) to model metrics of fish species richness, abundance and biomass, and multivariate regression trees (MRT) to model biomass and abundance of fish functional groups. We compared model performance using different sets of predictors and estimated the relative influence of individual predictors. Models of total species richness and total abundance performed best; those developed for endemic species performed worst. Abundance models performed substantially better than corresponding biomass models. In general, BRT and MRTs developed using predicted habitat classifications performed less well than those using multibeam data. The most influential individual predictor was the abiotic categorical variable from direct observer habitat classification and models that incorporated predictors from direct observer habitat classification consistently outperformed those that did not. Our results show that while remotely sensed data can offer considerable utility for predictive modelling, the addition of direct observer habitat classification data can substantially improve model performance. Thus it appears that there are aspects of marine habitats that are important for modelling metrics of fish biodiversity that are not fully captured by remotely sensed data. As such, the use of remotely sensed data to model biodiversity represents a compromise between model performance and data availability.

  6. Models of Marine Fish Biodiversity: Assessing Predictors from Three Habitat Classification Schemes

    PubMed Central

    Yates, Katherine L.; Mellin, Camille; Caley, M. Julian; Radford, Ben T.; Meeuwig, Jessica J.

    2016-01-01

    Prioritising biodiversity conservation requires knowledge of where biodiversity occurs. Such knowledge, however, is often lacking. New technologies for collecting biological and physical data coupled with advances in modelling techniques could help address these gaps and facilitate improved management outcomes. Here we examined the utility of environmental data, obtained using different methods, for developing models of both uni- and multivariate biodiversity metrics. We tested which biodiversity metrics could be predicted best and evaluated the performance of predictor variables generated from three types of habitat data: acoustic multibeam sonar imagery, predicted habitat classification, and direct observer habitat classification. We used boosted regression trees (BRT) to model metrics of fish species richness, abundance and biomass, and multivariate regression trees (MRT) to model biomass and abundance of fish functional groups. We compared model performance using different sets of predictors and estimated the relative influence of individual predictors. Models of total species richness and total abundance performed best; those developed for endemic species performed worst. Abundance models performed substantially better than corresponding biomass models. In general, BRT and MRTs developed using predicted habitat classifications performed less well than those using multibeam data. The most influential individual predictor was the abiotic categorical variable from direct observer habitat classification and models that incorporated predictors from direct observer habitat classification consistently outperformed those that did not. Our results show that while remotely sensed data can offer considerable utility for predictive modelling, the addition of direct observer habitat classification data can substantially improve model performance. Thus it appears that there are aspects of marine habitats that are important for modelling metrics of fish biodiversity that are not fully captured by remotely sensed data. As such, the use of remotely sensed data to model biodiversity represents a compromise between model performance and data availability. PMID:27333202

  7. School Turnarounds: Evidence from the 2009 Stimulus. NBER Working Paper No. 17990

    ERIC Educational Resources Information Center

    Dee, Thomas

    2012-01-01

    The American Recovery and Reinvestment Act of 2009 (ARRA) targeted substantial School Improvement Grants (SIGs) to the nation's "persistently lowest achieving" public schools (i.e., up to $2 million per school annually over 3 years) but required schools accepting these awards to implement a federally prescribed school-reform model.…

  8. IVHS And The Environment, New Models For Federal, State And Local Cooperation In The Application Of Advanced Transportation Systems For Environmental Improvements In Urban Areas, Executive Summary

    DOT National Transportation Integrated Search

    1994-09-01

    INTELLIGENT VEHICLE HIGHWAY SYSTEMS (IVHS) HAVE THE POTENTIAL TO SUBSTANTIALLY CHANGE TRANSPORTATION'S IMPACT ON URBAN AIR QUALITY AND OTHER ENVIRONMENTAL ASPECTS. WHETHER THIS IMPACT IS POSITIVE DEPENDS ON HOW THESE TECHNOLOGIES ARE DEPLOYED. THIS S...

  9. Leadership for Learning: Lessons from 40 Years of Empirical Research

    ERIC Educational Resources Information Center

    Hallinger, Philip

    2011-01-01

    Purpose: This paper aims to present a research-based model of leadership for learning. It argues that the field has made substantial progress over the past 40 years in identifying ways in which leadership contributes to learning and school improvement. Four specific dimensions of leading for learning are presented: values and beliefs, leadership…

  10. James-Stein Estimation. Program Statistics Research, Technical Report No. 89-86.

    ERIC Educational Resources Information Center

    Brandwein, Ann Cohen; Strawderman, William E.

    This paper presents an expository development of James-Stein estimation with substantial emphasis on exact results for nonnormal location models. The themes of the paper are: (1) the improvement possible over the best invariant estimator via shrinkage estimation is not surprising but expected from a variety of perspectives; (2) the amount of…

  11. Mutations in gp41 are correlated with coreceptor tropism but do not improve prediction methods substantially.

    PubMed

    Thielen, Alexander; Lengauer, Thomas; Swenson, Luke C; Dong, Winnie W Y; McGovern, Rachel A; Lewis, Marilyn; James, Ian; Heera, Jayvant; Valdez, Hernan; Harrigan, P Richard

    2011-01-01

    The main determinants of HIV-1 coreceptor usage are located in the V3-loop of gp120, although mutations in V2 and gp41 are also known. Incorporation of V2 is known to improve prediction algorithms; however, this has not been confirmed for gp41 mutations. Samples with V3 and gp41 genotypes and Trofile assay (Monogram Biosciences, South San Francisco, CA, USA) results were taken from the HOMER cohort (n=444) and from patients screened for the MOTIVATE studies (n=1,916; 859 with maraviroc outcome data). Correlations of mutations with tropism were assessed using Fisher's exact test and prediction models trained using support vector machines. Models were validated by cross-validation, by testing models from one dataset on the other, and by analysing virological outcome. Several mutations within gp41 were highly significant for CXCR4 usage; most strikingly an insertion occurring in 7.7% of HOMER-R5 and 46.3% of HOMER-X4 samples (MOTIVATE 5.7% and 25.2%, respectively). Models trained on gp41 sequence alone achieved relatively high areas under the receiver-operating characteristic curve (AUCs; HOMER 0.713 and MOTIVATE 0.736) that were almost as good as V3 models (0.773 and 0.884, respectively). However, combining the two regions improved predictions only marginally (0.813 and 0.902, respectively). Similar results were found when models were trained on HOMER and validated on MOTIVATE or vice versa. The difference in median log viral load decrease at week 24 between patients with R5 and X4 virus was 1.65 (HOMER 2.45 and MOTIVATE 0.79) for V3 models, 1.59 for gp41-models (2.42 and 0.83, respectively) and 1.58 for the combined predictor (2.44 and 0.86, respectively). Several mutations within gp41 showed strong correlation with tropism in two independent datasets. However, incorporating gp41 mutations into prediction models is not mandatory because they do not improve substantially on models trained on V3 sequences alone.

  12. Data envelopment analysis in service quality evaluation: an empirical study

    NASA Astrophysics Data System (ADS)

    Najafi, Seyedvahid; Saati, Saber; Tavana, Madjid

    2015-09-01

    Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A slack-based measure (SBM) of efficiency with constant inputs is used to calculate the PSQI. In addition, a non-linear programming model based on the SBM is proposed to delineate an improvement guideline and improve service quality. An empirical study is conducted to assess the applicability of the method proposed in this study. A large number of studies have used DEA as a benchmarking tool to measure service quality. These models do not propose a coherent performance evaluation construct and consequently fail to deliver improvement guidelines for improving service quality. The DEA models proposed in this study are designed to evaluate and improve service quality within a comprehensive framework and without any dependency on external data.

  13. Sensitivity of a model projection of near-surface permafrost degradation to soil column depth and representation of soil organic matter.

    Treesearch

    David M. Lawrence; Andrew G. Slater; Vladimir E. Romanovsky; Dmitry J. Nicolsky

    2008-01-01

    The sensitivity of a global land-surface model projection of near-surface permafrost degradation is assessed with respect to explicit accounting of the thermal and hydrologic properties of soil organic matter and to a deepening of the soil column from 3.5 to 50 or more m. Together these modifications result in substantial improvements in the simulation of near-surface...

  14. The effects of changes in physical fitness on academic performance among New York City youth.

    PubMed

    Bezold, Carla P; Konty, Kevin J; Day, Sophia E; Berger, Magdalena; Harr, Lindsey; Larkin, Michael; Napier, Melanie D; Nonas, Cathy; Saha, Subir; Harris, Tiffany G; Stark, James H

    2014-12-01

    To evaluate whether a change in fitness is associated with academic outcomes in New York City (NYC) middle-school students using longitudinal data and to evaluate whether this relationship is modified by student household poverty. This was a longitudinal study of 83,111 New York City middle-school students enrolled between 2006-2007 and 2011-2012. Fitness was measured as a composite percentile based on three fitness tests and categorized based on change from the previous year. The effect of the fitness change level on academic outcomes, measured as a composite percentile based on state standardized mathematics and English Language Arts test scores, was estimated using a multilevel growth model. Models were stratified by sex, and additional models were tested stratified by student household poverty. For both girls and boys, a substantial increase in fitness from the previous year resulted in a greater improvement in academic ranking than was seen in the reference group (girls: .36 greater percentile point improvement, 95% confidence interval: .09-.63; boys: .38 greater percentile point improvement, 95% confidence interval: .09-.66). A substantial decrease in fitness was associated with a decrease in academics in both boys and girls. Effects of fitness on academics were stronger in high-poverty boys and girls than in low-poverty boys and girls. Academic rankings improved for boys and girls who increased their fitness level by >20 percentile points compared to other students. Opportunities for increased physical fitness may be important to support academic performance. Copyright © 2014 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  15. The RAPIDD ebola forecasting challenge: Synthesis and lessons learnt.

    PubMed

    Viboud, Cécile; Sun, Kaiyuan; Gaffey, Robert; Ajelli, Marco; Fumanelli, Laura; Merler, Stefano; Zhang, Qian; Chowell, Gerardo; Simonsen, Lone; Vespignani, Alessandro

    2018-03-01

    Infectious disease forecasting is gaining traction in the public health community; however, limited systematic comparisons of model performance exist. Here we present the results of a synthetic forecasting challenge inspired by the West African Ebola crisis in 2014-2015 and involving 16 international academic teams and US government agencies, and compare the predictive performance of 8 independent modeling approaches. Challenge participants were invited to predict 140 epidemiological targets across 5 different time points of 4 synthetic Ebola outbreaks, each involving different levels of interventions and "fog of war" in outbreak data made available for predictions. Prediction targets included 1-4 week-ahead case incidences, outbreak size, peak timing, and several natural history parameters. With respect to weekly case incidence targets, ensemble predictions based on a Bayesian average of the 8 participating models outperformed any individual model and did substantially better than a null auto-regressive model. There was no relationship between model complexity and prediction accuracy; however, the top performing models for short-term weekly incidence were reactive models with few parameters, fitted to a short and recent part of the outbreak. Individual model outputs and ensemble predictions improved with data accuracy and availability; by the second time point, just before the peak of the epidemic, estimates of final size were within 20% of the target. The 4th challenge scenario - mirroring an uncontrolled Ebola outbreak with substantial data reporting noise - was poorly predicted by all modeling teams. Overall, this synthetic forecasting challenge provided a deep understanding of model performance under controlled data and epidemiological conditions. We recommend such "peace time" forecasting challenges as key elements to improve coordination and inspire collaboration between modeling groups ahead of the next pandemic threat, and to assess model forecasting accuracy for a variety of known and hypothetical pathogens. Published by Elsevier B.V.

  16. Investigation of the potential for direct compaction of a fine ibuprofen powder dry-coated with magnesium stearate.

    PubMed

    Qu, Li; Zhou, Qi Tony; Gengenbach, Thomas; Denman, John A; Stewart, Peter J; Hapgood, Karen P; Gamlen, Michael; Morton, David A V

    2015-05-01

    Intensive dry powder coating (mechanofusion) with tablet lubricants has previously been shown to give substantial powder flow improvement. This study explores whether the mechanofusion of magnesium stearate (MgSt), on a fine drug powder can substantially improve flow, without preventing the powder from being directly compacted into tablets. A fine ibuprofen powder, which is both cohesive and possesses a low-melting point, was dry coated via mechanofusion with between 0.1% and 5% (w/w) MgSt. Traditional low-shear blending was also employed as a comparison. No significant difference in particle size or shape was measured following mechanofusion. For the low-shear blended powders, only marginal improvement in flowability was obtained. However, after mechanofusion, substantial improvements in the flow properties were demonstrated. Both XPS and ToF-SIMS demonstrated high degrees of a nano-scale coating coverage of MgSt on the particle surfaces from optimized mechanofusion. The study showed that robust tablets were produced from the selected mechanofused powders, at high-dose concentration and tablet tensile strength was further optimized via addition of a Polyvinylpyrrolidone (PVP) binder (10% w/w). The tablets with the mechanofused powder (with or without PVP) also exhibited significantly lower ejection stress than those made of the raw powder, demonstrating good lubrication. Surprisingly, the release rate of drug from the tablets made with the mechanofused powder was not retarded. This is the first study to demonstrate such a single-step dry coating of model drug with MgSt, with promising flow improvement, flow-aid and lubrication effects, tabletability and also non-inhibited dissolution rate.

  17. Image-optimized Coronal Magnetic Field Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Shaela I.; Uritsky, Vadim; Davila, Joseph M., E-mail: shaela.i.jones-mecholsky@nasa.gov, E-mail: shaela.i.jonesmecholsky@nasa.gov

    We have reported previously on a new method we are developing for using image-based information to improve global coronal magnetic field models. In that work, we presented early tests of the method, which proved its capability to improve global models based on flawed synoptic magnetograms, given excellent constraints on the field in the model volume. In this follow-up paper, we present the results of similar tests given field constraints of a nature that could realistically be obtained from quality white-light coronagraph images of the lower corona. We pay particular attention to difficulties associated with the line-of-sight projection of features outsidemore » of the assumed coronagraph image plane and the effect on the outcome of the optimization of errors in the localization of constraints. We find that substantial improvement in the model field can be achieved with these types of constraints, even when magnetic features in the images are located outside of the image plane.« less

  18. SPATIO-TEMPORAL MODELING OF AGRICULTURAL YIELD DATA WITH AN APPLICATION TO PRICING CROP INSURANCE CONTRACTS

    PubMed Central

    Ozaki, Vitor A.; Ghosh, Sujit K.; Goodwin, Barry K.; Shirota, Ricardo

    2009-01-01

    This article presents a statistical model of agricultural yield data based on a set of hierarchical Bayesian models that allows joint modeling of temporal and spatial autocorrelation. This method captures a comprehensive range of the various uncertainties involved in predicting crop insurance premium rates as opposed to the more traditional ad hoc, two-stage methods that are typically based on independent estimation and prediction. A panel data set of county-average yield data was analyzed for 290 counties in the State of Paraná (Brazil) for the period of 1990 through 2002. Posterior predictive criteria are used to evaluate different model specifications. This article provides substantial improvements in the statistical and actuarial methods often applied to the calculation of insurance premium rates. These improvements are especially relevant to situations where data are limited. PMID:19890450

  19. Genomic selection models double the accuracy of predicted breeding values for bacterial cold water disease resistance compared to a traditional pedigree-based model in rainbow trout aquaculture.

    PubMed

    Vallejo, Roger L; Leeds, Timothy D; Gao, Guangtu; Parsons, James E; Martin, Kyle E; Evenhuis, Jason P; Fragomeni, Breno O; Wiens, Gregory D; Palti, Yniv

    2017-02-01

    Previously, we have shown that bacterial cold water disease (BCWD) resistance in rainbow trout can be improved using traditional family-based selection, but progress has been limited to exploiting only between-family genetic variation. Genomic selection (GS) is a new alternative that enables exploitation of within-family genetic variation. We compared three GS models [single-step genomic best linear unbiased prediction (ssGBLUP), weighted ssGBLUP (wssGBLUP), and BayesB] to predict genomic-enabled breeding values (GEBV) for BCWD resistance in a commercial rainbow trout population, and compared the accuracy of GEBV to traditional estimates of breeding values (EBV) from a pedigree-based BLUP (P-BLUP) model. We also assessed the impact of sampling design on the accuracy of GEBV predictions. For these comparisons, we used BCWD survival phenotypes recorded on 7893 fish from 102 families, of which 1473 fish from 50 families had genotypes [57 K single nucleotide polymorphism (SNP) array]. Naïve siblings of the training fish (n = 930 testing fish) were genotyped to predict their GEBV and mated to produce 138 progeny testing families. In the following generation, 9968 progeny were phenotyped to empirically assess the accuracy of GEBV predictions made on their non-phenotyped parents. The accuracy of GEBV from all tested GS models were substantially higher than the P-BLUP model EBV. The highest increase in accuracy relative to the P-BLUP model was achieved with BayesB (97.2 to 108.8%), followed by wssGBLUP at iteration 2 (94.4 to 97.1%) and 3 (88.9 to 91.2%) and ssGBLUP (83.3 to 85.3%). Reducing the training sample size to n = ~1000 had no negative impact on the accuracy (0.67 to 0.72), but with n = ~500 the accuracy dropped to 0.53 to 0.61 if the training and testing fish were full-sibs, and even substantially lower, to 0.22 to 0.25, when they were not full-sibs. Using progeny performance data, we showed that the accuracy of genomic predictions is substantially higher than estimates obtained from the traditional pedigree-based BLUP model for BCWD resistance. Overall, we found that using a much smaller training sample size compared to similar studies in livestock, GS can substantially improve the selection accuracy and genetic gains for this trait in a commercial rainbow trout breeding population.

  20. The Value of Information in Decision-Analytic Modeling for Malaria Vector Control in East Africa.

    PubMed

    Kim, Dohyeong; Brown, Zachary; Anderson, Richard; Mutero, Clifford; Miranda, Marie Lynn; Wiener, Jonathan; Kramer, Randall

    2017-02-01

    Decision analysis tools and mathematical modeling are increasingly emphasized in malaria control programs worldwide to improve resource allocation and address ongoing challenges with sustainability. However, such tools require substantial scientific evidence, which is costly to acquire. The value of information (VOI) has been proposed as a metric for gauging the value of reduced model uncertainty. We apply this concept to an evidenced-based Malaria Decision Analysis Support Tool (MDAST) designed for application in East Africa. In developing MDAST, substantial gaps in the scientific evidence base were identified regarding insecticide resistance in malaria vector control and the effectiveness of alternative mosquito control approaches, including larviciding. We identify four entomological parameters in the model (two for insecticide resistance and two for larviciding) that involve high levels of uncertainty and to which outputs in MDAST are sensitive. We estimate and compare a VOI for combinations of these parameters in evaluating three policy alternatives relative to a status quo policy. We find having perfect information on the uncertain parameters could improve program net benefits by up to 5-21%, with the highest VOI associated with jointly eliminating uncertainty about reproductive speed of malaria-transmitting mosquitoes and initial efficacy of larviciding at reducing the emergence of new adult mosquitoes. Future research on parameter uncertainty in decision analysis of malaria control policy should investigate the VOI with respect to other aspects of malaria transmission (such as antimalarial resistance), the costs of reducing uncertainty in these parameters, and the extent to which imperfect information about these parameters can improve payoffs. © 2016 Society for Risk Analysis.

  1. Improving Incremental Balance in the GSI 3DVAR Analysis System

    NASA Technical Reports Server (NTRS)

    Errico, Ronald M.; Yang, Runhua; Kleist, Daryl T.; Parrish, David F.; Derber, John C.; Treadon, Russ

    2008-01-01

    The Gridpoint Statistical Interpolation (GSI) analysis system is a unified global/regional 3DVAR analysis code that has been under development for several years at the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center. It has recently been implemented into operations at NCEP in both the global and North American data assimilation systems (GDAS and NDAS). An important aspect of this development has been improving the balance of the analysis produced by GSI. The improved balance between variables has been achieved through the inclusion of a Tangent Linear Normal Mode Constraint (TLNMC). The TLNMC method has proven to be very robust and effective. The TLNMC as part of the global GSI system has resulted in substantial improvement in data assimilation both at NCEP and at the NASA Global Modeling and Assimilation Office (GMAO).

  2. Novelties in pharmacological management of cardiopulmonary resuscitation

    PubMed Central

    Bartos, Jason A.; Yannopoulos, Demetris

    2014-01-01

    Purpose of review The ultimate goal of cardiopulmonary resuscitation is long-term neurologically intact survival. Despite numerous well designed studies, the medications currently used in advanced cardiac life support have not demonstrated success in this regard. This review describes the novel therapeutics under investigation to improve functional recovery and survival. Recent findings Whereas current medications focus on achieving return of spontaneous circulation and improved hemodynamics, novel therapies currently in development are focused on improving cellular survival and function by preventing metabolic derangement, protecting mitochondria, and preventing cell death caused by cardiac arrest. Improved cardiac and neurologic function and survival benefits have been observed using animal models of cardiopulmonary arrest. Summary Although substantial data have shown benefits using robust animal models, further human studies are necessary to investigate the potential long-term benefits of these therapies. PMID:23995130

  3. Counteracting structural errors in ensemble forecast of influenza outbreaks.

    PubMed

    Pei, Sen; Shaman, Jeffrey

    2017-10-13

    For influenza forecasts generated using dynamical models, forecast inaccuracy is partly attributable to the nonlinear growth of error. As a consequence, quantification of the nonlinear error structure in current forecast models is needed so that this growth can be corrected and forecast skill improved. Here, we inspect the error growth of a compartmental influenza model and find that a robust error structure arises naturally from the nonlinear model dynamics. By counteracting these structural errors, diagnosed using error breeding, we develop a new forecast approach that combines dynamical error correction and statistical filtering techniques. In retrospective forecasts of historical influenza outbreaks for 95 US cities from 2003 to 2014, overall forecast accuracy for outbreak peak timing, peak intensity and attack rate, are substantially improved for predicted lead times up to 10 weeks. This error growth correction method can be generalized to improve the forecast accuracy of other infectious disease dynamical models.Inaccuracy of influenza forecasts based on dynamical models is partly due to nonlinear error growth. Here the authors address the error structure of a compartmental influenza model, and develop a new improved forecast approach combining dynamical error correction and statistical filtering techniques.

  4. Understanding the Day Cent model: Calibration, sensitivity, and identifiability through inverse modeling

    USGS Publications Warehouse

    Necpálová, Magdalena; Anex, Robert P.; Fienen, Michael N.; Del Grosso, Stephen J.; Castellano, Michael J.; Sawyer, John E.; Iqbal, Javed; Pantoja, Jose L.; Barker, Daniel W.

    2015-01-01

    The ability of biogeochemical ecosystem models to represent agro-ecosystems depends on their correct integration with field observations. We report simultaneous calibration of 67 DayCent model parameters using multiple observation types through inverse modeling using the PEST parameter estimation software. Parameter estimation reduced the total sum of weighted squared residuals by 56% and improved model fit to crop productivity, soil carbon, volumetric soil water content, soil temperature, N2O, and soil3NO− compared to the default simulation. Inverse modeling substantially reduced predictive model error relative to the default model for all model predictions, except for soil 3NO− and 4NH+. Post-processing analyses provided insights into parameter–observation relationships based on parameter correlations, sensitivity and identifiability. Inverse modeling tools are shown to be a powerful way to systematize and accelerate the process of biogeochemical model interrogation, improving our understanding of model function and the underlying ecosystem biogeochemical processes that they represent.

  5. The Deep South Clouds & Aerosols project: Improving the modelling of clouds in the Southern Ocean region

    NASA Astrophysics Data System (ADS)

    Morgenstern, Olaf; McDonald, Adrian; Harvey, Mike; Davies, Roger; Katurji, Marwan; Varma, Vidya; Williams, Jonny

    2016-04-01

    Southern-Hemisphere climate projections are subject to persistent climate model biases affecting the large majority of contemporary climate models, which degrade the reliability of these projections, particularly at the regional scale. Southern-Hemisphere specific problems include the fact that satellite-based observations comparisons with model output indicate that cloud occurrence above the Southern Ocean is substantially underestimated, with consequences for the radiation balance, sea surface temperatures, sea ice, and the position of storm tracks. The Southern-Ocean and Antarctic region is generally characterized by an acute paucity of surface-based and airborne observations, further complicating the situation. In recognition of this and other Southern-Hemisphere specific problems with climate modelling, the New Zealand Government has launched the Deep South National Science Challenge, whose purpose is to develop a new Earth System Model which reduces these very large radiative forcing problems associated with erroneous clouds. The plan is to conduct a campaign of targeted observations in the Southern Ocean region, leveraging off international measurement campaigns in this area, and using these and existing measurements of cloud and aerosol properties to improve the representation of clouds in the nascent New Zealand Earth System Model. Observations and model development will target aerosol physics and chemistry, particularly sulphate, sea salt, and non-sulphate organic aerosol, its interactions with clouds, and cloud microphysics. The hypothesis is that the cloud schemes in most GCMs are trained on Northern-Hemisphere data characterized by substantial anthropogenic or terrestrial aerosol-related influences which are almost completely absent in the Deep South.

  6. A physics department's role in preparing physics teachers: The Colorado learning assistant model

    NASA Astrophysics Data System (ADS)

    Otero, Valerie; Pollock, Steven; Finkelstein, Noah

    2010-11-01

    In response to substantial evidence that many U.S. students are inadequately prepared in science and mathematics, we have developed an effective and adaptable model that improves the education of all students in introductory physics and increases the numbers of talented physics majors becoming certified to teach physics. We report on the Colorado Learning Assistant model and discuss its effectiveness at a large research university. Since its inception in 2003, we have increased the pool of well-qualified K-12 physics teachers by a factor of approximately three, engaged scientists significantly in the recruiting and preparation of future teachers, and improved the introductory physics sequence so that students' learning gains are typically double the traditional average.

  7. Using Genotype Abundance to Improve Phylogenetic Inference

    PubMed Central

    Mesin, Luka; Victora, Gabriel D; Minin, Vladimir N; Matsen, Frederick A

    2018-01-01

    Abstract Modern biological techniques enable very dense genetic sampling of unfolding evolutionary histories, and thus frequently sample some genotypes multiple times. This motivates strategies to incorporate genotype abundance information in phylogenetic inference. In this article, we synthesize a stochastic process model with standard sequence-based phylogenetic optimality, and show that tree estimation is substantially improved by doing so. Our method is validated with extensive simulations and an experimental single-cell lineage tracing study of germinal center B cell receptor affinity maturation. PMID:29474671

  8. Double-input compartmental modeling and spectral analysis for the quantification of positron emission tomography data in oncology

    NASA Astrophysics Data System (ADS)

    Tomasi, G.; Kimberley, S.; Rosso, L.; Aboagye, E.; Turkheimer, F.

    2012-04-01

    In positron emission tomography (PET) studies involving organs different from the brain, ignoring the metabolite contribution to the tissue time-activity curves (TAC), as in the standard single-input (SI) models, may compromise the accuracy of the estimated parameters. We employed here double-input (DI) compartmental modeling (CM), previously used for [11C]thymidine, and a novel DI spectral analysis (SA) approach on the tracers 5-[18F]fluorouracil (5-[18F]FU) and [18F]fluorothymidine ([18F]FLT). CM and SA were performed initially with a SI approach using the parent plasma TAC as an input function. These methods were then employed using a DI approach with the metabolite plasma TAC as an additional input function. Regions of interest (ROIs) corresponding to healthy liver, kidneys and liver metastases for 5-[18F]FU and to tumor, vertebra and liver for [18F]FLT were analyzed. For 5-[18F]FU, the improvement of the fit quality with the DI approaches was remarkable; in CM, the Akaike information criterion (AIC) always selected the DI over the SI model. Volume of distribution estimates obtained with DI CM and DI SA were in excellent agreement, for both parent 5-[18F]FU (R2 = 0.91) and metabolite [18F]FBAL (R2 = 0.99). For [18F]FLT, the DI methods provided notable improvements but less substantial than for 5-[18F]FU due to the lower rate of metabolism of [18F]FLT. On the basis of the AIC values, agreement between [18F]FLT Ki estimated with the SI and DI models was good (R2 = 0.75) for the ROIs where the metabolite contribution was negligible, indicating that the additional input did not bias the parent tracer only-related estimates. When the AIC suggested a substantial contribution of the metabolite [18F]FLT-glucuronide, on the other hand, the change in the parent tracer only-related parameters was significant (R2 = 0.33 for Ki). Our results indicated that improvements of DI over SI approaches can range from moderate to substantial and are more significant for tracers with a high rate of metabolism. Furthermore, they showed that SA is suitable for DI modeling and can be used effectively in the analysis of PET data.

  9. Lower- Versus Higher-Income Populations In The Alternative Quality Contract: Improved Quality And Similar Spending

    PubMed Central

    Song, Zirui; Rose, Sherri; Chernew, Michael E.; Safran, Dana Gelb

    2018-01-01

    As population-based payment models become increasingly common, it is crucial to understand how such payment models affect health disparities. We evaluated health care quality and spending among enrollees in areas with lower versus higher socioeconomic status in Massachusetts before and after providers entered into the Alternative Quality Contract, a two-sided population-based payment model with substantial incentives tied to quality. We compared changes in process measures, outcome measures, and spending between enrollees in areas with lower and higher socioeconomic status from 2006 to 2012 (outcome measures were measured after the intervention only). Quality improved for all enrollees in the Alternative Quality Contract after their provider organizations entered the contract. Process measures improved 1.2 percentage points per year more among enrollees in areas with lower socioeconomic status than among those in areas with higher socioeconomic status. Outcome measure improvement was no different between the subgroups; neither were changes in spending. Larger or comparable improvements in quality among enrollees in areas with lower socioeconomic status suggest a potential narrowing of disparities. Strong pay-for-performance incentives within a population-based payment model could encourage providers to focus on improving quality for more disadvantaged populations. PMID:28069849

  10. Free to Choose? Reform, Choice, and Consideration Sets in the English National Health Service.

    PubMed

    Gaynor, Martin; Propper, Carol; Seiler, Stephan

    2016-11-01

    Choice in public services is controversial. We exploit a reform in the English National Health Service to assess the effect of removing constraints on patient choice. We estimate a demand model that explicitly captures the removal of the choice constraints imposed on patients. We find that, post-removal, patients became more responsive to clinical quality. This led to a modest reduction in mortality and a substantial increase in patient welfare. The elasticity of demand faced by hospitals increased substantially post- reform and we find evidence that hospitals responded to the enhanced incentives by improving quality. This suggests greater choice can raise quality.

  11. Factoring vs linear modeling in rate estimation: a simulation study of relative accuracy.

    PubMed

    Maldonado, G; Greenland, S

    1998-07-01

    A common strategy for modeling dose-response in epidemiology is to transform ordered exposures and covariates into sets of dichotomous indicator variables (that is, to factor the variables). Factoring tends to increase estimation variance, but it also tends to decrease bias and thus may increase or decrease total accuracy. We conducted a simulation study to examine the impact of factoring on the accuracy of rate estimation. Factored and unfactored Poisson regression models were fit to follow-up study datasets that were randomly generated from 37,500 population model forms that ranged from subadditive to supramultiplicative. In the situations we examined, factoring sometimes substantially improved accuracy relative to fitting the corresponding unfactored model, sometimes substantially decreased accuracy, and sometimes made little difference. The difference in accuracy between factored and unfactored models depended in a complicated fashion on the difference between the true and fitted model forms, the strength of exposure and covariate effects in the population, and the study size. It may be difficult in practice to predict when factoring is increasing or decreasing accuracy. We recommend, therefore, that the strategy of factoring variables be supplemented with other strategies for modeling dose-response.

  12. Glomerular structural-functional relationship models of diabetic nephropathy are robust in type 1 diabetic patients.

    PubMed

    Mauer, Michael; Caramori, Maria Luiza; Fioretto, Paola; Najafian, Behzad

    2015-06-01

    Studies of structural-functional relationships have improved understanding of the natural history of diabetic nephropathy (DN). However, in order to consider structural end points for clinical trials, the robustness of the resultant models needs to be verified. This study examined whether structural-functional relationship models derived from a large cohort of type 1 diabetic (T1D) patients with a wide range of renal function are robust. The predictability of models derived from multiple regression analysis and piecewise linear regression analysis was also compared. T1D patients (n = 161) with research renal biopsies were divided into two equal groups matched for albumin excretion rate (AER). Models to explain AER and glomerular filtration rate (GFR) by classical DN lesions in one group (T1D-model, or T1D-M) were applied to the other group (T1D-test, or T1D-T) and regression analyses were performed. T1D-M-derived models explained 70 and 63% of AER variance and 32 and 21% of GFR variance in T1D-M and T1D-T, respectively, supporting the substantial robustness of the models. Piecewise linear regression analyses substantially improved predictability of the models with 83% of AER variance and 66% of GFR variance explained by classical DN glomerular lesions alone. These studies demonstrate that DN structural-functional relationship models are robust, and if appropriate models are used, glomerular lesions alone explain a major proportion of AER and GFR variance in T1D patients. © The Author 2014. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  13. Improving the nutritional value of Golden Rice through increased pro-vitamin A content.

    PubMed

    Paine, Jacqueline A; Shipton, Catherine A; Chaggar, Sunandha; Howells, Rhian M; Kennedy, Mike J; Vernon, Gareth; Wright, Susan Y; Hinchliffe, Edward; Adams, Jessica L; Silverstone, Aron L; Drake, Rachel

    2005-04-01

    "Golden Rice" is a variety of rice engineered to produce beta-carotene (pro-vitamin A) to help combat vitamin A deficiency, and it has been predicted that its contribution to alleviating vitamin A deficiency would be substantially improved through even higher beta-carotene content. We hypothesized that the daffodil gene encoding phytoene synthase (psy), one of the two genes used to develop Golden Rice, was the limiting step in beta-carotene accumulation. Through systematic testing of other plant psys, we identified a psy from maize that substantially increased carotenoid accumulation in a model plant system. We went on to develop "Golden Rice 2" introducing this psy in combination with the Erwinia uredovora carotene desaturase (crtI) used to generate the original Golden Rice. We observed an increase in total carotenoids of up to 23-fold (maximum 37 microg/g) compared to the original Golden Rice and a preferential accumulation of beta-carotene.

  14. An improved model of the Earth's gravitational field: GEM-T1

    NASA Technical Reports Server (NTRS)

    Marsh, J. G.; Lerch, F. J.; Christodoulidis, D. C.; Putney, B. H.; Felsentreger, T. L.; Sanchez, B. V.; Smith, D. E.; Klosko, S. M.; Martin, T. V.; Pavlis, E. C.

    1987-01-01

    Goddard Earth Model T1 (GEM-T1), which was developed from an analysis of direct satellite tracking observations, is the first in a new series of such models. GEM-T1 is complete to degree and order 36. It was developed using consistent reference parameters and extensive earth and ocean tidal models. It was simultaneously solved for gravitational and tidal terms, earth orientation parameters, and the orbital parameters of 580 individual satellite arcs. The solution used only satellite tracking data acquired on 17 different satellites and is predominantly based upon the precise laser data taken by third generation systems. In all, 800,000 observations were used. A major improvement in field accuracy was obtained. For marine geodetic applications, long wavelength geoidal modeling is twice as good as in earlier satellite-only GEM models. Orbit determination accuracy has also been substantially advanced over a wide range of satellites that have been tested.

  15. Ecolabeled paper towels: consumer valuation and expenditure analysis.

    PubMed

    Srinivasan, Arun K; Blomquist, Glenn C

    2009-01-01

    Ecolabeled paper towels are manufactured using post-consumer recycled material and sold in markets using a recycle logo. Environmentally conscious consumers purchase these paper towels and thereby contribute to improving environmental quality. In this paper, we estimate the implicit value placed by consumers on ecolabeled paper towels using a hedonic price function and conduct an expenditure analysis using Heckman's selection model. Using the data set from the Internet-based grocery stores called as Peapod we find that some consumers recognize ecolabels on paper towels and place a substantial, positive price premium on them. The expenditure analysis indicates that for the preferred functional form, the demand for ecolabeled paper towels is inelastic for environmentally conscious consumers. The simulated results from the selection model indicate that a small subsidy for ecolabeled paper towels will not substantially change consumers' purchase decisions.

  16. Substantiation of basic scheme of grain cleaning machine for preparation of agricultural crops seeds

    NASA Astrophysics Data System (ADS)

    Giyevskiy, A. M.; Orobinsky, V. I.; Tarasenko, A. P.; Chernyshov, A. V.; Kurilov, D. O.

    2018-03-01

    The article presents data on the feasibility of the concept of a high-efficiency seed cleaner with the consistent use of the air flow in aspiration and the multi-tier placement of the sorting grids in grating mills. As a result of modeling, the directions for further improvement of air-screen seed cleaning machines have been identified: an increase in the proportion of sorting grids in the mills up to 70 ... 80% and an increase in the speed of the air flow in the channel of the pre-filter cleaning up to 8.0 m / s. Experiments have established the competence of using mathematical modeling of airflow in the pneumatic system with the use of a finite-volume method for solving hydrodynamic equations for substantiating the basic parameters of the pneumatic system.

  17. Laboratory Investigation of Space and Planetary Dust Grains

    NASA Technical Reports Server (NTRS)

    Spann, James

    2005-01-01

    Dust in space is ubiquitous and impacts diverse observed phenomena in various ways. Understanding the dominant mechanisms that control dust grain properties and its impact on surrounding environments is basic to improving our understanding observed processes at work in space. There is a substantial body of work on the theory and modeling of dust in space and dusty plasmas. To substantiate and validate theory and models, laboratory investigations and space borne observations have been conducted. Laboratory investigations are largely confined to an assembly of dust grains immersed in a plasma environment. Frequently the behaviors of these complex dusty plasmas in the laboratory have raised more questions than verified theories. Space borne observations have helped us characterize planetary environments. The complex behavior of dust grains in space indicates the need to understand the microphysics of individual grains immersed in a plasma or space environment.

  18. What’s Needed from Climate Modeling to Advance Actionable Science for Water Utilities?

    NASA Astrophysics Data System (ADS)

    Barsugli, J. J.; Anderson, C. J.; Smith, J. B.; Vogel, J. M.

    2009-12-01

    “…perfect information on climate change is neither available today nor likely to be available in the future, but … over time, as the threats climate change poses to our systems grow more real, predicting those effects with greater certainty is non-discretionary. We’re not yet at a level at which climate change projections can drive climate change adaptation.” (Testimony of WUCA Staff Chair David Behar to the House Committee on Science and Technology, May 5, 2009) To respond to this challenge, the Water Utility Climate Alliance (WUCA) has sponsored a white paper titled “Options for Improving Climate Modeling to Assist Water Utility Planning for Climate Change. ” This report concerns how investments in the science of climate change, and in particular climate modeling and downscaling, can best be directed to help make climate projections more actionable. The meaning of “model improvement” can be very different depending on whether one is talking to a climate model developer or to a water manager trying to incorporate climate projections in to planning. We first surveyed the WUCA members on present and potential uses of climate model projections and on climate inputs to their various system models. Based on those surveys and on subsequent discussions, we identified four dimensions along which improvement in modeling would make the science more “actionable”: improved model agreement on change in key parameters; narrowing the range of model projections; providing projections at spatial and temporal scales that match water utilities system models; providing projections that water utility planning horizons. With these goals in mind we developed four options for improving global-scale climate modeling and three options for improving downscaling that will be discussed. However, there does not seem to be a single investment - the proverbial “magic bullet” -- which will substantially reduce the range of model projections at the scales at which utility planning is conducted. In the near term we feel strongly that water utilities and climate scientists should work together to leverage the upcoming Coupled Model Intercomparison Project, Phase 5 (CMIP5; a coordinated set climate model experiments that will be used to support the upcoming IPCC Fifth Assessment) to better benefit water utilities. In the longer term, even with model and downscaling improvements, it is very likely that substantial uncertainty about future climate change at the desired spatial and temporal scales will remain. Nonetheless, there is no doubt the climate is changing, and the challenge is to work with what we have, or what we can reasonably expect to have in the coming years to make the best decisions we can.

  19. Evaluation of Enhanced High Resolution MODIS/AMSR-E SSTs and the Impact on Regional Weather Forecast

    NASA Technical Reports Server (NTRS)

    Schiferl, Luke D.; Fuell, Kevin K.; Case, Jonathan L.; Jedlovec, Gary J.

    2010-01-01

    Over the last few years, the NASA Short-term Prediction Research and Transition (SPoRT) Center has been generating a 1-km sea surface temperature (SST) composite derived from retrievals of the Moderate Resolution Imaging Spectroradiometer (MODIS) for use in operational diagnostics and regional model initialization. With the assumption that the day-to-day variation in the SST is nominal, individual MODIS passes aboard the Earth Observing System (EOS) Aqua and Terra satellites are used to create and update four composite SST products each day at 0400, 0700, 1600, and 1900 UTC, valid over the western Atlantic and Caribbean waters. A six month study from February to August 2007 over the marine areas surrounding southern Florida was conducted to compare the use of the MODIS SST composite versus the Real-Time Global SST analysis to initialize the Weather Research and Forecasting (WRF) model. Substantial changes in the forecast heat fluxes were seen at times in the marine boundary layer, but relatively little overall improvement was measured in the sensible weather elements. The limited improvement in the WRF model forecasts could be attributed to the diurnal changes in SST seen in the MODIS SST composites but not accounted for by the model. Furthermore, cloud contamination caused extended periods when individual passes of MODIS were unable to update the SSTs, leading to substantial SST latency and a cool bias during the early summer months. In order to alleviate the latency problems, the SPoRT Center recently enhanced its MODIS SST composite by incorporating information from the Advanced Microwave Scanning Radiometer-EOS (AMSR-E) instruments as well as the Operational Sea Surface Temperature and Sea Ice Analysis. These enhancements substantially decreased the latency due to cloud cover and improved the bias and correlation of the composites at available marine point observations. While these enhancements improved upon the modeled cold bias using the original MODIS SSTs, the discernable impacts on the WRF model were still somewhat limited. This paper explores several factors that may have contributed to this result. First, the original methodology to initialize the model used the most recent SST composite available in a hypothetical real ]time configuration, often matching the forecast initial time with an SST field that was 5-8 hours offset. To minimize the differences that result from the diurnal variations in SST, the previous day fs SST composite is incorporated at a time closest to the model initialization hour (e.g. 1600 UTC composite at 1500 UTC model initialization). Second, the diurnal change seen in the MODIS SST composites was not represented by the WRF model in previous simulations, since the SSTs were held constant throughout the model integration. To address this issue, we explore the use of a water skin-temperature diurnal cycle prediction capability within v3.1 of the WRF model to better represent fluctuations in marine surface forcing. Finally, the verification of the WRF model is limited to very few over-water sites, many of which are located near the coastlines. In order to measure the open ocean improvements from the AMSR-E, we could use an independent 2-dimensional, satellite-derived data set to validate the forecast model by applying an object-based verification method. Such a validation technique could aid in better understanding the benefits of the mesoscale SST spatial structure to regional models applications.

  20. Assimilation of Satellite to Improve Cloud Simulation in Wrf Model

    NASA Astrophysics Data System (ADS)

    Park, Y. H.; Pour Biazar, A.; McNider, R. T.

    2012-12-01

    A simple approach has been introduced to improve cloud simulation spatially and temporally in a meteorological model. The first step for this approach is to use Geostationary Operational Environmental Satellite (GOES) observations to identify clouds and estimate the clouds structure. Then by comparing GOES observations to model cloud field, we identify areas in which model has under-predicted or over-predicted clouds. Next, by introducing subsidence in areas with over-prediction and lifting in areas with under-prediction, erroneous clouds are removed and new clouds are formed. The technique estimates a vertical velocity needed for the cloud correction and then uses a one dimensional variation schemes (1D_Var) to calculate the horizontal divergence components and the consequent horizontal wind components needed to sustain such vertical velocity. Finally, the new horizontal winds are provided as a nudging field to the model. This nudging provides the dynamical support needed to create/clear clouds in a sustainable manner. The technique was implemented and tested in the Weather Research and Forecast (WRF) Model and resulted in substantial improvement in model simulated clouds. Some of the results are presented here.

  1. Using quadratic mean diameter and relative spacing index to enhance height-diameter and crown ratio models fitted to longitudinal data

    Treesearch

    Pradip Saud; Thomas B. Lynch; Anup K. C.; James M. Guldin

    2016-01-01

    The inclusion of quadratic mean diameter (QMD) and relative spacing index (RSI) substantially improved the predictive capacity of height–diameter at breast height (d.b.h.) and crown ratio models (CR), respectively. Data were obtained from 208 permanent plots established in western Arkansas and eastern Oklahoma during 1985–1987 and remeasured for the sixth time (2012–...

  2. Losartan Administration Reduces Fibrosis but Hinders Functional Recovery after Volumetric Muscle Loss Injury

    DTIC Science & Technology

    2014-09-25

    therapy. Pre - viously, losartan has been successfully used to reduce fibrosis and improve both muscle regeneration and function in several models of...efficacy of losartan has not yet been tested in a VML injury model. VML injury involves a substantial loss of muscle tissue that does not regenerate by...fibrosis development after VML injury in the rat tibialis anterior (TA) muscle. METHODS Experimental Design Male Lewis rats with VML were provided access

  3. Teacher Evaluation and School Improvement: An Analysis of the Evidence

    ERIC Educational Resources Information Center

    Hallinger, Philip; Heck, Ronald H.; Murphy, Joseph

    2014-01-01

    In recent years, substantial investments have been made in reengineering systems of teacher evaluation. The new generation models of teacher evaluation typically adopt a standards-based view of teaching quality and include a value-added measure of growth in student learning. With more than a decade of experience and research, it is timely to…

  4. Social preferences toward energy generation with woody biomass from public forests in Montana, USA

    Treesearch

    Robert M. Campbell; Tyron J. Venn; Nathaniel M. Anderson

    2016-01-01

    In Montana, USA, there are substantial opportunities for mechanized thinning treatments on public forests to reduce the likelihood of severe and damaging wildfires and improve forest health. These treatments produce residues that can be used to generate renewable energy and displace fossil fuels. The choice modeling method is employed to examine the marginal...

  5. The Role of Acquired Shared Mental Models in Improving the Process of Team-Based Learning

    ERIC Educational Resources Information Center

    Johnson, Tristan E.; Khalil, Mohammed K.; Spector, J. Michael, Ed.

    2008-01-01

    Working in teams is an important aspect of learning in various educational settings. Although education has embraced instructional strategies that use multiple learners to facilitate learning, the benefits of team-based learning need to be substantiated. There are limited efforts to evaluate the efficacy of learning processes associated with…

  6. Improving sea level simulation in Mediterranean regional climate models

    NASA Astrophysics Data System (ADS)

    Adloff, Fanny; Jordà, Gabriel; Somot, Samuel; Sevault, Florence; Arsouze, Thomas; Meyssignac, Benoit; Li, Laurent; Planton, Serge

    2017-08-01

    For now, the question about future sea level change in the Mediterranean remains a challenge. Previous climate modelling attempts to estimate future sea level change in the Mediterranean did not meet a consensus. The low resolution of CMIP-type models prevents an accurate representation of important small scales processes acting over the Mediterranean region. For this reason among others, the use of high resolution regional ocean modelling has been recommended in literature to address the question of ongoing and future Mediterranean sea level change in response to climate change or greenhouse gases emissions. Also, it has been shown that east Atlantic sea level variability is the dominant driver of the Mediterranean variability at interannual and interdecadal scales. However, up to now, long-term regional simulations of the Mediterranean Sea do not integrate the full sea level information from the Atlantic, which is a substantial shortcoming when analysing Mediterranean sea level response. In the present study we analyse different approaches followed by state-of-the-art regional climate models to simulate Mediterranean sea level variability. Additionally we present a new simulation which incorporates improved information of Atlantic sea level forcing at the lateral boundary. We evaluate the skills of the different simulations in the frame of long-term hindcast simulations spanning from 1980 to 2012 analysing sea level variability from seasonal to multidecadal scales. Results from the new simulation show a substantial improvement in the modelled Mediterranean sea level signal. This confirms that Mediterranean mean sea level is strongly influenced by the Atlantic conditions, and thus suggests that the quality of the information in the lateral boundary conditions (LBCs) is crucial for the good modelling of Mediterranean sea level. We also found that the regional differences inside the basin, that are induced by circulation changes, are model-dependent and thus not affected by the LBCs. Finally, we argue that a correct configuration of LBCs in the Atlantic should be used for future Mediterranean simulations, which cover hindcast period, but also for scenarios.

  7. Goldmann Tonometer Prism with an Optimized Error Correcting Applanation Surface.

    PubMed

    McCafferty, Sean; Lim, Garrett; Duncan, William; Enikov, Eniko; Schwiegerling, Jim

    2016-09-01

    We evaluate solutions for an applanating surface modification to the Goldmann tonometer prism, which substantially negates the errors due to patient variability in biomechanics. A modified Goldmann or correcting applanation tonometry surface (CATS) prism is presented which was optimized to minimize the intraocular pressure (IOP) error due to corneal thickness, stiffness, curvature, and tear film. Mathematical modeling with finite element analysis (FEA) and manometric IOP referenced cadaver eyes were used to optimize and validate the design. Mathematical modeling of the optimized CATS prism indicates an approximate 50% reduction in each of the corneal biomechanical and tear film errors. Manometric IOP referenced pressure in cadaveric eyes demonstrates substantial equivalence to GAT in nominal eyes with the CATS prism as predicted by modeling theory. A CATS modified Goldmann prism is theoretically able to significantly improve the accuracy of IOP measurement without changing Goldmann measurement technique or interpretation. Clinical validation is needed but the analysis indicates a reduction in CCT error alone to less than ±2 mm Hg using the CATS prism in 100% of a standard population compared to only 54% less than ±2 mm Hg error with the present Goldmann prism. This article presents an easily adopted novel approach and critical design parameters to improve the accuracy of a Goldmann applanating tonometer.

  8. Water demand for electricity in deep decarbonisation scenarios: a multi-model assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mouratiadou, I.; Bevione, M.; Bijl, D. L.

    This study assesses the effects of deep electricity decarbonisation and shifts in the choice of power plant cooling technologies on global electricity water demand, using a suite of five integrated assessment models. We find that electricity sector decarbonisation results in co-benefits for water resources primarily due to the phase-out of water-intensive coal-based thermoelectric power generation, although these co-benefits vary substantially across decarbonisation scenarios. Wind and solar photovoltaic power represent a win-win option for both climate and water resources, but further expansion of nuclear or fossil- and biomass-fuelled power plants with carbon capture and storage may result in increased pressures onmore » the water environment. Further to these results, the paper provides insights on the most crucial factors of uncertainty with regards to future estimates of water demand. These estimates varied substantially across models in scenarios where the effects of decarbonisation on the electricity mix were less clear-cut. Future thermal and water efficiency improvements of power generation technologies and demand-side energy efficiency improvements were also identified to be important factors of uncertainty. We conclude that in order to ensure positive effects of decarbonisation on water resources, climate policy should be combined with technology-specific energy and/or water policies.« less

  9. Aeroelastic modeling of rotor blades with spanwise variable elastic axis offset: Classic issues revisited and new formulations

    NASA Technical Reports Server (NTRS)

    Bielawa, Richard L.

    1988-01-01

    In response to a systematic methodology assessment program directed to the aeroelastic stability of hingeless helicopter rotor blades, improved basic aeroelastic reformulations and new formulations relating to structural sweep were achieved. Correlational results are presented showing the substantially improved performance of the G400 aeroelastic analysis incorporating these new formulations. The formulations pertain partly to sundry solutions to classic problem areas, relating to dynamic inflow with vortex-ring state operation and basic blade kinematics, but mostly to improved physical modeling of elastic axis offset (structural sweep) in the presence of nonlinear structural twist. Specific issues addressed are an alternate modeling of the delta EI torsional excitation due to compound bending using a force integration approach, and the detailed kinematic representation of an elastically deflected point mass of a beam with both structural sweep and nonlinear twist.

  10. Vegetation projections for Wind Cave National Park with three future climate scenarios: Final report in completion of Task Agreement J8W07100052

    USGS Publications Warehouse

    King, David A.; Bachelet, Dominique M.; Symstad, Amy J.

    2013-01-01

    Since the initial application of MC1 to a small portion of WICA (Bachelet et al. 2000), the model has been altered to improve model performance with the inclusion of dynamic fire. Applying this improved version to WICA required substantial recalibration, during which we have made a number of improvements to MC1 that will be incorporated as permanent changes. In this report we document these changes and our calibration procedure following a brief overview of the model. We compare the projections of current vegetation to the current state of the park and present projections of vegetation dynamics under future climates downscaled from three GCMs selected to represent the existing range in available GCM projections. In doing so, we examine the consequences of different management options regarding fire and grazing, major aspects of biotic management at Wind Cave.

  11. Bundled Payments in Total Joint Replacement: Keeping Our Care Affordable and High in Quality.

    PubMed

    McLawhorn, Alexander S; Buller, Leonard T

    2017-09-01

    The purpose of this review was to evaluate the literature regarding bundle payment reimbursement models for total joint arthroplasty (TJA). From an economic standpoint, TJA are cost-effective, but they represent a substantial expense to the Centers for Medicare & Medicaid Services (CMS). Historically, fee-for-service payment models resulted in highly variable cost and quality. CMS introduced Bundled Payments for Care Improvement (BPCI) in 2012 and subsequently the Comprehensive Care for Joint Replacement (CJR) reimbursement model in 2016 to improve the value of TJA from the perspectives of both CMS and patients, by improving quality via cost control. Early results of bundled payments are promising, but preserving access to care for patients with high comorbidity burdens and those requiring more complex care is a lingering concern. Hospitals, regardless of current participation in bundled payments, should develop care pathways for TJA to maximize efficiency and patient safety.

  12. Extended charge banking model of dual path shocks for implantable cardioverter defibrillators

    PubMed Central

    Dosdall, Derek J; Sweeney, James D

    2008-01-01

    Background Single path defibrillation shock methods have been improved through the use of the Charge Banking Model of defibrillation, which predicts the response of the heart to shocks as a simple resistor-capacitor (RC) circuit. While dual path defibrillation configurations have significantly reduced defibrillation thresholds, improvements to dual path defibrillation techniques have been limited to experimental observations without a practical model to aid in improving dual path defibrillation techniques. Methods The Charge Banking Model has been extended into a new Extended Charge Banking Model of defibrillation that represents small sections of the heart as separate RC circuits, uses a weighting factor based on published defibrillation shock field gradient measures, and implements a critical mass criteria to predict the relative efficacy of single and dual path defibrillation shocks. Results The new model reproduced the results from several published experimental protocols that demonstrated the relative efficacy of dual path defibrillation shocks. The model predicts that time between phases or pulses of dual path defibrillation shock configurations should be minimized to maximize shock efficacy. Discussion Through this approach the Extended Charge Banking Model predictions may be used to improve dual path and multi-pulse defibrillation techniques, which have been shown experimentally to lower defibrillation thresholds substantially. The new model may be a useful tool to help in further improving dual path and multiple pulse defibrillation techniques by predicting optimal pulse durations and shock timing parameters. PMID:18673561

  13. Prospects for improving the representation of coastal and shelf seas in global ocean models

    NASA Astrophysics Data System (ADS)

    Holt, Jason; Hyder, Patrick; Ashworth, Mike; Harle, James; Hewitt, Helene T.; Liu, Hedong; New, Adrian L.; Pickles, Stephen; Porter, Andrew; Popova, Ekaterina; Icarus Allen, J.; Siddorn, John; Wood, Richard

    2017-02-01

    Accurately representing coastal and shelf seas in global ocean models represents one of the grand challenges of Earth system science. They are regions of immense societal importance through the goods and services they provide, hazards they pose and their role in global-scale processes and cycles, e.g. carbon fluxes and dense water formation. However, they are poorly represented in the current generation of global ocean models. In this contribution, we aim to briefly characterise the problem, and then to identify the important physical processes, and their scales, needed to address this issue in the context of the options available to resolve these scales globally and the evolving computational landscape.We find barotropic and topographic scales are well resolved by the current state-of-the-art model resolutions, e.g. nominal 1/12°, and still reasonably well resolved at 1/4°; here, the focus is on process representation. We identify tides, vertical coordinates, river inflows and mixing schemes as four areas where modelling approaches can readily be transferred from regional to global modelling with substantial benefit. In terms of finer-scale processes, we find that a 1/12° global model resolves the first baroclinic Rossby radius for only ˜ 8 % of regions < 500 m deep, but this increases to ˜ 70 % for a 1/72° model, so resolving scales globally requires substantially finer resolution than the current state of the art.We quantify the benefit of improved resolution and process representation using 1/12° global- and basin-scale northern North Atlantic nucleus for a European model of the ocean (NEMO) simulations; the latter includes tides and a k-ɛ vertical mixing scheme. These are compared with global stratification observations and 19 models from CMIP5. In terms of correlation and basin-wide rms error, the high-resolution models outperform all these CMIP5 models. The model with tides shows improved seasonal cycles compared to the high-resolution model without tides. The benefits of resolution are particularly apparent in eastern boundary upwelling zones.To explore the balance between the size of a globally refined model and that of multiscale modelling options (e.g. finite element, finite volume or a two-way nesting approach), we consider a simple scale analysis and a conceptual grid refining approach. We put this analysis in the context of evolving computer systems, discussing model turnaround time, scalability and resource costs. Using a simple cost model compared to a reference configuration (taken to be a 1/4° global model in 2011) and the increasing performance of the UK Research Councils' computer facility, we estimate an unstructured mesh multiscale approach, resolving process scales down to 1.5 km, would use a comparable share of the computer resource by 2021, the two-way nested multiscale approach by 2022, and a 1/72° global model by 2026. However, we also note that a 1/12° global model would not have a comparable computational cost to a 1° global model in 2017 until 2027. Hence, we conclude that for computationally expensive models (e.g. for oceanographic research or operational oceanography), resolving scales to ˜ 1.5 km would be routinely practical in about a decade given substantial effort on numerical and computational development. For complex Earth system models, this extends to about 2 decades, suggesting the focus here needs to be on improved process parameterisation to meet these challenges.

  14. Remote sensing inputs to landscape models which predict future spatial land use patterns for hydrologic models

    NASA Technical Reports Server (NTRS)

    Miller, L. D.; Tom, C.; Nualchawee, K.

    1977-01-01

    A tropical forest area of Northern Thailand provided a test case of the application of the approach in more natural surroundings. Remote sensing imagery subjected to proper computer analysis has been shown to be a very useful means of collecting spatial data for the science of hydrology. Remote sensing products provide direct input to hydrologic models and practical data bases for planning large and small-scale hydrologic developments. Combining the available remote sensing imagery together with available map information in the landscape model provides a basis for substantial improvements in these applications.

  15. Investigating Dry Deposition of Ozone to Vegetation

    NASA Astrophysics Data System (ADS)

    Silva, Sam J.; Heald, Colette L.

    2018-01-01

    Atmospheric ozone loss through dry deposition to vegetation is a critically important process for both air quality and ecosystem health. The majority of atmospheric chemistry models calculate dry deposition using a resistance-in-series parameterization by Wesely (1989), which is dependent on many environmental variables and lookup table values. The uncertainties contained within this parameterization have not been fully explored, ultimately challenging our ability to understand global scale biosphere-atmosphere interactions. In this work, we evaluate the GEOS-Chem model simulation of ozone dry deposition using a globally distributed suite of observations. We find that simulated daytime deposition velocities generally reproduce the magnitude of observations to within a factor of 1.4. When correctly accounting for differences in land class between the observations and model, these biases improve, most substantially over the grasses and shrubs land class. These biases do not impact the global ozone burden substantially; however, they do lead to local absolute changes of up to 4 ppbv and relative changes of 15% in summer surface concentrations. We use MERRA meteorology from 1979 to 2008 to assess that the interannual variability in simulated annual mean ozone dry deposition due to model input meteorology is small (generally less than 5% over vegetated surfaces). Sensitivity experiments indicate that the simulation is most sensitive to the stomatal and ground surface resistances, as well as leaf area index. To improve ozone dry deposition models, more measurements are necessary over rainforests and various crop types, alongside constraints on individual depositional pathways and other in-canopy ozone loss processes.

  16. Array Effects in Large Wind Farms. Cooperative Research and Development Final Report, CRADA Number CRD-09-343

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moriarty, Patrick

    2016-02-23

    The effects of wind turbine wakes within operating wind farms have a substantial impact on the overall energy production from the farm. The current generation of models drastically underpredicts the impact of these wakes leading to non-conservative estimates of energy capture and financial losses to wind farm operators and developers. To improve these models, detailed research of operating wind farms is necessary. Rebecca Barthelmie of Indiana University is a world leader of wind farm wakes effects and would like to partner with NREL to help improve wind farm modeling by gathering additional wind farm data, develop better models and increasemore » collaboration with European researchers working in the same area. This is currently an active area of research at NREL and the capabilities of both parties should mesh nicely.« less

  17. A comparison of two coaching approaches to enhance implementation of a recovery-oriented service model.

    PubMed

    Deane, Frank P; Andresen, Retta; Crowe, Trevor P; Oades, Lindsay G; Ciarrochi, Joseph; Williams, Virginia

    2014-09-01

    Moving to recovery-oriented service provision in mental health may entail retraining existing staff, as well as training new staff. This represents a substantial burden on organisations, particularly since transfer of training into practice is often poor. Follow-up supervision and/or coaching have been found to improve the implementation and sustainment of new approaches. We compared the effect of two coaching conditions, skills-based and transformational coaching, on the implementation of a recovery-oriented model following training. Training followed by coaching led to significant sustained improvements in the quality of care planning in accordance with the new model over the 12-month study period. No interaction effect was observed between the two conditions. However, post hoc analyses suggest that transformational coaching warrants further exploration. The results support the provision of supervision in the form of coaching in the implementation of a recovery-oriented service model, and suggest the need to better elucidate the mechanisms within different coaching approaches that might contribute to improved care.

  18. Machine Learning Based Multi-Physical-Model Blending for Enhancing Renewable Energy Forecast -- Improvement via Situation Dependent Error Correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Siyuan; Hwang, Youngdeok; Khabibrakhmanov, Ildar

    With increasing penetration of solar and wind energy to the total energy supply mix, the pressing need for accurate energy forecasting has become well-recognized. Here we report the development of a machine-learning based model blending approach for statistically combining multiple meteorological models for improving the accuracy of solar/wind power forecast. Importantly, we demonstrate that in addition to parameters to be predicted (such as solar irradiance and power), including additional atmospheric state parameters which collectively define weather situations as machine learning input provides further enhanced accuracy for the blended result. Functional analysis of variance shows that the error of individual modelmore » has substantial dependence on the weather situation. The machine-learning approach effectively reduces such situation dependent error thus produces more accurate results compared to conventional multi-model ensemble approaches based on simplistic equally or unequally weighted model averaging. Validation over an extended period of time results show over 30% improvement in solar irradiance/power forecast accuracy compared to forecasts based on the best individual model.« less

  19. Deployment of e-health services - a business model engineering strategy.

    PubMed

    Kijl, Björn; Nieuwenhuis, Lambert J M; Huis in 't Veld, Rianne M H A; Hermens, Hermie J; Vollenbroek-Hutten, Miriam M R

    2010-01-01

    We designed a business model for deploying a myofeedback-based teletreatment service. An iterative and combined qualitative and quantitative action design approach was used for developing the business model and the related value network. Insights from surveys, desk research, expert interviews, workshops and quantitative modelling were combined to produce the first business model and then to refine it in three design cycles. The business model engineering strategy provided important insights which led to an improved, more viable and feasible business model and related value network design. Based on this experience, we conclude that the process of early stage business model engineering reduces risk and produces substantial savings in costs and resources related to service deployment.

  20. Potential impact of initialization on decadal predictions as assessed for CMIP5 models

    NASA Astrophysics Data System (ADS)

    Branstator, Grant; Teng, Haiyan

    2012-06-01

    To investigate the potential for initialization to improve decadal range predictions, we quantify the initial value predictability of upper 300 m temperature in the two northern ocean basins for 12 models from Coupled Model Intercomparison Project phase 5 (CMIP5), and we contrast it with the forced predictability in Representative Concentration Pathways (RCP) 4.5 climate change projections. We use a recently introduced method that produces predictability estimates from long control runs. Many initial states are considered, and we find on average 1) initialization has the potential to improve skill in the first 5 years in the North Pacific and the first 9 years in the North Atlantic, and 2) the impact from initialization becomes secondary compared to the impact of RCP4.5 forcing after 6 1/2 and 8 years in the two basins, respectively. Model-to-model and spatial variations in these limits are, however, substantial.

  1. Estimating the impact on health of poor reliability of drinking water interventions in developing countries.

    PubMed

    Hunter, Paul R; Zmirou-Navier, Denis; Hartemann, Philippe

    2009-04-01

    Recent evidence suggests that many improved drinking water supplies suffer from poor reliability. This study investigates what impact poor reliability may have on achieving health improvement targets. A Quantitative Microbiological Risk Assessment was conducted of the impact of interruptions in water supplies that forced people to revert to drinking raw water. Data from the literature were used to construct models on three waterborne pathogens common in Africa: Rotavirus, Cryptosporidium and Enterotoxigenic E. coli. Risk of infection by the target pathogens is substantially greater on days that people revert to raw water consumption. Over the course of a few days raw water consumption, the annual health benefits attributed to consumption of water from an improved supply will be almost all lost. Furthermore, risk of illness on days drinking raw water will fall substantially on very young children who have the highest risk of death following infection. Agencies responsible for implementing improved drinking water provision will not make meaningful contributions to public health targets if those systems are subject to poor reliability. Funders of water quality interventions in developing countries should put more effort into auditing whether interventions are sustainable and whether the health benefits are being achieved.

  2. Flexible Fabrics with High Thermal Conductivity for Advanced Spacesuits

    NASA Technical Reports Server (NTRS)

    Trevino, Luis A.; Bue, Grant; Orndoff, Evelyne; Kesterson, Matt; Connel, John W.; Smith, Joseph G., Jr.; Southward, Robin E.; Working, Dennis; Watson, Kent A.; Delozier, Donovan M.

    2006-01-01

    This paper describes the effort and accomplishments for developing flexible fabrics with high thermal conductivity (FFHTC) for spacesuits to improve thermal performance, lower weight and reduce complexity. Commercial and additional space exploration applications that require substantial performance enhancements in removal and transport of heat away from equipment as well as from the human body can benefit from this technology. Improvements in thermal conductivity were achieved through the use of modified polymers containing thermally conductive additives. The objective of the FFHTC effort is to significantly improve the thermal conductivity of the liquid cooled ventilation garment by improving the thermal conductivity of the subcomponents (i.e., fabric and plastic tubes). This paper presents the initial system modeling studies, including a detailed liquid cooling garment model incorporated into the Wissler human thermal regulatory model, to quantify the necessary improvements in thermal conductivity and garment geometries needed to affect system performance. In addition, preliminary results of thermal conductivity improvements of the polymer components of the liquid cooled ventilation garment are presented. By improving thermal garment performance, major technology drivers will be addressed for lightweight, high thermal conductivity, flexible materials for spacesuits that are strategic technical challenges of the Exploration

  3. The Impact Of Medicare ACOs On Improving Integration And Coordination Of Physical And Behavioral Health Care.

    PubMed

    Fullerton, Catherine A; Henke, Rachel M; Crable, Erika L; Hohlbauch, Andriana; Cummings, Nicholas

    2016-07-01

    The accountable care organization (ACO) model holds the promise of reducing costs and improving the quality of care by realigning payment incentives to focus on health outcomes instead of service volume. One key to managing the total cost of care is improving care coordination for and treatment of people with behavioral health disorders. We examined qualitative data from ninety organizations participating in Medicare ACO demonstration programs from 2012 through 2015 to determine whether and how they focused on behavioral health care. These ACOs had mixed degrees of engagement in improving behavioral health care for their populations. The biggest challenges included a lack of behavioral health care providers, data availability, and sustainable financing models. Nonetheless, we found substantial interest in integrating behavioral health care into primary care across a majority of the ACOs. Project HOPE—The People-to-People Health Foundation, Inc.

  4. Dielectric Properties of Piezoelectric Polyimides

    NASA Technical Reports Server (NTRS)

    Ounaies, Z.; Young, J. A.; Simpson, J. O.; Farmer, B. L.

    1997-01-01

    Molecular modeling and dielectric measurements are being used to identify mechanisms governing piezoelectric behavior in polyimides such as dipole orientation during poling, as well as degree of piezoelectricity achievable. Molecular modeling on polyimides containing pendant, polar nitrile (CN) groups has been completed to determine their remanent polarization. Experimental investigation of their dielectric properties evaluated as a function of temperature and frequency has substantiated numerical predictions. With this information in hand, we are then able to suggest changes in the molecular structures, which will then improve upon the piezoelectric response.

  5. Simulation of thin slot spirals and dual circular patch antennas using the finite element method with mixed elements

    NASA Technical Reports Server (NTRS)

    Gong, Jian; Volakis, John L.; Nurnberger, Michael W.

    1995-01-01

    This semi-annual report describes progress up to mid-January 1995. The report contains five sections all dealing with the modeling of spiral and patch antennas recessed in metallic platforms. Of significance is the development of decomposition schemes which separate the different regions of the antenna volume. Substantial effort was devoted to improving the feed model in the context of the finite element method (FEM). Finally, an innovative scheme for truncating finite element meshes is presented.

  6. Advanced propulsion for LEO-Moon transport. 3: Transportation model. M.S. Thesis - California Univ.

    NASA Technical Reports Server (NTRS)

    Henley, Mark W.

    1992-01-01

    A simplified computational model of low Earth orbit-Moon transportation system has been developed to provide insight into the benefits of new transportation technologies. A reference transportation infrastructure, based upon near-term technology developments, is used as a departure point for assessing other, more advanced alternatives. Comparison of the benefits of technology application, measured in terms of a mass payback ratio, suggests that several of the advanced technology alternatives could substantially improve the efficiency of low Earth orbit-Moon transportation.

  7. Public release of the ISC-GEM Global Instrumental Earthquake Catalogue (1900-2009)

    USGS Publications Warehouse

    Storchak, Dmitry A.; Di Giacomo, Domenico; Bondára, István; Engdahl, E. Robert; Harris, James; Lee, William H.K.; Villaseñor, Antonio; Bormann, Peter

    2013-01-01

    The International Seismological Centre–Global Earthquake Model (ISC–GEM) Global Instrumental Earthquake Catalogue (1900–2009) is the result of a special effort to substantially extend and improve currently existing global catalogs to serve the requirements of specific user groups who assess and model seismic hazard and risk. The data from the ISC–GEM Catalogue would be used worldwide yet will prove absolutely essential in those regions where a high seismicity level strongly correlates with a high population density.

  8. Lessons learned from a colocation model using psychiatrists in urban primary care settings.

    PubMed

    Weiss, Meredith; Schwartz, Bruce J

    2013-07-01

    Comorbid psychiatric illness has been identified as a major driver of health care costs. The colocation of psychiatrists in primary care practices has been proposed as a model to improve mental health and medical care as well as a model to reduce health care costs. Financial models were developed to determine the sustainability of colocation. We found that the population studied had substantial psychiatric and medical burdens, and multiple practice logistical issues were identified. The providers found the experience highly rewarding and colocation was financially sustainable under certain conditions. The colocation model was effective in identifying and treating psychiatric comorbidities.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuypers, Marshall A.; Lambert, Gregory Joseph; Moore, Thomas W.

    Chronic infection with Hepatitis C virus (HCV) results in cirrhosis, liver cancer and death. As the nations largest provider of care for HCV, US Veterans Health Administration (VHA) invests extensive resources in the diagnosis and treatment of the disease. This report documents modeling and analysis of HCV treatment dynamics performed for the VHA aimed at improving service delivery efficiency. System dynamics modeling of disease treatment demonstrated the benefits of early detection and the role of comorbidities in disease progress and patient mortality. Preliminary modeling showed that adherence to rigorous treatment protocols is a primary determinant of treatment success. In depthmore » meta-analysis revealed correlations of adherence and various psycho-social factors. This initial meta-analysis indicates areas where substantial improvement in patient outcomes can potentially result from VA programs which incorporate these factors into their design.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daily, Michael D.; Olsen, Brett N.; Schlesinger, Paul H.

    In mammalian cells cholesterol is essential for membrane function, but in excess can be cytototoxic. The cellular response to acute cholesterol loading involves biophysical-based mechanisms that regulate cholesterol levels, through modulation of the “activity” or accessibility of cholesterol to extra-membrane acceptors. Experiments and united atom (UA) simulations show that at high concentrations of cholesterol, lipid bilayers thin significantly and cholesterol availability to external acceptors increases substantially. Such cholesterol activation is critical to its trafficking within cells. Here we aim to reduce the computational cost to enable simulation of large and complex systems involved in cholesterol regulation, such as those includingmore » oxysterols and cholesterol-sensing proteins. To accomplish this, we have modified the published MARTINI coarse-grained force field to improve its predictions of cholesterol-induced changes in both macroscopic and microscopic properties of membranes. Most notably, MARTINI fails to capture both the (macroscopic) area condensation and membrane thickening seen at less than 30% cholesterol and the thinning seen above 40% cholesterol. The thinning at high concentration is critical to cholesterol activation. Microscopic properties of interest include cholesterol-cholesterol radial distribution functions (RDFs), tilt angle, and accessible surface area. First, we develop an “angle-corrected” model wherein we modify the coarse-grained bond angle potentials based on atomistic simulations. This modification significantly improves prediction of macroscopic properties, most notably the thickening/thinning behavior, and also slightly improves microscopic property prediction relative to MARTINI. Second, we add to the angle correction a “volume correction” by also adjusting phospholipid bond lengths to achieve a more accurate volume per molecule. The angle + volume correction substantially further improves the quantitative agreement of the macroscopic properties (area per molecule and thickness) with united atom simulations. However, this improvement also reduces the accuracy of microscopic predictions like radial distribution functions and cholesterol tilt below that of either MARTINI or the angle-corrected model. Thus, while both of our forcefield corrections improve MARTINI, the combined angle and volume correction should be used for problems involving sterol effects on the overall structure of the membrane, while our angle-corrected model should be used in cases where the properties of individual lipid and sterol models are critically important.« less

  11. Product development in large furniture companies: a descriptive model with implications for character-marked products

    Treesearch

    Matt Bumgardner; Robert J. Bush; Cynthia D. West

    2001-01-01

    Previous research has shown that substantial yield improvements are possible when character-marks are not removed from hardwood furniture parts. Attempts to promote increased use of character-marked wood in fumiture should be based on an understimn&ing of how design concepts originate and move through the stages of product development. However, very little has...

  12. Beyond Relational: A Database Architecture and Federated Query Optimization in a Multi-Modal Healthcare Environment

    ERIC Educational Resources Information Center

    Hylock, Ray Hales

    2013-01-01

    Over the past thirty years, clinical research has benefited substantially from the adoption of electronic medical record systems. As deployment has increased, so too has the number of researchers seeking to improve the overall analytical environment by way of tools and models. Although much work has been done, there are still many uninvestigated…

  13. Team Up for 21st Century Teaching and Learning: What Research and Practice Reveal about Professional Learning. Condensed Excerpts

    ERIC Educational Resources Information Center

    Carroll, Thomas G., Ed.; Fulton, Kathleen, Ed.; Doerr, Hanna, Ed.

    2010-01-01

    This document contains excerpts from Team Up for 21st Century Teaching & Learning. This document includes the excerpts of five articles that provide a substantial evidence-based argument for the power of collaborative communities to improve teaching and learning. These articles are: (1) Professional Communities and the Artisan Model of…

  14. Studying the Effectiveness of Physical Education in the Secondary School (by the Example of Kazakhstan)

    ERIC Educational Resources Information Center

    Botagariyev, ?ulegen A.; Kubiyeva, Svetlana S.; Baizakova, Venera E.; Mambetov, Nurolla; Tulegenov, Yerkin K.; Aralbayev, Alpysbay S.; Kairgozhin, Dulat U.

    2016-01-01

    The purpose of this study was to determine the effectiveness of the existing model of teaching physical training in secondary schools and the analysis of a game like method introduced to improve physical fitness of students. The authors substantiated the use of a game like method during physical training classes, which implementation should create…

  15. School Turnarounds: Evidence from the 2009 Stimulus. Program on Education Policy and Governance Working Papers Series. PEPG 12-04

    ERIC Educational Resources Information Center

    Dee, Thomas S.

    2012-01-01

    The American Recovery and Reinvestment Act of 2009 (ARRA) targeted substantial School Improvement Grants (SIGs) to the nation's "persistently lowest achieving" public schools (i.e., up to $2 million per school annually over 3 years) but required schools accepting these awards to implement a federally prescribed school-reform model.…

  16. Optimizing Estimated Loss Reduction for Active Sampling in Rank Learning

    DTIC Science & Technology

    2008-01-01

    active learning framework for SVM-based and boosting-based rank learning. Our approach suggests sampling based on maximizing the estimated loss differential over unlabeled data. Experimental results on two benchmark corpora show that the proposed model substantially reduces the labeling effort, and achieves superior performance rapidly with as much as 30% relative improvement over the margin-based sampling

  17. An alternate pathophysiologic paradigm of sepsis and septic shock

    PubMed Central

    Kumar, Anand

    2014-01-01

    The advent of modern antimicrobial therapy following the discovery of penicillin during the 1940s yielded remarkable improvements in case fatality rate of serious infections including septic shock. Since then, pathogens have continuously evolved under selective antimicrobial pressure resulting in a lack of significant improvement in clinical effectiveness in the antimicrobial therapy of septic shock despite ever more broad-spectrum and potent drugs. In addition, although substantial effort and money has been expended on the development novel non-antimicrobial therapies of sepsis in the past 30 years, clinical progress in this regard has been limited. This review explores the possibility that the current pathophysiologic paradigm of septic shock fails to appropriately consider the primacy of the microbial burden of infection as the primary driver of septic organ dysfunction. An alternate paradigm is offered that suggests that has substantial implications for optimizing antimicrobial therapy in septic shock. This model of disease progression suggests the key to significant improvement in the outcome of septic shock may lie, in great part, with improvements in delivery of existing antimicrobials and other anti-infectious strategies. Recognition of the role of delays in administration of antimicrobial therapy in the poor outcomes of septic shock is central to this effort. However, therapeutic strategies that improve the degree of antimicrobial cidality likely also have a crucial role. PMID:24184742

  18. A comparison of genomic selection models across time in interior spruce (Picea engelmannii × glauca) using unordered SNP imputation methods

    PubMed Central

    Ratcliffe, B; El-Dien, O G; Klápště, J; Porth, I; Chen, C; Jaquish, B; El-Kassaby, Y A

    2015-01-01

    Genomic selection (GS) potentially offers an unparalleled advantage over traditional pedigree-based selection (TS) methods by reducing the time commitment required to carry out a single cycle of tree improvement. This quality is particularly appealing to tree breeders, where lengthy improvement cycles are the norm. We explored the prospect of implementing GS for interior spruce (Picea engelmannii × glauca) utilizing a genotyped population of 769 trees belonging to 25 open-pollinated families. A series of repeated tree height measurements through ages 3–40 years permitted the testing of GS methods temporally. The genotyping-by-sequencing (GBS) platform was used for single nucleotide polymorphism (SNP) discovery in conjunction with three unordered imputation methods applied to a data set with 60% missing information. Further, three diverse GS models were evaluated based on predictive accuracy (PA), and their marker effects. Moderate levels of PA (0.31–0.55) were observed and were of sufficient capacity to deliver improved selection response over TS. Additionally, PA varied substantially through time accordingly with spatial competition among trees. As expected, temporal PA was well correlated with age-age genetic correlation (r=0.99), and decreased substantially with increasing difference in age between the training and validation populations (0.04–0.47). Moreover, our imputation comparisons indicate that k-nearest neighbor and singular value decomposition yielded a greater number of SNPs and gave higher predictive accuracies than imputing with the mean. Furthermore, the ridge regression (rrBLUP) and BayesCπ (BCπ) models both yielded equal, and better PA than the generalized ridge regression heteroscedastic effect model for the traits evaluated. PMID:26126540

  19. A comparison of genomic selection models across time in interior spruce (Picea engelmannii × glauca) using unordered SNP imputation methods.

    PubMed

    Ratcliffe, B; El-Dien, O G; Klápště, J; Porth, I; Chen, C; Jaquish, B; El-Kassaby, Y A

    2015-12-01

    Genomic selection (GS) potentially offers an unparalleled advantage over traditional pedigree-based selection (TS) methods by reducing the time commitment required to carry out a single cycle of tree improvement. This quality is particularly appealing to tree breeders, where lengthy improvement cycles are the norm. We explored the prospect of implementing GS for interior spruce (Picea engelmannii × glauca) utilizing a genotyped population of 769 trees belonging to 25 open-pollinated families. A series of repeated tree height measurements through ages 3-40 years permitted the testing of GS methods temporally. The genotyping-by-sequencing (GBS) platform was used for single nucleotide polymorphism (SNP) discovery in conjunction with three unordered imputation methods applied to a data set with 60% missing information. Further, three diverse GS models were evaluated based on predictive accuracy (PA), and their marker effects. Moderate levels of PA (0.31-0.55) were observed and were of sufficient capacity to deliver improved selection response over TS. Additionally, PA varied substantially through time accordingly with spatial competition among trees. As expected, temporal PA was well correlated with age-age genetic correlation (r=0.99), and decreased substantially with increasing difference in age between the training and validation populations (0.04-0.47). Moreover, our imputation comparisons indicate that k-nearest neighbor and singular value decomposition yielded a greater number of SNPs and gave higher predictive accuracies than imputing with the mean. Furthermore, the ridge regression (rrBLUP) and BayesCπ (BCπ) models both yielded equal, and better PA than the generalized ridge regression heteroscedastic effect model for the traits evaluated.

  20. Joint estimation over multiple individuals improves behavioural state inference from animal movement data.

    PubMed

    Jonsen, Ian

    2016-02-08

    State-space models provide a powerful way to scale up inference of movement behaviours from individuals to populations when the inference is made across multiple individuals. Here, I show how a joint estimation approach that assumes individuals share identical movement parameters can lead to improved inference of behavioural states associated with different movement processes. I use simulated movement paths with known behavioural states to compare estimation error between nonhierarchical and joint estimation formulations of an otherwise identical state-space model. Behavioural state estimation error was strongly affected by the degree of similarity between movement patterns characterising the behavioural states, with less error when movements were strongly dissimilar between states. The joint estimation model improved behavioural state estimation relative to the nonhierarchical model for simulated data with heavy-tailed Argos location errors. When applied to Argos telemetry datasets from 10 Weddell seals, the nonhierarchical model estimated highly uncertain behavioural state switching probabilities for most individuals whereas the joint estimation model yielded substantially less uncertainty. The joint estimation model better resolved the behavioural state sequences across all seals. Hierarchical or joint estimation models should be the preferred choice for estimating behavioural states from animal movement data, especially when location data are error-prone.

  1. Lower- Versus Higher-Income Populations In The Alternative Quality Contract: Improved Quality And Similar Spending.

    PubMed

    Song, Zirui; Rose, Sherri; Chernew, Michael E; Safran, Dana Gelb

    2017-01-01

    As population-based payment models become increasingly common, it is crucial to understand how such payment models affect health disparities. We evaluated health care quality and spending among enrollees in areas with lower versus higher socioeconomic status in Massachusetts before and after providers entered into the Alternative Quality Contract, a two-sided population-based payment model with substantial incentives tied to quality. We compared changes in process measures, outcome measures, and spending between enrollees in areas with lower and higher socioeconomic status from 2006 to 2012 (outcome measures were measured after the intervention only). Quality improved for all enrollees in the Alternative Quality Contract after their provider organizations entered the contract. Process measures improved 1.2 percentage points per year more among enrollees in areas with lower socioeconomic status than among those in areas with higher socioeconomic status. Outcome measure improvement was no different between the subgroups; neither were changes in spending. Larger or comparable improvements in quality among enrollees in areas with lower socioeconomic status suggest a potential narrowing of disparities. Strong pay-for-performance incentives within a population-based payment model could encourage providers to focus on improving quality for more disadvantaged populations. Project HOPE—The People-to-People Health Foundation, Inc.

  2. Toward Realistic Simulation of low-Level Clouds Using a Multiscale Modeling Framework With a Third-Order Turbulence Closure in its Cloud-Resolving Model Component

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man; Cheng, Anning

    2010-01-01

    This study presents preliminary results from a multiscale modeling framework (MMF) with an advanced third-order turbulence closure in its cloud-resolving model (CRM) component. In the original MMF, the Community Atmosphere Model (CAM3.5) is used as the host general circulation model (GCM), and the System for Atmospheric Modeling with a first-order turbulence closure is used as the CRM for representing cloud processes in each grid box of the GCM. The results of annual and seasonal means and diurnal variability are compared between the modified and original MMFs and the CAM3.5. The global distributions of low-level cloud amounts and precipitation and the amounts of low-level clouds in the subtropics and middle-level clouds in mid-latitude storm track regions in the modified MMF show substantial improvement relative to the original MMF when both are compared to observations. Some improvements can also be seen in the diurnal variability of precipitation.

  3. Sensitivity studies and a simple ozone perturbation experiment with a truncated two-dimensional model of the stratosphere

    NASA Technical Reports Server (NTRS)

    Stordal, Frode; Garcia, Rolando R.

    1987-01-01

    The 1-1/2-D model of Holton (1986), which is actually a highly truncated two-dimensional model, describes latitudinal variations of tracer mixing ratios in terms of their projections onto second-order Legendre polynomials. The present study extends the work of Holton by including tracers with photochemical production in the stratosphere (O3 and NOy). It also includes latitudinal variations in the photochemical sources and sinks, improving slightly the calculated global mean profiles for the long-lived tracers studied by Holton and improving substantially the latitudinal behavior of ozone. Sensitivity tests of the dynamical parameters in the model are performed, showing that the response of the model to changes in vertical residual meridional winds and horizontal diffusion coefficients is similar to that of a full two-dimensional model. A simple ozone perturbation experiment shows the model's ability to reproduce large-scale latitudinal variations in total ozone column depletions as well as ozone changes in the chemically controlled upper stratosphere.

  4. Assimilation of GRACE Terrestrial Water Storage Observations into a Land Surface Model for the Assessment of Regional Flood Potential

    NASA Technical Reports Server (NTRS)

    Reager, John T.; Thomas, Alys C.; Sproles, Eric A.; Rodell, Matthew; Beaudoing, Hiroko K.; Li, Bailing; Famiglietti, James S.

    2015-01-01

    We evaluate performance of the Catchment Land Surface Model (CLSM) under flood conditions after the assimilation of observations of the terrestrial water storage anomaly (TWSA) from NASA's Gravity Recovery and Climate Experiment (GRACE). Assimilation offers three key benefits for the viability of GRACE observations to operational applications: (1) near-real time analysis; (2) a downscaling of GRACE's coarse spatial resolution; and (3) state disaggregation of the vertically-integrated TWSA. We select the 2011 flood event in the Missouri river basin as a case study, and find that assimilation generally made the model wetter in the months preceding flood. We compare model outputs with observations from 14 USGS groundwater wells to assess improvements after assimilation. Finally, we examine disaggregated water storage information to improve the mechanistic understanding of event generation. Validation establishes that assimilation improved the model skill substantially, increasing regional groundwater anomaly correlation from 0.58 to 0.86. For the 2011 flood event in the Missouri river basin, results show that groundwater and snow water equivalent were contributors to pre-event flood potential, providing spatially-distributed early warning information.

  5. Patient safety in anesthesia: learning from the culture of high-reliability organizations.

    PubMed

    Wright, Suzanne M

    2015-03-01

    There has been an increased awareness of and interest in patient safety and improved outcomes, as well as a growing body of evidence substantiating medical error as a leading cause of death and injury in the United States. According to The Joint Commission, US hospitals demonstrate improvements in health care quality and patient safety. Although this progress is encouraging, much room for improvement remains. High-reliability organizations, industries that deliver reliable performances in the face of complex working environments, can serve as models of safety for our health care system until plausible explanations for patient harm are better understood. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Divergent projections of future land use in the United States arising from different models and scenarios

    USGS Publications Warehouse

    Sohl, Terry L.; Wimberly, Michael; Radeloff, Volker C.; Theobald, David M.; Sleeter, Benjamin M.

    2016-01-01

    A variety of land-use and land-cover (LULC) models operating at scales from local to global have been developed in recent years, including a number of models that provide spatially explicit, multi-class LULC projections for the conterminous United States. This diversity of modeling approaches raises the question: how consistent are their projections of future land use? We compared projections from six LULC modeling applications for the United States and assessed quantitative, spatial, and conceptual inconsistencies. Each set of projections provided multiple scenarios covering a period from roughly 2000 to 2050. Given the unique spatial, thematic, and temporal characteristics of each set of projections, individual projections were aggregated to a common set of basic, generalized LULC classes (i.e., cropland, pasture, forest, range, and urban) and summarized at the county level across the conterminous United States. We found very little agreement in projected future LULC trends and patterns among the different models. Variability among scenarios for a given model was generally lower than variability among different models, in terms of both trends in the amounts of basic LULC classes and their projected spatial patterns. Even when different models assessed the same purported scenario, model projections varied substantially. Projections of agricultural trends were often far above the maximum historical amounts, raising concerns about the realism of the projections. Comparisons among models were hindered by major discrepancies in categorical definitions, and suggest a need for standardization of historical LULC data sources. To capture a broader range of uncertainties, ensemble modeling approaches are also recommended. However, the vast inconsistencies among LULC models raise questions about the theoretical and conceptual underpinnings of current modeling approaches. Given the substantial effects that land-use change can have on ecological and societal processes, there is a need for improvement in LULC theory and modeling capabilities to improve acceptance and use of regional- to national-scale LULC projections for the United States and elsewhere.

  7. On the Representation of Ice Nucleation in Global Climate Models, and its Importance for Simulations of Climate Forcings and Feedbacks

    NASA Astrophysics Data System (ADS)

    Storelvmo, T.

    2015-12-01

    Substantial improvements have been made to the cloud microphysical schemes used in the latest generation of global climate models (GCMs), however, an outstanding weakness of these schemes lies in the arbitrariness of their tuning parameters. Despite the growing effort in improving the cloud microphysical schemes in GCMs, most of this effort has not focused on improving the ability of GCMs to accurately simulate phase partitioning in mixed-phase clouds. Getting the relative proportion of liquid droplets and ice crystals in clouds right in GCMs is critical for the representation of cloud radiative forcings and cloud-climate feedbacks. Here, we first present satellite observations of cloud phase obtained by NASA's CALIOP instrument, and report on robust statistical relationships between cloud phase and several aerosols species that have been demonstrated to act as ice nuclei (IN) in laboratory studies. We then report on results from model intercomparison projects that reveal that GCMs generally underestimate the amount of supercooled liquid in clouds. For a selected GCM (NCAR 's CAM5), we thereafter show that the underestimate can be attributed to two main factors: i) the presence of IN in the mixed-phase temperature range, and ii) the Wegener-Bergeron-Findeisen process, which converts liquid to ice once ice crystals have formed. Finally, we show that adjusting these two processes such that the GCM's cloud phase is in agreement with the observed has a substantial impact on the simulated radiative forcing due to IN perturbations, as well as on the cloud-climate feedbacks and ultimately climate sensitivity simulated by the GCM.

  8. Analyzing extreme sea levels for broad-scale impact and adaptation studies

    NASA Astrophysics Data System (ADS)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A.

    2017-12-01

    Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels (ESL), because increasing damage due to extreme events is one of the major consequences of sea-level rise (SLR) and climate change. Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future SLR; different scenarios were developed with process-based or semi-empirical models and used for coastal impact studies at various temporal and spatial scales to guide coastal management and adaptation efforts. Uncertainties in future SLR are typically accounted for by analyzing the impacts associated with a range of scenarios and model ensembles. ESL distributions are then displaced vertically according to the SLR scenarios under the inherent assumption that we have perfect knowledge on the statistics of extremes. However, there is still a limited understanding of present-day ESL which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of storm surge water levels, and (2) statistical models used for determining present-day ESL exceedance probabilities. There is no universally accepted approach to obtain such values for broad-scale flood risk assessments and while substantial research has explored SLR uncertainties, we quantify, for the first time globally, key uncertainties in ESL estimates. We find that contemporary ESL uncertainties exceed those from SLR projections and, assuming that we meet the Paris agreement, the projected SLR itself by the end of the century. Our results highlight the necessity to further improve our understanding of uncertainties in ESL estimates through (1) continued improvement of numerical and statistical models to simulate and analyze coastal water levels and (2) exploit the rich observational database and continue data archeology to obtain longer time series and remove model bias. Finally, ESL uncertainties need to be integrated with SLR uncertainties. Otherwise, important improvements in providing more robust SLR projections are of less benefit for broad-scale impact and adaptation studies and decision processes.

  9. Operational seasonal forecasting of crop performance.

    PubMed

    Stone, Roger C; Meinke, Holger

    2005-11-29

    Integrated, interdisciplinary crop performance forecasting systems, linked with appropriate decision and discussion support tools, could substantially improve operational decision making in agricultural management. Recent developments in connecting numerical weather prediction models and general circulation models with quantitative crop growth models offer the potential for development of integrated systems that incorporate components of long-term climate change. However, operational seasonal forecasting systems have little or no value unless they are able to change key management decisions. Changed decision making through incorporation of seasonal forecasting ultimately has to demonstrate improved long-term performance of the cropping enterprise. Simulation analyses conducted on specific production scenarios are especially useful in improving decisions, particularly if this is done in conjunction with development of decision-support systems and associated facilitated discussion groups. Improved management of the overall crop production system requires an interdisciplinary approach, where climate scientists, agricultural scientists and extension specialists are intimately linked with crop production managers in the development of targeted seasonal forecast systems. The same principle applies in developing improved operational management systems for commodity trading organizations, milling companies and agricultural marketing organizations. Application of seasonal forecast systems across the whole value chain in agricultural production offers considerable benefits in improving overall operational management of agricultural production.

  10. Operational seasonal forecasting of crop performance

    PubMed Central

    Stone, Roger C; Meinke, Holger

    2005-01-01

    Integrated, interdisciplinary crop performance forecasting systems, linked with appropriate decision and discussion support tools, could substantially improve operational decision making in agricultural management. Recent developments in connecting numerical weather prediction models and general circulation models with quantitative crop growth models offer the potential for development of integrated systems that incorporate components of long-term climate change. However, operational seasonal forecasting systems have little or no value unless they are able to change key management decisions. Changed decision making through incorporation of seasonal forecasting ultimately has to demonstrate improved long-term performance of the cropping enterprise. Simulation analyses conducted on specific production scenarios are especially useful in improving decisions, particularly if this is done in conjunction with development of decision-support systems and associated facilitated discussion groups. Improved management of the overall crop production system requires an interdisciplinary approach, where climate scientists, agricultural scientists and extension specialists are intimately linked with crop production managers in the development of targeted seasonal forecast systems. The same principle applies in developing improved operational management systems for commodity trading organizations, milling companies and agricultural marketing organizations. Application of seasonal forecast systems across the whole value chain in agricultural production offers considerable benefits in improving overall operational management of agricultural production. PMID:16433097

  11. Past, Present, and Future Capabilities of the Transonic Dynamics Tunnel from an Aeroelasticity Perspective

    NASA Technical Reports Server (NTRS)

    Cole, Stanley R.; Garcia, Jerry L.

    2000-01-01

    The NASA Langley Transonic Dynamics Tunnel (TDT) has provided a unique capability for aeroelastic testing for forty years. The facility has a rich history of significant contributions to the design of many United States commercial transports, military aircraft, launch vehicles, and spacecraft. The facility has many features that contribute to its uniqueness for aeroelasticity testing, perhaps the most important feature being the use of a heavy gas test medium to achieve higher test densities. Higher test medium densities substantially improve model-building requirements and therefore simplify the fabrication process for building aeroelastically scaled wind tunnel models. Aeroelastic scaling for the heavy gas results in lower model structural frequencies. Lower model frequencies tend to a make aeroelastic testing safer. This paper will describe major developments in the testing capabilities at the TDT throughout its history, the current status of the facility, and planned additions and improvements to its capabilities in the near future.

  12. The effect of a loss of model structural detail due to network skeletonization on contamination warning system design: case studies.

    PubMed

    Davis, Michael J; Janke, Robert

    2018-01-04

    The effect of limitations in the structural detail available in a network model on contamination warning system (CWS) design was examined in case studies using the original and skeletonized network models for two water distribution systems (WDSs). The skeletonized models were used as proxies for incomplete network models. CWS designs were developed by optimizing sensor placements for worst-case and mean-case contamination events. Designs developed using the skeletonized network models were transplanted into the original network model for evaluation. CWS performance was defined as the number of people who ingest more than some quantity of a contaminant in tap water before the CWS detects the presence of contamination. Lack of structural detail in a network model can result in CWS designs that (1) provide considerably less protection against worst-case contamination events than that obtained when a more complete network model is available and (2) yield substantial underestimates of the consequences associated with a contamination event. Nevertheless, CWSs developed using skeletonized network models can provide useful reductions in consequences for contaminants whose effects are not localized near the injection location. Mean-case designs can yield worst-case performances similar to those for worst-case designs when there is uncertainty in the network model. Improvements in network models for WDSs have the potential to yield significant improvements in CWS designs as well as more realistic evaluations of those designs. Although such improvements would be expected to yield improved CWS performance, the expected improvements in CWS performance have not been quantified previously. The results presented here should be useful to those responsible for the design or implementation of CWSs, particularly managers and engineers in water utilities, and encourage the development of improved network models.

  13. The effect of a loss of model structural detail due to network skeletonization on contamination warning system design: case studies

    NASA Astrophysics Data System (ADS)

    Davis, Michael J.; Janke, Robert

    2018-05-01

    The effect of limitations in the structural detail available in a network model on contamination warning system (CWS) design was examined in case studies using the original and skeletonized network models for two water distribution systems (WDSs). The skeletonized models were used as proxies for incomplete network models. CWS designs were developed by optimizing sensor placements for worst-case and mean-case contamination events. Designs developed using the skeletonized network models were transplanted into the original network model for evaluation. CWS performance was defined as the number of people who ingest more than some quantity of a contaminant in tap water before the CWS detects the presence of contamination. Lack of structural detail in a network model can result in CWS designs that (1) provide considerably less protection against worst-case contamination events than that obtained when a more complete network model is available and (2) yield substantial underestimates of the consequences associated with a contamination event. Nevertheless, CWSs developed using skeletonized network models can provide useful reductions in consequences for contaminants whose effects are not localized near the injection location. Mean-case designs can yield worst-case performances similar to those for worst-case designs when there is uncertainty in the network model. Improvements in network models for WDSs have the potential to yield significant improvements in CWS designs as well as more realistic evaluations of those designs. Although such improvements would be expected to yield improved CWS performance, the expected improvements in CWS performance have not been quantified previously. The results presented here should be useful to those responsible for the design or implementation of CWSs, particularly managers and engineers in water utilities, and encourage the development of improved network models.

  14. Hybrid Environmental Control System Integrated Modeling Trade Study Analysis for Commercial Aviation

    NASA Astrophysics Data System (ADS)

    Parrilla, Javier

    Current industry trends demonstrate aircraft electrification will be part of future platforms in order to achieve higher levels of efficiency in various vehicle level sub-systems. However electrification requires a substantial change in aircraft design that is not suitable for re-winged or re-engined applications as some aircraft manufacturers are opting for today. Thermal limits arise as engine cores progressively get smaller and hotter to improve overall engine efficiency, while legacy systems still demand a substantial amount of pneumatic, hydraulic and electric power extraction. The environmental control system (ECS) provides pressurization, ventilation and air conditioning in commercial aircraft, making it the main heat sink for all aircraft loads with exception of the engine. To mitigate the architecture thermal limits in an efficient manner, the form in which the ECS interacts with the engine will have to be enhanced as to reduce the overall energy consumed and achieve an energy optimized solution. This study examines a tradeoff analysis of an electric ECS by use of a fully integrated Numerical Propulsion Simulation System (NPSS) model that is capable of studying the interaction between the ECS and the engine cycle deck. It was found that a peak solution lays in a hybrid ECS where it utilizes the correct balance between a traditional pneumatic and a fully electric system. This intermediate architecture offers a substantial improvement in aircraft fuel consumptions due to a reduced amount of waste heat and customer bleed in exchange for partial electrification of the air-conditions pack which is a viable option for re-winged applications.

  15. Estimating energy expenditure from heart rate in older adults: a case for calibration.

    PubMed

    Schrack, Jennifer A; Zipunnikov, Vadim; Goldsmith, Jeff; Bandeen-Roche, Karen; Crainiceanu, Ciprian M; Ferrucci, Luigi

    2014-01-01

    Accurate measurement of free-living energy expenditure is vital to understanding changes in energy metabolism with aging. The efficacy of heart rate as a surrogate for energy expenditure is rooted in the assumption of a linear function between heart rate and energy expenditure, but its validity and reliability in older adults remains unclear. To assess the validity and reliability of the linear function between heart rate and energy expenditure in older adults using different levels of calibration. Heart rate and energy expenditure were assessed across five levels of exertion in 290 adults participating in the Baltimore Longitudinal Study of Aging. Correlation and random effects regression analyses assessed the linearity of the relationship between heart rate and energy expenditure and cross-validation models assessed predictive performance. Heart rate and energy expenditure were highly correlated (r=0.98) and linear regardless of age or sex. Intra-person variability was low but inter-person variability was high, with substantial heterogeneity of the random intercept (s.d. =0.372) despite similar slopes. Cross-validation models indicated individual calibration data substantially improves accuracy predictions of energy expenditure from heart rate, reducing the potential for considerable measurement bias. Although using five calibration measures provided the greatest reduction in the standard deviation of prediction errors (1.08 kcals/min), substantial improvement was also noted with two (0.75 kcals/min). These findings indicate standard regression equations may be used to make population-level inferences when estimating energy expenditure from heart rate in older adults but caution should be exercised when making inferences at the individual level without proper calibration.

  16. GEOS S2S-2_1: GMAO's New High Resolution Seasonal Prediction System

    NASA Technical Reports Server (NTRS)

    Molod, Andrea; Akella, Santha; Andrews, Lauren; Barahona, Donifan; Borovikov, Anna; Chang, Yehui; Cullather, Richard; Hackert, Eric; Kovach, Robin; Koster, Randal; hide

    2017-01-01

    A new version of the modeling and analysis system used to produce sub-seasonal to seasonal forecasts has just been released by the NASA Goddard Global Modeling and Assimilation Office. The new version runs at higher atmospheric resolution (approximately 12 degree globally), contains a substantially improved model description of the cryosphere, and includes additional interactive earth system model components (aerosol model). In addition, the Ocean data assimilation system has been replaced with a Local Ensemble Transform Kalman Filter. Here will describe the new system, along with the plans for the future (GEOS S2S-3_0) which will include a higher resolution ocean model and more interactive earth system model components (interactive vegetation, biomass burning from fires). We will also present results from a free-running coupled simulation with the new system and results from a series of retrospective seasonal forecasts. Results from retrospective forecasts show significant improvements in surface temperatures over much of the northern hemisphere and a much improved prediction of sea ice extent in both hemispheres. The precipitation forecast skill is comparable to previous S2S systems, and the only trade off is an increased double ITCZ, which is expected as we go to higher atmospheric resolution.

  17. Forecasting Influenza Outbreaks in Boroughs and Neighborhoods of New York City.

    PubMed

    Yang, Wan; Olson, Donald R; Shaman, Jeffrey

    2016-11-01

    The ideal spatial scale, or granularity, at which infectious disease incidence should be monitored and forecast has been little explored. By identifying the optimal granularity for a given disease and host population, and matching surveillance and prediction efforts to this scale, response to emergent and recurrent outbreaks can be improved. Here we explore how granularity and representation of spatial structure affect influenza forecast accuracy within New York City. We develop network models at the borough and neighborhood levels, and use them in conjunction with surveillance data and a data assimilation method to forecast influenza activity. These forecasts are compared to an alternate system that predicts influenza for each borough or neighborhood in isolation. At the borough scale, influenza epidemics are highly synchronous despite substantial differences in intensity, and inclusion of network connectivity among boroughs generally improves forecast accuracy. At the neighborhood scale, we observe much greater spatial heterogeneity among influenza outbreaks including substantial differences in local outbreak timing and structure; however, inclusion of the network model structure generally degrades forecast accuracy. One notable exception is that local outbreak onset, particularly when signal is modest, is better predicted with the network model. These findings suggest that observation and forecast at sub-municipal scales within New York City provides richer, more discriminant information on influenza incidence, particularly at the neighborhood scale where greater heterogeneity exists, and that the spatial spread of influenza among localities can be forecast.

  18. Climate Forcing Datasets for Agricultural Modeling: Merged Products for Gap-Filling and Historical Climate Series Estimation

    NASA Technical Reports Server (NTRS)

    Ruane, Alex C.; Goldberg, Richard; Chryssanthacopoulos, James

    2014-01-01

    The AgMERRA and AgCFSR climate forcing datasets provide daily, high-resolution, continuous, meteorological series over the 1980-2010 period designed for applications examining the agricultural impacts of climate variability and climate change. These datasets combine daily resolution data from retrospective analyses (the Modern-Era Retrospective Analysis for Research and Applications, MERRA, and the Climate Forecast System Reanalysis, CFSR) with in situ and remotely-sensed observational datasets for temperature, precipitation, and solar radiation, leading to substantial reductions in bias in comparison to a network of 2324 agricultural-region stations from the Hadley Integrated Surface Dataset (HadISD). Results compare favorably against the original reanalyses as well as the leading climate forcing datasets (Princeton, WFD, WFD-EI, and GRASP), and AgMERRA distinguishes itself with substantially improved representation of daily precipitation distributions and extreme events owing to its use of the MERRA-Land dataset. These datasets also peg relative humidity to the maximum temperature time of day, allowing for more accurate representation of the diurnal cycle of near-surface moisture in agricultural models. AgMERRA and AgCFSR enable a number of ongoing investigations in the Agricultural Model Intercomparison and Improvement Project (AgMIP) and related research networks, and may be used to fill gaps in historical observations as well as a basis for the generation of future climate scenarios.

  19. 24 CFR 968.105 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... purpose of funding physical and management improvements. Modernization program. A PHA's program for... substantially the same kind does qualify, but reconstruction, substantial improvement in the quality or kind of... resident participation in each of the required program components. PHMAP. The Public Housing Management...

  20. Double-input compartmental modeling and spectral analysis for the quantification of positron emission tomography data in oncology.

    PubMed

    Tomasi, G; Kimberley, S; Rosso, L; Aboagye, E; Turkheimer, F

    2012-04-07

    In positron emission tomography (PET) studies involving organs different from the brain, ignoring the metabolite contribution to the tissue time-activity curves (TAC), as in the standard single-input (SI) models, may compromise the accuracy of the estimated parameters. We employed here double-input (DI) compartmental modeling (CM), previously used for [¹¹C]thymidine, and a novel DI spectral analysis (SA) approach on the tracers 5-[¹⁸F]fluorouracil (5-[¹⁸F]FU) and [¹⁸F]fluorothymidine ([¹⁸F]FLT). CM and SA were performed initially with a SI approach using the parent plasma TAC as an input function. These methods were then employed using a DI approach with the metabolite plasma TAC as an additional input function. Regions of interest (ROIs) corresponding to healthy liver, kidneys and liver metastases for 5-[¹⁸F]FU and to tumor, vertebra and liver for [¹⁸F]FLT were analyzed. For 5-[¹⁸F]FU, the improvement of the fit quality with the DI approaches was remarkable; in CM, the Akaike information criterion (AIC) always selected the DI over the SI model. Volume of distribution estimates obtained with DI CM and DI SA were in excellent agreement, for both parent 5-[¹⁸F]FU (R(2) = 0.91) and metabolite [¹⁸F]FBAL (R(2) = 0.99). For [¹⁸F]FLT, the DI methods provided notable improvements but less substantial than for 5-[¹⁸F]FU due to the lower rate of metabolism of [¹⁸F]FLT. On the basis of the AIC values, agreement between [¹⁸F]FLT K(i) estimated with the SI and DI models was good (R² = 0.75) for the ROIs where the metabolite contribution was negligible, indicating that the additional input did not bias the parent tracer only-related estimates. When the AIC suggested a substantial contribution of the metabolite [¹⁸F]FLT-glucuronide, on the other hand, the change in the parent tracer only-related parameters was significant (R² = 0.33 for K(i)). Our results indicated that improvements of DI over SI approaches can range from moderate to substantial and are more significant for tracers with a high rate of metabolism. Furthermore, they showed that SA is suitable for DI modeling and can be used effectively in the analysis of PET data.

  1. Effects of transcranial direct current stimulation for treating depression: A modeling study.

    PubMed

    Csifcsák, Gábor; Boayue, Nya Mehnwolo; Puonti, Oula; Thielscher, Axel; Mittner, Matthias

    2018-07-01

    Transcranial direct current stimulation (tDCS) above the left dorsolateral prefrontal cortex (lDLPFC) has been widely used to improve symptoms of major depressive disorder (MDD). However, the effects of different stimulation protocols in the entire frontal lobe have not been investigated in a large sample including patient data. We used 38 head models created from structural magnetic resonance imaging data of 19 healthy adults and 19 MDD patients and applied computational modeling to simulate the spatial distribution of tDCS-induced electric fields (EFs) in 20 frontal regions. We evaluated effects of seven bipolar and two multi-electrode 4 × 1 tDCS protocols. For bipolar montages, EFs were of comparable strength in the lDLPFC and in the medial prefrontal cortex (MPFC). Depending on stimulation parameters, EF cortical maps varied to a considerable degree, but were found to be similar in controls and patients. 4 × 1 montages produced more localized, albeit weaker effects. White matter anisotropy was not modeled. The relationship between EF strength and clinical response to tDCS could not be evaluated. In addition to lDLPFC stimulation, excitability changes in the MPFC should also be considered as a potential mechanism underlying clinical efficacy of bipolar montages. MDD-associated anatomical variations are not likely to substantially influence current flow. Individual modeling of tDCS protocols can substantially improve cortical targeting. We make recommendations for future research to explicitly test the contribution of lDLPFC vs. MPFC stimulation to therapeutic outcomes of tDCS in this disorder. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Hot Dust in Panchromatic SED Fitting: Identification of Active Galactic Nuclei and Improved Galaxy Properties

    NASA Astrophysics Data System (ADS)

    Leja, Joel; Johnson, Benjamin D.; Conroy, Charlie; van Dokkum, Pieter

    2018-02-01

    Forward modeling of the full galaxy SED is a powerful technique, providing self-consistent constraints on stellar ages, dust properties, and metallicities. However, the accuracy of these results is contingent on the accuracy of the model. One significant source of uncertainty is the contribution of obscured AGN, as they are relatively common and can produce substantial mid-IR (MIR) emission. Here we include emission from dusty AGN torii in the Prospector SED-fitting framework, and fit the UV–IR broadband photometry of 129 nearby galaxies. We find that 10% of the fitted galaxies host an AGN contributing >10% of the observed galaxy MIR luminosity. We demonstrate the necessity of this AGN component in the following ways. First, we compare observed spectral features to spectral features predicted from our model fit to the photometry. We find that the AGN component greatly improves predictions for observed Hα and Hβ luminosities, as well as mid-infrared Akari and Spitzer/IRS spectra. Second, we show that inclusion of the AGN component changes stellar ages and SFRs by up to a factor of 10, and dust attenuations by up to a factor of 2.5. Finally, we show that the strength of our model AGN component correlates with independent AGN indicators, suggesting that these galaxies truly host AGN. Notably, only 46% of the SED-detected AGN would be detected with a simple MIR color selection. Based on these results, we conclude that SED models which fit MIR data without AGN components are vulnerable to substantial bias in their derived parameters.

  3. Predicting Urban Elementary Student Success and Passage on Ohio's High-Stakes Achievement Measures Using DIBELS Oral Reading Fluency and Informal Math Concepts and Applications: An Exploratory Study Employing Hierarchical Linear Modeling

    ERIC Educational Resources Information Center

    Merkle, Erich Robert

    2011-01-01

    Contemporary education is experiencing substantial reform across legislative, pedagogical, and assessment dimensions. The increase in school-based accountability systems has brought forth a culture where states, school districts, teachers, and individual students are required to demonstrate their efficacy towards improvement of the educational…

  4. Model tropical Atlantic biases underpin diminished Pacific decadal variability

    NASA Astrophysics Data System (ADS)

    McGregor, Shayne; Stuecker, Malte F.; Kajtar, Jules B.; England, Matthew H.; Collins, Mat

    2018-06-01

    Pacific trade winds have displayed unprecedented strengthening in recent decades1. This strengthening has been associated with east Pacific sea surface cooling2 and the early twenty-first-century slowdown in global surface warming2,3, amongst a host of other substantial impacts4-9. Although some climate models produce the timing of these recently observed trends10, they all fail to produce the trend magnitude2,11,12. This may in part be related to the apparent model underrepresentation of low-frequency Pacific Ocean variability and decadal wind trends2,11-13 or be due to a misrepresentation of a forced response1,14-16 or a combination of both. An increasingly prominent connection between the Pacific and Atlantic basins has been identified as a key driver of this strengthening of the Pacific trade winds12,17-20. Here we use targeted climate model experiments to show that combining the recent Atlantic warming trend with the typical climate model bias leads to a substantially underestimated response for the Pacific Ocean wind and surface temperature. The underestimation largely stems from a reduction and eastward shift of the atmospheric heating response to the tropical Atlantic warming trend. This result suggests that the recent Pacific trends and model decadal variability may be better captured by models with improved mean-state climatologies.

  5. Chemical transport model simulations of organic aerosol in ...

    EPA Pesticide Factsheets

    Gasoline- and diesel-fueled engines are ubiquitous sources of air pollution in urban environments. They emit both primary particulate matter and precursor gases that react to form secondary particulate matter in the atmosphere. In this work, we updated the organic aerosol module and organic emissions inventory of a three-dimensional chemical transport model, the Community Multiscale Air Quality Model (CMAQ), using recent, experimentally derived inputs and parameterizations for mobile sources. The updated model included a revised volatile organic compound (VOC) speciation for mobile sources and secondary organic aerosol (SOA) formation from unspeciated intermediate volatility organic compounds (IVOCs). The updated model was used to simulate air quality in southern California during May and June 2010, when the California Research at the Nexus of Air Quality and Climate Change (CalNex) study was conducted. Compared to the Traditional version of CMAQ, which is commonly used for regulatory applications, the updated model did not significantly alter the predicted organic aerosol (OA) mass concentrations but did substantially improve predictions of OA sources and composition (e.g., POA–SOA split), as well as ambient IVOC concentrations. The updated model, despite substantial differences in emissions and chemistry, performed similar to a recently released research version of CMAQ (Woody et al., 2016) that did not include the updated VOC and IVOC emissions and SOA data

  6. The Patient-Centered Medical Home: Preparation of the Workforce, More Questions than Answers.

    PubMed

    Reynolds, P Preston; Klink, Kathleen; Gilman, Stuart; Green, Larry A; Phillips, Russell S; Shipman, Scott; Keahey, David; Rugen, Kathryn; Davis, Molly

    2015-07-01

    As American medicine continues to undergo significant transformation, the patient-centered medical home (PCMH) is emerging as an interprofessional primary care model designed to deliver the right care for patients, by the right professional, at the right time, in the right setting, for the right cost. A review of local, state, regional and national initiatives to train professionals in delivering care within the PCMH model reveals some successes, but substantial challenges. Workforce policy recommendations designed to improve PCMH effectiveness and efficiency include 1) adoption of an expanded definition of primary care, 2) fundamental redesign of health professions education, 3) payment reform, 4) responsiveness to local needs assessments, and 5) systems improvement to emphasize quality, population health, and health disparities.

  7. Investigation of nonlinear inviscid and viscous flow effects in the analysis of dynamic stall. [air flow and chordwise pressure distribution on airfoil below stall condition

    NASA Technical Reports Server (NTRS)

    Crimi, P.

    1974-01-01

    A method for analyzing unsteady airfoil stall was refined by including nonlinear effects in the representation of the inviscid flow. Certain other aspects of the potential-flow model were reexamined and the effects of varying Reynolds number on stall characteristics were investigated. Refinement of the formulation improved the representation of the flow and chordwise pressure distribution below stall, but substantial quantitative differences between computed and measured results are still evident for sinusoidal pitching through stall. Agreement is substantially improved by assuming the growth rate of the dead-air region at the onset of leading-edge stall is of the order of the component of the free stream normal to the airfoil chordline. The method predicts the expected increase in the resistance to stalling with increasing Reynolds number. Results indicate that a given airfoil can undergo both trailing-edge and leading-edge stall under unsteady conditions.

  8. Application of a simple cerebellar model to geologic surface mapping

    USGS Publications Warehouse

    Hagens, A.; Doveton, J.H.

    1991-01-01

    Neurophysiological research into the structure and function of the cerebellum has inspired computational models that simulate information processing associated with coordination and motor movement. The cerebellar model arithmetic computer (CMAC) has a design structure which makes it readily applicable as an automated mapping device that "senses" a surface, based on a sample of discrete observations of surface elevation. The model operates as an iterative learning process, where cell weights are continuously modified by feedback to improve surface representation. The storage requirements are substantially less than those of a conventional memory allocation, and the model is extended easily to mapping in multidimensional space, where the memory savings are even greater. ?? 1991.

  9. Improving SWAT for simulating water and carbon fluxes of forest ecosystems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Qichun; Zhang, Xuesong

    2016-11-01

    As a widely used watershed model for assessing impacts of anthropogenic and natural disturbances on water quantity and quality, the Soil and Water Assessment Tool (SWAT) has not been extensively tested in simulating water and carbon fluxes of forest ecosystems. Here, we examine SWAT simulations of evapotranspiration (ET), net primary productivity (NPP), net ecosystem exchange (NEE), and plant biomass at ten AmeriFlux forest sites across the U.S. We identify unrealistic radiation use efficiency (Bio_E), large leaf to biomass fraction (Bio_LEAF), and missing phosphorus supply from parent material weathering as the primary causes for the inadequate performance of the default SWATmore » model in simulating forest dynamics. By further revising the relevant parameters and processes, SWAT’s performance is substantially improved. Based on the comparison between the improved SWAT simulations and flux tower observations, we discuss future research directions for further enhancing model parameterization and representation of water and carbon cycling for forests.« less

  10. Modelled female sale options demonstrate improved profitability in northern beef herds.

    PubMed

    Niethe, G E; Holmes, W E

    2008-12-01

    To examine the impact of improving the average value of cows sold, the risk of decreasing the number weaned, and total sales on the profitability of northern Australian cattle breeding properties. Gather, model and interpret breeder herd performances and production parameters on properties from six beef-producing regions in northern Australia. Production parameters, prices, costs and herd structure were entered into a herd simulation model for six northern Australian breeding properties that spay females to enhance their marketing options. After the data were validated by management, alternative management strategies were modelled using current market prices and most likely herd outcomes. The model predicted a close relationship between the average sale value of cows, the total herd sales and the gross margin/adult equivalent. Keeping breeders out of the herd to fatten generally improves their sale value, and this can be cost-effective, despite the lower number of progeny produced and the subsequent reduction in total herd sales. Furthermore, if the price of culled cows exceeds the price of culled heifers, provided there are sufficient replacement pregnant heifers available to maintain the breeder herd nucleus, substantial gains in profitability can be obtained by decreasing the age at which cows are culled from the herd. Generalised recommendations on improving reproductive performance are not necessarily the most cost-effective strategy to improve breeder herd profitability. Judicious use of simulation models is essential to help develop the best turnoff strategies for females and to improve station profitability.

  11. Abstraction of an Affective-Cognitive Decision Making Model Based on Simulated Behaviour and Perception Chains

    NASA Astrophysics Data System (ADS)

    Sharpanskykh, Alexei; Treur, Jan

    Employing rich internal agent models of actors in large-scale socio-technical systems often results in scalability issues. The problem addressed in this paper is how to improve computational properties of a complex internal agent model, while preserving its behavioral properties. The problem is addressed for the case of an existing affective-cognitive decision making model instantiated for an emergency scenario. For this internal decision model an abstracted behavioral agent model is obtained, which ensures a substantial increase of the computational efficiency at the cost of approximately 1% behavioural error. The abstraction technique used can be applied to a wide range of internal agent models with loops, for example, involving mutual affective-cognitive interactions.

  12. Preconditioned augmented Lagrangian formulation for nearly incompressible cardiac mechanics.

    PubMed

    Campos, Joventino Oliveira; Dos Santos, Rodrigo Weber; Sundnes, Joakim; Rocha, Bernardo Martins

    2018-04-01

    Computational modeling of the heart is a subject of substantial medical and scientific interest, which may contribute to increase the understanding of several phenomena associated with cardiac physiological and pathological states. Modeling the mechanics of the heart have led to considerable insights, but it still represents a complex and a demanding computational problem, especially in a strongly coupled electromechanical setting. Passive cardiac tissue is commonly modeled as hyperelastic and is characterized by quasi-incompressible, orthotropic, and nonlinear material behavior. These factors are known to be very challenging for the numerical solution of the model. The near-incompressibility is known to cause numerical issues such as the well-known locking phenomenon and ill-conditioning of the stiffness matrix. In this work, the augmented Lagrangian method is used to handle the nearly incompressible condition. This approach can potentially improve computational performance by reducing the condition number of the stiffness matrix and thereby improving the convergence of iterative solvers. We also improve the performance of iterative solvers by the use of an algebraic multigrid preconditioner. Numerical results of the augmented Lagrangian method combined with a preconditioned iterative solver for a cardiac mechanics benchmark suite are presented to show its improved performance. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Validation of Shoulder Response of Human Body Finite-Element Model (GHBMC) Under Whole Body Lateral Impact Condition.

    PubMed

    Park, Gwansik; Kim, Taewung; Panzer, Matthew B; Crandall, Jeff R

    2016-08-01

    In previous shoulder impact studies, the 50th-percentile male GHBMC human body finite-element model was shown to have good biofidelity regarding impact force, but under-predicted shoulder deflection by 80% compared to those observed in the experiment. The goal of this study was to validate the response of the GHBMC M50 model by focusing on three-dimensional shoulder kinematics under a whole-body lateral impact condition. Five modifications, focused on material properties and modeling techniques, were introduced into the model and a supplementary sensitivity analysis was done to determine the influence of each modification to the biomechanical response of the body. The modified model predicted substantially improved shoulder response and peak shoulder deflection within 10% of the observed experimental data, and showed good correlation in the scapula kinematics on sagittal and transverse planes. The improvement in the biofidelity of the shoulder region was mainly due to the modifications of material properties of muscle, the acromioclavicular joint, and the attachment region between the pectoralis major and ribs. Predictions of rib fracture and chest deflection were also improved because of these modifications.

  14. A Combined Kinetic and Volatility Basis Set Approach to Model Secondary Organic Aerosol from Toluene and Diesel Exhaust/Meat Cooking Mixtures

    NASA Astrophysics Data System (ADS)

    Parikh, H. M.; Carlton, A. G.; Zhang, H.; Kamens, R.; Vizuete, W.

    2011-12-01

    Secondary organic aerosol (SOA) is simulated for 6 outdoor smog chamber experiments using a SOA model based on a kinetic chemical mechanism in conjunction with a volatility basis set (VBS) approach. The experiments include toluene, a non-SOA-forming hydrocarbon mixture, diesel exhaust or meat cooking emissions and NOx, and are performed under varying conditions of relative humidity. SOA formation from toluene is modeled using a condensed kinetic aromatic mechanism that includes partitioning of lumped semi-volatile products in particle organic-phase and incorporates particle aqueous-phase chemistry to describe uptake of glyoxal and methylglyoxal. Modeling using the kinetic mechanism alone, along with primary organic aerosol (POA) from diesel exhaust (DE) /meat cooking (MC) fails to simulate the rapid SOA formation at the beginning hours of the experiments. Inclusion of a VBS approach with the kinetic mechanism to characterize the emissions and chemistry of complex mixture of intermediate volatility organic compounds (IVOCs) from DE/MC, substantially improves SOA predictions when compared with observed data. The VBS model includes photochemical aging of IVOCs and evaporation of POA after dilution. The relative contribution of SOA mass from DE/MC is as high as 95% in the morning, but substantially decreases after mid-afternoon. For high humidity experiments, aqueous-phase SOA fraction dominates the total SOA mass at the end of the day (approximately 50%). In summary, the combined kinetic and VBS approach provides a new and improved framework to semi-explicitly model SOA from VOC precursors in conjunction with a VBS approach that can be used on complex emission mixtures comprised with hundreds of individual chemical species.

  15. Reprint: Good laboratory practice: preventing introduction of bias at the bench

    PubMed Central

    Macleod, Malcolm R; Fisher, Marc; O’Collins, Victoria; Sena, Emily S; Dirnagl, Ulrich; Bath, Philip MW; Buchan, Alistair; van der Worp, H Bart; Traystman, Richard J; Minematsu, Kazuo; Donnan, Geoffrey A; Howells, David W

    2009-01-01

    As a research community, we have failed to show that drugs, which show substantial efficacy in animal models of cerebral ischemia, can also improve outcome in human stroke. Accumulating evidence suggests this may be due, at least in part, to problems in the design, conduct, and reporting of animal experiments which create a systematic bias resulting in the overstatement of neuroprotective efficacy. Here, we set out a series of measures to reduce bias in the design, conduct and reporting of animal experiments modeling human stroke. PMID:18797473

  16. Effect of temperature and precipitation on salmonellosis cases in South-East Queensland, Australia: an observational study

    PubMed Central

    Barnett, Adrian Gerard

    2016-01-01

    Objective Foodborne illnesses in Australia, including salmonellosis, are estimated to cost over $A1.25 billion annually. The weather has been identified as being influential on salmonellosis incidence, as cases increase during summer, however time series modelling of salmonellosis is challenging because outbreaks cause strong autocorrelation. This study assesses whether switching models is an improved method of estimating weather–salmonellosis associations. Design We analysed weather and salmonellosis in South-East Queensland between 2004 and 2013 using 2 common regression models and a switching model, each with 21-day lags for temperature and precipitation. Results The switching model best fit the data, as judged by its substantial improvement in deviance information criterion over the regression models, less autocorrelated residuals and control of seasonality. The switching model estimated a 5°C increase in mean temperature and 10 mm precipitation were associated with increases in salmonellosis cases of 45.4% (95% CrI 40.4%, 50.5%) and 24.1% (95% CrI 17.0%, 31.6%), respectively. Conclusions Switching models improve on traditional time series models in quantifying weather–salmonellosis associations. A better understanding of how temperature and precipitation influence salmonellosis may identify where interventions can be made to lower the health and economic costs of salmonellosis. PMID:26916693

  17. High performance liquid chromatographic hydrocarbon group-type analyses of mid-distillates employing fuel-derived fractions as standards

    NASA Technical Reports Server (NTRS)

    Seng, G. T.; Otterson, D. A.

    1983-01-01

    Two high performance liquid chromatographic (HPLC) methods have been developed for the determination of saturates, olefins and aromatics in petroleum and shale derived mid-distillate fuels. In one method the fuel to be analyzed is reacted with sulfuric acid, to remove a substantial portion of the aromatics, which provides a reacted fuel fraction for use in group type quantitation. The second involves the removal of a substantial portion of the saturates fraction from the HPLC system to permit the determination of olefin concentrations as low as 0.3 volume percent, and to improve the accuracy and precision of olefins determinations. Each method was evaluated using model compound mixtures and real fuel samples.

  18. Final Report, 2011-2014. Forecasting Carbon Storage as Eastern Forests Age. Joining Experimental and Modeling Approaches at the UMBS AmeriFlux Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, Peter; Bohrer, Gil; Gough, Christopher

    2015-03-12

    At the University of Michigan Biological Station (UMBS) AmeriFlux sites (US-UMB and US-UMd), long-term C cycling measurements and a novel ecosystem-scale experiment are revealing physical, biological, and ecological mechanisms driving long-term trajectories of C cycling, providing new data for improving modeling forecasts of C storage in eastern forests. Our findings provide support for previously untested hypotheses that stand-level structural and biological properties constrain long-term trajectories of C storage, and that remotely sensed canopy structural parameters can substantially improve model forecasts of forest C storage. Through the Forest Accelerated Succession ExperimenT (FASET), we are directly testing the hypothesis that forest Cmore » storage will increase due to increasing structural and biological complexity of the emerging tree communities. Support from this project, 2011-2014, enabled us to incorporate novel physical and ecological mechanisms into ecological, meteorological, and hydrological models to improve forecasts of future forest C storage in response to disturbance, succession, and current and long-term climate variation« less

  19. Improved Speech Coding Based on Open-Loop Parameter Estimation

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Chen, Ya-Chin; Longman, Richard W.

    2000-01-01

    A nonlinear optimization algorithm for linear predictive speech coding was developed early that not only optimizes the linear model coefficients for the open loop predictor, but does the optimization including the effects of quantization of the transmitted residual. It also simultaneously optimizes the quantization levels used for each speech segment. In this paper, we present an improved method for initialization of this nonlinear algorithm, and demonstrate substantial improvements in performance. In addition, the new procedure produces monotonically improving speech quality with increasing numbers of bits used in the transmitted error residual. Examples of speech encoding and decoding are given for 8 speech segments and signal to noise levels as high as 47 dB are produced. As in typical linear predictive coding, the optimization is done on the open loop speech analysis model. Here we demonstrate that minimizing the error of the closed loop speech reconstruction, instead of the simpler open loop optimization, is likely to produce negligible improvement in speech quality. The examples suggest that the algorithm here is close to giving the best performance obtainable from a linear model, for the chosen order with the chosen number of bits for the codebook.

  20. Test Activities in the Langley Transonic Dynamics Tunnel and a Summary of Recent Facility Improvements

    NASA Technical Reports Server (NTRS)

    Cole, Stanley R.; Johnson, R. Keith; Piatak, David J.; Florance, Jennifer P.; Rivera, Jose A., Jr.

    2003-01-01

    The Langley Transonic Dynamics Tunnel (TDT) has provided a unique capability for aeroelastic testing for over forty years. The facility has a rich history of significant contributions to the design of many United States commercial transports, military aircraft, launch vehicles, and spacecraft. The facility has many features that contribute to its uniqueness for aeroelasticity testing, perhaps the most important feature being the use of a heavy gas test medium to achieve higher test densities compared to testing in air. Higher test medium densities substantially improve model-building requirements and therefore simplify the fabrication process for building aeroelastically scaled wind tunnel models. This paper describes TDT capabilities that make it particularly suited for aeroelasticity testing. The paper also discusses the nature of recent test activities in the TDT, including summaries of several specific tests. Finally, the paper documents recent facility improvement projects and the continuous statistical quality assessment effort for the TDT.

  1. Effects of recombinant granulocyte-colony stimulating factor administration during Mycobacterium avium infection in mice

    PubMed Central

    Gonçalves, A S; Appelberg, R

    2001-01-01

    Granulocyte colony-stimulating factor (G-CSF) administration in vivo has been shown to improve the defence mechanisms against infection by different microbes. Here we evaluated a possible protective role of this molecule in a mouse model of mycobacterial infection. The administration of recombinant G-CSF promoted an extensive blood neutrophilia but failed to improve the course of Mycobacterium avium infection in C57Bl/6 or beige mice. G-CSF administration also failed to improve the efficacy of a triple chemotherapeutic regimen (clarithromycin + ethambutol + rifabutin). G-CSF treatment did not protect interleukin-10 gene disrupted mice infected with M. avium. Spleen cells from infected mice treated with G-CSF had a decreased priming for antigen-specific production of interferon gamma compared to control infected mice. Our data do not substantiate previous reports on the protective activity of G-CSF in antimycobacterial immunity using mouse models. PMID:11422200

  2. Rapid determination of total protein and wet gluten in commercial wheat flour using siSVR-NIR.

    PubMed

    Chen, Jia; Zhu, Shipin; Zhao, Guohua

    2017-04-15

    The determination of total protein and wet gluten is of critical importance when screening commercial flour for desired processing suitability. To this end, a near-infrared spectroscopy (NIR) method with support vector regression was developed in the present study. The effects of spectral preprocessing and the synergy interval on model performance were investigated. The results showed that the models from raw spectra were not acceptable, but they were substantially improved by properly applying spectral preprocessing methods. Meanwhile, the synergy interval was validated with a good ability to improve the performance of models based on the whole spectrum. The coefficient of determination (R 2 ), the root mean square error of prediction (RMSEP) and the standard deviation ratio (SDR) of the best models for total protein (wet gluten) were 0.906 (0.850), 0.425 (1.024) and 3.065 (2.482), respectively. These two best models have similar and lower relative errors (approximately 8.8%), which indicates their feasibility. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Inexact hardware for modelling weather & climate

    NASA Astrophysics Data System (ADS)

    Düben, Peter D.; McNamara, Hugh; Palmer, Tim

    2014-05-01

    The use of stochastic processing hardware and low precision arithmetic in atmospheric models is investigated. Stochastic processors allow hardware-induced faults in calculations, sacrificing exact calculations in exchange for improvements in performance and potentially accuracy and a reduction in power consumption. A similar trade-off is achieved using low precision arithmetic, with improvements in computation and communication speed and savings in storage and memory requirements. As high-performance computing becomes more massively parallel and power intensive, these two approaches may be important stepping stones in the pursuit of global cloud resolving atmospheric modelling. The impact of both, hardware induced faults and low precision arithmetic is tested in the dynamical core of a global atmosphere model. Our simulations show that both approaches to inexact calculations do not substantially affect the quality of the model simulations, provided they are restricted to act only on smaller scales. This suggests that inexact calculations at the small scale could reduce computation and power costs without adversely affecting the quality of the simulations.

  4. The GISS sounding temperature impact test

    NASA Technical Reports Server (NTRS)

    Halem, M.; Ghil, M.; Atlas, R.; Susskind, J.; Quirk, W. J.

    1978-01-01

    The impact of DST 5 and DST 6 satellite sounding data on mid-range forecasting was studied. The GISS temperature sounding technique, the GISS time-continuous four-dimensional assimilation procedure based on optimal statistical analysis, the GISS forecast model, and the verification techniques developed, including impact on local precipitation forecasts are described. It is found that the impact of sounding data was substantial and beneficial for the winter test period, Jan. 29 - Feb. 21. 1976. Forecasts started from initial state obtained with the aid of satellite data showed a mean improvement of about 4 points in the 48 and 772 hours Sub 1 scores as verified over North America and Europe. This corresponds to an 8 to 12 hour forecast improvement in the forecast range at 48 hours. An automated local precipitation forecast model applied to 128 cities in the United States showed on an average 15% improvement when satellite data was used for numerical forecasts. The improvement was 75% in the midwest.

  5. A numerical study of wave-current interaction through surface and bottom stresses: Coastal ocean response to Hurricane Fran of 1996

    NASA Astrophysics Data System (ADS)

    Xie, L.; Pietrafesa, L. J.; Wu, K.

    2003-02-01

    A three-dimensional wave-current coupled modeling system is used to examine the influence of waves on coastal currents and sea level. This coupled modeling system consists of the wave model-WAM (Cycle 4) and the Princeton Ocean Model (POM). The results from this study show that it is important to incorporate surface wave effects into coastal storm surge and circulation models. Specifically, we find that (1) storm surge models without coupled surface waves generally under estimate not only the peak surge but also the coastal water level drop which can also cause substantial impact on the coastal environment, (2) introducing wave-induced surface stress effect into storm surge models can significantly improve storm surge prediction, (3) incorporating wave-induced bottom stress into the coupled wave-current model further improves storm surge prediction, and (4) calibration of the wave module according to minimum error in significant wave height does not necessarily result in an optimum wave module in a wave-current coupled system for current and storm surge prediction.

  6. Analytical concepts for health management systems of liquid rocket engines

    NASA Technical Reports Server (NTRS)

    Williams, Richard; Tulpule, Sharayu; Hawman, Michael

    1990-01-01

    Substantial improvement in health management systems performance can be realized by implementing advanced analytical methods of processing existing liquid rocket engine sensor data. In this paper, such techniques ranging from time series analysis to multisensor pattern recognition to expert systems to fault isolation models are examined and contrasted. The performance of several of these methods is evaluated using data from test firings of the Space Shuttle main engines.

  7. Surrogate Analysis and Index Developer (SAID) tool

    USGS Publications Warehouse

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The regression models created in SAID can be used in utilities that have been developed to work with the USGS National Water Information System (NWIS) and for the USGS National Real-Time Water Quality (NRTWQ) Web site. The real-time dissemination of predicted SSC and prediction intervals for each time step has substantial potential to improve understanding of sediment-related water quality and associated engineering and ecological management decisions.

  8. Trends in ADL and IADL Disability in Community-Dwelling Older Adults in Shanghai, China, 1998–2008

    PubMed Central

    2013-01-01

    Objectives. We investigated trends in activities of daily living (ADL) and instrumental activities of daily living (IADL) disability from 1998 to 2008 among elder adults in Shanghai, China. Method. Our data came from 4 waves of the Shanghai Longitudinal Survey of Elderly Life and Opinion (1998, 2003, 2005, and 2008). ADL and IADL disabilities were recorded dichotomously (difficulty vs. no difficulty). The major independent variable was survey year. Covariates included demographics, socioeconomic conditions, family and social support, and other health conditions. Nested random-effect models were applied to estimate trends over time, referenced to 1998. Results. In comparison with the baseline year (1998), older adults in 2008 had lower odds of being ADL disabled, though the effect was no longer statistically significant when other health conditions were taken into account. Elders in 2003, 2005, and 2008 were 20%–26%, 17%–38%, and 53%–64% less likely to be IADL disabled than those in 1998, respectively, depending on the set of covariates included in the model. Discussion. Shanghai elders experienced substantial improvements in both ADL and IADL disability prevalence over the past decade. The trend toward improvement in IADL function is more consistent and substantial than that of ADL function. PMID:23525547

  9. Integrated assessment of the health and economic benefits of long-term renewable energy development in China

    NASA Astrophysics Data System (ADS)

    Dai, H.; Xie, Y.; Zhang, Y.

    2017-12-01

    Context/Purpose: Power generation from renewable energy (RE) could substitute huge amount of fossil energy in the power sector and have substantial co-benefits of air quality and human health improvement. In 2016, China National Renewable Energy Center (CNREC) released China Renewable Energy Outlook, CREO2016 and CREO2017, towards 2030 and 2050, respectively, in which two scenarios are proposed, namely, a conservative "Stated Policy" scenario and a more ambitious "High RE" scenario. This study, together with CNREC, aims to quantify the health and economic benefits of developing renewable energy at the provincial level in China up to 2030 and 2050. Methods: For this purpose, we developed an integrated approach that combines a power dispatch model at CNREC, an air pollutant emission projection model using energy consumption data from the Long-range Energy Alternatives Planning System (LEAP) model, an air quality model (GEOS-Chem at Harvard), an own-developed health model, and a macro economic model (Computable General Equilibrium model). Results: All together, we attempt to quantify how developing RE could reduce the concentration of PM2.5 and ozone in 30 provinces of China, how the human health could be improved in terms of mortality, morbidity and work hour loss, and what is the economic value of the health improvement in terms of increased GDP and the value of statistical life lost. The results show that developing RE as stated in the CREO2016 could prevent chronic mortality of 286 thousand people in China in 2030 alone, the value of saved statistical life is worthy 1200 billion Yuan, equivalent to 1.2% of GDP. In addition, averagely, due to reduced mortality and improved morbidity each person could work additionally by 1.16 hours per year, this could contribute to an increase of GDP by 0.1% in 2030. The assessment up to 2050 is still underway. Interpretation: The results imply that when the external benefit of renewable energy is taken into account, RE could be cost competitive compared with fossil fuel use. In other words, fossil fuel combustion is not so cheap as it appears when considering its external cost in terms of human health damage. Conclusion: Our study finds that developing renewable energy could bring substantial health and economic benefits for China.

  10. Operational Models Supporting Manned Space Flight

    NASA Astrophysics Data System (ADS)

    Johnson, A. S.; Weyland, M. D.; Lin, T. C.; Zapp, E. N.

    2006-12-01

    The Space Radiation Analysis Group (SRAG) at Johnson Space Center (JSC) has the primary responsibility to provide real-time radiation health operational support for manned space flight. Forecasts from NOAA SEC, real-time space environment data and radiation models are used to infer changes in the radiation environment due to space weather. Unlike current operations in low earth orbit which are afforded substantial protection from the geomagnetic field, exploration missions will have little protection and require improved operational tools for mission support. The current state of operational models and their limitations will be presented as well as an examination of needed tools to support exploration missions.

  11. Collaborative Project: Building improved optimized parameter estimation algorithms to improve methane and nitrogen fluxes in a climate model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahowald, Natalie

    Soils in natural and managed ecosystems and wetlands are well known sources of methane, nitrous oxides, and reactive nitrogen gases, but the magnitudes of gas flux to the atmosphere are still poorly constrained. Thus, the reasons for the large increases in atmospheric concentrations of methane and nitrous oxide since the preindustrial time period are not well understood. The low atmospheric concentrations of methane and nitrous oxide, despite being more potent greenhouse gases than carbon dioxide, complicate empirical studies to provide explanations. In addition to climate concerns, the emissions of reactive nitrogen gases from soils are important to the changing nitrogenmore » balance in the earth system, subject to human management, and may change substantially in the future. Thus improved modeling of the emission fluxes of these species from the land surface is important. Currently, there are emission modules for methane and some nitrogen species in the Community Earth System Model’s Community Land Model (CLM-ME/N); however, there are large uncertainties and problems in the simulations, resulting in coarse estimates. In this proposal, we seek to improve these emission modules by combining state-of-the-art process modules for emissions, available data, and new optimization methods. In earth science problems, we often have substantial data and knowledge of processes in disparate systems, and thus we need to combine data and a general process level understanding into a model for projections of future climate that are as accurate as possible. The best methodologies for optimization of parameters in earth system models are still being developed. In this proposal we will develop and apply surrogate algorithms that a) were especially developed for computationally expensive simulations like CLM-ME/N models; b) were (in the earlier surrogate optimization Stochastic RBF) demonstrated to perform very well on computationally expensive complex partial differential equations in earth science with limited numbers of simulations; and, c) will be (as part of the proposed research) significantly improved both by adding asynchronous parallelism, early truncation of unsuccessful simulations, and the improvement of both serial and parallel performance by the use of derivative and sensitivity information from global and local surrogate approximations S(x). The algorithm development and testing will be focused on the CLM-ME/N model application, but the methods are general and are expected to also perform well on optimization for parameter estimation of other climate models and other classes of continuous multimodal optimization problems arising from complex simulation models. In addition, this proposal will compile available datasets of emissions of methane, nitrous oxides and reactive nitrogen species and develop protocols for site level comparisons with the CLM-ME/N. Once the model parameters are optimized against site level data, the model will be simulated at the global level and compared to atmospheric concentration measurements for the current climate, and future emissions will be estimated using climate change as simulated by the CESM. This proposal combines experts in earth system modeling, optimization, computer science, and process level understanding of soil gas emissions in an interdisciplinary team in order to improve the modeling of methane and nitrogen gas emissions. This proposal thus meets the requirements of the SciDAC RFP, by integrating state-of-the-art computer science and earth system to build an improved earth system model.« less

  12. Pairing field methods to improve inference in wildlife surveys while accommodating detection covariance

    USGS Publications Warehouse

    Clare, John; McKinney, Shawn T.; DePue, John E.; Loftin, Cynthia S.

    2017-01-01

    It is common to use multiple field sampling methods when implementing wildlife surveys to compare method efficacy or cost efficiency, integrate distinct pieces of information provided by separate methods, or evaluate method-specific biases and misclassification error. Existing models that combine information from multiple field methods or sampling devices permit rigorous comparison of method-specific detection parameters, enable estimation of additional parameters such as false-positive detection probability, and improve occurrence or abundance estimates, but with the assumption that the separate sampling methods produce detections independently of one another. This assumption is tenuous if methods are paired or deployed in close proximity simultaneously, a common practice that reduces the additional effort required to implement multiple methods and reduces the risk that differences between method-specific detection parameters are confounded by other environmental factors. We develop occupancy and spatial capture–recapture models that permit covariance between the detections produced by different methods, use simulation to compare estimator performance of the new models to models assuming independence, and provide an empirical application based on American marten (Martes americana) surveys using paired remote cameras, hair catches, and snow tracking. Simulation results indicate existing models that assume that methods independently detect organisms produce biased parameter estimates and substantially understate estimate uncertainty when this assumption is violated, while our reformulated models are robust to either methodological independence or covariance. Empirical results suggested that remote cameras and snow tracking had comparable probability of detecting present martens, but that snow tracking also produced false-positive marten detections that could potentially substantially bias distribution estimates if not corrected for. Remote cameras detected marten individuals more readily than passive hair catches. Inability to photographically distinguish individual sex did not appear to induce negative bias in camera density estimates; instead, hair catches appeared to produce detection competition between individuals that may have been a source of negative bias. Our model reformulations broaden the range of circumstances in which analyses incorporating multiple sources of information can be robustly used, and our empirical results demonstrate that using multiple field-methods can enhance inferences regarding ecological parameters of interest and improve understanding of how reliably survey methods sample these parameters.

  13. A multidisciplinary-based conceptual model of a fractured sedimentary bedrock aquitard: improved prediction of aquitard integrity

    NASA Astrophysics Data System (ADS)

    Runkel, Anthony C.; Tipping, Robert G.; Meyer, Jessica R.; Steenberg, Julia R.; Retzler, Andrew J.; Parker, Beth L.; Green, Jeff A.; Barry, John D.; Jones, Perry M.

    2018-06-01

    A hydrogeologic conceptual model that improves understanding of variability in aquitard integrity is presented for a fractured sedimentary bedrock unit in the Cambrian-Ordovician aquifer system of midcontinent North America. The model is derived from multiple studies on the siliciclastic St. Lawrence Formation and adjacent strata across a range of scales and geologic conditions. These studies employed multidisciplinary techniques including borehole flowmeter logging, high-resolution depth-discrete multilevel well monitoring, fracture stratigraphy, fluorescent dye tracing, and three-dimensional (3D) distribution of anthropogenic tracers regionally. The paper documents a bulk aquitard that is highly anisotropic because of poor connectivity of vertical fractures across matrix with low permeability, but with ubiquitous bed parallel partings. The partings provide high bulk horizontal hydraulic conductivity, analogous to aquifers in the system, while multiple preferential termination horizons of vertical fractures serve as discrete low vertical hydraulic conductivity intervals inhibiting vertical flow. The aquitard has substantial variability in its ability to protect underlying groundwater from contamination. Across widespread areas where the aquitard is deeply buried by younger bedrock, preferential termination horizons provide for high aquitard integrity (i.e. protection). Protection is diminished close to incised valleys where stress release and weathering has enhanced secondary pore development, including better connection of fractures across these horizons. These conditions, along with higher hydraulic head gradients in the same areas and more complex 3D flow where the aquitard is variably incised, allow for more substantial transport to deeper aquifers. The conceptual model likely applies to other fractured sedimentary bedrock aquitards within and outside of this region.

  14. Multilayer shallow water models with locally variable number of layers and semi-implicit time discretization

    NASA Astrophysics Data System (ADS)

    Bonaventura, Luca; Fernández-Nieto, Enrique D.; Garres-Díaz, José; Narbona-Reina, Gladys

    2018-07-01

    We propose an extension of the discretization approaches for multilayer shallow water models, aimed at making them more flexible and efficient for realistic applications to coastal flows. A novel discretization approach is proposed, in which the number of vertical layers and their distribution are allowed to change in different regions of the computational domain. Furthermore, semi-implicit schemes are employed for the time discretization, leading to a significant efficiency improvement for subcritical regimes. We show that, in the typical regimes in which the application of multilayer shallow water models is justified, the resulting discretization does not introduce any major spurious feature and allows again to reduce substantially the computational cost in areas with complex bathymetry. As an example of the potential of the proposed technique, an application to a sediment transport problem is presented, showing a remarkable improvement with respect to standard discretization approaches.

  15. The role of thermal and lubricant boundary layers in the transient thermal analysis of spur gears

    NASA Technical Reports Server (NTRS)

    El-Bayoumy, L. E.; Akin, L. S.; Townsend, D. P.; Choy, F. C.

    1989-01-01

    An improved convection heat-transfer model has been developed for the prediction of the transient tooth surface temperature of spur gears. The dissipative quality of the lubricating fluid is shown to be limited to the capacity extent of the thermal boundary layer. This phenomenon can be of significance in the determination of the thermal limit of gears accelerating to the point where gear scoring occurs. Steady-state temperature prediction is improved considerably through the use of a variable integration time step that substantially reduces computer time. Computer-generated plots of temperature contours enable the user to animate the propagation of the thermal wave as the gears come into and out of contact, thus contributing to better understanding of this complex problem. This model has a much better capability at predicting gear-tooth temperatures than previous models.

  16. The influence of atmospheric grid resolution in a climate model-forced ice sheet simulation

    NASA Astrophysics Data System (ADS)

    Lofverstrom, Marcus; Liakka, Johan

    2018-04-01

    Coupled climate-ice sheet simulations have been growing in popularity in recent years. Experiments of this type are however challenging as ice sheets evolve over multi-millennial timescales, which is beyond the practical integration limit of most Earth system models. A common method to increase model throughput is to trade resolution for computational efficiency (compromise accuracy for speed). Here we analyze how the resolution of an atmospheric general circulation model (AGCM) influences the simulation quality in a stand-alone ice sheet model. Four identical AGCM simulations of the Last Glacial Maximum (LGM) were run at different horizontal resolutions: T85 (1.4°), T42 (2.8°), T31 (3.8°), and T21 (5.6°). These simulations were subsequently used as forcing of an ice sheet model. While the T85 climate forcing reproduces the LGM ice sheets to a high accuracy, the intermediate resolution cases (T42 and T31) fail to build the Eurasian ice sheet. The T21 case fails in both Eurasia and North America. Sensitivity experiments using different surface mass balance parameterizations improve the simulations of the Eurasian ice sheet in the T42 case, but the compromise is a substantial ice buildup in Siberia. The T31 and T21 cases do not improve in the same way in Eurasia, though the latter simulates the continent-wide Laurentide ice sheet in North America. The difficulty to reproduce the LGM ice sheets in the T21 case is in broad agreement with previous studies using low-resolution atmospheric models, and is caused by a substantial deterioration of the model climate between the T31 and T21 resolutions. It is speculated that this deficiency may demonstrate a fundamental problem with using low-resolution atmospheric models in these types of experiments.

  17. Translational models of lung disease.

    PubMed

    Mercer, Paul F; Abbott-Banner, Katharine; Adcock, Ian M; Knowles, Richard G

    2015-02-01

    The 2nd Cross Company Respiratory Symposium (CCRS), held in Horsham, U.K. in 2012, brought together representatives from across the pharmaceutical industry with expert academics, in the common interest of improving the design and translational predictiveness of in vivo models of respiratory disease. Organized by the respiratory representatives of the European Federation of Pharmaceutical Industries and Federations (EFPIA) group of companies involved in the EU-funded project (U-BIOPRED), the aim of the symposium was to identify state-of-the-art improvements in the utility and design of models of respiratory disease, with a view to improving their translational potential and reducing wasteful animal usage. The respiratory research and development community is responding to the challenge of improving translation in several ways: greater collaboration and open sharing of data, careful selection of the species, complexity and chronicity of the models, improved practices in preclinical research, continued refinement in models of respiratory diseases and their sub-types, greater understanding of the biology underlying human respiratory diseases and their sub-types, and finally greater use of human (and especially disease-relevant) cells, tissues and explants. The present review highlights these initiatives, combining lessons from the symposium and papers published in Clinical Science arising from the symposium, with critiques of the models currently used in the settings of asthma, idiopathic pulmonary fibrosis and COPD. The ultimate hope is that this will contribute to a more rational, efficient and sustainable development of a range of new treatments for respiratory diseases that continue to cause substantial morbidity and mortality across the world.

  18. Job stress and cardiovascular disease: a theoretic critical review.

    PubMed

    Kristensen, T S

    1996-07-01

    During the last 15 years, the research on job stress and cardiovascular diseases has been dominated by the job strain model developed by R. Karasek (1979) and colleagues (R. Karasek & T. Theorell, 1990). In this article the results of this research are briefly summarized, and the theoretical and methodological basis is discussed and criticized. A sociological interpretation of the model emphasizing theories of technological change, qualifications of the workers, and the organization of work is proposed. Furthermore, improvements with regard to measuring the job strain dimensions and to sampling the study base are suggested. Substantial improvements of the job strain research could be achieved if the principle of triangulation were used in the measurements of stressors, stress, and sickness and if occupation-based samples were used instead of large representative samples.

  19. Improving the Spacelab mass memory unit tape layout with a simulation model

    NASA Technical Reports Server (NTRS)

    Noneman, S. R.

    1984-01-01

    A tape drive called the Mass Memory Unit (MMU) stores software used by Spacelab computers. MMU tape motion must be minimized during typical flight operations to avoid a loss of scientific data. A projection of the tape motion is needed for evaluation of candidate tape layouts. A computer simulation of the scheduled and unscheduled MMU tape accesses is developed for this purpose. This simulation permits evaluations of candidate tape layouts by tracking and summarizing tape movements. The factors that affect tape travel are investigated and a heuristic is developed to find a good tape layout. An improved tape layout for Spacelab I is selected after the evaluation of fourteen candidates. The simulation model will provide the ability to determine MMU layouts that substantially decrease the tape travel on future Spacelab flights.

  20. Geostatistical Prediction of Microbial Water Quality Throughout a Stream Network Using Meteorology, Land Cover, and Spatiotemporal Autocorrelation.

    PubMed

    Holcomb, David A; Messier, Kyle P; Serre, Marc L; Rowny, Jakob G; Stewart, Jill R

    2018-06-25

    Predictive modeling is promising as an inexpensive tool to assess water quality. We developed geostatistical predictive models of microbial water quality that empirically modeled spatiotemporal autocorrelation in measured fecal coliform (FC) bacteria concentrations to improve prediction. We compared five geostatistical models featuring different autocorrelation structures, fit to 676 observations from 19 locations in North Carolina's Jordan Lake watershed using meteorological and land cover predictor variables. Though stream distance metrics (with and without flow-weighting) failed to improve prediction over the Euclidean distance metric, incorporating temporal autocorrelation substantially improved prediction over the space-only models. We predicted FC throughout the stream network daily for one year, designating locations "impaired", "unimpaired", or "unassessed" if the probability of exceeding the state standard was ≥90%, ≤10%, or >10% but <90%, respectively. We could assign impairment status to more of the stream network on days any FC were measured, suggesting frequent sample-based monitoring remains necessary, though implementing spatiotemporal predictive models may reduce the number of concurrent sampling locations required to adequately assess water quality. Together, these results suggest that prioritizing sampling at different times and conditions using geographically sparse monitoring networks is adequate to build robust and informative geostatistical models of water quality impairment.

  1. Root architecture simulation improves the inference from seedling root phenotyping towards mature root systems

    PubMed Central

    Zhao, Jiangsan; Rewald, Boris; Leitner, Daniel; Nagel, Kerstin A.; Nakhforoosh, Alireza

    2017-01-01

    Abstract Root phenotyping provides trait information for plant breeding. A shortcoming of high-throughput root phenotyping is the limitation to seedling plants and failure to make inferences on mature root systems. We suggest root system architecture (RSA) models to predict mature root traits and overcome the inference problem. Sixteen pea genotypes were phenotyped in (i) seedling (Petri dishes) and (ii) mature (sand-filled columns) root phenotyping platforms. The RSA model RootBox was parameterized with seedling traits to simulate the fully developed root systems. Measured and modelled root length, first-order lateral number, and root distribution were compared to determine key traits for model-based prediction. No direct relationship in root traits (tap, lateral length, interbranch distance) was evident between phenotyping systems. RootBox significantly improved the inference over phenotyping platforms. Seedling plant tap and lateral root elongation rates and interbranch distance were sufficient model parameters to predict genotype ranking in total root length with an RSpearman of 0.83. Parameterization including uneven lateral spacing via a scaling function substantially improved the prediction of architectures underlying the differently sized root systems. We conclude that RSA models can solve the inference problem of seedling root phenotyping. RSA models should be included in the phenotyping pipeline to provide reliable information on mature root systems to breeding research. PMID:28168270

  2. Improvements, testing and development of the ADM-τ sub-grid surface tension model for two-phase LES

    NASA Astrophysics Data System (ADS)

    Aniszewski, Wojciech

    2016-12-01

    In this paper, a specific subgrid term occurring in Large Eddy Simulation (LES) of two-phase flows is investigated. This and other subgrid terms are presented, we subsequently elaborate on the existing models for those and re-formulate the ADM-τ model for sub-grid surface tension previously published by these authors. This paper presents a substantial, conceptual simplification over the original model version, accompanied by a decrease in its computational cost. At the same time, it addresses the issues the original model version faced, e.g. introduces non-isotropic applicability criteria based on resolved interface's principal curvature radii. Additionally, this paper introduces more throughout testing of the ADM-τ, in both simple and complex flows.

  3. Lung function parameters improve prediction of VO2peak in an elderly population: The Generation 100 study.

    PubMed

    Hassel, Erlend; Stensvold, Dorthe; Halvorsen, Thomas; Wisløff, Ulrik; Langhammer, Arnulf; Steinshamn, Sigurd

    2017-01-01

    Peak oxygen uptake (VO2peak) is an indicator of cardiovascular health and a useful tool for risk stratification. Direct measurement of VO2peak is resource-demanding and may be contraindicated. There exist several non-exercise models to estimate VO2peak that utilize easily obtainable health parameters, but none of them includes lung function measures or hemoglobin concentrations. We aimed to test whether addition of these parameters could improve prediction of VO2peak compared to an established model that includes age, waist circumference, self-reported physical activity and resting heart rate. We included 1431 subjects aged 69-77 years that completed a laboratory test of VO2peak, spirometry, and a gas diffusion test. Prediction models for VO2peak were developed with multiple linear regression, and goodness of fit was evaluated. Forced expiratory volume in one second (FEV1), diffusing capacity of the lung for carbon monoxide and blood hemoglobin concentration significantly improved the ability of the established model to predict VO2peak. The explained variance of the model increased from 31% to 48% for men and from 32% to 38% for women (p<0.001). FEV1, diffusing capacity of the lungs for carbon monoxide and hemoglobin concentration substantially improved the accuracy of VO2peak prediction when added to an established model in an elderly population.

  4. Clarity versus complexity: land-use modeling as a practical tool for decision-makers

    USGS Publications Warehouse

    Sohl, Terry L.; Claggett, Peter

    2013-01-01

    The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.

  5. Forecasting Influenza Outbreaks in Boroughs and Neighborhoods of New York City

    PubMed Central

    2016-01-01

    The ideal spatial scale, or granularity, at which infectious disease incidence should be monitored and forecast has been little explored. By identifying the optimal granularity for a given disease and host population, and matching surveillance and prediction efforts to this scale, response to emergent and recurrent outbreaks can be improved. Here we explore how granularity and representation of spatial structure affect influenza forecast accuracy within New York City. We develop network models at the borough and neighborhood levels, and use them in conjunction with surveillance data and a data assimilation method to forecast influenza activity. These forecasts are compared to an alternate system that predicts influenza for each borough or neighborhood in isolation. At the borough scale, influenza epidemics are highly synchronous despite substantial differences in intensity, and inclusion of network connectivity among boroughs generally improves forecast accuracy. At the neighborhood scale, we observe much greater spatial heterogeneity among influenza outbreaks including substantial differences in local outbreak timing and structure; however, inclusion of the network model structure generally degrades forecast accuracy. One notable exception is that local outbreak onset, particularly when signal is modest, is better predicted with the network model. These findings suggest that observation and forecast at sub-municipal scales within New York City provides richer, more discriminant information on influenza incidence, particularly at the neighborhood scale where greater heterogeneity exists, and that the spatial spread of influenza among localities can be forecast. PMID:27855155

  6. A method for accounting for maintenance costs in flux balance analysis improves the prediction of plant cell metabolic phenotypes under stress conditions.

    PubMed

    Cheung, C Y Maurice; Williams, Thomas C R; Poolman, Mark G; Fell, David A; Ratcliffe, R George; Sweetlove, Lee J

    2013-09-01

    Flux balance models of metabolism generally utilize synthesis of biomass as the main determinant of intracellular fluxes. However, the biomass constraint alone is not sufficient to predict realistic fluxes in central heterotrophic metabolism of plant cells because of the major demand on the energy budget due to transport costs and cell maintenance. This major limitation can be addressed by incorporating transport steps into the metabolic model and by implementing a procedure that uses Pareto optimality analysis to explore the trade-off between ATP and NADPH production for maintenance. This leads to a method for predicting cell maintenance costs on the basis of the measured flux ratio between the oxidative steps of the oxidative pentose phosphate pathway and glycolysis. We show that accounting for transport and maintenance costs substantially improves the accuracy of fluxes predicted from a flux balance model of heterotrophic Arabidopsis cells in culture, irrespective of the objective function used in the analysis. Moreover, when the new method was applied to cells under control, elevated temperature and hyper-osmotic conditions, only elevated temperature led to a substantial increase in cell maintenance costs. It is concluded that the hyper-osmotic conditions tested did not impose a metabolic stress, in as much as the metabolic network is not forced to devote more resources to cell maintenance. © 2013 The Authors The Plant Journal © 2013 John Wiley & Sons Ltd.

  7. An observational radiative constraint on hydrologic cycle intensification.

    PubMed

    DeAngelis, Anthony M; Qu, Xin; Zelinka, Mark D; Hall, Alex

    2015-12-10

    Intensification of the hydrologic cycle is a key dimension of climate change, with substantial impacts on human and natural systems. A basic measure of hydrologic cycle intensification is the increase in global-mean precipitation per unit surface warming, which varies by a factor of three in current-generation climate models (about 1-3 per cent per kelvin). Part of the uncertainty may originate from atmosphere-radiation interactions. As the climate warms, increases in shortwave absorption from atmospheric moistening will suppress the precipitation increase. This occurs through a reduction of the latent heating increase required to maintain a balanced atmospheric energy budget. Using an ensemble of climate models, here we show that such models tend to underestimate the sensitivity of solar absorption to variations in atmospheric water vapour, leading to an underestimation in the shortwave absorption increase and an overestimation in the precipitation increase. This sensitivity also varies considerably among models due to differences in radiative transfer parameterizations, explaining a substantial portion of model spread in the precipitation response. Consequently, attaining accurate shortwave absorption responses through improvements to the radiative transfer schemes could reduce the spread in the predicted global precipitation increase per degree warming for the end of the twenty-first century by about 35 per cent, and reduce the estimated ensemble-mean increase in this quantity by almost 40 per cent.

  8. Enhancement of mechanical properties of 123 superconductors

    DOEpatents

    Balachandran, Uthamalingam

    1995-01-01

    A composition and method of preparing YBa.sub.2 Cu.sub.3 O.sub.7-x superconductor. Addition of tin oxide containing compounds to YBCO superconductors results in substantial improvement of fracture toughness and other mechanical properties without affect on T.sub.c. About 5-20% additions give rise to substantially improved mechanical properties.

  9. 78 FR 25501 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of a Longer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-69450; File No. SR-NASDAQ-2013-031] Self... Member Organization To Attest That ``Substantially All'' Orders Submitted to the Retail Price Improvement... ``substantially all,'' rather than all, orders submitted to the Retail Price Improvement Program qualify as...

  10. Revised Parameters for the AMOEBA Polarizable Atomic Multipole Water Model.

    PubMed

    Laury, Marie L; Wang, Lee-Ping; Pande, Vijay S; Head-Gordon, Teresa; Ponder, Jay W

    2015-07-23

    A set of improved parameters for the AMOEBA polarizable atomic multipole water model is developed. An automated procedure, ForceBalance, is used to adjust model parameters to enforce agreement with ab initio-derived results for water clusters and experimental data for a variety of liquid phase properties across a broad temperature range. The values reported here for the new AMOEBA14 water model represent a substantial improvement over the previous AMOEBA03 model. The AMOEBA14 model accurately predicts the temperature of maximum density and qualitatively matches the experimental density curve across temperatures from 249 to 373 K. Excellent agreement is observed for the AMOEBA14 model in comparison to experimental properties as a function of temperature, including the second virial coefficient, enthalpy of vaporization, isothermal compressibility, thermal expansion coefficient, and dielectric constant. The viscosity, self-diffusion constant, and surface tension are also well reproduced. In comparison to high-level ab initio results for clusters of 2-20 water molecules, the AMOEBA14 model yields results similar to AMOEBA03 and the direct polarization iAMOEBA models. With advances in computing power, calibration data, and optimization techniques, we recommend the use of the AMOEBA14 water model for future studies employing a polarizable water model.

  11. Model fitting for small skin permeability data sets: hyperparameter optimisation in Gaussian Process Regression.

    PubMed

    Ashrafi, Parivash; Sun, Yi; Davey, Neil; Adams, Roderick G; Wilkinson, Simon C; Moss, Gary Patrick

    2018-03-01

    The aim of this study was to investigate how to improve predictions from Gaussian Process models by optimising the model hyperparameters. Optimisation methods, including Grid Search, Conjugate Gradient, Random Search, Evolutionary Algorithm and Hyper-prior, were evaluated and applied to previously published data. Data sets were also altered in a structured manner to reduce their size, which retained the range, or 'chemical space' of the key descriptors to assess the effect of the data range on model quality. The Hyper-prior Smoothbox kernel results in the best models for the majority of data sets, and they exhibited significantly better performance than benchmark quantitative structure-permeability relationship (QSPR) models. When the data sets were systematically reduced in size, the different optimisation methods generally retained their statistical quality, whereas benchmark QSPR models performed poorly. The design of the data set, and possibly also the approach to validation of the model, is critical in the development of improved models. The size of the data set, if carefully controlled, was not generally a significant factor for these models and that models of excellent statistical quality could be produced from substantially smaller data sets. © 2018 Royal Pharmaceutical Society.

  12. Stratospheric Aerosol--Observations, Processes, and Impact on Climate

    NASA Technical Reports Server (NTRS)

    Kresmer, Stefanie; Thomason, Larry W.; von Hobe, Marc; Hermann, Markus; Deshler, Terry; Timmreck, Claudia; Toohey, Matthew; Stenke, Andrea; Schwarz, Joshua P.; Weigel, Ralf; hide

    2016-01-01

    Interest in stratospheric aerosol and its role in climate have increased over the last decade due to the observed increase in stratospheric aerosol since 2000 and the potential for changes in the sulfur cycle induced by climate change. This review provides an overview about the advances in stratospheric aerosol research since the last comprehensive assessment of stratospheric aerosol was published in 2006. A crucial development since 2006 is the substantial improvement in the agreement between in situ and space-based inferences of stratospheric aerosol properties during volcanically quiescent periods. Furthermore, new measurement systems and techniques, both in situ and space based, have been developed for measuring physical aerosol properties with greater accuracy and for characterizing aerosol composition. However, these changes induce challenges to constructing a long-term stratospheric aerosol climatology. Currently, changes in stratospheric aerosol levels less than 20% cannot be confidently quantified. The volcanic signals tend to mask any nonvolcanically driven change, making them difficult to understand. While the role of carbonyl sulfide as a substantial and relatively constant source of stratospheric sulfur has been confirmed by new observations and model simulations, large uncertainties remain with respect to the contribution from anthropogenic sulfur dioxide emissions. New evidence has been provided that stratospheric aerosol can also contain small amounts of nonsulfatematter such as black carbon and organics. Chemistry-climate models have substantially increased in quantity and sophistication. In many models the implementation of stratospheric aerosol processes is coupled to radiation and/or stratospheric chemistry modules to account for relevant feedback processes.

  13. Opportunities and constraints of presently used thermal manikins for thermo-physiological simulation of the human body.

    PubMed

    Psikuta, Agnes; Kuklane, Kalev; Bogdan, Anna; Havenith, George; Annaheim, Simon; Rossi, René M

    2016-03-01

    Combining the strengths of an advanced mathematical model of human physiology and a thermal manikin is a new paradigm for simulating thermal behaviour of humans. However, the forerunners of such adaptive manikins showed some substantial limitations. This project aimed to determine the opportunities and constraints of the existing thermal manikins when dynamically controlled by a mathematical model of human thermal physiology. Four thermal manikins were selected and evaluated for their heat flux measurement uncertainty including lateral heat flows between manikin body parts and the response of each sector to the frequent change of the set-point temperature typical when using a physiological model for control. In general, all evaluated manikins are suitable for coupling with a physiological model with some recommendations for further improvement of manikin dynamic performance. The proposed methodology is useful to improve the performance of the adaptive manikins and help to provide a reliable and versatile tool for the broad research and development domain of clothing, automotive and building engineering.

  14. Can Cultural Competency Reduce Racial And Ethnic Health Disparities? A Review And Conceptual Model

    PubMed Central

    Brach, Cindy; Fraserirector, Irene

    2016-01-01

    This article develops a conceptual model of cultural competency’s potential to reduce racial and ethnic health disparities, using the cultural competency and disparities literature to lay the foundation for the model and inform assessments of its validity. The authors identify nine major cultural competency techniques: interpreter services, recruitment and retention policies, training, coordinating with traditional healers, use of community health workers, culturally competent health promotion, including family/community members, immersion into another culture, and administrative and organizational accommodations. The conceptual model shows how these techniques could theoretically improve the ability of health systems and their clinicians to deliver appropriate services to diverse populations, thereby improving outcomes and reducing disparities. The authors conclude that while there is substantial research evidence to suggest that cultural competency should in fact work, health systems have little evidence about which cultural competency techniques are effective and less evidence on when and how to implement them properly. PMID:11092163

  15. 24 CFR 881.201 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... from gutting and extensive reconstruction to the cure of substantial accumulation of deferred maintenance. Cosmetic improvements alone do not qualify as substantial rehabilitation under this definition...

  16. 24 CFR 881.201 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... from gutting and extensive reconstruction to the cure of substantial accumulation of deferred maintenance. Cosmetic improvements alone do not qualify as substantial rehabilitation under this definition...

  17. 24 CFR 881.201 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... from gutting and extensive reconstruction to the cure of substantial accumulation of deferred maintenance. Cosmetic improvements alone do not qualify as substantial rehabilitation under this definition...

  18. 24 CFR 881.201 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... from gutting and extensive reconstruction to the cure of substantial accumulation of deferred maintenance. Cosmetic improvements alone do not qualify as substantial rehabilitation under this definition...

  19. Atmospheric Effects of Subsonic Aircraft: Interim Assessment Report of the Advanced Subsonic Technology Program

    NASA Technical Reports Server (NTRS)

    Friedl, Randall R. (Editor)

    1997-01-01

    This first interim assessment of the subsonic assessment (SASS) project attempts to summarize concisely the status of our knowledge concerning the impacts of present and future subsonic aircraft fleets. It also highlights the major areas of scientific uncertainty, through review of existing data bases and model-based sensitivity studies. In view of the need for substantial improvements in both model formulations and experimental databases, this interim assessment cannot provide confident numerical predictions of aviation impacts. However, a number of quantitative estimates are presented, which provide some guidance to policy makers.

  20. Improved Electrostatic Embedding for Fragment-Based Chemical Shift Calculations in Molecular Crystals.

    PubMed

    Hartman, Joshua D; Balaji, Ashwin; Beran, Gregory J O

    2017-12-12

    Fragment-based methods predict nuclear magnetic resonance (NMR) chemical shielding tensors in molecular crystals with high accuracy and computational efficiency. Such methods typically employ electrostatic embedding to mimic the crystalline environment, and the quality of the results can be sensitive to the embedding treatment. To improve the quality of this embedding environment for fragment-based molecular crystal property calculations, we borrow ideas from the embedded ion method to incorporate self-consistently polarized Madelung field effects. The self-consistent reproduction of the Madelung potential (SCRMP) model developed here constructs an array of point charges that incorporates self-consistent lattice polarization and which reproduces the Madelung potential at all atomic sites involved in the quantum mechanical region of the system. The performance of fragment- and cluster-based 1 H, 13 C, 14 N, and 17 O chemical shift predictions using SCRMP and density functionals like PBE and PBE0 are assessed. The improved embedding model results in substantial improvements in the predicted 17 O chemical shifts and modest improvements in the 15 N ones. Finally, the performance of the model is demonstrated by examining the assignment of the two oxygen chemical shifts in the challenging γ-polymorph of glycine. Overall, the SCRMP-embedded NMR chemical shift predictions are on par with or more accurate than those obtained with the widely used gauge-including projector augmented wave (GIPAW) model.

  1. Assessment of simulated water balance from Noah, Noah-MP, CLM, and VIC over CONUS using the NLDAS test bed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Xitian; Yang, Zong-Liang; Xia, Youlong

    2014-12-27

    This study assesses the hydrologic performance of four land surface models (LSMs) for the conterminous United States using the North American Land Data Assimilation System (NLDAS) test bed. The four LSMs are the baseline community Noah LSM (Noah, version 2.8), the Variable Infiltration Capacity (VIC, version 4.0.5) model, the substantially augmented Noah LSM with multiparameterization options (hence Noah-MP), and the Community Land Model version 4 (CLM4). All four models are driven by the same NLDAS-2 atmospheric forcing. Modeled terrestrial water storage (TWS), streamflow, evapotranspiration (ET), and soil moisture are compared with each other and evaluated against the identical observations. Relativemore » to Noah, the other three models offer significant improvements in simulating TWS and streamflow and moderate improvements in simulating ET and soil moisture. Noah-MP provides the best performance in simulating soil moisture and is among the best in simulating TWS, CLM4 shows the best performance in simulating ET, and VIC ranks the highest in performing the streamflow simulations. Despite these improvements, CLM4, Noah-MP, and VIC exhibit deficiencies, such as the low variability of soil moisture in CLM4, the fast growth of spring ET in Noah-MP, and the constant overestimation of ET in VIC.« less

  2. Adaptive estimation of the log fluctuating conductivity from tracer data at the Cape Cod Site

    USGS Publications Warehouse

    Deng, F.W.; Cushman, J.H.; Delleur, J.W.

    1993-01-01

    An adaptive estimation scheme is used to obtain the integral scale and variance of the log-fluctuating conductivity at the Cape Cod site based on the fast Fourier transform/stochastic model of Deng et al. (1993) and a Kalmanlike filter. The filter incorporates prior estimates of the unknown parameters with tracer moment data to adaptively obtain improved estimates as the tracer evolves. The results show that significant improvement in the prior estimates of the conductivity can lead to substantial improvement in the ability to predict plume movement. The structure of the covariance function of the log-fluctuating conductivity can be identified from the robustness of the estimation. Both the longitudinal and transverse spatial moment data are important to the estimation.

  3. 36 CFR 51.55 - What must a concessioner do after substantial completion of the capital improvement?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... together with, if requested by the Director, a written certification from a certified public accountant... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false What must a concessioner do after substantial completion of the capital improvement? 51.55 Section 51.55 Parks, Forests, and Public...

  4. Enhancement of mechanical properties of 123 superconductors

    DOEpatents

    Balachandran, U.

    1995-04-25

    A composition and method are disclosed of preparing YBa{sub 2}Cu{sub 3}O{sub 7{minus}x} superconductor. Addition of tin oxide containing compounds to YBCO superconductors results in substantial improvement of fracture toughness and other mechanical properties without affect on T{sub c}. About 5-20% additions give rise to substantially improved mechanical properties.

  5. Reciprocal peer review for quality improvement: an ethnographic case study of the Improving Lung Cancer Outcomes Project.

    PubMed

    Aveling, Emma-Louise; Martin, Graham; Jiménez García, Senai; Martin, Lisa; Herbert, Georgia; Armstrong, Natalie; Dixon-Woods, Mary; Woolhouse, Ian

    2012-12-01

    Peer review offers a promising way of promoting improvement in health systems, but the optimal model is not yet clear. We aimed to describe a specific peer review model-reciprocal peer-to-peer review (RP2PR)-to identify the features that appeared to support optimal functioning. We conducted an ethnographic study involving observations, interviews and documentary analysis of the Improving Lung Cancer Outcomes Project, which involved 30 paired multidisciplinary lung cancer teams participating in facilitated reciprocal site visits. Analysis was based on the constant comparative method. Fundamental features of the model include multidisciplinary participation, a focus on discussion and observation of teams in action, rather than paperwork; facilitated reflection and discussion on data and observations; support to develop focused improvement plans. Five key features were identified as important in optimising this model: peers and pairing methods; minimising logistic burden; structure of visits; independent facilitation; and credibility of the process. Facilitated RP2PR was generally a positive experience for participants, but implementing improvement plans was challenging and required substantial support. RP2PR appears to be optimised when it is well organised; a safe environment for learning is created; credibility is maximised; implementation and impact are supported. RP2PR is seen as credible and legitimate by lung cancer teams and can act as a powerful stimulus to produce focused quality improvement plans and to support implementation. Our findings have identified how RP2PR functioned and may be optimised to provide a constructive, open space for identifying opportunities for improvement and solutions.

  6. Improved Satellite-based Crop Yield Mapping by Spatially Explicit Parameterization of Crop Phenology

    NASA Astrophysics Data System (ADS)

    Jin, Z.; Azzari, G.; Lobell, D. B.

    2016-12-01

    Field-scale mapping of crop yields with satellite data often relies on the use of crop simulation models. However, these approaches can be hampered by inaccuracies in the simulation of crop phenology. Here we present and test an approach to use dense time series of Landsat 7 and 8 acquisitions data to calibrate various parameters related to crop phenology simulation, such as leaf number and leaf appearance rates. These parameters are then mapped across the Midwestern United States for maize and soybean, and for two different simulation models. We then implement our recently developed Scalable satellite-based Crop Yield Mapper (SCYM) with simulations reflecting the improved phenology parameterizations, and compare to prior estimates based on default phenology routines. Our preliminary results show that the proposed method can effectively alleviate the underestimation of early-season LAI by the default Agricultural Production Systems sIMulator (APSIM), and that spatially explicit parameterization for the phenology model substantially improves the SCYM performance in capturing the spatiotemporal variation in maize and soybean yield. The scheme presented in our study thus preserves the scalability of SCYM, while significantly reducing its uncertainty.

  7. Ensemble Learning of QTL Models Improves Prediction of Complex Traits

    PubMed Central

    Bian, Yang; Holland, James B.

    2015-01-01

    Quantitative trait locus (QTL) models can provide useful insights into trait genetic architecture because of their straightforward interpretability but are less useful for genetic prediction because of the difficulty in including the effects of numerous small effect loci without overfitting. Tight linkage between markers introduces near collinearity among marker genotypes, complicating the detection of QTL and estimation of QTL effects in linkage mapping, and this problem is exacerbated by very high density linkage maps. Here we developed a thinning and aggregating (TAGGING) method as a new ensemble learning approach to QTL mapping. TAGGING reduces collinearity problems by thinning dense linkage maps, maintains aspects of marker selection that characterize standard QTL mapping, and by ensembling, incorporates information from many more markers-trait associations than traditional QTL mapping. The objective of TAGGING was to improve prediction power compared with QTL mapping while also providing more specific insights into genetic architecture than genome-wide prediction models. TAGGING was compared with standard QTL mapping using cross validation of empirical data from the maize (Zea mays L.) nested association mapping population. TAGGING-assisted QTL mapping substantially improved prediction ability for both biparental and multifamily populations by reducing both the variance and bias in prediction. Furthermore, an ensemble model combining predictions from TAGGING-assisted QTL and infinitesimal models improved prediction abilities over the component models, indicating some complementarity between model assumptions and suggesting that some trait genetic architectures involve a mixture of a few major QTL and polygenic effects. PMID:26276383

  8. The regionalization of national-scale SPARROW models for stream nutrients

    USGS Publications Warehouse

    Schwarz, Gregory E.; Alexander, Richard B.; Smith, Richard A.; Preston, Stephen D.

    2011-01-01

    This analysis modifies the parsimonious specification of recently published total nitrogen (TN) and total phosphorus (TP) national-scale SPAtially Referenced Regressions On Watershed attributes models to allow each model coefficient to vary geographically among three major river basins of the conterminous United States. Regionalization of the national models reduces the standard errors in the prediction of TN and TP loads, expressed as a percentage of the predicted load, by about 6 and 7%. We develop and apply a method for combining national-scale and regional-scale information to estimate a hybrid model that imposes cross-region constraints that limit regional variation in model coefficients, effectively reducing the number of free model parameters as compared to a collection of independent regional models. The hybrid TN and TP regional models have improved model fit relative to the respective national models, reducing the standard error in the prediction of loads, expressed as a percentage of load, by about 5 and 4%. Only 19% of the TN hybrid model coefficients and just 2% of the TP hybrid model coefficients show evidence of substantial regional specificity (more than ±100% deviation from the national model estimate). The hybrid models have much greater precision in the estimated coefficients than do the unconstrained regional models, demonstrating the efficacy of pooling information across regions to improve regional models.

  9. Reducing respiratory motion artifacts in positron emission tomography through retrospective stacking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thorndyke, Brian; Schreibmann, Eduard; Koong, Albert

    Respiratory motion artifacts in positron emission tomography (PET) imaging can alter lesion intensity profiles, and result in substantially reduced activity and contrast-to-noise ratios (CNRs). We propose a corrective algorithm, coined 'retrospective stacking' (RS), to restore image quality without requiring additional scan time. Retrospective stacking uses b-spline deformable image registration to combine amplitude-binned PET data along the entire respiratory cycle into a single respiratory end point. We applied the method to a phantom model consisting of a small, hot vial oscillating within a warm background, as well as to {sup 18}FDG-PET images of a pancreatic and a liver patient. Comparisons weremore » made using cross-section visualizations, activity profiles, and CNRs within the region of interest. Retrospective stacking was found to properly restore the lesion location and intensity profile in all cases. In addition, RS provided CNR improvements up to three-fold over gated images, and up to five-fold over ungated data. These phantom and patient studies demonstrate that RS can correct for lesion motion and deformation, while substantially improving tumor visibility and background noise.« less

  10. Self-report measure of financial exploitation of older adults.

    PubMed

    Conrad, Kendon J; Iris, Madelyn; Ridings, John W; Langley, Kate; Wilber, Kathleen H

    2010-12-01

    this study was designed to improve the measurement of financial exploitation (FE) by testing psychometric properties of the older adult financial exploitation measure (OAFEM), a client self-report instrument. rasch item response theory and traditional validation approaches were used. Questionnaires were administered by 22 adult protective services investigators from 7 agencies in Illinois to 227 substantiated abuse clients. Analyses included tests for dimensionality, model fit, and additional construct validation. Results from the OAFEM were also compared with the substantiation decision of abuse and with investigators' assessments of FE using a staff report version. Hypotheses were generated to test hypothesized relationships. the OAFEM, including the original 79-, 54-, and 30-item measures, met stringent Rasch analysis fit and unidimensionality criteria and had high internal consistency and item reliability. The validation results were supportive, while leading to reconsideration of aspects of the hypothesized theoretical hierarchy. Thresholds were suggested to demonstrate levels of severity. the measure is now available to aid in the assessment of FE of older adults by both clinicians and researchers. Theoretical refinements developed using the empirically generated item hierarchy may help to improve assessment and intervention.

  11. The effects of speech production and vocabulary training on different components of spoken language performance.

    PubMed

    Paatsch, Louise E; Blamey, Peter J; Sarant, Julia Z; Bow, Catherine P

    2006-01-01

    A group of 21 hard-of-hearing and deaf children attending primary school were trained by their teachers on the production of selected consonants and on the meanings of selected words. Speech production, vocabulary knowledge, reading aloud, and speech perception measures were obtained before and after each type of training. The speech production training produced a small but significant improvement in the percentage of consonants correctly produced in words. The vocabulary training improved knowledge of word meanings substantially. Performance on speech perception and reading aloud were significantly improved by both types of training. These results were in accord with the predictions of a mathematical model put forward to describe the relationships between speech perception, speech production, and language measures in children (Paatsch, Blamey, Sarant, Martin, & Bow, 2004). These training data demonstrate that the relationships between the measures are causal. In other words, improvements in speech production and vocabulary performance produced by training will carry over into predictable improvements in speech perception and reading scores. Furthermore, the model will help educators identify the most effective methods of improving receptive and expressive spoken language for individual children who are deaf or hard of hearing.

  12. The migraine ACE model: evaluating the impact on time lost and medical resource Use.

    PubMed

    Caro, J J; Caro, G; Getsios, D; Raggio, G; Burrows, M; Black, L

    2000-04-01

    To describe the Migraine Adaptive Cost-Effectiveness Model in the context of an analysis of a simulated population of Canadian patients with migraine. The high prevalence of migraine and its substantial impact on patients' ability to function normally present a significant economic burden to society. In light of the recent availability of improved pharmaceutical treatments, a model was developed to assess their economic impact. The Migraine Adaptive Cost-Effectiveness Model incorporates the costs of time lost from both work and nonwork activities, as well as medical resource and medication use. Using Monte Carlo techniques, the model simulates the experience of a population of patients with migraine over the course of 1 year. As an example, analyses of a Canadian population were carried out using data from a multinational trial, surveys, national statistics, and the available literature. Using customary therapy, mean productivity losses (amounting to 84 hours of paid work time, 48 hours of unpaid work time, and 113 hours of leisure time lost) were estimated to cost $1949 (in 1997 Canadian dollars) per patient, with medical expenditures adding an average of $280 to the cost of illness. With customary treatment patterns, the costs of migraine associated with reduced functional capacity are substantial. The migraine model represents a flexible tool for the economic evaluation of different migraine treatments in various populations.

  13. Improving financial performance by modeling and analysis of radiology procedure scheduling at a large community hospital.

    PubMed

    Lu, Lingbo; Li, Jingshan; Gisler, Paula

    2011-06-01

    Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.

  14. Consistency of QSAR models: Correct split of training and test sets, ranking of models and performance parameters.

    PubMed

    Rácz, A; Bajusz, D; Héberger, K

    2015-01-01

    Recent implementations of QSAR modelling software provide the user with numerous models and a wealth of information. In this work, we provide some guidance on how one should interpret the results of QSAR modelling, compare and assess the resulting models, and select the best and most consistent ones. Two QSAR datasets are applied as case studies for the comparison of model performance parameters and model selection methods. We demonstrate the capabilities of sum of ranking differences (SRD) in model selection and ranking, and identify the best performance indicators and models. While the exchange of the original training and (external) test sets does not affect the ranking of performance parameters, it provides improved models in certain cases (despite the lower number of molecules in the training set). Performance parameters for external validation are substantially separated from the other merits in SRD analyses, highlighting their value in data fusion.

  15. Flexible multiply towpreg and method of production therefor

    NASA Technical Reports Server (NTRS)

    Muzzy, John D. (Inventor); Varughese, Babu (Inventor)

    1992-01-01

    This invention relates to an improved flexible towpreg and a method of production therefor. The improved flexible towpreg comprises a plurality of towpreg plies which comprise reinforcing filaments and matrix forming material; the reinforcing filaments being substantially wetout by the matrix forming material such that the towpreg plies are substantially void-free composite articles, and the towpreg plies having an average thickness less than about 100 microns. The method of production for the improved flexible towpreg comprises the steps of spreading the reinforcing filaments to expose individually substantially all of the reinforcing filaments; coating the reinforcing filaments with the matrix forming material in a manner causing interfacial adhesion of the matrix forming material to the reinforcing filaments; forming the towpreg plies by heating the matrix forming material contacting the reinforcing filaments until the matrix forming material liquefies and coats the reinforcing filaments; and cooling the towpreg plies in a manner such that substantial cohesion between neighboring towpreg plies is prevented until the matrix forming material solidifies.

  16. Flexible multiply towpreg

    NASA Technical Reports Server (NTRS)

    Muzzy, John D. (Inventor); Varughese, Babu (Inventor)

    1992-01-01

    This invention relates to an improved flexible towpreg and a method of production therefor. The improved flexible towpreg comprises a plurality of towpreg plies which comprise reinforcing filaments and matrix forming material; the reinforcing filaments being substantially wetout by the matrix forming material such that the towpreg plies are substantially void-free composite articles, and the towpreg plies having an average thickness less than about 100 microns. The method of production for the improved flexible towpreg comprises the steps of spreading the reinforcing filaments to expose individually substantially all of the reinforcing filaments; coating the reinforcing filaments with the matrix forming material in a manner causing interfacial adhesion of the matrix forming material to the reinforcing filaments; forming the towpreg plies by heating the matrix forming material contacting the reinforcing filaments until the matrix forming material liquifies and coats the reinforcing filaments; and cooling the towpreg plies in a manner such that substantial cohesion between neighboring towpreg plies is prevented until the matrix forming material solidifies.

  17. Integrating modeling, monitoring, and management to reduce critical uncertainties in water resource decision making.

    PubMed

    Peterson, James T; Freeman, Mary C

    2016-12-01

    Stream ecosystems provide multiple, valued services to society, including water supply, waste assimilation, recreation, and habitat for diverse and productive biological communities. Managers striving to sustain these services in the face of changing climate, land uses, and water demands need tools to assess the potential effectiveness of alternative management actions, and often, the resulting tradeoffs between competing objectives. Integrating predictive modeling with monitoring data in an adaptive management framework provides a process by which managers can reduce model uncertainties and thus improve the scientific bases for subsequent decisions. We demonstrate an integration of monitoring data with a dynamic, metapopulation model developed to assess effects of streamflow alteration on fish occupancy in a southeastern US stream system. Although not extensive (collected over three years at nine sites), the monitoring data allowed us to assess and update support for alternative population dynamic models using model probabilities and Bayes rule. We then use the updated model weights to estimate the effects of water withdrawal on stream fish communities and demonstrate how feedback in the form of monitoring data can be used to improve water resource decision making. We conclude that investment in more strategic monitoring, guided by a priori model predictions under alternative hypotheses and an adaptive sampling design, could substantially improve the information available to guide decision-making and management for ecosystem services from lotic systems. Published by Elsevier Ltd.

  18. The Logic of Collective Rating

    NASA Astrophysics Data System (ADS)

    Nax, Heinrich

    2016-05-01

    The introduction of participatory rating mechanisms on online sales platforms has had substantial impact on firms' sales and profits. In this note, we develop a dynamic model of consumer influences on ratings and of rating influences on consumers, focussing on standard 5-star mechanisms as implemented by many platforms. The key components of our social influence model are the consumer trust in the `wisdom of crowds' during the purchase phase and indirect reciprocity during the rating decision. Our model provides an overarching explanation for well-corroborated empirical regularities. We quantify the performance of the voluntary rating mechanism in terms of realized consumer surplus with the no-mechanism and full-information benchmarks, and identify how it could be improved.

  19. Weather models as virtual sensors to data-driven rainfall predictions in urban watersheds

    NASA Astrophysics Data System (ADS)

    Cozzi, Lorenzo; Galelli, Stefano; Pascal, Samuel Jolivet De Marc; Castelletti, Andrea

    2013-04-01

    Weather and climate predictions are a key element of urban hydrology where they are used to inform water management and assist in flood warning delivering. Indeed, the modelling of the very fast dynamics of urbanized catchments can be substantially improved by the use of weather/rainfall predictions. For example, in Singapore Marina Reservoir catchment runoff processes have a very short time of concentration (roughly one hour) and observational data are thus nearly useless for runoff predictions and weather prediction are required. Unfortunately, radar nowcasting methods do not allow to carrying out long - term weather predictions, whereas numerical models are limited by their coarse spatial scale. Moreover, numerical models are usually poorly reliable because of the fast motion and limited spatial extension of rainfall events. In this study we investigate the combined use of data-driven modelling techniques and weather variables observed/simulated with a numerical model as a way to improve rainfall prediction accuracy and lead time in the Singapore metropolitan area. To explore the feasibility of the approach, we use a Weather Research and Forecast (WRF) model as a virtual sensor network for the input variables (the states of the WRF model) to a machine learning rainfall prediction model. More precisely, we combine an input variable selection method and a non-parametric tree-based model to characterize the empirical relation between the rainfall measured at the catchment level and all possible weather input variables provided by WRF model. We explore different lead time to evaluate the model reliability for different long - term predictions, as well as different time lags to see how past information could improve results. Results show that the proposed approach allow a significant improvement of the prediction accuracy of the WRF model on the Singapore urban area.

  20. Generalized squeezing rotating-wave approximation to the isotropic and anisotropic Rabi model in the ultrastrong-coupling regime

    NASA Astrophysics Data System (ADS)

    Zhang, Yu-Yu

    2016-12-01

    Generalized squeezing rotating-wave approximation (GSRWA) is proposed by employing both the displacement and the squeezing transformations. A solvable Hamiltonian is reformulated in the same form as the ordinary RWA ones. For a qubit coupled to oscillators experiment, a well-defined Schrödinger-cat-like entangled state is given by the displaced-squeezed oscillator state instead of the original displaced state. For the isotropic Rabi case, the mean photon number and the ground-state energy are expressed analytically with additional squeezing terms, exhibiting a substantial improvement of the GSRWA. And the ground-state energy in the anisotropic Rabi model confirms the effectiveness of the GSRWA. Due to the squeezing effect, the GSRWA improves the previous methods only with the displacement transformation in a wide range of coupling strengths even for large atom frequency.

  1. Effects of Condensation on Peri-implant Bone Density and Remodeling

    PubMed Central

    Wang, L.; Wu, Y.; Perez, K.C.; Hyman, S.; Brunski, J.B.; Tulu, U.; Bao, C.; Salmon, B.; Helms, J.A.

    2017-01-01

    Bone condensation is thought to densify interfacial bone and thus improve implant primary stability, but scant data substantiate either claim. We developed a murine oral implant model to test these hypotheses. Osteotomies were created in healed maxillary extraction sites 1) by drilling or 2) by drilling followed by stepwise condensation with tapered osteotomes. Condensation increased interfacial bone density, as measured by a significant change in bone volume/total volume and trabecular spacing, but it simultaneously damaged the bone. On postimplant day 1, the condensed bone interface exhibited microfractures and osteoclast activity. Finite element modeling, mechanical testing, and immunohistochemical analyses at multiple time points throughout the osseointegration period demonstrated that condensation caused very high interfacial strains, marginal bone resorption, and no improvement in implant stability. Collectively, these multiscale analyses demonstrate that condensation does not positively contribute to implant stability. PMID:28048963

  2. Effects of Condensation on Peri-implant Bone Density and Remodeling.

    PubMed

    Wang, L; Wu, Y; Perez, K C; Hyman, S; Brunski, J B; Tulu, U; Bao, C; Salmon, B; Helms, J A

    2017-04-01

    Bone condensation is thought to densify interfacial bone and thus improve implant primary stability, but scant data substantiate either claim. We developed a murine oral implant model to test these hypotheses. Osteotomies were created in healed maxillary extraction sites 1) by drilling or 2) by drilling followed by stepwise condensation with tapered osteotomes. Condensation increased interfacial bone density, as measured by a significant change in bone volume/total volume and trabecular spacing, but it simultaneously damaged the bone. On postimplant day 1, the condensed bone interface exhibited microfractures and osteoclast activity. Finite element modeling, mechanical testing, and immunohistochemical analyses at multiple time points throughout the osseointegration period demonstrated that condensation caused very high interfacial strains, marginal bone resorption, and no improvement in implant stability. Collectively, these multiscale analyses demonstrate that condensation does not positively contribute to implant stability.

  3. Scale-adaptive compressive tracking with feature integration

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Li, Jicheng; Chen, Xiao; Li, Shuxin

    2016-05-01

    Numerous tracking-by-detection methods have been proposed for robust visual tracking, among which compressive tracking (CT) has obtained some promising results. A scale-adaptive CT method based on multifeature integration is presented to improve the robustness and accuracy of CT. We introduce a keypoint-based model to achieve the accurate scale estimation, which can additionally give a prior location of the target. Furthermore, by the high efficiency of data-independent random projection matrix, multiple features are integrated into an effective appearance model to construct the naïve Bayes classifier. At last, an adaptive update scheme is proposed to update the classifier conservatively. Experiments on various challenging sequences demonstrate substantial improvements by our proposed tracker over CT and other state-of-the-art trackers in terms of dealing with scale variation, abrupt motion, deformation, and illumination changes.

  4. Multi-Wheat-Model Ensemble Responses to Interannual Climate Variability

    NASA Technical Reports Server (NTRS)

    Ruane, Alex C.; Hudson, Nicholas I.; Asseng, Senthold; Camarrano, Davide; Ewert, Frank; Martre, Pierre; Boote, Kenneth J.; Thorburn, Peter J.; Aggarwal, Pramod K.; Angulo, Carlos

    2016-01-01

    We compare 27 wheat models' yield responses to interannual climate variability, analyzed at locations in Argentina, Australia, India, and The Netherlands as part of the Agricultural Model Intercomparison and Improvement Project (AgMIP) Wheat Pilot. Each model simulated 1981e2010 grain yield, and we evaluate results against the interannual variability of growing season temperature, precipitation, and solar radiation. The amount of information used for calibration has only a minor effect on most models' climate response, and even small multi-model ensembles prove beneficial. Wheat model clusters reveal common characteristics of yield response to climate; however models rarely share the same cluster at all four sites indicating substantial independence. Only a weak relationship (R2 0.24) was found between the models' sensitivities to interannual temperature variability and their response to long-termwarming, suggesting that additional processes differentiate climate change impacts from observed climate variability analogs and motivating continuing analysis and model development efforts.

  5. The Density of Mid-sized Kuiper Belt Objects from ALMA Thermal Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Michael E.; Butler, Bryan J.

    The densities of mid-sized Kuiper Belt objects (KBOs) are a key constraint in understanding the assembly of objects in the outer solar system. These objects are critical for understanding the currently unexplained transition from the smallest KBOs with densities lower than that of water, to the largest objects with significant rock content. Mapping this transition is made difficult by the uncertainties in the diameters of these objects, which maps into an even larger uncertainty in volume and thus density. The substantial collecting area of the Atacama Large Millimeter Array allows significantly more precise measurements of thermal emission from outer solarmore » system objects and could potentially greatly improve the density measurements. Here we use new thermal observations of four objects with satellites to explore the improvements possible with millimeter data. We find that effects due to effective emissivity at millimeter wavelengths make it difficult to use the millimeter data directly to find diameters and thus volumes for these bodies. In addition, we find that when including the effects of model uncertainty, the true uncertainties on the sizes of outer solar system objects measured with radiometry are likely larger than those previously published. Substantial improvement in object sizes will likely require precise occultation measurements.« less

  6. Macroscopic Crosslinked Neat Carbon Nanotube Materials and CNT/Carbon Fiber Hybrid Composites: Supermolecular Structure and New Failure Mode Study

    DTIC Science & Technology

    2015-10-01

    Materials; CRC Press, 1997. (70) Zhang, Y.; Zheng, L.; Sun , G.; Zhan, Z.; Liao, K. Failure Mechanisms of Carbon Nanotube Fibers under Different...Buehler, M. J. Mesoscale Modeling of Mechanics of Carbon Nanotubes: Self-Assembly, Self-Folding, and Fracture . J. Mater. Res. 2006, 21 (11), 2855–2869...close surface contact between CNTs to substantially improve the load transfer and mechanical properties. We also revealed that extremely low

  7. Diffractive Higgs boson production at the Fermilab Tevatron and the CERN Large Hadron Collider.

    PubMed

    Enberg, R; Ingelman, G; Kissavos, A; Tîmneanu, N

    2002-08-19

    Improved possibilities to find the Higgs boson in diffractive events, having less hadronic activity, depend on whether the cross section is large enough. Based on the soft color interaction models that successfully describe diffractive hard scattering at DESY HERA and the Fermilab Tevatron, we find that only a few diffractive Higgs events may be produced at the Tevatron, but we predict a substantial rate at the CERN Large Hadron Collider.

  8. Integrated pyrolucite fluidized bed-membrane hybrid process for improved iron and manganese control in drinking water.

    PubMed

    Dashtban Kenari, Seyedeh Laleh; Barbeau, Benoit

    2017-04-15

    Newly developed ceramic membrane technologies offer numerous advantages over the conventional polymeric membranes. This work proposes a new configuration, an integrated pyrolucite fluidized bed (PFB)-ceramic MF/UF hybrid process, for improved iron and manganese control in drinking water. A pilot-scale study was undertaken to evaluate the performance of this process with respect to iron and manganese control as well as membrane fouling. In addition, the fouling of commercially available ceramic membranes in conventional preoxidation-MF/UF process was compared with the hybrid process configuration. In this regard, a series of experiments were conducted under different influent water quality and operating conditions. Fouling mechanisms and reversibility were analyzed using blocking law and resistance-in-series models. The results evidenced that the flux rate and the concentration of calcium and humic acids in the feed water have a substantial impact on the filtration behavior of both membranes. The model for constant flux compressible cake formation well described the rise in transmembrane pressure. The compressibility of the filter cake substantially increased in the presence of 2 mg/L humic acids. The presence of calcium ions caused significant aggregation of manganese dioxide and humic acid which severely impacted the extent of membrane fouling. The PFB pretreatment properly alleviated membrane fouling by removing more than 75% and 95% of iron and manganese, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Future disability projections could be improved by connecting to the theory of a dynamic equilibrium.

    PubMed

    Klijs, Bart; Mackenbach, Johan P; Kunst, Anton E

    2011-04-01

    Projections of future trends in the burden of disability could be guided by models linking disability to life expectancy, such as the dynamic equilibrium theory. This article tests the key assumption of this theory that severe disability is associated with proximity to death, whereas mild disability is not. Using data from the GLOBE study (Gezondheid en Levensomstandigheden Bevolking Eindhoven en omstreken), the association of three levels of self-reported disabilities in activities of daily living with age and proximity to death was studied using logistic regression models. Regression estimates were used to estimate the number of life years with disability for life spans of 75 and 85 years. Odds ratios of 0.976 (not significant) for mild disability, 1.137 for moderate disability, and 1.231 for severe disability showed a stronger effect of proximity to death for more severe levels of disability. A 10-year increase of life span was estimated to result in a substantial expansion of mild disability (4.6 years) compared with a small expansion of moderate (0.7 years) and severe (0.9 years) disability. These findings support the theory of a dynamic equilibrium. Projections of the future burden of disability could be substantially improved by connecting to this theory and incorporating information on proximity to death. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. The business case for patient safety.

    PubMed

    Hwang, Raymond W; Herndon, James H

    2007-04-01

    Recent trends have focused attention on improving patient safety in the United States healthcare system. Lapses in patient safety create undue, often preventable, morbidity. These include adverse drug events, adverse surgical events and nosocomial infections. From an organizational perspective, these events are both inefficient and expensive. Many safe practices and quality enhancing improvements, such as computer provider order entry, proper infection surveillance, telemedicine intensive care, and registered nurse staffing are in fact cost-effective. However, in order to fully achieve higher quality, better adverse event reporting and a culture of safety must first be developed. Increased provider recognition, models of success, public awareness and consumer demand are propelling improvements. As we will outline in this review of the current literature, the business case for patient safety is a compelling one, offering substantial economic incentives for achieving the necessary goal of improved patient outcomes.

  11. An improved prognostic model for stage T1a and T1b prostate cancer by assessments of cancer extent

    PubMed Central

    Rajab, Ramzi; Fisher, Gabrielle; Kattan, Michael W; Foster, Christopher S; Møller, Henrik; Oliver, Tim; Reuter, Victor; Scardino, Peter T; Cuzick, Jack; Berney, Daniel M

    2013-01-01

    Treatment decisions on prostate cancer diagnosed by trans-urethral resection (TURP) of the prostate are difficult. The current TNM staging system for pT1 prostate cancer has not been re-evaluated for 25 years. Our objective was to optimise the predictive power of tumor extent measurements in TURP of the prostate specimens. A total of 914 patients diagnosed by TURP of the prostate between 1990 and 1996, managed conservatively were identified. The clinical end point was death from prostate cancer. Diagnostic serum prostate-specific antigen (PSA) and contemporary Gleason grading was available. Cancer extent was measured by the percentage of chips infiltrated by cancer. Death rates were compared by univariate and multivariate proportional hazards models, including baseline PSA and Gleason score. The percentage of positive chips was highly predictive of prostate cancer death when assessed as a continuous variable or as a grouped variable on the basis of and including the quintiles, quartiles, tertiles and median groups. In the univariate model, the most informative variable was a four group-split (≤ 10%, >10–25%, > 25–75% and > 75%); (HR = 2.08, 95% CI = 1.8–2.4, P < 0.0001). The same was true in a multivariate model (ΔX2 (1 d.f.) = 15.0, P = 0.0001). The current cutoff used by TNM (< = 5%) was sub-optimal (ΔX2 (1 d.f.) = 4.8, P = 0.023). The current TNM staging results in substantial loss of information. Staging by a four-group subdivision would substantially improve prognostication in patients with early stage disease and also may help to refine management decisions in patients who would do well with conservative treatments. PMID:20834240

  12. Global-scale combustion sources of organic aerosols: sensitivity to formation and removal mechanisms

    NASA Astrophysics Data System (ADS)

    Tsimpidi, Alexandra P.; Karydis, Vlassis A.; Pandis, Spyros N.; Lelieveld, Jos

    2017-06-01

    Organic compounds from combustion sources such as biomass burning and fossil fuel use are major contributors to the global atmospheric load of aerosols. We analyzed the sensitivity of model-predicted global-scale organic aerosols (OA) to parameters that control primary emissions, photochemical aging, and the scavenging efficiency of organic vapors. We used a computationally efficient module for the description of OA composition and evolution in the atmosphere (ORACLE) of the global chemistry-climate model EMAC (ECHAM/MESSy Atmospheric Chemistry). A global dataset of aerosol mass spectrometer (AMS) measurements was used to evaluate simulated primary (POA) and secondary (SOA) OA concentrations. Model results are sensitive to the emission rates of intermediate-volatility organic compounds (IVOCs) and POA. Assuming enhanced reactivity of semi-volatile organic compounds (SVOCs) and IVOCs with OH substantially improved the model performance for SOA. The use of a hybrid approach for the parameterization of the aging of IVOCs had a small effect on predicted SOA levels. The model performance improved by assuming that freshly emitted organic compounds are relatively hydrophobic and become increasingly hygroscopic due to oxidation.

  13. LiDAR based prediction of forest biomass using hierarchical models with spatially varying coefficients

    USGS Publications Warehouse

    Babcock, Chad; Finley, Andrew O.; Bradford, John B.; Kolka, Randall K.; Birdsey, Richard A.; Ryan, Michael G.

    2015-01-01

    Many studies and production inventory systems have shown the utility of coupling covariates derived from Light Detection and Ranging (LiDAR) data with forest variables measured on georeferenced inventory plots through regression models. The objective of this study was to propose and assess the use of a Bayesian hierarchical modeling framework that accommodates both residual spatial dependence and non-stationarity of model covariates through the introduction of spatial random effects. We explored this objective using four forest inventory datasets that are part of the North American Carbon Program, each comprising point-referenced measures of above-ground forest biomass and discrete LiDAR. For each dataset, we considered at least five regression model specifications of varying complexity. Models were assessed based on goodness of fit criteria and predictive performance using a 10-fold cross-validation procedure. Results showed that the addition of spatial random effects to the regression model intercept improved fit and predictive performance in the presence of substantial residual spatial dependence. Additionally, in some cases, allowing either some or all regression slope parameters to vary spatially, via the addition of spatial random effects, further improved model fit and predictive performance. In other instances, models showed improved fit but decreased predictive performance—indicating over-fitting and underscoring the need for cross-validation to assess predictive ability. The proposed Bayesian modeling framework provided access to pixel-level posterior predictive distributions that were useful for uncertainty mapping, diagnosing spatial extrapolation issues, revealing missing model covariates, and discovering locally significant parameters.

  14. Conformational dynamics of a crystalline protein from microsecond-scale molecular dynamics simulations and diffuse X-ray scattering.

    PubMed

    Wall, Michael E; Van Benschoten, Andrew H; Sauter, Nicholas K; Adams, Paul D; Fraser, James S; Terwilliger, Thomas C

    2014-12-16

    X-ray diffraction from protein crystals includes both sharply peaked Bragg reflections and diffuse intensity between the peaks. The information in Bragg scattering is limited to what is available in the mean electron density. The diffuse scattering arises from correlations in the electron density variations and therefore contains information about collective motions in proteins. Previous studies using molecular-dynamics (MD) simulations to model diffuse scattering have been hindered by insufficient sampling of the conformational ensemble. To overcome this issue, we have performed a 1.1-μs MD simulation of crystalline staphylococcal nuclease, providing 100-fold more sampling than previous studies. This simulation enables reproducible calculations of the diffuse intensity and predicts functionally important motions, including transitions among at least eight metastable states with different active-site geometries. The total diffuse intensity calculated using the MD model is highly correlated with the experimental data. In particular, there is excellent agreement for the isotropic component of the diffuse intensity, and substantial but weaker agreement for the anisotropic component. Decomposition of the MD model into protein and solvent components indicates that protein-solvent interactions contribute substantially to the overall diffuse intensity. We conclude that diffuse scattering can be used to validate predictions from MD simulations and can provide information to improve MD models of protein motions.

  15. Correcting for deformation in skin-based marker systems.

    PubMed

    Alexander, E J; Andriacchi, T P

    2001-03-01

    A new technique is described that reduces error due to skin movement artifact in the opto-electronic measurement of in vivo skeletal motion. This work builds on a previously described point cluster technique marker set and estimation algorithm by extending the transformation equations to the general deformation case using a set of activity-dependent deformation models. Skin deformation during activities of daily living are modeled as consisting of a functional form defined over the observation interval (the deformation model) plus additive noise (modeling error). The method is described as an interval deformation technique. The method was tested using simulation trials with systematic and random components of deformation error introduced into marker position vectors. The technique was found to substantially outperform methods that require rigid-body assumptions. The method was tested in vivo on a patient fitted with an external fixation device (Ilizarov). Simultaneous measurements from markers placed on the Ilizarov device (fixed to bone) were compared to measurements derived from skin-based markers. The interval deformation technique reduced the errors in limb segment pose estimate by 33 and 25% compared to the classic rigid-body technique for position and orientation, respectively. This newly developed method has demonstrated that by accounting for the changing shape of the limb segment, a substantial improvement in the estimates of in vivo skeletal movement can be achieved.

  16. Fundamental reform of payment for adult primary care: comprehensive payment for comprehensive care.

    PubMed

    Goroll, Allan H; Berenson, Robert A; Schoenbaum, Stephen C; Gardner, Laurence B

    2007-03-01

    Primary care is essential to the effective and efficient functioning of health care delivery systems, yet there is an impending crisis in the field due in part to a dysfunctional payment system. We present a fundamentally new model of payment for primary care, replacing encounter-based imbursement with comprehensive payment for comprehensive care. Unlike former iterations of primary care capitation (which simply bundled inadequate fee-for-service payments), our comprehensive payment model represents new investment in adult primary care, with substantial increases in payment over current levels. The comprehensive payment is directed to practices to include support for the modern systems and teams essential to the delivery of comprehensive, coordinated care. Income to primary physicians is increased commensurate with the high level of responsibility expected. To ensure optimal allocation of resources and the rewarding of desired outcomes, the comprehensive payment is needs/risk-adjusted and performance-based. Our model establishes a new social contract with the primary care community, substantially increasing payment in return for achieving important societal health system goals, including improved accessibility, quality, safety, and efficiency. Attainment of these goals should help offset and justify the costs of the investment. Field tests of this and other new models of payment for primary care are urgently needed.

  17. Fundamental Reform of Payment for Adult Primary Care: Comprehensive Payment for Comprehensive Care

    PubMed Central

    Berenson, Robert A.; Schoenbaum, Stephen C.; Gardner, Laurence B.

    2007-01-01

    Primary care is essential to the effective and efficient functioning of health care delivery systems, yet there is an impending crisis in the field due in part to a dysfunctional payment system. We present a fundamentally new model of payment for primary care, replacing encounter-based imbursement with comprehensive payment for comprehensive care. Unlike former iterations of primary care capitation (which simply bundled inadequate fee-for-service payments), our comprehensive payment model represents new investment in adult primary care, with substantial increases in payment over current levels. The comprehensive payment is directed to practices to include support for the modern systems and teams essential to the delivery of comprehensive, coordinated care. Income to primary physicians is increased commensurate with the high level of responsibility expected. To ensure optimal allocation of resources and the rewarding of desired outcomes, the comprehensive payment is needs/risk-adjusted and performance-based. Our model establishes a new social contract with the primary care community, substantially increasing payment in return for achieving important societal health system goals, including improved accessibility, quality, safety, and efficiency. Attainment of these goals should help offset and justify the costs of the investment. Field tests of this and other new models of payment for primary care are urgently needed. PMID:17356977

  18. The Wind Forecast Improvement Project (WFIP). A Public-Private Partnership Addressing Wind Energy Forecast Needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilczak, James M.; Finley, Cathy; Freedman, Jeff

    The Wind Forecast Improvement Project (WFIP) is a public-private research program, the goals of which are to improve the accuracy of short-term (0-6 hr) wind power forecasts for the wind energy industry and then to quantify the economic savings that accrue from more efficient integration of wind energy into the electrical grid. WFIP was sponsored by the U.S. Department of Energy (DOE), with partners that include the National Oceanic and Atmospheric Administration (NOAA), private forecasting companies (WindLogics and AWS Truepower), DOE national laboratories, grid operators, and universities. WFIP employed two avenues for improving wind power forecasts: first, through the collectionmore » of special observations to be assimilated into forecast models to improve model initial conditions; and second, by upgrading NWP forecast models and ensembles. The new observations were collected during concurrent year-long field campaigns in two high wind energy resource areas of the U.S. (the upper Great Plains, and Texas), and included 12 wind profiling radars, 12 sodars, 184 instrumented tall towers and over 400 nacelle anemometers (provided by private industry), lidar, and several surface flux stations. Results demonstrate that a substantial improvement of up to 14% relative reduction in power root mean square error (RMSE) was achieved from the combination of improved NOAA numerical weather prediction (NWP) models and assimilation of the new observations. Data denial experiments run over select periods of time demonstrate that up to a 6% relative improvement came from the new observations. The use of ensemble forecasts produced even larger forecast improvements. Based on the success of WFIP, DOE is planning follow-on field programs.« less

  19. Advanced Hydrogen Liquefaction Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Joseph; Kromer, Brian; Neu, Ben

    2011-09-28

    The project identified and quantified ways to reduce the cost of hydrogen liquefaction, and reduce the cost of hydrogen distribution. The goal was to reduce the power consumption by 20% and then to reduce the capital cost. Optimizing the process, improving process equipment, and improving ortho-para conversion significantly reduced the power consumption of liquefaction, but by less than 20%. Because the efficiency improvement was less than the target, the program was stopped before the capital cost was addressed. These efficiency improvements could provide a benefit to the public to improve the design of future hydrogen liquefiers. The project increased themore » understanding of hydrogen liquefaction by modeling different processes and thoroughly examining ortho-para separation and conversion. The process modeling provided a benefit to the public because the project incorporated para hydrogen into the process modeling software, so liquefaction processes can be modeled more accurately than using only normal hydrogen. Adding catalyst to the first heat exchanger, a simple method to reduce liquefaction power, was identified, analyzed, and quantified. The demonstrated performance of ortho-para separation is sufficient for at least one identified process concept to show reduced power cost when compared to hydrogen liquefaction processes using conventional ortho-para conversion. The impact of improved ortho-para conversion can be significant because ortho para conversion uses about 20-25% of the total liquefaction power, but performance improvement is necessary to realize a substantial benefit. Most of the energy used in liquefaction is for gas compression. Improvements in hydrogen compression will have a significant impact on overall liquefier efficiency. Improvements to turbines, heat exchangers, and other process equipment will have less impact.« less

  20. The importance of illumination in a non-contact photoplethysmography imaging system for burn wound assessment

    NASA Astrophysics Data System (ADS)

    Mo, Weirong; Mohan, Rachit; Li, Weizhi; Zhang, Xu; Sellke, Eric W.; Fan, Wensheng; DiMaio, J. Michael; Thatcher, Jeffery E.

    2015-02-01

    We present a non-contact, reflective photoplethysmogram (PPG) imaging method and a prototype system for identifying the presence of dermal burn wounds during a burn debridement surgery. This system aims to provide assistance to clinicians and surgeons in the process of dermal wound management and wound triage decisions. We examined the system variables of illumination uniformity and intensity and present our findings. An LED array, a tungsten light source, and eventually high-power LED emitters were studied as illumination methods for our PPG imaging device. These three different illumination sources were tested in a controlled tissue phantom model and an animal burn model. We found that the low heat and even illumination pattern using high power LED emitters provided a substantial improvement to the collected PPG signal in our animal burn model. These improvements allow the PPG signal from different pixels to be comparable in both time-domain and frequency-domain, simplify the illumination subsystem complexity, and remove the necessity of using high dynamic range cameras. Through the burn model output comparison, such as the blood volume in animal burn data and controlled tissue phantom model, our optical improvements have led to more clinically applicable images to aid in burn assessment.

  1. Root architecture simulation improves the inference from seedling root phenotyping towards mature root systems.

    PubMed

    Zhao, Jiangsan; Bodner, Gernot; Rewald, Boris; Leitner, Daniel; Nagel, Kerstin A; Nakhforoosh, Alireza

    2017-02-01

    Root phenotyping provides trait information for plant breeding. A shortcoming of high-throughput root phenotyping is the limitation to seedling plants and failure to make inferences on mature root systems. We suggest root system architecture (RSA) models to predict mature root traits and overcome the inference problem. Sixteen pea genotypes were phenotyped in (i) seedling (Petri dishes) and (ii) mature (sand-filled columns) root phenotyping platforms. The RSA model RootBox was parameterized with seedling traits to simulate the fully developed root systems. Measured and modelled root length, first-order lateral number, and root distribution were compared to determine key traits for model-based prediction. No direct relationship in root traits (tap, lateral length, interbranch distance) was evident between phenotyping systems. RootBox significantly improved the inference over phenotyping platforms. Seedling plant tap and lateral root elongation rates and interbranch distance were sufficient model parameters to predict genotype ranking in total root length with an RSpearman of 0.83. Parameterization including uneven lateral spacing via a scaling function substantially improved the prediction of architectures underlying the differently sized root systems. We conclude that RSA models can solve the inference problem of seedling root phenotyping. RSA models should be included in the phenotyping pipeline to provide reliable information on mature root systems to breeding research. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  2. A whole school approach: collaborative development of school health policies, processes, and practices.

    PubMed

    Hunt, Pete; Barrios, Lisa; Telljohann, Susan K; Mazyck, Donna

    2015-11-01

    The Whole School, Whole Community, Whole Child (WSCC) model shows the interrelationship between health and learning and the potential for improving educational outcomes by improving health outcomes. However, current descriptions do not explain how to implement the model. The existing literature, including scientific articles, programmatic guidance, and publications by national agencies and organizations, was reviewed and synthesized to describe an overview of interrelatedness of learning and health and the 10 components of the WSCC model. The literature suggests potential benefits of applying the WSCC model at the district and school level. But, the model lacks specific guidance as to how this might be made actionable. A collaborative approach to health and learning is suggested, including a 10-step systematic process to help schools and districts develop an action plan for improving health and education outcomes. Essential preliminary actions are suggested to minimize the impact of the challenges that commonly derail systematic planning processes and program implementation, such as lack of readiness, personnel shortages, insufficient resources, and competing priorities. All new models require testing and evidence to confirm their value. District and schools will need to test this model and put plans into action to show that significant, substantial, and sustainable health and academic outcomes can be achieved. © 2015 The Authors. Journal of School Health published by Wiley Periodicals, Inc. on behalf of American School Health Association.

  3. Methods for using groundwater model predictions to guide hydrogeologic data collection, with application to the Death Valley regional groundwater flow system

    USGS Publications Warehouse

    Tiedeman, C.R.; Hill, M.C.; D'Agnese, F. A.; Faunt, C.C.

    2003-01-01

    Calibrated models of groundwater systems can provide substantial information for guiding data collection. This work considers using such models to guide hydrogeologic data collection for improving model predictions by identifying model parameters that are most important to the predictions. Identification of these important parameters can help guide collection of field data about parameter values and associated flow system features and can lead to improved predictions. Methods for identifying parameters important to predictions include prediction scaled sensitivities (PSS), which account for uncertainty on individual parameters as well as prediction sensitivity to parameters, and a new "value of improved information" (VOII) method presented here, which includes the effects of parameter correlation in addition to individual parameter uncertainty and prediction sensitivity. In this work, the PSS and VOII methods are demonstrated and evaluated using a model of the Death Valley regional groundwater flow system. The predictions of interest are advective transport paths originating at sites of past underground nuclear testing. Results show that for two paths evaluated the most important parameters include a subset of five or six of the 23 defined model parameters. Some of the parameters identified as most important are associated with flow system attributes that do not lie in the immediate vicinity of the paths. Results also indicate that the PSS and VOII methods can identify different important parameters. Because the methods emphasize somewhat different criteria for parameter importance, it is suggested that parameters identified by both methods be carefully considered in subsequent data collection efforts aimed at improving model predictions.

  4. Rhombohedral cubic semiconductor materials on trigonal substrate with single crystal properties and devices based on such materials

    NASA Technical Reports Server (NTRS)

    Park, Yeonjoon (Inventor); Choi, Sang Hyouk (Inventor); King, Glen C. (Inventor); Elliott, James R. (Inventor)

    2012-01-01

    Growth conditions are developed, based on a temperature-dependent alignment model, to enable formation of cubic group IV, group II-V and group II-VI crystals in the [111] orientation on the basal (0001) plane of trigonal crystal substrates, controlled such that the volume percentage of primary twin crystal is reduced from about 40% to about 0.3%, compared to the majority single crystal. The control of stacking faults in this and other embodiments can yield single crystalline semiconductors based on these materials that are substantially without defects, or improved thermoelectric materials with twinned crystals for phonon scattering while maintaining electrical integrity. These methods can selectively yield a cubic-on-trigonal epitaxial semiconductor material in which the cubic layer is substantially either directly aligned, or 60 degrees-rotated from, the underlying trigonal material.

  5. Time-series analysis of ruminant foetal wastage at a slaughterhouse in North Central Nigeria between 2001 and 2012.

    PubMed

    Alhaji, Nma B; Odetokun, Ismail A; Shittu, Aminu; Onyango, Joshua; Chafe, Umar M; Abubakar, Muhammed S; Muraina, Issa A; Fasina, Folorunso O; Lee, Hu Suk

    2015-12-15

    In developing countries, foetal wastage from slaughtered ruminants and the associated economic losses appear to be substantial. However, only a limited number of studies have comprehensively evaluated these trends. In the current study, secondary (retrospective) and primary data were collected and evaluated to estimate the prevalence of foetal wastage from cattle, sheep and goats slaughtered at an abattoir in Minna, Nigeria, over a 12-year period (January 2001-December 2012). Time-series modelling revealed substantial differences in the rate of foetal wastage amongst the slaughtered species, with more lambs having been wasted than calves or kids. Seasonal effects seem to influence rates of foetal wastage and certain months in the year appear to be associated with higher odds of foetal wastage. Improved management systems are suggested to reduce the risk of foetal losses.

  6. NASA Stennis Space Center integrated system health management test bed and development capabilities

    NASA Astrophysics Data System (ADS)

    Figueroa, Fernando; Holland, Randy; Coote, David

    2006-05-01

    Integrated System Health Management (ISHM) capability for rocket propulsion testing is rapidly evolving and promises substantial reduction in time and cost of propulsion systems development, with substantially reduced operational costs and evolutionary improvements in launch system operational robustness. NASA Stennis Space Center (SSC), along with partners that includes NASA, contractor, and academia; is investigating and developing technologies to enable ISHM capability in SSC's rocket engine test stands (RETS). This will enable validation and experience capture over a broad range of rocket propulsion systems of varying complexity. This paper describes key components that constitute necessary ingredients to make possible implementation of credible ISHM capability in RETS, other NASA ground test and operations facilities, and ultimately spacecraft and space platforms and systems: (1) core technologies for ISHM, (2) RETS as ISHM testbeds, and (3) RETS systems models.

  7. The Economic Impacts of Climate Change on Agriculture: New Damage Functions from a Meta-Analsis and the GGCMI

    NASA Astrophysics Data System (ADS)

    Moore, F. C.; Baldos, U. L. C.; Hertel, T. W.; Diaz, D.

    2016-12-01

    Substantial advances have been made in recent years in understanding the effects of climate change on agriculture, but this is not currently represented in economic models used to quantify the benefits of reducing greenhouse gas emissions. In fact, the science regarding climate change impacts on agriculture in these models dates to the early 1990s or before. In this paper we derive new economic damage functions for the agricultural sector based on two methods for aggregating current scientific understanding of the impacts of warming on yields. We first present a new meta-analysis based on a review of the agronomic literature performed for the IPCC 5th Assessment Report and compare results from this approach with findings from the AgMIP Global Gridded Crop Model Intercomparison (GGCMI). We find yield impacts implied by the meta-analysis are generally more negative than those from the GGCMI, particularly at higher latitudes, but show substantial agreement in many areas. We then use both yield products as input to the Global Trade Analysis Project (GTAP) computable general equilibrium (CGE) model in order to estimate the welfare consequences of these yield shocks and to produce two new economic damage functions. These damage functions are consistently more negative than the current representation of agricultural damages in Integrated Asessment Models (IAMs), in some cases substantially so. Replacing the existing damage functions with those based on more recent science increases the social cost of carbon (SCC) by between 43% (GGCMI) and 143% (Meta-Analysis). In addition to presenting a new mutli-crop, multi-model gridded yield impact prouct that complements the GGCMI, this is also the first end-to-end study that directly links the biophysical impacts of climate change to the SCC, something we believe essential to improving the integrity of IAMs going forward.

  8. Effects of New Funding Models for Patient-Centered Medical Homes on Primary Care Practice Finances and Services: Results of a Microsimulation Model.

    PubMed

    Basu, Sanjay; Phillips, Russell S; Song, Zirui; Landon, Bruce E; Bitton, Asaf

    2016-09-01

    We assess the financial implications for primary care practices of participating in patient-centered medical home (PCMH) funding initiatives. We estimated practices' changes in net revenue under 3 PCMH funding initiatives: increased fee-for-service (FFS) payments, traditional FFS with additional per-member-per-month (PMPM) payments, or traditional FFS with PMPM and pay-for-performance (P4P) payments. Net revenue estimates were based on a validated microsimulation model utilizing national practice surveys. Simulated practices reflecting the national range of practice size, location, and patient population were examined under several potential changes in clinical services: investments in patient tracking, communications, and quality improvement; increased support staff; altered visit templates to accommodate longer visits, telephone visits or electronic visits; and extended service delivery hours. Under the status quo of traditional FFS payments, clinics operate near their maximum estimated possible net revenue levels, suggesting they respond strongly to existing financial incentives. Practices gained substantial additional net annual revenue per full-time physician under PMPM or PMPM plus P4P payments ($113,300 per year, 95% CI, $28,500 to $198,200) but not under increased FFS payments (-$53,500, 95% CI, -$69,700 to -$37,200), after accounting for costs of meeting PCMH funding requirements. Expanding services beyond minimum required levels decreased net revenue, because traditional FFS revenues decreased. PCMH funding through PMPM payments could substantially improve practice finances but will not offer sufficient financial incentives to expand services beyond minimum requirements for PCMH funding. © 2016 Annals of Family Medicine, Inc.

  9. Effects of New Funding Models for Patient-Centered Medical Homes on Primary Care Practice Finances and Services: Results of a Microsimulation Model

    PubMed Central

    Basu, Sanjay; Phillips, Russell S.; Song, Zirui; Landon, Bruce E.; Bitton, Asaf

    2016-01-01

    PURPOSE We assess the financial implications for primary care practices of participating in patient-centered medical home (PCMH) funding initiatives. METHODS We estimated practices’ changes in net revenue under 3 PCMH funding initiatives: increased fee-for-service (FFS) payments, traditional FFS with additional per-member-per-month (PMPM) payments, or traditional FFS with PMPM and pay-for-performance (P4P) payments. Net revenue estimates were based on a validated microsimulation model utilizing national practice surveys. Simulated practices reflecting the national range of practice size, location, and patient population were examined under several potential changes in clinical services: investments in patient tracking, communications, and quality improvement; increased support staff; altered visit templates to accommodate longer visits, telephone visits or electronic visits; and extended service delivery hours. RESULTS Under the status quo of traditional FFS payments, clinics operate near their maximum estimated possible net revenue levels, suggesting they respond strongly to existing financial incentives. Practices gained substantial additional net annual revenue per full-time physician under PMPM or PMPM plus P4P payments ($113,300 per year, 95% CI, $28,500 to $198,200) but not under increased FFS payments (−$53,500, 95% CI, −$69,700 to −$37,200), after accounting for costs of meeting PCMH funding requirements. Expanding services beyond minimum required levels decreased net revenue, because traditional FFS revenues decreased. CONCLUSIONS PCMH funding through PMPM payments could substantially improve practice finances but will not offer sufficient financial incentives to expand services beyond minimum requirements for PCMH funding. PMID:27621156

  10. Adaptive estimation of state of charge and capacity with online identified battery model for vanadium redox flow battery

    NASA Astrophysics Data System (ADS)

    Wei, Zhongbao; Tseng, King Jet; Wai, Nyunt; Lim, Tuti Mariana; Skyllas-Kazacos, Maria

    2016-11-01

    Reliable state estimate depends largely on an accurate battery model. However, the parameters of battery model are time varying with operating condition variation and battery aging. The existing co-estimation methods address the model uncertainty by integrating the online model identification with state estimate and have shown improved accuracy. However, the cross interference may arise from the integrated framework to compromise numerical stability and accuracy. Thus this paper proposes the decoupling of model identification and state estimate to eliminate the possibility of cross interference. The model parameters are online adapted with the recursive least squares (RLS) method, based on which a novel joint estimator based on extended Kalman Filter (EKF) is formulated to estimate the state of charge (SOC) and capacity concurrently. The proposed joint estimator effectively compresses the filter order which leads to substantial improvement in the computational efficiency and numerical stability. Lab scale experiment on vanadium redox flow battery shows that the proposed method is highly authentic with good robustness to varying operating conditions and battery aging. The proposed method is further compared with some existing methods and shown to be superior in terms of accuracy, convergence speed, and computational cost.

  11. Use of empirical likelihood to calibrate auxiliary information in partly linear monotone regression models.

    PubMed

    Chen, Baojiang; Qin, Jing

    2014-05-10

    In statistical analysis, a regression model is needed if one is interested in finding the relationship between a response variable and covariates. When the response depends on the covariate, then it may also depend on the function of this covariate. If one has no knowledge of this functional form but expect for monotonic increasing or decreasing, then the isotonic regression model is preferable. Estimation of parameters for isotonic regression models is based on the pool-adjacent-violators algorithm (PAVA), where the monotonicity constraints are built in. With missing data, people often employ the augmented estimating method to improve estimation efficiency by incorporating auxiliary information through a working regression model. However, under the framework of the isotonic regression model, the PAVA does not work as the monotonicity constraints are violated. In this paper, we develop an empirical likelihood-based method for isotonic regression model to incorporate the auxiliary information. Because the monotonicity constraints still hold, the PAVA can be used for parameter estimation. Simulation studies demonstrate that the proposed method can yield more efficient estimates, and in some situations, the efficiency improvement is substantial. We apply this method to a dementia study. Copyright © 2013 John Wiley & Sons, Ltd.

  12. Paleosecular variation and time-averaged field analysis over the last 10 Ma from a new global dataset (PSV10)

    NASA Astrophysics Data System (ADS)

    Cromwell, G.; Johnson, C. L.; Tauxe, L.; Constable, C.; Jarboe, N.

    2015-12-01

    Previous paleosecular variation (PSV) and time-averaged field (TAF) models draw on compilations of paleodirectional data that lack equatorial and high latitude sites and use latitudinal virtual geomagnetic pole (VGP) cutoffs designed to remove transitional field directions. We present a new selected global dataset (PSV10) of paleodirectional data spanning the last 10 Ma. We include all results calculated with modern laboratory methods, regardless of site VGP colatitude, that meet statistically derived selection criteria. We exclude studies that target transitional field states or identify significant tectonic effects, and correct for any bias from serial correlation by averaging directions from sequential lava flows. PSV10 has an improved global distribution compared with previous compilations, comprising 1519 sites from 71 studies. VGP dispersion in PSV10 varies with latitude, exhibiting substantially higher values in the southern hemisphere than at corresponding northern latitudes. Inclination anomaly estimates at many latitudes are within error of an expected GAD field, but significant negative anomalies are found at equatorial and mid-northern latitudes. Current PSV models Model G and TK03 do not fit observed PSV or TAF latitudinal behavior in PSV10, or subsets of normal and reverse polarity data, particularly for southern hemisphere sites. Attempts to fit these observations with simple modifications to TK03 showed slight statistical improvements, but still exceed acceptable errors. The root-mean-square misfit of TK03 (and subsequent iterations) is substantially lower for the normal polarity subset of PSV10, compared to reverse polarity data. Two-thirds of data in PSV10 are normal polarity, most which are from the last 5 Ma, so we develop a new TAF model using this subset of data. We use the resulting TAF model to explore whether new statistical PSV models can better describe our new global compilation.

  13. Impact of cleaning and other interventions on the reduction of hospital-acquired Clostridium difficile infections in two hospitals in England assessed using a breakpoint model.

    PubMed

    Hughes, G J; Nickerson, E; Enoch, D A; Ahluwalia, J; Wilkinson, C; Ayers, R; Brown, N M

    2013-07-01

    Clostridium difficile infection remains a major challenge for hospitals. Although targeted infection control initiatives have been shown to be effective in reducing the incidence of hospital-acquired C. difficile infection, there is little evidence available to assess the effectiveness of specific interventions. To use statistical modelling to detect substantial reductions in the incidence of C. difficile from time series data from two hospitals in England, and relate these time points to infection control interventions. A statistical breakpoints model was fitted to likely hospital-acquired C. difficile infection incidence data from a teaching hospital (2002-2009) and a district general hospital (2005-2009) in England. Models with increasing complexity (i.e. increasing the number of breakpoints) were tested for an improved fit to the data. Partitions estimated from breakpoint models were tested for individual stability using statistical process control charts. Major infection control interventions from both hospitals during this time were grouped according to their primary target (antibiotics, cleaning, isolation, other) and mapped to the model-suggested breakpoints. For both hospitals, breakpoints coincided with enhancements to cleaning protocols. Statistical models enabled formal assessment of the impact of different interventions, and showed that enhancements to deep cleaning programmes are the interventions that have most likely led to substantial reductions in hospital-acquired C. difficile infections at the two hospitals studied. Copyright © 2013 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  14. Community of inquiry model: advancing distance learning in nurse anesthesia education.

    PubMed

    Pecka, Shannon L; Kotcherlakota, Suhasini; Berger, Ann M

    2014-06-01

    The number of distance education courses offered by nurse anesthesia programs has increased substantially. Emerging distance learning trends must be researched to ensure high-quality education for student registered nurse anesthetists. However, research to examine distance learning has been hampered by a lack of theoretical models. This article introduces the Community of Inquiry model for use in nurse anesthesia education. This model has been used for more than a decade to guide and research distance learning in higher education. A major strength of this model learning. However, it lacks applicability to the development of higher order thinking for student registered nurse anesthetists. Thus, a new derived Community of Inquiry model was designed to improve these students' higher order thinking in distance learning. The derived model integrates Bloom's revised taxonomy into the original Community of Inquiry model and provides a means to design, evaluate, and research higher order thinking in nurse anesthesia distance education courses.

  15. Approach to Modeling Boundary Layer Ingestion Using a Fully Coupled Propulsion-RANS Model

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Mader, Charles A.; Kenway, Gaetan K. W.; Martins, Joaquim R. R. A.

    2017-01-01

    Airframe-propulsion integration concepts that use boundary layer ingestion have the potential to reduce aircraft fuel burn. One concept that has been recently explored is NASA's Starc-ABL aircraft configuration, which offers the potential for 12% mission fuel burn reduction by using a turbo-electric propulsion system with an aft-mounted electrically driven boundary layer ingestion propulsor. This large potential for improved performance motivates a more detailed study of the boundary layer ingestion propulsor design, but to date, analyses of boundary layer ingestion have used uncoupled methods. These methods account for only aerodynamic effects on the propulsion system or propulsion system effects on the aerodynamics, but not both simultaneously. This work presents a new approach for building fully coupled propulsive-aerodynamic models of boundary layer ingestion propulsion systems. A 1D thermodynamic cycle analysis is coupled to a RANS simulation to model the Starc-ABL aft propulsor at a cruise condition and the effects variation in propulsor design on performance are examined. The results indicates that both propulsion and aerodynamic effects contribute equally toward the overall performance and that the fully coupled model yields substantially different results compared to uncoupled. The most significant finding is that boundary layer ingestion, while offering substantial fuel burn savings, introduces throttle dependent aerodynamics effects that need to be accounted for. This work represents a first step toward the multidisciplinary design optimization of boundary layer ingestion propulsion systems.

  16. An observational radiative constraint on hydrologic cycle intensification

    DOE PAGES

    DeAngelis, Anthony M.; Qu, Xin; Zelinka, Mark D.; ...

    2015-12-09

    We report that intensification of the hydrologic cycle is a key dimension of climate change, with substantial impacts on human and natural systems. A basic measure of hydrologic cycle intensification is the increase in global-mean precipitation per unit surface warming, which varies by a factor of three in current-generation climate models (about 1–3 per cent per kelvin). Part of the uncertainty may originate from atmosphere–radiation interactions. As the climate warms, increases in shortwave absorption from atmospheric moistening will suppress the precipitation increase. This occurs through a reduction of the latent heating increase required to maintain a balanced atmospheric energy budget.more » Using an ensemble of climate models, here we show that such models tend to underestimate the sensitivity of solar absorption to variations in atmospheric water vapour, leading to an underestimation in the shortwave absorption increase and an overestimation in the precipitation increase. This sensitivity also varies considerably among models due to differences in radiative transfer parameterizations, explaining a substantial portion of model spread in the precipitation response. Consequently, attaining accurate shortwave absorption responses through improvements to the radiative transfer schemes could reduce the spread in the predicted global precipitation increase per degree warming for the end of the twenty-first century by about 35 per cent, and reduce the estimated ensemble-mean increase in this quantity by almost 40 per cent.« less

  17. Simulating carbon and water cycles of larch forests in East Asia by the BIOME-BGC model with AsiaFlux data

    NASA Astrophysics Data System (ADS)

    Ueyama, M.; Ichii, K.; Hirata, R.; Takagi, K.; Asanuma, J.; Machimura, T.; Nakai, Y.; Ohta, T.; Saigusa, N.; Takahashi, Y.; Hirano, T.

    2010-03-01

    Larch forests are widely distributed across many cool-temperate and boreal regions, and they are expected to play an important role in global carbon and water cycles. Model parameterizations for larch forests still contain large uncertainties owing to a lack of validation. In this study, a process-based terrestrial biosphere model, BIOME-BGC, was tested for larch forests at six AsiaFlux sites and used to identify important environmental factors that affect the carbon and water cycles at both temporal and spatial scales. The model simulation performed with the default deciduous conifer parameters produced results that had large differences from the observed net ecosystem exchange (NEE), gross primary productivity (GPP), ecosystem respiration (RE), and evapotranspiration (ET). Therefore, we adjusted several model parameters in order to reproduce the observed rates of carbon and water cycle processes. This model calibration, performed using the AsiaFlux data, substantially improved the model performance. The simulated annual GPP, RE, NEE, and ET from the calibrated model were highly consistent with observed values. The observed and simulated GPP and RE across the six sites were positively correlated with the annual mean air temperature and annual total precipitation. On the other hand, the simulated carbon budget was partly explained by the stand disturbance history in addition to the climate. The sensitivity study indicated that spring warming enhanced the carbon sink, whereas summer warming decreased it across the larch forests. The summer radiation was the most important factor that controlled the carbon fluxes in the temperate site, but the VPD and water conditions were the limiting factors in the boreal sites. One model parameter, the allocation ratio of carbon between belowground and aboveground, was site-specific, and it was negatively correlated with the annual climate of annual mean air temperature and total precipitation. Although this study substantially improved the model performance, the uncertainties that remained in terms of the sensitivity to water conditions should be examined in ongoing and long-term observations.

  18. Improving satellite-driven PM2.5 models with Moderate Resolution Imaging Spectroradiometer fire counts in the southeastern U.S.

    PubMed

    Hu, Xuefei; Waller, Lance A; Lyapustin, Alexei; Wang, Yujie; Liu, Yang

    2014-10-16

    Multiple studies have developed surface PM 2.5 (particle size less than 2.5 µm in aerodynamic diameter) prediction models using satellite-derived aerosol optical depth as the primary predictor and meteorological and land use variables as secondary variables. To our knowledge, satellite-retrieved fire information has not been used for PM 2.5 concentration prediction in statistical models. Fire data could be a useful predictor since fires are significant contributors of PM 2.5 . In this paper, we examined whether remotely sensed fire count data could improve PM 2.5 prediction accuracy in the southeastern U.S. in a spatial statistical model setting. A sensitivity analysis showed that when the radius of the buffer zone centered at each PM 2.5 monitoring site reached 75 km, fire count data generally have the greatest predictive power of PM 2.5 across the models considered. Cross validation (CV) generated an R 2 of 0.69, a mean prediction error of 2.75 µg/m 3 , and root-mean-square prediction errors (RMSPEs) of 4.29 µg/m 3 , indicating a good fit between the dependent and predictor variables. A comparison showed that the prediction accuracy was improved more substantially from the nonfire model to the fire model at sites with higher fire counts. With increasing fire counts, CV RMSPE decreased by values up to 1.5 µg/m 3 , exhibiting a maximum improvement of 13.4% in prediction accuracy. Fire count data were shown to have better performance in southern Georgia and in the spring season due to higher fire occurrence. Our findings indicate that fire count data provide a measurable improvement in PM 2.5 concentration estimation, especially in areas and seasons prone to fire events.

  19. Population health improvement: a community health business model that engages partners in all sectors.

    PubMed

    Kindig, David A; Isham, George

    2014-01-01

    Because population health improvement requires action on multiple determinants--including medical care, health behaviors, and the social and physical environments--no single entity can be held accountable for achieving improved outcomes. Medical organizations, government, schools, businesses, and community organizations all need to make substantial changes in how they approach health and how they allocate resources. To this end, we suggest the development of multisectoral community health business partnership models. Such collaborative efforts are needed by sectors and actors not accustomed to working together. Healthcare executives can play important leadership roles in fostering or supporting such partnerships in local and national arenas where they have influence. In this article, we develop the following components of this argument: defining a community health business model; defining population health and the Triple Aim concept; reaching beyond core mission to help create the model; discussing the shift for care delivery beyond healthcare organizations to other community sectors; examining who should lead in developing the community business model; discussing where the resources for a community business model might come from; identifying that better evidence is needed to inform where to make cost-effective investments; and proposing some next steps. The approach we have outlined is a departure from much current policy and management practice. But new models are needed as a road map to drive action--not just thinking--to address the enormous challenge of improving population health. While we applaud continuing calls to improve health and reduce disparities, progress will require more robust incentives, strategies, and action than have been in practice to date. Our hope is that ideas presented here will help to catalyze a collective, multisectoral response to this critical social and economic challenge.

  20. Improving satellite-driven PM2.5 models with Moderate Resolution Imaging Spectroradiometer fire counts in the southeastern U.S

    PubMed Central

    Hu, Xuefei; Waller, Lance A.; Lyapustin, Alexei; Wang, Yujie; Liu, Yang

    2017-01-01

    Multiple studies have developed surface PM2.5 (particle size less than 2.5 µm in aerodynamic diameter) prediction models using satellite-derived aerosol optical depth as the primary predictor and meteorological and land use variables as secondary variables. To our knowledge, satellite-retrieved fire information has not been used for PM2.5 concentration prediction in statistical models. Fire data could be a useful predictor since fires are significant contributors of PM2.5. In this paper, we examined whether remotely sensed fire count data could improve PM2.5 prediction accuracy in the southeastern U.S. in a spatial statistical model setting. A sensitivity analysis showed that when the radius of the buffer zone centered at each PM2.5 monitoring site reached 75 km, fire count data generally have the greatest predictive power of PM2.5 across the models considered. Cross validation (CV) generated an R2 of 0.69, a mean prediction error of 2.75 µg/m3, and root-mean-square prediction errors (RMSPEs) of 4.29 µg/m3, indicating a good fit between the dependent and predictor variables. A comparison showed that the prediction accuracy was improved more substantially from the nonfire model to the fire model at sites with higher fire counts. With increasing fire counts, CV RMSPE decreased by values up to 1.5 µg/m3, exhibiting a maximum improvement of 13.4% in prediction accuracy. Fire count data were shown to have better performance in southern Georgia and in the spring season due to higher fire occurrence. Our findings indicate that fire count data provide a measurable improvement in PM2.5 concentration estimation, especially in areas and seasons prone to fire events. PMID:28967648

  1. Revised Parameters for the AMOEBA Polarizable Atomic Multipole Water Model

    PubMed Central

    Pande, Vijay S.; Head-Gordon, Teresa; Ponder, Jay W.

    2016-01-01

    A set of improved parameters for the AMOEBA polarizable atomic multipole water model is developed. The protocol uses an automated procedure, ForceBalance, to adjust model parameters to enforce agreement with ab initio-derived results for water clusters and experimentally obtained data for a variety of liquid phase properties across a broad temperature range. The values reported here for the new AMOEBA14 water model represent a substantial improvement over the previous AMOEBA03 model. The new AMOEBA14 water model accurately predicts the temperature of maximum density and qualitatively matches the experimental density curve across temperatures ranging from 249 K to 373 K. Excellent agreement is observed for the AMOEBA14 model in comparison to a variety of experimental properties as a function of temperature, including the 2nd virial coefficient, enthalpy of vaporization, isothermal compressibility, thermal expansion coefficient and dielectric constant. The viscosity, self-diffusion constant and surface tension are also well reproduced. In comparison to high-level ab initio results for clusters of 2 to 20 water molecules, the AMOEBA14 model yields results similar to the AMOEBA03 and the direct polarization iAMOEBA models. With advances in computing power, calibration data, and optimization techniques, we recommend the use of the AMOEBA14 water model for future studies employing a polarizable water model. PMID:25683601

  2. Benefits of an improved wheat crop information system

    NASA Technical Reports Server (NTRS)

    Kinne, I. L.

    1976-01-01

    The ECON work and the results of the independent reviews are summarized. Attempts are made to put this information into layman's terms and to present the benefits that can realistically be expected from a LANDSAT-type remote sensing system. Further the mechanisms by which these benefits can be expected to accrue are presented. The benefits are given including the nature of expected information improvements, how and why they can lead to benefits to society, and the estimated magnitude of the expected benefits. A brief description is presented of the ECON models, how they work, their results, and a summary of the pertinent aspects of each review. The ECON analyses show that substantial benefits will accrue from implementation of an improved wheat crop information system based on remote sensing.

  3. The use of acoustically tuned resonators to improve the sound transmission loss of double-panel partitions

    NASA Astrophysics Data System (ADS)

    Mason, J. M.; Fahy, F. J.

    1988-07-01

    Double-leaf partitions are often utilized in situations requiring low weight structures with high transmission loss, an example of current interest being the fuselage walls of propeller-driven aircraft. In this case, acoustic excitation is periodic and, if one of the frequencies of excitation lies in the region of the fundamental mass-air-mass frequency of the partition, insulation performance is considerably less than desired. The potential effectiveness of tuned Helmholtz resonators connected to the partition cavity is investigated as a method of improving transmission loss. This is demonstrated by a simple theoretical model and then experimentally verified. Results show that substantial improvements may be obtained at and around the mass-air-mass frequency for a total resonator volume 15 percent of the cavity volume.

  4. Modeling canopy-induced turbulence in the Earth system: a unified parameterization of turbulent exchange within plant canopies and the roughness sublayer (CLM-ml v0)

    NASA Astrophysics Data System (ADS)

    Bonan, Gordon B.; Patton, Edward G.; Harman, Ian N.; Oleson, Keith W.; Finnigan, John J.; Lu, Yaqiong; Burakowski, Elizabeth A.

    2018-04-01

    Land surface models used in climate models neglect the roughness sublayer and parameterize within-canopy turbulence in an ad hoc manner. We implemented a roughness sublayer turbulence parameterization in a multilayer canopy model (CLM-ml v0) to test if this theory provides a tractable parameterization extending from the ground through the canopy and the roughness sublayer. We compared the canopy model with the Community Land Model (CLM4.5) at seven forest, two grassland, and three cropland AmeriFlux sites over a range of canopy heights, leaf area indexes, and climates. CLM4.5 has pronounced biases during summer months at forest sites in midday latent heat flux, sensible heat flux, gross primary production, nighttime friction velocity, and the radiative temperature diurnal range. The new canopy model reduces these biases by introducing new physics. Advances in modeling stomatal conductance and canopy physiology beyond what is in CLM4.5 substantially improve model performance at the forest sites. The signature of the roughness sublayer is most evident in nighttime friction velocity and the diurnal cycle of radiative temperature, but is also seen in sensible heat flux. Within-canopy temperature profiles are markedly different compared with profiles obtained using Monin-Obukhov similarity theory, and the roughness sublayer produces cooler daytime and warmer nighttime temperatures. The herbaceous sites also show model improvements, but the improvements are related less systematically to the roughness sublayer parameterization in these canopies. The multilayer canopy with the roughness sublayer turbulence improves simulations compared with CLM4.5 while also advancing the theoretical basis for surface flux parameterizations.

  5. Dendritic network models: Improving isoscapes and quantifying influence of landscape and in-stream processes on strontium isotopes in rivers

    USGS Publications Warehouse

    Brennan, Sean R.; Torgersen, Christian E.; Hollenbeck, Jeff P.; Fernandez, Diego P.; Jensen, Carrie K; Schindler, Daniel E.

    2016-01-01

    A critical challenge for the Earth sciences is to trace the transport and flux of matter within and among aquatic, terrestrial, and atmospheric systems. Robust descriptions of isotopic patterns across space and time, called “isoscapes,” form the basis of a rapidly growing and wide-ranging body of research aimed at quantifying connectivity within and among Earth's systems. However, isoscapes of rivers have been limited by conventional Euclidean approaches in geostatistics and the lack of a quantitative framework to apportion the influence of processes driven by landscape features versus in-stream phenomena. Here we demonstrate how dendritic network models substantially improve the accuracy of isoscapes of strontium isotopes and partition the influence of hydrologic transport versus local geologic features on strontium isotope ratios in a large Alaska river. This work illustrates the analytical power of dendritic network models for the field of isotope biogeochemistry, particularly for provenance studies of modern and ancient animals.

  6. Preventing the Androgen Receptor N/C Interaction Delays Disease Onset in a Mouse Model of SBMA.

    PubMed

    Zboray, Lori; Pluciennik, Anna; Curtis, Dana; Liu, Yuhong; Berman-Booty, Lisa D; Orr, Christopher; Kesler, Cristina T; Berger, Tamar; Gioeli, Daniel; Paschal, Bryce M; Merry, Diane E

    2015-12-15

    Spinal and bulbar muscular atrophy (SBMA) is a neurodegenerative disease caused by a polyglutamine expansion in the androgen receptor (AR) and is associated with misfolding and aggregation of the mutant AR. We investigated the role of an interdomain interaction between the amino (N)-terminal FxxLF motif and carboxyl (C)-terminal AF-2 domain in a mouse model of SBMA. Male transgenic mice expressing polyQ-expanded AR with a mutation in the FxxLF motif (F23A) to prevent the N/C interaction displayed substantially improved motor function compared with N/C-intact AR-expressing mice and showed reduced pathological features of SBMA. Serine 16 phosphorylation was substantially enhanced by the F23A mutation; moreover, the protective effect of AR F23A was dependent on this phosphorylation. These results reveal an important role for the N/C interaction on disease onset in mice and suggest that targeting AR conformation could be a therapeutic strategy for patients with SBMA. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Validity and usefulness of the Line Drill test for adolescent basketball players: a Bayesian multilevel analysis.

    PubMed

    Carvalho, Humberto M; Gonçalves, Carlos E; Grosgeorge, Bernard; Paes, Roberto R

    2017-01-01

    The study examined the validity of the Line Drill test (LD) in male adolescent basketball players (10-15 years). Sensitiveness of the LD to changes in performance across a training and competition season (4 months) was also considered. Age, maturation, body size and LD were measured (n = 57). Sensitiveness of the LD was examined pre- and post-competitive season in a sub-sample (n = 44). The time at each of the four shuttle sprints of the LD (i.e. four stages) was modelled with Bayesian multilevel models. We observed very large correlation of performance at stage 4 (full LD protocol) with stage 3, but lower correlations with the early LD stages. Players' performance by somatic maturity differed substantially only when considering full LD protocol performance. Substantial improvements in all stages of the protocol were observed across the 4-month competitive season. The LD protocol should be shortened by the last full court shuttle sprint, remaining sensitive to training exposure, and independent of maturity status and body size.

  8. The Oncology Care Model: Perspectives From the Centers for Medicare & Medicaid Services and Participating Oncology Practices in Academia and the Community.

    PubMed

    Kline, Ron; Adelson, Kerin; Kirshner, Jeffrey J; Strawbridge, Larissa M; Devita, Marsha; Sinanis, Naralys; Conway, Patrick H; Basch, Ethan

    2017-01-01

    Cancer care delivery in the United States is often fragmented and inefficient, imposing substantial burdens on patients. Costs of cancer care are rising more rapidly than other specialties, with substantial regional differences in quality and cost. The Centers for Medicare & Medicaid Services (CMS) Innovation Center (CMMIS) recently launched the Oncology Care Model (OCM), which uses payment incentives and practice redesign requirements toward the goal of improving quality while controlling costs. As of March 2017, 190 practices were participating, with approximately 3,200 oncologists providing care for approximately 150,000 unique beneficiaries per year (approximately 20% of the Medicare Fee-for-Service population receiving chemotherapy for cancer). This article provides an overview of the program from the CMS perspective, as well as perspectives from two practices implementing OCM: an academic health system (Yale Cancer Center) and a community practice (Hematology Oncology Associates of Central New York). Requirements of OCM, as well as implementation successes, challenges, financial implications, impact on quality, and future visions, are provided from each perspective.

  9. Effect of temperature and precipitation on salmonellosis cases in South-East Queensland, Australia: an observational study.

    PubMed

    Stephen, Dimity Maree; Barnett, Adrian Gerard

    2016-02-25

    Foodborne illnesses in Australia, including salmonellosis, are estimated to cost over $A1.25 billion annually. The weather has been identified as being influential on salmonellosis incidence, as cases increase during summer, however time series modelling of salmonellosis is challenging because outbreaks cause strong autocorrelation. This study assesses whether switching models is an improved method of estimating weather-salmonellosis associations. We analysed weather and salmonellosis in South-East Queensland between 2004 and 2013 using 2 common regression models and a switching model, each with 21-day lags for temperature and precipitation. The switching model best fit the data, as judged by its substantial improvement in deviance information criterion over the regression models, less autocorrelated residuals and control of seasonality. The switching model estimated a 5 °C increase in mean temperature and 10 mm precipitation were associated with increases in salmonellosis cases of 45.4% (95% CrI 40.4%, 50.5%) and 24.1% (95% CrI 17.0%, 31.6%), respectively. Switching models improve on traditional time series models in quantifying weather-salmonellosis associations. A better understanding of how temperature and precipitation influence salmonellosis may identify where interventions can be made to lower the health and economic costs of salmonellosis. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. Constraints on Smoke Injection Height, Source Strength, and Transports from MISR and MODIS

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph A.; Petrenko, Mariya; Val Martin, Maria; Chin, Mian

    2014-01-01

    The AeroCom BB (Biomass Burning) Experiment AOD (Aerosol Optical Depth) motivation: We have a substantial set of satellite wildfire plume AOD snapshots and injection heights to help calibrate model/inventory performance; We are 1) adding more fire source-strength cases 2) using MISR to improve the AOD constrains and 3) adding 2008 global injection heights; We selected GFED3-daily due to good overall source strength performance, but any inventory can be tested; Joint effort to test multiple, global models, to draw robust BB injection height and emission strength conclusions. We provide satellite-based injection height and smoke plume AOD climatologies.

  11. Pairing field methods to improve inference in wildlife surveys while accommodating detection covariance.

    PubMed

    Clare, John; McKinney, Shawn T; DePue, John E; Loftin, Cynthia S

    2017-10-01

    It is common to use multiple field sampling methods when implementing wildlife surveys to compare method efficacy or cost efficiency, integrate distinct pieces of information provided by separate methods, or evaluate method-specific biases and misclassification error. Existing models that combine information from multiple field methods or sampling devices permit rigorous comparison of method-specific detection parameters, enable estimation of additional parameters such as false-positive detection probability, and improve occurrence or abundance estimates, but with the assumption that the separate sampling methods produce detections independently of one another. This assumption is tenuous if methods are paired or deployed in close proximity simultaneously, a common practice that reduces the additional effort required to implement multiple methods and reduces the risk that differences between method-specific detection parameters are confounded by other environmental factors. We develop occupancy and spatial capture-recapture models that permit covariance between the detections produced by different methods, use simulation to compare estimator performance of the new models to models assuming independence, and provide an empirical application based on American marten (Martes americana) surveys using paired remote cameras, hair catches, and snow tracking. Simulation results indicate existing models that assume that methods independently detect organisms produce biased parameter estimates and substantially understate estimate uncertainty when this assumption is violated, while our reformulated models are robust to either methodological independence or covariance. Empirical results suggested that remote cameras and snow tracking had comparable probability of detecting present martens, but that snow tracking also produced false-positive marten detections that could potentially substantially bias distribution estimates if not corrected for. Remote cameras detected marten individuals more readily than passive hair catches. Inability to photographically distinguish individual sex did not appear to induce negative bias in camera density estimates; instead, hair catches appeared to produce detection competition between individuals that may have been a source of negative bias. Our model reformulations broaden the range of circumstances in which analyses incorporating multiple sources of information can be robustly used, and our empirical results demonstrate that using multiple field-methods can enhance inferences regarding ecological parameters of interest and improve understanding of how reliably survey methods sample these parameters. © 2017 by the Ecological Society of America.

  12. Leverage effect, economic policy uncertainty and realized volatility with regime switching

    NASA Astrophysics Data System (ADS)

    Duan, Yinying; Chen, Wang; Zeng, Qing; Liu, Zhicao

    2018-03-01

    In this study, we first investigate the impacts of leverage effect and economic policy uncertainty (EPU) on future volatility in the framework of regime switching. Out-of-sample results show that the HAR-RV including the leverage effect and economic policy uncertainty with regimes can achieve higher forecast accuracy than RV-type and GARCH-class models. Our robustness results further imply that these factors in the framework of regime switching can substantially improve the HAR-RV's forecast performance.

  13. Linear-time general decoding algorithm for the surface code

    NASA Astrophysics Data System (ADS)

    Darmawan, Andrew S.; Poulin, David

    2018-05-01

    A quantum error correcting protocol can be substantially improved by taking into account features of the physical noise process. We present an efficient decoder for the surface code which can account for general noise features, including coherences and correlations. We demonstrate that the decoder significantly outperforms the conventional matching algorithm on a variety of noise models, including non-Pauli noise and spatially correlated noise. The algorithm is based on an approximate calculation of the logical channel using a tensor-network description of the noisy state.

  14. Internal dosimetry monitoring equipment: Present and future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Selby, J.; Carbaugh, E.H.; Lynch, T.P.

    1993-09-01

    We have attempted to characterize the current and future status of in vivo and in vitro measurement programs coupled with the associated radioanalytical methods and workplace monitoring. Developments in these areas must be carefully integrated by internal dosimetrists, radiochemists and field health physicists. Their goal should be uniform improvement rather than to focus on one specific area (e.g., dose modeling) to the neglect of other areas where the measurement capabilities are substantially less sophisticated and, therefore, the potential source of error is greatest.

  15. Meaningful improvement in gait speed in hip fracture recovery.

    PubMed

    Alley, Dawn E; Hicks, Gregory E; Shardell, Michelle; Hawkes, William; Miller, Ram; Craik, Rebecca L; Mangione, Kathleen K; Orwig, Denise; Hochberg, Marc; Resnick, Barbara; Magaziner, Jay

    2011-09-01

    To estimate meaningful improvements in gait speed observed during recovery from hip fracture and to evaluate the sensitivity and specificity of gait speed changes in detecting change in self-reported mobility. Secondary longitudinal data analysis from two randomized controlled trials Twelve hospitals in the Baltimore, Maryland, area. Two hundred seventeen women admitted with hip fracture. Usual gait speed and self-reported mobility (ability to walk 1 block and climb 1 flight of stairs) measured 2 and 12 months after fracture. Effect size-based estimates of meaningful differences were 0.03 for small differences and 0.09 for substantial differences. Depending on the anchor (stairs vs walking) and method (mean difference vs regression), anchor-based estimates ranged from 0.10 to 0.17 m/s for small meaningful improvements and 0.17 to 0.26 m/s for substantial meaningful improvement. Optimal gait speed cutpoints yielded low sensitivity (0.39-0.62) and specificity (0.57-0.76) for improvements in self-reported mobility. Results from this sample of women recovering from hip fracture provide only limited support for the 0.10-m/s cut point for substantial meaningful change previously identified in community-dwelling older adults experiencing declines in walking abilities. Anchor-based estimates and cut points derived from receiver operating characteristic curve analysis suggest that greater improvements in gait speed may be required for substantial perceived mobility improvement in female hip fracture patients. Furthermore, gait speed change performed poorly in discriminating change in self-reported mobility. Estimates of meaningful change in gait speed may differ based on the direction of change (improvement vs decline) or between patient populations. © 2011, Copyright the Authors. Journal compilation © 2011, The American Geriatrics Society.

  16. Meaningful Improvement in Gait Speed in Hip Fracture Recovery

    PubMed Central

    Alley, Dawn E.; Hicks, Gregory E.; Shardell, Michelle; Hawkes, William; Miller, Ram; Craik, Rebecca L.; Mangione, Kathleen K.; Orwig, Denise; Hochberg, Marc; Resnick, Barbara; Magaziner, Jay

    2011-01-01

    OBJECTIVES To estimate meaningful improvements in gait speed observed during recovery from hip fracture and to evaluate the sensitivity and specificity of gait speed changes in detecting change in self-reported mobility. DESIGN Secondary longitudinal data analysis from two randomized controlled trials SETTING Twelve hospitals in the Baltimore, Maryland, area. PARTICIPANTS Two hundred seventeen women admitted with hip fracture. MEASUREMENTS Usual gait speed and self-reported mobility (ability to walk 1 block and climb 1 flight of stairs) measured 2 and 12 months after fracture. RESULTS Effect size–based estimates of meaningful differences were 0.03 for small differences and 0.09 for substantial differences. Depending on the anchor (stairs vs walking) and method (mean difference vs regression), anchor-based estimates ranged from 0.10 to 0.17 m/s for small meaningful improvements and 0.17 to 0.26 m/s for substantial meaningful improvement. Optimal gait speed cut-points yielded low sensitivity (0.39–0.62) and specificity (0.57–0.76) for improvements in self-reported mobility. CONCLUSION Results from this sample of women recovering from hip fracture provide only limited support for the 0.10-m/s cut point for substantial meaningful change previously identified in community-dwelling older adults experiencing declines in walking abilities. Anchor-based estimates and cut points derived from receiver operating characteristic curve analysis suggest that greater improvements in gait speed may be required for substantial perceived mobility improvement in female hip fracture patients. Furthermore, gait speed change performed poorly in discriminating change in self-reported mobility. Estimates of meaningful change in gait speed may differ based on the direction of change (improvement vs decline) or between patient populations. PMID:21883109

  17. System integration of wind and solar power in integrated assessment models: A cross-model evaluation of new approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pietzcker, Robert C.; Ueckerdt, Falko; Carrara, Samuel

    Mitigation-Process Integrated Assessment Models (MP-IAMs) are used to analyze long-term transformation pathways of the energy system required to achieve stringent climate change mitigation targets. Due to their substantial temporal and spatial aggregation, IAMs cannot explicitly represent all detailed challenges of integrating the variable renewable energies (VRE) wind and solar in power systems, but rather rely on parameterized modeling approaches. In the ADVANCE project, six international modeling teams have developed new approaches to improve the representation of power sector dynamics and VRE integration in IAMs. In this study, we qualitatively and quantitatively evaluate the last years' modeling progress and study themore » impact of VRE integration modeling on VRE deployment in IAM scenarios. For a comprehensive and transparent qualitative evaluation, we first develop a framework of 18 features of power sector dynamics and VRE integration. We then apply this framework to the newly-developed modeling approaches to derive a detailed map of strengths and limitations of the different approaches. For the quantitative evaluation, we compare the IAMs to the detailed hourly-resolution power sector model REMIX. We find that the new modeling approaches manage to represent a large number of features of the power sector, and the numerical results are in reasonable agreement with those derived from the detailed power sector model. Updating the power sector representation and the cost and resources of wind and solar substantially increased wind and solar shares across models: Under a carbon price of 30$/tCO2 in 2020 (increasing by 5% per year), the model-average cost-minimizing VRE share over the period 2050-2100 is 62% of electricity generation, 24%-points higher than with the old model version.« less

  18. Extravehicular Mobility Unit Penetration Probability from Micrometeoroids and Orbital Debris: Revised Analytical Model and Potential Space Suit Improvements

    NASA Technical Reports Server (NTRS)

    Chase, Thomas D.; Splawn, Keith; Christiansen, Eric L.

    2007-01-01

    The NASA Extravehicular Mobility Unit (EMU) micrometeoroid and orbital debris protection ability has recently been assessed against an updated, higher threat space environment model. The new environment was analyzed in conjunction with a revised EMU solid model using a NASA computer code. Results showed that the EMU exceeds the required mathematical Probability of having No Penetrations (PNP) of any suit pressure bladder over the remaining life of the program (2,700 projected hours of 2 person spacewalks). The success probability was calculated to be 0.94, versus a requirement of >0.91, for the current spacesuit s outer protective garment. In parallel to the probability assessment, potential improvements to the current spacesuit s outer protective garment were built and impact tested. A NASA light gas gun was used to launch projectiles at test items, at speeds of approximately 7 km per second. Test results showed that substantial garment improvements could be made, with mild material enhancements and moderate assembly development. The spacesuit s PNP would improve marginally with the tested enhancements, if they were available for immediate incorporation. This paper discusses the results of the model assessment process and test program. These findings add confidence to the continued use of the existing NASA EMU during International Space Station (ISS) assembly and Shuttle Operations. They provide a viable avenue for improved hypervelocity impact protection for the EMU, or for future space suits.

  19. An Online Prediction Platform to Support the Environmental ...

    EPA Pesticide Factsheets

    Historical QSAR models are currently utilized across a broad range of applications within the U.S. Environmental Protection Agency (EPA). These models predict basic physicochemical properties (e.g., logP, aqueous solubility, vapor pressure), which are then incorporated into exposure, fate and transport models. Whereas the classical manner of publishing results in peer-reviewed journals remains appropriate, there are substantial benefits to be gained by providing enhanced, open access to the training data sets and resulting models. Benefits include improved transparency, more flexibility to expand training sets and improve model algorithms, and greater ability to independently characterize model performance both globally and in local areas of chemistry. We have developed a web-based prediction platform that uses open-source descriptors and modeling algorithms, employs modern cheminformatics technologies, and is tailored for ease of use by the toxicology and environmental regulatory community. This tool also provides web-services to meet both EPA’s projects and the modeling community at-large. The platform hosts models developed within EPA’s National Center for Computational Toxicology, as well as those developed by other EPA scientists and the outside scientific community. Recognizing that there are other on-line QSAR model platforms currently available which have additional capabilities, we connect to such services, where possible, to produce an integrated

  20. Modelled interglacial carbon cycle dynamics during the Holocene, the Eemian and Marine Isotope Stage (MIS) 11

    NASA Astrophysics Data System (ADS)

    Kleinen, Thomas; Brovkin, Victor; Munhoven, Guy

    2016-11-01

    Trends in the atmospheric concentration of CO2 during three recent interglacials - the Holocene, the Eemian and Marine Isotope Stage (MIS) 11 - are investigated using an earth system model of intermediate complexity, which we extended with process-based modules to consider two slow carbon cycle processes - peat accumulation and shallow-water CaCO3 sedimentation (coral reef formation). For all three interglacials, model simulations considering peat accumulation and shallow-water CaCO3 sedimentation substantially improve the agreement between model results and ice core CO2 reconstructions in comparison to a carbon cycle set-up neglecting these processes. This enables us to model the trends in atmospheric CO2, with modelled trends similar to the ice core data, forcing the model only with orbital and sea level changes. During the Holocene, anthropogenic CO2 emissions are required to match the observed rise in atmospheric CO2 after 3 ka BP but are not relevant before this time. Our model experiments show a considerable improvement in the modelled CO2 trends by the inclusion of the slow carbon cycle processes, allowing us to explain the CO2 evolution during the Holocene and two recent interglacials consistently using an identical model set-up.

  1. Cloud Feedbacks on Greenhouse Warming in a Multi-Scale Modeling Framework with a Higher-Order Turbulence Closure

    NASA Technical Reports Server (NTRS)

    Cheng, Anning; Xu, Kuan-Man

    2015-01-01

    Five-year simulation experiments with a multi-scale modeling Framework (MMF) with a advanced intermediately prognostic higher-order turbulence closure (IPHOC) in its cloud resolving model (CRM) component, also known as SPCAM-IPHOC (super parameterized Community Atmospheric Model), are performed to understand the fast tropical (30S-30N) cloud response to an instantaneous doubling of CO2 concentration with SST held fixed at present-day values. SPCAM-IPHOC has substantially improved the low-level representation compared with SPCAM. It is expected that the cloud responses to greenhouse warming in SPCAM-IPHOC is more realistic. The change of rising motion, surface precipitation, cloud cover, and shortwave and longwave cloud radiative forcing in SPCAM-IPHOC from the greenhouse warming will be presented in the presentation.

  2. Genome Editing and Its Applications in Model Organisms.

    PubMed

    Ma, Dongyuan; Liu, Feng

    2015-12-01

    Technological advances are important for innovative biological research. Development of molecular tools for DNA manipulation, such as zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), and the clustered regularly-interspaced short palindromic repeat (CRISPR)/CRISPR-associated (Cas), has revolutionized genome editing. These approaches can be used to develop potential therapeutic strategies to effectively treat heritable diseases. In the last few years, substantial progress has been made in CRISPR/Cas technology, including technical improvements and wide application in many model systems. This review describes recent advancements in genome editing with a particular focus on CRISPR/Cas, covering the underlying principles, technological optimization, and its application in zebrafish and other model organisms, disease modeling, and gene therapy used for personalized medicine. Copyright © 2016 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  3. Improving Adolescent Judgment and Decision Making

    PubMed Central

    Dansereau, Donald F.; Knight, Danica K.; Flynn, Patrick M.

    2013-01-01

    Human judgment and decision making (JDM) has substantial room for improvement, especially among adolescents. Increased technological and social complexity “ups the ante” for developing impactful JDM interventions and aids. Current explanatory advances in this field emphasize dual processing models that incorporate both experiential and analytic processing systems. According to these models, judgment and decisions based on the experiential system are rapid and stem from automatic reference to previously stored episodes. Those based on the analytic system are viewed as slower and consciously developed. These models also hypothesize that metacognitive (self-monitoring) activities embedded in the analytic system influence how and when the two systems are used. What is not included in these models is the development of an intersection between the two systems. Because such an intersection is strongly suggested by memory and educational research as the basis of wisdom/expertise, the present paper describes an Integrated Judgment and Decision-Making Model (IJDM) that incorporates this component. Wisdom/expertise is hypothesized to contain a collection of schematic structures that can emerge from the accumulation of similar episodes or repeated analytic practice. As will be argued, in comparisons to dual system models, the addition of this component provides a broader basis for selecting and designing interventions to improve adolescent JDM. Its development also has implications for generally enhancing cognitive interventions by adopting principles from athletic training to create automated, expert behaviors. PMID:24391350

  4. A Model of High-Frequency Self-Mixing in Double-Barrier Rectifier

    NASA Astrophysics Data System (ADS)

    Palma, Fabrizio; Rao, R.

    2018-03-01

    In this paper, a new model of the frequency dependence of the double-barrier THz rectifier is presented. The new structure is of interest because it can be realized by CMOS image sensor technology. Its application in a complex field such as that of THz receivers requires the availability of an analytical model, which is reliable and able to highlight the dependence on the parameters of the physical structure. The model is based on the hydrodynamic semiconductor equations, solved in the small signal approximation. The model depicts the mechanisms of the THz modulation of the charge in the depleted regions of the double-barrier device and explains the self-mixing process, the frequency dependence, and the detection capability of the structure. The model thus substantially improves the analytical models of the THz rectification available in literature, mainly based on lamped equivalent circuits.

  5. RANS modeling of scalar dispersion from localized sources within a simplified urban-area model

    NASA Astrophysics Data System (ADS)

    Rossi, Riccardo; Capra, Stefano; Iaccarino, Gianluca

    2011-11-01

    The dispersion of a passive scalar downstream a localized source within a simplified urban-like geometry is examined by means of RANS scalar flux models. The computations are conducted under conditions of neutral stability and for three different incoming wind directions (0°, 45°, 90°) at a roughness Reynolds number of Ret = 391. A Reynolds stress transport model is used to close the flow governing equations whereas both the standard eddy-diffusivity closure and algebraic flux models are employed to close the transport equation for the passive scalar. The comparison with a DNS database shows improved reliability from algebraic scalar flux models towards predicting both the mean concentration and the plume structure. Since algebraic flux models do not increase substantially the computational effort, the results indicate that the use of tensorial-diffusivity can be promising tool for dispersion simulations for the urban environment.

  6. Hidden Markov induced Dynamic Bayesian Network for recovering time evolving gene regulatory networks

    NASA Astrophysics Data System (ADS)

    Zhu, Shijia; Wang, Yadong

    2015-12-01

    Dynamic Bayesian Networks (DBN) have been widely used to recover gene regulatory relationships from time-series data in computational systems biology. Its standard assumption is ‘stationarity’, and therefore, several research efforts have been recently proposed to relax this restriction. However, those methods suffer from three challenges: long running time, low accuracy and reliance on parameter settings. To address these problems, we propose a novel non-stationary DBN model by extending each hidden node of Hidden Markov Model into a DBN (called HMDBN), which properly handles the underlying time-evolving networks. Correspondingly, an improved structural EM algorithm is proposed to learn the HMDBN. It dramatically reduces searching space, thereby substantially improving computational efficiency. Additionally, we derived a novel generalized Bayesian Information Criterion under the non-stationary assumption (called BWBIC), which can help significantly improve the reconstruction accuracy and largely reduce over-fitting. Moreover, the re-estimation formulas for all parameters of our model are derived, enabling us to avoid reliance on parameter settings. Compared to the state-of-the-art methods, the experimental evaluation of our proposed method on both synthetic and real biological data demonstrates more stably high prediction accuracy and significantly improved computation efficiency, even with no prior knowledge and parameter settings.

  7. O2 A Band Studies for Cloud Detection and Algorithm Improvement

    NASA Technical Reports Server (NTRS)

    Chance, K. V.

    1996-01-01

    Detection of cloud parameters from space-based spectrometers can employ the vibrational bands of O2 in the (sup b1)Sigma(sub +)(sub g) yields X(sub 3) Sigma(sup -)(sub g) spin-forbidden electronic transition manifold, particularly the Delta nu = 0 A band. The GOME instrument uses the A band in the Initial Cloud Fitting Algorithm (ICFA). The work reported here consists of making substantial improvements in the line-by-line spectral database for the A band, testing whether an additional correction to the line shape function is necessary in order to correctly model the atmospheric transmission in this band, and calculating prototype cloud and ground template spectra for comparison with satellite measurements.

  8. Sequence determinants of improved CRISPR sgRNA design.

    PubMed

    Xu, Han; Xiao, Tengfei; Chen, Chen-Hao; Li, Wei; Meyer, Clifford A; Wu, Qiu; Wu, Di; Cong, Le; Zhang, Feng; Liu, Jun S; Brown, Myles; Liu, X Shirley

    2015-08-01

    The CRISPR/Cas9 system has revolutionized mammalian somatic cell genetics. Genome-wide functional screens using CRISPR/Cas9-mediated knockout or dCas9 fusion-mediated inhibition/activation (CRISPRi/a) are powerful techniques for discovering phenotype-associated gene function. We systematically assessed the DNA sequence features that contribute to single guide RNA (sgRNA) efficiency in CRISPR-based screens. Leveraging the information from multiple designs, we derived a new sequence model for predicting sgRNA efficiency in CRISPR/Cas9 knockout experiments. Our model confirmed known features and suggested new features including a preference for cytosine at the cleavage site. The model was experimentally validated for sgRNA-mediated mutation rate and protein knockout efficiency. Tested on independent data sets, the model achieved significant results in both positive and negative selection conditions and outperformed existing models. We also found that the sequence preference for CRISPRi/a is substantially different from that for CRISPR/Cas9 knockout and propose a new model for predicting sgRNA efficiency in CRISPRi/a experiments. These results facilitate the genome-wide design of improved sgRNA for both knockout and CRISPRi/a studies. © 2015 Xu et al.; Published by Cold Spring Harbor Laboratory Press.

  9. Tradeoffs between income, air pollution and life expectancy: Brief report on the US experience, 1980-2000.

    PubMed

    Pope, C Arden; Ezzati, Majid; Dockery, Douglas W

    2015-10-01

    During the period of 1980-2000, the US obtained substantial reductions in air pollution and improvements in life expectancy (LE). Multiple factors contributed to improved health. This report explores and illustrates trade-offs between income, air pollution, and LE. Both improved air quality and income growth contributed to LE gains - without evidence of substantial negative tradeoffs between air pollution and income. Cleaner air may be considered an "economic good" with contributions to health, wellbeing, and human capital. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Cavitation-resistant inducer

    DOEpatents

    Dunn, Charlton; Subbaraman, Maria R.

    1989-01-01

    An improvement in an inducer for a pump wherein the inducer includes a hub, a plurality of radially extending substantially helical blades and a wall member extending about and encompassing an outer periphery of the blades. The improvement comprises forming adjacent pairs of blades and the hub to provide a substantially rectangular cross-sectional flow area which cross-sectional flow area decreases from the inlet end of the inducer to a discharge end of the inducer, resulting in increased inducer efficiency improved suction performance, reduced susceptibility to cavitation, reduced susceptibility to hub separation and reduced fabrication costs.

  11. Mechanical Performance of Asphalt Mortar Containing Hydrated Lime and EAFSS at Low and High Temperatures.

    PubMed

    Moon, Ki Hoon; Falchetto, Augusto Cannone; Wang, Di; Riccardi, Chiara; Wistuba, Michael P

    2017-07-03

    In this paper, the possibility of improving the global response of asphalt materials for pavement applications through the use of hydrated lime and Electric Arc-Furnace Steel Slag (EAFSS) was investigated. For this purpose, a set of asphalt mortars was prepared by mixing two different asphalt binders with fine granite aggregate together with hydrated lime or EAFSS at three different percentages. Bending Beam Rheometer (BBR) creep tests and Dynamic Shear Rheometer (DSR) complex modulus tests were performed to evaluate the material response both at low and high temperature. Then, the rheological Huet model was fitted to the BBR creep results for estimating the impact of filler content on the model parameters. It was found that an addition of hydrated lime and EAFSS up to 10% and 5%, respectively, results in satisfactory low-temperature performance with a substantial improvement of the high-temperature behavior.

  12. Applied genetic evaluations for production and functional traits in dairy cattle.

    PubMed

    Mark, T

    2004-08-01

    The objective of this study was to review the current status of genetic evaluation systems for production and functional traits as practiced in different Interbull member countries and to discuss that status in relation to research results and potential improvements. Thirty-one countries provided information. Substantial variation was evident for number of traits considered per country, trait definition, genetic evaluation procedure within trait, effects included, and how these were treated in genetic evaluation models. All countries lacked genetic evaluations for one or more economically important traits. Improvement in the genetic evaluation models, especially for many functional traits, could be achieved by closing the gaps between research and practice. More detailed and up to date information about national genetic evaluation systems for traits in different countries is available at www.interbull.org. Female fertility and workability traits were considered in many countries and could be next in line for international genetic evaluations.

  13. Mechanical Performance of Asphalt Mortar Containing Hydrated Lime and EAFSS at Low and High Temperatures

    PubMed Central

    Moon, Ki Hoon; Wang, Di; Riccardi, Chiara; Wistuba, Michael P.

    2017-01-01

    In this paper, the possibility of improving the global response of asphalt materials for pavement applications through the use of hydrated lime and Electric Arc-Furnace Steel Slag (EAFSS) was investigated. For this purpose, a set of asphalt mortars was prepared by mixing two different asphalt binders with fine granite aggregate together with hydrated lime or EAFSS at three different percentages. Bending Beam Rheometer (BBR) creep tests and Dynamic Shear Rheometer (DSR) complex modulus tests were performed to evaluate the material response both at low and high temperature. Then, the rheological Huet model was fitted to the BBR creep results for estimating the impact of filler content on the model parameters. It was found that an addition of hydrated lime and EAFSS up to 10% and 5%, respectively, results in satisfactory low-temperature performance with a substantial improvement of the high-temperature behavior. PMID:28773100

  14. Attitudes of nursing staff toward interprofessional in-patient-centered rounding.

    PubMed

    Sharma, Umesh; Klocke, David

    2014-09-01

    Historically, medicine and nursing has had a hierarchical and patriarchal relationship, with physicians holding monopoly over knowledge-based practice of medical care, thus impeding interprofessional collaboration. Power gradient prevents nurses from demanding cooperative patient rounding. We surveyed attitudes of nursing staff at our tertiary care community hospital, before and after implementation of a patient-centered interprofessional (hospitalist-nurse) rounding process for patients. There was a substantial improvement in nursing staff satisfaction related to the improved communication (7%-54%, p < 0.001) and rounding (3%-49%, p < 0.001) by hospitalist providers. Patient-centered rounding also positively impacted nursing workflow (5%-56%, p < 0.001), nurses' perceptions of value as a team member (26%-56%, p = 0.018) and their job satisfaction (43%-59%, p = 0.010). Patient-centered rounding positively contributed to transforming the hospitalist-nurse hierarchical model to a team-based collaborative model, thus enhancing interprofessional relationships.

  15. Increasing EHR system usability through standards: Conformance criteria in the HL7 EHR-system functional model.

    PubMed

    Meehan, Rebecca A; Mon, Donald T; Kelly, Kandace M; Rocca, Mitra; Dickinson, Gary; Ritter, John; Johnson, Constance M

    2016-10-01

    Though substantial work has been done on the usability of health information technology, improvements in electronic health record system (EHR) usability have been slow, creating frustration, distrust of EHRs and the use of potentially unsafe work-arounds. Usability standards could be part of the solution for improving EHR usability. EHR system functional requirements and standards have been used successfully in the past to specify system behavior, the criteria of which have been gradually implemented in EHR systems through certification programs and other national health IT strategies. Similarly, functional requirements and standards for usability can help address the multitude of sequelae associated with poor usability. This paper describes the evidence-based functional requirements for usability contained in the Health Level Seven (HL7) EHR System Functional Model, and the benefits of open and voluntary EHR system usability standards. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Experience Transitioning Models and Data at the NOAA Space Weather Prediction Center

    NASA Astrophysics Data System (ADS)

    Berger, Thomas

    2016-07-01

    The NOAA Space Weather Prediction Center has a long history of transitioning research data and models into operations and with the validation activities required. The first stage in this process involves demonstrating that the capability has sufficient value to customers to justify the cost needed to transition it and to run it continuously and reliably in operations. Once the overall value is demonstrated, a substantial effort is then required to develop the operational software from the research codes. The next stage is to implement and test the software and product generation on the operational computers. Finally, effort must be devoted to establishing long-term measures of performance, maintaining the software, and working with forecasters, customers, and researchers to improve over time the operational capabilities. This multi-stage process of identifying, transitioning, and improving operational space weather capabilities will be discussed using recent examples. Plans for future activities will also be described.

  17. Gravity model improvement using GEOS-3 (GEM 9 and 10)

    NASA Technical Reports Server (NTRS)

    Lerch, F. J.; Klosko, S. M.; Laubscher, R. E.; Wagner, C. A.

    1977-01-01

    The use of collocation permitted GEM 9 to be a larger field than previous derived satellite models, GEM 9 having harmonics complete to 20 x 20 with selected higher degree terms. The satellite data set has approximately 840,000 observations, of which 200,000 are laser ranges taken on 9 satellites equipped with retroreflectors. GEM 10 is complete to 22 x 22 with selected higher degree terms out to degree and order 30 amounting to a total of 592 coefficients. Comparisons with surface gravity and altimeter data indicate a substantial improvement in GEM 9 over previous satellite solutions; GEM 9 is in even closer agreement with surface data than the previously published GEM 6 solution which contained surface gravity. In particular the free air gravity anomalies calculated from GEM 9 and a surface gravity solution are in excellent agreement for the high degree terms.

  18. Quantitative properties of clustering within modern microscopic nuclear models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Volya, A.; Tchuvil’sky, Yu. M., E-mail: tchuvl@nucl-th.sinp.msu.ru

    2016-09-15

    A method for studying cluster spectroscopic properties of nuclear fragmentation, such as spectroscopic amplitudes, cluster form factors, and spectroscopic factors, is developed on the basis of modern precision nuclear models that take into account the mixing of large-scale shell-model configurations. Alpha-cluster channels are considered as an example. A mathematical proof of the need for taking into account the channel-wave-function renormalization generated by exchange terms of the antisymmetrization operator (Fliessbach effect) is given. Examples where this effect is confirmed by a high quality of the description of experimental data are presented. By and large, the method in question extends substantially themore » possibilities for studying clustering phenomena in nuclei and for improving the quality of their description.« less

  19. Quantifying the economic risks of climate change

    NASA Astrophysics Data System (ADS)

    Diaz, Delavane; Moore, Frances

    2017-11-01

    Understanding the value of reducing greenhouse-gas emissions matters for policy decisions and climate risk management, but quantification is challenging because of the complex interactions and uncertainties in the Earth and human systems, as well as normative ethical considerations. Current modelling approaches use damage functions to parameterize a simplified relationship between climate variables, such as temperature change, and economic losses. Here we review and synthesize the limitations of these damage functions and describe how incorporating impacts, adaptation and vulnerability research advances and empirical findings could substantially improve damage modelling and the robustness of social cost of carbon values produced. We discuss the opportunities and challenges associated with integrating these research advances into cost-benefit integrated assessment models, with guidance for future work.

  20. Selecting among competing models of electro-optic, infrared camera system range performance

    USGS Publications Warehouse

    Nichols, Jonathan M.; Hines, James E.; Nichols, James D.

    2013-01-01

    Range performance is often the key requirement around which electro-optical and infrared camera systems are designed. This work presents an objective framework for evaluating competing range performance models. Model selection based on the Akaike’s Information Criterion (AIC) is presented for the type of data collected during a typical human observer and target identification experiment. These methods are then demonstrated on observer responses to both visible and infrared imagery in which one of three maritime targets was placed at various ranges. We compare the performance of a number of different models, including those appearing previously in the literature. We conclude that our model-based approach offers substantial improvements over the traditional approach to inference, including increased precision and the ability to make predictions for some distances other than the specific set for which experimental trials were conducted.

  1. Modules for Experiments in Stellar Astrophysics (MESA): Planets, Oscillations, Rotation, and Massive Stars

    NASA Astrophysics Data System (ADS)

    Paxton, Bill; Cantiello, Matteo; Arras, Phil; Bildsten, Lars; Brown, Edward F.; Dotter, Aaron; Mankovich, Christopher; Montgomery, M. H.; Stello, Dennis; Timmes, F. X.; Townsend, Richard

    2013-09-01

    We substantially update the capabilities of the open source software package Modules for Experiments in Stellar Astrophysics (MESA), and its one-dimensional stellar evolution module, MESA star. Improvements in MESA star's ability to model the evolution of giant planets now extends its applicability down to masses as low as one-tenth that of Jupiter. The dramatic improvement in asteroseismology enabled by the space-based Kepler and CoRoT missions motivates our full coupling of the ADIPLS adiabatic pulsation code with MESA star. This also motivates a numerical recasting of the Ledoux criterion that is more easily implemented when many nuclei are present at non-negligible abundances. This impacts the way in which MESA star calculates semi-convective and thermohaline mixing. We exhibit the evolution of 3-8 M ⊙ stars through the end of core He burning, the onset of He thermal pulses, and arrival on the white dwarf cooling sequence. We implement diffusion of angular momentum and chemical abundances that enable calculations of rotating-star models, which we compare thoroughly with earlier work. We introduce a new treatment of radiation-dominated envelopes that allows the uninterrupted evolution of massive stars to core collapse. This enables the generation of new sets of supernovae, long gamma-ray burst, and pair-instability progenitor models. We substantially modify the way in which MESA star solves the fully coupled stellar structure and composition equations, and we show how this has improved the scaling of MESA's calculational speed on multi-core processors. Updates to the modules for equation of state, opacity, nuclear reaction rates, and atmospheric boundary conditions are also provided. We describe the MESA Software Development Kit that packages all the required components needed to form a unified, maintained, and well-validated build environment for MESA. We also highlight a few tools developed by the community for rapid visualization of MESA star results.

  2. Improving risk classification of critical illness with biomarkers: a simulation study

    PubMed Central

    Seymour, Christopher W.; Cooke, Colin R.; Wang, Zheyu; Kerr, Kathleen F.; Yealy, Donald M.; Angus, Derek C.; Rea, Thomas D.; Kahn, Jeremy M.; Pepe, Margaret S.

    2012-01-01

    Purpose Optimal triage of patients at risk of critical illness requires accurate risk prediction, yet little data exists on the performance criteria required of a potential biomarker to be clinically useful. Materials and Methods We studied an adult cohort of non-arrest, non-trauma emergency medical services encounters transported to a hospital from 2002–2006. We simulated hypothetical biomarkers increasingly associated with critical illness during hospitalization, and determined the biomarker strength and sample size necessary to improve risk classification beyond a best clinical model. Results Of 57,647 encounters, 3,121 (5.4%) were hospitalized with critical illness and 54,526 (94.6%) without critical illness. The addition of a moderate strength biomarker (odds ratio=3.0 for critical illness) to a clinical model improved discrimination (c-statistic 0.85 vs. 0.8, p<0.01), reclassification (net reclassification improvement=0.15, 95%CI: 0.13,0.18), and increased the proportion of cases in the highest risk categoryby+8.6% (95%CI: 7.5,10.8%). Introducing correlation between the biomarker and physiological variables in the clinical risk score did not modify the results. Statistically significant changes in net reclassification required a sample size of at least 1000 subjects. Conclusions Clinical models for triage of critical illness could be significantly improved by incorporating biomarkers, yet, substantial sample sizes and biomarker strength may be required. PMID:23566734

  3. Historical background and design evolution of the transonic aircraft technology supercritical wing

    NASA Technical Reports Server (NTRS)

    Ayers, T. G.; Hallissy, J. B.

    1981-01-01

    Two dimensional wind tunnel test results obtained for supercritical airfoils indicated that substantial improvements in aircraft performance at high subsonic speeds could be achieved by shaping the airfoil to improve the supercritical flow above the upper surface. Significant increases in the drag divergence Mach number, the maximum lift coefficient for buffer onset, and the Mach number for buffet onset at a given lift coefficient were demonstrated for the supercritical airfoil, as compared with a NACA 6 series airfoil of comparable thickness. These trends were corroborated by results from three dimensional wind tunnel and flight tests. Because these indicated extensions of the buffet boundaries could provide significant improvements in the maneuverability of a fighter airplane, an exploratory wind tunnel investigation was initiated which demonstrated that significant aerodynamic improvements could be achieved from the direct substitution of a supercritical airfoil on a variable wing sweep multimission airplane model.

  4. Upgrades for the CMS simulation

    DOE PAGES

    Lange, D. J.; Hildreth, M.; Ivantchenko, V. N.; ...

    2015-05-22

    Over the past several years, the CMS experiment has made significant changes to its detector simulation application. The geometry has been generalized to include modifications being made to the CMS detector for 2015 operations, as well as model improvements to the simulation geometry of the current CMS detector and the implementation of a number of approved and possible future detector configurations. These include both completely new tracker and calorimetry systems. We have completed the transition to Geant4 version 10, we have made significant progress in reducing the CPU resources required to run our Geant4 simulation. These have been achieved throughmore » both technical improvements and through numerical techniques. Substantial speed improvements have been achieved without changing the physics validation benchmarks that the experiment uses to validate our simulation application for use in production. As a result, we will discuss the methods that we implemented and the corresponding demonstrated performance improvements deployed for our 2015 simulation application.« less

  5. Multitrait, Random Regression, or Simple Repeatability Model in High-Throughput Phenotyping Data Improve Genomic Prediction for Wheat Grain Yield.

    PubMed

    Sun, Jin; Rutkoski, Jessica E; Poland, Jesse A; Crossa, José; Jannink, Jean-Luc; Sorrells, Mark E

    2017-07-01

    High-throughput phenotyping (HTP) platforms can be used to measure traits that are genetically correlated with wheat ( L.) grain yield across time. Incorporating such secondary traits in the multivariate pedigree and genomic prediction models would be desirable to improve indirect selection for grain yield. In this study, we evaluated three statistical models, simple repeatability (SR), multitrait (MT), and random regression (RR), for the longitudinal data of secondary traits and compared the impact of the proposed models for secondary traits on their predictive abilities for grain yield. Grain yield and secondary traits, canopy temperature (CT) and normalized difference vegetation index (NDVI), were collected in five diverse environments for 557 wheat lines with available pedigree and genomic information. A two-stage analysis was applied for pedigree and genomic selection (GS). First, secondary traits were fitted by SR, MT, or RR models, separately, within each environment. Then, best linear unbiased predictions (BLUPs) of secondary traits from the above models were used in the multivariate prediction models to compare predictive abilities for grain yield. Predictive ability was substantially improved by 70%, on average, from multivariate pedigree and genomic models when including secondary traits in both training and test populations. Additionally, (i) predictive abilities slightly varied for MT, RR, or SR models in this data set, (ii) results indicated that including BLUPs of secondary traits from the MT model was the best in severe drought, and (iii) the RR model was slightly better than SR and MT models under drought environment. Copyright © 2017 Crop Science Society of America.

  6. Evaluating Parametrization Protocols for Hydration Free Energy Calculations with the AMOEBA Polarizable Force Field.

    PubMed

    Bradshaw, Richard T; Essex, Jonathan W

    2016-08-09

    Hydration free energy (HFE) calculations are often used to assess the performance of biomolecular force fields and the quality of assigned parameters. The AMOEBA polarizable force field moves beyond traditional pairwise additive models of electrostatics and may be expected to improve upon predictions of thermodynamic quantities such as HFEs over and above fixed-point-charge models. The recent SAMPL4 challenge evaluated the AMOEBA polarizable force field in this regard but showed substantially worse results than those using the fixed-point-charge GAFF model. Starting with a set of automatically generated AMOEBA parameters for the SAMPL4 data set, we evaluate the cumulative effects of a series of incremental improvements in parametrization protocol, including both solute and solvent model changes. Ultimately, the optimized AMOEBA parameters give a set of results that are not statistically significantly different from those of GAFF in terms of signed and unsigned error metrics. This allows us to propose a number of guidelines for new molecule parameter derivation with AMOEBA, which we expect to have benefits for a range of biomolecular simulation applications such as protein-ligand binding studies.

  7. Transitions to improved confinement regimes induced by changes in heating in zero-dimensional models for tokamak plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, H.; Chapman, S. C.; Max Planck Institute for the Physics of Complex Systems, Dresden

    2014-06-15

    It is shown that rapid substantial changes in heating rate can induce transitions to improved energy confinement regimes in zero-dimensional models for tokamak plasma phenomenology. We examine for the first time the effect of step changes in heating rate in the models of Kim and Diamond [Phys. Rev. Lett. 90, 185006 (2003)] and Malkov and Diamond [Phys. Plasmas 16, 012504 (2009)], which nonlinearly couple the evolving temperature gradient, micro-turbulence, and a mesoscale flow; and in the extension of Zhu et al. [Phys. Plasmas 20, 042302 (2013)], which couples to a second mesoscale flow component. The temperature gradient rises, as doesmore » the confinement time defined by analogy with the fusion context, while micro-turbulence is suppressed. This outcome is robust against variation of heating rise time and against introduction of an additional variable into the model. It is also demonstrated that oscillating changes in heating rate can drive the level of micro-turbulence through a period-doubling path to chaos, where the amplitude of the oscillatory component of the heating rate is the control parameter.« less

  8. Cognitive Performance in Older Adults with Stable Heart Failure: Longitudinal Evidence for Stability and Improvement

    PubMed Central

    Alosco, Michael L.; Garcia, Sarah; Spitznagel, Mary Beth; van Dulmen, Manfred; Cohen, Ronald; Sweet, Lawrence H.; Josephson, Richard; Hughes, Joel; Rosneck, Jim; Gunstad, John

    2013-01-01

    Cognitive impairment is prevalent in heart failure (HF), though substantial variability in the pattern of cognitive impairment is found across studies. To clarify the nature of cognitive impairment in HF, we examined longitudinal trajectories across multiple domains of cognition in HF patients using latent growth class modeling. 115 HF patients completed a neuropsychological battery at baseline, 3-months and 12-months. Participants also completed the Beck Depression Inventory-II (BDI-II). Latent class growth analyses revealed a three-class model for attention/executive function, four-class model for memory, and a three-class model for language. The slope for attention/executive function and language remained stable, while improvements were noted in memory performance. Education and BDI-II significantly predicted the intercept for attention/executive function and language abilities. The BDI-II also predicted baseline memory. The current findings suggest that multiple performance-based classes of neuropsychological test performance exist within cognitive domains, though case-controlled prospective studies with extended follow-ups are needed to fully elucidate changes and predictors of cognitive function in HF. PMID:23906182

  9. Enzyme replacement therapy on hypophosphatasia mouse model

    PubMed Central

    Montaño, Adriana M.; Shimada, Tsutomu; Sly, William S.

    2013-01-01

    Hypophosphatasia (HPP) is an inborn error of metabolism caused by deficiency of the tissue-nonspecific alkaline phosphatase (TNSALP), resulting in a defect of bone mineralization. Natural substrates for this ectoenzyme accumulate extracellulary including inorganic pyrophosphate (PPi), an inhibitor of mineralization, and pyridoxal 5-phosphate (PLP), a co-factor form of vitamin B6. Enzyme replacement therapy (ERT) for HPP by functional TNSALP is one of the therapeutic options. The C-terminal-anchorless human recombinant TNSALP derived from Chinese hamster ovary cell lines was purified. TNSALP-null mice (Akp2−/−), an infantile model of HPP, were treated from birth using TNSALP and vitamin B6 diet. Long-term efficacy studies of ERT consisted of every 3 days subcutaneous or intravenous injections till 28 days old (dose 20 U/g) and subsequently every 3 days intravenous injections for 6 months (dose 10 U/g). We assessed therapeutic effect by growth and survival rates, fertility, skeletal manifestations, and radiographic and pathological finding. Treated Akp2−/− mice grew normally till 4 weeks and appeared well with a minimum skeletal abnormality as well as absence of epilepsy, compared with untreated mice which died by 3 weeks old. The prognosis of TNSALP-treated Akp2−/− mice was improved substantially: 1) prolonged life span over 6 months, 2) improvement of the growth, and 3) normal fertility. After 6 months of treatment, we found moderate hypomineralization with abnormal proliferative chondrocytes in growth plate and articular cartilage. In conclusion, ERT with human native TNSALP improves substantial clinical manifestations in Akp2−/− mice, suggesting that ERT with anchorless TNSALP is also a potential therapy for HPP. PMID:23978959

  10. Orexin gene therapy restores the timing and maintenance of wakefulness in narcoleptic mice.

    PubMed

    Kantor, Sandor; Mochizuki, Takatoshi; Lops, Stefan N; Ko, Brian; Clain, Elizabeth; Clark, Erika; Yamamoto, Mihoko; Scammell, Thomas E

    2013-08-01

    Narcolepsy is caused by selective loss of the orexin/hypocretin-producing neurons of the hypothalamus. For patients with narcolepsy, chronic sleepiness is often the most disabling symptom, but current therapies rarely normalize alertness and do not address the underlying orexin deficiency. We hypothesized that the sleepiness of narcolepsy would substantially improve if orexin signaling were restored in specific brain regions at appropriate times of day. We used gene therapy to restore orexin signaling in a mouse model of narcolepsy. In these Atx mice, expression of a toxic protein (ataxin-3) selectively kills the orexin neurons. To induce ectopic expression of the orexin neuropeptides, we microinjected an adeno-associated viral vector coding for prepro-orexin plus a red fluorescence protein (AAV-orexin) into the mediobasal hypothalamus of Atx and wild-type mice. Control mice received an AAV coding only for red fluorescence protein. Two weeks later, we recorded sleep/wake behavior, locomotor activity, and body temperature and examined the patterns of orexin expression. Atx mice rescued with AAV-orexin produced long bouts of wakefulness and had a normal diurnal pattern of arousal, with the longest bouts of wake and the highest amounts of locomotor activity in the first hours of the night. In addition, AAV-orexin improved the timing of rapid eye movement sleep and the consolidation of nonrapid eye movement sleep in Atx mice. These substantial improvements in sleepiness and other symptoms of narcolepsy demonstrate the effectiveness of orexin gene therapy in a mouse model of narcolepsy. Additional work is needed to optimize this approach, but in time, AAV-orexin could become a useful therapeutic option for patients with narcolepsy.

  11. A simple lightning assimilation technique for improving ...

    EPA Pesticide Factsheets

    Convective rainfall is often a large source of error in retrospective modeling applications. In particular, positive rainfall biases commonly exist during summer months due to overactive convective parameterizations. In this study, lightning assimilation was applied in the Kain-Fritsch (KF) convective scheme to improve retrospective simulations using the Weather Research and Forecasting (WRF) model. The assimilation method has a straightforward approach: force KF deep convection where lightning is observed and, optionally, suppress deep convection where lightning is absent. WRF simulations were made with and without lightning assimilation over the continental United States for July 2012, July 2013, and January 2013. The simulations were evaluated against NCEP stage-IV precipitation data and MADIS near-surface meteorological observations. In general, the use of lightning assimilation considerably improves the simulation of summertime rainfall. For example, the July 2012 monthly averaged bias of 6 h accumulated rainfall is reduced from 0.54 to 0.07 mm and the spatial correlation is increased from 0.21 to 0.43 when lightning assimilation is used. Statistical measures of near-surface meteorological variables also are improved. Consistent improvements also are seen for the July 2013 case. These results suggest that this lightning assimilation technique has the potential to substantially improve simulation of warm-season rainfall in retrospective WRF applications. The

  12. A Simple Lightning Assimilation Technique For Improving ...

    EPA Pesticide Factsheets

    Convective rainfall is often a large source of error in retrospective modeling applications. In particular, positive rainfall biases commonly exist during summer months due to overactive convective parameterizations. In this study, lightning assimilation was applied in the Kain-Fritsch (KF) convective scheme to improve retrospective simulations using the Weather Research and Forecasting (WRF) model. The assimilation method has a straightforward approach: Force KF deep convection where lightning is observed and, optionally, suppress deep convection where lightning is absent. WRF simulations were made with and without lightning assimilation over the continental United States for July 2012, July 2013, and January 2013. The simulations were evaluated against NCEP stage-IV precipitation data and MADIS near-surface meteorological observations. In general, the use of lightning assimilation considerably improves the simulation of summertime rainfall. For example, the July 2012 monthly-averaged bias of 6-h accumulated rainfall is reduced from 0.54 mm to 0.07 mm and the spatial correlation is increased from 0.21 to 0.43 when lightning assimilation is used. Statistical measures of near-surface meteorological variables also are improved. Consistent improvements also are seen for the July 2013 case. These results suggest that this lightning assimilation technique has the potential to substantially improve simulation of warm-season rainfall in retrospective WRF appli

  13. Improved Land Use and Leaf Area Index Enhances WRF-3DVAR Satellite Radiance Assimilation: A Case Study Focusing on Rainfall Simulation in the Shule River Basin during July 2013

    NASA Astrophysics Data System (ADS)

    Yang, Junhua; Ji, Zhenming; Chen, Deliang; Kang, Shichang; Fu, Congshen; Duan, Keqin; Shen, Miaogen

    2018-06-01

    The application of satellite radiance assimilation can improve the simulation of precipitation by numerical weather prediction models. However, substantial quantities of satellite data, especially those derived from low-level (surface-sensitive) channels, are rejected for use because of the difficulty in realistically modeling land surface emissivity and energy budgets. Here, we used an improved land use and leaf area index (LAI) dataset in the WRF-3DVAR assimilation system to explore the benefit of using improved quality of land surface information to improve rainfall simulation for the Shule River Basin in the northeastern Tibetan Plateau as a case study. The results for July 2013 show that, for low-level channels (e.g., channel 3), the underestimation of brightness temperature in the original simulation was largely removed by more realistic land surface information. In addition, more satellite data could be utilized in the assimilation because the realistic land use and LAI data allowed more satellite radiance data to pass the deviation test and get used by the assimilation, which resulted in improved initial driving fields and better simulation in terms of temperature, relative humidity, vertical convection, and cumulative precipitation.

  14. Can a virtual reality surgical simulation training provide a self-driven and mentor-free skills learning? Investigation of the practical influence of the performance metrics from the virtual reality robotic surgery simulator on the skill learning and associated cognitive workloads.

    PubMed

    Lee, Gyusung I; Lee, Mija R

    2018-01-01

    While it is often claimed that virtual reality (VR) training system can offer self-directed and mentor-free skill learning using the system's performance metrics (PM), no studies have yet provided evidence-based confirmation. This experimental study investigated what extent to which trainees achieved their self-learning with a current VR simulator and whether additional mentoring improved skill learning, skill transfer and cognitive workloads in robotic surgery simulation training. Thirty-two surgical trainees were randomly assigned to either the Control-Group (CG) or Experiment-Group (EG). While the CG participants reviewed the PM at their discretion, the EG participants had explanations about PM and instructions on how to improve scores. Each subject completed a 5-week training using four simulation tasks. Pre- and post-training data were collected using both a simulator and robot. Peri-training data were collected after each session. Skill learning, time spent on PM (TPM), and cognitive workloads were compared between groups. After the simulation training, CG showed substantially lower simulation task scores (82.9 ± 6.0) compared with EG (93.2 ± 4.8). Both groups demonstrated improved physical model tasks performance with the actual robot, but the EG had a greater improvement in two tasks. The EG exhibited lower global mental workload/distress, higher engagement, and a better understanding regarding using PM to improve performance. The EG's TPM was initially long but substantially shortened as the group became familiar with PM. Our study demonstrated that the current VR simulator offered limited self-skill learning and additional mentoring still played an important role in improving the robotic surgery simulation training.

  15. Advanced relativistic VLBI model for geodesy

    NASA Astrophysics Data System (ADS)

    Soffel, Michael; Kopeikin, Sergei; Han, Wen-Biao

    2017-07-01

    Our present relativistic part of the geodetic VLBI model for Earthbound antennas is a consensus model which is considered as a standard for processing high-precision VLBI observations. It was created as a compromise between a variety of relativistic VLBI models proposed by different authors as documented in the IERS Conventions 2010. The accuracy of the consensus model is in the picosecond range for the group delay but this is not sufficient for current geodetic purposes. This paper provides a fully documented derivation of a new relativistic model having an accuracy substantially higher than one picosecond and based upon a well accepted formalism of relativistic celestial mechanics, astrometry and geodesy. Our new model fully confirms the consensus model at the picosecond level and in several respects goes to a great extent beyond it. More specifically, terms related to the acceleration of the geocenter are considered and kept in the model, the gravitational time-delay due to a massive body (planet, Sun, etc.) with arbitrary mass and spin-multipole moments is derived taking into account the motion of the body, and a new formalism for the time-delay problem of radio sources located at finite distance from VLBI stations is presented. Thus, the paper presents a substantially elaborated theoretical justification of the consensus model and its significant extension that allows researchers to make concrete estimates of the magnitude of residual terms of this model for any conceivable configuration of the source of light, massive bodies, and VLBI stations. The largest terms in the relativistic time delay which can affect the current VLBI observations are from the quadrupole and the angular momentum of the gravitating bodies that are known from the literature. These terms should be included in the new geodetic VLBI model for improving its consistency.

  16. Promoting cancer screening within the patient centered medical home.

    PubMed

    Sarfaty, Mona; Wender, Richard; Smith, Robert

    2011-01-01

    While consensus has grown that primary care is the essential access point in a high-performing health care system, the current model of primary care underperforms in both chronic disease management and prevention. The Patient Centered Medical Home model (PCMH) is at the center of efforts to reinvent primary care practice, and is regarded as the most promising approach to addressing the burden of chronic disease, improving health outcomes, and reducing health spending. However, the potential for the medical home to improve the delivery of cancer screening (and preventive services in general) has received limited attention in both conceptualization and practice. Medical home demonstrations to date have included few evidence-based preventive services in their outcome measures, and few have evaluated the effect of different payment models. Decreasing use of hospitals and emergency rooms and an emphasis on improving chronic care represent improvements in effective delivery of healthcare, but leave opportunities for reducing the burden of cancer untouched. Data confirm that what does or does not happen in the primary care setting has a substantial impact on cancer outcomes. Insofar as cancer is the leading cause of death before age 80, the PCMH model must prioritize adherence to cancer screening according to recommended guidelines, and systems, financial incentives, and reimbursements must be aligned to achieve that goal. This article explores capacities that are needed in the medical home model to facilitate the integration of cancer screening and other preventive services. These capacities include improved patient access and communication, health risk assessments, periodic preventive health exams, use of registries that store cancer risk information and screening history, ability to track and follow up on tests and referrals, feedback on performance, and payment models that reward cancer screening. Copyright © 2011 American Cancer Society, Inc.

  17. Changes in tropical precipitation cluster size distributions under global warming

    NASA Astrophysics Data System (ADS)

    Neelin, J. D.; Quinn, K. M.

    2016-12-01

    The total amount of precipitation integrated across a tropical storm or other precipitation feature (contiguous clusters of precipitation exceeding a minimum rain rate) is a useful measure of the aggregate size of the disturbance. To establish baseline behavior in current climate, the probability distribution of cluster sizes from multiple satellite retrievals and National Center for Environmental Prediction (NCEP) reanalysis is compared to those from Coupled Model Intercomparison Project (CMIP5) models and the Geophysical Fluid Dynamics Laboratory high-resolution atmospheric model (HIRAM-360 and -180). With the caveat that a minimum rain rate threshold is important in the models (which tend to overproduce low rain rates), the models agree well with observations in leading properties. In particular, scale-free power law ranges in which the probability drops slowly with increasing cluster size are well modeled, followed by a rapid drop in probability of the largest clusters above a cutoff scale. Under the RCP 8.5 global warming scenario, the models indicate substantial increases in probability (up to an order of magnitude) of the largest clusters by the end of century. For models with continuous time series of high resolution output, there is substantial spread on when these probability increases for the largest precipitation clusters should be detectable, ranging from detectable within the observational period to statistically significant trends emerging only in the second half of the century. Examination of NCEP reanalysis and SSMI/SSMIS series of satellite retrievals from 1979 to present does not yield reliable evidence of trends at this time. The results suggest improvements in inter-satellite calibration of the SSMI/SSMIS retrievals could aid future detection.

  18. Impediments to predicting site response: Seismic property estimation and modeling simplifications

    USGS Publications Warehouse

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Guzina, B.B.

    2009-01-01

    We compare estimates of the empirical transfer function (ETF) to the plane SH-wave theoretical transfer function (TTF) within a laterally constant medium for invasive and noninvasive estimates of the seismic shear-wave slownesses at 13 Kiban-Kyoshin network stations throughout Japan. The difference between the ETF and either of the TTFs is substantially larger than the difference between the two TTFs computed from different estimates of the seismic properties. We show that the plane SH-wave TTF through a laterally homogeneous medium at vertical incidence inadequately models observed amplifications at most sites for both slowness estimates, obtained via downhole measurements and the spectral analysis of surface waves. Strategies to improve the predictions can be separated into two broad categories: improving the measurement of soil properties and improving the theory that maps the 1D soil profile onto spectral amplification. Using an example site where the 1D plane SH-wave formulation poorly predicts the ETF, we find a more satisfactory fit to the ETF by modeling the full wavefield and incorporating spatially correlated variability of the seismic properties. We conclude that our ability to model the observed site response transfer function is limited largely by the assumptions of the theoretical formulation rather than the uncertainty of the soil property estimates.

  19. Development of a Melanoma Risk Prediction Model Incorporating MC1R Genotype and Indoor Tanning Exposure: Impact of Mole Phenotype on Model Performance

    PubMed Central

    Penn, Lauren A.; Qian, Meng; Zhang, Enhan; Ng, Elise; Shao, Yongzhao; Berwick, Marianne; Lazovich, DeAnn; Polsky, David

    2014-01-01

    Background Identifying individuals at increased risk for melanoma could potentially improve public health through targeted surveillance and early detection. Studies have separately demonstrated significant associations between melanoma risk, melanocortin receptor (MC1R) polymorphisms, and indoor ultraviolet light (UV) exposure. Existing melanoma risk prediction models do not include these factors; therefore, we investigated their potential to improve the performance of a risk model. Methods Using 875 melanoma cases and 765 controls from the population-based Minnesota Skin Health Study we compared the predictive ability of a clinical melanoma risk model (Model A) to an enhanced model (Model F) using receiver operating characteristic (ROC) curves. Model A used self-reported conventional risk factors including mole phenotype categorized as “none”, “few”, “some” or “many” moles. Model F added MC1R genotype and measures of indoor and outdoor UV exposure to Model A. We also assessed the predictive ability of these models in subgroups stratified by mole phenotype (e.g. nevus-resistant (“none” and “few” moles) and nevus-prone (“some” and “many” moles)). Results Model A (the reference model) yielded an area under the ROC curve (AUC) of 0.72 (95% CI = 0.69, 0.74). Model F was improved with an AUC = 0.74 (95% CI = 0.71–0.76, p<0.01). We also observed substantial variations in the AUCs of Models A & F when examined in the nevus-prone and nevus-resistant subgroups. Conclusions These results demonstrate that adding genotypic information and environmental exposure data can increase the predictive ability of a clinical melanoma risk model, especially among nevus-prone individuals. PMID:25003831

  20. Robust Combining of Disparate Classifiers Through Order Statistics

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Ghosh, Joydeep

    2001-01-01

    Integrating the outputs of multiple classifiers via combiners or meta-learners has led to substantial improvements in several difficult pattern recognition problems. In this article we investigate a family of combiners based on order statistics, for robust handling of situations where there are large discrepancies in performance of individual classifiers. Based on a mathematical modeling of how the decision boundaries are affected by order statistic combiners, we derive expressions for the reductions in error expected when simple output combination methods based on the the median, the maximum and in general, the ith order statistic, are used. Furthermore, we analyze the trim and spread combiners, both based on linear combinations of the ordered classifier outputs, and show that in the presence of uneven classifier performance, they often provide substantial gains over both linear and simple order statistics combiners. Experimental results on both real world data and standard public domain data sets corroborate these findings.

  1. Photostability can be significantly modulated by molecular packing in glasses

    DOE Data Explorer

    Ediger, Mark [University of Wisconsin-Madison; de Pablo, Juan [University of Chicago; Anthony, Lucas [University of Chicago; Qiu, Yue [University of Chicago

    2016-04-10

    While previous work has demonstrated that molecular packing in organic crystals can strongly influence photochemical stability, efforts to tune photostability in amorphous materials have shown much smaller effects. Here we show that physical vapor deposition can substantially improve the photostability of organic glasses. Disperse Orange 37 (DO37), an azobenzene derivative, is studied as a model system. Photostability is assessed through changes in the density and molecular orientation of glassy thin films during light irradiation. By optimizing the substrate temperature used for deposition, we can increase photostability by a factor of 50 relative to the liquid-cooled glass. Photostability correlates with glass density, with density increases of up to 1.3%. Coarse-grained molecular simulations, which mimic glass preparation and the photoisomerization reaction, also indicate that glasses with higher density have substantially increased photostability. These results provide insights that may assist in the design of organic photovoltaics and light emission devices with longer lifetimes.

  2. JACIE accreditation for blood and marrow transplantation: past, present and future directions of an international model for healthcare quality improvement

    PubMed Central

    Snowden, J A; McGrath, E; Duarte, R F; Saccardi, R; Orchard, K; Worel, N; Kuball, J; Chabannon, C; Mohty, M

    2017-01-01

    Blood and marrow transplantation (BMT) is a complex and evolving medical speciality that makes substantial demands on healthcare resources. To meet a professional responsibility to both patients and public health services, the European Society for Blood and Marrow Transplantation (EBMT) initiated and developed the Joint Accreditation Committee of the International Society for Cellular Therapy and EBMT—better known by the acronym, JACIE. Since its inception, JACIE has performed over 530 voluntary accreditation inspections (62% first time; 38% reaccreditation) in 25 countries, representing 40% of transplant centres in Europe. As well as widespread professional acceptance, JACIE has become incorporated into the regulatory framework for delivery of BMT and other haematopoietic cellular therapies in several countries. In recent years, JACIE has been validated using the EBMT registry as an effective means of quality improvement with a substantial positive impact on survival outcomes. Future directions include development of Europe-wide risk-adjusted outcome benchmarking through the EBMT registry and further extension beyond Europe, including goals to faciliate access for BMT programmes in in low- and middle-income economies (LMIEs) via a ‘first-step’ process. PMID:28346416

  3. The role of ion exchange in the passivation of In(Zn)P nanocrystals with ZnS

    PubMed Central

    Cho, Deok-Yong; Xi, Lifei; Boothroyd, Chris; Kardynal, Beata; Lam, Yeng Ming

    2016-01-01

    We have investigated the chemical state of In(Zn)P/ZnS core/shell nanocrystals (NCs) for color conversion applications using hard X-ray absorption spectroscopy (XAS) and photoluminescence excitation (PLE). Analyses of the edge energies as well as the X-ray absorption fine structure (XAFS) reveal that the Zn2+ ions from ZnS remain in the shell while the S2− ions penetrate into the core at an early stage of the ZnS deposition. It is further demonstrated that for short growth times, the ZnS shell coverage on the core was incomplete, whereas the coverage improved gradually as the shell deposition time increased. Together with evidence from PLE spectra, where there is a strong indication of the presence of P vacancies, this suggests that the core-shell interface in the In(Zn)P/ZnS NCs are subject to substantial atomic exchanges and detailed models for the shell structure beyond simple layer coverage are needed. This substantial atomic exchange is very likely to be the reason for the improved photoluminescence behavior of the core-shell particles compare to In(Zn)P-only NCs as S can passivate the NCs surfaces. PMID:26972936

  4. Progress and Challenges in Subseasonal Prediction

    NASA Technical Reports Server (NTRS)

    Schubert, Siegfried

    2003-01-01

    While substantial advances have occurred over the last few decades in both weather and seasonal prediction, progress in improving predictions on subseasonal time scales (approximately 2 weeks to 2 months) has been slow. In this talk I will highlight some of the recent progress that has been made to improve forecasts on subseasonal time scales and outline the challenges that we face both from an observational and modeling perspective. The talk will be based primarily on the results and conclusions of a recent NASA-sponsored workshop that focused on the subseasonal prediction problem. One of the key conclusions of that workshop was that there is compelling evidence for predictability at forecast lead times substantially longer than two weeks, and that much of that predictability is currently untapped. Tropical diabatic heating and soil wetness were singled out as particularly important processes affecting predictability on these time scales. Predictability was also linked to various low-frequency atmospheric phenomena such as the annular modes in high latitudes (including their connections to the stratosphere), the Pacific/North American pattern, and the Madden-Julian Oscillation. I will end the talk by summarizing the recommendations and plans that have been put forward for accelerating progress on the subseasonal prediction problem.

  5. Application of an empirical saturation rule to TGLF to unify low-k and high-k turbulence dominated regimes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jian, Xiang; Chan, Vincent S.; Chen, Jiale

    Here, we propose a phenomenological turbulence saturation model and apply it to the TGLF turbulence transport model, which captures the physics of interaction between low-k and high-k turbulence consistent with the multi-scale gyro-kinetic simulation result. The new model, TGLF-VX is tested with three discharges from DIII-D and EAST tokamak, which cover both low-k and high-k turbulence dominated regimes. It is found that the profile match can be substantially improved over previous models when evolving Te, Ti and ne simultaneously. Good agreement for all three discharges is obtained with one fixed parameter in the model when taking experimental uncertainties into consideration.more » Finally, TGLF-VX is applied to explore the sensitivity of the predicted CFETR steady-state performance to different transport models. Our result shows that a scenario using only RF auxiliary heating could be significantly affected.« less

  6. Application of an empirical saturation rule to TGLF to unify low-k and high-k turbulence dominated regimes

    DOE PAGES

    Jian, Xiang; Chan, Vincent S.; Chen, Jiale; ...

    2017-09-28

    Here, we propose a phenomenological turbulence saturation model and apply it to the TGLF turbulence transport model, which captures the physics of interaction between low-k and high-k turbulence consistent with the multi-scale gyro-kinetic simulation result. The new model, TGLF-VX is tested with three discharges from DIII-D and EAST tokamak, which cover both low-k and high-k turbulence dominated regimes. It is found that the profile match can be substantially improved over previous models when evolving Te, Ti and ne simultaneously. Good agreement for all three discharges is obtained with one fixed parameter in the model when taking experimental uncertainties into consideration.more » Finally, TGLF-VX is applied to explore the sensitivity of the predicted CFETR steady-state performance to different transport models. Our result shows that a scenario using only RF auxiliary heating could be significantly affected.« less

  7. Milestone-specific, Observed data points for evaluating levels of performance (MODEL) assessment strategy for anesthesiology residency programs.

    PubMed

    Nagy, Christopher J; Fitzgerald, Brian M; Kraus, Gregory P

    2014-01-01

    Anesthesiology residency programs will be expected to have Milestones-based evaluation systems in place by July 2014 as part of the Next Accreditation System. The San Antonio Uniformed Services Health Education Consortium (SAUSHEC) anesthesiology residency program developed and implemented a Milestones-based feedback and evaluation system a year ahead of schedule. It has been named the Milestone-specific, Observed Data points for Evaluating Levels of performance (MODEL) assessment strategy. The "MODEL Menu" and the "MODEL Blueprint" are tools that other anesthesiology residency programs can use in developing their own Milestones-based feedback and evaluation systems prior to ACGME-required implementation. Data from our early experience with the streamlined MODEL blueprint assessment strategy showed substantially improved faculty compliance with reporting requirements. The MODEL assessment strategy provides programs with a workable assessment method for residents, and important Milestones data points to programs for ACGME reporting.

  8. Prioritization of in silico models and molecular descriptors for the assessment of ready biodegradability.

    PubMed

    Fernández, Alberto; Rallo, Robert; Giralt, Francesc

    2015-10-01

    Ready biodegradability is a key property for evaluating the long-term effects of chemicals on the environment and human health. As such, it is used as a screening test for the assessment of persistent, bioaccumulative and toxic substances. Regulators encourage the use of non-testing methods, such as in silico models, to save money and time. A dataset of 757 chemicals was collected to assess the performance of four freely available in silico models that predict ready biodegradability. They were applied to develop a new consensus method that prioritizes the use of each individual model according to its performance on chemical subsets driven by the presence or absence of different molecular descriptors. This consensus method was capable of almost eliminating unpredictable chemicals, while the performance of combined models was substantially improved with respect to that of the individual models. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Benchmarking novel approaches for modelling species range dynamics

    PubMed Central

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H.; Moore, Kara A.; Zimmermann, Niklaus E.

    2016-01-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species’ range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species’ response to climate change but also emphasise several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches operational for large numbers of species. PMID:26872305

  10. Benchmarking novel approaches for modelling species range dynamics.

    PubMed

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E

    2016-08-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches operational for large numbers of species. © 2016 John Wiley & Sons Ltd.

  11. Detection and Attribution of Regional Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bala, G; Mirin, A

    2007-01-19

    We developed a high resolution global coupled modeling capability to perform breakthrough studies of the regional climate change. The atmospheric component in our simulation uses a 1{sup o} latitude x 1.25{sup o} longitude grid which is the finest resolution ever used for the NCAR coupled climate model CCSM3. Substantial testing and slight retuning was required to get an acceptable control simulation. The major accomplishment is the validation of this new high resolution configuration of CCSM3. There are major improvements in our simulation of the surface wind stress and sea ice thickness distribution in the Arctic. Surface wind stress and oceanmore » circulation in the Antarctic Circumpolar Current are also improved. Our results demonstrate that the FV version of the CCSM coupled model is a state of the art climate model whose simulation capabilities are in the class of those used for IPCC assessments. We have also provided 1000 years of model data to Scripps Institution of Oceanography to estimate the natural variability of stream flow in California. In the future, our global model simulations will provide boundary data to high-resolution mesoscale model that will be used at LLNL. The mesoscale model would dynamically downscale the GCM climate to regional scale on climate time scales.« less

  12. Assessment of Gravity Field and Steady State Ocean Circulation Explorer (GOCE) geoid model using GPS levelling over Sabah and Sarawak

    NASA Astrophysics Data System (ADS)

    Othman, A. H.; Omar, K. M.; Din, A. H. M.; Som, Z. A. M.; Yahaya, N. A. Z.; Pa'suya, M. F.

    2016-06-01

    The GOCE satellite mission has significantly contributed to various applications such as solid earth physics, oceanography and geodesy. Some substantial applications of geodesy are to improve the gravity field knowledge and the precise geoid modelling towards realising global height unification. This paper aims to evaluate GOCE geoid model based on the recent GOCE Global Geopotential Model (GGM), as well as EGM2008, using GPS levelling data over East Malaysia, i.e. Sabah and Sarawak. The satellite GGMs selected in this study are the GOCE GGM models which include GOCE04S, TIM_R5 and SPW_R4, and the EGM2008 model. To assess these models, the geoid heights from these GGMs are compared to the local geometric geoid height. The GGM geoid heights was derived using EGMLAB1 software and the geometric geoid height was computed by available GPS levelling information obtained from the Department Survey and Mapping Malaysia. Generally, the GOCE models performed better than EGM2008 over East Malaysia and the best fit GOCE model for this region is the TIM_R5 model. The TIM_R5 GOCE model demonstrated the lowest R.M.S. of ± 16.5 cm over Sarawak, comparatively. For further improvement, this model should be combined with the local gravity data for optimum geoid modelling over East Malaysia.

  13. ELECTROCHEMISTRY AND ON-CELL REFORMATION MODELING FOR SOLID OXIDE FUEL CELL STACKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Recknagle, Kurtis P.; Jarboe, Daniel T.; Johnson, Kenneth I.

    2007-01-16

    ABSTRACT Providing adequate and efficient cooling schemes for solid-oxide-fuel-cell (SOFC) stacks continues to be a challenge coincident with the development of larger, more powerful stacks. The endothermic steam-methane reformation reaction can provide cooling and improved system efficiency when performed directly on the electrochemically active anode. Rapid kinetics of the endothermic reaction typically causes a localized temperature depression on the anode near the fuel inlet. It is desirable to extend the endothermic effect over more of the cell area and mitigate the associated differences in temperature on the cell to alleviate subsequent thermal stresses. In this study, modeling tools validated formore » the prediction of fuel use, on-cell methane reforming, and the distribution of temperature within SOFC stacks, are employed to provide direction for modifying the catalytic activity of anode materials to control the methane conversion rate. Improvements in thermal management that can be achieved through on-cell reforming is predicted and discussed. Two operating scenarios are considered: one in which the methane fuel is fully pre-reformed, and another in which a substantial percentage of the methane is reformed on-cell. For the latter, a range of catalytic activity is considered and the predicted thermal effects on the cell are presented. Simulations of the cell electrochemical and thermal performance with and without on-cell reforming, including structural analyses, show a substantial decrease in thermal stresses for an on-cell reforming case with slowed methane conversion.« less

  14. Avoided electricity subsidy payments can finance substantial appliance efficiency incentive programs: Case study of Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leventis, Greg; Gopal, Anand; Rue du Can, Stephane de la

    Numerous countries use taxpayer funds to subsidize residential electricity for a variety of socioeconomic objectives. These subsidies lower the value of energy efficiency to the consumer while raising it for the government. Further, while it would be especially helpful to have stringent Minimum Energy Performance Standards (MEPS) for appliances and buildings in this environment, they are hard to strengthen without imposing a cost on ratepayers. In this secondbest world, where the presence of subsidies limits the government’s ability to strengthen standards, we find that avoided subsidies are a readily available source of financing for energy efficiency incentive programs. Here, wemore » introduce the LBNL Energy Efficiency Revenue Analysis (LEERA) model to estimate the appliance efficiency improvements that can be achieved in Mexico by the revenue neutral financing of incentive programs from avoided subsidy payments. LEERA uses the detailed techno-economic analysis developed by LBNL for the Super-efficient Equipment and Appliance Deployment (SEAD) Initiative to calculate the incremental costs of appliance efficiency improvements. We analyze Mexico’s tariff structures and the long-run marginal cost of supply to calculate the marginal savings for the government from appliance efficiency. We find that avoided subsidy payments alone can finance incentive programs that cover the full incremental cost of refrigerators that are 27% more efficient and TVs that are 32% more efficient than baseline models. We find less substantial market transformation potential for room ACs primarily because AC energy savings occur at less subsidized tariffs.« less

  15. ATF3 expression improves motor function in the ALS mouse model by promoting motor neuron survival and retaining muscle innervation.

    PubMed

    Seijffers, Rhona; Zhang, Jiangwen; Matthews, Jonathan C; Chen, Adam; Tamrazian, Eric; Babaniyi, Olusegun; Selig, Martin; Hynynen, Meri; Woolf, Clifford J; Brown, Robert H

    2014-01-28

    ALS is a fatal neurodegenerative disease characterized by a progressive loss of motor neurons and atrophy of distal axon terminals in muscle, resulting in loss of motor function. Motor end plates denervated by axonal retraction of dying motor neurons are partially reinnervated by remaining viable motor neurons; however, this axonal sprouting is insufficient to compensate for motor neuron loss. Activating transcription factor 3 (ATF3) promotes neuronal survival and axonal growth. Here, we reveal that forced expression of ATF3 in motor neurons of transgenic SOD1(G93A) ALS mice delays neuromuscular junction denervation by inducing axonal sprouting and enhancing motor neuron viability. Maintenance of neuromuscular junction innervation during the course of the disease in ATF3/SOD1(G93A) mice is associated with a substantial delay in muscle atrophy and improved motor performance. Although disease onset and mortality are delayed, disease duration is not affected. This study shows that adaptive axonal growth-promoting mechanisms can substantially improve motor function in ALS and importantly, that augmenting viability of the motor neuron soma and maintaining functional neuromuscular junction connections are both essential elements in therapy for motor neuron disease in the SOD1(G93A) mice. Accordingly, effective protection of optimal motor neuron function requires restitution of multiple dysregulated cellular pathways.

  16. The Atlanta Urban Heat Island Mitigation and Air Quality Modeling Project: How High-Resoution Remote Sensing Data Can Improve Air Quality Models

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Estes, Maurice G., Jr.; Crosson, William L.; Khan, Maudood N.

    2006-01-01

    The Atlanta Urban Heat Island and Air Quality Project had its genesis in Project ATLANTA (ATlanta Land use Analysis: Temperature and Air quality) that began in 1996. Project ATLANTA examined how high-spatial resolution thermal remote sensing data could be used to derive better measurements of the Urban Heat Island effect over Atlanta. We have explored how these thermal remote sensing, as well as other imaged datasets, can be used to better characterize the urban landscape for improved air quality modeling over the Atlanta area. For the air quality modeling project, the National Land Cover Dataset and the local scale Landpro99 dataset at 30m spatial resolutions have been used to derive land use/land cover characteristics for input into the MM5 mesoscale meteorological model that is one of the foundations for the Community Multiscale Air Quality (CMAQ) model to assess how these data can improve output from CMAQ. Additionally, land use changes to 2030 have been predicted using a Spatial Growth Model (SGM). SGM simulates growth around a region using population, employment and travel demand forecasts. Air quality modeling simulations were conducted using both current and future land cover. Meteorological modeling simulations indicate a 0.5 C increase in daily maximum air temperatures by 2030. Air quality modeling simulations show substantial differences in relative contributions of individual atmospheric pollutant constituents as a result of land cover change. Enhanced boundary layer mixing over the city tends to offset the increase in ozone concentration expected due to higher surface temperatures as a result of urbanization.

  17. High-fertility phenotypes: two outbred mouse models exhibit substantially different molecular and physiological strategies warranting improved fertility.

    PubMed

    Langhammer, Martina; Michaelis, Marten; Hoeflich, Andreas; Sobczak, Alexander; Schoen, Jennifer; Weitzel, Joachim M

    2014-01-01

    Animal models are valuable tools in fertility research. Worldwide, there are more than 400 transgenic or knockout mouse models available showing a reproductive phenotype; almost all of them exhibit an infertile or at least subfertile phenotype. By contrast, animal models revealing an improved fertility phenotype are barely described. This article summarizes data on two outbred mouse models exhibiting a 'high-fertility' phenotype. These mouse lines were generated via selection over a time period of more than 40 years and 161 generations. During this selection period, the number of offspring per litter and the total birth weight of the entire litter nearly doubled. Concomitantly with the increased fertility phenotype, several endocrine parameters (e.g. serum testosterone concentrations in male animals), physiological parameters (e.g. body weight, accelerated puberty, and life expectancy), and behavioral parameters (e.g. behavior in an open field and endurance fitness on a treadmill) were altered. We demonstrate that the two independently bred high-fertility mouse lines warranted their improved fertility phenotype using different molecular and physiological strategies. The fertility lines display female- as well as male-specific characteristics. These genetically heterogeneous mouse models provide new insights into molecular and cellular mechanisms that enhance fertility. In view of decreasing fertility in men, these models will therefore be a precious information source for human reproductive medicine. Translated abstract A German translation of abstract is freely available at http://www.reproduction-online.org/content/147/4/427/suppl/DC1.

  18. Conformational dynamics of a crystalline protein from microsecond-scale molecular dynamics simulations and diffuse X-ray scattering

    DOE PAGES

    Wall, Michael E.; Van Benschoten, Andrew H.; Sauter, Nicholas K.; ...

    2014-12-01

    X-ray diffraction from protein crystals includes both sharply peaked Bragg reflections and diffuse intensity between the peaks. The information in Bragg scattering is limited to what is available in the mean electron density. The diffuse scattering arises from correlations in the electron density variations and therefore contains information about collective motions in proteins. Previous studies using molecular-dynamics (MD) simulations to model diffuse scattering have been hindered by insufficient sampling of the conformational ensemble. To overcome this issue, we have performed a 1.1-μs MD simulation of crystalline staphylococcal nuclease, providing 100-fold more sampling than previous studies. This simulation enables reproducible calculationsmore » of the diffuse intensity and predicts functionally important motions, including transitions among at least eight metastable states with different active-site geometries. The total diffuse intensity calculated using the MD model is highly correlated with the experimental data. In particular, there is excellent agreement for the isotropic component of the diffuse intensity, and substantial but weaker agreement for the anisotropic component. The decomposition of the MD model into protein and solvent components indicates that protein–solvent interactions contribute substantially to the overall diffuse intensity. In conclusion, diffuse scattering can be used to validate predictions from MD simulations and can provide information to improve MD models of protein motions.« less

  19. Conformational dynamics of a crystalline protein from microsecond-scale molecular dynamics simulations and diffuse X-ray scattering

    PubMed Central

    Wall, Michael E.; Van Benschoten, Andrew H.; Sauter, Nicholas K.; Adams, Paul D.; Fraser, James S.; Terwilliger, Thomas C.

    2014-01-01

    X-ray diffraction from protein crystals includes both sharply peaked Bragg reflections and diffuse intensity between the peaks. The information in Bragg scattering is limited to what is available in the mean electron density. The diffuse scattering arises from correlations in the electron density variations and therefore contains information about collective motions in proteins. Previous studies using molecular-dynamics (MD) simulations to model diffuse scattering have been hindered by insufficient sampling of the conformational ensemble. To overcome this issue, we have performed a 1.1-μs MD simulation of crystalline staphylococcal nuclease, providing 100-fold more sampling than previous studies. This simulation enables reproducible calculations of the diffuse intensity and predicts functionally important motions, including transitions among at least eight metastable states with different active-site geometries. The total diffuse intensity calculated using the MD model is highly correlated with the experimental data. In particular, there is excellent agreement for the isotropic component of the diffuse intensity, and substantial but weaker agreement for the anisotropic component. Decomposition of the MD model into protein and solvent components indicates that protein–solvent interactions contribute substantially to the overall diffuse intensity. We conclude that diffuse scattering can be used to validate predictions from MD simulations and can provide information to improve MD models of protein motions. PMID:25453071

  20. A stochastic estimation procedure for intermittently-observed semi-Markov multistate models with back transitions.

    PubMed

    Aralis, Hilary; Brookmeyer, Ron

    2017-01-01

    Multistate models provide an important method for analyzing a wide range of life history processes including disease progression and patient recovery following medical intervention. Panel data consisting of the states occupied by an individual at a series of discrete time points are often used to estimate transition intensities of the underlying continuous-time process. When transition intensities depend on the time elapsed in the current state and back transitions between states are possible, this intermittent observation process presents difficulties in estimation due to intractability of the likelihood function. In this manuscript, we present an iterative stochastic expectation-maximization algorithm that relies on a simulation-based approximation to the likelihood function and implement this algorithm using rejection sampling. In a simulation study, we demonstrate the feasibility and performance of the proposed procedure. We then demonstrate application of the algorithm to a study of dementia, the Nun Study, consisting of intermittently-observed elderly subjects in one of four possible states corresponding to intact cognition, impaired cognition, dementia, and death. We show that the proposed stochastic expectation-maximization algorithm substantially reduces bias in model parameter estimates compared to an alternative approach used in the literature, minimal path estimation. We conclude that in estimating intermittently observed semi-Markov models, the proposed approach is a computationally feasible and accurate estimation procedure that leads to substantial improvements in back transition estimates.

  1. Simulation of Longwave Enhancement beneath Montane and Boreal Forests in CLM4.5

    NASA Astrophysics Data System (ADS)

    Todt, M.; Rutter, N.; Fletcher, C. G.; Wake, L. M.; Loranty, M. M.

    2017-12-01

    CMIP5 models have been shown to underestimate both trend and variability in northern hemisphere spring snow cover extent. A substantial fraction of this area is covered by boreal forests, in which the snow energy balance is dominated by radiation. Forest coverage impacts the surface radiation budget by shading the ground and enhancing longwave radiation. Longwave enhancement in boreal forests is a potential mechanism that contributes to uncertainty in snowmelt modelling, however, its impact on snowmelt in global land models has not been analysed yet. This study assesses the simulation of sub-canopy longwave radiation and longwave enhancement by CLM4.5, the land component of the NCAR Community Earth System Model, in which boreal forests are represented by three plant functional types (PFT): evergreen needleleaf trees (ENT), deciduous needleleaf trees (DNT), and deciduous broadleaf trees (DBT). Simulation of sub-canopy longwave enhancement is evaluated at boreal forest sites covering the three boreal PFT in CLM4.5 to assess the dependence of simulation errors on meteorological forcing, vegetation type and vegetation density. ENT are evaluated over a total of six snowmelt seasons in Swiss alpine and subalpine forests, as well as a single season at a Finnish arctic site with varying vegetation density. A Swedish artic site features varying vegetation density for DBT for a single winter, and two sites in Eastern Siberia are included covering a total of four snowmelt seasons in DNT forests. CLM4.5 overestimates the diurnal range of sub-canopy longwave radiation and consequently longwave enhancement, overestimating daytime values and underestimating nighttime values. Simulation errors result mainly from clear sky conditions, due to high absorption of shortwave radiation during daytime and radiative cooling during nighttime. Using recent improvements to the canopy parameterisations of SNOWPACK as a guideline, CLM4.5 simulations of sub-canopy longwave radiation improved through the implementation of a heat mass parameterisation, i.e. including thermal inertia due to biomass. However, this improvement does not substantially reduce the amplitude of the diurnal cycle, a result also found during the development of SNOWPACK.

  2. Dynamic model-based N management reduces surplus nitrogen and improves the environmental performance of corn production

    NASA Astrophysics Data System (ADS)

    Sela, S.; Woodbury, P. B.; van Es, H. M.

    2018-05-01

    The US Midwest is the largest and most intensive corn (Zea mays, L.) production region in the world. However, N losses from corn systems cause serious environmental impacts including dead zones in coastal waters, groundwater pollution, particulate air pollution, and global warming. New approaches to reducing N losses are urgently needed. N surplus is gaining attention as such an approach for multiple cropping systems. We combined experimental data from 127 on-farm field trials conducted in seven US states during the 2011–2016 growing seasons with biochemical simulations using the PNM model to quantify the benefits of a dynamic location-adapted management approach to reduce N surplus. We found that this approach allowed large reductions in N rate (32%) and N surplus (36%) compared to existing static approaches, without reducing yield and substantially reducing yield-scaled N losses (11%). Across all sites, yield-scaled N losses increased linearly with N surplus values above ~48 kg ha‑1. Using the dynamic model-based N management approach enabled growers to get much closer to this target than using existing static methods, while maintaining yield. Therefore, this approach can substantially reduce N surplus and N pollution potential compared to static N management.

  3. Automatic lumen and outer wall segmentation of the carotid artery using deformable three-dimensional models in MR angiography and vessel wall images.

    PubMed

    van 't Klooster, Ronald; de Koning, Patrick J H; Dehnavi, Reza Alizadeh; Tamsma, Jouke T; de Roos, Albert; Reiber, Johan H C; van der Geest, Rob J

    2012-01-01

    To develop and validate an automated segmentation technique for the detection of the lumen and outer wall boundaries in MR vessel wall studies of the common carotid artery. A new segmentation method was developed using a three-dimensional (3D) deformable vessel model requiring only one single user interaction by combining 3D MR angiography (MRA) and 2D vessel wall images. This vessel model is a 3D cylindrical Non-Uniform Rational B-Spline (NURBS) surface which can be deformed to fit the underlying image data. Image data of 45 subjects was used to validate the method by comparing manual and automatic segmentations. Vessel wall thickness and volume measurements obtained by both methods were compared. Substantial agreement was observed between manual and automatic segmentation; over 85% of the vessel wall contours were segmented successfully. The interclass correlation was 0.690 for the vessel wall thickness and 0.793 for the vessel wall volume. Compared with manual image analysis, the automated method demonstrated improved interobserver agreement and inter-scan reproducibility. Additionally, the proposed automated image analysis approach was substantially faster. This new automated method can reduce analysis time and enhance reproducibility of the quantification of vessel wall dimensions in clinical studies. Copyright © 2011 Wiley Periodicals, Inc.

  4. Assessing high shares of renewable energies in district heating systems - a case study for the city of Herten

    NASA Astrophysics Data System (ADS)

    Aydemir, Ali; Popovski, Eftim; Bellstädt, Daniel; Fleiter, Tobias; Büchele, Richard

    2017-11-01

    Many earlier studies have assessed the DH generation mix without taking explicitly into account future changes in the building stock and heat demand. The approach of this study consists of three steps that combine stock modeling, energy demand forecasting, and simulation of different energy technologies. First, a detailed residential building stock model for Herten is constructed by using remote sensing together with a typology for the German building stock. Second, a bottom-up simulation model is used which calculates the thermal energy demand based on energy-related investments in buildings in order to forecast the thermal demand up to 2050. Third, solar thermal fields in combination with large-scale heat pumps are sized as an alternative to the current coal-fired CHPs. We finally assess cost of heat and CO2 reduction for these units for two scenarios which differ with regard to the DH expansion. It can be concluded that up to 2030 and 2050 a substantial reduction in buildings heat demand due to the improved building insulation is expected. The falling heat demand in the DH substantially reduces the economic feasibility of new RES generation capacity. This reduction might be compensated by continuously connecting apartment buildings to the DH network until 2050.

  5. Evaluation of Cloud Microphysics in JMA-NHM Simulations Using Bin or Bulk Microphysical Schemes through Comparison with Cloud Radar Observations

    NASA Technical Reports Server (NTRS)

    Iguchi, Takamichi; Nakajima, Teruyuki; Khain, Alexander P.; Saito, Kazuo; Takemura, Toshihiko; Okamoto, Hajime; Nishizawa, Tomoaki; Tao, Wei-Kuo

    2012-01-01

    Numerical weather prediction (NWP) simulations using the Japan Meteorological Agency NonhydrostaticModel (JMA-NHM) are conducted for three precipitation events observed by shipborne or spaceborneW-band cloud radars. Spectral bin and single-moment bulk cloud microphysics schemes are employed separatelyfor an intercomparative study. A radar product simulator that is compatible with both microphysicsschemes is developed to enable a direct comparison between simulation and observation with respect to theequivalent radar reflectivity factor Ze, Doppler velocity (DV), and path-integrated attenuation (PIA). Ingeneral, the bin model simulation shows better agreement with the observed data than the bulk modelsimulation. The correction of the terminal fall velocities of snowflakes using those of hail further improves theresult of the bin model simulation. The results indicate that there are substantial uncertainties in the masssizeand sizeterminal fall velocity relations of snowflakes or in the calculation of terminal fall velocity of snowaloft. For the bulk microphysics, the overestimation of Ze is observed as a result of a significant predominanceof snow over cloud ice due to substantial deposition growth directly to snow. The DV comparison shows thata correction for the fall velocity of hydrometeors considering a change of particle size should be introducedeven in single-moment bulk cloud microphysics.

  6. Filling the Gaps: The Synergistic Application of Satellite Data for the Volcanic Ash Threat to Aviation

    NASA Technical Reports Server (NTRS)

    Murray, John; Vernier, Jean-Paul; Fairlie, T. Duncan; Pavolonis, Michael; Krotkov, Nickolay A.; Lindsay, Francis; Haynes, John

    2013-01-01

    Although significant progress has been made in recent years, estimating volcanic ash concentration for the full extent of the airspace affected by volcanic ash remains a challenge. No single satellite, airborne or ground observing system currently exists which can sufficiently inform dispersion models to provide the degree of accuracy required to use them with a high degree of confidence for routing aircraft in and near volcanic ash. Toward this end, the detection and characterization of volcanic ash in the atmosphere may be substantially improved by integrating a wider array of observing systems and advancements in trajectory and dispersion modeling to help solve this problem. The qualitative aspect of this effort has advanced significantly in the past decade due to the increase of highly complementary observational and model data currently available. Satellite observations, especially when coupled with trajectory and dispersion models can provide a very accurate picture of the 3-dimensional location of ash clouds. The accurate estimate of the mass loading at various locations throughout the entire plume, however improving, remains elusive. This paper examines the capabilities of various satellite observation systems and postulates that model-based volcanic ash concentration maps and forecasts might be significantly improved if the various extant satellite capabilities are used together with independent, accurate mass loading data from other observing systems available to calibrate (tune) ash concentration retrievals from the satellite systems.

  7. Advances in nowcasting influenza-like illness rates using search query logs

    NASA Astrophysics Data System (ADS)

    Lampos, Vasileios; Miller, Andrew C.; Crossan, Steve; Stefansen, Christian

    2015-08-01

    User-generated content can assist epidemiological surveillance in the early detection and prevalence estimation of infectious diseases, such as influenza. Google Flu Trends embodies the first public platform for transforming search queries to indications about the current state of flu in various places all over the world. However, the original model significantly mispredicted influenza-like illness rates in the US during the 2012-13 flu season. In this work, we build on the previous modeling attempt, proposing substantial improvements. Firstly, we investigate the performance of a widely used linear regularized regression solver, known as the Elastic Net. Then, we expand on this model by incorporating the queries selected by the Elastic Net into a nonlinear regression framework, based on a composite Gaussian Process. Finally, we augment the query-only predictions with an autoregressive model, injecting prior knowledge about the disease. We assess predictive performance using five consecutive flu seasons spanning from 2008 to 2013 and qualitatively explain certain shortcomings of the previous approach. Our results indicate that a nonlinear query modeling approach delivers the lowest cumulative nowcasting error, and also suggest that query information significantly improves autoregressive inferences, obtaining state-of-the-art performance.

  8. Motorcycle Drag Reduction using a Streamlined Object Ahead of the Rider

    NASA Astrophysics Data System (ADS)

    Selvamuthu, Thirukumaran; Thangadurai, Murugan

    2018-05-01

    Aerodynamics design of various components plays a significant role in reducing the overall drag of the vehicle to improve the fuel efficiency. In the present study, the effects of a semi-ellipsoidal structure placed ahead of a rider on the HONDA CBR 600 RR bike have been studied in detail for Reynolds number varying from 1.24 to 3.72 million. Three-dimensional numerical simulations were performed by solving the Reynolds averaged Navier-Stokes equations with the SST k-ω turbulence model. The numerical results were validated with the wind tunnel testing performed on a 1:12 scale down model using an external pyramidal balance. It has been observed that the wake pattern behind the vehicle, pressure and velocity distribution over the vehicle were modified remarkably by the inclusion of semi-ellipsoidal structure compared to the model with the rider. The drag coefficient of the bike was increased about 16% by placing a dummy rider over the vehicle. However, it decreased substantially and reached close to the base model value when the semi-ellipsoidal structure placed ahead of the rider. Further, the inclusion of semi-ellipsoidal structure produced a negative lift which improves the traction on the road compared to the base model.

  9. Advances in nowcasting influenza-like illness rates using search query logs.

    PubMed

    Lampos, Vasileios; Miller, Andrew C; Crossan, Steve; Stefansen, Christian

    2015-08-03

    User-generated content can assist epidemiological surveillance in the early detection and prevalence estimation of infectious diseases, such as influenza. Google Flu Trends embodies the first public platform for transforming search queries to indications about the current state of flu in various places all over the world. However, the original model significantly mispredicted influenza-like illness rates in the US during the 2012-13 flu season. In this work, we build on the previous modeling attempt, proposing substantial improvements. Firstly, we investigate the performance of a widely used linear regularized regression solver, known as the Elastic Net. Then, we expand on this model by incorporating the queries selected by the Elastic Net into a nonlinear regression framework, based on a composite Gaussian Process. Finally, we augment the query-only predictions with an autoregressive model, injecting prior knowledge about the disease. We assess predictive performance using five consecutive flu seasons spanning from 2008 to 2013 and qualitatively explain certain shortcomings of the previous approach. Our results indicate that a nonlinear query modeling approach delivers the lowest cumulative nowcasting error, and also suggest that query information significantly improves autoregressive inferences, obtaining state-of-the-art performance.

  10. Modelling stream aquifer seepage in an alluvial aquifer: an improved loosing-stream package for MODFLOW

    NASA Astrophysics Data System (ADS)

    Osman, Yassin Z.; Bruen, Michael P.

    2002-07-01

    Seepage from a stream, which partially penetrates an unconfined alluvial aquifer, is studied for the case when the water table falls below the streambed level. Inadequacies are identified in current modelling approaches to this situation. A simple and improved method of incorporating such seepage into groundwater models is presented. This considers the effect on seepage flow of suction in the unsaturated part of the aquifer below a disconnected stream and allows for the variation of seepage with water table fluctuations. The suggested technique is incorporated into the saturated code MODFLOW and is tested by comparing its predictions with those of a widely used variably saturated model, SWMS_2D simulating water flow and solute transport in two-dimensional variably saturated media. Comparisons are made of both seepage flows and local mounding of the water table. The suggested technique compares very well with the results of variably saturated model simulations. Most currently used approaches are shown to underestimate the seepage and associated local water table mounding, sometimes substantially. The proposed method is simple, easy to implement and requires only a small amount of additional data about the aquifer hydraulic properties.

  11. Global change modeling for Northern Eurasia: a review and strategies to move forward

    NASA Astrophysics Data System (ADS)

    Monier, E.; Kicklighter, D. W.; Sokolov, A. P.; Zhuang, Q.; Sokolik, I. N.; Lawford, R. G.; Kappas, M.; Paltsev, S.; Groisman, P. Y.

    2017-12-01

    Northern Eurasia is made up of a complex and diverse set of physical, ecological, climatic and human systems, which provide important ecosystem services including the storage of substantial stocks of carbon in its terrestrial ecosystems. At the same time, the region has experienced dramatic climate change, natural disturbances and changes in land management practices over the past century. For these reasons, Northern Eurasia is both a critical region to understand and a complex system with substantial challenges for the modeling community. This review is designed to highlight the state of past and ongoing efforts of the research community to understand and model these environmental, socioeconomic, and climatic changes. We further aim to provide perspectives on the future direction of global change modeling to improve our understanding of the role of Northern Eurasia in the coupled human-Earth system. Modeling efforts have shown that environmental and socioeconomic changes in Northern Eurasia can have major impacts on biodiversity, ecosystems services, environmental sustainability, and the carbon cycle of the region, and beyond. These impacts have the potential to feedback onto and alter the global Earth system. We find that past and ongoing studies have largely focused on specific components of Earth system dynamics and have not systematically examined their feedbacks to the global Earth system and to society. We identify the crucial role of Earth system models in advancing our understanding of feedbacks within the region and with the global system. We further argue for the need for integrated assessment models (IAMs), a suite of models that couple human activity models to Earth system models, which are key to address many emerging issues that require a representation of the coupled human-Earth system.

  12. A review of and perspectives on global change modeling for Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Monier, Erwan; Kicklighter, David W.; Sokolov, Andrei P.; Zhuang, Qianlai; Sokolik, Irina N.; Lawford, Richard; Kappas, Martin; Paltsev, Sergey V.; Groisman, Pavel Ya

    2017-08-01

    Northern Eurasia is made up of a complex and diverse set of physical, ecological, climatic and human systems, which provide important ecosystem services including the storage of substantial stocks of carbon in its terrestrial ecosystems. At the same time, the region has experienced dramatic climate change, natural disturbances and changes in land management practices over the past century. For these reasons, Northern Eurasia is both a critical region to understand and a complex system with substantial challenges for the modeling community. This review is designed to highlight the state of past and ongoing efforts of the research community to understand and model these environmental, socioeconomic, and climatic changes. We further aim to provide perspectives on the future direction of global change modeling to improve our understanding of the role of Northern Eurasia in the coupled human-Earth system. Modeling efforts have shown that environmental and socioeconomic changes in Northern Eurasia can have major impacts on biodiversity, ecosystems services, environmental sustainability, and the carbon cycle of the region, and beyond. These impacts have the potential to feedback onto and alter the global Earth system. We find that past and ongoing studies have largely focused on specific components of Earth system dynamics and have not systematically examined their feedbacks to the global Earth system and to society. We identify the crucial role of Earth system models in advancing our understanding of feedbacks within the region and with the global system. We further argue for the need for integrated assessment models (IAMs), a suite of models that couple human activity models to Earth system models, which are key to address many emerging issues that require a representation of the coupled human-Earth system.

  13. Surface tension prevails over solute effect in organic-influenced cloud droplet activation.

    PubMed

    Ovadnevaite, Jurgita; Zuend, Andreas; Laaksonen, Ari; Sanchez, Kevin J; Roberts, Greg; Ceburnis, Darius; Decesari, Stefano; Rinaldi, Matteo; Hodas, Natasha; Facchini, Maria Cristina; Seinfeld, John H; O' Dowd, Colin

    2017-06-29

    The spontaneous growth of cloud condensation nuclei (CCN) into cloud droplets under supersaturated water vapour conditions is described by classic Köhler theory. This spontaneous activation of CCN depends on the interplay between the Raoult effect, whereby activation potential increases with decreasing water activity or increasing solute concentration, and the Kelvin effect, whereby activation potential decreases with decreasing droplet size or increases with decreasing surface tension, which is sensitive to surfactants. Surface tension lowering caused by organic surfactants, which diminishes the Kelvin effect, is expected to be negated by a concomitant reduction in the Raoult effect, driven by the displacement of surfactant molecules from the droplet bulk to the droplet-vapour interface. Here we present observational and theoretical evidence illustrating that, in ambient air, surface tension lowering can prevail over the reduction in the Raoult effect, leading to substantial increases in cloud droplet concentrations. We suggest that consideration of liquid-liquid phase separation, leading to complete or partial engulfing of a hygroscopic particle core by a hydrophobic organic-rich phase, can explain the lack of concomitant reduction of the Raoult effect, while maintaining substantial lowering of surface tension, even for partial surface coverage. Apart from the importance of particle size and composition in droplet activation, we show by observation and modelling that incorporation of phase-separation effects into activation thermodynamics can lead to a CCN number concentration that is up to ten times what is predicted by climate models, changing the properties of clouds. An adequate representation of the CCN activation process is essential to the prediction of clouds in climate models, and given the effect of clouds on the Earth's energy balance, improved prediction of aerosol-cloud-climate interactions is likely to result in improved assessments of future climate change.

  14. Successful arrest of photoreceptor and vision loss expands the therapeutic window of retinal gene therapy to later stages of disease

    PubMed Central

    Beltran, William A.; Cideciyan, Artur V.; Iwabe, Simone; Swider, Malgorzata; Kosyk, Mychajlo S.; McDaid, Kendra; Martynyuk, Inna; Ying, Gui-Shuang; Shaffer, James; Deng, Wen-Tao; Boye, Sanford L.; Lewin, Alfred S.; Hauswirth, William W.; Jacobson, Samuel G.; Aguirre, Gustavo D.

    2015-01-01

    Inherited retinal degenerations cause progressive loss of photoreceptor neurons with eventual blindness. Corrective or neuroprotective gene therapies under development could be delivered at a predegeneration stage to prevent the onset of disease, as well as at intermediate-degeneration stages to slow the rate of progression. Most preclinical gene therapy successes to date have been as predegeneration interventions. In many animal models, as well as in human studies, to date, retinal gene therapy administered well after the onset of degeneration was not able to modify the rate of progression even when successfully reversing dysfunction. We evaluated consequences of gene therapy delivered at intermediate stages of disease in a canine model of X-linked retinitis pigmentosa (XLRP) caused by a mutation in the Retinitis Pigmentosa GTPase Regulator (RPGR) gene. Spatiotemporal natural history of disease was defined and therapeutic dose selected based on predegeneration results. Then interventions were timed at earlier and later phases of intermediate-stage disease, and photoreceptor degeneration monitored with noninvasive imaging, electrophysiological function, and visual behavior for more than 2 y. All parameters showed substantial and significant arrest of the progressive time course of disease with treatment, which resulted in long-term improved retinal function and visual behavior compared with control eyes. Histology confirmed that the human RPGR transgene was stably expressed in photoreceptors and associated with improved structural preservation of rods, cones, and ON bipolar cells together with correction of opsin mislocalization. These findings in a clinically relevant large animal model demonstrate the long-term efficacy of RPGR gene augmentation and substantially broaden the therapeutic window for intervention in patients with RPGR-XLRP. PMID:26460017

  15. Potential benefits of magnetic suspension and balance systems

    NASA Technical Reports Server (NTRS)

    Lawing, Pierce L.; Dress, David A.; Kilgore, Robert A.

    1987-01-01

    The potential of Magnetic Suspension and Balance Systems (MSBS) to improve conventional wind tunnel testing techniques is discussed. Topics include: elimination of model geometry distortion and support interference to improve the measurement accuracy of aerodynamic coefficients; removal of testing restrictions due to supports; improved dynamic stability data; and stores separation testing. Substantial increases in wind tunnel productivity are anticipated due to the coalescence of these improvements. Specific improvements in testing methods for missiles, helicopters, fighter aircraft, twin fuselage transports and bombers, state separation, water tunnels, and automobiles are also forecast. In a more speculative vein, new wind tunnel test techniques are envisioned as a result of applying MSBS, including free-flight computer trajectories in the test section, pilot-in-the-loop and designer-in-the-loop testing, shipboard missile launch simulation, and optimization of hybrid hypersonic configurations. Also addressed are potential applications of MSBS to such diverse technologies as medical research and practice, industrial robotics, space weaponry, and ore processing in space.

  16. Technical Note: On the Use of Nudging for Aerosol-Climate Model Intercomparison Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Kai; Wan, Hui; Liu, Xiaohong

    2014-08-26

    Nudging is an assimilation technique widely used in the development and evaluation of climate models. Con- straining the simulated wind and temperature fields using global weather reanalysis facilitates more straightforward comparison between simulation and observation, and reduces uncertainties associated with natural variabilities of the large-scale circulation. On the other hand, the artificial forcing introduced by nudging can be strong enough to change the basic characteristics of the model climate. In the paper we show that for the Community Atmosphere Model version 5, due to the systematic temperature bias in the standard model and the relatively strong sensitivity of homogeneous icemore » nucleation to aerosol concentration, nudging towards reanalysis results in substantial reductions in the ice cloud amount and the impact of anthropogenic aerosols on longwave cloud forcing. In order to reduce discrepancies between the nudged and unconstrained simulations and meanwhile take the advantages of nudging, two alternative experimentation methods are evaluated. The first one constrains only the horizontal winds. The second method nudges both winds and temperature, but replaces the long-term climatology of the reanalysis by that of the model. Results show that both methods lead to substantially improved agreement with the free-running model in terms of the top-of-atmosphere radiation budget and cloud ice amount. The wind-only nudging is more convenient to apply, and provides higher correlations of the wind fields, geopotential height and specific humidity between simulation and reanalysis. This suggests that nudging the horizontal winds but not temperature is a good strategy, especially for studies that involve both warm and cold clouds.« less

  17. Biogeochemical modeling of CO2 and CH4 production in anoxic Arctic soil microcosms

    NASA Astrophysics Data System (ADS)

    Tang, Guoping; Zheng, Jianqiu; Xu, Xiaofeng; Yang, Ziming; Graham, David E.; Gu, Baohua; Painter, Scott L.; Thornton, Peter E.

    2016-09-01

    Soil organic carbon turnover to CO2 and CH4 is sensitive to soil redox potential and pH conditions. However, land surface models do not consider redox and pH in the aqueous phase explicitly, thereby limiting their use for making predictions in anoxic environments. Using recent data from incubations of Arctic soils, we extend the Community Land Model with coupled carbon and nitrogen (CLM-CN) decomposition cascade to include simple organic substrate turnover, fermentation, Fe(III) reduction, and methanogenesis reactions, and assess the efficacy of various temperature and pH response functions. Incorporating the Windermere Humic Aqueous Model (WHAM) enables us to approximately describe the observed pH evolution without additional parameterization. Although Fe(III) reduction is normally assumed to compete with methanogenesis, the model predicts that Fe(III) reduction raises the pH from acidic to neutral, thereby reducing environmental stress to methanogens and accelerating methane production when substrates are not limiting. The equilibrium speciation predicts a substantial increase in CO2 solubility as pH increases, and taking into account CO2 adsorption to surface sites of metal oxides further decreases the predicted headspace gas-phase fraction at low pH. Without adequate representation of these speciation reactions, as well as the impacts of pH, temperature, and pressure, the CO2 production from closed microcosms can be substantially underestimated based on headspace CO2 measurements only. Our results demonstrate the efficacy of geochemical models for simulating soil biogeochemistry and provide predictive understanding and mechanistic representations that can be incorporated into land surface models to improve climate predictions.

  18. Stochastic parameterization for light absorption by internally mixed BC/dust in snow grains for application to climate models

    NASA Astrophysics Data System (ADS)

    Liou, K. N.; Takano, Y.; He, C.; Yang, P.; Leung, L. R.; Gu, Y.; Lee, W. L.

    2014-06-01

    A stochastic approach has been developed to model the positions of BC (black carbon)/dust internally mixed with two snow grain types: hexagonal plate/column (convex) and Koch snowflake (concave). Subsequently, light absorption and scattering analysis can be followed by means of an improved geometric-optics approach coupled with Monte Carlo photon tracing to determine BC/dust single-scattering properties. For a given shape (plate, Koch snowflake, spheroid, or sphere), the action of internal mixing absorbs substantially more light than external mixing. The snow grain shape effect on absorption is relatively small, but its effect on asymmetry factor is substantial. Due to a greater probability of intercepting photons, multiple inclusions of BC/dust exhibit a larger absorption than an equal-volume single inclusion. The spectral absorption (0.2-5 µm) for snow grains internally mixed with BC/dust is confined to wavelengths shorter than about 1.4 µm, beyond which ice absorption predominates. Based on the single-scattering properties determined from stochastic and light absorption parameterizations and using the adding/doubling method for spectral radiative transfer, we find that internal mixing reduces snow albedo substantially more than external mixing and that the snow grain shape plays a critical role in snow albedo calculations through its forward scattering strength. Also, multiple inclusion of BC/dust significantly reduces snow albedo as compared to an equal-volume single sphere. For application to land/snow models, we propose a two-layer spectral snow parameterization involving contaminated fresh snow on top of old snow for investigating and understanding the climatic impact of multiple BC/dust internal mixing associated with snow grain metamorphism, particularly over mountain/snow topography.

  19. Cavitation-resistant inducer

    DOEpatents

    Dunn, C.; Subbaraman, M.R.

    1989-06-13

    An improvement in an inducer for a pump is disclosed wherein the inducer includes a hub, a plurality of radially extending substantially helical blades and a wall member extending about and encompassing an outer periphery of the blades. The improvement comprises forming adjacent pairs of blades and the hub to provide a substantially rectangular cross-sectional flow area which cross-sectional flow area decreases from the inlet end of the inducer to a discharge end of the inducer, resulting in increased inducer efficiency improved suction performance, reduced susceptibility to cavitation, reduced susceptibility to hub separation and reduced fabrication costs. 11 figs.

  20. Improving the environmental profile of wood panels via co-production of ethanol and acetic acid.

    PubMed

    Earles, J Mason; Halog, Anthony; Shaler, Stephen

    2011-11-15

    The oriented strand board (OSB) biorefinery is an emerging technology that could improve the building, transportation, and chemical sectors' environmental profiles. By adding a hot water extraction stage to conventional OSB panel manufacturing, hemicellulose polysaccharides can be extracted from wood strands and converted to renewably sourced ethanol and acetic acid. Replacing fossil-based gasoline and acetic acid has the potential to reduce greenhouse gas (GHG) emissions, among other possible impacts. At the same time, hemicellulose extraction could improve the environmental profile of OSB panels by reducing the level of volatile organic compounds (VOCs) emitted during manufacturing. In this study, the life cycle significance of such GHG, VOC, and other emission reductions was investigated. A process model was developed based on a mix of laboratory and industrial-level mass and energy flow data. Using these data a life cycle assessment (LCA) model was built. Sensitive process parameters were identified and used to develop a target production scenario for the OSB biorefinery. The findings suggest that the OSB biorefinery's deployment could substantially improve human and ecosystem health via reduction of select VOCs compared to conventionally produced OSB, gasoline, and acetic acid. Technological advancements are needed, however, to achieve desirable GHG reductions.

  1. Global Health Impacts of Future Aviation Emissions Under Alternative Control Scenarios

    PubMed Central

    2015-01-01

    There is strong evidence of an association between fine particulate matter less than 2.5 μm (PM2.5) in aerodynamic diameter and adverse health outcomes. This study analyzes the global excess mortality attributable to the aviation sector in the present (2006) and in the future (three 2050 scenarios) using the integrated exposure response model that was also used in the 2010 Global Burden of Disease assessment. The PM2.5 concentrations for the present and future scenarios were calculated using aviation emission inventories developed by the Volpe National Transportation Systems Center and a global chemistry-climate model. We found that while excess mortality due to the aviation sector emissions is greater in 2050 compared to 2006, improved fuel policies (technology and operations improvements yielding smaller increases in fuel burn compared to 2006, and conversion to fully sustainable fuels) in 2050 could lead to 72% fewer deaths for adults 25 years and older than a 2050 scenario with no fuel improvements. Among the four health outcomes examined, ischemic heart disease was the greatest cause of death. Our results suggest that implementation of improved fuel policies can have substantial human health benefits. PMID:25412200

  2. Global health impacts of future aviation emissions under alternative control scenarios.

    PubMed

    Morita, Haruka; Yang, Suijia; Unger, Nadine; Kinney, Patrick L

    2014-12-16

    There is strong evidence of an association between fine particulate matter less than 2.5 μm (PM2.5) in aerodynamic diameter and adverse health outcomes. This study analyzes the global excess mortality attributable to the aviation sector in the present (2006) and in the future (three 2050 scenarios) using the integrated exposure response model that was also used in the 2010 Global Burden of Disease assessment. The PM2.5 concentrations for the present and future scenarios were calculated using aviation emission inventories developed by the Volpe National Transportation Systems Center and a global chemistry-climate model. We found that while excess mortality due to the aviation sector emissions is greater in 2050 compared to 2006, improved fuel policies (technology and operations improvements yielding smaller increases in fuel burn compared to 2006, and conversion to fully sustainable fuels) in 2050 could lead to 72% fewer deaths for adults 25 years and older than a 2050 scenario with no fuel improvements. Among the four health outcomes examined, ischemic heart disease was the greatest cause of death. Our results suggest that implementation of improved fuel policies can have substantial human health benefits.

  3. Improving dynamic phytoplankton reserve-utilization models with an indirect proxy for internal nitrogen.

    PubMed

    Malerba, Martino E; Heimann, Kirsten; Connolly, Sean R

    2016-09-07

    Ecologists have often used indirect proxies to represent variables that are difficult or impossible to measure directly. In phytoplankton, the internal concentration of the most limiting nutrient in a cell determines its growth rate. However, directly measuring the concentration of nutrients within cells is inaccurate, expensive, destructive, and time-consuming, substantially impairing our ability to model growth rates in nutrient-limited phytoplankton populations. The red chlorophyll autofluorescence (hereafter "red fluorescence") signal emitted by a cell is highly correlated with nitrogen quota in nitrogen-limited phytoplankton species. The aim of this study was to evaluate the reliability of including flow cytometric red fluorescence as a proxy for internal nitrogen status to model phytoplankton growth rates. To this end, we used the classic Quota model and designed three approaches to calibrate its model parameters to data: where empirical observations on cell internal nitrogen quota were used to fit the model ("Nitrogen-Quota approach"), where quota dynamics were inferred only from changes in medium nutrient depletion and population density ("Virtual-Quota approach"), or where red fluorescence emission of a cell was used as an indirect proxy for its internal nitrogen quota ("Fluorescence-Quota approach"). Two separate analyses were carried out. In the first analysis, stochastic model simulations were parameterized from published empirical relationships and used to generate dynamics of phytoplankton communities reared under nitrogen-limited conditions. Quota models were fitted to the dynamics of each simulated species with the three different approaches and the performance of each model was compared. In the second analysis, we fit Quota models to laboratory time-series and we calculate the ability of each calibration approach to describe the observed trajectories of internal nitrogen quota in the culture. Results from both analyses concluded that the Fluorescence-Quota approach including per-cell red fluorescence as a proxy of internal nitrogen substantially improved the ability of Quota models to describe phytoplankton dynamics, while still accounting for the biologically important process of cell nitrogen storage. More broadly, many population models in ecology implicitly recognize the importance of accounting for storage mechanisms to describe the dynamics of individual organisms. Hence, the approach documented here with phytoplankton dynamics may also be useful for evaluating the potential of indirect proxies in other ecological systems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Effect of primary and secondary parameters on analytical estimation of effective thermal conductivity of two phase materials using unit cell approach

    NASA Astrophysics Data System (ADS)

    S, Chidambara Raja; P, Karthikeyan; Kumaraswamidhas, L. A.; M, Ramu

    2018-05-01

    Most of the thermal design systems involve two phase materials and analysis of such systems requires detailed understanding of the thermal characteristics of the two phase material. This article aimed to develop geometry dependent unit cell approach model by considering the effects of all primary parameters (conductivity ratio and concentration) and secondary parameters (geometry, contact resistance, natural convection, Knudsen and radiation) for the estimation of effective thermal conductivity of two-phase materials. The analytical equations have been formulated based on isotherm approach for 2-D and 3-D spatially periodic medium. The developed models are validated with standard models and suited for all kind of operating conditions. The results have shown substantial improvement compared to the existing models and are in good agreement with the experimental data.

  5. Precision electroweak physics at LEP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mannelli, M.

    1994-12-01

    Copious event statistics, a precise understanding of the LEP energy scale, and a favorable experimental situation at the Z{sup 0} resonance have allowed the LEP experiments to provide both dramatic confirmation of the Standard Model of strong and electroweak interactions and to place substantially improved constraints on the parameters of the model. The author concentrates on those measurements relevant to the electroweak sector. It will be seen that the precision of these measurements probes sensitively the structure of the Standard Model at the one-loop level, where the calculation of the observables measured at LEP is affected by the value chosenmore » for the top quark mass. One finds that the LEP measurements are consistent with the Standard Model, but only if the mass of the top quark is measured to be within a restricted range of about 20 GeV.« less

  6. The spectra of ten galactic X-ray sources in the southern sky

    NASA Technical Reports Server (NTRS)

    Cruddace, R.; Bowyer, S.; Lampton, M.; Mack, J. E., Jr.; Margon, B.

    1971-01-01

    Data on ten galactic X-ray sources were obtained during a rocket flight from Brazil in June 1969. Detailed spectra of these sources have been compared with bremsstrahlung, black body, and power law models, each including interstellar absorption. Six of the sources were fitted well by one or more of these models. In only one case were the data sufficient to distinguish the best model. Three of the sources were not fitted by any of the models, which suggests that more complex emission mechanisms are applicable. A comparison of our results with those of previous investigations provides evidence that five of the sources vary in intensity by a factor of 2 or more, and that three have variable spectra. New or substantially improved positions have been derived for four of the sources observed.

  7. Model improvements and validation of TerraSAR-X precise orbit determination

    NASA Astrophysics Data System (ADS)

    Hackel, S.; Montenbruck, O.; Steigenberger, P.; Balss, U.; Gisinger, C.; Eineder, M.

    2017-05-01

    The radar imaging satellite mission TerraSAR-X requires precisely determined satellite orbits for validating geodetic remote sensing techniques. Since the achieved quality of the operationally derived, reduced-dynamic (RD) orbit solutions limits the capabilities of the synthetic aperture radar (SAR) validation, an effort is made to improve the estimated orbit solutions. This paper discusses the benefits of refined dynamical models on orbit accuracy as well as estimated empirical accelerations and compares different dynamic models in a RD orbit determination. Modeling aspects discussed in the paper include the use of a macro-model for drag and radiation pressure computation, the use of high-quality atmospheric density and wind models as well as the benefit of high-fidelity gravity and ocean tide models. The Sun-synchronous dusk-dawn orbit geometry of TerraSAR-X results in a particular high correlation of solar radiation pressure modeling and estimated normal-direction positions. Furthermore, this mission offers a unique suite of independent sensors for orbit validation. Several parameters serve as quality indicators for the estimated satellite orbit solutions. These include the magnitude of the estimated empirical accelerations, satellite laser ranging (SLR) residuals, and SLR-based orbit corrections. Moreover, the radargrammetric distance measurements of the SAR instrument are selected for assessing the quality of the orbit solutions and compared to the SLR analysis. The use of high-fidelity satellite dynamics models in the RD approach is shown to clearly improve the orbit quality compared to simplified models and loosely constrained empirical accelerations. The estimated empirical accelerations are substantially reduced by 30% in tangential direction when working with the refined dynamical models. Likewise the SLR residuals are reduced from -3 ± 17 to 2 ± 13 mm, and the SLR-derived normal-direction position corrections are reduced from 15 to 6 mm, obtained from the 2012-2014 period. The radar range bias is reduced from -10.3 to -6.1 mm with the updated orbit solutions, which coincides with the reduced standard deviation of the SLR residuals. The improvements are mainly driven by the satellite macro-model for the purpose of solar radiation pressure modeling, improved atmospheric density models, and the use of state-of-the-art gravity field models.

  8. Perceptions of Barriers and Facilitators During Implementation of a Complex Model of Group Prenatal Care in Six Urban Sites

    PubMed Central

    Novick, Gina; Womack, Julie A.; Lewis, Jessica; Stasko, Emily C.; Rising, Sharon S.; Sadler, Lois S.; Cunningham, Shayna C.; Tobin, Jonathan N.; Ickovics, Jeannette R.

    2016-01-01

    Group prenatal care improves perinatal outcomes, but implementing this complex model places substantial demands on settings designed for individual care. To describe perceived barriers and facilitators to implementing and sustaining Centering Pregnancy Plus (CP+) group prenatal care, 24 in-depth interviews were conducted with 22 clinicians, staff, administrators, and study personnel in six of the 14 sites of a randomized trial of the model. All sites served low-income, minority women. Sites for the present evaluation were selected for variation in location, study arm, and initial implementation response. Implementing CP+ was challenging in all sites, requiring substantial adaptations of clinical systems. All sites had barriers to meeting the model’s demands, but how sites responded to these barriers affected whether implementation thrived or struggled. Thriving sites had organizational cultures that supported innovation, champions who advocated for CP+, and staff who viewed logistical demands as manageable hurdles. Struggling sites had bureaucratic organizational structures and lacked buy-in and financial resources, and staff were overwhelmed by the model’s challenges. Findings suggested that implementing and sustaining health care innovation requires new practices and different ways of thinking, and health systems may not fully recognize the magnitude of change required. Consequently, evidence-based practices are modified or discontinued, and outcomes may differ from those in the original controlled studies. Before implementing new models of care, clinical settings should anticipate model demands and assess capacity for adapting to the disruptions of innovation. PMID:26340483

  9. Growth Modeling with Non-Ignorable Dropout: Alternative Analyses of the STAR*D Antidepressant Trial

    PubMed Central

    Muthén, Bengt; Asparouhov, Tihomir; Hunter, Aimee; Leuchter, Andrew

    2011-01-01

    This paper uses a general latent variable framework to study a series of models for non-ignorable missingness due to dropout. Non-ignorable missing data modeling acknowledges that missingness may depend on not only covariates and observed outcomes at previous time points as with the standard missing at random (MAR) assumption, but also on latent variables such as values that would have been observed (missing outcomes), developmental trends (growth factors), and qualitatively different types of development (latent trajectory classes). These alternative predictors of missing data can be explored in a general latent variable framework using the Mplus program. A flexible new model uses an extended pattern-mixture approach where missingness is a function of latent dropout classes in combination with growth mixture modeling using latent trajectory classes. A new selection model allows not only an influence of the outcomes on missingness, but allows this influence to vary across latent trajectory classes. Recommendations are given for choosing models. The missing data models are applied to longitudinal data from STAR*D, the largest antidepressant clinical trial in the U.S. to date. Despite the importance of this trial, STAR*D growth model analyses using non-ignorable missing data techniques have not been explored until now. The STAR*D data are shown to feature distinct trajectory classes, including a low class corresponding to substantial improvement in depression, a minority class with a U-shaped curve corresponding to transient improvement, and a high class corresponding to no improvement. The analyses provide a new way to assess drug efficiency in the presence of dropout. PMID:21381817

  10. Zika Virus: Recent Advances towards the Development of Vaccines and Therapeutics.

    PubMed

    McArthur, Monica A

    2017-06-13

    Zika is a rapidly emerging public health threat. Although clinical infection is frequently mild, significant neurological manifestations have been demonstrated in infants born to Zika virus (ZIKV) infected mothers. Due to the substantial ramifications of intrauterine infection, effective counter-measures are urgently needed. In order to develop effective anti-ZIKV vaccines and therapeutics, improved animal models and a better understanding of immunological correlates of protection against ZIKV are required. This review will summarize what is currently known about ZIKV, the clinical manifestations and epidemiology of Zika as well as, the development of animal models to study ZIKV infection, host immune responses against ZIKV, and the current state of development of vaccines and therapeutics against ZIKV.

  11. On estimating the basin-scale ocean circulation from satellite altimetry. Part 1: Straightforward spherical harmonic expansion

    NASA Technical Reports Server (NTRS)

    Tai, Chang-Kou

    1988-01-01

    Direct estimation of the absolute dynamic topography from satellite altimetry has been confined to the largest scales (basically the basin-scale) owing to the fact that the signal-to-noise ratio is more unfavorable everywhere else. But even for the largest scales, the results are contaminated by the orbit error and geoid uncertainties. Recently a more accurate Earth gravity model (GEM-T1) became available, providing the opportunity to examine the whole question of direct estimation under a more critical limelight. It is found that our knowledge of the Earth's gravity field has indeed improved a great deal. However, it is not yet possible to claim definitively that our knowledge of the ocean circulation has improved through direct estimation. Yet, the improvement in the gravity model has come to the point that it is no longer possible to attribute the discrepancy at the basin scales between altimetric and hydrographic results as mostly due to geoid uncertainties. A substantial part of the difference must be due to other factors; i.e., the orbit error, or the uncertainty of the hydrographically derived dynamic topography.

  12. Estimating Evapotranspiration with Land Data Assimilation Systems

    NASA Technical Reports Server (NTRS)

    Peters-Lidard, C. D.; Kumar, S. V.; Mocko, D. M.; Tian, Y.

    2011-01-01

    Advancements in both land surface models (LSM) and land surface data assimilation, especially over the last decade, have substantially advanced the ability of land data assimilation systems (LDAS) to estimate evapotranspiration (ET). This article provides a historical perspective on international LSM intercomparison efforts and the development of LDAS systems, both of which have improved LSM ET skill. In addition, an assessment of ET estimates for current LDAS systems is provided along with current research that demonstrates improvement in LSM ET estimates due to assimilating satellite-based soil moisture products. Using the Ensemble Kalman Filter in the Land Information System, we assimilate both NASA and Land Parameter Retrieval Model (LPRM) soil moisture products into the Noah LSM Version 3.2 with the North American LDAS phase 2 (NLDAS-2) forcing to mimic the NLDAS-2 configuration. Through comparisons with two global reference ET products, one based on interpolated flux tower data and one from a new satellite ET algorithm, over the NLDAS2 domain, we demonstrate improvement in ET estimates only when assimilating the LPRM soil moisture product.

  13. Improved simulation of Antarctic sea ice due to the radiative effects of falling snow

    NASA Astrophysics Data System (ADS)

    Li, J.-L. F.; Richardson, Mark; Hong, Yulan; Lee, Wei-Liang; Wang, Yi-Hui; Yu, Jia-Yuh; Fetzer, Eric; Stephens, Graeme; Liu, Yinghui

    2017-08-01

    Southern Ocean sea-ice cover exerts critical control on local albedo and Antarctic precipitation, but simulated Antarctic sea-ice concentration commonly disagrees with observations. Here we show that the radiative effects of precipitating ice (falling snow) contribute substantially to this discrepancy. Many models exclude these radiative effects, so they underestimate both shortwave albedo and downward longwave radiation. Using two simulations with the climate model CESM1, we show that including falling-snow radiative effects improves the simulations relative to cloud properties from CloudSat-CALIPSO, radiation from CERES-EBAF and sea-ice concentration from passive microwave sensors. From 50-70°S, the simulated sea-ice-area bias is reduced by 2.12 × 106 km2 (55%) in winter and by 1.17 × 106 km2 (39%) in summer, mainly because increased wintertime longwave heating restricts sea-ice growth and so reduces summer albedo. Improved Antarctic sea-ice simulations will increase confidence in projected Antarctic sea level contributions and changes in global warming driven by long-term changes in Southern Ocean feedbacks.

  14. Substantially oxygen-free contact tube

    NASA Technical Reports Server (NTRS)

    Pike, James F. (Inventor)

    1993-01-01

    A device for arc welding is provided in which a continuously-fed electrode wire is in electrical contact with a contact tube. The contact tube is improved by using a substantially oxygen-free conductive alloy in order to reduce the amount of electrical erosion.

  15. Substantially Oxygen-Free Contact Tube

    NASA Technical Reports Server (NTRS)

    Pike, James F. (Inventor)

    1991-01-01

    A device for arc welding is provided in which a continuously-fed electrode wire is in electrical contact with a contact tube. The contact tube is improved by using a substantially oxygen-free conductive alloy in order to reduce the amount of electrical erosion.

  16. Bayesian Estimation of Small Effects in Exercise and Sports Science.

    PubMed

    Mengersen, Kerrie L; Drovandi, Christopher C; Robert, Christian P; Pyne, David B; Gore, Christopher J

    2016-01-01

    The aim of this paper is to provide a Bayesian formulation of the so-called magnitude-based inference approach to quantifying and interpreting effects, and in a case study example provide accurate probabilistic statements that correspond to the intended magnitude-based inferences. The model is described in the context of a published small-scale athlete study which employed a magnitude-based inference approach to compare the effect of two altitude training regimens (live high-train low (LHTL), and intermittent hypoxic exposure (IHE)) on running performance and blood measurements of elite triathletes. The posterior distributions, and corresponding point and interval estimates, for the parameters and associated effects and comparisons of interest, were estimated using Markov chain Monte Carlo simulations. The Bayesian analysis was shown to provide more direct probabilistic comparisons of treatments and able to identify small effects of interest. The approach avoided asymptotic assumptions and overcame issues such as multiple testing. Bayesian analysis of unscaled effects showed a probability of 0.96 that LHTL yields a substantially greater increase in hemoglobin mass than IHE, a 0.93 probability of a substantially greater improvement in running economy and a greater than 0.96 probability that both IHE and LHTL yield a substantially greater improvement in maximum blood lactate concentration compared to a Placebo. The conclusions are consistent with those obtained using a 'magnitude-based inference' approach that has been promoted in the field. The paper demonstrates that a fully Bayesian analysis is a simple and effective way of analysing small effects, providing a rich set of results that are straightforward to interpret in terms of probabilistic statements.

  17. Genome typing of nonhuman primate models: implications for biomedical research.

    PubMed

    Haus, Tanja; Ferguson, Betsy; Rogers, Jeffrey; Doxiadis, Gaby; Certa, Ulrich; Rose, Nicola J; Teepe, Robert; Weinbauer, Gerhard F; Roos, Christian

    2014-11-01

    The success of personalized medicine rests on understanding the genetic variation between individuals. Thus, as medical practice evolves and variation among individuals becomes a fundamental aspect of clinical medicine, a thorough consideration of the genetic and genomic information concerning the animals used as models in biomedical research also becomes critical. In particular, nonhuman primates (NHPs) offer great promise as models for many aspects of human health and disease. These are outbred species exhibiting substantial levels of genetic variation; however, understanding of the contribution of this variation to phenotypes is lagging behind in NHP species. Thus, there is a pivotal need to address this gap and define strategies for characterizing both genomic content and variability within primate models of human disease. Here, we discuss the current state of genomics of NHP models and offer guidelines for future work to ensure continued improvement and utility of this line of biomedical research. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Unsteady Computational Tests of a Non-Equilibrium

    NASA Astrophysics Data System (ADS)

    Jirasek, Adam; Hamlington, Peter; Lofthouse, Andrew; Usafa Collaboration; Cu Boulder Collaboration

    2017-11-01

    A non-equilibrium turbulence model is assessed on simulations of three practically-relevant unsteady test cases; oscillating channel flow, transonic flow around an oscillating airfoil, and transonic flow around the Benchmark Super-Critical Wing. The first case is related to piston-driven flows while the remaining cases are relevant to unsteady aerodynamics at high angles of attack and transonic speeds. Non-equilibrium turbulence effects arise in each of these cases in the form of a lag between the mean strain rate and Reynolds stresses, resulting in reduced kinetic energy production compared to classical equilibrium turbulence models that are based on the gradient transport (or Boussinesq) hypothesis. As a result of the improved representation of unsteady flow effects, the non-equilibrium model provides substantially better agreement with available experimental data than do classical equilibrium turbulence models. This suggests that the non-equilibrium model may be ideally suited for simulations of modern high-speed, high angle of attack aerodynamics problems.

  19. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  20. Distributed Assimilation of Satellite-based Snow Extent for Improving Simulated Streamflow in Mountainous, Dense Forests: An Example Over the DMIP2 Western Basins

    NASA Technical Reports Server (NTRS)

    Yatheendradas, Soni; Peters-Lidard, Christa D.; Koren, Victor; Cosgrove, Brian A.; DeGoncalves, Luis G. D.; Smith, Michael; Geiger, James; Cui, Zhengtao; Borak, Jordan; Kumar, Sujay V.; hide

    2012-01-01

    Snow cover area affects snowmelt, soil moisture, evapotranspiration, and ultimately streamflow. For the Distributed Model Intercomparison Project - Phase 2 Western basins, we assimilate satellite-based fractional snow cover area (fSCA) from the Moderate Resolution Imaging Spectroradiometer, or MODIS, into the National Weather Service (NWS) SNOW-17 model. This model is coupled with the NWS Sacramento Heat Transfer (SAC-HT) model inside the National Aeronautics and Space Administration's (NASA) Land Information System. SNOW-17 computes fSCA from snow water equivalent (SWE) values using an areal depletion curve. Using a direct insertion, we assimilate fSCAs in two fully distributed ways: 1) we update the curve by attempting SWE preservation, and 2) we reconstruct SWEs using the curve. The preceding are refinements of an existing simple, conceptually-guided NWS algorithm. Satellite fSCA over dense forests inadequately accounts for below-canopy snow, degrading simulated streamflow upon assimilation during snowmelt. Accordingly, we implement a below-canopy allowance during assimilation. This simplistic allowance and direct insertion are found to be inadequate for improving calibrated results, still degrading them as mentioned above. However, for streamflow volume for the uncalibrated runs, we obtain: (1) substantial to major improvements (64-81 %) as a percentage of the control run residuals (or distance from observations), and (2) minor improvements (16-22 %) as a percentage of observed values. We highlight the need for detailed representations of canopy-snow optical radiative transfer processes in mountainous, dense forest regions if assimilation-based improvements are to be seen in calibrated runs over these areas.

  1. Modelling regional climate change and urban planning scenarios and their impacts on the urban environment in two cities with WRF-ACASA

    NASA Astrophysics Data System (ADS)

    Falk, M.; Pyles, R. D.; Marras, S.; Spano, D.; Paw U, K. T.

    2011-12-01

    The number of urban metabolism studies has increased in recent years, due to the important impact that energy, water and carbon exchange over urban areas have on climate change. Urban modeling is therefore crucial in the future design and management of cities. This study presents the ACASA model coupled to the Weather Research and Forecasting (WRF-ARW) mesoscale model to simulate urban fluxes at a horizontal resolution of 200 meters for urban areas of roughly 100 km^2. As part of the European Project "BRIDGE", these regional simulations were used in combination with remotely sensed data to provide constraints on the land surface types and the exchange of carbon and energy fluxes from urban centers. Surface-atmosphere exchanges of mass and energy were simulated using the Advanced Canopy Atmosphere Soil Algorithm (ACASA). ACASA is a multi-layer high-order closure model, recently modified to work over natural, agricultural as well as urban environments. In particular, improvements were made to account for the anthropogenic contribution to heat and carbon production. For two cities four climate change and four urban planning scenarios were simulated: The climate change scenarios include a base scenario (Sc0: 2008 Commit in IPCC), a medium emission scenario (Sc1: IPCC A2), a worst case emission scenario (Sce2: IPCC A1F1) and finally a best case emission scenario (Sce3: IPCC B1). The urban planning scenarios include different development scenarios such as smart growth. The two cities are a high latitude city, Helsinki (Finland) and an historic city, Florence (Italy). Helsinki is characterized by recent, rapid urbanization that requires a substantial amount of energy for heating, while Florence is representative of cities in lower latitudes, with substantial cultural heritage and a comparatively constant architectural footprint over time. In general, simulated fluxes matched the point observations well and showed consistent improvement in the energy partitioning over urban regions. We present comparisons of observed (EC) tower flux observations from the Florence (Ximeniano) site for 1-9 April, 2008 with results from two sets of high-resolution simulations: the first using dynamically-downscaled input/boundary conditions (Model-0) and the second using fully nested WRF-ACASA (Model-1). In each simulation the model physics are the same; only the WRF domain configuration differs. Preliminary results (Figure 1) indicate a degree of parity (and a slight statistical improvement), in the performances of Model-1 vs. that of Model-0 with respect to observed. Figure 1 (below) shows air temperature values from observed and both model estimates. Additional results indicate that care must be taken to configure the WRF domain, as performance appears to be sensitive to model configuration.

  2. Modeling biophysical/biogeochemical/ecological/ocean/atmosphere two-way interactions using NCEP CFS/SSiB5/TRIFFID/DAYCENT: challenge and promising

    NASA Astrophysics Data System (ADS)

    Xue, Y.; Liu, Y.; Cox, P. M.; De Sales, F.; Lee, J.; Marx, L.; Hartman, M. D.; Yang, R.; Parton, W. J.; Qiu, B.; Ek, M. B.

    2016-12-01

    Evaluations of several dynamic vegetation models' (DVM) performances in the offline experiments and in the CMIP5 simulations suggest that most of the DVMs substantially overestimate leaf area index (LAI) and length of the growing season, which contribute to overestimation in their coupled models' precipitation. These results suggest important deficiencies in today's DVMs but also show the importance of proper ecological processes in the Earth System Modeling. We have developed a water-carbon-energy balance-based ecosystem model (SSiB4/TRIFFID) and verified it with field and satellite measurement at seasonal to decadal and longer scales. In the global offline tests, the model was integrated from 1950 to 2010 driven by observed meteorological forcing. The simulated trend and decadal variabilities in surface ecosystem conditions (e.g., Plant functional types, LAI, GPP), and surface water and energy balances are analyzed; further experiments and analyses are carried to isolate the contribution due to elevated atmospheric carbon concentration, global warming, soil moisture, and climate variability. How nitrogen processes simulated by the DayCent model Climate Forecast System (CFS) model, which has consistently shown improvements in simulated atmospheric & ocean conditions compared with those runs with specified vegetation conditions. In an experiment, two parametrizations that calculate the mean water potential in soil layers, which affect transpiration and plants' mortality, are tested. It shows that these two methods have substantial impact on global decadal variability of precipitation and surface temperature, with even opposite signs over some regions in the worlds. These results show the uncertainty in DVM modeling with significant implication for the future prediction. It is imperative to evaluate DVMs with comprehensive observational data.

  3. The Importance of Distance to Resources in the Spatial Modelling of Bat Foraging Habitat

    PubMed Central

    Rainho, Ana; Palmeirim, Jorge M.

    2011-01-01

    Many bats are threatened by habitat loss, but opportunities to manage their habitats are now increasing. Success of management depends greatly on the capacity to determine where and how interventions should take place, so models predicting how animals use landscapes are important to plan them. Bats are quite distinctive in the way they use space for foraging because (i) most are colonial central-place foragers and (ii) exploit scattered and distant resources, although this increases flying costs. To evaluate how important distances to resources are in modelling foraging bat habitat suitability, we radio-tracked two cave-dwelling species of conservation concern (Rhinolophus mehelyi and Miniopterus schreibersii) in a Mediterranean landscape. Habitat and distance variables were evaluated using logistic regression modelling. Distance variables greatly increased the performance of models, and distance to roost and to drinking water could alone explain 86 and 73% of the use of space by M. schreibersii and R. mehelyi, respectively. Land-cover and soil productivity also provided a significant contribution to the final models. Habitat suitability maps generated by models with and without distance variables differed substantially, confirming the shortcomings of maps generated without distance variables. Indeed, areas shown as highly suitable in maps generated without distance variables proved poorly suitable when distance variables were also considered. We concluded that distances to resources are determinant in the way bats forage across the landscape, and that using distance variables substantially improves the accuracy of suitability maps generated with spatially explicit models. Consequently, modelling with these variables is important to guide habitat management in bats and similarly mobile animals, particularly if they are central-place foragers or depend on spatially scarce resources. PMID:21547076

  4. Evaluation and inversion of a net ecosystem carbon exchange model for grasslands and croplands

    NASA Astrophysics Data System (ADS)

    Herbst, M.; Klosterhalfen, A.; Weihermueller, L.; Graf, A.; Schmidt, M.; Huisman, J. A.; Vereecken, H.

    2017-12-01

    A one-dimensional soil water, heat, and CO2 flux model (SOILCO2), a pool concept of soil carbon turnover (RothC), and a crop growth module (SUCROS) was coupled to predict the net ecosystem exchange (NEE) of carbon. This model, further referred to as AgroC, was extended with routines for managed grassland as well as for root exudation and root decay. In a first step, the coupled model was applied to two winter wheat sites and one upland grassland site in Germany. The model was calibrated based on soil water content, soil temperature, biometric, and soil respiration measurements for each site, and validated in terms of hourly NEE measured with the eddy covariance technique. The overall model performance of AgroC was acceptable with a model efficiency >0.78 for NEE. In a second step, AgroC was optimized with the eddy covariance NEE measurements to examine the effect of various objective functions, constraints, and data-transformations on estimated NEE, which showed a distinct sensitivity to the choice of objective function and the inclusion of soil respiration data in the optimization process. Both, day and nighttime fluxes, were found to be sensitive to the selected optimization strategy. Additional consideration of soil respiration measurements improved the simulation of small positive fluxes remarkably. Even though the model performance of the selected optimization strategies did not diverge substantially, the resulting annual NEE differed substantially. We conclude that data-transformation, definition of objective functions, and data sources have to be considered cautiously when using a terrestrial ecosystem model to determine carbon balances by means of eddy covariance measurements.

  5. Resonance-induced sensitivity enhancement method for conductivity sensors

    NASA Technical Reports Server (NTRS)

    Tai, Yu-Chong (Inventor); Shih, Chi-yuan (Inventor); Li, Wei (Inventor); Zheng, Siyang (Inventor)

    2009-01-01

    Methods and systems for improving the sensitivity of a variety of conductivity sensing devices, in particular capacitively-coupled contactless conductivity detectors. A parallel inductor is added to the conductivity sensor. The sensor with the parallel inductor is operated at a resonant frequency of the equivalent circuit model. At the resonant frequency, parasitic capacitances that are either in series or in parallel with the conductance (and possibly a series resistance) is substantially removed from the equivalent circuit, leaving a purely resistive impedance. An appreciably higher sensor sensitivity results. Experimental verification shows that sensitivity improvements of the order of 10,000-fold are possible. Examples of detecting particulates with high precision by application of the apparatus and methods of operation are described.

  6. Improved Prediction of Quasi-Global Vegetation Conditions Using Remotely-Sensed Surface Soil Moisture

    NASA Technical Reports Server (NTRS)

    Bolten, John; Crow, Wade

    2012-01-01

    The added value of satellite-based surface soil moisture retrievals for agricultural drought monitoring is assessed by calculating the lagged rank correlation between remotely-sensed vegetation indices (VI) and soil moisture estimates obtained both before and after the assimilation of surface soil moisture retrievals derived from the Advanced Microwave Scanning Radiometer-EOS (AMSR-E) into a soil water balance model. Higher soil moisture/VI lag correlations imply an enhanced ability to predict future vegetation conditions using estimates of current soil moisture. Results demonstrate that the assimilation of AMSR-E surface soil moisture retrievals substantially improve the performance of a global drought monitoring system - particularly in sparsely-instrumented areas of the world where high-quality rainfall observations are unavailable.

  7. A 15,000-hour cyclic endurance test of an 8-centimeter-diameter electron bombardment mercury ion thruster

    NASA Technical Reports Server (NTRS)

    Nakanishi, S.

    1976-01-01

    A laboratory model 8 cm thruster with improvements to minimize ion chamber erosion and peeling of sputtered metal was subjected to a cyclic endurance test for 15,040 hours and 460 restarts. A charted history of several thruster operating variables and off-normal events are shown in 600-hour segments at three points in the test. The transient behavior of these variables during a typical start-stop cycle is presented. Finding of the post-test inspection confirmed most of the expected results. Charge exchange ions caused normal accelerator grid erosion. The workability of the various design features was substantiated, and attainable improvements in propellant utilization efficiency should significantly reduce accelerator erosion.

  8. The importance of surface recombination and energy-bandgap narrowing in p-n-junction silicon solar cells

    NASA Technical Reports Server (NTRS)

    Fossum, J. G.; Lindholm, F. A.; Shibib, M. A.

    1979-01-01

    Experimental data demonstrating the sensitivity of open-circuit voltage to front-surface conditions are presented for a variety of p-n-junction silicon solar cells. Analytical models accounting for the data are defined and supported by additional experiments. The models and the data imply that a) surface recombination significantly limits the open-circuit voltage (and the short-circuit current) of typical silicon cells, and b) energy-bandgap narrowing is important in the manifestation of these limitations. The models suggest modifications in both the structural design and the fabrication processing of the cells that would result in substantial improvements in cell performance. The benefits of one such modification - the addition of a thin thermal silicon-dioxide layer on the front surface - are indicated experimentally.

  9. Editorial

    NASA Astrophysics Data System (ADS)

    Bijeljic, Branko; Icardi, Matteo; Prodanović, Maša

    2018-05-01

    Substantial progress has been made over last few decades on understanding the physics of multiphase flow and reactive transport phenomena in subsurface porous media. Confluence of advances in experimental techniques (including micromodels, X-ray microtomography, Nuclear Magnetic Resonance (NMR)) as well as computational power have made it possible to observe static and dynamic multi-scale flow, transport and reactive processes, thus stimulating development of new generation of modelling tools from pore to field scale. One of the key challenges is to make experiment and models as complementary as possible, with continuously improving experimental methods in order to increase predictive capabilities of theoretical models across scales. This creates need to establish rigorous benchmark studies of flow, transport and reaction in porous media which can then serve as the basis for introducing more complex phenomena in future developments.

  10. The pros and cons of code validation

    NASA Technical Reports Server (NTRS)

    Bobbitt, Percy J.

    1988-01-01

    Computational and wind tunnel error sources are examined and quantified using specific calculations of experimental data, and a substantial comparison of theoretical and experimental results, or a code validation, is discussed. Wind tunnel error sources considered include wall interference, sting effects, Reynolds number effects, flow quality and transition, and instrumentation such as strain gage balances, electronically scanned pressure systems, hot film gages, hot wire anemometers, and laser velocimeters. Computational error sources include math model equation sets, the solution algorithm, artificial viscosity/dissipation, boundary conditions, the uniqueness of solutions, grid resolution, turbulence modeling, and Reynolds number effects. It is concluded that, although improvements in theory are being made more quickly than in experiments, wind tunnel research has the advantage of the more realistic transition process of a right turbulence model in a free-transition test.

  11. The Regionalization of National-Scale SPARROW Models for Stream Nutrients

    USGS Publications Warehouse

    Schwarz, G.E.; Alexander, R.B.; Smith, R.A.; Preston, S.D.

    2011-01-01

    This analysis modifies the parsimonious specification of recently published total nitrogen (TN) and total phosphorus (TP) national-scale SPAtially Referenced Regressions On Watershed attributes models to allow each model coefficient to vary geographically among three major river basins of the conterminous United States. Regionalization of the national models reduces the standard errors in the prediction of TN and TP loads, expressed as a percentage of the predicted load, by about 6 and 7%. We develop and apply a method for combining national-scale and regional-scale information to estimate a hybrid model that imposes cross-region constraints that limit regional variation in model coefficients, effectively reducing the number of free model parameters as compared to a collection of independent regional models. The hybrid TN and TP regional models have improved model fit relative to the respective national models, reducing the standard error in the prediction of loads, expressed as a percentage of load, by about 5 and 4%. Only 19% of the TN hybrid model coefficients and just 2% of the TP hybrid model coefficients show evidence of substantial regional specificity (more than ??100% deviation from the national model estimate). The hybrid models have much greater precision in the estimated coefficients than do the unconstrained regional models, demonstrating the efficacy of pooling information across regions to improve regional models. ?? 2011 American Water Resources Association. This article is a U.S. Government work and is in the public domain in the USA.

  12. Improving Water Balance Estimation in the Nile by Combining Remote Sensing and Hydrological Modelling: a Template for Ungauged Basins

    NASA Astrophysics Data System (ADS)

    Gleason, C. J.; Wada, Y.; Wang, J.

    2017-12-01

    Declining gauging infrastructure and fractious water politics have decreased available information about river flows globally, especially in international river basins. Remote sensing and water balance modelling are frequently cited as a potential solutions, but these techniques largely rely on the same in decline gauge data to constrain or parameterize discharge estimates, thus creating a circular approach to estimating discharge inapplicable to ungauged basins. To address this, we here combine a discontinued gauge, remotely sensed discharge estimates made via at-many-stations hydraulic geometry (AMHG) and Landsat data, and the PCR-GLOBWB hydrological model to estimate discharge for an ungauged time period for the Lower Nile (1978-present). Specifically, we first estimate initial discharges from 86 Landsat images and AMHG (1984-2015), and then use these flow estimates to tune the hydrologic model. Our tuning methodology is purposefully simple and can be easily applied to any model without the need for calibration/parameterization. The resulting tuned modelled hydrograph shows large improvement in flow magnitude over previous modelled hydrographs, and validation of tuned monthly model output flows against the historical gauge yields an RMSE of 343 m3/s (33.7%). By contrast, the original simulation had an order-of-magnitude flow error. This improvement is substantial but not perfect: modelled flows have a one-to two-month wet season lag and a negative bias. More sophisticated model calibration and training (e.g. data assimilation) is needed to improve upon our results, however, our results achieved by coupling physical models and remote sensing is a promising first step and proof of concept toward future modelling of ungauged flows. This is especially true as massive cloud computing via Google Earth Engine makes our method easily applicable to any basin without current gauges. Finally, we purposefully do not offer prescriptive solutions for Nile management, and rather hope that the methods demonstrated herein can prove useful to river stakeholders in managing their own water.

  13. A Test of the ARCC© Model Improves Implementation of Evidence-Based Practice, Healthcare Culture, and Patient Outcomes.

    PubMed

    Melnyk, Bernadette Mazurek; Fineout-Overholt, Ellen; Giggleman, Martha; Choy, Katie

    2017-02-01

    Although several models of evidence-based practice (EBP) exist, there is a paucity of studies that have been conducted to evaluate their implementation in healthcare settings. The purpose of this study was to examine the impact of the Advancing Research and Clinical practice through close Collaboration (ARCC) Model on organizational culture, clinicians' EBP beliefs and EBP implementation, and patient outcomes at one healthcare system in the western United States. A pre-test, post-test longitudinal pre-experimental study was conducted with follow-up immediately following full implementation of the ARCC Model. The study was conducted at a 341-bed acute care hospital in the western region of the United States. The sample consisted of 58 interprofessional healthcare professionals. The ARCC Model was implemented in a sequential format over 12 months with the key strategy of preparing a critical mass of EBP mentors for the healthcare system. Healthcare professionals' EBP beliefs, EBP implementation, and organizational culture were measured with valid and reliable instruments. Patient outcomes were collected in aggregate from the hospital's medical records. Findings indicated significant increases in clinicians' EBP beliefs and EBP implementation along with positive movement toward an organizational EBP culture. Study findings also indicated substantial improvements in several patient outcomes. Implementation of the ARCC Model in healthcare systems can enhance clinicians' beliefs and implementation of evidence-based care, improve patient outcomes, and move organizational culture toward EBP. © 2016 Sigma Theta Tau International.

  14. Development of the BIOME-BGC model for the simulation of managed Moso bamboo forest ecosystems.

    PubMed

    Mao, Fangjie; Li, Pingheng; Zhou, Guomo; Du, Huaqiang; Xu, Xiaojun; Shi, Yongjun; Mo, Lufeng; Zhou, Yufeng; Tu, Guoqing

    2016-05-01

    Numerical models are the most appropriate instrument for the analysis of the carbon balance of terrestrial ecosystems and their interactions with changing environmental conditions. The process-based model BIOME-BGC is widely used in simulation of carbon balance within vegetation, litter and soil of unmanaged ecosystems. For Moso bamboo forests, however, simulations with BIOME-BGC are inaccurate in terms of the growing season and the carbon allocation, due to the oversimplified representation of phenology. Our aim was to improve the applicability of BIOME-BGC for managed Moso bamboo forest ecosystem by implementing several new modules, including phenology, carbon allocation, and management. Instead of the simple phenology and carbon allocation representations in the original version, a periodic Moso bamboo phenology and carbon allocation module was implemented, which can handle the processes of Moso bamboo shooting and high growth during "on-year" and "off-year". Four management modules (digging bamboo shoots, selective cutting, obtruncation, fertilization) were integrated in order to quantify the functioning of managed ecosystems. The improved model was calibrated and validated using eddy covariance measurement data collected at a managed Moso bamboo forest site (Anji) during 2011-2013 years. As a result of these developments and calibrations, the performance of the model was substantially improved. Regarding the measured and modeled fluxes (gross primary production, total ecosystem respiration, net ecosystem exchange), relative errors were decreased by 42.23%, 103.02% and 18.67%, respectively. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Conditioning laboratory cats to handling and transport.

    PubMed

    Gruen, Margaret E; Thomson, Andrea E; Clary, Gillian P; Hamilton, Alexandra K; Hudson, Lola C; Meeker, Rick B; Sherman, Barbara L

    2013-10-01

    As research subjects, cats have contributed substantially to our understanding of biological systems, from the development of mammalian visual pathways to the pathophysiology of feline immunodeficiency virus as a model for human immunodeficiency virus. Few studies have evaluated humane methods for managing cats in laboratory animal facilities, however, in order to reduce fear responses and improve their welfare. The authors describe a behavioral protocol used in their laboratory to condition cats to handling and transport. Such behavioral conditioning benefits the welfare of the cats, the safety of animal technicians and the quality of feline research data.

  16. Physically Consistent Eddy-resolving State Estimation and Prediction of the Coupled Pan-Arctic Climate System at Daily to Interannual Time Scales Using the Regional Arctic Climate Model (RACM)

    DTIC Science & Technology

    2014-09-30

    large biases aloft manifest themselves as large circulation biases at the surface (Fig. 3). Wintertime sea level pressure ( SLP ) contours align closely...extends Arctic, and the Icelandic low is very weak and shifted eastward from its proper location. Summer SLP biases in RASM_nonudg are smaller than...winter SLP biases, but are still substantial, and are again greatly improved in RASM_nudg. Although the magnitude of SLP biases is somewhat smaller

  17. Imidazopyridine CB2 agonists: optimization of CB2/CB1 selectivity and implications for in vivo analgesic efficacy.

    PubMed

    Trotter, B Wesley; Nanda, Kausik K; Burgey, Christopher S; Potteiger, Craig M; Deng, James Z; Green, Ahren I; Hartnett, John C; Kett, Nathan R; Wu, Zhicai; Henze, Darrell A; Della Penna, Kimberly; Desai, Reshma; Leitl, Michael D; Lemaire, Wei; White, Rebecca B; Yeh, Suzie; Urban, Mark O; Kane, Stefanie A; Hartman, George D; Bilodeau, Mark T

    2011-04-15

    A new series of imidazopyridine CB2 agonists is described. Structural optimization improved CB2/CB1 selectivity in this series and conferred physical properties that facilitated high in vivo exposure, both centrally and peripherally. Administration of a highly selective CB2 agonist in a rat model of analgesia was ineffective despite substantial CNS exposure, while administration of a moderately selective CB2/CB1 agonist exhibited significant analgesic effects. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Communication: An efficient and accurate perturbative correction to initiator full configuration interaction quantum Monte Carlo

    NASA Astrophysics Data System (ADS)

    Blunt, Nick S.

    2018-06-01

    We present a perturbative correction within initiator full configuration interaction quantum Monte Carlo (i-FCIQMC). In the existing i-FCIQMC algorithm, a significant number of spawned walkers are discarded due to the initiator criteria. Here we show that these discarded walkers have a form that allows the calculation of a second-order Epstein-Nesbet correction, which may be accumulated in a trivial and inexpensive manner, yet substantially improves i-FCIQMC results. The correction is applied to the Hubbard model and the uniform electron gas and molecular systems.

  19. Mount Sinai Hospital's approach to Ontario's Health System Funding Reform.

    PubMed

    Chalk, Tyler; Lau, Davina; Morgan, Matthew; Dietrich, Sandra; Beduz, Mary Agnes; Bell, Chaim M

    2014-01-01

    In April 2012, the Ontario government introduced Health System Funding Reform (HSFR), a transformational shift in how hospitals are funded. Mount Sinai Hospital recognized that moving from global funding to a "patient-based" model would have substantial operational and clinical implications. Adjusting to the new funding environment was set as a top corporate priority, serving as the strategic basis for re-examining and redesigning operations to further improve both quality and efficiency. Two years into HSFR, this article outlines Mount Sinai Hospital's approach and highlights key lessons learned. Copyright © 2014 Longwoods Publishing.

  20. A tunable refractive index matching medium for live imaging cells, tissues and model organisms

    PubMed Central

    Boothe, Tobias; Hilbert, Lennart; Heide, Michael; Berninger, Lea; Huttner, Wieland B; Zaburdaev, Vasily; Vastenhouw, Nadine L; Myers, Eugene W; Drechsel, David N; Rink, Jochen C

    2017-01-01

    In light microscopy, refractive index mismatches between media and sample cause spherical aberrations that often limit penetration depth and resolution. Optical clearing techniques can alleviate these mismatches, but they are so far limited to fixed samples. We present Iodixanol as a non-toxic medium supplement that allows refractive index matching in live specimens and thus substantially improves image quality in live-imaged primary cell cultures, planarians, zebrafish and human cerebral organoids. DOI: http://dx.doi.org/10.7554/eLife.27240.001 PMID:28708059

  1. Effective-field renormalization-group method for Ising systems

    NASA Astrophysics Data System (ADS)

    Fittipaldi, I. P.; De Albuquerque, D. F.

    1992-02-01

    A new applicable effective-field renormalization-group (ERFG) scheme for computing critical properties of Ising spins systems is proposed and used to study the phase diagrams of a quenched bond-mixed spin Ising model on square and Kagomé lattices. The present EFRG approach yields results which improves substantially on those obtained from standard mean-field renormalization-group (MFRG) method. In particular, it is shown that the EFRG scheme correctly distinguishes the geometry of the lattice structure even when working with the smallest possible clusters, namely N'=1 and N=2.

  2. Reducing hospital readmission through team-based primary care: A 7-week pilot study integrating behavioral health and pharmacy.

    PubMed

    DeCaporale-Ryan, Lauren N; Ahmed-Sarwar, Nabila; Upham, Robbyn; Mahler, Karen; Lashway, Katie

    2017-06-01

    A team-based service delivery model was applied to provide patients with biopsychosocial care following hospital discharge to reduce hospital readmission. Most previous interventions focused on transitions of care occurred in the inpatient setting with attention to predischarge strategies. These interventions have not considered psychosocial stressors, and few have explored management in primary care settings. A 7-week team-based service delivery model was implemented in a family medicine practice emphasizing a biopsychosocial approach. A physician, psychologist, pharmacist, care managers, and interdisciplinary trainees worked with 17 patients following hospital discharge. This comprehensive evaluation assessed patients' mood, cognitive abilities, and self-management of health behaviors. Modifications were made to improve ease of access to outpatient care and to improve patient understanding of the therapeutic plan. This pilot study was conducted to determine the utility of the model. Of 17 patients, 15 individuals avoided readmission at 30- and 90-day intervals. Other substantial benefits were noted, including reduced polypharmacy, engagement in specialty care, and reduction of environmental stressors to improve access to care. The clinic in which this was implemented is currently making efforts to maintain this model of care based on observed success. Although this work only represents a small sample, results are encouraging. This model can be replicated in other primary care settings with specialty clinicians on site. Specifically, approaches that promote a team-based delivery in a primary care setting may support improved patient outcomes and reduced overall systems' costs. Recommendations for research in a clinical setting are also offered. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. The PDB_REDO server for macromolecular structure model optimization.

    PubMed

    Joosten, Robbie P; Long, Fei; Murshudov, Garib N; Perrakis, Anastassis

    2014-07-01

    The refinement and validation of a crystallographic structure model is the last step before the coordinates and the associated data are submitted to the Protein Data Bank (PDB). The success of the refinement procedure is typically assessed by validating the models against geometrical criteria and the diffraction data, and is an important step in ensuring the quality of the PDB public archive [Read et al. (2011 ▶), Structure, 19, 1395-1412]. The PDB_REDO procedure aims for 'constructive validation', aspiring to consistent and optimal refinement parameterization and pro-active model rebuilding, not only correcting errors but striving for optimal interpretation of the electron density. A web server for PDB_REDO has been implemented, allowing thorough, consistent and fully automated optimization of the refinement procedure in REFMAC and partial model rebuilding. The goal of the web server is to help practicing crystallo-graphers to improve their model prior to submission to the PDB. For this, additional steps were implemented in the PDB_REDO pipeline, both in the refinement procedure, e.g. testing of resolution limits and k-fold cross-validation for small test sets, and as new validation criteria, e.g. the density-fit metrics implemented in EDSTATS and ligand validation as implemented in YASARA. Innovative ways to present the refinement and validation results to the user are also described, which together with auto-generated Coot scripts can guide users to subsequent model inspection and improvement. It is demonstrated that using the server can lead to substantial improvement of structure models before they are submitted to the PDB.

  4. The PDB_REDO server for macromolecular structure model optimization

    PubMed Central

    Joosten, Robbie P.; Long, Fei; Murshudov, Garib N.; Perrakis, Anastassis

    2014-01-01

    The refinement and validation of a crystallographic structure model is the last step before the coordinates and the associated data are submitted to the Protein Data Bank (PDB). The success of the refinement procedure is typically assessed by validating the models against geometrical criteria and the diffraction data, and is an important step in ensuring the quality of the PDB public archive [Read et al. (2011 ▶), Structure, 19, 1395–1412]. The PDB_REDO procedure aims for ‘constructive validation’, aspiring to consistent and optimal refinement parameterization and pro-active model rebuilding, not only correcting errors but striving for optimal interpretation of the electron density. A web server for PDB_REDO has been implemented, allowing thorough, consistent and fully automated optimization of the refinement procedure in REFMAC and partial model rebuilding. The goal of the web server is to help practicing crystallo­graphers to improve their model prior to submission to the PDB. For this, additional steps were implemented in the PDB_REDO pipeline, both in the refinement procedure, e.g. testing of resolution limits and k-fold cross-validation for small test sets, and as new validation criteria, e.g. the density-fit metrics implemented in EDSTATS and ligand validation as implemented in YASARA. Innovative ways to present the refinement and validation results to the user are also described, which together with auto-generated Coot scripts can guide users to subsequent model inspection and improvement. It is demonstrated that using the server can lead to substantial improvement of structure models before they are submitted to the PDB. PMID:25075342

  5. Evaluation of Convective Transport in the GEOS-5 Chemistry and Climate Model

    NASA Technical Reports Server (NTRS)

    Pickering, Kenneth E.; Ott, Lesley E.; Shi, Jainn J.; Tao. Wei-Kuo; Mari, Celine; Schlager, Hans

    2011-01-01

    The NASA Goddard Earth Observing System (GEOS-5) Chemistry and Climate Model (CCM) consists of a global atmospheric general circulation model and the combined stratospheric and tropospheric chemistry package from the NASA Global Modeling Initiative (GMI) chemical transport model. The subgrid process of convective tracer transport is represented through the Relaxed Arakawa-Schubert parameterization in the GEOS-5 CCM. However, substantial uncertainty for tracer transport is associated with this parameterization, as is the case with all global and regional models. We have designed a project to comprehensively evaluate this parameterization from the point of view of tracer transport, and determine the most appropriate improvements that can be made to the GEOS-5 convection algorithm, allowing improvement in our understanding of the role of convective processes in determining atmospheric composition. We first simulate tracer transport in individual observed convective events with a cloud-resolving model (WRF). Initial condition tracer profiles (CO, CO2, O3) are constructed from aircraft data collected in undisturbed air, and the simulations are evaluated using aircraft data taken in the convective anvils. A single-column (SCM) version of the GEOS-5 GCM with online tracers is then run for the same convective events. SCM output is evaluated based on averaged tracer fields from the cloud-resolving model. Sensitivity simulations with adjusted parameters will be run in the SCM to determine improvements in the representation of convective transport. The focus of the work to date is on tropical continental convective events from the African Monsoon Multidisciplinary Analyses (AMMA) field mission in August 2006 that were extensively sampled by multiple research aircraft.

  6. The Agricultural Model Intercomparison and Improvement Project (AgMIP): Progress and Preliminary Results

    NASA Astrophysics Data System (ADS)

    Rosenzweig, C.

    2011-12-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) is a distributed climate-scenario simulation exercise for historical model intercomparison and future climate change conditions with participation of multiple crop and agricultural trade modeling groups around the world. The goals of AgMIP are to improve substantially the characterization of risk of hunger and world food security due to climate change and to enhance adaptation capacity in both developing and developed countries. Recent progress and the current status of AgMIP will be presented, highlighting three areas of activity: preliminary results from crop pilot studies, outcomes from regional workshops, and emerging scientific challenges. AgMIP crop modeling efforts are being led by pilot studies, which have been established for wheat, maize, rice, and sugarcane. These crop-specific initiatives have proven instrumental in testing and contributing to AgMIP protocols, as well as creating preliminary results for aggregation and input to agricultural trade models. Regional workshops are being held to encourage collaborations and set research activities in motion for key agricultural areas. The first of these workshops was hosted by Embrapa and UNICAMP and held in Campinas, Brazil. Outcomes from this meeting have informed crop modeling research activities within South America, AgMIP protocols, and future regional workshops. Several scientific challenges have emerged and are currently being addressed by AgMIP researchers. Areas of particular interest include geospatial weather generation, ensemble methods for climate scenarios and crop models, spatial aggregation of field-scale yields to regional and global production, and characterization of future changes in climate variability.

  7. A Caveat Note on Tuning in the Development of Coupled Climate Models

    NASA Astrophysics Data System (ADS)

    Dommenget, Dietmar; Rezny, Michael

    2018-01-01

    State-of-the-art coupled general circulation models (CGCMs) have substantial errors in their simulations of climate. In particular, these errors can lead to large uncertainties in the simulated climate response (both globally and regionally) to a doubling of CO2. Currently, tuning of the parameterization schemes in CGCMs is a significant part of the developed. It is not clear whether such tuning actually improves models. The tuning process is (in general) neither documented, nor reproducible. Alternative methods such as flux correcting are not used nor is it clear if such methods would perform better. In this study, ensembles of perturbed physics experiments are performed with the Globally Resolved Energy Balance (GREB) model to test the impact of tuning. The work illustrates that tuning has, in average, limited skill given the complexity of the system, the limited computing resources, and the limited observations to optimize parameters. While tuning may improve model performance (such as reproducing observed past climate), it will not get closer to the "true" physics nor will it significantly improve future climate change projections. Tuning will introduce artificial compensating error interactions between submodels that will hamper further model development. In turn, flux corrections do perform well in most, but not all aspects. A main advantage of flux correction is that it is much cheaper, simpler, more transparent, and it does not introduce artificial error interactions between submodels. These GREB model experiments should be considered as a pilot study to motivate further CGCM studies that address the issues of model tuning.

  8. Estimating West Nile virus transmission period in Pennsylvania using an optimized degree-day model.

    PubMed

    Chen, Shi; Blanford, Justine I; Fleischer, Shelby J; Hutchinson, Michael; Saunders, Michael C; Thomas, Matthew B

    2013-07-01

    Abstract We provide calibrated degree-day models to predict potential West Nile virus (WNV) transmission periods in Pennsylvania. We begin by following the standard approach of treating the degree-days necessary for the virus to complete the extrinsic incubation period (EIP), and mosquito longevity as constants. This approach failed to adequately explain virus transmission periods based on mosquito surveillance data from 4 locations (Harrisburg, Philadelphia, Pittsburgh, and Williamsport) in Pennsylvania from 2002 to 2008. Allowing the EIP and adult longevity to vary across time and space improved model fit substantially. The calibrated models increase the ability to successfully predict the WNV transmission period in Pennsylvania to 70-80% compared to less than 30% in the uncalibrated model. Model validation showed the optimized models to be robust in 3 of the locations, although still showing errors for Philadelphia. These models and methods could provide useful tools to predict WNV transmission period from surveillance datasets, assess potential WNV risk, and make informed mosquito surveillance strategies.

  9. Risk of bias reporting in the recent animal focal cerebral ischaemia literature.

    PubMed

    Bahor, Zsanett; Liao, Jing; Macleod, Malcolm R; Bannach-Brown, Alexandra; McCann, Sarah K; Wever, Kimberley E; Thomas, James; Ottavi, Thomas; Howells, David W; Rice, Andrew; Ananiadou, Sophia; Sena, Emily

    2017-10-15

    Findings from in vivo research may be less reliable where studies do not report measures to reduce risks of bias. The experimental stroke community has been at the forefront of implementing changes to improve reporting, but it is not known whether these efforts are associated with continuous improvements. Our aims here were firstly to validate an automated tool to assess risks of bias in published works, and secondly to assess the reporting of measures taken to reduce the risk of bias within recent literature for two experimental models of stroke. We developed and used text analytic approaches to automatically ascertain reporting of measures to reduce risk of bias from full-text articles describing animal experiments inducing middle cerebral artery occlusion (MCAO) or modelling lacunar stroke. Compared with previous assessments, there were improvements in the reporting of measures taken to reduce risks of bias in the MCAO literature but not in the lacunar stroke literature. Accuracy of automated annotation of risk of bias in the MCAO literature was 86% (randomization), 94% (blinding) and 100% (sample size calculation); and in the lacunar stroke literature accuracy was 67% (randomization), 91% (blinding) and 96% (sample size calculation). There remains substantial opportunity for improvement in the reporting of animal research modelling stroke, particularly in the lacunar stroke literature. Further, automated tools perform sufficiently well to identify whether studies report blinded assessment of outcome, but improvements are required in the tools to ascertain whether randomization and a sample size calculation were reported. © 2017 The Author(s).

  10. Molecular simulation of water and hydration effects in different environments: challenges and developments for DFTB based models.

    PubMed

    Goyal, Puja; Qian, Hu-Jun; Irle, Stephan; Lu, Xiya; Roston, Daniel; Mori, Toshifumi; Elstner, Marcus; Cui, Qiang

    2014-09-25

    We discuss the description of water and hydration effects that employs an approximate density functional theory, DFTB3, in either a full QM or QM/MM framework. The goal is to explore, with the current formulation of DFTB3, the performance of this method for treating water in different chemical environments, the magnitude and nature of changes required to improve its performance, and factors that dictate its applicability to reactions in the condensed phase in a QM/MM framework. A relatively minor change (on the scale of kBT) in the O-H repulsive potential is observed to substantially improve the structural properties of bulk water under ambient conditions; modest improvements are also seen in dynamic properties of bulk water. This simple change also improves the description of protonated water clusters, a solvated proton, and to a more limited degree, a solvated hydroxide. By comparing results from DFTB3 models that differ in the description of water, we confirm that proton transfer energetics are adequately described by the standard DFTB3/3OB model for meaningful mechanistic analyses. For QM/MM applications, a robust parametrization of QM-MM interactions requires an explicit consideration of condensed phase properties, for which an efficient sampling technique was developed recently and is reviewed here. The discussions help make clear the value and limitations of DFTB3 based simulations, as well as the developments needed to further improve the accuracy and transferability of the methodology.

  11. Significant inconsistency of vegetation carbon density in CMIP5 Earth system models against observational data

    NASA Astrophysics Data System (ADS)

    Song, Xia; Hoffman, Forrest M.; Iversen, Colleen M.; Yin, Yunhe; Kumar, Jitendra; Ma, Chun; Xu, Xiaofeng

    2017-09-01

    Earth system models (ESMs) have been widely used for projecting global vegetation carbon dynamics, yet how well ESMs performed for simulating vegetation carbon density remains untested. We compiled observational data of vegetation carbon density from literature and existing data sets to evaluate nine ESMs at site, biome, latitude, and global scales. Three variables—root (including fine and coarse roots), total vegetation carbon density, and the root:total vegetation carbon ratios (R/T ratios), were chosen for ESM evaluation. ESM models performed well in simulating the spatial distribution of carbon densities in root (r = 0.71) and total vegetation (r = 0.62). However, ESM models had significant biases in simulating absolute carbon densities in root and total vegetation biomass across the majority of land ecosystems, especially in tropical and arctic ecosystems. Particularly, ESMs significantly overestimated carbon density in root (183%) and total vegetation biomass (167%) in climate zones of 10°S-10°N. Substantial discrepancies between modeled and observed R/T ratios were found: the R/T ratios from ESMs were relatively constant, approximately 0.2 across all ecosystems, along latitudinal gradients, and in tropic, temperate, and arctic climatic zones, which was significantly different from the observed large variations in the R/T ratios (0.1-0.8). There were substantial inconsistencies between ESM-derived carbon density in root and total vegetation biomass and the R/T ratio at multiple scales, indicating urgent needs for model improvements on carbon allocation algorithms and more intensive field campaigns targeting carbon density in all key vegetation components.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Guoping; Zheng, Jianqiu; Xu, Xiaofeng

    Soil organic carbon turnover to CO 2 and CH 4 is sensitive to soil redox potential and pH conditions. But, land surface models do not consider redox and pH in the aqueous phase explicitly, thereby limiting their use for making predictions in anoxic environments. Using recent data from incubations of Arctic soils, we extend the Community Land Model with coupled carbon and nitrogen (CLM-CN) decomposition cascade to include simple organic substrate turnover, fermentation, Fe(III) reduction, and methanogenesis reactions, and assess the efficacy of various temperature and pH response functions. Incorporating the Windermere Humic Aqueous Model (WHAM) enables us to approximatelymore » describe the observed pH evolution without additional parameterization. Though Fe(III) reduction is normally assumed to compete with methanogenesis, the model predicts that Fe(III) reduction raises the pH from acidic to neutral, thereby reducing environmental stress to methanogens and accelerating methane production when substrates are not limiting. Furthermore, the equilibrium speciation predicts a substantial increase in CO 2 solubility as pH increases, and taking into account CO 2 adsorption to surface sites of metal oxides further decreases the predicted headspace gas-phase fraction at low pH. Without adequate representation of these speciation reactions, as well as the impacts of pH, temperature, and pressure, the CO 2 production from closed microcosms can be substantially underestimated based on headspace CO 2 measurements only. Our results demonstrate the efficacy of geochemical models for simulating soil biogeochemistry and provide predictive understanding and mechanistic representations that can be incorporated into land surface models to improve climate predictions.« less

  13. A gamma-ray constraint on the nature of dark matter

    NASA Technical Reports Server (NTRS)

    Silk, Joseph; Bloemen, Hans

    1987-01-01

    If even a small component of the Galactic spheroid consists of the weakly interacting majorana fermions that are cold-dark-matter candidate particles for the Galactic halo, there should be a substantial flux of annihilation gamma rays from a source of about 1-deg extent at the Galactic center. COS B observations already constrain the halo cold-dark-matter (CDM) content entrained in the inner spheroid to be less than about 10 percent. A somewhat weaker constraint applies to the CDM believed to be present in the Galactic disk, but still only about 15 percent can be in such particles. Monochromatic line photons of energy 3-10 GeV are also predicted, and future experiments may be capable of improving these limits. Since both theoretical models of galaxy formation in a CDM-dominated universe and mass models for the rotation curve in the inner Galaxy suggest that a substantial fraction of the spheroid component should be nonluminous and incorporate entrained halo CDM, the hypothesis that the halo CDM consists predominantly of weakly interacting fermions such as photinos or heavy majorana mass neutrinos or higgsinos may already be subject to observational test.

  14. A new look at the decomposition of agricultural productivity growth incorporating weather effects.

    PubMed

    Njuki, Eric; Bravo-Ureta, Boris E; O'Donnell, Christopher J

    2018-01-01

    Random fluctuations in temperature and precipitation have substantial impacts on agricultural output. However, the contribution of these changing configurations in weather to total factor productivity (TFP) growth has not been addressed explicitly in econometric analyses. Thus, the key objective of this study is to quantify and to investigate the role of changing weather patterns in explaining yearly fluctuations in TFP. For this purpose, we define TFP to be a measure of total output divided by a measure of total input. We estimate a stochastic production frontier model using U.S. state-level agricultural data incorporating growing season temperature and precipitation, and intra-annual standard deviations of temperature and precipitation for the period 1960-2004. We use the estimated parameters of the model to compute a TFP index that has good axiomatic properties. We then decompose TFP growth in each state into weather effects, technological progress, technical efficiency, and scale-mix efficiency changes. This approach improves our understanding of the role of different components of TFP in agricultural productivity growth. We find that annual TFP growth averaged 1.56% between 1960 and 2004. Moreover, we observe substantial heterogeneity in weather effects across states and over time.

  15. A new look at the decomposition of agricultural productivity growth incorporating weather effects

    PubMed Central

    Bravo-Ureta, Boris E.; O’Donnell, Christopher J.

    2018-01-01

    Random fluctuations in temperature and precipitation have substantial impacts on agricultural output. However, the contribution of these changing configurations in weather to total factor productivity (TFP) growth has not been addressed explicitly in econometric analyses. Thus, the key objective of this study is to quantify and to investigate the role of changing weather patterns in explaining yearly fluctuations in TFP. For this purpose, we define TFP to be a measure of total output divided by a measure of total input. We estimate a stochastic production frontier model using U.S. state-level agricultural data incorporating growing season temperature and precipitation, and intra-annual standard deviations of temperature and precipitation for the period 1960–2004. We use the estimated parameters of the model to compute a TFP index that has good axiomatic properties. We then decompose TFP growth in each state into weather effects, technological progress, technical efficiency, and scale-mix efficiency changes. This approach improves our understanding of the role of different components of TFP in agricultural productivity growth. We find that annual TFP growth averaged 1.56% between 1960 and 2004. Moreover, we observe substantial heterogeneity in weather effects across states and over time. PMID:29466461

  16. Impact of SST on heavy rainfall events on eastern Adriatic during SOP1 of HyMeX

    NASA Astrophysics Data System (ADS)

    Ivatek-Šahdan, Stjepan; Stanešić, Antonio; Tudor, Martina; Odak Plenković, Iris; Janeković, Ivica

    2018-02-01

    The season of late summer and autumn is favourable for intensive precipitation events (IPE) in the central Mediterranean. During that period the sea surface is warm and contributes to warming and moistening of the lowest portion of the atmosphere, particularly the planetary boundary layer (PBL). Adriatic sea is surrounded by mountains and the area often receives substantial amounts of precipitation in short time (24 h). The IPEs are a consequence of convection triggered by topography acting on the southerly flow that has brought the unstable air to the coastline. Improvement in prediction of high impact weather events is one of the goals of The Hydrological cycle in the Mediterranean eXperiment (HyMeX). This study examines how precipitation patterns change in response to different SST forcing. We focus on the IPEs that occurred on the eastern Adriatic coast during the first HyMeX Special observing period (SOP1, 6 September to 5 November 2012). The operational forecast model ALADIN uses the same SST as the global meteorological model (ARPEGE from Meteo France), as well as the forecast lateral boundary conditions (LBCs). First we assess the SST used by the operational atmospheric model ALADIN and compare it to the in situ measurements, ROMS ocean model, OSTIA and MUR analyses. Results of this assessment show that SST in the eastern Adriatic was overestimated by up to 10 K during HyMeX SOP1 period. Then we examine the sensitivity of 8 km and 2 km resolution forecasts of IPEs to the changes in the SST during whole SOP1 with special attention to the intensive precipitation event in Rijeka. Forecast runs in both resolutions are performed for the whole SOP1 using different SST fields prescribed at initial time and kept constant during the model forecast. Categorical verification of 24 h accumulated precipitation did not show substantial improvement in verification scores when more realistic SST was used. Furthermore, the results show that the impact of introducing improved SST in the analysis on the precipitation forecast varies for different cases. There is generally a larger sensitivity to the SST in high resolution than in the lower one, although the forecast period of the latter is longer.

  17. Influence of physiological phenology on the seasonal pattern of ecosystem respiration in deciduous forests.

    PubMed

    Migliavacca, Mirco; Reichstein, Markus; Richardson, Andrew D; Mahecha, Miguel D; Cremonese, Edoardo; Delpierre, Nicolas; Galvagno, Marta; Law, Beverly E; Wohlfahrt, Georg; Black, T Andrew; Carvalhais, Nuno; Ceccherini, Guido; Chen, Jiquan; Gobron, Nadine; Koffi, Ernest; Munger, J William; Perez-Priego, Oscar; Robustelli, Monica; Tomelleri, Enrico; Cescatti, Alessandro

    2015-01-01

    Understanding the environmental and biotic drivers of respiration at the ecosystem level is a prerequisite to further improve scenarios of the global carbon cycle. In this study we investigated the relevance of physiological phenology, defined as seasonal changes in plant physiological properties, for explaining the temporal dynamics of ecosystem respiration (RECO) in deciduous forests. Previous studies showed that empirical RECO models can be substantially improved by considering the biotic dependency of RECO on the short-term productivity (e.g., daily gross primary production, GPP) in addition to the well-known environmental controls of temperature and water availability. Here, we use a model-data integration approach to investigate the added value of physiological phenology, represented by the first temporal derivative of GPP, or alternatively of the fraction of absorbed photosynthetically active radiation, for modeling RECO at 19 deciduous broadleaved forests in the FLUXNET La Thuile database. The new data-oriented semiempirical model leads to an 8% decrease in root mean square error (RMSE) and a 6% increase in the modeling efficiency (EF) of modeled RECO when compared to a version of the model that does not consider the physiological phenology. The reduction of the model-observation bias occurred mainly at the monthly time scale, and in spring and summer, while a smaller reduction was observed at the annual time scale. The proposed approach did not improve the model performance at several sites, and we identified as potential causes the plant canopy heterogeneity and the use of air temperature as a driver of ecosystem respiration instead of soil temperature. However, in the majority of sites the model-error remained unchanged regardless of the driving temperature. Overall, our results point toward the potential for improving current approaches for modeling RECO in deciduous forests by including the phenological cycle of the canopy. © 2014 John Wiley & Sons Ltd.

  18. 26 CFR 1.1237-1 - Real property subdivided for sale.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... improvements. Other changes in the market price of the lot, not arising from improvements made by the taxpayer... roads, including gravel roads where required by the climate, are not substantial improvements. (5...

  19. 26 CFR 1.1237-1 - Real property subdivided for sale.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... improvements. Other changes in the market price of the lot, not arising from improvements made by the taxpayer... roads, including gravel roads where required by the climate, are not substantial improvements. (5...

  20. 26 CFR 1.1237-1 - Real property subdivided for sale.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... improvements. Other changes in the market price of the lot, not arising from improvements made by the taxpayer... roads, including gravel roads where required by the climate, are not substantial improvements. (5...

  1. 26 CFR 1.1237-1 - Real property subdivided for sale.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... improvements. Other changes in the market price of the lot, not arising from improvements made by the taxpayer... roads, including gravel roads where required by the climate, are not substantial improvements. (5...

  2. 26 CFR 1.1237-1 - Real property subdivided for sale.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... improvements. Other changes in the market price of the lot, not arising from improvements made by the taxpayer... roads, including gravel roads where required by the climate, are not substantial improvements. (5...

  3. A Dynamic Model of Mercury's Magnetospheric Magnetic Field

    PubMed Central

    Johnson, Catherine L.; Philpott, Lydia; Tsyganenko, Nikolai A.; Anderson, Brian J.

    2017-01-01

    Abstract Mercury's solar wind and interplanetary magnetic field environment is highly dynamic, and variations in these external conditions directly control the current systems and magnetic fields inside the planetary magnetosphere. We update our previous static model of Mercury's magnetic field by incorporating variations in the magnetospheric current systems, parameterized as functions of Mercury's heliocentric distance and magnetic activity. The new, dynamic model reproduces the location of the magnetopause current system as a function of systematic pressure variations encountered during Mercury's eccentric orbit, as well as the increase in the cross‐tail current intensity with increasing magnetic activity. Despite the enhancements in the external field parameterization, the residuals between the observed and modeled magnetic field inside the magnetosphere indicate that the dynamic model achieves only a modest overall improvement over the previous static model. The spatial distribution of the residuals in the magnetic field components shows substantial improvement of the model accuracy near the dayside magnetopause. Elsewhere, the large‐scale distribution of the residuals is similar to those of the static model. This result implies either that magnetic activity varies much faster than can be determined from the spacecraft's passage through the magnetosphere or that the residual fields are due to additional external current systems not represented in the model or both. Birkeland currents flowing along magnetic field lines between the magnetosphere and planetary high‐latitude regions have been identified as one such contribution. PMID:29263560

  4. Technical Note: On the use of nudging for aerosol-climate model intercomparison studies

    DOE PAGES

    Zhang, K.; Wan, H.; Liu, X.; ...

    2014-04-24

    Nudging is an assimilation technique widely used in the development and evaluation of climate models. Constraining the simulated wind and temperature fields using global weather reanalysis facilitates more straightforward comparison between simulation and observation, and reduces uncertainties associated with natural variabilities of the large-scale circulation. On the other hand, the forcing introduced by nudging can be strong enough to change the basic characteristics of the model climate. In the paper we show that for the Community Atmosphere Model version 5, due to the systematic temperature bias in the standard model and the sensitivity of simulated ice formation to anthropogenic aerosolmore » concentration, nudging towards reanalysis results in substantial reductions in the ice cloud amount and the impact of anthropogenic aerosols on longwave cloud forcing. In order to reduce discrepancies between the nudged and unconstrained simulations and meanwhile take the advantages of nudging, two alternative experimentation methods are evaluated. The first one constrains only the horizontal winds. The second method nudges both winds and temperature, but replaces the long-term climatology of the reanalysis by that of the model. Results show that both methods lead to substantially improved agreement with the free-running model in terms of the top-of-atmosphere radiation budget and cloud ice amount. The wind-only nudging is more convenient to apply, and provides higher correlations of the wind fields, geopotential height and specific humidity between simulation and reanalysis. This suggests nudging the horizontal winds but not temperature is a good strategy for the investigation of aerosol indirect effects through ice clouds, since it provides well-constrained meteorology without strongly perturbing the model's mean climate.« less

  5. Technical Note: On the use of nudging for aerosol-climate model intercomparison studies

    NASA Astrophysics Data System (ADS)

    Zhang, K.; Wan, H.; Liu, X.; Ghan, S. J.; Kooperman, G. J.; Ma, P.-L.; Rasch, P. J.

    2014-04-01

    Nudging is an assimilation technique widely used in the development and evaluation of climate models. Constraining the simulated wind and temperature fields using global weather reanalysis facilitates more straightforward comparison between simulation and observation, and reduces uncertainties associated with natural variabilities of the large-scale circulation. On the other hand, the forcing introduced by nudging can be strong enough to change the basic characteristics of the model climate. In the paper we show that for the Community Atmosphere Model version 5, due to the systematic temperature bias in the standard model and the sensitivity of simulated ice formation to anthropogenic aerosol concentration, nudging towards reanalysis results in substantial reductions in the ice cloud amount and the impact of anthropogenic aerosols on longwave cloud forcing. In order to reduce discrepancies between the nudged and unconstrained simulations and meanwhile take the advantages of nudging, two alternative experimentation methods are evaluated. The first one constrains only the horizontal winds. The second method nudges both winds and temperature, but replaces the long-term climatology of the reanalysis by that of the model. Results show that both methods lead to substantially improved agreement with the free-running model in terms of the top-of-atmosphere radiation budget and cloud ice amount. The wind-only nudging is more convenient to apply, and provides higher correlations of the wind fields, geopotential height and specific humidity between simulation and reanalysis. This suggests nudging the horizontal winds but not temperature is a good strategy for the investigation of aerosol indirect effects through ice clouds, since it provides well-constrained meteorology without strongly perturbing the model's mean climate.

  6. Integrating sensorimotor systems in a robot model of cricket behavior

    NASA Astrophysics Data System (ADS)

    Webb, Barbara H.; Harrison, Reid R.

    2000-10-01

    The mechanisms by which animals manage sensorimotor integration and coordination of different behaviors can be investigated in robot models. In previous work the first author has build a robot that localizes sound based on close modeling of the auditory and neural system in the cricket. It is known that the cricket combines its response to sound with other sensorimotor activities such as an optomotor reflex and reactions to mechanical stimulation for the antennae and cerci. Behavioral evidence suggests some ways these behaviors may be integrated. We have tested the addition of an optomotor response, using an analog VLSI circuit developed by the second author, to the sound localizing behavior and have shown that it can, as in the cricket, improve the directness of the robot's path to sound. In particular it substantially improves behavior when the robot is subject to a motor disturbance. Our aim is to better understand how the insect brain functions in controlling complex combinations of behavior, with the hope that this will also suggest novel mechanisms for sensory integration on robots.

  7. Bayesian modeling of consumer behavior in the presence of anonymous visits

    NASA Astrophysics Data System (ADS)

    Novak, Julie Esther

    Tailoring content to consumers has become a hallmark of marketing and digital media, particularly as it has become easier to identify customers across usage or purchase occasions. However, across a wide variety of contexts, companies find that customers do not consistently identify themselves, leaving a substantial fraction of anonymous visits. We develop a Bayesian hierarchical model that allows us to probabilistically assign anonymous sessions to users. These probabilistic assignments take into account a customer's demographic information, frequency of visitation, activities taken when visiting, and times of arrival. We present two studies, one with synthetic and one with real data, where we demonstrate improved performance over two popular practices (nearest-neighbor matching and deleting the anonymous visits) due to increased efficiency and reduced bias driven by the non-ignorability of which types of events are more likely to be anonymous. Using our proposed model, we avoid potential bias in understanding the effect of a firm's marketing on its customers, improve inference about the total number of customers in the dataset, and provide more precise targeted marketing to both previously observed and unobserved customers.

  8. Processable conductive graphene/polyethylene nanocomposites: Effects of graphene dispersion and polyethylene blending with oxidized polyethylene on rheology and microstructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iqbal, Muhammad Z.; Abdala, Ahmed A.; Mittal, Vikas

    Poor dispersion of graphene in non-polar polymer matrices creates composites with limited applications. A method to improve the dispersion of graphene in polyethylene (PE) via blending PE with oxidized PE (OPE) is examined. Graphene was produced by simultaneous thermal exfoliation and reduction of graphite oxide. Nanocomposites of graphene with PE as well as graphene with PE/OPE-blends were prepared by solvent blending. Improved dispersion of graphene in PE/OPE blends substantially decreases percolation from both rheological (0.3 vol%) and electrical (0.13 vol%) measurements compared to neat PE nanocomposites (1 and 0.29 vol%), respectively. A universal Brownian dispersion of graphene in polymers wasmore » concluded similar to that of nanotubes, following the Doi-Edwards theory. Micromechanical models, such as Mori-Tanaka and Halpin-Tsai models, modeled the mechanical properties of the nanocomposites. The nanocomposites microstructure, studied by small angle x-ray scattering, confirmed better dispersion of graphene at lower loadings and the formation of surface fractals in the blend/graphene nanocomposites; whereas only mass fractals were observed in neat PE/graphene nanocomposites.« less

  9. Assessing administrative costs of mental health and substance abuse services.

    PubMed

    Broyles, Robert W; Narine, Lutchmie; Robertson, Madeline J

    2004-05-01

    Increasing competition in the market for mental health and substance abuse MHSA services and the potential to realize significant administrative savings have created an imperative to monitor, evaluate, and control spending on administrative functions. This paper develops a generic model that evaluates spending on administrative personnel by a group of providers. The precision of the model is demonstrated by examining a set of data assembled from five MHSA service providers. The model examines a differential cost construction derived from inter-facility comparisons of administrative expenses. After controlling for the scale of operations, the results enable MHSA programs to control the efficiency of administrative personnel and related rates of compensation. The results indicate that the efficiency of using the administrative complement and the scale of operations represent the lion's share of the total differential cost. The analysis also indicates that a modest improvement in the use of administrative personnel results in substantial cost savings, an increase in the net cash flow derived from operations, an improvement in the fiscal performance of the provider, and a decline in opportunity costs that assume the form of foregone direct patient care.

  10. Loss reduction in axial-flow compressors through low-speed model testing

    NASA Technical Reports Server (NTRS)

    Wisler, D. C.

    1984-01-01

    A systematic procedure for reducing losses in axial-flow compressors is presented. In this procedure, a large, low-speed, aerodynamic model of a high-speed core compressor is designed and fabricated based on aerodynamic similarity principles. This model is then tested at low speed where high-loss regions associated with three-dimensional endwall boundary layers flow separation, leakage, and secondary flows can be located, detailed measurements made, and loss mechanisms determined with much greater accuracy and much lower cost and risk than is possible in small, high-speed compressors. Design modifications are made by using custom-tailored airfoils and vector diagrams, airfoil endbends, and modified wall geometries in the high-loss regions. The design improvements resulting in reduced loss or increased stall margin are then scaled to high speed. This paper describes the procedure and presents experimental results to show that in some cases endwall loss has been reduced by as much as 10 percent, flow separation has been reduced or eliminated, and stall margin has been substantially improved by using these techniques.

  11. Evaluating Health Co-Benefits of Climate Change Mitigation in Urban Mobility

    PubMed Central

    Wolkinger, Brigitte; Weisz, Ulli; Hutter, Hans-Peter; Delcour, Jennifer; Griebler, Robert; Mittelbach, Bernhard; Maier, Philipp; Reifeltshammer, Raphael

    2018-01-01

    There is growing recognition that implementation of low-carbon policies in urban passenger transport has near-term health co-benefits through increased physical activity and improved air quality. Nevertheless, co-benefits and related cost reductions are often not taken into account in decision processes, likely because they are not easy to capture. In an interdisciplinary multi-model approach we address this gap, investigating the co-benefits resulting from increased physical activity and improved air quality due to climate mitigation policies for three urban areas. Additionally we take a (macro-)economic perspective, since that is the ultimate interest of policy-makers. Methodologically, we link a transport modelling tool, a transport emission model, an emission dispersion model, a health model and a macroeconomic Computable General Equilibrium (CGE) model to analyze three climate change mitigation scenarios. We show that higher levels of physical exercise and reduced exposure to pollutants due to mitigation measures substantially decrease morbidity and mortality. Expenditures are mainly born by the public sector but are mostly offset by the emerging co-benefits. Our macroeconomic results indicate a strong positive welfare effect, yet with slightly negative GDP and employment effects. We conclude that considering economic co-benefits of climate change mitigation policies in urban mobility can be put forward as a forceful argument for policy makers to take action. PMID:29710784

  12. A constructive Indian country response to the evidence-based program mandate.

    PubMed

    Walker, R Dale; Bigelow, Douglas A

    2011-01-01

    Over the last 20 years governmental mandates for preferentially funding evidence-based "model" practices and programs has become doctrine in some legislative bodies, federal agencies, and state agencies. It was assumed that what works in small sample, controlled settings would work in all community settings, substantially improving safety, effectiveness, and value-for-money. The evidence-based "model" programs mandate has imposed immutable "core components," fidelity testing, alien programming and program developers, loss of familiar programs, and resource capacity requirements upon tribes, while infringing upon their tribal sovereignty and consultation rights. Tribal response in one state (Oregon) went through three phases: shock and rejection; proposing an alternative approach using criteria of cultural appropriateness, aspiring to evaluability; and adopting logic modeling. The state heard and accepted the argument that the tribal way of knowing is different and valid. Currently, a state-authorized tribal logic model and a review panel process are used to approve tribal best practices for state funding. This constructive response to the evidence-based program mandate elevates tribal practices in the funding and regulatory world, facilitates continuing quality improvement and evaluation, while ensuring that practices and programs remain based on local community context and culture. This article provides details of a model that could well serve tribes facing evidence-based model program mandates throughout the country.

  13. Plant microRNA-Target Interaction Identification Model Based on the Integration of Prediction Tools and Support Vector Machine

    PubMed Central

    Meng, Jun; Shi, Lin; Luan, Yushi

    2014-01-01

    Background Confident identification of microRNA-target interactions is significant for studying the function of microRNA (miRNA). Although some computational miRNA target prediction methods have been proposed for plants, results of various methods tend to be inconsistent and usually lead to more false positive. To address these issues, we developed an integrated model for identifying plant miRNA–target interactions. Results Three online miRNA target prediction toolkits and machine learning algorithms were integrated to identify and analyze Arabidopsis thaliana miRNA-target interactions. Principle component analysis (PCA) feature extraction and self-training technology were introduced to improve the performance. Results showed that the proposed model outperformed the previously existing methods. The results were validated by using degradome sequencing supported Arabidopsis thaliana miRNA-target interactions. The proposed model constructed on Arabidopsis thaliana was run over Oryza sativa and Vitis vinifera to demonstrate that our model is effective for other plant species. Conclusions The integrated model of online predictors and local PCA-SVM classifier gained credible and high quality miRNA-target interactions. The supervised learning algorithm of PCA-SVM classifier was employed in plant miRNA target identification for the first time. Its performance can be substantially improved if more experimentally proved training samples are provided. PMID:25051153

  14. Protective effects of Cinnamomum cassia (Lamaceae) against gout and septic responses via attenuation of inflammasome activation in experimental models.

    PubMed

    Shin, Woo-Young; Shim, Do-Wan; Kim, Myong-Ki; Sun, Xiao; Koppula, Sushruta; Yu, Sang-Hyeun; Kim, Han-Bi; Kim, Tack-Joong; Kang, Tae-Bong; Lee, Kwang-Ho

    2017-06-09

    Cinnamomum cassia (C. cassia, Lauraceae family), commonly used for treating dyspepsia, gastritis, blood circulation, and inflammatory diseases is considered as one of the 50 fundamental herbs in traditional Chinese medicine. The anti-inflammatory action of an ethanol extract of C. cassia (CA), and its underlying mechanisms were explored in both in vitro cellular and in vivo murine models. Bone marrow-derived macrophages (BMDMs) were used to study the regulatory effect of CA on inflammasome activation. A lipopolysaccharide (LPS)-induced sepsis mouse model and a monosodium urate (MSU)-induced gout model were employed to study the effect of CA on in vivo efficacy. CA improved the survival rate in the LPS-induced septic shock mouse model and inhibited inflammasome activation including NLRP3, NLRC4, and AIM2, leading to suppression of interleukin-1β secretion. Further, ASC oligomerization and its speck formation in cytosol were attenuated by CA treatment. Furthermore, CA improved both survival rate of LPS-induced septic shock and gout murine model. CA treatment significantly attenuated danger signals-induced inflammatory responses via regulation of inflammasome activation, substantiating the traditional claims of its use in the treatment of inflammation-related disorders. Copyright © 2017. Published by Elsevier B.V.

  15. Biotic and abiotic factors predicting the global distribution and population density of an invasive large mammal

    PubMed Central

    Lewis, Jesse S.; Farnsworth, Matthew L.; Burdett, Chris L.; Theobald, David M.; Gray, Miranda; Miller, Ryan S.

    2017-01-01

    Biotic and abiotic factors are increasingly acknowledged to synergistically shape broad-scale species distributions. However, the relative importance of biotic and abiotic factors in predicting species distributions is unclear. In particular, biotic factors, such as predation and vegetation, including those resulting from anthropogenic land-use change, are underrepresented in species distribution modeling, but could improve model predictions. Using generalized linear models and model selection techniques, we used 129 estimates of population density of wild pigs (Sus scrofa) from 5 continents to evaluate the relative importance, magnitude, and direction of biotic and abiotic factors in predicting population density of an invasive large mammal with a global distribution. Incorporating diverse biotic factors, including agriculture, vegetation cover, and large carnivore richness, into species distribution modeling substantially improved model fit and predictions. Abiotic factors, including precipitation and potential evapotranspiration, were also important predictors. The predictive map of population density revealed wide-ranging potential for an invasive large mammal to expand its distribution globally. This information can be used to proactively create conservation/management plans to control future invasions. Our study demonstrates that the ongoing paradigm shift, which recognizes that both biotic and abiotic factors shape species distributions across broad scales, can be advanced by incorporating diverse biotic factors. PMID:28276519

  16. Evaluating Health Co-Benefits of Climate Change Mitigation in Urban Mobility.

    PubMed

    Wolkinger, Brigitte; Haas, Willi; Bachner, Gabriel; Weisz, Ulli; Steininger, Karl; Hutter, Hans-Peter; Delcour, Jennifer; Griebler, Robert; Mittelbach, Bernhard; Maier, Philipp; Reifeltshammer, Raphael

    2018-04-28

    There is growing recognition that implementation of low-carbon policies in urban passenger transport has near-term health co-benefits through increased physical activity and improved air quality. Nevertheless, co-benefits and related cost reductions are often not taken into account in decision processes, likely because they are not easy to capture. In an interdisciplinary multi-model approach we address this gap, investigating the co-benefits resulting from increased physical activity and improved air quality due to climate mitigation policies for three urban areas. Additionally we take a (macro-)economic perspective, since that is the ultimate interest of policy-makers. Methodologically, we link a transport modelling tool, a transport emission model, an emission dispersion model, a health model and a macroeconomic Computable General Equilibrium (CGE) model to analyze three climate change mitigation scenarios. We show that higher levels of physical exercise and reduced exposure to pollutants due to mitigation measures substantially decrease morbidity and mortality. Expenditures are mainly born by the public sector but are mostly offset by the emerging co-benefits. Our macroeconomic results indicate a strong positive welfare effect, yet with slightly negative GDP and employment effects. We conclude that considering economic co-benefits of climate change mitigation policies in urban mobility can be put forward as a forceful argument for policy makers to take action.

  17. 75 FR 27504 - Substantial Product Hazard List: Hand-Held Hair Dryers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-17

    ...The Consumer Product Safety Improvement Act of 2008 (``CPSIA''), authorizes the United States Consumer Product Safety Commission (``Commission'') to specify, by rule, for any consumer product or class of consumer products, characteristics whose existence or absence shall be deemed a substantial product hazard under certain circumstances. In this document, the Commission is proposing a rule to determine that any hand-held hair dryer without integral immersion protection presents a substantial product hazard.

  18. Minimum and Maximum Potential Contributions to Future Sea Level Rise from Polar Ice Sheets

    NASA Astrophysics Data System (ADS)

    Deconto, R. M.; Pollard, D.

    2017-12-01

    New climate and ice-sheet modeling, calibrated to past changes in sea-level, is painting a stark picture of the future fate of the great polar ice sheets if greenhouse gas emissions continue unabated. This is especially true for Antarctica, where a substantial fraction of the ice sheet rests on bedrock more than 500-meters below sea level. Here, we explore the sensitivity of the polar ice sheets to a warming atmosphere and ocean under a range of future greenhouse gas emissions scenarios. The ice sheet-climate-ocean model used here considers time-evolving changes in surface mass balance and sub-ice oceanic melting, ice deformation, grounding line retreat on reverse-sloped bedrock (Marine Ice Sheet Instability), and newly added processes including hydrofracturing of ice shelves in response to surface meltwater and rain, and structural collapse of thick, marine-terminating ice margins with tall ice-cliff faces (Marine Ice Cliff Instability). The simulations improve on previous work by using 1) improved atmospheric forcing from a Regional Climate Model and 2) a much wider range of model physical parameters within the bounds of modern observations of ice dynamical processes (particularly calving rates) and paleo constraints on past ice-sheet response to warming. Approaches to more precisely define the climatic thresholds capable of triggering rapid and potentially irreversible ice-sheet retreat are also discussed, as is the potential for aggressive mitigation strategies like those discussed at the 2015 Paris Climate Conference (COP21) to substantially reduce the risk of extreme sea-level rise. These results, including physics that consider both ice deformation (creep) and calving (mechanical failure of marine terminating ice) expand on previously estimated limits of maximum rates of future sea level rise based solely on kinematic constraints of glacier flow. At the high end, the new results show the potential for more than 2m of global mean sea level rise by 2100, implying that physically plausible upper limits on future sea-level rise might need to be reconsidered.

  19. Simulation of seasonal US precipitation and temperature by the nested CWRF-ECHAM system

    NASA Astrophysics Data System (ADS)

    Chen, Ligang; Liang, Xin-Zhong; DeWitt, David; Samel, Arthur N.; Wang, Julian X. L.

    2016-02-01

    This study investigates the refined simulation skill that results when the regional Climate extension of the Weather Research and Forecasting (CWRF) model is nested in the ECMWF Hamburg version 4.5 (ECHAM) atmospheric general circulation model over the United States during 1980-2009, where observed sea surface temperatures are used in both models. Over the contiguous US, for each of the four seasons from winter to fall, CWRF reduces the root mean square error of the ECHAM seasonal mean surface air temperature simulation by 0.19, 0.82, 2.02 and 1.85 °C, and increases the equitable threat score of seasonal mean precipitation by 0.18, 0.11, 0.09 and 0.12. CWRF also simulates much more realistically daily precipitation frequency and heavy precipitation events, typically over the Central Great Plains, Cascade Mountains and Gulf Coast States. These CWRF skill enhancements are attributed to the increased spatial resolution and physics refinements in representing orographic, terrestrial hydrology, convection, and cloud-aerosol-radiation effects and their interactions. Empirical orthogonal function analysis of seasonal mean precipitation and surface air temperature interannual variability shows that, in general, CWRF substantially improves the spatial distribution of both quantities, while temporal evolution (i.e. interannual variability) of the first 3 primary patterns is highly correlated with that of the driving ECHAM (except for summer precipitation), and they both have low temporal correlations against observations. During winter, when large-scale forcing dominates, both models also have similar responses to strong ENSO signals where they successfully capture observed precipitation composite anomalies but substantially fail to reproduce surface air temperature anomalies. When driven by the ECMWF Reanalysis Interim, CWRF produces a very realistic interannual evolution of large-scale precipitation and surface air temperature patterns where the temporal correlations with observations are significant. These results indicate that CWRF can greatly improve mesoscale regional climate structures but it cannot change interannual variations of the large-scale patterns, which are determined by the driving lateral boundary conditions.

  20. Representation of Precipitation in a Decade-long Continental-Scale Convection-Resolving Climate Simulation

    NASA Astrophysics Data System (ADS)

    Leutwyler, D.; Fuhrer, O.; Ban, N.; Lapillonne, X.; Lüthi, D.; Schar, C.

    2017-12-01

    The representation of moist convection in climate models represents a major challenge, due to the small scales involved. Regional climate simulations using horizontal resolutions of O(1km) allow to explicitly resolve deep convection leading to an improved representation of the water cycle. However, due to their extremely demanding computational requirements, they have so far been limited to short simulations and/or small computational domains. A new version of the Consortium for Small-Scale Modeling weather and climate model (COSMO) is capable of exploiting new supercomputer architectures employing GPU accelerators, and allows convection-resolving climate simulations on computational domains spanning continents and time periods up to one decade. We present results from a decade-long, convection-resolving climate simulation on a European-scale computational domain. The simulation has a grid spacing of 2.2 km, 1536x1536x60 grid points, covers the period 1999-2008, and is driven by the ERA-Interim reanalysis. Specifically we present an evaluation of hourly rainfall using a wide range of data sets, including several rain-gauge networks and a remotely-sensed lightning data set. Substantial improvements are found in terms of the diurnal cycles of precipitation amount, wet-hour frequency and all-hour 99th percentile. However the results also reveal substantial differences between regions with and without strong orographic forcing. Furthermore we present an index for deep-convective activity based on the statistics of vertical motion. Comparison of the index with lightning data shows that the convection-resolving climate simulations are able to reproduce important features of the annual cycle of deep convection in Europe. Leutwyler D., D. Lüthi, N. Ban, O. Fuhrer, and C. Schär (2017): Evaluation of the Convection-Resolving Climate Modeling Approach on Continental Scales , J. Geophys. Res. Atmos., 122, doi:10.1002/2016JD026013.

  1. Demand assessment and price-elasticity estimation of quality-improved primary health care in Palestine: a contribution from the contingent valuation method.

    PubMed

    Mataria, Awad; Luchini, Stéphane; Daoud, Yousef; Moatti, Jean-Paul

    2007-10-01

    This paper proposes a new methodology to assess demand and price-elasticity for health care, based on patients' stated willingness to pay (WTP) values for certain aspects of health care quality improvements. A conceptual analysis of how respondents consider contingent valuation (CV) questions allowed us to specify a probability density function of stated WTP values, and consequently, to model a demand function for quality-improved health care, using a parametric survival approach. The model was empirically estimated using a CV study intended to assess patients' values for improving the quality of primary health care (PHC) services in Palestine. A random sample of 499 individuals was interviewed following medical consultation in four PHC centers. Quality was assessed using a multi-attribute approach; and respondents valued seven specific quality improvements using a decomposed valuation scenario and a payment card elicitation technique. Our results suggest an inelastic demand at low user fees levels, and when the price-increase is accompanied with substantial quality-improvements. Nevertheless, demand becomes more and more elastic if user fees continue to rise. On the other hand, patients' reactions to price-increase turn out to depend on their level of income. Our results can be used to design successful health care financing strategies that include a consideration of patients' preferences and financial capacities. John Wiley & Sons, Ltd.

  2. 30 CFR 700.5 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... substantial physical harm to persons, property, or the environment and to which persons or improvements on... substantially the quality of the environment, prevent or damage the beneficial use of land or water resources.... Reclamation activity means the reclamation, abatement, control, or prevention of adverse effects of past...

  3. 30 CFR 700.5 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... substantial physical harm to persons, property, or the environment and to which persons or improvements on... substantially the quality of the environment, prevent or damage the beneficial use of land or water resources.... Reclamation activity means the reclamation, abatement, control, or prevention of adverse effects of past...

  4. Validation of a novel animal model for sciatic nerve repair with an adipose-derived stem cell loaded fibrin conduit.

    PubMed

    Saller, Maximilian M; Huettl, Rosa-Eva; Mayer, Julius M; Feuchtinger, Annette; Krug, Christian; Holzbach, Thomas; Volkmer, Elias

    2018-05-01

    Despite the regenerative capabilities of peripheral nerves, severe injuries or neuronal trauma of critical size impose immense hurdles for proper restoration of neuro-muscular circuitry. Autologous nerve grafts improve re-establishment of connectivity, but also comprise substantial donor site morbidity. We developed a rat model which allows the testing of different cell applications, i.e., mesenchymal stem cells, to improve nerve regeneration in vivo. To mimic inaccurate alignment of autologous nerve grafts with the injured nerve, a 20 mm portion of the sciatic nerve was excised, and sutured back in place in reversed direction. To validate the feasibility of our novel model, a fibrin gel conduit containing autologous undifferentiated adipose-derived stem cells was applied around the coaptation sites and compared to autologous nerve grafts. After evaluating sciatic nerve function for 16 weeks postoperatively, animals were sacrificed, and gastrocnemius muscle weight was determined along with morphological parameters (g-ratio, axon density & diameter) of regenerating axons. Interestingly, the addition of undifferentiated adipose-derived stem cells resulted in a significantly improved re-myelination, axon ingrowth and functional outcome, when compared to animals without a cell seeded conduit. The presented model thus displays several intriguing features: it imitates a certain mismatch in size, distribution and orientation of axons within the nerve coaptation site. The fibrin conduit itself allows for an easy application of cells and, as a true critical-size defect model, any observed improvement relates directly to the performed intervention. Since fibrin and adipose-derived stem cells have been approved for human applications, the technique can theoretically be performed on humans. Thus, we suggest that the model is a powerful tool to investigate cell mediated assistance of peripheral nerve regeneration.

  5. Improving substance abuse screening and intervention in a primary care clinic.

    PubMed

    Neushotz, Lori A; Fitzpatrick, Joyce J

    2008-04-01

    Despite recent efforts to educate primary care providers in the identification and management of patients presenting with substance abuse problems, many opportunities to identify and intervene with these patients are overlooked. This project was designed to identify factors that interfere with rates of screening and brief intervention (SBI) of substance abuse problems in a primary care clinic in a major academic medical center in New York City. Six informants representing the disciplines of medicine, nursing, and social work in the primary care clinic provided information regarding SBI. Analysis was focused on substantiation of the need for enhanced diffusion of knowledge related to screening for substance abuse problems to improve rates of SBI in primary care. Recommendations for improvement included continued promotion of SBI by influential role models and opinion leaders, improvement in primary care providers' perceptions of the perceived characteristics of SBI to improve rates of adoption, implementation of interdisciplinary educational initiatives toward the goal of improving rates of SBI in the primary care clinic, and initiation of translational research at the clinic supporting SBI in primary care.

  6. Vinblastine 20' Amides: Synthetic Analogues That Maintain or Improve Potency and Simultaneously Overcome Pgp-Derived Efflux and Resistance.

    PubMed

    Lukesh, John C; Carney, Daniel W; Dong, Huijun; Cross, R Matthew; Shukla, Vyom; Duncan, Katharine K; Yang, Shouliang; Brody, Daniel M; Brütsch, Manuela M; Radakovic, Aleksandar; Boger, Dale L

    2017-09-14

    A series of 180 vinblastine 20' amides were prepared in three steps from commercially available starting materials, systematically exploring a typically inaccessible site in the molecule enlisting a powerful functionalization strategy. Clear structure-activity relationships and a structural model were developed in the studies which provided many such 20' amides that exhibit substantial and some even remarkable enhancements in potency, many that exhibit further improvements in activity against a Pgp overexpressing resistant cancer cell line, and an important subset of the vinblastine analogues that display little or no differential in activity against a matched pair of vinblastine sensitive and resistant (Pgp overexpressing) cell lines. The improvements in potency directly correlated with target tubulin binding affinity, and the reduction in differential functional activity against the sensitive and Pgp overexpressing resistant cell lines was found to correlate directly with an impact on Pgp-derived efflux.

  7. Properties of quantum systems via diagonalization of transition amplitudes. II. Systematic improvements of short-time propagation

    NASA Astrophysics Data System (ADS)

    Vidanović, Ivana; Bogojević, Aleksandar; Balaž, Antun; Belić, Aleksandar

    2009-12-01

    In this paper, building on a previous analysis [I. Vidanović, A. Bogojević, and A. Belić, preceding paper, Phys. Rev. E 80, 066705 (2009)] of exact diagonalization of the space-discretized evolution operator for the study of properties of nonrelativistic quantum systems, we present a substantial improvement to this method. We apply recently introduced effective action approach for obtaining short-time expansion of the propagator up to very high orders to calculate matrix elements of space-discretized evolution operator. This improves by many orders of magnitude previously used approximations for discretized matrix elements and allows us to numerically obtain large numbers of accurate energy eigenvalues and eigenstates using numerical diagonalization. We illustrate this approach on several one- and two-dimensional models. The quality of numerically calculated higher-order eigenstates is assessed by comparison with semiclassical cumulative density of states.

  8. An improved dehazing algorithm of aerial high-definition image

    NASA Astrophysics Data System (ADS)

    Jiang, Wentao; Ji, Ming; Huang, Xiying; Wang, Chao; Yang, Yizhou; Li, Tao; Wang, Jiaoying; Zhang, Ying

    2016-01-01

    For unmanned aerial vehicle(UAV) images, the sensor can not get high quality images due to fog and haze weather. To solve this problem, An improved dehazing algorithm of aerial high-definition image is proposed. Based on the model of dark channel prior, the new algorithm firstly extracts the edges from crude estimated transmission map and expands the extracted edges. Then according to the expended edges, the algorithm sets a threshold value to divide the crude estimated transmission map into different areas and makes different guided filter on the different areas compute the optimized transmission map. The experimental results demonstrate that the performance of the proposed algorithm is substantially the same as the one based on dark channel prior and guided filter. The average computation time of the new algorithm is around 40% of the one as well as the detection ability of UAV image is improved effectively in fog and haze weather.

  9. Satellite techniques for determining the geopotential for sea-surface elevations

    NASA Technical Reports Server (NTRS)

    Pisacane, V. L.

    1984-01-01

    Spaceborne altimetry with measurement accuracies of a few centimeters which has the potential to determine sea surface elevations necessary to compute accurate three-dimensonal geostrophic currents from traditional hydrographic observation is discussed. The limitation in this approach is the uncertainties in knowledge of the global and ocean geopotentials which produce satellite and height uncertainties about an order of magnitude larger than the goal of about 10 cm. The quantative effects of geopotential uncertainties on processing altimetry data are described. Potential near term improvements, not requiring additional spacecraft, are discussed. Even though there is substantial improvements at the longer wavelengths, the oceanographic goal will be achieved. The geopotential research mission (GRM) is described which should produce goepotential models that are capable of defining the ocean geid to 10 cm and near-Earth satellite position. The state of the art and the potential of spaceborne gravimetry is described as an alternative approach to improve our knowledge of the geopotential.

  10. Simplified curve fits for the thermodynamic properties of equilibrium air

    NASA Technical Reports Server (NTRS)

    Srinivasan, S.; Tannehill, J. C.; Weilmuenster, K. J.

    1987-01-01

    New, improved curve fits for the thermodynamic properties of equilibrium air have been developed. The curve fits are for pressure, speed of sound, temperature, entropy, enthalpy, density, and internal energy. These curve fits can be readily incorporated into new or existing computational fluid dynamics codes if real gas effects are desired. The curve fits are constructed from Grabau-type transition functions to model the thermodynamic surfaces in a piecewise manner. The accuracies and continuity of these curve fits are substantially improved over those of previous curve fits. These improvements are due to the incorporation of a small number of additional terms in the approximating polynomials and careful choices of the transition functions. The ranges of validity of the new curve fits are temperatures up to 25 000 K and densities from 10 to the -7 to 10 to the 3d power amagats.

  11. GCSS/WGNE Pacific Cross-section Intercomparison: Tropical and Subtropical Cloud Transitions

    NASA Astrophysics Data System (ADS)

    Teixeira, J.

    2008-12-01

    In this presentation I will discuss the role of the GEWEX Cloud Systems Study (GCSS) working groups in paving the way for substantial improvements in cloud parameterization in weather and climate models. The GCSS/WGNE Pacific Cross-section Intercomparison (GPCI) is an extension of GCSS and is a different type of model evaluation where climate models are analyzed along a Pacific Ocean transect from California to the equator. This approach aims at complementing the more traditional efforts in GCSS by providing a simple framework for the evaluation of models that encompasses several fundamental cloud regimes such as stratocumulus, shallow cumulus and deep cumulus, as well as the transitions between them. Currently twenty four climate and weather prediction models are participating in GPCI. We will present results of the comparison between models and recent satellite data. In particular, we will explore in detail the potential of the Atmospheric Infrared Sounder (AIRS) and CloudSat data for the evaluation of the representation of clouds and convection in climate models.

  12. Simulation of blood flow in deformable vessels using subject-specific geometry and spatially varying wall properties

    PubMed Central

    Xiong, Guanglei; Figueroa, C. Alberto; Xiao, Nan; Taylor, Charles A.

    2011-01-01

    SUMMARY Simulation of blood flow using image-based models and computational fluid dynamics has found widespread application to quantifying hemodynamic factors relevant to the initiation and progression of cardiovascular diseases and for planning interventions. Methods for creating subject-specific geometric models from medical imaging data have improved substantially in the last decade but for many problems, still require significant user interaction. In addition, while fluid–structure interaction methods are being employed to model blood flow and vessel wall dynamics, tissue properties are often assumed to be uniform. In this paper, we propose a novel workflow for simulating blood flow using subject-specific geometry and spatially varying wall properties. The geometric model construction is based on 3D segmentation and geometric processing. Variable wall properties are assigned to the model based on combining centerline-based and surface-based methods. We finally demonstrate these new methods using an idealized cylindrical model and two subject-specific vascular models with thoracic and cerebral aneurysms. PMID:21765984

  13. Long short-term memory for speaker generalization in supervised speech separation

    PubMed Central

    Chen, Jitong; Wang, DeLiang

    2017-01-01

    Speech separation can be formulated as learning to estimate a time-frequency mask from acoustic features extracted from noisy speech. For supervised speech separation, generalization to unseen noises and unseen speakers is a critical issue. Although deep neural networks (DNNs) have been successful in noise-independent speech separation, DNNs are limited in modeling a large number of speakers. To improve speaker generalization, a separation model based on long short-term memory (LSTM) is proposed, which naturally accounts for temporal dynamics of speech. Systematic evaluation shows that the proposed model substantially outperforms a DNN-based model on unseen speakers and unseen noises in terms of objective speech intelligibility. Analyzing LSTM internal representations reveals that LSTM captures long-term speech contexts. It is also found that the LSTM model is more advantageous for low-latency speech separation and it, without future frames, performs better than the DNN model with future frames. The proposed model represents an effective approach for speaker- and noise-independent speech separation. PMID:28679261

  14. Stochastic Parameterization: Toward a New View of Weather and Climate Models

    DOE PAGES

    Berner, Judith; Achatz, Ulrich; Batté, Lauriane; ...

    2017-03-31

    The last decade has seen the success of stochastic parameterizations in short-term, medium-range, and seasonal forecasts: operational weather centers now routinely use stochastic parameterization schemes to represent model inadequacy better and to improve the quantification of forecast uncertainty. Developed initially for numerical weather prediction, the inclusion of stochastic parameterizations not only provides better estimates of uncertainty, but it is also extremely promising for reducing long-standing climate biases and is relevant for determining the climate response to external forcing. This article highlights recent developments from different research groups that show that the stochastic representation of unresolved processes in the atmosphere, oceans,more » land surface, and cryosphere of comprehensive weather and climate models 1) gives rise to more reliable probabilistic forecasts of weather and climate and 2) reduces systematic model bias. We make a case that the use of mathematically stringent methods for the derivation of stochastic dynamic equations will lead to substantial improvements in our ability to accurately simulate weather and climate at all scales. Recent work in mathematics, statistical mechanics, and turbulence is reviewed; its relevance for the climate problem is demonstrated; and future research directions are outlined« less

  15. Deep Visual Attention Prediction

    NASA Astrophysics Data System (ADS)

    Wang, Wenguan; Shen, Jianbing

    2018-05-01

    In this work, we aim to predict human eye fixation with view-free scenes based on an end-to-end deep learning architecture. Although Convolutional Neural Networks (CNNs) have made substantial improvement on human attention prediction, it is still needed to improve CNN based attention models by efficiently leveraging multi-scale features. Our visual attention network is proposed to capture hierarchical saliency information from deep, coarse layers with global saliency information to shallow, fine layers with local saliency response. Our model is based on a skip-layer network structure, which predicts human attention from multiple convolutional layers with various reception fields. Final saliency prediction is achieved via the cooperation of those global and local predictions. Our model is learned in a deep supervision manner, where supervision is directly fed into multi-level layers, instead of previous approaches of providing supervision only at the output layer and propagating this supervision back to earlier layers. Our model thus incorporates multi-level saliency predictions within a single network, which significantly decreases the redundancy of previous approaches of learning multiple network streams with different input scales. Extensive experimental analysis on various challenging benchmark datasets demonstrate our method yields state-of-the-art performance with competitive inference time.

  16. A model for evaluating academic research centers: Case study of the Asian/Pacific Islander Youth Violence Prevention Center.

    PubMed

    Nishimura, Stephanie T; Hishinuma, Earl S; Goebert, Deborah A; Onoye, Jane M M; Sugimoto-Matsuda, Jeanelle J

    2018-02-01

    To provide one model for evaluating academic research centers, given their vital role in addressing public health issues. A theoretical framework is described for a comprehensive evaluation plan for research centers. This framework is applied to one specific center by describing the center's Logic Model and Evaluation Plan, including a sample of the center's activities. Formative and summative evaluation information is summarized. In addition, a summary of outcomes is provided: improved practice and policy; reduction of risk factors and increase in protective factors; reduction of interpersonal youth violence in the community; and national prototype for prevention of interpersonal youth violence. Research centers are important mechanisms to advance science and improve people's quality of life. Because of their more infrastructure-intensive and comprehensive approach, they also require substantial resources for success, and thus, also require careful accountability. It is therefore important to comprehensively evaluate these centers. As provided herein, a more systematic and structured approach utilizing logic models, an evaluation plan, and successful processes can provide research centers with a functionally useful method in their evaluation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Advantages of soft versus hard constraints in self-modeling curve resolution problems. Alternating least squares with penalty functions.

    PubMed

    Gemperline, Paul J; Cash, Eric

    2003-08-15

    A new algorithm for self-modeling curve resolution (SMCR) that yields improved results by incorporating soft constraints is described. The method uses least squares penalty functions to implement constraints in an alternating least squares algorithm, including nonnegativity, unimodality, equality, and closure constraints. By using least squares penalty functions, soft constraints are formulated rather than hard constraints. Significant benefits are (obtained using soft constraints, especially in the form of fewer distortions due to noise in resolved profiles. Soft equality constraints can also be used to introduce incomplete or partial reference information into SMCR solutions. Four different examples demonstrating application of the new method are presented, including resolution of overlapped HPLC-DAD peaks, flow injection analysis data, and batch reaction data measured by UV/visible and near-infrared spectroscopy (NIR). Each example was selected to show one aspect of the significant advantages of soft constraints over traditionally used hard constraints. Incomplete or partial reference information into self-modeling curve resolution models is described. The method offers a substantial improvement in the ability to resolve time-dependent concentration profiles from mixture spectra recorded as a function of time.

  18. Advances in HYDRA and its application to simulations of Inertial Confinement Fusion targets

    NASA Astrophysics Data System (ADS)

    Marinak, M. M.; Kerbel, G. D.; Koning, J. M.; Patel, M. V.; Sepke, S. M.; Brown, P. N.; Chang, B.; Procassini, R.; Veitzer, S. A.

    2008-11-01

    We will outline new capabilities added to the HYDRA 2D/3D multiphysics ICF simulation code. These include a new SN multigroup radiation transport package (1D), constitutive models for elastic-plastic (strength) effects, and a mix model. A Monte Carlo burn package is being incorporated to model diagnostic signatures of neutrons, gamma rays and charged particles. A 3D MHD package that treats resistive MHD is available. Improvements to HYDRA's implicit Monte Carlo photonics package, including the addition of angular biasing, now enable integrated hohlraum simulations to complete in substantially shorter time. The heavy ion beam deposition package now includes a new model for ion stopping power developed by the Tech-X Corporation, with improved accuracy below the Bragg peak. Examples will illustrate HYDRA's enhanced capabilities to simulate various aspects of inertial confinement fusion targets.This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344. The work of Tech-X personnel was funded by the Department of Energy under Small Business Innovation Research Contract No. DE-FG02-03ER83797.

  19. A Robust Bayesian Random Effects Model for Nonlinear Calibration Problems

    PubMed Central

    Fong, Y.; Wakefield, J.; De Rosa, S.; Frahm, N.

    2013-01-01

    Summary In the context of a bioassay or an immunoassay, calibration means fitting a curve, usually nonlinear, through the observations collected on a set of samples containing known concentrations of a target substance, and then using the fitted curve and observations collected on samples of interest to predict the concentrations of the target substance in these samples. Recent technological advances have greatly improved our ability to quantify minute amounts of substance from a tiny volume of biological sample. This has in turn led to a need to improve statistical methods for calibration. In this paper, we focus on developing calibration methods robust to dependent outliers. We introduce a novel normal mixture model with dependent error terms to model the experimental noise. In addition, we propose a re-parameterization of the five parameter logistic nonlinear regression model that allows us to better incorporate prior information. We examine the performance of our methods with simulation studies and show that they lead to a substantial increase in performance measured in terms of mean squared error of estimation and a measure of the average prediction accuracy. A real data example from the HIV Vaccine Trials Network Laboratory is used to illustrate the methods. PMID:22551415

  20. Random-walk mobility analysis of Lisbon's plans for the post-1755 reconstruction

    NASA Astrophysics Data System (ADS)

    de Sampayo, Mafalda Teixeira; Sousa-Rodrigues, David

    2016-11-01

    The different options for the reconstruction of the city of Lisbon in the aftermath of the 1755 earthquake are studied with an agent-based model based on randomwalks. This method gives a comparative quantitative measure of mobility of the circulation spaces within the city. The plans proposed for the city of Lisbon signified a departure from the medieval mobility city model. The intricacy of the old city circulation spaces is greatly reduced in the new plans and the mobility between different areas is substantially improved. The simulation results of the random-walk model show that those plans keeping the main force lines of the old city presented less improvement in terms ofmobility. The plans that had greater design freedom were, by contrast, easier to navigate. Lisbon's reconstruction followed a plan that included a shift in the traditional notions of mobility. This affected the daily lives of its citizens by potentiating an easy access to the waterfront, simplifying orientation and navigability. Using the random-walk model it is shown how to quantitatively measure the potential that synthetic plans have in terms of the permeability and navigability of different city public spaces.

Top