Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-21
... 2006 Decision Memorandum) at ``Benchmarks for Short-Term Financing.'' B. Benchmark for Long-Term Loans.... Subsidies Valuation Information A. Benchmarks for Short-Term Financing For those programs requiring the application of a won-denominated, short-term interest rate benchmark, in accordance with 19 CFR 351.505(a)(2...
Time and frequency structure of causal correlation networks in the China bond market
NASA Astrophysics Data System (ADS)
Wang, Zhongxing; Yan, Yan; Chen, Xiaosong
2017-07-01
There are more than eight hundred interest rates published in the China bond market every day. Identifying the benchmark interest rates that have broad influences on most other interest rates is a major concern for economists. In this paper, a multi-variable Granger causality test is developed and applied to construct a directed network of interest rates, whose important nodes, regarded as key interest rates, are evaluated with CheiRank scores. The results indicate that repo rates are the benchmark of short-term rates, the central bank bill rates are in the core position of mid-term interest rates network, and treasury bond rates lead the long-term bond rates. The evolution of benchmark interest rates from 2008 to 2014 is also studied, and it is found that SHIBOR has generally become the benchmark interest rate in China. In the frequency domain we identify the properties of information flows between interest rates, and the result confirms the existence of market segmentation in the China bond market.
A Benchmark Study of Large Contract Supplier Monitoring Within DOD and Private Industry
1994-03-01
83 2. Long Term Supplier Relationships ...... .. 84 3. Global Sourcing . . . . . . . . . . . . .. 85 4. Refocusing on Customer Quality...monitoring and recognition, reduced number of suppliers, global sourcing, and long term contractor relationships . These initiatives were then compared to DCMC...on customer quality. 14. suBJE.C TERMS Benchmark Study of Large Contract Supplier Monitoring. 15. NUMBER OF PAGES108 16. PRICE CODE 17. SECURITY
Janssen, David; Jongen, Wesley; Schröder-Bäck, Peter
2016-08-01
In this case study, European quality benchmarks were used to explore the contemporary quality of the long-term care provision for older people in the Belgian region of Flanders and the Netherlands following recent policy reforms. Semi-structured qualitative interviews were conducted with various experts on the long-term care provision. The results show that in the wake of the economic crisis and the reforms that followed, certain vulnerable groups of older people in Belgium and the Netherlands are at risk of being deprived of long-term care that is available, affordable and person-centred. Various suggestions were provided on how to improve the quality of the long-term care provision. The main conclusion drawn in this study is that while national and regional governments set the stage through regulatory frameworks and financing mechanisms, it is subsequently up to long-term care organisations, local social networks and informal caregivers to give substance to a high quality long-term care provision. An increased reliance on social networks and informal caregivers is seen as vital to ensure the sustainability of the long-term care systems in Belgium and in the Netherlands, although this simultaneously introduces new predicaments and difficulties. Structural governmental measures have to be introduced to support and protect informal caregivers and informal care networks. Copyright © 2016 Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-05
... to the short- and medium-term rates to convert them to long- term rates using Bloomberg U.S... derivation of the benchmark and discount rates used to value these subsidies is discussed below. Short-Term... inflation-adjusted short-term benchmark rate, we have also excluded any countries with aberrational or...
NASA Technical Reports Server (NTRS)
Halford, Gary R.; Shah, Ashwin; Arya, Vinod K.; Krause, David L.; Bartolotta, Paul A.
2002-01-01
Deep-space missions require onboard electric power systems with reliable design lifetimes of up to 10 yr and beyond. A high-efficiency Stirling radioisotope power system is a likely candidate for future deep-space missions and Mars rover applications. To ensure ample durability, the structurally critical heater head of the Stirling power convertor has undergone extensive computational analyses of operating temperatures (up to 650 C), stresses, and creep resistance of the thin-walled Inconel 718 bill of material. Durability predictions are presented in terms of the probability of survival. A benchmark structural testing program has commenced to support the analyses. This report presents the current status of durability assessments.
Multi-Complementary Model for Long-Term Tracking
Zhang, Deng; Zhang, Junchang; Xia, Chenyang
2018-01-01
In recent years, video target tracking algorithms have been widely used. However, many tracking algorithms do not achieve satisfactory performance, especially when dealing with problems such as object occlusions, background clutters, motion blur, low illumination color images, and sudden illumination changes in real scenes. In this paper, we incorporate an object model based on contour information into a Staple tracker that combines the correlation filter model and color model to greatly improve the tracking robustness. Since each model is responsible for tracking specific features, the three complementary models combine for more robust tracking. In addition, we propose an efficient object detection model with contour and color histogram features, which has good detection performance and better detection efficiency compared to the traditional target detection algorithm. Finally, we optimize the traditional scale calculation, which greatly improves the tracking execution speed. We evaluate our tracker on the Object Tracking Benchmarks 2013 (OTB-13) and Object Tracking Benchmarks 2015 (OTB-15) benchmark datasets. With the OTB-13 benchmark datasets, our algorithm is improved by 4.8%, 9.6%, and 10.9% on the success plots of OPE, TRE and SRE, respectively, in contrast to another classic LCT (Long-term Correlation Tracking) algorithm. On the OTB-15 benchmark datasets, when compared with the LCT algorithm, our algorithm achieves 10.4%, 12.5%, and 16.1% improvement on the success plots of OPE, TRE, and SRE, respectively. At the same time, it needs to be emphasized that, due to the high computational efficiency of the color model and the object detection model using efficient data structures, and the speed advantage of the correlation filters, our tracking algorithm could still achieve good tracking speed. PMID:29425170
Winning Strategy: Set Benchmarks of Early Success to Build Momentum for the Long Term
ERIC Educational Resources Information Center
Spiro, Jody
2012-01-01
Change is a highly personal experience. Everyone participating in the effort has different reactions to change, different concerns, and different motivations for being involved. The smart change leader sets benchmarks along the way so there are guideposts and pause points instead of an endless change process. "Early wins"--a term used to describe…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-01
... of the loan (e.g., short-term v. long-term), and the currency in which the loan is denominated...'s subsidies to the sales of Tomasello only. Benchmarks for Long-Term Loans and Discount Rates Loan..., the long-term interest rate calculated according to the methodology described above for the year in...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-08
... rate versus variable interest rate), the maturity of the loan (e.g., short-term versus long-term), and... were received. Benchmarks for Long-Term Loans and Discount Rates Pursuant to 19 CFR 351.505(a), the... using the national average cost of long-term, fixed-rate loans pursuant to 19 CFR 351.524(d)(3)(B...
Maximal Unbiased Benchmarking Data Sets for Human Chemokine Receptors and Comparative Analysis.
Xia, Jie; Reid, Terry-Elinor; Wu, Song; Zhang, Liangren; Wang, Xiang Simon
2018-05-29
Chemokine receptors (CRs) have long been druggable targets for the treatment of inflammatory diseases and HIV-1 infection. As a powerful technique, virtual screening (VS) has been widely applied to identifying small molecule leads for modern drug targets including CRs. For rational selection of a wide variety of VS approaches, ligand enrichment assessment based on a benchmarking data set has become an indispensable practice. However, the lack of versatile benchmarking sets for the whole CRs family that are able to unbiasedly evaluate every single approach including both structure- and ligand-based VS somewhat hinders modern drug discovery efforts. To address this issue, we constructed Maximal Unbiased Benchmarking Data sets for human Chemokine Receptors (MUBD-hCRs) using our recently developed tools of MUBD-DecoyMaker. The MUBD-hCRs encompasses 13 subtypes out of 20 chemokine receptors, composed of 404 ligands and 15756 decoys so far and is readily expandable in the future. It had been thoroughly validated that MUBD-hCRs ligands are chemically diverse while its decoys are maximal unbiased in terms of "artificial enrichment", "analogue bias". In addition, we studied the performance of MUBD-hCRs, in particular CXCR4 and CCR5 data sets, in ligand enrichment assessments of both structure- and ligand-based VS approaches in comparison with other benchmarking data sets available in the public domain and demonstrated that MUBD-hCRs is very capable of designating the optimal VS approach. MUBD-hCRs is a unique and maximal unbiased benchmarking set that covers major CRs subtypes so far.
Sulmann, D
2011-02-01
All citizens have the right to dignified and respectful social care and assistance. The state and society as a whole have the responsibility to guarantee the realization of these rights. However, the question arises what is dignified and respectful long-term care and assistance for the individual? One possible answer is given by the German Charter of Rights for people in need of long-term care and assistance. The charter summarizes existing books of law such as the German Federal Constitution or the European Social Charter and translates them into a specific context of long-term care. It is written in a language easily understood by everyone and reflects the central situation of people in need of long-term care and assistance. It sets an explicit benchmark for health and social care in Germany. The Charter was developed in 2005 at the round table for long-term care, hosted by the Federal Ministry for Family Affairs, Senior Citizens, Women and Youth in collaboration with the Federal Ministry of Health and Social Security. The round table consisted of representatives of users, consumer groups and other stakeholders, but also of care providers and health and care insurance funds in Germany.Many institutions, such as residential homes and health care services have now successfully applied the Charter in their daily work and it has found its way into several books of law at national and regional levels. The following article gives an overview of the structure, content and intention of the Charter and also highlights examples of implementation and its effects on the care structure and daily work with people in need of long-term care.
NASA Astrophysics Data System (ADS)
Nguyen, Theanh; Chan, Tommy H. T.; Thambiratnam, David P.; King, Les
2015-12-01
In the structural health monitoring (SHM) field, long-term continuous vibration-based monitoring is becoming increasingly popular as this could keep track of the health status of structures during their service lives. However, implementing such a system is not always feasible due to on-going conflicts between budget constraints and the need of sophisticated systems to monitor real-world structures under their demanding in-service conditions. To address this problem, this paper presents a comprehensive development of a cost-effective and flexible vibration DAQ system for long-term continuous SHM of a newly constructed institutional complex with a special focus on the main building. First, selections of sensor type and sensor positions are scrutinized to overcome adversities such as low-frequency and low-level vibration measurements. In order to economically tackle the sparse measurement problem, a cost-optimized Ethernet-based peripheral DAQ model is first adopted to form the system skeleton. A combination of a high-resolution timing coordination method based on the TCP/IP command communication medium and a periodic system resynchronization strategy is then proposed to synchronize data from multiple distributed DAQ units. The results of both experimental evaluations and experimental-numerical verifications show that the proposed DAQ system in general and the data synchronization solution in particular work well and they can provide a promising cost-effective and flexible alternative for use in real-world SHM projects. Finally, the paper demonstrates simple but effective ways to make use of the developed monitoring system for long-term continuous structural health evaluation as well as to use the instrumented building herein as a multi-purpose benchmark structure for studying not only practical SHM problems but also synchronization related issues.
WWTP dynamic disturbance modelling--an essential module for long-term benchmarking development.
Gernaey, K V; Rosen, C; Jeppsson, U
2006-01-01
Intensive use of the benchmark simulation model No. 1 (BSM1), a protocol for objective comparison of the effectiveness of control strategies in biological nitrogen removal activated sludge plants, has also revealed a number of limitations. Preliminary definitions of the long-term benchmark simulation model No. 1 (BSM1_LT) and the benchmark simulation model No. 2 (BSM2) have been made to extend BSM1 for evaluation of process monitoring methods and plant-wide control strategies, respectively. Influent-related disturbances for BSM1_LT/BSM2 are to be generated with a model, and this paper provides a general overview of the modelling methods used. Typical influent dynamic phenomena generated with the BSM1_LT/BSM2 influent disturbance model, including diurnal, weekend, seasonal and holiday effects, as well as rainfall, are illustrated with simulation results. As a result of the work described in this paper, a proposed influent model/file has been released to the benchmark developers for evaluation purposes. Pending this evaluation, a final BSM1_LT/BSM2 influent disturbance model definition is foreseen. Preliminary simulations with dynamic influent data generated by the influent disturbance model indicate that default BSM1 activated sludge plant control strategies will need extensions for BSM1_LT/BSM2 to efficiently handle 1 year of influent dynamics.
ERIC Educational Resources Information Center
Ka'opua, Lana Sue I.; Gotay, Carolyn C.; Hannum, Meghan; Bunghanoy, Grace
2005-01-01
Increasingly evident is the important role of partners in patients' adaptation to diagnosis, treatment, and recovery. Yet, little is known about partners' adaptation when patients reach the benchmark known as long-term survival. This study describes elderly wives of prostate cancer survivors' perspectives of adaptation to the enduring challenges…
Correlation of Noncancer Benchmark Doses in Short- and Long-Term Rodent Bioassays.
Kratchman, Jessica; Wang, Bing; Fox, John; Gray, George
2018-05-01
This study investigated whether, in the absence of chronic noncancer toxicity data, short-term noncancer toxicity data can be used to predict chronic toxicity effect levels by focusing on the dose-response relationship instead of a critical effect. Data from National Toxicology Program (NTP) technical reports have been extracted and modeled using the Environmental Protection Agency's Benchmark Dose Software. Best-fit, minimum benchmark dose (BMD), and benchmark dose lower limits (BMDLs) have been modeled for all NTP pathologist identified significant nonneoplastic lesions, final mean body weight, and mean organ weight of 41 chemicals tested by NTP between 2000 and 2012. Models were then developed at the chemical level using orthogonal regression techniques to predict chronic (two years) noncancer health effect levels using the results of the short-term (three months) toxicity data. The findings indicate that short-term animal studies may reasonably provide a quantitative estimate of a chronic BMD or BMDL. This can allow for faster development of human health toxicity values for risk assessment for chemicals that lack chronic toxicity data. © 2017 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Keppel-Aleks, G.; Hoffman, F. M.
2014-12-01
Feedbacks between the global carbon cycle and climate represent one of the largest uncertainties in climate prediction. A promising method for reducing uncertainty in predictions of carbon-climate feedbacks is based on identifying an "emergent constraint" that leverages correlations between mechanistically linked long-term feedbacks and short-term variations within the model ensemble. By applying contemporary observations to evaluate model skill in simulating short-term variations, we may be able to better assess the probability of simulated long-term feedbacks. We probed the constraint on long-term terrestrial carbon stocks provided by climate-driven fluctuations in the atmospheric CO2 growth rate at contemporary timescales. We considered the impact of both temperature and precipitation anomalies on terrestrial ecosystem exchange and further separated the direct influence of fire where possible. When we explicitly considered the role of atmospheric transport in smoothing the imprint of climate-driven flux anomalies on atmospheric CO2 patterns, we found that the extent of temporal averaging of both the observations and ESM output leads to estimates for the long-term climate sensitivity of tropical land carbon storage that are different by a factor of two. In the context of these results, we discuss strategies for applying emergent constraints for benchmarking biogeochemical feedbacks in ESMs. Specifically, our results underscore the importance of selecting appropriate observational benchmarks and, for future model intercomparison projects, outputting fields that most closely correspond to available observational datasets.
Cepoiu-Martin, Monica; Bischak, Diane P
2018-02-01
The increase in the incidence of dementia in the aging population and the decrease in the availability of informal caregivers put pressure on continuing care systems to care for a growing number of people with disabilities. Policy changes in the continuing care system need to address this shift in the population structure. One of the most effective tools for assessing policies in complex systems is system dynamics. Nevertheless, this method is underused in continuing care capacity planning. A system dynamics model of the Alberta Continuing Care System was developed using stylized data. Sensitivity analyses and policy evaluations were conducted to demonstrate the use of system dynamics modelling in this area of public health planning. We focused our policy exploration on introducing staff/resident benchmarks in both supportive living and long-term care (LTC). The sensitivity analyses presented in this paper help identify leverage points in the system that need to be acknowledged when policy decisions are made. Our policy explorations showed that the deficits of staff increase dramatically when benchmarks are introduced, as expected, but at the end of the simulation period, the difference in deficits of both nurses and health care aids are similar between the 2 scenarios tested. Modifying the benchmarks in LTC only versus in both supportive living and LTC has similar effects on staff deficits in long term, under the assumptions of this particular model. The continuing care system dynamics model can be used to test various policy scenarios, allowing decision makers to visualize the effect of a certain policy choice on different system variables and to compare different policy options. Our exploration illustrates the use of system dynamics models for policy making in complex health care systems. © 2017 John Wiley & Sons, Ltd.
Toward Scalable Benchmarks for Mass Storage Systems
NASA Technical Reports Server (NTRS)
Miller, Ethan L.
1996-01-01
This paper presents guidelines for the design of a mass storage system benchmark suite, along with preliminary suggestions for programs to be included. The benchmarks will measure both peak and sustained performance of the system as well as predicting both short- and long-term behavior. These benchmarks should be both portable and scalable so they may be used on storage systems from tens of gigabytes to petabytes or more. By developing a standard set of benchmarks that reflect real user workload, we hope to encourage system designers and users to publish performance figures that can be compared with those of other systems. This will allow users to choose the system that best meets their needs and give designers a tool with which they can measure the performance effects of improvements to their systems.
ERIC Educational Resources Information Center
Radunzel, Justine; Noble, Julie
2012-01-01
This study compared the effectiveness of ACT[R] Composite score and high school grade point average (HSGPA) for predicting long-term college success. Outcomes included annual progress towards a degree (based on cumulative credit-bearing hours earned), degree completion, and cumulative grade point average (GPA) at 150% of normal time to degree…
Versari, Cristian; Stoma, Szymon; Batmanov, Kirill; Llamosi, Artémis; Mroz, Filip; Kaczmarek, Adam; Deyell, Matt; Lhoussaine, Cédric; Hersen, Pascal; Batt, Gregory
2017-02-01
With the continuous expansion of single cell biology, the observation of the behaviour of individual cells over extended durations and with high accuracy has become a problem of central importance. Surprisingly, even for yeast cells that have relatively regular shapes, no solution has been proposed that reaches the high quality required for long-term experiments for segmentation and tracking (S&T) based on brightfield images. Here, we present CellStar , a tool chain designed to achieve good performance in long-term experiments. The key features are the use of a new variant of parametrized active rays for segmentation, a neighbourhood-preserving criterion for tracking, and the use of an iterative approach that incrementally improves S&T quality. A graphical user interface enables manual corrections of S&T errors and their use for the automated correction of other, related errors and for parameter learning. We created a benchmark dataset with manually analysed images and compared CellStar with six other tools, showing its high performance, notably in long-term tracking. As a community effort, we set up a website, the Yeast Image Toolkit, with the benchmark and the Evaluation Platform to gather this and additional information provided by others. © 2017 The Authors.
Versari, Cristian; Stoma, Szymon; Batmanov, Kirill; Llamosi, Artémis; Mroz, Filip; Kaczmarek, Adam; Deyell, Matt
2017-01-01
With the continuous expansion of single cell biology, the observation of the behaviour of individual cells over extended durations and with high accuracy has become a problem of central importance. Surprisingly, even for yeast cells that have relatively regular shapes, no solution has been proposed that reaches the high quality required for long-term experiments for segmentation and tracking (S&T) based on brightfield images. Here, we present CellStar, a tool chain designed to achieve good performance in long-term experiments. The key features are the use of a new variant of parametrized active rays for segmentation, a neighbourhood-preserving criterion for tracking, and the use of an iterative approach that incrementally improves S&T quality. A graphical user interface enables manual corrections of S&T errors and their use for the automated correction of other, related errors and for parameter learning. We created a benchmark dataset with manually analysed images and compared CellStar with six other tools, showing its high performance, notably in long-term tracking. As a community effort, we set up a website, the Yeast Image Toolkit, with the benchmark and the Evaluation Platform to gather this and additional information provided by others. PMID:28179544
A solid reactor core thermal model for nuclear thermal rockets
NASA Astrophysics Data System (ADS)
Rider, William J.; Cappiello, Michael W.; Liles, Dennis R.
1991-01-01
A Helium/Hydrogen Cooled Reactor Analysis (HERA) computer code has been developed. HERA has the ability to model arbitrary geometries in three dimensions, which allows the user to easily analyze reactor cores constructed of prismatic graphite elements. The code accounts for heat generation in the fuel, control rods, and other structures; conduction and radiation across gaps; convection to the coolant; and a variety of boundary conditions. The numerical solution scheme has been optimized for vector computers, making long transient analyses economical. Time integration is either explicit or implicit, which allows the use of the model to accurately calculate both short- or long-term transients with an efficient use of computer time. Both the basic spatial and temporal integration schemes have been benchmarked against analytical solutions.
Benchmarking of Typical Meteorological Year datasets dedicated to Concentrated-PV systems
NASA Astrophysics Data System (ADS)
Realpe, Ana Maria; Vernay, Christophe; Pitaval, Sébastien; Blanc, Philippe; Wald, Lucien; Lenoir, Camille
2016-04-01
Accurate analysis of meteorological and pyranometric data for long-term analysis is the basis of decision-making for banks and investors, regarding solar energy conversion systems. This has led to the development of methodologies for the generation of Typical Meteorological Years (TMY) datasets. The most used method for solar energy conversion systems was proposed in 1978 by the Sandia Laboratory (Hall et al., 1978) considering a specific weighted combination of different meteorological variables with notably global, diffuse horizontal and direct normal irradiances, air temperature, wind speed, relative humidity. In 2012, a new approach was proposed in the framework of the European project FP7 ENDORSE. It introduced the concept of "driver" that is defined by the user as an explicit function of the pyranometric and meteorological relevant variables to improve the representativeness of the TMY datasets with respect the specific solar energy conversion system of interest. The present study aims at comparing and benchmarking different TMY datasets considering a specific Concentrated-PV (CPV) system as the solar energy conversion system of interest. Using long-term (15+ years) time-series of high quality meteorological and pyranometric ground measurements, three types of TMY datasets generated by the following methods: the Sandia method, a simplified driver with DNI as the only representative variable and a more sophisticated driver. The latter takes into account the sensitivities of the CPV system with respect to the spectral distribution of the solar irradiance and wind speed. Different TMY datasets from the three methods have been generated considering different numbers of years in the historical dataset, ranging from 5 to 15 years. The comparisons and benchmarking of these TMY datasets are conducted considering the long-term time series of simulated CPV electric production as a reference. The results of this benchmarking clearly show that the Sandia method is not suitable for CPV systems. For these systems, the TMY datasets obtained using dedicated drivers (DNI only or more precise one) are more representative to derive TMY datasets from limited long-term meteorological dataset.
Long-term fish monitoring in large rivers: Utility of “benchmarking” across basins
Ward, David L.; Casper, Andrew F.; Counihan, Timothy D.; Bayer, Jennifer M.; Waite, Ian R.; Kosovich, John J.; Chapman, Colin; Irwin, Elise R.; Sauer, Jennifer S.; Ickes, Brian; McKerrow, Alexa
2017-01-01
In business, benchmarking is a widely used practice of comparing your own business processes to those of other comparable companies and incorporating identified best practices to improve performance. Biologists and resource managers designing and conducting monitoring programs for fish in large river systems tend to focus on single river basins or segments of large rivers, missing opportunities to learn from those conducting fish monitoring in other rivers. We briefly examine five long-term fish monitoring programs in large rivers in the United States (Colorado, Columbia, Mississippi, Illinois, and Tallapoosa rivers) and identify opportunities for learning across programs by detailing best monitoring practices and why these practices were chosen. Although monitoring objectives, methods, and program maturity differ between each river system, examples from these five case studies illustrate the important role that long-term monitoring programs play in interpreting temporal and spatial shifts in fish populations for both established objectives and newly emerging questions. We suggest that deliberate efforts to develop a broader collaborative network through benchmarking will facilitate sharing of ideas and development of more effective monitoring programs.
Benchmark analysis of forecasted seasonal temperature over different climatic areas
NASA Astrophysics Data System (ADS)
Giunta, G.; Salerno, R.; Ceppi, A.; Ercolani, G.; Mancini, M.
2015-12-01
From a long-term perspective, an improvement of seasonal forecasting, which is often exclusively based on climatology, could provide a new capability for the management of energy resources in a time scale of just a few months. This paper regards a benchmark analysis in relation to long-term temperature forecasts over Italy in the year 2010, comparing the eni-kassandra meteo forecast (e-kmf®) model, the Climate Forecast System-National Centers for Environmental Prediction (CFS-NCEP) model, and the climatological reference (based on 25-year data) with observations. Statistical indexes are used to understand the reliability of the prediction of 2-m monthly air temperatures with a perspective of 12 weeks ahead. The results show how the best performance is achieved by the e-kmf® system which improves the reliability for long-term forecasts compared to climatology and the CFS-NCEP model. By using the reliable high-performance forecast system, it is possible to optimize the natural gas portfolio and management operations, thereby obtaining a competitive advantage in the European energy market.
Evaluation of control strategies using an oxidation ditch benchmark.
Abusam, A; Keesman, K J; Spanjers, H; van, Straten G; Meinema, K
2002-01-01
This paper presents validation and implementation results of a benchmark developed for a specific full-scale oxidation ditch wastewater treatment plant. A benchmark is a standard simulation procedure that can be used as a tool in evaluating various control strategies proposed for wastewater treatment plants. It is based on model and performance criteria development. Testing of this benchmark, by comparing benchmark predictions to real measurements of the electrical energy consumptions and amounts of disposed sludge for a specific oxidation ditch WWTP, has shown that it can (reasonably) be used for evaluating the performance of this WWTP. Subsequently, the validated benchmark was then used in evaluating some basic and advanced control strategies. Some of the interesting results obtained are the following: (i) influent flow splitting ratio, between the first and the fourth aerated compartments of the ditch, has no significant effect on the TN concentrations in the effluent, and (ii) for evaluation of long-term control strategies, future benchmarks need to be able to assess settlers' performance.
40 CFR 265.310 - Closure and post-closure care.
Code of Federal Regulations, 2010 CFR
2010-07-01
... designed and constructed to: (1) Provide long-term minimization of migration of liquids through the closed... from eroding or otherwise damaging the final cover; and (5) Protect and maintain surveyed benchmarks...
40 CFR 265.310 - Closure and post-closure care.
Code of Federal Regulations, 2011 CFR
2011-07-01
... designed and constructed to: (1) Provide long-term minimization of migration of liquids through the closed... from eroding or otherwise damaging the final cover; and (5) Protect and maintain surveyed benchmarks...
Protein Models Docking Benchmark 2
Anishchenko, Ivan; Kundrotas, Petras J.; Tuzikov, Alexander V.; Vakser, Ilya A.
2015-01-01
Structural characterization of protein-protein interactions is essential for our ability to understand life processes. However, only a fraction of known proteins have experimentally determined structures. Such structures provide templates for modeling of a large part of the proteome, where individual proteins can be docked by template-free or template-based techniques. Still, the sensitivity of the docking methods to the inherent inaccuracies of protein models, as opposed to the experimentally determined high-resolution structures, remains largely untested, primarily due to the absence of appropriate benchmark set(s). Structures in such a set should have pre-defined inaccuracy levels and, at the same time, resemble actual protein models in terms of structural motifs/packing. The set should also be large enough to ensure statistical reliability of the benchmarking results. We present a major update of the previously developed benchmark set of protein models. For each interactor, six models were generated with the model-to-native Cα RMSD in the 1 to 6 Å range. The models in the set were generated by a new approach, which corresponds to the actual modeling of new protein structures in the “real case scenario,” as opposed to the previous set, where a significant number of structures were model-like only. In addition, the larger number of complexes (165 vs. 63 in the previous set) increases the statistical reliability of the benchmarking. We estimated the highest accuracy of the predicted complexes (according to CAPRI criteria), which can be attained using the benchmark structures. The set is available at http://dockground.bioinformatics.ku.edu. PMID:25712716
The essential value of long-term experimental data for hydrology and water management
NASA Astrophysics Data System (ADS)
Tetzlaff, Doerthe; Carey, Sean K.; McNamara, James P.; Laudon, Hjalmar; Soulsby, Chris
2017-04-01
Observations and data from long-term experimental watersheds are the foundation of hydrology as a geoscience. They allow us to benchmark process understanding, observe trends and natural cycles, and are prerequisites for testing predictive models. Long-term experimental watersheds also are places where new measurement technologies are developed. These studies offer a crucial evidence base for understanding and managing the provision of clean water supplies, predicting and mitigating the effects of floods, and protecting ecosystem services provided by rivers and wetlands. They also show how to manage land and water in an integrated, sustainable way that reduces environmental and economic costs.
Benchmarking Discount Rate in Natural Resource Damage Assessment with Risk Aversion.
Wu, Desheng; Chen, Shuzhen
2017-08-01
Benchmarking a credible discount rate is of crucial importance in natural resource damage assessment (NRDA) and restoration evaluation. This article integrates a holistic framework of NRDA with prevailing low discount rate theory, and proposes a discount rate benchmarking decision support system based on service-specific risk aversion. The proposed approach has the flexibility of choosing appropriate discount rates for gauging long-term services, as opposed to decisions based simply on duration. It improves injury identification in NRDA since potential damages and side-effects to ecosystem services are revealed within the service-specific framework. A real embankment case study demonstrates valid implementation of the method. © 2017 Society for Risk Analysis.
Benchmarking facilities providing care: An international overview of initiatives
Thonon, Frédérique; Watson, Jonathan; Saghatchian, Mahasti
2015-01-01
We performed a literature review of existing benchmarking projects of health facilities to explore (1) the rationales for those projects, (2) the motivation for health facilities to participate, (3) the indicators used and (4) the success and threat factors linked to those projects. We studied both peer-reviewed and grey literature. We examined 23 benchmarking projects of different medical specialities. The majority of projects used a mix of structure, process and outcome indicators. For some projects, participants had a direct or indirect financial incentive to participate (such as reimbursement by Medicaid/Medicare or litigation costs related to quality of care). A positive impact was reported for most projects, mainly in terms of improvement of practice and adoption of guidelines and, to a lesser extent, improvement in communication. Only 1 project reported positive impact in terms of clinical outcomes. Success factors and threats are linked to both the benchmarking process (such as organisation of meetings, link with existing projects) and indicators used (such as adjustment for diagnostic-related groups). The results of this review will help coordinators of a benchmarking project to set it up successfully. PMID:26770800
NASA Technical Reports Server (NTRS)
Stewart, H. E.; Blom, R.; Abrams, M.; Daily, M.
1980-01-01
Satellite synthetic aperture radar (SAR) images is evaluated in terms of its geologic applications. The benchmark to which the SAR images are compared is LANDSAT, used both for structural and lithologic interpretations.
Schaub, Michael T.; Delvenne, Jean-Charles; Yaliraki, Sophia N.; Barahona, Mauricio
2012-01-01
In recent years, there has been a surge of interest in community detection algorithms for complex networks. A variety of computational heuristics, some with a long history, have been proposed for the identification of communities or, alternatively, of good graph partitions. In most cases, the algorithms maximize a particular objective function, thereby finding the ‘right’ split into communities. Although a thorough comparison of algorithms is still lacking, there has been an effort to design benchmarks, i.e., random graph models with known community structure against which algorithms can be evaluated. However, popular community detection methods and benchmarks normally assume an implicit notion of community based on clique-like subgraphs, a form of community structure that is not always characteristic of real networks. Specifically, networks that emerge from geometric constraints can have natural non clique-like substructures with large effective diameters, which can be interpreted as long-range communities. In this work, we show that long-range communities escape detection by popular methods, which are blinded by a restricted ‘field-of-view’ limit, an intrinsic upper scale on the communities they can detect. The field-of-view limit means that long-range communities tend to be overpartitioned. We show how by adopting a dynamical perspective towards community detection [1], [2], in which the evolution of a Markov process on the graph is used as a zooming lens over the structure of the network at all scales, one can detect both clique- or non clique-like communities without imposing an upper scale to the detection. Consequently, the performance of algorithms on inherently low-diameter, clique-like benchmarks may not always be indicative of equally good results in real networks with local, sparser connectivity. We illustrate our ideas with constructive examples and through the analysis of real-world networks from imaging, protein structures and the power grid, where a multiscale structure of non clique-like communities is revealed. PMID:22384178
Benchmarking Methods and Data Sets for Ligand Enrichment Assessment in Virtual Screening
Xia, Jie; Tilahun, Ermias Lemma; Reid, Terry-Elinor; Zhang, Liangren; Wang, Xiang Simon
2014-01-01
Retrospective small-scale virtual screening (VS) based on benchmarking data sets has been widely used to estimate ligand enrichments of VS approaches in the prospective (i.e. real-world) efforts. However, the intrinsic differences of benchmarking sets to the real screening chemical libraries can cause biased assessment. Herein, we summarize the history of benchmarking methods as well as data sets and highlight three main types of biases found in benchmarking sets, i.e. “analogue bias”, “artificial enrichment” and “false negative”. In addition, we introduced our recent algorithm to build maximum-unbiased benchmarking sets applicable to both ligand-based and structure-based VS approaches, and its implementations to three important human histone deacetylase (HDAC) isoforms, i.e. HDAC1, HDAC6 and HDAC8. The Leave-One-Out Cross-Validation (LOO CV) demonstrates that the benchmarking sets built by our algorithm are maximum-unbiased in terms of property matching, ROC curves and AUCs. PMID:25481478
The Filament Sensor for Near Real-Time Detection of Cytoskeletal Fiber Structures
Eltzner, Benjamin; Wollnik, Carina; Gottschlich, Carsten; Huckemann, Stephan; Rehfeldt, Florian
2015-01-01
A reliable extraction of filament data from microscopic images is of high interest in the analysis of acto-myosin structures as early morphological markers in mechanically guided differentiation of human mesenchymal stem cells and the understanding of the underlying fiber arrangement processes. In this paper, we propose the filament sensor (FS), a fast and robust processing sequence which detects and records location, orientation, length, and width for each single filament of an image, and thus allows for the above described analysis. The extraction of these features has previously not been possible with existing methods. We evaluate the performance of the proposed FS in terms of accuracy and speed in comparison to three existing methods with respect to their limited output. Further, we provide a benchmark dataset of real cell images along with filaments manually marked by a human expert as well as simulated benchmark images. The FS clearly outperforms existing methods in terms of computational runtime and filament extraction accuracy. The implementation of the FS and the benchmark database are available as open source. PMID:25996921
Long-term integrating samplers for indoor air and sub slab soil gas at VI sites
Vapor intrusion (VI) site assessments are plagued by substantial spatial and temporal variability that makes exposure and risk assessment difficult. Most risk-based decision making for volatile organic compound (VOC) exposure in the indoor environment is based on health benchmark...
A Pervaporation Study of Ammonia Solutions Using Molecular Sieve Silica Membranes
Yang, Xing; Fraser, Thomas; Myat, Darli; Smart, Simon; Zhang, Jianhua; Diniz da Costa, João C.; Liubinas, Audra; Duke, Mikel
2014-01-01
An innovative concept is proposed to recover ammonia from industrial wastewater using a molecular sieve silica membrane in pervaporation (PV), benchmarked against vacuum membrane distillation (VMD). Cobalt and iron doped molecular sieve silica-based ceramic membranes were evaluated based on the ammonia concentration factor downstream and long-term performance. A modified low-temperature membrane evaluation system was utilized, featuring the ability to capture and measure ammonia in the permeate. It was found that the silica membrane with confirmed molecular sieving features had higher water selectivity over ammonia. This was due to a size selectivity mechanism that favoured water, but blocked ammonia. However, a cobalt doped silica membrane previously treated with high temperature water solutions demonstrated extraordinary preference towards ammonia by achieving up to a 50,000 mg/L ammonia concentration (a reusable concentration level) measured in the permeate when fed with 800 mg/L of ammonia solution. This exceeded the concentration factor expected by the benchmark VMD process by four-fold, suspected to be due to the competitive adsorption of ammonia over water into the silica structure with pores now large enough to accommodate ammonia. However, this membrane showed a gradual decline in selectivity, suspected to be due to the degradation of the silica material/pore structure after several hours of operation. PMID:24957120
Towards accurate localization: long- and short-term correlation filters for tracking
NASA Astrophysics Data System (ADS)
Li, Minglangjun; Tian, Chunna
2018-04-01
Visual tracking is a challenging problem, especially using a single model. In this paper, we propose a discriminative correlation filter (DCF) based tracking approach that exploits both the long-term and short-term information of the target, named LSTDCF, to improve the tracking performance. In addition to a long-term filter learned through the whole sequence, a short-term filter is trained using only features extracted from most recent frames. The long-term filter tends to capture more semantics of the target as more frames are used for training. However, since the target may undergo large appearance changes, features extracted around the target in non-recent frames prevent the long-term filter from locating the target in the current frame accurately. In contrast, the short-term filter learns more spatial details of the target from recent frames but gets over-fitting easily. Thus the short-term filter is less robust to handle cluttered background and prone to drift. We take the advantage of both filters and fuse their response maps to make the final estimation. We evaluate our approach on a widely-used benchmark with 100 image sequences and achieve state-of-the-art results.
2017-01-01
The authors use four criteria to examine a novel community detection algorithm: (a) effectiveness in terms of producing high values of normalized mutual information (NMI) and modularity, using well-known social networks for testing; (b) examination, meaning the ability to examine mitigating resolution limit problems using NMI values and synthetic networks; (c) correctness, meaning the ability to identify useful community structure results in terms of NMI values and Lancichinetti-Fortunato-Radicchi (LFR) benchmark networks; and (d) scalability, or the ability to produce comparable modularity values with fast execution times when working with large-scale real-world networks. In addition to describing a simple hierarchical arc-merging (HAM) algorithm that uses network topology information, we introduce rule-based arc-merging strategies for identifying community structures. Five well-studied social network datasets and eight sets of LFR benchmark networks were employed to validate the correctness of a ground-truth community, eight large-scale real-world complex networks were used to measure its efficiency, and two synthetic networks were used to determine its susceptibility to two resolution limit problems. Our experimental results indicate that the proposed HAM algorithm exhibited satisfactory performance efficiency, and that HAM-identified and ground-truth communities were comparable in terms of social and LFR benchmark networks, while mitigating resolution limit problems. PMID:29121100
Assessment of soil health in the central claypan region, Missouri
USDA-ARS?s Scientific Manuscript database
Assessment of soil health involves determining how well a soil is performing its biological, chemical, and physical functions relative to its inherent potential. Within the Central Claypan Region of Missouri, the Salt River Basin was selected as a benchmark watershed to assess long-term effects of c...
Super Energy Efficiency Design (S.E.E.D.) Home Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
German, A.; Dakin, B.; Backman, C.
This report describes the results of evaluation by the Alliance for Residential Building Innovation (ARBI) Building America team of the 'Super Energy Efficient Design' (S.E.E.D) home, a 1,935 sq. ft., single-story spec home located in Tucson, AZ. This prototype design was developed with the goal of providing an exceptionally energy efficient yet affordable home and includes numerous aggressive energy features intended to significantly reduce heating and cooling loads such as structural insulated panel (SIP) walls and roof, high performance windows, an ERV, an air-to-water heat pump with mixed-mode radiant and forced air delivery, solar water heating, and rooftop PV. Sourcemore » energy savings are estimated at 45% over the Building America B10 Benchmark. System commissioning, short term testing, long term monitoring and detailed analysis of results was conducted to identify the performance attributes and cost effectiveness of the whole house measure package.« less
Super Energy Efficient Design (S.E.E.D.) Home Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
German, A.; Dakin, B.; Backman, C.
This report describes the results of evaluation by the Alliance for Residential Building Innovation (ARBI) Building America team of the “Super Energy Efficient Design” (S.E.E.D) home, a 1,935 sq. ft., single-story spec home located in Tucson, AZ. This prototype design was developed with the goal of providing an exceptionally energy efficient yet affordable home and includes numerous aggressive energy features intended to significantly reduce heating and cooling loads such as structural insulated panel (SIP) walls and roof, high performance windows, an ERV, an air-to-water heat pump with mixed-mode radiant and forced air delivery, solar water heating, and rooftop PV. Sourcemore » energy savings are estimated at 45% over the Building America B10 Benchmark. System commissioning, short term testing, long term monitoring and detailed analysis of results was conducted to identify the performance attributes and cost effectiveness of the whole house measure package.« less
Use of outsourced nurses in long-term acute care hospitals: outcomes and leadership preferences.
Alvarez, M Raymond; Kerr, Bernard J; Burtner, Joan; Ledlow, Gerald; Fulton, Larry V
2011-02-01
When staffing effectiveness is not maintained over time, the likelihood of negative outcomes increases. This challenge is particularly problematic in long-term acute care hospitals (LTACHs) where use of outsourced temporary nurses is common when providing safe, sufficient care to medically complex patients who require longer hospital stays than normally would occur. To assess this issue, the authors discuss the outcomes of their survey of LTACH chief nursing officers that demonstrated LTACH quality indicators and overall patient satisfaction were within nationally accepted benchmarks even with higher levels of outsourced nurses used in this post-acute care setting.
High hardness and superlative oxidation resistance in a pseudo-icosahehdral Cr-Al binary
NASA Astrophysics Data System (ADS)
Simonson, J. W.; Rosa, R.; Antonacci, A. K.; He, H.; Bender, A. D.; Pabla, J.; Adrip, W.; McNally, D. E.; Zebro, A.; Kamenov, P.; Geschwind, G.; Ghose, S.; Dooryhee, E.; Ibrahim, A.; Aronson, M. C.
Improving the efficiency of fossil fuel plants is a practical option for decreasing carbon dioxide emissions from electrical power generation. Present limits on the operating temperatures of exposed steel components, however, restrict steam temperatures and therefore energy efficiency. Even as a new generation of creep-resistant, high strength steels retain long term structural stability to temperatures as high as ~ 973 K, the low Cr-content of these alloys hinders their oxidation resistance, necessitating the development of new corrosion resistant coatings. We report here the nearly ideal properties of potential coating material Cr55Al229, which exhibits high hardness at room temperature as well as low thermal conductivity and superlative oxidation resistance at 973 K, with an oxidation rate at least three times smaller than those of benchmark materials. These properties originate from a pseudo-icosahedral crystal structure, suggesting new criteria for future research.
U.S. EPA Superfund Program's Policy for Risk and Dose Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Stuart
2008-01-15
The Environmental Protection Agency (EPA) Office of Superfund Remediation and Technology Innovation (OSRTI) has primary responsibility for implementing the long-term (non-emergency) portion of a key U.S. law regulating cleanup: the Comprehensive Environmental Response, Compensation and Liability Act, CERCLA, nicknamed 'Superfund'. The purpose of the Superfund program is to protect human health and the environment over the long term from releases or potential releases of hazardous substances from abandoned or uncontrolled hazardous waste sites. The focus of this paper is on risk and dose assessment policies and tools for addressing radioactively contaminated sites by the Superfund program. EPA has almost completedmore » two risk assessment tools that are particularly relevant to decommissioning activities conducted under CERCLA authority. These are the: 1. Building Preliminary Remediation Goals for Radionuclides (BPRG) electronic calculator, and 2. Radionuclide Outdoor Surfaces Preliminary Remediation Goals (SPRG) electronic calculator. EPA developed the BPRG calculator to help standardize the evaluation and cleanup of radiologically contaminated buildings at which risk is being assessed for occupancy. BPRGs are radionuclide concentrations in dust, air and building materials that correspond to a specified level of human cancer risk. The intent of SPRG calculator is to address hard outside surfaces such as building slabs, outside building walls, sidewalks and roads. SPRGs are radionuclide concentrations in dust and hard outside surface materials. EPA is also developing the 'Radionuclide Ecological Benchmark' calculator. This calculator provides biota concentration guides (BCGs), also known as ecological screening benchmarks, for use in ecological risk assessments at CERCLA sites. This calculator is intended to develop ecological benchmarks as part of the EPA guidance 'Ecological Risk Assessment Guidance for Superfund: Process for Designing and Conducting Ecological Risk Assessments'. The calculator develops ecological benchmarks for ionizing radiation based on cell death only.« less
Validation project. This report describes the procedure used to generate the noise models output dataset , and then it compares that dataset to the...benchmark, the Engineer Research and Development Centers Long-Range Sound Propagation dataset . It was found that the models consistently underpredict the
For more than three decades chronic studies in rodents have been the benchmark for assessing the potential long-term toxicity, and particularly the carcinogenicity, of chemicals. With doses typically administered for about 2 years (18 months to lifetime), the rodent bioassay has ...
The Role of Deformation Energetics in Long-Term Tectonic Modeling
NASA Astrophysics Data System (ADS)
Ahamed, S.; Choi, E.
2017-12-01
The deformation-related energy budget is usually considered in the simplest form or even entirely omitted from the energy balance equation. We derive a full energy balance equation that accounts not only for heat energy but also for mechanical (elastic, plastic and viscous) work. The derived equation is implemented in DES3D, an unstructured finite element solver for long-term tectonic deformation. We verify the implementation by comparing numerical solutions to the corresponding semi-analytic solutions in three benchmarks extended from the classical oedometer test. We also investigate the long-term effects of deformation energetics on the evolution of large offset normal faults. We find that the models considering the full energy balance equation tend to produce more secondary faults and an elongated core complex. Our results for the normal fault system confirm that persistent inelastic deformation has a significant impact on the long-term evolution of faults, motivating further exploration of the role of the full energy balance equation in other geodynamic systems.
The essential value of long-term experimental data for hydrology and water management
NASA Astrophysics Data System (ADS)
Tetzlaff, D.; Carey, S. K.; McNamara, J. P.; Laudon, H.; Soulsby, C.
2017-12-01
Observations and data from long-term experimental watersheds are the foundation of hydrology as a geoscience. They allow us to benchmark process understanding, observe trends and natural cycles, and are pre-requisites for testing predictive models. Long-term experimental watersheds also are places where new measurement technologies are developed. These studies offer a crucial evidence base for understanding and managing the provision of clean water supplies; predicting and mitigating the effects of floods, and protecting ecosystem services provided by rivers and wetlands. They also show how to manage land and water in an integrated, sustainable way that reduces environmental and economic costs. We present a number of compelling examples illustrating how hydrologic process understanding has been generated through comparing hypotheses to data, and how this understanding has been essential for managing water supplies, floods, and ecosystem services today.
Protein remote homology detection based on bidirectional long short-term memory.
Li, Shumin; Chen, Junjie; Liu, Bin
2017-10-10
Protein remote homology detection plays a vital role in studies of protein structures and functions. Almost all of the traditional machine leaning methods require fixed length features to represent the protein sequences. However, it is never an easy task to extract the discriminative features with limited knowledge of proteins. On the other hand, deep learning technique has demonstrated its advantage in automatically learning representations. It is worthwhile to explore the applications of deep learning techniques to the protein remote homology detection. In this study, we employ the Bidirectional Long Short-Term Memory (BLSTM) to learn effective features from pseudo proteins, also propose a predictor called ProDec-BLSTM: it includes input layer, bidirectional LSTM, time distributed dense layer and output layer. This neural network can automatically extract the discriminative features by using bidirectional LSTM and the time distributed dense layer. Experimental results on a widely-used benchmark dataset show that ProDec-BLSTM outperforms other related methods in terms of both the mean ROC and mean ROC50 scores. This promising result shows that ProDec-BLSTM is a useful tool for protein remote homology detection. Furthermore, the hidden patterns learnt by ProDec-BLSTM can be interpreted and visualized, and therefore, additional useful information can be obtained.
Vreck, D; Gernaey, K V; Rosen, C; Jeppsson, U
2006-01-01
In this paper, implementation of the Benchmark Simulation Model No 2 (BSM2) within Matlab-Simulink is presented. The BSM2 is developed for plant-wide WWTP control strategy evaluation on a long-term basis. It consists of a pre-treatment process, an activated sludge process and sludge treatment processes. Extended evaluation criteria are proposed for plant-wide control strategy assessment. Default open-loop and closed-loop strategies are also proposed to be used as references with which to compare other control strategies. Simulations indicate that the BM2 is an appropriate tool for plant-wide control strategy evaluation.
Framewise phoneme classification with bidirectional LSTM and other neural network architectures.
Graves, Alex; Schmidhuber, Jürgen
2005-01-01
In this paper, we present bidirectional Long Short Term Memory (LSTM) networks, and a modified, full gradient version of the LSTM learning algorithm. We evaluate Bidirectional LSTM (BLSTM) and several other network architectures on the benchmark task of framewise phoneme classification, using the TIMIT database. Our main findings are that bidirectional networks outperform unidirectional ones, and Long Short Term Memory (LSTM) is much faster and also more accurate than both standard Recurrent Neural Nets (RNNs) and time-windowed Multilayer Perceptrons (MLPs). Our results support the view that contextual information is crucial to speech processing, and suggest that BLSTM is an effective architecture with which to exploit it.
Model evaluation using a community benchmarking system for land surface models
NASA Astrophysics Data System (ADS)
Mu, M.; Hoffman, F. M.; Lawrence, D. M.; Riley, W. J.; Keppel-Aleks, G.; Kluzek, E. B.; Koven, C. D.; Randerson, J. T.
2014-12-01
Evaluation of atmosphere, ocean, sea ice, and land surface models is an important step in identifying deficiencies in Earth system models and developing improved estimates of future change. For the land surface and carbon cycle, the design of an open-source system has been an important objective of the International Land Model Benchmarking (ILAMB) project. Here we evaluated CMIP5 and CLM models using a benchmarking system that enables users to specify models, data sets, and scoring systems so that results can be tailored to specific model intercomparison projects. Our scoring system used information from four different aspects of global datasets, including climatological mean spatial patterns, seasonal cycle dynamics, interannual variability, and long-term trends. Variable-to-variable comparisons enable investigation of the mechanistic underpinnings of model behavior, and allow for some control of biases in model drivers. Graphics modules allow users to evaluate model performance at local, regional, and global scales. Use of modular structures makes it relatively easy for users to add new variables, diagnostic metrics, benchmarking datasets, or model simulations. Diagnostic results are automatically organized into HTML files, so users can conveniently share results with colleagues. We used this system to evaluate atmospheric carbon dioxide, burned area, global biomass and soil carbon stocks, net ecosystem exchange, gross primary production, ecosystem respiration, terrestrial water storage, evapotranspiration, and surface radiation from CMIP5 historical and ESM historical simulations. We found that the multi-model mean often performed better than many of the individual models for most variables. We plan to publicly release a stable version of the software during fall of 2014 that has land surface, carbon cycle, hydrology, radiation and energy cycle components.
When Does Length Cause the Word Length Effect?
ERIC Educational Resources Information Center
Jalbert, Annie; Neath, Ian; Bireta, Tamra J.; Surprenant, Aimee M.
2011-01-01
The word length effect, the finding that lists of short words are better recalled than lists of long words, has been termed one of the benchmark findings that any theory of immediate memory must account for. Indeed, the effect led directly to the development of working memory and the phonological loop, and it is viewed as the best remaining…
Using HFire for spatial modeling of fire in shrublands
Seth H. Peterson; Marco E. Morais; Jean M. Carlson; Philip E. Dennison; Dar A. Roberts; Max A. Moritz; David R. Weise
2009-01-01
An efficient raster fire-spread model named HFire is introduced. HFire can simulate single-fire events or long-term fire regimes, using the same fire-spread algorithm. This paper describes the HFire algorithm, benchmarks the model using a standard set of tests developed for FARSITE, and compares historical and predicted fire spread perimeters for three southern...
Unified Deep Learning Architecture for Modeling Biology Sequence.
Wu, Hongjie; Cao, Chengyuan; Xia, Xiaoyan; Lu, Qiang
2017-10-09
Prediction of the spatial structure or function of biological macromolecules based on their sequence remains an important challenge in bioinformatics. When modeling biological sequences using traditional sequencing models, characteristics, such as long-range interactions between basic units, the complicated and variable output of labeled structures, and the variable length of biological sequences, usually lead to different solutions on a case-by-case basis. This study proposed the use of bidirectional recurrent neural networks based on long short-term memory or a gated recurrent unit to capture long-range interactions by designing the optional reshape operator to adapt to the diversity of the output labels and implementing a training algorithm to support the training of sequence models capable of processing variable-length sequences. Additionally, the merge and pooling operators enhanced the ability to capture short-range interactions between basic units of biological sequences. The proposed deep-learning model and its training algorithm might be capable of solving currently known biological sequence-modeling problems through the use of a unified framework. We validated our model on one of the most difficult biological sequence-modeling problems currently known, with our results indicating the ability of the model to obtain predictions of protein residue interactions that exceeded the accuracy of current popular approaches by 10% based on multiple benchmarks.
Value of the distant future: Model-independent results
NASA Astrophysics Data System (ADS)
Katz, Yuri A.
2017-01-01
This paper shows that the model-independent account of correlations in an interest rate process or a log-consumption growth process leads to declining long-term tails of discount curves. Under the assumption of an exponentially decaying memory in fluctuations of risk-free real interest rates, I derive the analytical expression for an apt value of the long run discount factor and provide a detailed comparison of the obtained result with the outcome of the benchmark risk-free interest rate models. Utilizing the standard consumption-based model with an isoelastic power utility of the representative economic agent, I derive the non-Markovian generalization of the Ramsey discounting formula. Obtained analytical results allowing simple calibration, may augment the rigorous cost-benefit and regulatory impact analysis of long-term environmental and infrastructure projects.
Structural Mechanics and Dynamics Branch
NASA Technical Reports Server (NTRS)
Stefko, George
2003-01-01
The 2002 annual report of the Structural Mechanics and Dynamics Branch reflects the majority of the work performed by the branch staff during the 2002 calendar year. Its purpose is to give a brief review of the branch s technical accomplishments. The Structural Mechanics and Dynamics Branch develops innovative computational tools, benchmark experimental data, and solutions to long-term barrier problems in the areas of propulsion aeroelasticity, active and passive damping, engine vibration control, rotor dynamics, magnetic suspension, structural mechanics, probabilistics, smart structures, engine system dynamics, and engine containment. Furthermore, the branch is developing a compact, nonpolluting, bearingless electric machine with electric power supplied by fuel cells for future "more electric" aircraft. An ultra-high-power-density machine that can generate projected power densities of 50 hp/lb or more, in comparison to conventional electric machines, which generate usually 0.2 hp/lb, is under development for application to electric drives for propulsive fans or propellers. In the future, propulsion and power systems will need to be lighter, to operate at higher temperatures, and to be more reliable in order to achieve higher performance and economic viability. The Structural Mechanics and Dynamics Branch is working to achieve these complex, challenging goals.
Wildenhain, Jan; Spitzer, Michaela; Dolma, Sonam; Jarvik, Nick; White, Rachel; Roy, Marcia; Griffiths, Emma; Bellows, David S.; Wright, Gerard D.; Tyers, Mike
2016-01-01
The network structure of biological systems suggests that effective therapeutic intervention may require combinations of agents that act synergistically. However, a dearth of systematic chemical combination datasets have limited the development of predictive algorithms for chemical synergism. Here, we report two large datasets of linked chemical-genetic and chemical-chemical interactions in the budding yeast Saccharomyces cerevisiae. We screened 5,518 unique compounds against 242 diverse yeast gene deletion strains to generate an extended chemical-genetic matrix (CGM) of 492,126 chemical-gene interaction measurements. This CGM dataset contained 1,434 genotype-specific inhibitors, termed cryptagens. We selected 128 structurally diverse cryptagens and tested all pairwise combinations to generate a benchmark dataset of 8,128 pairwise chemical-chemical interaction tests for synergy prediction, termed the cryptagen matrix (CM). An accompanying database resource called ChemGRID was developed to enable analysis, visualisation and downloads of all data. The CGM and CM datasets will facilitate the benchmarking of computational approaches for synergy prediction, as well as chemical structure-activity relationship models for anti-fungal drug discovery. PMID:27874849
NASA Astrophysics Data System (ADS)
Dokka, R. K.
2005-05-01
It has been long-recognized that the south-central United States of America bordering the Gulf of Mexico (GOM) is actively subsiding, resulting in a slow, yet unrelenting inundation of the coast from south Texas to southwestern Alabama. Today's motions are but the latest chapter in the subsidence history of the GOM, a region that has accommodated the deposition of over 20 km of deltaic and continental margin sediments since mid Mesozoic time. Understanding the recent history of displacements and the processes responsible for subsidence are especially critical for near-term planning for coastal protection and restoration activities. Documentation of the true magnitude and geography of vertical motions of the surface through time has been hampered because previous measurement schemes did not employ reference datums of sufficient spatial and temporal precision. This situation has been somewhat improved recently through the recent analysis of National Geodetic Survey (NGS) 1st order leveling data from >2710 benchmarks in the region by Shinkle and Dokka (NOAA Technical Report 50 [2004]). That paper used original observations (not adjusted) and computed displacements and velocities related to NAVD88 for benchmarks visited during various leveling surveys from 1920 through 1995. Several important characteristics were observed and are summarized below. First, the data show that subsidence is not limited to areas of recent sediment accumulation such as the wetland areas of the modern delta (MRD) of the Mississippi River or its upstream alluvial valley (MAV), as supposed by most current syntheses. The entire coastal zone, as well as inland areas several hundred km from the shore, has subsided over the period of measurement. Regionally, vertical velocities range from less than -52 mm/yr in Louisiana to over +15 mm/yr in peripheral areas of eastern Mississippi-Alabama. The mean rate is ~-11 mm/yr in most coastal parishes of Louisiana. In the Mississippi River deltaic plain, subsidence was 2-3 times higher than estimates based on long-term geologic measurements. The data also indicate that adjacent alluvial ridges where the population is concentrated have been similarly affected. In the Chenier plain of southwest Louisiana, a region previously thought to be subsiding at slowly, rates of sinking are similar to those of the deltaic plain. Second, spatial patterns suggest that motions at most locations may have both long (10-100 km) and short (<5 km) wavelength components. Gross aspects of some long wavelength motions can be explained by flexure produced by late Quaternary sediment loads such as the MRD and the MAV. Short wavelength spikes in motions correlate well with areas of fluid withdrawal, faults, and salt structures. Third, motions at many benchmarks have not been linear through time. For example, subsidence in ~10-30 km wide zones surrounding some active normal faults of south Louisiana declined as faulting has slowed (and vice versa). Subsidence in these areas reached a peak in 1970 and declined thereafter. Some local changes also correlate with changes in human-related activities (e.g., reduced groundwater pumping and slower subsidence in the Lake Charles area beginning in the late 1980s).
Benchmarking on Tsunami Currents with ComMIT
NASA Astrophysics Data System (ADS)
Sharghi vand, N.; Kanoglu, U.
2015-12-01
There were no standards for the validation and verification of tsunami numerical models before 2004 Indian Ocean tsunami. Even, number of numerical models has been used for inundation mapping effort, evaluation of critical structures, etc. without validation and verification. After 2004, NOAA Center for Tsunami Research (NCTR) established standards for the validation and verification of tsunami numerical models (Synolakis et al. 2008 Pure Appl. Geophys. 165, 2197-2228), which will be used evaluation of critical structures such as nuclear power plants against tsunami attack. NCTR presented analytical, experimental and field benchmark problems aimed to estimate maximum runup and accepted widely by the community. Recently, benchmark problems were suggested by the US National Tsunami Hazard Mitigation Program Mapping & Modeling Benchmarking Workshop: Tsunami Currents on February 9-10, 2015 at Portland, Oregon, USA (http://nws.weather.gov/nthmp/index.html). These benchmark problems concentrated toward validation and verification of tsunami numerical models on tsunami currents. Three of the benchmark problems were: current measurement of the Japan 2011 tsunami in Hilo Harbor, Hawaii, USA and in Tauranga Harbor, New Zealand, and single long-period wave propagating onto a small-scale experimental model of the town of Seaside, Oregon, USA. These benchmark problems were implemented in the Community Modeling Interface for Tsunamis (ComMIT) (Titov et al. 2011 Pure Appl. Geophys. 168, 2121-2131), which is a user-friendly interface to the validated and verified Method of Splitting Tsunami (MOST) (Titov and Synolakis 1995 J. Waterw. Port Coastal Ocean Eng. 121, 308-316) model and is developed by NCTR. The modeling results are compared with the required benchmark data, providing good agreements and results are discussed. Acknowledgment: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 603839 (Project ASTARTE - Assessment, Strategy and Risk Reduction for Tsunamis in Europe)
Long-term monitoring of river basins: strengths and weaknesses, opportunities and threats
NASA Astrophysics Data System (ADS)
Howden, N. J. K.; Burt, T. P.
2016-12-01
In a world where equilibrium is more and more uncommon, monitoring is an essential way to discover whether undesirable change is taking place. Monitoring requires a deliberate plan of action: the regular collection and processing of information. Long-term data reveal important patterns, allowing trends, cycles, and rare events to be identified. This is particularly important for complex systems where signals may be subtle and slow to emerge. Moreover, very long data sets are essential to test hypotheses undreamt of at the time the monitoring was started. This overview includes long time series from UK river basins showing how hydrology and water quality have changed over time - and continue to change. An important conclusion is the long time frame of system recovery, well beyond the normal lifetime of individual governments or research grants. At a time of increasing hydroclimatic variability, long time series remain crucially important; in particular, continuity of observations is vital at key benchmark sites.
Pro-sustainability choices and child deaths averted: from project experience to investment strategy.
Sarriot, Eric G; Swedberg, Eric A; Ricca, James G
2011-05-01
The pursuit of the Millennium Development Goals and advancing the 'global health agenda' demand the achievement of health impact at scale through efficient investments. We have previously offered that sustainability-a necessary condition for successful expansion of programmes-can be addressed in practical terms. Based on benchmarks from actual child survival projects, we assess the expected impact of translating pro-sustainability choices into investment strategies. We review the experience of Save the Children US in Guinea in terms of investment, approach to sustainability and impact. It offers three benchmarks for impact: Entry project (21 lives saved of children under age five per US$100 000), Expansion project (37 LS/US$100k), and Continuation project (100 LS/US$100k). Extrapolating this experience, we model the impact of a traditional investment scenario against a pro-sustainability scenario and compare the deaths averted per dollar spent over five project cycles. The impact per dollar spent on a pro-sustainability strategy is 3.4 times that of a traditional one over the long run (range from 2.2 to 5.7 times in a sensitivity analysis). This large efficiency differential between two investment approaches offers a testable hypothesis for large-scale/long-term studies. The 'bang for the buck' of health programmes could be greatly increased by following a pro-sustainability investment strategy.
Alan A. Ager; Andrew J. McMahan; James J. Barrett; Charles W. McHugh
2007-01-01
We simulated long-term forest management activities on 16,000-ha wildland-urban interface in the Blue Mountains near La Grande, Oregon. The study area is targeted for thinning and fuels treatments on both private and Federally managed lands to address forest health and sustainability concerns and reduce the risk of severe wildfire. We modeled number of benchmark...
National Center for Advanced Manufacturing Overview
NASA Technical Reports Server (NTRS)
Vickers, J.
2001-01-01
The National Center for Advanced Manufacturing (NCAM) is a strategy, organization, and partnership focused on long-term technology development. The NCAM initially will be a regional partnership, however the intent is national in scope. Benchmarking is needed to follow the concept to the finished project, not using trial and error. Significant progress has been made to date, and NCAM is setting the vision for the future.
Zha, Hao; Latina, Andrea; Grudiev, Alexej; ...
2016-01-20
The baseline design of CLIC (Compact Linear Collider) uses X-band accelerating structures for its main linacs. In order to maintain beam stability in multibunch operation, long-range transverse wakefields must be suppressed by 2 orders of magnitude between successive bunches, which are separated in time by 0.5 ns. Such strong wakefield suppression is achieved by equipping every accelerating structure cell with four damping waveguides terminated with individual rf loads. A beam-based experiment to directly measure the effectiveness of this long-range transverse wakefield and benchmark simulations was made in the FACET test facility at SLAC using a prototype CLIC accelerating structure. Furthermore,more » the experiment showed good agreement with the simulations and a strong suppression of the wakefields with an unprecedented minimum resolution of 0.1 V/(pC mm m).« less
Design and development of a community carbon cycle benchmarking system for CMIP5 models
NASA Astrophysics Data System (ADS)
Mu, M.; Hoffman, F. M.; Lawrence, D. M.; Riley, W. J.; Keppel-Aleks, G.; Randerson, J. T.
2013-12-01
Benchmarking has been widely used to assess the ability of atmosphere, ocean, sea ice, and land surface models to capture the spatial and temporal variability of observations during the historical period. For the carbon cycle and terrestrial ecosystems, the design and development of an open-source community platform has been an important goal as part of the International Land Model Benchmarking (ILAMB) project. Here we designed and developed a software system that enables the user to specify the models, benchmarks, and scoring systems so that results can be tailored to specific model intercomparison projects. We used this system to evaluate the performance of CMIP5 Earth system models (ESMs). Our scoring system used information from four different aspects of climate, including the climatological mean spatial pattern of gridded surface variables, seasonal cycle dynamics, the amplitude of interannual variability, and long-term decadal trends. We used this system to evaluate burned area, global biomass stocks, net ecosystem exchange, gross primary production, and ecosystem respiration from CMIP5 historical simulations. Initial results indicated that the multi-model mean often performed better than many of the individual models for most of the observational constraints.
Benchmark Simulation Model No 2: finalisation of plant layout and default control strategy.
Nopens, I; Benedetti, L; Jeppsson, U; Pons, M-N; Alex, J; Copp, J B; Gernaey, K V; Rosen, C; Steyer, J-P; Vanrolleghem, P A
2010-01-01
The COST/IWA Benchmark Simulation Model No 1 (BSM1) has been available for almost a decade. Its primary purpose has been to create a platform for control strategy benchmarking of activated sludge processes. The fact that the research work related to the benchmark simulation models has resulted in more than 300 publications worldwide demonstrates the interest in and need of such tools within the research community. Recent efforts within the IWA Task Group on "Benchmarking of control strategies for WWTPs" have focused on an extension of the benchmark simulation model. This extension aims at facilitating control strategy development and performance evaluation at a plant-wide level and, consequently, includes both pretreatment of wastewater as well as the processes describing sludge treatment. The motivation for the extension is the increasing interest and need to operate and control wastewater treatment systems not only at an individual process level but also on a plant-wide basis. To facilitate the changes, the evaluation period has been extended to one year. A prolonged evaluation period allows for long-term control strategies to be assessed and enables the use of control handles that cannot be evaluated in a realistic fashion in the one week BSM1 evaluation period. In this paper, the finalised plant layout is summarised and, as was done for BSM1, a default control strategy is proposed. A demonstration of how BSM2 can be used to evaluate control strategies is also given.
NASA Astrophysics Data System (ADS)
Carafa, Michele M. C.; Bird, Peter
2016-07-01
The lithosphere of Italy is exposed to a number of different short-term strain transients, including but not limited to landslides, postseismic relaxation, and volcanic inflation/deflation. These transients affect GPS velocities and complicate the assessment of the long-term tectonic component of the surface deformation. In a companion paper we present a method for anticipating the principal patterns of nontectonic, short-term strains and building this information into the covariance matrix of the geodetic velocities. In this work we apply this method to Italian GPS velocities to build an augmented covariance matrix that characterizes all expected discrepancies between short- and long-term velocities. We find that formal uncertainties usually reported for GPS measurements are smaller than the variability of the same benchmarks across a geologic time span. Furthermore, we include in our modeling the azimuths of most compressive horizontal principal stresses (SHmax) because GPS data cannot resolve the active kinematics of coastal and offshore areas. We find that the final tectonic model can be made relatively insensitive to short-term interfering processes if the augmented covariance matrix and SHmax data records are used in the objective function. This results in a preferred neotectonic model that is also in closer agreement with independent geologic and seismological constraints and has the advantage of reducing short-term biases in forecasts of long-term seismicity.
Benditz, A; Drescher, J; Greimel, F; Zeman, F; Grifka, J; Meißner, W; Völlner, F
2016-12-05
Perioperative pain reduction, particularly during the first two days, is highly important for patients after total knee arthroplasty (TKA). Problems are not only caused by medical issues but by organization and hospital structure. The present study shows how the quality of pain management can be increased by implementing a standardized pain concept and simple, consistent benchmarking. All patients included into the study had undergone total knee arthroplasty. Outcome parameters were analyzed by means of a questionnaire on the first postoperative day. A multidisciplinary team implemented a regular procedure of data analyzes and external benchmarking by participating in a nationwide quality improvement project. At the beginning of the study, our hospital ranked 16 th in terms of activity-related pain and 9 th in patient satisfaction among 47 anonymized hospitals participating in the benchmarking project. At the end of the study, we had improved to 1 st activity-related pain and to 2 nd in patient satisfaction. Although benchmarking started and finished with the same standardized pain management concept, results were initially pure. Beside pharmacological treatment, interdisciplinary teamwork and benchmarking with direct feedback mechanisms are also very important for decreasing postoperative pain and for increasing patient satisfaction after TKA.
Benditz, A.; Drescher, J.; Greimel, F.; Zeman, F.; Grifka, J.; Meißner, W.; Völlner, F.
2016-01-01
Perioperative pain reduction, particularly during the first two days, is highly important for patients after total knee arthroplasty (TKA). Problems are not only caused by medical issues but by organization and hospital structure. The present study shows how the quality of pain management can be increased by implementing a standardized pain concept and simple, consistent benchmarking. All patients included into the study had undergone total knee arthroplasty. Outcome parameters were analyzed by means of a questionnaire on the first postoperative day. A multidisciplinary team implemented a regular procedure of data analyzes and external benchmarking by participating in a nationwide quality improvement project. At the beginning of the study, our hospital ranked 16th in terms of activity-related pain and 9th in patient satisfaction among 47 anonymized hospitals participating in the benchmarking project. At the end of the study, we had improved to 1st activity-related pain and to 2nd in patient satisfaction. Although benchmarking started and finished with the same standardized pain management concept, results were initially pure. Beside pharmacological treatment, interdisciplinary teamwork and benchmarking with direct feedback mechanisms are also very important for decreasing postoperative pain and for increasing patient satisfaction after TKA. PMID:27917911
Ferrada, Evandro; Vergara, Ismael A; Melo, Francisco
2007-01-01
The correct discrimination between native and near-native protein conformations is essential for achieving accurate computer-based protein structure prediction. However, this has proven to be a difficult task, since currently available physical energy functions, empirical potentials and statistical scoring functions are still limited in achieving this goal consistently. In this work, we assess and compare the ability of different full atom knowledge-based potentials to discriminate between native protein structures and near-native protein conformations generated by comparative modeling. Using a benchmark of 152 near-native protein models and their corresponding native structures that encompass several different folds, we demonstrate that the incorporation of close non-bonded pairwise atom terms improves the discriminating power of the empirical potentials. Since the direct and unbiased derivation of close non-bonded terms from current experimental data is not possible, we obtained and used those terms from the corresponding pseudo-energy functions of a non-local knowledge-based potential. It is shown that this methodology significantly improves the discrimination between native and near-native protein conformations, suggesting that a proper description of close non-bonded terms is important to achieve a more complete and accurate description of native protein conformations. Some external knowledge-based energy functions that are widely used in model assessment performed poorly, indicating that the benchmark of models and the specific discrimination task tested in this work constitutes a difficult challenge.
Stress Testing of Organic Light- Emitting Diode Panels and Luminaires
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Lynn; Rountree, Kelley; Mills, Karmann
This report builds on previous DOE efforts with OLED technology by updating information on a previously benchmarked OLED product (the Chalina luminaire from Acuity Brands) and provides new benchmarks on the performance of Brite 2 and Brite Amber OLED panels from OLEDWorks. During the tests described here, samples of these devices were subjected to continuous operation in stress tests at elevated ambient temperature environments of 35°C or 45°C. In addition, samples were also operated continuously at room temperature in a room temperature operational life test (RTOL). One goal of this study was to investigate whether these test conditions can acceleratemore » failure of OLED panels, either through panel shorting or an open circuit in the panel. These stress tests are shown to provide meaningful acceleration of OLED failure modes, and an acceleration factor of 2.6 was calculated at 45°C for some test conditions. In addition, changes in the photometric properties of the emitted light (e.g., luminous flux and chromaticity maintenance) was also evaluated for insights into the long-term stability of these products compared to earlier generations. Because OLEDs are a lighting system, electrical testing was also performed on the panel-driver pairs to provide insights into the impact of the driver on long-term panel performance.« less
Deng, Lei; Wu, Hongjie; Liu, Chuyao; Zhan, Weihua; Zhang, Jingpu
2018-06-01
Long non-coding RNAs (lncRNAs) are involved in many biological processes, such as immune response, development, differentiation and gene imprinting and are associated with diseases and cancers. But the functions of the vast majority of lncRNAs are still unknown. Predicting the biological functions of lncRNAs is one of the key challenges in the post-genomic era. In our work, We first build a global network including a lncRNA similarity network, a lncRNA-protein association network and a protein-protein interaction network according to the expressions and interactions, then extract the topological feature vectors of the global network. Using these features, we present an SVM-based machine learning approach, PLNRGO, to annotate human lncRNAs. In PLNRGO, we construct a training data set according to the proteins with GO annotations and train a binary classifier for each GO term. We assess the performance of PLNRGO on our manually annotated lncRNA benchmark and a protein-coding gene benchmark with known functional annotations. As a result, the performance of our method is significantly better than that of other state-of-the-art methods in terms of maximum F-measure and coverage. Copyright © 2018 Elsevier Ltd. All rights reserved.
Benchmark Testing of the Largest Titanium Aluminide Sheet Subelement Conducted
NASA Technical Reports Server (NTRS)
Bartolotta, Paul A.; Krause, David L.
2000-01-01
To evaluate wrought titanium aluminide (gamma TiAl) as a viable candidate material for the High-Speed Civil Transport (HSCT) exhaust nozzle, an international team led by the NASA Glenn Research Center at Lewis Field successfully fabricated and tested the largest gamma TiAl sheet structure ever manufactured. The gamma TiAl sheet structure, a 56-percent subscale divergent flap subelement, was fabricated for benchmark testing in three-point bending. Overall, the subelement was 84-cm (33-in.) long by 13-cm (5-in.) wide by 8-cm (3-in.) deep. Incorporated into the subelement were features that might be used in the fabrication of a full-scale divergent flap. These features include the use of: (1) gamma TiAl shear clips to join together sections of corrugations, (2) multiple gamma TiAl face sheets, (3) double hot-formed gamma TiAl corrugations, and (4) brazed joints. The structural integrity of the gamma TiAl sheet subelement was evaluated by conducting a room-temperature three-point static bend test.
NASA Technical Reports Server (NTRS)
Chromiak, J. A.; Shansky, J.; Perrone, C.; Vandenburgh, H. H.
1998-01-01
Three-dimensional skeletal muscle organ-like structures (organoids) formed in tissue culture by fusion of proliferating myoblasts into parallel networks of long, unbranched myofibers provide an in vivo-like model for examining the effects of growth factors, tension, and space flight on muscle cell growth and metabolism. To determine the feasibility of maintaining either avian or mammalian muscle organoids in a commercial perfusion bioreactor system, we measured metabolism, protein turnover. and autocrine/paracrine growth factor release rates. Medium glucose was metabolized at a constant rate in both low-serum- and serum-free media for up to 30 d. Total organoid noncollagenous protein and DNA content decreased approximately 22-28% (P < 0.05) over a 13-d period. Total protein synthesis rates could be determined accurately in the bioreactors for up to 30 h and total protein degradation rates could be measured for up to 3 wk. Special fixation and storage conditions necessary for space flight studies were validated as part of the studies. For example, the anabolic autocrine/paracrine skeletal muscle growth factors prostaglandin F2alpha (PGF2alpha) and insulin-like growth factor-1 (IGF-1) could be measured accurately in collected media fractions, even after storage at 37 degrees C for up to 10 d. In contrast, creatine kinase activity (a marker of cell damage) in collected media fractions was unreliable. These results provide initial benchmarks for long-term ex vivo studies of tissue-engineered skeletal muscle.
The public gets what the public wants: experiences of public reporting in long-term care in Europe.
Rodrigues, Ricardo; Trigg, Lisa; Schmidt, Andrea E; Leichsenring, Kai
2014-05-01
Public reporting of quality in long-term care is advocated on the basis of allowing providers to improve their performance by benchmarking and supporting users to choose the best providers. Both mechanisms are intended to drive improvements in quality. However, there is relatively scarce comparative research on the experiences and impact of public reporting on quality in long-term care in Europe. Using information gathered from key informants by means of a structured questionnaire and country profiles, this paper discusses experiences with public reporting mechanisms in seven European countries and available information on their impact on quality in long-term care. Countries surveyed included a variety of public reporting schemes, ranging from pilot programmes to statutory mechanisms. Public reporting mechanisms more often focus on institutional care. Inspections carried out as part of a legal quality assurance framework are the main source of information gathering, supplemented by provider self-assessments in the context of internal quality management and user satisfaction surveys. Information on quality goes well beyond structural indicators to also include indicators on quality of life of users. Information is displayed using numerical scores (percentages), but also measures such as ratings (similar to school grades) and ticks and crosses. Only one country corrects for case-mix. The internet is the preferred medium of displaying information. There was little evidence to show whether public reporting has a significant impact on driving users' choices of provider. Studies reported low awareness of quality indicators among potential end users and information was not always displayed in a convenient format, e.g. through complicated numerical scores. There is scarce evidence of public reporting directly causing improved quality, although the relative youth and the pilot characteristics of some of the schemes covered here could also have contributed to downplay their impact. The establishment of public reporting mechanisms did however contribute to shaping the discussion on quality measurement in several of the countries surveyed. The findings presented in this paper highlight the need to consider some factors in the discussion of the impact of public reporting in long-term care, namely, the organisation of care markets, frequently characterised by limited competition; the circumstances under which user choice takes place, often made under conditions of duress; and the leadership conditions needed to bring about improvements in quality in different care settings. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Krause, David L.; Brewer, Ethan J.; Pawlik, Ralph
2013-01-01
This report provides test methodology details and qualitative results for the first structural benchmark creep test of an Advanced Stirling Convertor (ASC) heater head of ASC-E2 design heritage. The test article was recovered from a flight-like Microcast MarM-247 heater head specimen previously used in helium permeability testing. The test article was utilized for benchmark creep test rig preparation, wall thickness and diametral laser scan hardware metrological developments, and induction heater custom coil experiments. In addition, a benchmark creep test was performed, terminated after one week when through-thickness cracks propagated at thermocouple weld locations. Following this, it was used to develop a unique temperature measurement methodology using contact thermocouples, thereby enabling future benchmark testing to be performed without the use of conventional welded thermocouples, proven problematic for the alloy. This report includes an overview of heater head structural benchmark creep testing, the origin of this particular test article, test configuration developments accomplished using the test article, creep predictions for its benchmark creep test, qualitative structural benchmark creep test results, and a short summary.
Multilayer Optimization of Heterogeneous Networks Using Grammatical Genetic Programming.
Fenton, Michael; Lynch, David; Kucera, Stepan; Claussen, Holger; O'Neill, Michael
2017-09-01
Heterogeneous cellular networks are composed of macro cells (MCs) and small cells (SCs) in which all cells occupy the same bandwidth. Provision has been made under the third generation partnership project-long term evolution framework for enhanced intercell interference coordination (eICIC) between cell tiers. Expanding on previous works, this paper instruments grammatical genetic programming to evolve control heuristics for heterogeneous networks. Three aspects of the eICIC framework are addressed including setting SC powers and selection biases, MC duty cycles, and scheduling of user equipments (UEs) at SCs. The evolved heuristics yield minimum downlink rates three times higher than a baseline method, and twice that of a state-of-the-art benchmark. Furthermore, a greater number of UEs receive transmissions under the proposed scheme than in either the baseline or benchmark cases.
Benchmarks for target tracking
NASA Astrophysics Data System (ADS)
Dunham, Darin T.; West, Philip D.
2011-09-01
The term benchmark originates from the chiseled horizontal marks that surveyors made, into which an angle-iron could be placed to bracket ("bench") a leveling rod, thus ensuring that the leveling rod can be repositioned in exactly the same place in the future. A benchmark in computer terms is the result of running a computer program, or a set of programs, in order to assess the relative performance of an object by running a number of standard tests and trials against it. This paper will discuss the history of simulation benchmarks that are being used by multiple branches of the military and agencies of the US government. These benchmarks range from missile defense applications to chemical biological situations. Typically, a benchmark is used with Monte Carlo runs in order to tease out how algorithms deal with variability and the range of possible inputs. We will also describe problems that can be solved by a benchmark.
Denost, Quentin; Saillour, Florence; Masya, Lindy; Martinaud, Helene Maillou; Guillon, Stephanie; Kret, Marion; Rullier, Eric; Quintard, Bruno; Solomon, Michael
2016-04-04
Among patients with rectal cancer, 5-10% have a primary rectal cancer beyond the total mesorectal excision plane (PRC-bTME) and 10% recur locally following primary surgery (LRRC). In both cases, patients 'care remains challenging with a significant worldwide variation in practice regarding overall management and criteria for operative intervention. These variations in practice can be explained by structural and organizational differences, as well as cultural dissimilarities. However, surgical resection of PRC-bTME and LRRC provides the best chance of long-term survival after complete resection (R0). With regards to the organization of the healthcare system and the operative criteria for these patients, France and Australia seem to be highly different. A benchmarking-type analysis between French and Australian clinical practice, with regards to the care and management of PRC-bTME and LRRC, would allow understanding of patients' care and management structures as well as individual and collective mechanisms of operative decision-making in order to ensure equitable practice and improve survival for these patients. The current study is an international Benchmarking trial comparing two cohorts of 120 consecutive patients with non-metastatic PRC-bTME and LRRC. Patients with curative and palliative treatment intent are included. The study design has three main parts: (1) French and Australian cohorts including clinical, radiological and surgical data, quality of life (MOS SF36, FACT-C) and distress level (Distress thermometer) at the inclusion, 6 and 12 months; (2) experimental analyses consisting of a blinded inter-country reading of pelvic MRI to assess operatory decisions; (3) qualitative analyses based on MDT meeting observation, semi-structured interviews and focus groups of health professional attendees and conducted by a research psychologist in both countries using the same guides. The primary endpoint will be the clinical resection rate. Secondary end points will be concordance rate between French and Australian operative decisions based on the inter-country reading MRI, post-operative mortality and morbidity rates, oncological outcomes based on resection status and one-year overall and disease-free survival, patients' quality of life and distress level. Qualitative analysis will compare obstacles and facilitators of operative decision-making between both countries. Benchmarking can be defined as a comparison and learning process which will allow, in the context of PRC-bTME and LRRC, to understand and to share the whole process management of these patientsbetween Farnce and Australia. NCT02551471 . (date of registration: 09/14/2015).
How do I know if my forecasts are better? Using benchmarks in hydrological ensemble prediction
NASA Astrophysics Data System (ADS)
Pappenberger, F.; Ramos, M. H.; Cloke, H. L.; Wetterhall, F.; Alfieri, L.; Bogner, K.; Mueller, A.; Salamon, P.
2015-03-01
The skill of a forecast can be assessed by comparing the relative proximity of both the forecast and a benchmark to the observations. Example benchmarks include climatology or a naïve forecast. Hydrological ensemble prediction systems (HEPS) are currently transforming the hydrological forecasting environment but in this new field there is little information to guide researchers and operational forecasters on how benchmarks can be best used to evaluate their probabilistic forecasts. In this study, it is identified that the forecast skill calculated can vary depending on the benchmark selected and that the selection of a benchmark for determining forecasting system skill is sensitive to a number of hydrological and system factors. A benchmark intercomparison experiment is then undertaken using the continuous ranked probability score (CRPS), a reference forecasting system and a suite of 23 different methods to derive benchmarks. The benchmarks are assessed within the operational set-up of the European Flood Awareness System (EFAS) to determine those that are 'toughest to beat' and so give the most robust discrimination of forecast skill, particularly for the spatial average fields that EFAS relies upon. Evaluating against an observed discharge proxy the benchmark that has most utility for EFAS and avoids the most naïve skill across different hydrological situations is found to be meteorological persistency. This benchmark uses the latest meteorological observations of precipitation and temperature to drive the hydrological model. Hydrological long term average benchmarks, which are currently used in EFAS, are very easily beaten by the forecasting system and the use of these produces much naïve skill. When decomposed into seasons, the advanced meteorological benchmarks, which make use of meteorological observations from the past 20 years at the same calendar date, have the most skill discrimination. They are also good at discriminating skill in low flows and for all catchment sizes. Simpler meteorological benchmarks are particularly useful for high flows. Recommendations for EFAS are to move to routine use of meteorological persistency, an advanced meteorological benchmark and a simple meteorological benchmark in order to provide a robust evaluation of forecast skill. This work provides the first comprehensive evidence on how benchmarks can be used in evaluation of skill in probabilistic hydrological forecasts and which benchmarks are most useful for skill discrimination and avoidance of naïve skill in a large scale HEPS. It is recommended that all HEPS use the evidence and methodology provided here to evaluate which benchmarks to employ; so forecasters can have trust in their skill evaluation and will have confidence that their forecasts are indeed better.
MARC calculations for the second WIPP structural benchmark problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgan, H.S.
1981-05-01
This report describes calculations made with the MARC structural finite element code for the second WIPP structural benchmark problem. Specific aspects of problem implementation such as element choice, slip line modeling, creep law implementation, and thermal-mechanical coupling are discussed in detail. Also included are the computational results specified in the benchmark problem formulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greiner, Miles
Radial hydride formation in high-burnup used fuel cladding has the potential to radically reduce its ductility and suitability for long-term storage and eventual transport. To avoid this formation, the maximum post-reactor temperature must remain sufficiently low to limit the cladding hoop stress, and so that hydrogen from the existing circumferential hydrides will not dissolve and become available to re-precipitate into radial hydrides under the slow cooling conditions during drying, transfer and early dry-cask storage. The objective of this research is to develop and experimentallybenchmark computational fluid dynamics simulations of heat transfer in post-pool-storage drying operations, when high-burnup fuel cladding ismore » likely to experience its highest temperature. These benchmarked tools can play a key role in evaluating dry cask storage systems for extended storage of high-burnup fuels and post-storage transportation, including fuel retrievability. The benchmarked tools will be used to aid the design of efficient drying processes, as well as estimate variations of surface temperatures as a means of inferring helium integrity inside the canister or cask. This work will be conducted effectively because the principal investigator has experience developing these types of simulations, and has constructed a test facility that can be used to benchmark them.« less
Efficient Online Learning Algorithms Based on LSTM Neural Networks.
Ergen, Tolga; Kozat, Suleyman Serdar
2017-09-13
We investigate online nonlinear regression and introduce novel regression structures based on the long short term memory (LSTM) networks. For the introduced structures, we also provide highly efficient and effective online training methods. To train these novel LSTM-based structures, we put the underlying architecture in a state space form and introduce highly efficient and effective particle filtering (PF)-based updates. We also provide stochastic gradient descent and extended Kalman filter-based updates. Our PF-based training method guarantees convergence to the optimal parameter estimation in the mean square error sense provided that we have a sufficient number of particles and satisfy certain technical conditions. More importantly, we achieve this performance with a computational complexity in the order of the first-order gradient-based methods by controlling the number of particles. Since our approach is generic, we also introduce a gated recurrent unit (GRU)-based approach by directly replacing the LSTM architecture with the GRU architecture, where we demonstrate the superiority of our LSTM-based approach in the sequential prediction task via different real life data sets. In addition, the experimental results illustrate significant performance improvements achieved by the introduced algorithms with respect to the conventional methods over several different benchmark real life data sets.
Zhang, Yu; Biggs, Jason D.; Govind, Niranjan; ...
2014-10-09
In this study, long-range electron transfer (ET) plays a key role in many biological energy conversion and synthesis processes. We show that nonlinear spectroscopy with attosecond X-ray pulses provides a real time movie of the evolving oxidation states and electron densities around atoms, and can probe these processes with high spatial and temporal resolution. This is demonstrated in a simulation study of the stimulated X-ray Raman (SXRS) signals in Re-modified azurin, which had long served as a benchmark for long-range ET in proteins. Nonlinear SXRS signals are sensitive to the local electronic structure and should offer a novel window formore » long-range ET.« less
Vertical Seafloor Geodesy at Two Mid-ocean Ridge Sites: Recent Results and Lessons Learned (Invited)
NASA Astrophysics Data System (ADS)
Nooner, S. L.; Chadwick, B.; Webb, S. C.
2013-12-01
Precise measurements of ambient seawater pressure can be used as a proxy for seafloor depth and can be used to track vertical movements of the seafloor with time. We have employed two measurement techniques simultaneously to track both episodic and long-term deformation signals at active volcanic sites on mid-ocean ridges. The first technique is through the use of Bottom Pressure Recorders (BPRs), which are instruments that sit on the seafloor recording pressure continuously for 1-3 years until they are recovered for data download and battery replacement. BPRs are essential for measuring episodic events but suffer from slow instrument drift that is indistinguishable from long-term deformation. To track the long-term deformation signals and quantify drift in the BPRs, we developed a technique using ROV deployed Mobile Pressure Recorders (MPRs). In 2000 we began making MPR measurements on top of seafloor benchmarks at Axial Seamount on the Juan de Fuca Ridge after it's 1998 eruption. The combined BPR and MPR measurements have allowed us to observe and quantify an entire eruption cycle at the intermediate spreading Axial Seamount. From 2008-2011 we established another geodetic network at the fast spreading East Pacific Rise (EPR) at the site of a 2005/2006 eruption near 9°50' N. Here we show the results to date from both Axial Seamount and the EPR, and discuss lessons learned during the last 14 years. Measurements at Axial Seamount were all made using ROVs, while measurements at the EPR were made using the manned submersible Alvin in 2008 and 2009 and the Jason ROV in 2011. Our observations at Axial Seamount have enabled us to characterize its eruption cycle into 4 distinct phases: 1.) pre-eruption short-term rapid inflation, 2.) co-eruption deflation, 3.) rapid post-eruption reinflation, and 4.) subsequent long-term steady inflation. The transition between the phases 3 and 4 was not captured after the 1998 eruption and is an important impetus for continued observations at Axial since it's most recent eruption in 2011. Our observations at the EPR show that at least 12 cm of inflation accumulated between December 2009 and October 2011. Unfortunately the data from 2008 were not of sufficient quality to act as a baseline. The primary lessons learned from these studies are in minimizing measurement errors and properly quantifying and dealing with the tilt and temperature sensitivities of the pressure gauges. For example, we are now using heavy concrete benchmarks with an indentation that makes it easy to reliably place the instrument in precisely the same location and orientation during each visit to a benchmark. To deal with the temperature sensitivities, we have found that it is best to allow plenty of time for temperature equilibration to occur at the beginning of each dive, and to limit the number of recoveries to the ship. Recoveries can be adequately dealt with, however, as long as each dive is constructed as a self-contained survey loop with repeated measurements at several stations.
Long term real-time GB_InSAR monitoring of a large rock slide
NASA Astrophysics Data System (ADS)
Crosta, G. B.; Agliardi, F.; Sosio, R.; Rivolta, C.; Mannucci, G.
2011-12-01
We analyze a long term monitoring dataset collected for a deep-seated rockslide (Ruinon, Lombardy, Italy). The rockslide has been actively monitored since 1997 by means of an in situ monitoring network (topographic benchmarks, GPS, wire extensometers) and since 2006 by a ground based radar. Monitoring data have been used to set-up and update the geological model, to identify rockslide extent and geometry, to analyse the sensitivity to seasonal changes and their impact on the reliability and early warning potential of monitoring data. GB-InSAR data allowed us to identify sectors characterized by different behaviours and associated to outcropping bedrock, thick debris cover, major structures. GB-Insar data have been used to set-up a "virtual monitoring network" by a posteriori selection of critical locations. Displacement time series extracted from GB-InSAR data provide a large amount of information even in debris-covered areas, when ground-based instrumentation fails. Such spatially-distributed, improved information, validated by selected ground-based measurements, allowed to establish new velocity and displacement thresholds for early warning purposes. The data are analysed to verify the dependency of the observed displacements on the line of sight orientation as well as on that of the framed resolution cell. Relationships with rainfall and morphological slope characteristics have been analysed to verify the sensitivity to rain intensity and amount and to distinguish among the different possible mechanisms.
Moyle, Wendy; Fetherstonhaugh, Deirdre; Greben, Melissa; Beattie, Elizabeth
2015-04-23
Over half of the residents in long-term care have a diagnosis of dementia. Maintaining quality of life is important, as there is no cure for dementia. Quality of life may be used as a benchmark for caregiving, and can help to enhance respect for the person with dementia and to improve care provision. The purpose of this study was to describe quality of life as reported by people living with dementia in long-term care in terms of the influencers of, as well as the strategies needed, to improve quality of life. A descriptive exploratory approach. A subsample of twelve residents across two Australian states from a national quantitative study on quality of life was interviewed. Data were analysed thematically from a realist perspective. The approach to the thematic analysis was inductive and data-driven. Three themes emerged in relation to influencers and strategies related to quality of life: (a) maintaining independence, (b) having something to do, and (c) the importance of social interaction. The findings highlight the importance of understanding individual resident needs and consideration of the complexity of living in large group living situations, in particular in regard to resident decision-making.
Complex network structure influences processing in long-term and short-term memory.
Vitevitch, Michael S; Chan, Kit Ying; Roodenrys, Steven
2012-07-01
Complex networks describe how entities in systems interact; the structure of such networks is argued to influence processing. One measure of network structure, clustering coefficient, C, measures the extent to which neighbors of a node are also neighbors of each other. Previous psycholinguistic experiments found that the C of phonological word-forms influenced retrieval from the mental lexicon (that portion of long-term memory dedicated to language) during the on-line recognition and production of spoken words. In the present study we examined how network structure influences other retrieval processes in long- and short-term memory. In a false-memory task-examining long-term memory-participants falsely recognized more words with low- than high-C. In a recognition memory task-examining veridical memories in long-term memory-participants correctly recognized more words with low- than high-C. However, participants in a serial recall task-examining redintegration in short-term memory-recalled lists comprised of high-C words more accurately than lists comprised of low-C words. These results demonstrate that network structure influences cognitive processes associated with several forms of memory including lexical, long-term, and short-term.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-31
... description of the merchandise is dispositive. Subsidies Valuation Information A. Benchmarks for Short-Term Financing For those programs requiring the application of a won-denominated, short-term interest rate... Issues and Decision Memorandum (CORE from Korea 2006 Decision Memorandum) at ``Benchmarks for Short-Term...
Exterior view of west wall of LongTerm Oxidizer Silo (T28B) ...
Exterior view of west wall of Long-Term Oxidizer Silo (T-28B) at left (taller structure) and adjacent Oxidizer Conditioning Structure (T-28D) at right (lower structure) - Air Force Plant PJKS, Systems Integration Laboratory, Long-Term Oxidizer Silo, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO
Grindon, Christina; Harris, Sarah; Evans, Tom; Novik, Keir; Coveney, Peter; Laughton, Charles
2004-07-15
Molecular modelling played a central role in the discovery of the structure of DNA by Watson and Crick. Today, such modelling is done on computers: the more powerful these computers are, the more detailed and extensive can be the study of the dynamics of such biological macromolecules. To fully harness the power of modern massively parallel computers, however, we need to develop and deploy algorithms which can exploit the structure of such hardware. The Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) is a scalable molecular dynamics code including long-range Coulomb interactions, which has been specifically designed to function efficiently on parallel platforms. Here we describe the implementation of the AMBER98 force field in LAMMPS and its validation for molecular dynamics investigations of DNA structure and flexibility against the benchmark of results obtained with the long-established code AMBER6 (Assisted Model Building with Energy Refinement, version 6). Extended molecular dynamics simulations on the hydrated DNA dodecamer d(CTTTTGCAAAAG)(2), which has previously been the subject of extensive dynamical analysis using AMBER6, show that it is possible to obtain excellent agreement in terms of static, dynamic and thermodynamic parameters between AMBER6 and LAMMPS. In comparison with AMBER6, LAMMPS shows greatly improved scalability in massively parallel environments, opening up the possibility of efficient simulations of order-of-magnitude larger systems and/or for order-of-magnitude greater simulation times.
[The Use of Jumbo Cups in Revision Total Hip Arthroplasty].
von Roth, Philipp; Wassilew, Georgi I
2017-10-01
Extra-large uncemented jumbo cups are among the most common methods of acetabular revision. Jumbo cups do not contribute to bone stock restoration, and in the case of a subsequent revision, an even larger bone defect is to be expected. Thus, understanding long-term survival is essential. The present article discusses the literature relevant to this topic and addresses technical and implant-specific characteristics of jumbo cups. In summary, jumbo cups show an acceptable long-term survival rate, with aseptic loosening as the most common reason for revision and dislocation being the most common complication. Through the development of alternative revision systems, jumbo cups have lost their importance in today's practice. However, they can serve as a benchmark for studies of newer technologies in revision total hip arthroplasty. Georg Thieme Verlag KG Stuttgart · New York.
Brucker, Sara Y; Schumacher, Claudia; Sohn, Christoph; Rezai, Mahdi; Bamberg, Michael; Wallwiener, Diethelm
2008-01-01
Background The main study objectives were: to establish a nationwide voluntary collaborative network of breast centres with independent data analysis; to define suitable quality indicators (QIs) for benchmarking the quality of breast cancer (BC) care; to demonstrate existing differences in BC care quality; and to show that BC care quality improved with benchmarking from 2003 to 2007. Methods BC centres participated voluntarily in a scientific benchmarking procedure. A generic XML-based data set was developed and used for data collection. Nine guideline-based quality targets serving as rate-based QIs were initially defined, reviewed annually and modified or expanded accordingly. QI changes over time were analysed descriptively. Results During 2003–2007, respective increases in participating breast centres and postoperatively confirmed BCs were from 59 to 220 and from 5,994 to 31,656 (> 60% of new BCs/year in Germany). Starting from 9 process QIs, 12 QIs were developed by 2007 as surrogates for long-term outcome. Results for most QIs increased. From 2003 to 2007, the most notable increases seen were for preoperative histological confirmation of diagnosis (58% (in 2003) to 88% (in 2007)), appropriate endocrine therapy in hormone receptor-positive patients (27 to 93%), appropriate radiotherapy after breast-conserving therapy (20 to 79%) and appropriate radiotherapy after mastectomy (8 to 65%). Conclusion Nationwide external benchmarking of BC care is feasible and successful. The benchmarking system described allows both comparisons among participating institutions as well as the tracking of changes in average quality of care over time for the network as a whole. Marked QI increases indicate improved quality of BC care. PMID:19055735
Brucker, Sara Y; Schumacher, Claudia; Sohn, Christoph; Rezai, Mahdi; Bamberg, Michael; Wallwiener, Diethelm
2008-12-02
The main study objectives were: to establish a nationwide voluntary collaborative network of breast centres with independent data analysis; to define suitable quality indicators (QIs) for benchmarking the quality of breast cancer (BC) care; to demonstrate existing differences in BC care quality; and to show that BC care quality improved with benchmarking from 2003 to 2007. BC centres participated voluntarily in a scientific benchmarking procedure. A generic XML-based data set was developed and used for data collection. Nine guideline-based quality targets serving as rate-based QIs were initially defined, reviewed annually and modified or expanded accordingly. QI changes over time were analysed descriptively. During 2003-2007, respective increases in participating breast centres and postoperatively confirmed BCs were from 59 to 220 and from 5,994 to 31,656 (> 60% of new BCs/year in Germany). Starting from 9 process QIs, 12 QIs were developed by 2007 as surrogates for long-term outcome. Results for most QIs increased. From 2003 to 2007, the most notable increases seen were for preoperative histological confirmation of diagnosis (58% (in 2003) to 88% (in 2007)), appropriate endocrine therapy in hormone receptor-positive patients (27 to 93%), appropriate radiotherapy after breast-conserving therapy (20 to 79%) and appropriate radiotherapy after mastectomy (8 to 65%). Nationwide external benchmarking of BC care is feasible and successful. The benchmarking system described allows both comparisons among participating institutions as well as the tracking of changes in average quality of care over time for the network as a whole. Marked QI increases indicate improved quality of BC care.
NASA Astrophysics Data System (ADS)
Scaioni, M.; Corti, M.; Diolaiuti, G.; Fugazza, D.; Cernuschi, M.
2017-09-01
Experts from the University of Milan have been investigating Forni Glacier in the Italian alps for decades, resulting in the archive of a cumbersome mass of observed data. While the analysis of archive maps, medium resolution satellite images and DEM's may provide an overview of the long-term processes, the application of close-range sensing techniques offers the unprecedented opportunity to operate a 4D reconstruction of the glacier geometry at both global and local levels. In the latest years the availability of high-resolution DEM's from stereo-photogrammetry (2007) and UAV-photogrammetry (2014 and 2016) has allowed an improved analysis of the glacier ice-mass balance within time. During summer 2016 a methodology to record the local disruption processes has been investigated. The presence of vertical and sub-vertical surfaces has motivated the use of Structure-from-Motion Photogrammetry from ground-based stations, which yielded results comparable to the ones achieved using a long-range terrestrial laser scanner. This technique may be assumed as benchmarking for accuracy assessment, but is more difficult to be operated in high-mountain areas. Nevertheless, the measurement of GCP's for the terrestrial photogrammetric project has revealed to be a complex task, involving the need of a total station a GNSS. The effect of network geometry on the final output has also been investigated for SfM-Photogrammetry, considering the severe limitations implied in the Alpine environment.
myBrain: a novel EEG embedded system for epilepsy monitoring.
Pinho, Francisco; Cerqueira, João; Correia, José; Sousa, Nuno; Dias, Nuno
2017-10-01
The World Health Organisation has pointed that a successful health care delivery, requires effective medical devices as tools for prevention, diagnosis, treatment and rehabilitation. Several studies have concluded that longer monitoring periods and outpatient settings might increase diagnosis accuracy and success rate of treatment selection. The long-term monitoring of epileptic patients through electroencephalography (EEG) has been considered a powerful tool to improve the diagnosis, disease classification, and treatment of patients with such condition. This work presents the development of a wireless and wearable EEG acquisition platform suitable for both long-term and short-term monitoring in inpatient and outpatient settings. The developed platform features 32 passive dry electrodes, analogue-to-digital signal conversion with 24-bit resolution and a variable sampling frequency from 250 Hz to 1000 Hz per channel, embedded in a stand-alone module. A computer-on-module embedded system runs a Linux ® operating system that rules the interface between two software frameworks, which interact to satisfy the real-time constraints of signal acquisition as well as parallel recording, processing and wireless data transmission. A textile structure was developed to accommodate all components. Platform performance was evaluated in terms of hardware, software and signal quality. The electrodes were characterised through electrochemical impedance spectroscopy and the operating system performance running an epileptic discrimination algorithm was evaluated. Signal quality was thoroughly assessed in two different approaches: playback of EEG reference signals and benchmarking with a clinical-grade EEG system in alpha-wave replacement and steady-state visual evoked potential paradigms. The proposed platform seems to efficiently monitor epileptic patients in both inpatient and outpatient settings and paves the way to new ambulatory clinical regimens as well as non-clinical EEG applications.
Validating the Usefulness of Combined Japanese GMS Data For Long-Term Global Change Studies
NASA Technical Reports Server (NTRS)
Simpson, James J.; Dodge, James C. (Technical Monitor)
2001-01-01
The primary objectives of the Geostationary Meteorological Satellite (GMS)-5 Pathfinder Project were the following: (1) to evaluate GMS-5 data for sources of error and develop methods for minimizing any such errors in GMS-5 data; (2) to prepare a GMS-5 Pathfinder data set for the GMS-5 Pathfinder Benchmark Period (1 July 95 - 30 June 96); and (3) show the usefulness of the improved Pathfinder data set in at least one geophysical application. All objectives were met.
National audit of continence care: laying the foundation.
Mian, Sarah; Wagg, Adrian; Irwin, Penny; Lowe, Derek; Potter, Jonathan; Pearson, Michael
2005-12-01
National audit provides a basis for establishing performance against national standards, benchmarking against other service providers and improving standards of care. For effective audit, clinical indicators are required that are valid, feasible to apply and reliable. This study describes the methods used to develop clinical indicators of continence care in preparation for a national audit. To describe the methods used to develop and test clinical indicators of continence care with regard to validity, feasibility and reliability. A multidisciplinary working group developed clinical indicators that measured the structure, process and outcome of care as well as case-mix variables. Literature searching, consensus workshops and a Delphi process were used to develop the indicators. The indicators were tested in 15 secondary care sites, 15 primary care sites and 15 long-term care settings. The process of development produced indicators that received a high degree of consensus within the Delphi process. Testing of the indicators demonstrated an internal reliability of 0.7 and an external reliability of 0.6. Data collection required significant investment in terms of staff time and training. The method used produced indicators that achieved a high degree of acceptance from health care professionals. The reliability of data collection was high for this audit and was similar to the level seen in other successful national audits. Data collection for the indicators was feasible to collect, however, issues of time and staffing were identified as limitations to such data collection. The study has described a systematic method for developing clinical indicators for national audit. The indicators proved robust and reliable in primary and secondary care as well as long-term care settings.
NASA Astrophysics Data System (ADS)
Caron, David A.; Connell, Paige E.; Schaffner, Rebecca A.; Schnetzer, Astrid; Fuhrman, Jed A.; Countway, Peter D.; Kim, Diane Y.
2017-03-01
Biogeochemistry in marine plankton communities is strongly influenced by the activities of microbial species. Understanding the composition and dynamics of these assemblages is essential for modeling emergent community-level processes, yet few studies have examined all of the biological assemblages present in the plankton, and benchmark data of this sort from time-series studies are rare. Abundance and biomass of the entire microbial assemblage and mesozooplankton (>200 μm) were determined vertically, monthly and seasonally over a 3-year period at a coastal time-series station in the San Pedro Basin off the southwestern coast of the USA. All compartments of the planktonic community were enumerated (viruses in the femtoplankton size range [0.02-0.2 μm], bacteria + archaea and cyanobacteria in the picoplankton size range [0.2-2.0 μm], phototrophic and heterotrophic protists in the nanoplanktonic [2-20 μm] and microplanktonic [20-200 μm] size ranges, and mesozooplankton [>200 μm]. Carbon biomass of each category was estimated using standard conversion factors. Plankton abundances varied over seven orders of magnitude across all categories, and total carbon biomass averaged approximately 60 μg C l-1 in surface waters of the 890 m water column over the study period. Bacteria + archaea comprised the single largest component of biomass (>1/3 of the total), with the sum of phototrophic protistan biomass making up a similar proportion. Temporal variability at this subtropical station was not dramatic. Monthly depth-specific and depth-integrated biomass varied 2-fold at the station, while seasonal variances were generally <50%. This study provides benchmark information for investigating long-term environmental forcing on the composition and dynamics of the microbes that dominate food web structure and function at this coastal observatory.
Puton, Tomasz; Kozlowski, Lukasz P.; Rother, Kristian M.; Bujnicki, Janusz M.
2013-01-01
We present a continuous benchmarking approach for the assessment of RNA secondary structure prediction methods implemented in the CompaRNA web server. As of 3 October 2012, the performance of 28 single-sequence and 13 comparative methods has been evaluated on RNA sequences/structures released weekly by the Protein Data Bank. We also provide a static benchmark generated on RNA 2D structures derived from the RNAstrand database. Benchmarks on both data sets offer insight into the relative performance of RNA secondary structure prediction methods on RNAs of different size and with respect to different types of structure. According to our tests, on the average, the most accurate predictions obtained by a comparative approach are generated by CentroidAlifold, MXScarna, RNAalifold and TurboFold. On the average, the most accurate predictions obtained by single-sequence analyses are generated by CentroidFold, ContextFold and IPknot. The best comparative methods typically outperform the best single-sequence methods if an alignment of homologous RNA sequences is available. This article presents the results of our benchmarks as of 3 October 2012, whereas the rankings presented online are continuously updated. We will gladly include new prediction methods and new measures of accuracy in the new editions of CompaRNA benchmarks. PMID:23435231
Uncertainty in stormwater drainage adaptation: what matters and how much is too much?
NASA Astrophysics Data System (ADS)
Stack, L. J.; Simpson, M. H.; Moore, T.; Gulliver, J. S.; Roseen, R.; Eberhart, L.; Smith, J. B.; Gruber, J.; Yetka, L.; Wood, R.; Lawson, C.
2014-12-01
Published research continues to report that long-term, local-scale precipitation forecasts are too uncertain to support local-scale adaptation. Numerous studies quantify the range of uncertainty in downscaled model output; compare this with uncertainty from other sources such as hydrological modeling; and propose circumventing uncertainty via "soft" or "low regret" actions, or adaptive management. Yet non-structural adaptations alone are likely insufficient. Structural adaptation requires quantified engineering design specifications. However, the literature does not define a tolerable level of uncertainty. Without such a benchmark, how can we determine whether the climate-change-cognizant design specifications that we are capable of, for example the climate change factors increasingly utilized in European practice, are viable? The presentation will explore this question, in the context of reporting results and observations from an ongoing ten-year program assessing local-scale stormwater drainage system vulnerabilities, required capacities, and adaptation options and costs. This program has studied stormwater systems of varying complexity in a variety of regions, topographies, and levels of urbanization, in northern-New England and the upper-Midwestern United States. These studies demonstrate the feasibility of local-scale design specifications, and provide tangible information on risk to enable valid cost/benefit decisions. The research program has found that stormwater planners and engineers have routinely accepted, in the normal course of professional practice, a level of uncertainty in hydrological modeling comparable to that in long-term precipitation projections. Moreover, the ability to quantify required capacity and related construction costs for specific climate change scenarios, the insensitivity of capacity and costs to uncertainty, and the percentage of pipes and culverts that never require upsizing, all serve to limit the impact of uncertainty inherent in climate change projections.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thrower, A.W.; Patric, J.; Keister, M.
2008-07-01
The purpose of the Office of Civilian Radioactive Waste Management's (OCRWM) Logistics Benchmarking Project is to identify established government and industry practices for the safe transportation of hazardous materials which can serve as a yardstick for design and operation of OCRWM's national transportation system for shipping spent nuclear fuel and high-level radioactive waste to the proposed repository at Yucca Mountain, Nevada. The project will present logistics and transportation practices and develop implementation recommendations for adaptation by the national transportation system. This paper will describe the process used to perform the initial benchmarking study, highlight interim findings, and explain how thesemore » findings are being implemented. It will also provide an overview of the next phase of benchmarking studies. The benchmarking effort will remain a high-priority activity throughout the planning and operational phases of the transportation system. The initial phase of the project focused on government transportation programs to identify those practices which are most clearly applicable to OCRWM. These Federal programs have decades of safe transportation experience, strive for excellence in operations, and implement effective stakeholder involvement, all of which parallel OCRWM's transportation mission and vision. The initial benchmarking project focused on four business processes that are critical to OCRWM's mission success, and can be incorporated into OCRWM planning and preparation in the near term. The processes examined were: transportation business model, contract management/out-sourcing, stakeholder relations, and contingency planning. More recently, OCRWM examined logistics operations of AREVA NC's Business Unit Logistics in France. The next phase of benchmarking will focus on integrated domestic and international commercial radioactive logistic operations. The prospective companies represent large scale shippers and have vast experience in safely and efficiently shipping spent nuclear fuel and other radioactive materials. Additional business processes may be examined in this phase. The findings of these benchmarking efforts will help determine the organizational structure and requirements of the national transportation system. (authors)« less
PDB_REDO: automated re-refinement of X-ray structure models in the PDB.
Joosten, Robbie P; Salzemann, Jean; Bloch, Vincent; Stockinger, Heinz; Berglund, Ann-Charlott; Blanchet, Christophe; Bongcam-Rudloff, Erik; Combet, Christophe; Da Costa, Ana L; Deleage, Gilbert; Diarena, Matteo; Fabbretti, Roberto; Fettahi, Géraldine; Flegel, Volker; Gisel, Andreas; Kasam, Vinod; Kervinen, Timo; Korpelainen, Eija; Mattila, Kimmo; Pagni, Marco; Reichstadt, Matthieu; Breton, Vincent; Tickle, Ian J; Vriend, Gert
2009-06-01
Structural biology, homology modelling and rational drug design require accurate three-dimensional macromolecular coordinates. However, the coordinates in the Protein Data Bank (PDB) have not all been obtained using the latest experimental and computational methods. In this study a method is presented for automated re-refinement of existing structure models in the PDB. A large-scale benchmark with 16 807 PDB entries showed that they can be improved in terms of fit to the deposited experimental X-ray data as well as in terms of geometric quality. The re-refinement protocol uses TLS models to describe concerted atom movement. The resulting structure models are made available through the PDB_REDO databank (http://www.cmbi.ru.nl/pdb_redo/). Grid computing techniques were used to overcome the computational requirements of this endeavour.
Structural Benchmark Creep Testing for the Advanced Stirling Convertor Heater Head
NASA Technical Reports Server (NTRS)
Krause, David L.; Kalluri, Sreeramesh; Bowman, Randy R.; Shah, Ashwin R.
2008-01-01
The National Aeronautics and Space Administration (NASA) has identified the high efficiency Advanced Stirling Radioisotope Generator (ASRG) as a candidate power source for use on long duration Science missions such as lunar applications, Mars rovers, and deep space missions. For the inherent long life times required, a structurally significant design limit for the heater head component of the ASRG Advanced Stirling Convertor (ASC) is creep deformation induced at low stress levels and high temperatures. Demonstrating proof of adequate margins on creep deformation and rupture for the operating conditions and the MarM-247 material of construction is a challenge that the NASA Glenn Research Center is addressing. The combined analytical and experimental program ensures integrity and high reliability of the heater head for its 17-year design life. The life assessment approach starts with an extensive series of uniaxial creep tests on thin MarM-247 specimens that comprise the same chemistry, microstructure, and heat treatment processing as the heater head itself. This effort addresses a scarcity of openly available creep properties for the material as well as for the virtual absence of understanding of the effect on creep properties due to very thin walls, fine grains, low stress levels, and high-temperature fabrication steps. The approach continues with a considerable analytical effort, both deterministically to evaluate the median creep life using nonlinear finite element analysis, and probabilistically to calculate the heater head s reliability to a higher degree. Finally, the approach includes a substantial structural benchmark creep testing activity to calibrate and validate the analytical work. This last element provides high fidelity testing of prototypical heater head test articles; the testing includes the relevant material issues and the essential multiaxial stress state, and applies prototypical and accelerated temperature profiles for timely results in a highly controlled laboratory environment. This paper focuses on the last element and presents a preliminary methodology for creep rate prediction, the experimental methods, test challenges, and results from benchmark testing of a trial MarM-247 heater head test article. The results compare favorably with the analytical strain predictions. A description of other test findings is provided, and recommendations for future test procedures are suggested. The manuscript concludes with describing the potential impact of the heater head creep life assessment and benchmark testing effort on the ASC program.
Ring, Matthias; Eskofier, Bjoern M.
2015-01-01
Long-term studies in rodents are the benchmark method to assess carcinogenicity of single substances, mixtures, and multi-compounds. In such a study, mice and rats are exposed to a test agent at different dose levels for a period of two years and the incidence of neoplastic lesions is observed. However, this two-year study is also expensive, time-consuming, and burdensome to the experimental animals. Consequently, various alternatives have been proposed in the literature to assess carcinogenicity on basis of short-term studies. In this paper, we investigated if effects on the rodents’ liver weight in short-term studies can be exploited to predict the incidence of liver tumors in long-term studies. A set of 138 paired short- and long-term studies was compiled from the database of the U.S. National Toxicology Program (NTP), more precisely, from (long-term) two-year carcinogenicity studies and their preceding (short-term) dose finding studies. In this set, data mining methods revealed patterns that can predict the incidence of liver tumors with accuracies of over 80%. However, the results simultaneously indicated a potential bias regarding liver tumors in two-year NTP studies. The incidence of liver tumors does not only depend on the test agent but also on other confounding factors in the study design, e.g., species, sex, type of substance. We recommend considering this bias if the hazard or risk of a test agent is assessed on basis of a NTP carcinogenicity study. PMID:25658102
Ka'opua, Lana Sue I; Gotay, Carolyn C; Hannum, Meghan; Bunghanoy, Grace
2005-05-01
Increasingly evident is the important role of partners in patients' adaptation to diagnosis, treatment, and recovery. Yet, little is known about partners' adaptation when patients reach the benchmark known as long-term survival. This study describes elderly wives of prostate cancer survivors' perspectives of adaptation to the enduring challenges of prostate cancer survival and considers their experience in the context of ethnicity. Content analysis and grounded theory methods guided data collection and analysis of two waves of in-depth interviews with 26 elderly Asian/Pacific Islanders (Chinese, Filipino, Japanese, Native Hawaiian) living in Hawai'i. Continuous learning was the most common phenomenon as reflected in four types of adaptive work: involvement in husband's health, affirmation of the marital bond, normalization of adversity, and participation in personally meaningful acts. Issues are highlighted for consideration in developing culturally relevant, age-appropriate, and strengths-based interventions.
Carbon nanocages: a new support material for Pt catalyst with remarkably high durability.
Wang, Xiao Xia; Tan, Zhe Hua; Zeng, Min; Wang, Jian Nong
2014-03-24
Low durability is the major challenge hindering the large-scale implementation of proton exchange membrane fuel cell (PEMFC) technology, and corrosion of carbon support materials of current catalysts is the main cause. Here, we describe the finding of remarkably high durability with the use of a novel support material. This material is based on hollow carbon nanocages developed with a high degree of graphitization and concurrent nitrogen doping for oxidation resistance enhancement, uniform deposition of fine Pt particles, and strong Pt-support interaction. Accelerated degradation testing shows that such designed catalyst possesses a superior electrochemical activity and long-term stability for both hydrogen oxidation and oxygen reduction relative to industry benchmarks of current catalysts. Further testing under conditions of practical fuel cell operation reveals almost no degradation over long-term cycling. Such a catalyst of high activity, particularly, high durability, opens the door for the next-generation PEMFC for "real world" application.
NASA Astrophysics Data System (ADS)
Zhuo, La; Mekonnen, Mesfin M.; Hoekstra, Arjen Y.
2016-11-01
Meeting growing food demands while simultaneously shrinking the water footprint (WF) of agricultural production is one of the greatest societal challenges. Benchmarks for the WF of crop production can serve as a reference and be helpful in setting WF reduction targets. The consumptive WF of crops, the consumption of rainwater stored in the soil (green WF), and the consumption of irrigation water (blue WF) over the crop growing period varies spatially and temporally depending on environmental factors like climate and soil. The study explores which environmental factors should be distinguished when determining benchmark levels for the consumptive WF of crops. Hereto we determine benchmark levels for the consumptive WF of winter wheat production in China for all separate years in the period 1961-2008, for rain-fed vs. irrigated croplands, for wet vs. dry years, for warm vs. cold years, for four different soil classes, and for two different climate zones. We simulate consumptive WFs of winter wheat production with the crop water productivity model AquaCrop at a 5 by 5 arcmin resolution, accounting for water stress only. The results show that (i) benchmark levels determined for individual years for the country as a whole remain within a range of ±20 % around long-term mean levels over 1961-2008, (ii) the WF benchmarks for irrigated winter wheat are 8-10 % larger than those for rain-fed winter wheat, (iii) WF benchmarks for wet years are 1-3 % smaller than for dry years, (iv) WF benchmarks for warm years are 7-8 % smaller than for cold years, (v) WF benchmarks differ by about 10-12 % across different soil texture classes, and (vi) WF benchmarks for the humid zone are 26-31 % smaller than for the arid zone, which has relatively higher reference evapotranspiration in general and lower yields in rain-fed fields. We conclude that when determining benchmark levels for the consumptive WF of a crop, it is useful to primarily distinguish between different climate zones. If actual consumptive WFs of winter wheat throughout China were reduced to the benchmark levels set by the best 25 % of Chinese winter wheat production (1224 m3 t-1 for arid areas and 841 m3 t-1 for humid areas), the water saving in an average year would be 53 % of the current water consumption at winter wheat fields in China. The majority of the yield increase and associated improvement in water productivity can be achieved in southern China.
Zheng, Heping; Shabalin, Ivan G.; Handing, Katarzyna B.; Bujnicki, Janusz M.; Minor, Wladek
2015-01-01
The ubiquitous presence of magnesium ions in RNA has long been recognized as a key factor governing RNA folding, and is crucial for many diverse functions of RNA molecules. In this work, Mg2+-binding architectures in RNA were systematically studied using a database of RNA crystal structures from the Protein Data Bank (PDB). Due to the abundance of poorly modeled or incorrectly identified Mg2+ ions, the set of all sites was comprehensively validated and filtered to identify a benchmark dataset of 15 334 ‘reliable’ RNA-bound Mg2+ sites. The normalized frequencies by which specific RNA atoms coordinate Mg2+ were derived for both the inner and outer coordination spheres. A hierarchical classification system of Mg2+ sites in RNA structures was designed and applied to the benchmark dataset, yielding a set of 41 types of inner-sphere and 95 types of outer-sphere coordinating patterns. This classification system has also been applied to describe six previously reported Mg2+-binding motifs and detect them in new RNA structures. Investigation of the most populous site types resulted in the identification of seven novel Mg2+-binding motifs, and all RNA structures in the PDB were screened for the presence of these motifs. PMID:25800744
Marking Closely or on the Bench?: An Australian's Benchmark Statement.
ERIC Educational Resources Information Center
Jones, Roy
2000-01-01
Reviews the benchmark statements of the Quality Assurance Agency for Higher Education in the United Kingdom. Examines the various sections within the benchmark. States that in terms of emphasizing the positive attributes of the geography discipline the statements have wide utility and applicability. (CMK)
An overview of the ENEA activities in the field of coupled codes NPP simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parisi, C.; Negrenti, E.; Sepielli, M.
2012-07-01
In the framework of the nuclear research activities in the fields of safety, training and education, ENEA (the Italian National Agency for New Technologies, Energy and the Sustainable Development) is in charge of defining and pursuing all the necessary steps for the development of a NPP engineering simulator at the 'Casaccia' Research Center near Rome. A summary of the activities in the field of the nuclear power plants simulation by coupled codes is here presented with the long term strategy for the engineering simulator development. Specifically, results from the participation in international benchmarking activities like the OECD/NEA 'Kalinin-3' benchmark andmore » the 'AER-DYN-002' benchmark, together with simulations of relevant events like the Fukushima accident, are here reported. The ultimate goal of such activities performed using state-of-the-art technology is the re-establishment of top level competencies in the NPP simulation field in order to facilitate the development of Enhanced Engineering Simulators and to upgrade competencies for supporting national energy strategy decisions, the nuclear national safety authority, and the R and D activities on NPP designs. (authors)« less
Nutrient cycle benchmarks for earth system land model
NASA Astrophysics Data System (ADS)
Zhu, Q.; Riley, W. J.; Tang, J.; Zhao, L.
2017-12-01
Projecting future biosphere-climate feedbacks using Earth system models (ESMs) relies heavily on robust modeling of land surface carbon dynamics. More importantly, soil nutrient (particularly, nitrogen (N) and phosphorus (P)) dynamics strongly modulate carbon dynamics, such as plant sequestration of atmospheric CO2. Prevailing ESM land models all consider nitrogen as a potentially limiting nutrient, and several consider phosphorus. However, including nutrient cycle processes in ESM land models potentially introduces large uncertainties that could be identified and addressed by improved observational constraints. We describe the development of two nutrient cycle benchmarks for ESM land models: (1) nutrient partitioning between plants and soil microbes inferred from 15N and 33P tracers studies and (2) nutrient limitation effects on carbon cycle informed by long-term fertilization experiments. We used these benchmarks to evaluate critical hypotheses regarding nutrient cycling and their representation in ESMs. We found that a mechanistic representation of plant-microbe nutrient competition based on relevant functional traits best reproduced observed plant-microbe nutrient partitioning. We also found that for multiple-nutrient models (i.e., N and P), application of Liebig's law of the minimum is often inaccurate. Rather, the Multiple Nutrient Limitation (MNL) concept better reproduces observed carbon-nutrient interactions.
1. Exterior view of LongTerm Hydrazine Silo (T28E), looking southeast. ...
1. Exterior view of Long-Term Hydrazine Silo (T-28E), looking southeast. The structure was designed to assess long-term environmental impacts on storage of the Titan's fuel (hydrazine). The low-lying building to the immediate right of the silo is the Fuel Purification Structure (T-28E), constructed during the late 1960s to purify hydrazine for long-term hardware requirements for satellites and space expedition vehicles associated with the Titan III. - Air Force Plant PJKS, Systems Integration Laboratory, Long-Term Hydrazine Silo, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO
Benchmarking specialty hospitals, a scoping review on theory and practice.
Wind, A; van Harten, W H
2017-04-04
Although benchmarking may improve hospital processes, research on this subject is limited. The aim of this study was to provide an overview of publications on benchmarking in specialty hospitals and a description of study characteristics. We searched PubMed and EMBASE for articles published in English in the last 10 years. Eligible articles described a project stating benchmarking as its objective and involving a specialty hospital or specific patient category; or those dealing with the methodology or evaluation of benchmarking. Of 1,817 articles identified in total, 24 were included in the study. Articles were categorized into: pathway benchmarking, institutional benchmarking, articles on benchmark methodology or -evaluation and benchmarking using a patient registry. There was a large degree of variability:(1) study designs were mostly descriptive and retrospective; (2) not all studies generated and showed data in sufficient detail; and (3) there was variety in whether a benchmarking model was just described or if quality improvement as a consequence of the benchmark was reported upon. Most of the studies that described a benchmark model described the use of benchmarking partners from the same industry category, sometimes from all over the world. Benchmarking seems to be more developed in eye hospitals, emergency departments and oncology specialty hospitals. Some studies showed promising improvement effects. However, the majority of the articles lacked a structured design, and did not report on benchmark outcomes. In order to evaluate the effectiveness of benchmarking to improve quality in specialty hospitals, robust and structured designs are needed including a follow up to check whether the benchmark study has led to improvements.
Palm oil price forecasting model: An autoregressive distributed lag (ARDL) approach
NASA Astrophysics Data System (ADS)
Hamid, Mohd Fahmi Abdul; Shabri, Ani
2017-05-01
Palm oil price fluctuated without any clear trend or cyclical pattern in the last few decades. The instability of food commodities price causes it to change rapidly over time. This paper attempts to develop Autoregressive Distributed Lag (ARDL) model in modeling and forecasting the price of palm oil. In order to use ARDL as a forecasting model, this paper modifies the data structure where we only consider lagged explanatory variables to explain the variation in palm oil price. We then compare the performance of this ARDL model with a benchmark model namely ARIMA in term of their comparative forecasting accuracy. This paper also utilize ARDL bound testing approach to co-integration in examining the short run and long run relationship between palm oil price and its determinant; production, stock, and price of soybean as the substitute of palm oil and price of crude oil. The comparative forecasting accuracy suggests that ARDL model has a better forecasting accuracy compared to ARIMA.
III-Vs at Scale: A PV Manufacturing Cost Analysis of the Thin Film Vapor-Liquid-Solid Growth Mode
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Maxwell; Horowitz, Kelsey; Woodhouse, Michael
The authors present a manufacturing cost analysis for producing thin-film indium phosphide modules by combining a novel thin-film vapor-liquid-solid (TF-VLS) growth process with a standard monolithic module platform. The example cell structure is ITO/n-TiO2/p-InP/Mo. For a benchmark scenario of 12% efficient modules, the module cost is estimated to be $0.66/W(DC) and the module cost is calculated to be around $0.36/W(DC) at a long-term potential efficiency of 24%. The manufacturing cost for the TF-VLS growth portion is estimated to be ~$23/m2, a significant reduction compared with traditional metalorganic chemical vapor deposition. The analysis here suggests the TF-VLS growth mode could enablemore » lower-cost, high-efficiency III-V photovoltaics compared with manufacturing methods used today and open up possibilities for other optoelectronic applications as well.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Royer, Michael P.; McCullough, Jeffrey J.; Tucker, Joseph C.
The lumen depreciation and color shift of 17 different A lamps (15 LED, 1 CFL, 1 halogen) was monitored in the automated long-term test apparatus (ALTA) for more than 7,500 hours. Ten samples of each lamp model were tested, with measurements recorded on a weekly basis. The lamps were operated continuously at an ambient temperature of 45°C (-1°C). Importantly, the steady-state test conditions were not optimized for inducing catastrophic failure for any of the lamp technologies—to which thermal cycling is a strong contributor— and are not typical of normal use patterns—which usually include off periods where the lamp cools down.more » Further, the test conditions differ from those used in standardized long-term test methods (i.e., IES LM-80, IES LM-84), so the results should not be directly compared. On the other hand, the test conditions are similar to those used by ENERGY STAR (when elevated temperature testing is called for). Likewise, the conditions and assumptions used by manufacturers to generated lifetime claims may vary; the CALiPER long-term data is informative, but cannot necessarily be used to discredit manufacturer claims. The test method used for this investigation should be interpreted as one more focused on the long-term effects of elevated temperature operation, at an ambient temperature that is not uncommon in luminaires. On average, the lumen maintenance of the LED lamps monitored in the ALTA was better than benchmark lamps, but there was considerable variation from lamp model to lamp model. While three lamp models had average lumen maintenance above 99% at the end of the study period, two products had average lumen maintenance below 65%, constituting a parametric failure. These two products, along with a third, also exhibited substantial color shift, another form of parametric failure. While none of the LED lamps exhibited catastrophic failure—and all of the benchmarks did—the early degradation of performance is concerning, especially with a new technology trying to build a reputation with consumers. Beyond the observed parametric failures nearly half of the products failed to meet early-life thresholds for lumen maintenance, which were borrowed from ENERGY STAR specifications. That is, the lumen maintenance was sufficiently low at 6,000 hours that seven of the products are unlikely to have lumen maintenance above 70% at their rated lifetime (which was usually 25,000 hours). Given the methods used for this investigation—most notably continuous operation—the results should not be interpreted as indicative of a lamp’s performance in a typical environment. Likewise, these results are not directly relatable to manufacturer lifetime claims. This report is best used to understand the variation in LED product performance, compare the robustness of LED lamps and benchmark conventional lamps, and understand the characteristics of lumen and chromaticity change. A key takeaway is that the long-term performance of LED lamps can vary greatly from model to model (i.e., the technology is not homogenous), although the lamp-to-lamp consistency within a given model is relatively good. Further, operation of LED lamps in an enclosed luminaire (or otherwise in high ambient temperatures), can induce parametric failure of LEDs much earlier than their rated lifetime; manufacturer warnings about such conditions should be followed if performance degradation is unacceptable.« less
Student Satisfaction Surveys: The Value in Taking an Historical Perspective
ERIC Educational Resources Information Center
Kane, David; Williams, James; Cappuccini-Ansfield, Gillian
2008-01-01
Benchmarking satisfaction over time can be extremely valuable where a consistent feedback cycle is employed. However, the value of benchmarking over a long period of time has not been analysed in depth. What is the value of benchmarking this type of data over time? What does it tell us about a feedback and action cycle? What impact does a study of…
A Methodology for Benchmarking Relational Database Machines,
1984-01-01
user benchmarks is to compare the multiple users to the best-case performance The data for each query classification coll and the performance...called a benchmark. The term benchmark originates from the markers used by sur - veyors in establishing common reference points for their measure...formatted databases. In order to further simplify the problem, we restrict our study to those DBMs which support the relational model. A sur - vey
Human Thermal Model Evaluation Using the JSC Human Thermal Database
NASA Technical Reports Server (NTRS)
Bue, Grant; Makinen, Janice; Cognata, Thomas
2012-01-01
Human thermal modeling has considerable long term utility to human space flight. Such models provide a tool to predict crew survivability in support of vehicle design and to evaluate crew response in untested space environments. It is to the benefit of any such model not only to collect relevant experimental data to correlate it against, but also to maintain an experimental standard or benchmark for future development in a readily and rapidly searchable and software accessible format. The Human thermal database project is intended to do just so; to collect relevant data from literature and experimentation and to store the data in a database structure for immediate and future use as a benchmark to judge human thermal models against, in identifying model strengths and weakness, to support model development and improve correlation, and to statistically quantify a model s predictive quality. The human thermal database developed at the Johnson Space Center (JSC) is intended to evaluate a set of widely used human thermal models. This set includes the Wissler human thermal model, a model that has been widely used to predict the human thermoregulatory response to a variety of cold and hot environments. These models are statistically compared to the current database, which contains experiments of human subjects primarily in air from a literature survey ranging between 1953 and 2004 and from a suited experiment recently performed by the authors, for a quantitative study of relative strength and predictive quality of the models.
NASA Astrophysics Data System (ADS)
Shi, Haichen; Worden, Keith; Cross, Elizabeth J.
2018-03-01
Cointegration is now extensively used to model the long term common trends among economic variables in the field of econometrics. Recently, cointegration has been successfully implemented in the context of structural health monitoring (SHM), where it has been used to remove the confounding influences of environmental and operational variations (EOVs) that can often mask the signature of structural damage. However, restrained by its linear nature, the conventional cointegration approach has limited power in modelling systems where measurands are nonlinearly related; this occurs, for example, in the benchmark study of the Z24 Bridge, where nonlinear relationships between natural frequencies were induced during a period of very cold temperatures. To allow the removal of EOVs from SHM data with nonlinear relationships like this, this paper extends the well-established cointegration method to a nonlinear context, which is to allow a breakpoint in the cointegrating vector. In a novel approach, the augmented Dickey-Fuller (ADF) statistic is used to find which position is most appropriate for inserting a breakpoint, the Johansen procedure is then utilised for the estimation of cointegrating vectors. The proposed approach is examined with a simulated case and real SHM data from the Z24 Bridge, demonstrating that the EOVs can be neatly eliminated.
Hwang, Bosun; You, Jiwoo; Vaessen, Thomas; Myin-Germeys, Inez; Park, Cheolsoo; Zhang, Byoung-Tak
2018-02-08
Stress recognition using electrocardiogram (ECG) signals requires the intractable long-term heart rate variability (HRV) parameter extraction process. This study proposes a novel deep learning framework to recognize the stressful states, the Deep ECGNet, using ultra short-term raw ECG signals without any feature engineering methods. The Deep ECGNet was developed through various experiments and analysis of ECG waveforms. We proposed the optimal recurrent and convolutional neural networks architecture, and also the optimal convolution filter length (related to the P, Q, R, S, and T wave durations of ECG) and pooling length (related to the heart beat period) based on the optimization experiments and analysis on the waveform characteristics of ECG signals. The experiments were also conducted with conventional methods using HRV parameters and frequency features as a benchmark test. The data used in this study were obtained from Kwangwoon University in Korea (13 subjects, Case 1) and KU Leuven University in Belgium (9 subjects, Case 2). Experiments were designed according to various experimental protocols to elicit stressful conditions. The proposed framework to recognize stress conditions, the Deep ECGNet, outperformed the conventional approaches with the highest accuracy of 87.39% for Case 1 and 73.96% for Case 2, respectively, that is, 16.22% and 10.98% improvements compared with those of the conventional HRV method. We proposed an optimal deep learning architecture and its parameters for stress recognition, and the theoretical consideration on how to design the deep learning structure based on the periodic patterns of the raw ECG data. Experimental results in this study have proved that the proposed deep learning model, the Deep ECGNet, is an optimal structure to recognize the stress conditions using ultra short-term ECG data.
Very High Specific Energy, Medium Power Li/CFx Primary Battery for Launchers and Space Probes
NASA Astrophysics Data System (ADS)
Brochard, Paul; Godillot, Gerome; Peres, Jean Paul; Corbin, Julien; Espinosa, Amaya
2014-08-01
Benchmark with existing technologies shows the advantages of the lithium-fluorinated carbon (Li/CFx) technology for use aboard future launchers in terms of a low Total Cost of Ownership (TCO), especially for high energy demanding missions such as re-ignitable upper stages for long GTO+ missions and probes for deep space exploration.This paper presents the new results obtained on this chemistry in terms of electrical and climatic performances, abuse tests and life tests. Studies - co-financed between CNES and Saft - looked at a pure CFx version with a specific energy up to 500 Wh/kg along with a medium power of 80 to 100 W/kg.
Li, Yang; Yang, Jianyi
2017-04-24
The prediction of protein-ligand binding affinity has recently been improved remarkably by machine-learning-based scoring functions. For example, using a set of simple descriptors representing the atomic distance counts, the RF-Score improves the Pearson correlation coefficient to about 0.8 on the core set of the PDBbind 2007 database, which is significantly higher than the performance of any conventional scoring function on the same benchmark. A few studies have been made to discuss the performance of machine-learning-based methods, but the reason for this improvement remains unclear. In this study, by systemically controlling the structural and sequence similarity between the training and test proteins of the PDBbind benchmark, we demonstrate that protein structural and sequence similarity makes a significant impact on machine-learning-based methods. After removal of training proteins that are highly similar to the test proteins identified by structure alignment and sequence alignment, machine-learning-based methods trained on the new training sets do not outperform the conventional scoring functions any more. On the contrary, the performance of conventional functions like X-Score is relatively stable no matter what training data are used to fit the weights of its energy terms.
Does neighborhood size really cause the word length effect?
Guitard, Dominic; Saint-Aubin, Jean; Tehan, Gerald; Tolan, Anne
2018-02-01
In short-term serial recall, it is well-known that short words are remembered better than long words. This word length effect has been the cornerstone of the working memory model and a benchmark effect that all models of immediate memory should account for. Currently, there is no consensus as to what determines the word length effect. Jalbert and colleagues (Jalbert, Neath, Bireta, & Surprenant, 2011a; Jalbert, Neath, & Surprenant, 2011b) suggested that neighborhood size is one causal factor. In six experiments we systematically examined their suggestion. In Experiment 1, with an immediate serial recall task, multiple word lengths, and a large pool of words controlled for neighborhood size, the typical word length effect was present. In Experiments 2 and 3, with an order reconstruction task and words with either many or few neighbors, we observed the typical word length effect. In Experiment 4 we tested the hypothesis that the previous abolition of the word length effect when neighborhood size was controlled was due to a confounded factor: frequency of orthographic structure. As predicted, we reversed the word length effect when using short words with less frequent orthographic structures than the long words, as was done in both of Jalbert et al.'s studies. In Experiments 5 and 6, we again observed the typical word length effect, even if we controlled for neighborhood size and frequency of orthographic structure. Overall, the results were not consistent with the predictions of Jalbert et al. and clearly showed a large and reliable word length effect after controlling for neighborhood size.
Anelastic and Compressible Simulation of Moist Dynamics at Planetary Scales
NASA Astrophysics Data System (ADS)
Kurowski, M.; Smolarkiewicz, P. K.; Grabowski, W.
2015-12-01
Moist anelastic and compressible numerical solutions to the planetary baroclinic instability and climate benchmarks are compared. The solutions are obtained applying a consistent numerical framework for dis- crete integrations of the various nonhydrostatic flow equations. Moist extension of the baroclinic instability benchmark is formulated as an analog of the dry case. Flow patterns, surface vertical vorticity and pressure, total kinetic energy, power spectra, and total amount of condensed water are analyzed. The climate bench- mark extends the baroclinic instability study by addressing long-term statistics of an idealized planetary equilibrium and associated meridional transports. Short-term deterministic anelastic and compressible so- lutions differ significantly. In particular, anelastic baroclinic eddies propagate faster and develop slower owing to, respectively, modified dispersion relation and abbreviated baroclinic vorticity production. These eddies also carry less kinetic energy, and the onset of their rapid growth occurs later than for the compressible solutions. The observed differences between the two solutions are sensitive to initial conditions as they di- minish for large-amplitude excitations of the instability. In particular, on the climatic time scales, the anelastic and compressible solutions evince similar zonally averaged flow patterns with the matching meridional transports of entropy, momentum, and moisture.
How Benchmarking and Higher Education Came Together
ERIC Educational Resources Information Center
Levy, Gary D.; Ronco, Sharron L.
2012-01-01
This chapter introduces the concept of benchmarking and how higher education institutions began to use benchmarking for a variety of purposes. Here, benchmarking is defined as a strategic and structured approach whereby an organization compares aspects of its processes and/or outcomes to those of another organization or set of organizations to…
van Hartevelt, Tim J; Cabral, Joana; Møller, Arne; FitzGerald, James J; Green, Alexander L; Aziz, Tipu Z; Deco, Gustavo; Kringelbach, Morten L
2015-01-01
It is unclear whether Hebbian-like learning occurs at the level of long-range white matter connections in humans, i.e., where measurable changes in structural connectivity (SC) are correlated with changes in functional connectivity. However, the behavioral changes observed after deep brain stimulation (DBS) suggest the existence of such Hebbian-like mechanisms occurring at the structural level with functional consequences. In this rare case study, we obtained the full network of white matter connections of one patient with Parkinson's disease (PD) before and after long-term DBS and combined it with a computational model of ongoing activity to investigate the effects of DBS-induced long-term structural changes. The results show that the long-term effects of DBS on resting-state functional connectivity is best obtained in the computational model by changing the structural weights from the subthalamic nucleus (STN) to the putamen and the thalamus in a Hebbian-like manner. Moreover, long-term DBS also significantly changed the SC towards normality in terms of model-based measures of segregation and integration of information processing, two key concepts of brain organization. This novel approach using computational models to model the effects of Hebbian-like changes in SC allowed us to causally identify the possible underlying neural mechanisms of long-term DBS using rare case study data. In time, this could help predict the efficacy of individual DBS targeting and identify novel DBS targets.
Structural Benchmark Testing for Stirling Convertor Heater Heads
NASA Technical Reports Server (NTRS)
Krause, David L.; Kalluri, Sreeramesh; Bowman, Randy R.
2007-01-01
The National Aeronautics and Space Administration (NASA) has identified high efficiency Stirling technology for potential use on long duration Space Science missions such as Mars rovers, deep space missions, and lunar applications. For the long life times required, a structurally significant design limit for the Stirling convertor heater head is creep deformation induced even under relatively low stress levels at high material temperatures. Conventional investigations of creep behavior adequately rely on experimental results from uniaxial creep specimens, and much creep data is available for the proposed Inconel-718 (IN-718) and MarM-247 nickel-based superalloy materials of construction. However, very little experimental creep information is available that directly applies to the atypical thin walls, the specific microstructures, and the low stress levels. In addition, the geometry and loading conditions apply multiaxial stress states on the heater head components, far from the conditions of uniaxial testing. For these reasons, experimental benchmark testing is underway to aid in accurately assessing the durability of Stirling heater heads. The investigation supplements uniaxial creep testing with pneumatic testing of heater head test articles at elevated temperatures and with stress levels ranging from one to seven times design stresses. This paper presents experimental methods, results, post-test microstructural analyses, and conclusions for both accelerated and non-accelerated tests. The Stirling projects use the results to calibrate deterministic and probabilistic analytical creep models of the heater heads to predict their life times.
Brandenburg, Jan Gerit; Grimme, Stefan
2014-01-01
We present and evaluate dispersion corrected Hartree-Fock (HF) and Density Functional Theory (DFT) based quantum chemical methods for organic crystal structure prediction. The necessity of correcting for missing long-range electron correlation, also known as van der Waals (vdW) interaction, is pointed out and some methodological issues such as inclusion of three-body dispersion terms are discussed. One of the most efficient and widely used methods is the semi-classical dispersion correction D3. Its applicability for the calculation of sublimation energies is investigated for the benchmark set X23 consisting of 23 small organic crystals. For PBE-D3 the mean absolute deviation (MAD) is below the estimated experimental uncertainty of 1.3 kcal/mol. For two larger π-systems, the equilibrium crystal geometry is investigated and very good agreement with experimental data is found. Since these calculations are carried out with huge plane-wave basis sets they are rather time consuming and routinely applicable only to systems with less than about 200 atoms in the unit cell. Aiming at crystal structure prediction, which involves screening of many structures, a pre-sorting with faster methods is mandatory. Small, atom-centered basis sets can speed up the computation significantly but they suffer greatly from basis set errors. We present the recently developed geometrical counterpoise correction gCP. It is a fast semi-empirical method which corrects for most of the inter- and intramolecular basis set superposition error. For HF calculations with nearly minimal basis sets, we additionally correct for short-range basis incompleteness. We combine all three terms in the HF-3c denoted scheme which performs very well for the X23 sublimation energies with an MAD of only 1.5 kcal/mol, which is close to the huge basis set DFT-D3 result.
NASA Astrophysics Data System (ADS)
Cook, M. J.; Sasagawa, G. S.; Roland, E. C.; Schmidt, D. A.; Wilcock, W. S. D.; Zumberge, M. A.
2017-12-01
Seawater pressure can be used to measure vertical seafloor deformation since small seafloor height changes produce measurable pressure changes. However, resolving secular vertical deformation near subduction zones can be difficult due to pressure gauge drift. A typical gauge drift rate of about 10 cm/year exceeds the expected secular rate of 1 cm/year or less in Cascadia. The absolute self-calibrating pressure recorder (ASCPR) was developed to solve the issue of gauge drift by using a deadweight calibrator to make campaign-style measurements of the absolute seawater pressure. Pressure gauges alternate between observing the ambient seawater pressure and the deadweight calibrator pressure, which is an accurately known reference value, every 10-20 minutes for several hours. The difference between the known reference pressure and the observed seafloor pressure allows offsets and transients to be corrected to determine the true, absolute seafloor pressure. Absolute seafloor pressure measurements provide a great utility for geodetic deformation studies. The measurements provide instrument-independent, benchmark values that can be used far into the future as epoch points in long-term time series or as important calibration points for other continuous pressure records. The ASCPR was first deployed in Cascadia in 2014 and 2015, when seven concrete seafloor benchmarks were placed along a trench-perpendicular profile extending from 20 km to 105 km off the central Oregon coast. Two benchmarks have ASCPR measurements that span three years, one benchmark spans two years, and four benchmarks span one year. Measurement repeatability is currently 3 to 4 cm, but we anticipate accuracy on the order of 1 cm with improvements to the instrument metrology and processing tidal and non-tidal oceanographic signals.
Sawyer S. Scherer; Anthony W. D' Amato; Christel C. Kern; Brian J. Palik; Matthew B. Russell
2016-01-01
Prescribed fire is increasingly being viewed as a valuable tool for mitigating the ecological consequences of long-term fire suppression within fire-adapted forest ecosystems. While the use of burning treatments in northern temperate conifer forests has at times received considerable attention, the long-term (>10 years) effects on forest structure and...
Stirling engine - Approach for long-term durability assessment
NASA Technical Reports Server (NTRS)
Tong, Michael T.; Bartolotta, Paul A.; Halford, Gary R.; Freed, Alan D.
1992-01-01
The approach employed by NASA Lewis for the long-term durability assessment of the Stirling engine hot-section components is summarized. The approach consists of: preliminary structural assessment; development of a viscoplastic constitutive model to accurately determine material behavior under high-temperature thermomechanical loads; an experimental program to characterize material constants for the viscoplastic constitutive model; finite-element thermal analysis and structural analysis using a viscoplastic constitutive model to obtain stress/strain/temperature at the critical location of the hot-section components for life assessment; and development of a life prediction model applicable for long-term durability assessment at high temperatures. The approach should aid in the provision of long-term structural durability and reliability of Stirling engines.
The Concepts "Benchmarks and Benchmarking" Used in Education Planning: Teacher Education as Example
ERIC Educational Resources Information Center
Steyn, H. J.
2015-01-01
Planning in education is a structured activity that includes several phases and steps that take into account several kinds of information (Steyn, Steyn, De Waal & Wolhuter, 2002: 146). One of the sets of information that are usually considered is the (so-called) "benchmarks" and "benchmarking" regarding the focus of a…
Validation of electronic structure methods for isomerization reactions of large organic molecules.
Luo, Sijie; Zhao, Yan; Truhlar, Donald G
2011-08-14
In this work the ISOL24 database of isomerization energies of large organic molecules presented by Huenerbein et al. [Phys. Chem. Chem. Phys., 2010, 12, 6940] is updated, resulting in the new benchmark database called ISOL24/11, and this database is used to test 50 electronic model chemistries. To accomplish the update, the very expensive and highly accurate CCSD(T)-F12a/aug-cc-pVDZ method is first exploited to investigate a six-reaction subset of the 24 reactions, and by comparison of various methods with the benchmark, MCQCISD-MPW is confirmed to be of high accuracy. The final ISOL24/11 database is composed of six reaction energies calculated by CCSD(T)-F12a/aug-cc-pVDZ and 18 calculated by MCQCISD-MPW. We then tested 40 single-component density functionals (both local and hybrid), eight doubly hybrid functionals, and two other methods against ISOL24/11. It is found that the SCS-MP3/CBS method, which is used as benchmark for the original ISOL24, has an MUE of 1.68 kcal mol(-1), which is close to or larger than some of the best tested DFT methods. Using the new benchmark, we find ωB97X-D and MC3MPWB to be the best single-component and doubly hybrid functionals respectively, with PBE0-D3 and MC3MPW performing almost as well. The best single-component density functionals without molecular mechanics dispersion-like terms are M08-SO, M08-HX, M05-2X, and M06-2X. The best single-component density functionals without Hartree-Fock exchange are M06-L-D3 when MM terms are included and M06-L when they are not.
Bhat, Virunya S; Hester, Susan D; Nesnow, Stephen; Eastmond, David A
2013-11-01
The ability to anchor chemical class-based gene expression changes to phenotypic lesions and to describe these changes as a function of dose and time informs mode-of-action determinations and improves quantitative risk assessments. Previous global expression profiling identified a 330-probe cluster differentially expressed and commonly responsive to 3 hepatotumorigenic conazoles (cyproconazole, epoxiconazole, and propiconazole) at 30 days. Extended to 2 more conazoles (triadimefon and myclobutanil), the present assessment encompasses 4 tumorigenic and 1 nontumorigenic conazole. Transcriptional benchmark dose levels (BMDL(T)) were estimated for a subset of the cluster with dose-responsive behavior and a ≥ 5-fold increase or decrease in signal intensity at the highest dose. These genes primarily encompassed CAR/RXR activation, P450 metabolism, liver hypertrophy- glutathione depletion, LPS/IL-1-mediated inhibition of RXR, and NRF2-mediated oxidative stress pathways. Median BMDL(T) estimates from the subset were concordant (within a factor of 2.4) with apical benchmark doses (BMDL(A)) for increased liver weight at 30 days for the 5 conazoles. The 30-day median BMDL(T) estimates were within one-half order of magnitude of the chronic BMDLA for hepatocellular tumors. Potency differences seen in the dose-responsive transcription of certain phase II metabolism, bile acid detoxification, and lipid oxidation genes mirrored each conazole's tumorigenic potency. The 30-day BMDL(T) corresponded to tumorigenic potency on a milligram per kilogram day basis with cyproconazole > epoxiconazole > propiconazole > triadimefon > myclobutanil (nontumorigenic). These results support the utility of measuring short-term gene expression changes to inform quantitative risk assessments from long-term exposures.
Dearing, James W; Beacom, Amanda M; Chamberlain, Stephanie A; Meng, Jingbo; Berta, Whitney B; Keefe, Janice M; Squires, Janet E; Doupe, Malcolm B; Taylor, Deanne; Reid, Robert Colin; Cook, Heather; Cummings, Greta G; Baumbusch, Jennifer L; Knopp-Sihota, Jennifer; Norton, Peter G; Estabrooks, Carole A
2017-02-03
Initiatives to accelerate the adoption and implementation of evidence-based practices benefit from an association with influential individuals and organizations. When opinion leaders advocate or adopt a best practice, others adopt too, resulting in diffusion. We sought to identify existing influence throughout Canada's long-term care sector and the extent to which informal advice-seeking relationships tie the sector together as a network. We conducted a sociometric survey of senior leaders in 958 long-term care facilities operating in 11 of Canada's 13 provinces and territories. We used an integrated knowledge translation approach to involve knowledge users in planning and administering the survey and in analyzing and interpreting the results. Responses from 482 senior leaders generated the names of 794 individuals and 587 organizations as sources of advice for improving resident care in long-term care facilities. A single advice-seeking network appears to span the nation. Proximity exhibits a strong effect on network structure, with provincial inter-organizational networks having more connections and thus a denser structure than interpersonal networks. We found credible individuals and organizations within groups (opinion leaders and opinion-leading organizations) and individuals and organizations that function as weak ties across groups (boundary spanners and bridges) for all studied provinces and territories. A good deal of influence in the Canadian long-term care sector rests with professionals such as provincial health administrators not employed in long-term care facilities. The Canadian long-term care sector is tied together through informal advice-seeking relationships that have given rise to an emergent network structure. Knowledge of this structure and engagement with its opinion leaders and boundary spanners may provide a route for stimulating the adoption and effective implementation of best practices, improving resident care and strengthening the long-term care advice network. We conclude that informal relational pathways hold promise for helping to transform the Canadian long-term care sector.
Benchmarking hypercube hardware and software
NASA Technical Reports Server (NTRS)
Grunwald, Dirk C.; Reed, Daniel A.
1986-01-01
It was long a truism in computer systems design that balanced systems achieve the best performance. Message passing parallel processors are no different. To quantify the balance of a hypercube design, an experimental methodology was developed and the associated suite of benchmarks was applied to several existing hypercubes. The benchmark suite includes tests of both processor speed in the absence of internode communication and message transmission speed as a function of communication patterns.
A role for high frequency hydrochemical sampling in long term ecosystem studies
NASA Astrophysics Data System (ADS)
Sebestyen, S. D.; Shanley, J. B.; Boyer, E. W.; Kendall, C.
2007-12-01
Monitoring of surface waters for major chemical constituents is needed to assess long-term trends and responses to ecological disturbance. However, the typical fixed-interval (weekly, monthly, or quarterly) sampling schemes of most long-term ecosystem studies may not capture the full range of stream chemical variation and do not always provide enough information to discern the landscape processes that control surface water chemistry and solute loadings. To expand upon traditional hydrochemical monitoring, we collected high frequency event-based surface water samples at an upland, forested basin of the Sleepers River Research Watershed (Vermont, USA), one of five intensively studied sites in the Water, Energy, and Biogeochemical Budgets (WEBB) program of the US Geological Survey. We present several examples that highlight the importance of linking long-term weekly data with intensive, high frequency sampling. We used end-member mixing analysis and isotopic approaches to trace sources of stream nutrients (e.g. nitrate, dissolved organic carbon) and quantified how atmospheric pollutants (e.g. nitrogen, sulfate, and mercury) affect stream chemistry. High frequency sampling generates large numbers of samples and is both labor and resource intensive but yields insights into ecosystem functions that are not readily discerned from less-frequent sampling. As the ecological community contemplates the scope and foci of environmental observatories as benchmarks for deciphering the effects of natural and anthropogenic change, incorporating high frequency hydrochemical sampling will further our understanding of ecosystem functions across a range of ecosystem types and disturbance effects.
Radiation Coupling with the FUN3D Unstructured-Grid CFD Code
NASA Technical Reports Server (NTRS)
Wood, William A.
2012-01-01
The HARA radiation code is fully-coupled to the FUN3D unstructured-grid CFD code for the purpose of simulating high-energy hypersonic flows. The radiation energy source terms and surface heat transfer, under the tangent slab approximation, are included within the fluid dynamic ow solver. The Fire II flight test, at the Mach-31 1643-second trajectory point, is used as a demonstration case. Comparisons are made with an existing structured-grid capability, the LAURA/HARA coupling. The radiative surface heat transfer rates from the present approach match the benchmark values within 6%. Although radiation coupling is the focus of the present work, convective surface heat transfer rates are also reported, and are seen to vary depending upon the choice of mesh connectivity and FUN3D ux reconstruction algorithm. On a tetrahedral-element mesh the convective heating matches the benchmark at the stagnation point, but under-predicts by 15% on the Fire II shoulder. Conversely, on a mixed-element mesh the convective heating over-predicts at the stagnation point by 20%, but matches the benchmark away from the stagnation region.
Unified constitutive models for high-temperature structural applications
NASA Technical Reports Server (NTRS)
Lindholm, U. S.; Chan, K. S.; Bodner, S. R.; Weber, R. M.; Walker, K. P.
1988-01-01
Unified constitutive models are characterized by the use of a single inelastic strain rate term for treating all aspects of inelastic deformation, including plasticity, creep, and stress relaxation under monotonic or cyclic loading. The structure of this class of constitutive theory pertinent for high temperature structural applications is first outlined and discussed. The effectiveness of the unified approach for representing high temperature deformation of Ni-base alloys is then evaluated by extensive comparison of experimental data and predictions of the Bodner-Partom and the Walker models. The use of the unified approach for hot section structural component analyses is demonstrated by applying the Walker model in finite element analyses of a benchmark notch problem and a turbine blade problem.
A bio-inspired memory model for structural health monitoring
NASA Astrophysics Data System (ADS)
Zheng, Wei; Zhu, Yong
2009-04-01
Long-term structural health monitoring (SHM) systems need intelligent management of the monitoring data. By analogy with the way the human brain processes memories, we present a bio-inspired memory model (BIMM) that does not require prior knowledge of the structure parameters. The model contains three time-domain areas: a sensory memory area, a short-term memory area and a long-term memory area. First, the initial parameters of the structural state are specified to establish safety criteria. Then the large amount of monitoring data that falls within the safety limits is filtered while the data outside the safety limits are captured instantly in the sensory memory area. Second, disturbance signals are distinguished from danger signals in the short-term memory area. Finally, the stable data of the structural balance state are preserved in the long-term memory area. A strategy for priority scheduling via fuzzy c-means for the proposed model is then introduced. An experiment on bridge tower deformation demonstrates that the proposed model can be applied for real-time acquisition, limited-space storage and intelligent mining of the monitoring data in a long-term SHM system.
Comparing the performance of two CBIRS indexing schemes
NASA Astrophysics Data System (ADS)
Mueller, Wolfgang; Robbert, Guenter; Henrich, Andreas
2003-01-01
Content based image retrieval (CBIR) as it is known today has to deal with a number of challenges. Quickly summarized, the main challenges are firstly, to bridge the semantic gap between high-level concepts and low-level features using feedback, secondly to provide performance under adverse conditions. High-dimensional spaces, as well as a demanding machine learning task make the right way of indexing an important issue. When indexing multimedia data, most groups opt for extraction of high-dimensional feature vectors from the data, followed by dimensionality reduction like PCA (Principal Components Analysis) or LSI (Latent Semantic Indexing). The resulting vectors are indexed using spatial indexing structures such as kd-trees or R-trees, for example. Other projects, such as MARS and Viper propose the adaptation of text indexing techniques, notably the inverted file. Here, the Viper system is the most direct adaptation of text retrieval techniques to quantized vectors. However, while the Viper query engine provides decent performance together with impressive user-feedback behavior, as well as the possibility for easy integration of long-term learning algorithms, and support for potentially infinite feature vectors, there has been no comparison of vector-based methods and inverted-file-based methods under similar conditions. In this publication, we compare a CBIR query engine that uses inverted files (Bothrops, a rewrite of the Viper query engine based on a relational database), and a CBIR query engine based on LSD (Local Split Decision) trees for spatial indexing using the same feature sets. The Benchathlon initiative works on providing a set of images and ground truth for simulating image queries by example and corresponding user feedback. When performing the Benchathlon benchmark on a CBIR system (the System Under Test, SUT), a benchmarking harness connects over internet to the SUT, performing a number of queries using an agreed-upon protocol, the multimedia retrieval markup language (MRML). Using this benchmark one can measure the quality of retrieval, as well as the overall (speed) performance of the benchmarked system. Our Benchmarks will draw on the Benchathlon"s work for documenting the retrieval performance of both inverted file-based and LSD tree based techniques. However in addition to these results, we will present statistics, that can be obtained only inside the system under test. These statistics will include the number of complex mathematical operations, as well as the amount of data that has to be read from disk during operation of a query.
Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.
Mørk, Søren; Holmes, Ian
2012-03-01
Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.
Ivbijaro, Go; Kolkiewicz, LA; McGee, Lsf; Gikunoo, M
2008-03-01
Objectives This audit aims to evaluate the effectiveness of delivering an equivalent primary care service to a long-term forensic psychiatric inpatient population, using the UK primary care national Quality and Outcomes Framework (QOF).Method The audit compares the targets met by the general practitioner with special interest (GPwSI) service, using local and national QOF benchmarks (2005-2006), and determines the prevalence of chronic disease in a long-term inpatient forensic psychiatry population.Results The audit results show that the UK national QOF is a useful tool for assessment and evaluation of physical healthcare needs in a non-community based population. It shows an increased prevalence of all QOF-assessed long-term physical conditions when compared to the local East London population and national UK population, confirming previously reported elevated levels of physical healthcare need in psychiatric populations.Conclusions This audit shows that the UK General Practice QOF can be used as a standardised instrument for commissioning and monitoring the delivery of physical health services to in-patient psychiatric populations, and for the evaluation of the effectiveness of clinical interventions in long-term physical conditions. The audit also demonstrates the effectiveness of using a GPwSI in healthcare delivery in non-community based settings. We suggest that the findings may be generalisable to other long-term inpatient psychiatric and prison populations in order to further the objective of delivering an equivalent primary care service to all populations.The QOF is a set of national primary care audit standards and is freely available on the British Medical Association website or the UK Department of Health website. We suggest that primary care workers in health economies who have not yet developed their own national primary care standards can access and adapt these standards in order to improve the clinical standards of care given to the primary care populations that they serve.
2008-01-01
Objectives This audit aims to evaluate the effectiveness of delivering an equivalent primary care service to a long-term forensic psychiatric inpatient population, using the UK primary care national Quality and Outcomes Framework (QOF). Method The audit compares the targets met by the general practitioner with special interest (GPwSI) service, using local and national QOF benchmarks (2005–2006), and determines the prevalence of chronic disease in a long-term inpatient forensic psychiatry population. Results The audit results show that the UK national QOF is a useful tool for assessment and evaluation of physical healthcare needs in a non-community based population. It shows an increased prevalence of all QOF-assessed long-term physical conditions when compared to the local East London population and national UK population, confirming previously reported elevated levels of physical healthcare need in psychiatric populations. Conclusions This audit shows that the UK General Practice QOF can be used as a standardised instrument for commissioning and monitoring the delivery of physical health services to in-patient psychiatric populations, and for the evaluation of the effectiveness of clinical interventions in long-term physical conditions. The audit also demonstrates the effectiveness of using a GPwSI in healthcare delivery in non-community based settings. We suggest that the findings may be generalisable to other long-term inpatient psychiatric and prison populations in order to further the objective of delivering an equivalent primary care service to all populations. The QOF is a set of national primary care audit standards and is freely available on the British Medical Association website or the UK Department of Health website. We suggest that primary care workers in health economies who have not yet developed their own national primary care standards can access and adapt these standards in order to improve the clinical standards of care given to the primary care populations that they serve. PMID:22477846
Environmental modeling and recognition for an autonomous land vehicle
NASA Technical Reports Server (NTRS)
Lawton, D. T.; Levitt, T. S.; Mcconnell, C. C.; Nelson, P. C.
1987-01-01
An architecture for object modeling and recognition for an autonomous land vehicle is presented. Examples of objects of interest include terrain features, fields, roads, horizon features, trees, etc. The architecture is organized around a set of data bases for generic object models and perceptual structures, temporary memory for the instantiation of object and relational hypotheses, and a long term memory for storing stable hypotheses that are affixed to the terrain representation. Multiple inference processes operate over these databases. Researchers describe these particular components: the perceptual structure database, the grouping processes that operate over this, schemas, and the long term terrain database. A processing example that matches predictions from the long term terrain model to imagery, extracts significant perceptual structures for consideration as potential landmarks, and extracts a relational structure to update the long term terrain database is given.
Educational network comparative analysis of small groups: Short- and long-term communications
NASA Astrophysics Data System (ADS)
Berg, D. B.; Zvereva, O. M.; Nazarova, Yu. Yu.; Chepurov, E. G.; Kokovin, A. V.; Ranyuk, S. V.
2017-11-01
The present study is devoted to the discussion of small group communication network structures. These communications were observed in student groups, where actors were united with a regular educational activity. The comparative analysis was carried out for networks of short-term (1 hour) and long-term (4 weeks) communications, it was based on seven structural parameters, and consisted of two stages. At the first stage, differences between the network graphs were examined, and the random corresponding Bernoulli graphs were built. At the second stage, revealed differences were compared. Calculations were performed using UCINET software framework. It was found out that networks of long-term and short-term communications are quite different: the structure of a short-term communication network is close to a random one, whereas the most of long-term communication network parameters differ from the corresponding random ones by more than 30%. This difference can be explained by strong "noisiness" of a short-term communication network, and the lack of social in it.
How to benchmark methods for structure-based virtual screening of large compound libraries.
Christofferson, Andrew J; Huang, Niu
2012-01-01
Structure-based virtual screening is a useful computational technique for ligand discovery. To systematically evaluate different docking approaches, it is important to have a consistent benchmarking protocol that is both relevant and unbiased. Here, we describe the designing of a benchmarking data set for docking screen assessment, a standard docking screening process, and the analysis and presentation of the enrichment of annotated ligands among a background decoy database.
Hung, Linda; Bruneval, Fabien; Baishya, Kopinjol; ...
2017-04-07
Energies from the GW approximation and the Bethe–Salpeter equation (BSE) are benchmarked against the excitation energies of transition-metal (Cu, Zn, Ag, and Cd) single atoms and monoxide anions. We demonstrate that best estimates of GW quasiparticle energies at the complete basis set limit should be obtained via extrapolation or closure relations, while numerically converged GW-BSE eigenvalues can be obtained on a finite basis set. Calculations using real-space wave functions and pseudopotentials are shown to give best-estimate GW energies that agree (up to the extrapolation error) with calculations using all-electron Gaussian basis sets. We benchmark the effects of a vertex approximationmore » (ΓLDA) and the mean-field starting point in GW and the BSE, performing computations using a real-space, transition-space basis and scalar-relativistic pseudopotentials. Here, while no variant of GW improves on perturbative G0W0 at predicting ionization energies, G0W0Γ LDA-BSE computations give excellent agreement with experimental absorption spectra as long as off-diagonal self-energy terms are included. We also present G0W0 quasiparticle energies for the CuO –, ZnO –, AgO –, and CdO – anions, in comparison to available anion photoelectron spectra.« less
Reviews and syntheses: Field data to benchmark the carbon cycle models for tropical forests
Clark, Deborah A.; Asao, Shinichi; Fisher, Rosie A.; Reed, Sasha C.; Reich, Peter B.; Ryan, Michael G.; Wood, Tana E.; Yang, Xiaojuan
2017-01-01
For more accurate projections of both the global carbon (C) cycle and the changing climate, a critical current need is to improve the representation of tropical forests in Earth system models. Tropical forests exchange more C, energy, and water with the atmosphere than any other class of land ecosystems. Further, tropical-forest C cycling is likely responding to the rapid global warming, intensifying water stress, and increasing atmospheric CO2 levels. Projections of the future C balance of the tropics vary widely among global models. A current effort of the modeling community, the ILAMB (International Land Model Benchmarking) project, is to compile robust observations that can be used to improve the accuracy and realism of the land models for all major biomes. Our goal with this paper is to identify field observations of tropical-forest ecosystem C stocks and fluxes, and of their long-term trends and climatic and CO2 sensitivities, that can serve this effort. We propose criteria for reference-level field data from this biome and present a set of documented examples from old-growth lowland tropical forests. We offer these as a starting point towards the goal of a regularly updated consensus set of benchmark field observations of C cycling in tropical forests.
Baumung, Claudia; Rehm, Jürgen; Franke, Heike; Lachenmeier, Dirk W.
2016-01-01
Nicotine was not included in previous efforts to identify the most important toxicants of tobacco smoke. A health risk assessment of nicotine for smokers of cigarettes was conducted using the margin of exposure (MOE) approach and results were compared to literature MOEs of various other tobacco toxicants. The MOE is defined as ratio between toxicological threshold (benchmark dose) and estimated human intake. Dose-response modelling of human and animal data was used to derive the benchmark dose. The MOE was calculated using probabilistic Monte Carlo simulations for daily cigarette smokers. Benchmark dose values ranged from 0.004 mg/kg bodyweight for symptoms of intoxication in children to 3 mg/kg bodyweight for mortality in animals; MOEs ranged from below 1 up to 7.6 indicating a considerable consumer risk. The dimension of the MOEs is similar to those of other tobacco toxicants with high concerns relating to adverse health effects such as acrolein or formaldehyde. Owing to the lack of toxicological data in particular relating to cancer, long term animal testing studies for nicotine are urgently necessary. There is immediate need of action concerning the risk of nicotine also with regard to electronic cigarettes and smokeless tobacco. PMID:27759090
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hung, Linda; Bruneval, Fabien; Baishya, Kopinjol
Energies from the GW approximation and the Bethe–Salpeter equation (BSE) are benchmarked against the excitation energies of transition-metal (Cu, Zn, Ag, and Cd) single atoms and monoxide anions. We demonstrate that best estimates of GW quasiparticle energies at the complete basis set limit should be obtained via extrapolation or closure relations, while numerically converged GW-BSE eigenvalues can be obtained on a finite basis set. Calculations using real-space wave functions and pseudopotentials are shown to give best-estimate GW energies that agree (up to the extrapolation error) with calculations using all-electron Gaussian basis sets. We benchmark the effects of a vertex approximationmore » (ΓLDA) and the mean-field starting point in GW and the BSE, performing computations using a real-space, transition-space basis and scalar-relativistic pseudopotentials. Here, while no variant of GW improves on perturbative G0W0 at predicting ionization energies, G0W0Γ LDA-BSE computations give excellent agreement with experimental absorption spectra as long as off-diagonal self-energy terms are included. We also present G0W0 quasiparticle energies for the CuO –, ZnO –, AgO –, and CdO – anions, in comparison to available anion photoelectron spectra.« less
Reviews and syntheses: Field data to benchmark the carbon cycle models for tropical forests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Deborah A.; Asao, Shinichi; Fisher, Rosie
For more accurate projections of both the global carbon (C) cycle and the changing climate, a critical current need is to improve the representation of tropical forests in Earth system models. Tropical forests exchange more C, energy, and water with the atmosphere than any other class of land ecosystems. Further, tropical-forest C cycling is likely responding to the rapid global warming, intensifying water stress, and increasing atmospheric CO 2 levels. Projections of the future C balance of the tropics vary widely among global models. A current effort of the modeling community, the ILAMB (International Land Model Benchmarking) project, is tomore » compile robust observations that can be used to improve the accuracy and realism of the land models for all major biomes. Our goal with this paper is to identify field observations of tropical-forest ecosystem C stocks and fluxes, and of their long-term trends and climatic and CO 2 sensitivities, that can serve this effort. We propose criteria for reference-level field data from this biome and present a set of documented examples from old-growth lowland tropical forests. We offer these as a starting point towards the goal of a regularly updated consensus set of benchmark field observations of C cycling in tropical forests.« less
Reviews and syntheses: Field data to benchmark the carbon cycle models for tropical forests
NASA Astrophysics Data System (ADS)
Clark, Deborah A.; Asao, Shinichi; Fisher, Rosie; Reed, Sasha; Reich, Peter B.; Ryan, Michael G.; Wood, Tana E.; Yang, Xiaojuan
2017-10-01
For more accurate projections of both the global carbon (C) cycle and the changing climate, a critical current need is to improve the representation of tropical forests in Earth system models. Tropical forests exchange more C, energy, and water with the atmosphere than any other class of land ecosystems. Further, tropical-forest C cycling is likely responding to the rapid global warming, intensifying water stress, and increasing atmospheric CO2 levels. Projections of the future C balance of the tropics vary widely among global models. A current effort of the modeling community, the ILAMB (International Land Model Benchmarking) project, is to compile robust observations that can be used to improve the accuracy and realism of the land models for all major biomes. Our goal with this paper is to identify field observations of tropical-forest ecosystem C stocks and fluxes, and of their long-term trends and climatic and CO2 sensitivities, that can serve this effort. We propose criteria for reference-level field data from this biome and present a set of documented examples from old-growth lowland tropical forests. We offer these as a starting point towards the goal of a regularly updated consensus set of benchmark field observations of C cycling in tropical forests.
Reviews and syntheses: Field data to benchmark the carbon cycle models for tropical forests
Clark, Deborah A.; Asao, Shinichi; Fisher, Rosie; ...
2017-10-23
For more accurate projections of both the global carbon (C) cycle and the changing climate, a critical current need is to improve the representation of tropical forests in Earth system models. Tropical forests exchange more C, energy, and water with the atmosphere than any other class of land ecosystems. Further, tropical-forest C cycling is likely responding to the rapid global warming, intensifying water stress, and increasing atmospheric CO 2 levels. Projections of the future C balance of the tropics vary widely among global models. A current effort of the modeling community, the ILAMB (International Land Model Benchmarking) project, is tomore » compile robust observations that can be used to improve the accuracy and realism of the land models for all major biomes. Our goal with this paper is to identify field observations of tropical-forest ecosystem C stocks and fluxes, and of their long-term trends and climatic and CO 2 sensitivities, that can serve this effort. We propose criteria for reference-level field data from this biome and present a set of documented examples from old-growth lowland tropical forests. We offer these as a starting point towards the goal of a regularly updated consensus set of benchmark field observations of C cycling in tropical forests.« less
ERIC Educational Resources Information Center
Crossland, John
2011-01-01
The English National Curriculum Programmes of Study emphasise the importance of knowledge, understanding and skills, and teachers are well versed in structuring learning in those terms. Research outcomes into how long-term memory is stored and retrieved provide support for structuring learning in this way. Four further messages are added to the…
Advanced moisture modeling of polymer composites.
DOT National Transportation Integrated Search
2014-04-01
Long term moisture exposure has been shown to affect the mechanical performance of polymeric composite structures. This reduction : in mechanical performance must be considered during product design in order to ensure long term structure survival. In...
A statistical summary of data from the U.S. Geological Survey's national water quality networks
Smith, R.A.; Alexander, R.B.
1983-01-01
The U.S. Geological Survey Operates two nationwide networks to monitor water quality, the National Hydrologic Bench-Mark Network and the National Stream Quality Accounting Network (NASQAN). The Bench-Mark network is composed of 51 stations in small drainage basins which are as close as possible to their natural state, with no human influence and little likelihood of future development. Stations in the NASQAN program are located to monitor flow from accounting units (subregional drainage basins) which collectively encompass the entire land surface of the nation. Data collected at both networks include streamflow, concentrations of major inorganic constituents, nutrients, and trace metals. The goals of the two water quality sampling programs include the determination of mean constituent concentrations and transport rates as well as the analysis of long-term trends in those variables. This report presents a station-by-station statistical summary of data from the two networks for the period 1974 through 1981. (Author 's abstract)
Exterior view of LongTerm Oxidizer Silo (T28D) in left background ...
Exterior view of Long-Term Oxidizer Silo (T-28D) in left background (taller structure) and adjacent Oxidizer Conditioning Structure (T-28B) at extreme left background, looking south. At far right in foreground is a nitrogen tank in a concrete truck well - Air Force Plant PJKS, Systems Integration Laboratory, Long-Term Oxidizer Silo, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO
Ozgul, Arpat; Oli, Madan K; Armitage, Kenneth B; Blumstein, Daniel T; Van Vuren, Dirk H
2009-04-01
Despite recent advances in biodemography and metapopulation ecology, we still have limited understanding of how local demographic parameters influence short- and long-term metapopulation dynamics. We used long-term data from 17 local populations, along with the recently developed methods of matrix metapopulation modeling and transient sensitivity analysis, to investigate the influence of local demography on long-term (asymptotic) versus short-term (transient) dynamics of a yellow-bellied marmot metapopulation in Colorado. Both long- and short-term dynamics depended primarily on a few colony sites and were highly sensitive to changes in demography at these sites, particularly in survival of reproductive adult females. Interestingly, the relative importance of sites differed between long- and short-term dynamics; the spatial structure and local population sizes, while insignificant for asymptotic dynamics, were influential on transient dynamics. However, considering the spatial structure was uninformative about the relative influence of local demography on metapopulation dynamics. The vital rates that were the most influential on local dynamics were also the most influential on both long- and short-term metapopulation dynamics. Our results show that an explicit consideration of local demography is essential for a complete understanding of the dynamics and persistence of spatially structured populations.
Non Covalent Interactions in Large Diamondoid Dimers in the Gas Phase - a Microwave Study
NASA Astrophysics Data System (ADS)
Perez, Cristobal; Sekutor, Marina; Fokin, Andrey A.; Blomeyer, Sebastian; Vishnevskiy, Yury V.; Mitzel, Norbert W.; Schreiner, Peter R.; Schnell, Melanie
2017-06-01
Accurate structure determination of large molecules still represents an ambitious challenge. Interesting benchmark systems for structure determination are large diamondoid dimers, whose structures are governed by strong intramolecular interactions. Recently, diamondoid dimers with unusually long central C-C bonds (up to 1.71 Å) were synthesized. This long central C-C bond was rationalized by numerous CH...HC-type dispersion attractions between the two halves of the molecule. The thermodynamic stabilization of molecules equipped with bulky groups has provided a conceptually new rationale, since until then it had been assumed that such molecules are highly unstable. We performed a broadband CP-FTMW spectroscopy study in the 2-8 GHz frequency range on oxygen-substituted diamondoid dimers (C_{26}H_{34}O_2, 28 heavy atoms) as well as diadamantyl ether to provide further insight into their structures. The experimental data are compared with results from quantum-chemical calculations and gas-phase electron diffraction. For the ether, we even obtained ^{13}C and ^{18}O isotopologues to generate the full heavy-atom substitution structure.
NASA Astrophysics Data System (ADS)
de Lautour, Oliver R.; Omenzetter, Piotr
2010-07-01
Developed for studying long sequences of regularly sampled data, time series analysis methods are being increasingly investigated for the use of Structural Health Monitoring (SHM). In this research, Autoregressive (AR) models were used to fit the acceleration time histories obtained from two experimental structures: a 3-storey bookshelf structure and the ASCE Phase II Experimental SHM Benchmark Structure, in undamaged and limited number of damaged states. The coefficients of the AR models were considered to be damage-sensitive features and used as input into an Artificial Neural Network (ANN). The ANN was trained to classify damage cases or estimate remaining structural stiffness. The results showed that the combination of AR models and ANNs are efficient tools for damage classification and estimation, and perform well using small number of damage-sensitive features and limited sensors.
Implementation of Programmatic Quality and the Impact on Safety
NASA Technical Reports Server (NTRS)
Huls, Dale Thomas; Meehan, Kevin
2005-01-01
The purpose of this paper is to discuss the implementation of a programmatic quality assurance discipline within the International Space Station Program and the resulting impact on safety. NASA culture has continued to stress safety at the expense of quality when both are extremely important and both can equally influence the success or failure of a Program or Mission. Although safety was heavily criticized in the media after Colimbiaa, strong case can be made that it was the failure of quality processes and quality assurance in all processes that eventually led to the Columbia accident. Consequently, it is possible to have good quality processes without safety, but it is impossible to have good safety processes without quality. The ISS Program quality assurance function was analyzed as representative of the long-term manned missions that are consistent with the President s Vision for Space Exploration. Background topics are as follows: The quality assurance organizational structure within the ISS Program and the interrelationships between various internal and external organizations. ISS Program quality roles and responsibilities with respect to internal Program Offices and other external organizations such as the Shuttle Program, JSC Directorates, NASA Headquarters, NASA Contractors, other NASA Centers, and International Partner/participants will be addressed. A detailed analysis of implemented quality assurance responsibilities and functions with respect to NASA Headquarters, the JSC S&MA Directorate, and the ISS Program will be presented. Discussions topics are as follows: A comparison of quality and safety resources in terms of staffing, training, experience, and certifications. A benchmark assessment of the lessons learned from the Columbia Accident Investigation (CAB) Report (and follow-up reports and assessments), NASA Benchmarking, and traditional quality assurance activities against ISS quality procedures and practices. The lack of a coherent operational and sustaining quality assurance strategy for long-term manned space flight. An analysis of the ISS waiver processes and the Problem Reporting and Corrective Action (PRACA) process implemented as quality functions. Impact of current ISS Program procedures and practices with regards to operational safety and risk A discussion regarding a "defense-in-depth" approach to quality functions will be provided to address the issue of "integration vs independence" with respect to the roles of Programs, NASA Centers, and NASA Headquarters. Generic recommendations are offered to address the inadequacies identified in the implementation of ISS quality assurance. A reassessment by the NASA community regarding the importance of a "quality culture" as a component within a larger "safety culture" will generate a more effective and value-added functionality that will ultimately enhance safety.
Carbon nanocages: A new support material for Pt catalyst with remarkably high durability
Wang, Xiao Xia; Tan, Zhe Hua; Zeng, Min; Wang, Jian Nong
2014-01-01
Low durability is the major challenge hindering the large-scale implementation of proton exchange membrane fuel cell (PEMFC) technology, and corrosion of carbon support materials of current catalysts is the main cause. Here, we describe the finding of remarkably high durability with the use of a novel support material. This material is based on hollow carbon nanocages developed with a high degree of graphitization and concurrent nitrogen doping for oxidation resistance enhancement, uniform deposition of fine Pt particles, and strong Pt-support interaction. Accelerated degradation testing shows that such designed catalyst possesses a superior electrochemical activity and long-term stability for both hydrogen oxidation and oxygen reduction relative to industry benchmarks of current catalysts. Further testing under conditions of practical fuel cell operation reveals almost no degradation over long-term cycling. Such a catalyst of high activity, particularly, high durability, opens the door for the next-generation PEMFC for “real world” application. PMID:24658614
Long-term scale adaptive tracking with kernel correlation filters
NASA Astrophysics Data System (ADS)
Wang, Yueren; Zhang, Hong; Zhang, Lei; Yang, Yifan; Sun, Mingui
2018-04-01
Object tracking in video sequences has broad applications in both military and civilian domains. However, as the length of input video sequence increases, a number of problems arise, such as severe object occlusion, object appearance variation, and object out-of-view (some portion or the entire object leaves the image space). To deal with these problems and identify the object being tracked from cluttered background, we present a robust appearance model using Speeded Up Robust Features (SURF) and advanced integrated features consisting of the Felzenszwalb's Histogram of Oriented Gradients (FHOG) and color attributes. Since re-detection is essential in long-term tracking, we develop an effective object re-detection strategy based on moving area detection. We employ the popular kernel correlation filters in our algorithm design, which facilitates high-speed object tracking. Our evaluation using the CVPR2013 Object Tracking Benchmark (OTB2013) dataset illustrates that the proposed algorithm outperforms reference state-of-the-art trackers in various challenging scenarios.
A probabilistic framework for the cover effect in bedrock erosion
NASA Astrophysics Data System (ADS)
Turowski, Jens M.; Hodge, Rebecca
2017-06-01
The cover effect in fluvial bedrock erosion is a major control on bedrock channel morphology and long-term channel dynamics. Here, we suggest a probabilistic framework for the description of the cover effect that can be applied to field, laboratory, and modelling data and thus allows the comparison of results from different sources. The framework describes the formation of sediment cover as a function of the probability of sediment being deposited on already alluviated areas of the bed. We define benchmark cases and suggest physical interpretations of deviations from these benchmarks. Furthermore, we develop a reach-scale model for sediment transfer in a bedrock channel and use it to clarify the relations between the sediment mass residing on the bed, the exposed bedrock fraction, and the transport stage. We derive system timescales and investigate cover response to cyclic perturbations. The model predicts that bedrock channels can achieve grade in steady state by adjusting bed cover. Thus, bedrock channels have at least two characteristic timescales of response. Over short timescales, the degree of bed cover is adjusted such that the supplied sediment load can just be transported, while over long timescales, channel morphology evolves such that the bedrock incision rate matches the tectonic uplift or base-level lowering rate.
Abiotic and biotic determinants of leaf carbon exchange capacity from tropical to high boreal biomes
NASA Astrophysics Data System (ADS)
Smith, N. G.; Dukes, J. S.
2016-12-01
Photosynthesis and respiration on land represent the two largest fluxes of carbon dioxide between the atmosphere and the Earth's surface. As such, the Earth System Models that are used to project climate change are high sensitive to these processes. Studies have found that much of this uncertainty is due to the formulation and parameterization of plant photosynthetic and respiratory capacity. Here, we quantified the abiotic and biotic factors that determine photosynthetic and respiratory capacity at large spatial scales. Specifically, we measured the maximum rate of Rubisco carboxylation (Vcmax), the maximum rate of Ribulose-1,5-bisphosphate regeneration (Jmax), and leaf dark respiration (Rd) in >600 individuals of 98 plant species from the tropical to high boreal biomes of Northern and Central America. We also measured a bevy of covariates including plant functional type, leaf nitrogen content, short- and long-term climate, leaf water potential, plant size, and leaf mass per area. We found that plant functional type and leaf nitrogen content were the primary determinants of Vcmax, Jmax, and Rd. Mean annual temperature and mean annual precipitation were not significant predictors of these rates. However, short-term climatic variables, specifically soil moisture and air temperature over the previous 25 days, were significant predictors and indicated that heat and soil moisture deficits combine to reduce photosynthetic capacity and increase respiratory capacity. Finally, these data were used as a model benchmarking tool for the Community Land Model version 4.5 (CLM 4.5). The benchmarking analyses determined errors in the leaf nitrogen allocation scheme of CLM 4.5. Under high leaf nitrogen levels within a plant type the model overestimated Vcmax and Jmax. This result suggested that plants were altering their nitrogen allocation patterns when leaf nitrogen levels were high, an effect that was not being captured by the model. These data, taken with models in mind, provide paths forward for improving model structure and parameterization of leaf carbon exchange at large spatial scales.
NASA Technical Reports Server (NTRS)
VanderWijngaart, Rob; Frumkin, Michael; Biegel, Bryan A. (Technical Monitor)
2002-01-01
We provide a paper-and-pencil specification of a benchmark suite for computational grids. It is based on the NAS (NASA Advanced Supercomputing) Parallel Benchmarks (NPB) and is called the NAS Grid Benchmarks (NGB). NGB problems are presented as data flow graphs encapsulating an instance of a slightly modified NPB task in each graph node, which communicates with other nodes by sending/receiving initialization data. Like NPB, NGB specifies several different classes (problem sizes). In this report we describe classes S, W, and A, and provide verification values for each. The implementor has the freedom to choose any language, grid environment, security model, fault tolerance/error correction mechanism, etc., as long as the resulting implementation passes the verification test and reports the turnaround time of the benchmark.
The non-random walk of stock prices: the long-term correlation between signs and sizes
NASA Astrophysics Data System (ADS)
La Spada, G.; Farmer, J. D.; Lillo, F.
2008-08-01
We investigate the random walk of prices by developing a simple model relating the properties of the signs and absolute values of individual price changes to the diffusion rate (volatility) of prices at longer time scales. We show that this benchmark model is unable to reproduce the diffusion properties of real prices. Specifically, we find that for one hour intervals this model consistently over-predicts the volatility of real price series by about 70%, and that this effect becomes stronger as the length of the intervals increases. By selectively shuffling some components of the data while preserving others we are able to show that this discrepancy is caused by a subtle but long-range non-contemporaneous correlation between the signs and sizes of individual returns. We conjecture that this is related to the long-memory of transaction signs and the need to enforce market efficiency.
Laser beam self-focusing in turbulent dissipative media.
Hafizi, B; Peñano, J R; Palastro, J P; Fischer, R P; DiComo, G
2017-01-15
A high-power laser beam propagating through a dielectric in the presence of fluctuations is subject to diffraction, dissipation, and optical Kerr nonlinearity. A method of moments was applied to a stochastic, nonlinear enveloped wave equation to analyze the evolution of the long-term spot radius. For propagation in atmospheric turbulence described by a Kolmogorov-von Kármán spectral density, the analysis was benchmarked against field experiments in the low-power limit and compared with simulation results in the high-power regime. Dissipation reduced the effect of self-focusing and led to chromatic aberration.
2010-01-01
Changes to the glycosylation profile on HIV gp120 can influence viral pathogenesis and alter AIDS disease progression. The characterization of glycosylation differences at the sequence level is inadequate as the placement of carbohydrates is structurally complex. However, no structural framework is available to date for the study of HIV disease progression. In this study, we propose a novel machine-learning based framework for the prediction of AIDS disease progression in three stages (RP, SP, and LTNP) using the HIV structural gp120 profile. This new intelligent framework proves to be accurate and provides an important benchmark for predicting AIDS disease progression computationally. The model is trained using a novel HIV gp120 glycosylation structural profile to detect possible stages of AIDS disease progression for the target sequences of HIV+ individuals. The performance of the proposed model was compared to seven existing different machine-learning models on newly proposed gp120-Benchmark_1 dataset in terms of error-rate (MSE), accuracy (CCI), stability (STD), and complexity (TBM). The novel framework showed better predictive performance with 67.82% CCI, 30.21 MSE, 0.8 STD, and 2.62 TBM on the three stages of AIDS disease progression of 50 HIV+ individuals. This framework is an invaluable bioinformatics tool that will be useful to the clinical assessment of viral pathogenesis. PMID:21143806
Single molecule sequencing-guided scaffolding and correction of draft assemblies.
Zhu, Shenglong; Chen, Danny Z; Emrich, Scott J
2017-12-06
Although single molecule sequencing is still improving, the lengths of the generated sequences are inevitably an advantage in genome assembly. Prior work that utilizes long reads to conduct genome assembly has mostly focused on correcting sequencing errors and improving contiguity of de novo assemblies. We propose a disassembling-reassembling approach for both correcting structural errors in the draft assembly and scaffolding a target assembly based on error-corrected single molecule sequences. To achieve this goal, we formulate a maximum alternating path cover problem. We prove that this problem is NP-hard, and solve it by a 2-approximation algorithm. Our experimental results show that our approach can improve the structural correctness of target assemblies in the cost of some contiguity, even with smaller amounts of long reads. In addition, our reassembling process can also serve as a competitive scaffolder relative to well-established assembly benchmarks.
Economics of carbon dioxide capture and utilization-a supply and demand perspective.
Naims, Henriette
2016-11-01
Lately, the technical research on carbon dioxide capture and utilization (CCU) has achieved important breakthroughs. While single CO 2 -based innovations are entering the markets, the possible economic effects of a large-scale CO 2 utilization still remain unclear to policy makers and the public. Hence, this paper reviews the literature on CCU and provides insights on the motivations and potential of making use of recovered CO 2 emissions as a commodity in the industrial production of materials and fuels. By analyzing data on current global CO 2 supply from industrial sources, best practice benchmark capture costs and the demand potential of CO 2 utilization and storage scenarios with comparative statics, conclusions can be drawn on the role of different CO 2 sources. For near-term scenarios the demand for the commodity CO 2 can be covered from industrial processes, that emit CO 2 at a high purity and low benchmark capture cost of approximately 33 €/t. In the long-term, with synthetic fuel production and large-scale CO 2 utilization, CO 2 is likely to be available from a variety of processes at benchmark costs of approx. 65 €/t. Even if fossil-fired power generation is phased out, the CO 2 emissions of current industrial processes would suffice for ambitious CCU demand scenarios. At current economic conditions, the business case for CO 2 utilization is technology specific and depends on whether efficiency gains or substitution of volatile priced raw materials can be achieved. Overall, it is argued that CCU should be advanced complementary to mitigation technologies and can unfold its potential in creating local circular economy solutions.
Huang, Shuai; Li, Jing; Ye, Jieping; Fleisher, Adam; Chen, Kewei; Wu, Teresa; Reiman, Eric
2013-06-01
Structure learning of Bayesian Networks (BNs) is an important topic in machine learning. Driven by modern applications in genetics and brain sciences, accurate and efficient learning of large-scale BN structures from high-dimensional data becomes a challenging problem. To tackle this challenge, we propose a Sparse Bayesian Network (SBN) structure learning algorithm that employs a novel formulation involving one L1-norm penalty term to impose sparsity and another penalty term to ensure that the learned BN is a Directed Acyclic Graph--a required property of BNs. Through both theoretical analysis and extensive experiments on 11 moderate and large benchmark networks with various sample sizes, we show that SBN leads to improved learning accuracy, scalability, and efficiency as compared with 10 existing popular BN learning algorithms. We apply SBN to a real-world application of brain connectivity modeling for Alzheimer's disease (AD) and reveal findings that could lead to advancements in AD research.
Huang, Shuai; Li, Jing; Ye, Jieping; Fleisher, Adam; Chen, Kewei; Wu, Teresa; Reiman, Eric
2014-01-01
Structure learning of Bayesian Networks (BNs) is an important topic in machine learning. Driven by modern applications in genetics and brain sciences, accurate and efficient learning of large-scale BN structures from high-dimensional data becomes a challenging problem. To tackle this challenge, we propose a Sparse Bayesian Network (SBN) structure learning algorithm that employs a novel formulation involving one L1-norm penalty term to impose sparsity and another penalty term to ensure that the learned BN is a Directed Acyclic Graph (DAG)—a required property of BNs. Through both theoretical analysis and extensive experiments on 11 moderate and large benchmark networks with various sample sizes, we show that SBN leads to improved learning accuracy, scalability, and efficiency as compared with 10 existing popular BN learning algorithms. We apply SBN to a real-world application of brain connectivity modeling for Alzheimer’s disease (AD) and reveal findings that could lead to advancements in AD research. PMID:22665720
Memory-Intensive Benchmarks: IRAM vs. Cache-Based Machines
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Gaeke, Brian R.; Husbands, Parry; Li, Xiaoye S.; Oliker, Leonid; Yelick, Katherine A.; Biegel, Bryan (Technical Monitor)
2002-01-01
The increasing gap between processor and memory performance has lead to new architectural models for memory-intensive applications. In this paper, we explore the performance of a set of memory-intensive benchmarks and use them to compare the performance of conventional cache-based microprocessors to a mixed logic and DRAM processor called VIRAM. The benchmarks are based on problem statements, rather than specific implementations, and in each case we explore the fundamental hardware requirements of the problem, as well as alternative algorithms and data structures that can help expose fine-grained parallelism or simplify memory access patterns. The benchmarks are characterized by their memory access patterns, their basic control structures, and the ratio of computation to memory operation.
NASA Astrophysics Data System (ADS)
Liu, Bing; Sun, Li Guo
2018-06-01
This paper chooses the Nanjing-Hangzhou high speed overbridge, a self-anchored suspension bridge, as the research target, trying to identify the dynamic characteristic parameters of the bridge by using the peak-picking method to analyze the velocity response data under ambient excitation collected by 7 vibration pickup sensors set on the bridge deck. The ABAQUS is used to set up a three-dimensional finite element model for the full bridge and amends the finite element model of the suspension bridge based on the identified modal parameter, and suspender force picked by the PDV100 laser vibrometer. The study shows that the modal parameter can well be identified by analyzing the bridge vibration velocity collected by 7 survey points. The identified modal parameter and measured suspender force can be used as the basis of the amendment of the finite element model of the suspension bridge. The amended model can truthfully reflect the structural physical features and it can also be the benchmark model for the long-term health monitoring and condition assessment of the bridge.
NASA Astrophysics Data System (ADS)
Peppa, M. V.; Mills, J. P.; Fieber, K. D.; Haynes, I.; Turner, S.; Turner, A.; Douglas, M.; Bryan, P. G.
2018-05-01
Understanding and protecting cultural heritage involves the detection and long-term documentation of archaeological remains alongside the spatio-temporal analysis of their landscape evolution. Archive aerial photography can illuminate traces of ancient features which typically appear with different brightness values from their surrounding environment, but are not always well defined. This research investigates the implementation of the Structure-from-Motion - Multi-View Stereo image matching approach with an image enhancement algorithm to derive three epochs of orthomosaics and digital surface models from visible and near infrared historic aerial photography. The enhancement algorithm uses decorrelation stretching to improve the contrast of the orthomosaics so as archaeological features are better detected. Results include 2D / 3D locations of detected archaeological traces stored into a geodatabase for further archaeological interpretation and correlation with benchmark observations. The study also discusses the merits and difficulties of the process involved. This research is based on a European-wide project, entitled "Cultural Heritage Through Time", and the case study research was carried out as a component of the project in the UK.
Co-creating meaningful structures within long-term psychotherapy group culture.
Gayle, Robin G
2009-07-01
Meaningful group structures are co-created within the long-term outpatient psychotherapy group through a hermeneutical interaction between structure and immediate experience of structure by individuals embedded in personal and collective contexts. Co-created meanings expand original group- and self-understandings and further evolve structures that are stable yet do not exist independently of the narratives and affects of the members who interact with them. Group structures do not reduce, expand, or dissolve but change in connection to the experiences and meaning attributions within the group. This intersubjective process mediates the emphasis within group theory on leader responsibility for culture building that risks overpromoting certain psychotherapeutic cultural intentions over others. Three examples of intersubjective hermeneutical interaction within long-term psychotherapy groups lend insight into global, cultural, and societal groups.
Internal Benchmarking for Institutional Effectiveness
ERIC Educational Resources Information Center
Ronco, Sharron L.
2012-01-01
Internal benchmarking is an established practice in business and industry for identifying best in-house practices and disseminating the knowledge about those practices to other groups in the organization. Internal benchmarking can be done with structures, processes, outcomes, or even individuals. In colleges or universities with multicampuses or a…
van Lent, Wineke A M; de Beer, Relinde D; van Harten, Wim H
2010-08-31
Benchmarking is one of the methods used in business that is applied to hospitals to improve the management of their operations. International comparison between hospitals can explain performance differences. As there is a trend towards specialization of hospitals, this study examines the benchmarking process and the success factors of benchmarking in international specialized cancer centres. Three independent international benchmarking studies on operations management in cancer centres were conducted. The first study included three comprehensive cancer centres (CCC), three chemotherapy day units (CDU) were involved in the second study and four radiotherapy departments were included in the final study. Per multiple case study a research protocol was used to structure the benchmarking process. After reviewing the multiple case studies, the resulting description was used to study the research objectives. We adapted and evaluated existing benchmarking processes through formalizing stakeholder involvement and verifying the comparability of the partners. We also devised a framework to structure the indicators to produce a coherent indicator set and better improvement suggestions. Evaluating the feasibility of benchmarking as a tool to improve hospital processes led to mixed results. Case study 1 resulted in general recommendations for the organizations involved. In case study 2, the combination of benchmarking and lean management led in one CDU to a 24% increase in bed utilization and a 12% increase in productivity. Three radiotherapy departments of case study 3, were considering implementing the recommendations.Additionally, success factors, such as a well-defined and small project scope, partner selection based on clear criteria, stakeholder involvement, simple and well-structured indicators, analysis of both the process and its results and, adapt the identified better working methods to the own setting, were found. The improved benchmarking process and the success factors can produce relevant input to improve the operations management of specialty hospitals.
2010-01-01
Background Benchmarking is one of the methods used in business that is applied to hospitals to improve the management of their operations. International comparison between hospitals can explain performance differences. As there is a trend towards specialization of hospitals, this study examines the benchmarking process and the success factors of benchmarking in international specialized cancer centres. Methods Three independent international benchmarking studies on operations management in cancer centres were conducted. The first study included three comprehensive cancer centres (CCC), three chemotherapy day units (CDU) were involved in the second study and four radiotherapy departments were included in the final study. Per multiple case study a research protocol was used to structure the benchmarking process. After reviewing the multiple case studies, the resulting description was used to study the research objectives. Results We adapted and evaluated existing benchmarking processes through formalizing stakeholder involvement and verifying the comparability of the partners. We also devised a framework to structure the indicators to produce a coherent indicator set and better improvement suggestions. Evaluating the feasibility of benchmarking as a tool to improve hospital processes led to mixed results. Case study 1 resulted in general recommendations for the organizations involved. In case study 2, the combination of benchmarking and lean management led in one CDU to a 24% increase in bed utilization and a 12% increase in productivity. Three radiotherapy departments of case study 3, were considering implementing the recommendations. Additionally, success factors, such as a well-defined and small project scope, partner selection based on clear criteria, stakeholder involvement, simple and well-structured indicators, analysis of both the process and its results and, adapt the identified better working methods to the own setting, were found. Conclusions The improved benchmarking process and the success factors can produce relevant input to improve the operations management of specialty hospitals. PMID:20807408
The effect of long-term changes in plant inputs on soil carbon stocks
NASA Astrophysics Data System (ADS)
Georgiou, K.; Li, Z.; Torn, M. S.
2017-12-01
Soil organic carbon (SOC) is the largest actively-cycling terrestrial reservoir of C and an integral component of thriving natural and managed ecosystems. C input interventions (e.g., litter removal or organic amendments) are common in managed landscapes and present an important decision for maintaining healthy soils in sustainable agriculture and forestry. Furthermore, climate and land-cover change can also affect the amount of plant C inputs that enter the soil through changes in plant productivity, allocation, and rooting depth. Yet, the processes that dictate the response of SOC to such changes in C inputs are poorly understood and inadequately represented in predictive models. Long-term litter manipulations are an invaluable resource for exploring key controls of SOC storage and validating model representations. Here we explore the response of SOC to long-term changes in plant C inputs across a range of biomes and soil types. We synthesize and analyze data from long-term litter manipulation field experiments, and focus our meta-analysis on changes to total SOC stocks, microbial biomass carbon, and mineral-associated (`protected') carbon pools and explore the relative contribution of above- versus below-ground C inputs. Our cross-site data comparison reveals that divergent SOC responses are observed between forest sites, particularly for treatments that increase C inputs to the soil. We explore trends among key variables (e.g., microbial biomass to SOC ratios) that inform soil C model representations. The assembled dataset is an important benchmark for evaluating process-based hypotheses and validating divergent model formulations.
ERIC Educational Resources Information Center
Mortenson, Lee E.; Berdes, Celia M.
This document, one in a series developed to provide technical assistance to 22 Long-Term Care Gerontology Centers, describes the current administrative and structural phenomenon of these centers. Precedents useful in assessing both the current climate and actual prospects for development of long term care centers are cited. The first section…
Jimenez-Del-Toro, Oscar; Muller, Henning; Krenn, Markus; Gruenberg, Katharina; Taha, Abdel Aziz; Winterstein, Marianne; Eggel, Ivan; Foncubierta-Rodriguez, Antonio; Goksel, Orcun; Jakab, Andras; Kontokotsios, Georgios; Langs, Georg; Menze, Bjoern H; Salas Fernandez, Tomas; Schaer, Roger; Walleyo, Anna; Weber, Marc-Andre; Dicente Cid, Yashin; Gass, Tobias; Heinrich, Mattias; Jia, Fucang; Kahl, Fredrik; Kechichian, Razmig; Mai, Dominic; Spanier, Assaf B; Vincent, Graham; Wang, Chunliang; Wyeth, Daniel; Hanbury, Allan
2016-11-01
Variations in the shape and appearance of anatomical structures in medical images are often relevant radiological signs of disease. Automatic tools can help automate parts of this manual process. A cloud-based evaluation framework is presented in this paper including results of benchmarking current state-of-the-art medical imaging algorithms for anatomical structure segmentation and landmark detection: the VISCERAL Anatomy benchmarks. The algorithms are implemented in virtual machines in the cloud where participants can only access the training data and can be run privately by the benchmark administrators to objectively compare their performance in an unseen common test set. Overall, 120 computed tomography and magnetic resonance patient volumes were manually annotated to create a standard Gold Corpus containing a total of 1295 structures and 1760 landmarks. Ten participants contributed with automatic algorithms for the organ segmentation task, and three for the landmark localization task. Different algorithms obtained the best scores in the four available imaging modalities and for subsets of anatomical structures. The annotation framework, resulting data set, evaluation setup, results and performance analysis from the three VISCERAL Anatomy benchmarks are presented in this article. Both the VISCERAL data set and Silver Corpus generated with the fusion of the participant algorithms on a larger set of non-manually-annotated medical images are available to the research community.
Highway-railway at-grade crossing structures : long term settlement measurements and assessments.
DOT National Transportation Integrated Search
2009-05-01
The purpose of this research to evaluate the long-term settlements for a wide variety of at-grade crossings. Twenty-four highway crossings were monitored to determine the effects of enhanced support on minimizing long-term settlements of the crossing...
Peter Caldwell; Chelcy Ford Miniat; Steven Brantley; Katherine Elliott; Stephanie Laseter; Wayne Swank
2016-01-01
In forested watersheds, changes in climate and forest structure or age can affect water yield; yet few long-term observational records from such watersheds exist that allow an assessment of these impacts over time. In this study, we used long-term (~80 yrs) observational records of climate and water yield in six reference watersheds at the Coweeta Hydrologic Laboratory...
Sustained Innovation Through Shared Capitalism and Democratic Governance
NASA Astrophysics Data System (ADS)
Beyster, M. A.; Blasi, J.; Sibilia, J.; Zebuchen, T.; Bowman, A.
The Foundation for Enterprise Development (FED) explores application of democratic representative governance models and structures for long-term interdisciplinary research, development and education to the concept of an organization that can sustain activity in support of interstellar travel in the 100-year timeframe, as outlined by the 100 Year StarshipTM. This paper titled, Sustained Innovation through Shared Capitalism and Democratic Governance , explores the roots of representative structures and organizations as long-lived success stories throughout history. Research, innovation, organizational structures and associated issues are explored to address the long-term focus required for development, both material and human. Impact investing vehicles are also explored as potential investment structures addressing the long-term horizon required by the organization. This paper provides an illustration, description and philosophical approach of this model as developed by the FED and our collaborators.
NASA Astrophysics Data System (ADS)
Gao, C.; Liu, Y.; Jin, J.; Wei, T.
2015-12-01
East and south coastal China contributes to respectively about 30% and 8% of CO2 emissions in China and the world, and therefore play a critical role in achieving the national goal of emission reduction to mitigate the global warming. It also serves as a benchmark for the less developed regions of China, in terms of achieving the developed world's human development standard under lower per capita emissions. We analyze the driving forces of emissions in this region and their provincial characteristics by applying the Logarithmic Mean Divisia Index method. Our findings show that emissions have been doubled during the period from 2000 to 2012, along with three and two folds increase in economy and energy consumption, respectively. This suggests a persistent lock between economic growth and emissions, even in this socioeconomically advanced region in China. Provincial difference in annual emission growth reveals three distinguished low-carbon developmental stages, owning mainly to the effectiveness of energy efficiency in reducing emission growth. This may explain why previous climate policies have aimed to reduce carbon intensity. These results indicate that targeted measures on enhancing energy efficiency in the short term and de-carbonization of both the economic and energy structure in the long term can lower the emission growth more effectively and efficiently. They also suggest that factor-driven emission reduction strategies and policies are needed in the geographically and socioeconomically similar regions.
Edwards, C Blake; Jordan, David L; Owen, Michael Dk; Dixon, Philip M; Young, Bryan G; Wilson, Robert G; Weller, Steven C; Shaw, David R
2014-12-01
Since the introduction of glyphosate-resistant (GR) crops, growers have often relied on glyphosate-only weed control programs. As a result, multiple weeds have evolved resistance to glyphosate. A 5 year study including 156 growers from Illinois, Iowa, Indiana, Nebraska, North Carolina and Mississippi in the United States was conducted to compare crop yields and net returns between grower standard weed management programs (SPs) and programs containing best management practices (BMPs) recommended by university weed scientists. The BMPs were designed to prevent or mitigate/manage evolved herbicide resistance. Weed management costs were greater for the BMP approach in most situations, but crop yields often increased sufficiently for net returns similar to those of the less expensive SPs. This response was similar across all years, geographical regions, states, crops and tillage systems. Herbicide use strategies that include a diversity of herbicide mechanisms of action will increase the long-term sustainability of glyphosate-based weed management strategies. Growers can adopt herbicide resistance BMPs with confidence that net returns will not be negatively affected in the short term and contribute to resistance management in the long term. © 2014 Society of Chemical Industry.
Centennial impacts of fragmentation on the canopy structure of tropical montane forest
Nicholas Vaughn; Greg Asner; Christian Giardina
2014-01-01
Fragmentation poses one of the greatest threats to tropical forests with short-term changes to the structure of forest canopies affecting microclimate, tree mortality, and growth. Yet the long-term effects of fragmentation are poorly understood because (1) most effects require many decades to materialize, but long-term studies are very rare, (2) the effects of edges on...
Volf, Martin; Redmond, Conor; Albert, Ágnes J; Le Bagousse-Pinguet, Yoann; Biella, Paolo; Götzenberger, Lars; Hrázský, Záboj; Janeček, Štěpán; Klimešová, Jitka; Lepš, Jan; Šebelíková, Lenka; Vlasatá, Tereza; de Bello, Francesco
2016-04-01
The functional structures of communities respond to environmental changes by both species replacement (turnover) and within-species variation (intraspecific trait variability; ITV). Evidence is lacking on the relative importance of these two components, particularly in response to both short- and long-term environmental disturbance. We hypothesized that such short- and long-term perturbations would induce changes in community functional structure primarily via ITV and turnover, respectively. To test this we applied an experimental design across long-term mown and abandoned meadows, with each plot containing a further level of short-term management treatments: mowing, grazing and abandonment. Within each plot, species composition and trait values [height, shoot biomass, and specific leaf area (SLA)] were recorded on up to five individuals per species. Positive covariations between the contribution of species turnover and ITV occurred for height and shoot biomass in response to both short- and long-term management, indicating that species turnover and intraspecific adjustments selected for similar trait values. Positive covariations also occurred for SLA, but only in response to long-term management. The contributions of turnover and ITV changed depending on both the trait and management trajectory. As expected, communities responded to short-term disturbances mostly through changes in intraspecific trait variability, particularly for height and biomass. Interestingly, for SLA they responded to long-term disturbances by both species turnover and intraspecific adjustments. These findings highlight the importance of both ITV and species turnover in adjusting grassland functional trait response to environmental perturbation, and show that the response is trait specific and affected by disturbance regime history.
PMLB: a large benchmark suite for machine learning evaluation and comparison.
Olson, Randal S; La Cava, William; Orzechowski, Patryk; Urbanowicz, Ryan J; Moore, Jason H
2017-01-01
The selection, development, or comparison of machine learning methods in data mining can be a difficult task based on the target problem and goals of a particular study. Numerous publicly available real-world and simulated benchmark datasets have emerged from different sources, but their organization and adoption as standards have been inconsistent. As such, selecting and curating specific benchmarks remains an unnecessary burden on machine learning practitioners and data scientists. The present study introduces an accessible, curated, and developing public benchmark resource to facilitate identification of the strengths and weaknesses of different machine learning methodologies. We compare meta-features among the current set of benchmark datasets in this resource to characterize the diversity of available data. Finally, we apply a number of established machine learning methods to the entire benchmark suite and analyze how datasets and algorithms cluster in terms of performance. From this study, we find that existing benchmarks lack the diversity to properly benchmark machine learning algorithms, and there are several gaps in benchmarking problems that still need to be considered. This work represents another important step towards understanding the limitations of popular benchmarking suites and developing a resource that connects existing benchmarking standards to more diverse and efficient standards in the future.
DOT National Transportation Integrated Search
2009-06-01
This report describes the investigation of the long term structural performance of a : hybrid FRP-concrete (HFRPC) bridge deck on steel girders. The study aimed at : assessing three long term aspects pertaining to the HFRPC bridge deck: (1) creep : c...
2013-01-01
Background While a large body of work exists on comparing and benchmarking descriptors of molecular structures, a similar comparison of protein descriptor sets is lacking. Hence, in the current work a total of 13 amino acid descriptor sets have been benchmarked with respect to their ability of establishing bioactivity models. The descriptor sets included in the study are Z-scales (3 variants), VHSE, T-scales, ST-scales, MS-WHIM, FASGAI, BLOSUM, a novel protein descriptor set (termed ProtFP (4 variants)), and in addition we created and benchmarked three pairs of descriptor combinations. Prediction performance was evaluated in seven structure-activity benchmarks which comprise Angiotensin Converting Enzyme (ACE) dipeptidic inhibitor data, and three proteochemometric data sets, namely (1) GPCR ligands modeled against a GPCR panel, (2) enzyme inhibitors (NNRTIs) with associated bioactivities against a set of HIV enzyme mutants, and (3) enzyme inhibitors (PIs) with associated bioactivities on a large set of HIV enzyme mutants. Results The amino acid descriptor sets compared here show similar performance (<0.1 log units RMSE difference and <0.1 difference in MCC), while errors for individual proteins were in some cases found to be larger than those resulting from descriptor set differences ( > 0.3 log units RMSE difference and >0.7 difference in MCC). Combining different descriptor sets generally leads to better modeling performance than utilizing individual sets. The best performers were Z-scales (3) combined with ProtFP (Feature), or Z-Scales (3) combined with an average Z-Scale value for each target, while ProtFP (PCA8), ST-Scales, and ProtFP (Feature) rank last. Conclusions While amino acid descriptor sets capture different aspects of amino acids their ability to be used for bioactivity modeling is still – on average – surprisingly similar. Still, combining sets describing complementary information consistently leads to small but consistent improvement in modeling performance (average MCC 0.01 better, average RMSE 0.01 log units lower). Finally, performance differences exist between the targets compared thereby underlining that choosing an appropriate descriptor set is of fundamental for bioactivity modeling, both from the ligand- as well as the protein side. PMID:24059743
Long-term real-time structural health monitoring using wireless smart sensor
NASA Astrophysics Data System (ADS)
Jang, Shinae; Mensah-Bonsu, Priscilla O.; Li, Jingcheng; Dahal, Sushil
2013-04-01
Improving the safety and security of civil infrastructure has become a critical issue for decades since it plays a central role in the economics and politics of a modern society. Structural health monitoring of civil infrastructure using wireless smart sensor network has emerged as a promising solution recently to increase structural reliability, enhance inspection quality, and reduce maintenance costs. Though hardware and software framework are well prepared for wireless smart sensors, the long-term real-time health monitoring strategy are still not available due to the lack of systematic interface. In this paper, the Imote2 smart sensor platform is employed, and a graphical user interface for the long-term real-time structural health monitoring has been developed based on Matlab for the Imote2 platform. This computer-aided engineering platform enables the control, visualization of measured data as well as safety alarm feature based on modal property fluctuation. A new decision making strategy to check the safety is also developed and integrated in this software. Laboratory validation of the computer aided engineering platform for the Imote2 on a truss bridge and a building structure has shown the potential of the interface for long-term real-time structural health monitoring.
Protein kinase M ζ and the maintenance of long-term memory.
Zhang, Yang; Zong, Wei; Zhang, Lei; Ma, Yuanye; Wang, Jianhong
2016-10-01
Although various molecules have been found to mediate the processes of memory acquisition and consolidation, the molecular mechanism to maintain memory still remains elusive. In recent years, a molecular pathway focusing on protein kinase Mζ (PKMζ) has become of interest to researchers because of its potential role in long-term memory maintenance. PKMζ is an isoform of protein kinase C (PKC) and has a related structure that influences its function in maintaining memory. Considerable evidence has been gathered on PKMζ activity, including loss of function studies using PKMζ inhibitors, such as PKMζ inhibitory peptide (ZIP), suggesting PKMζ plays an important role in long-term memory maintenance. This review provides an overview of the role of PKMζ in long-term memory and outlines the molecular structure of PKMζ, the molecular mechanism of PKMζ in long-term memory maintenance and future directions of PKMζ research. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mu, John C.; Tootoonchi Afshar, Pegah; Mohiyuddin, Marghoob; Chen, Xi; Li, Jian; Bani Asadi, Narges; Gerstein, Mark B.; Wong, Wing H.; Lam, Hugo Y. K.
2015-01-01
A high-confidence, comprehensive human variant set is critical in assessing accuracy of sequencing algorithms, which are crucial in precision medicine based on high-throughput sequencing. Although recent works have attempted to provide such a resource, they still do not encompass all major types of variants including structural variants (SVs). Thus, we leveraged the massive high-quality Sanger sequences from the HuRef genome to construct by far the most comprehensive gold set of a single individual, which was cross validated with deep Illumina sequencing, population datasets, and well-established algorithms. It was a necessary effort to completely reanalyze the HuRef genome as its previously published variants were mostly reported five years ago, suffering from compatibility, organization, and accuracy issues that prevent their direct use in benchmarking. Our extensive analysis and validation resulted in a gold set with high specificity and sensitivity. In contrast to the current gold sets of the NA12878 or HS1011 genomes, our gold set is the first that includes small variants, deletion SVs and insertion SVs up to a hundred thousand base-pairs. We demonstrate the utility of our HuRef gold set to benchmark several published SV detection tools. PMID:26412485
Zhu, Bi; Chen, Chuansheng; Loftus, Elizabeth F; He, Qinghua; Lei, Xuemei; Dong, Qi; Lin, Chongde
2016-11-01
There is a keen interest in identifying specific brain regions that are related to individual differences in true and false memories. Previous functional neuroimaging studies showed that activities in the hippocampus, right fusiform gyrus, and parahippocampal gyrus were associated with true and false memories, but no study thus far has examined whether the structures of these brain regions are associated with short-term and long-term true and false memories. To address that question, the current study analyzed data from 205 healthy young adults, who had valid data from both structural brain imaging and a misinformation task. In the misinformation task, subjects saw the crime scenarios, received misinformation, and took memory tests about the crimes an hour later and again after 1.5 years. Results showed that bilateral hippocampal volume was associated with short-term true and false memories, whereas right fusiform gyrus volume and surface area were associated with long-term true and false memories. This study provides the first evidence for the structural neural bases of individual differences in short-term and long-term true and false memories.
NASA Astrophysics Data System (ADS)
Pulkkinen, A. A.; Bernabeu, E.; Weigel, R. S.; Kelbert, A.; Rigler, E. J.; Bedrosian, P.; Love, J. J.
2017-12-01
Development of realistic storm scenarios that can be played through the exposed systems is one of the key requirements for carrying out quantitative space weather hazards assessments. In the geomagnetically induced currents (GIC) and power grids context, these scenarios have to quantify the spatiotemporal evolution of the geoelectric field that drives the potentially hazardous currents in the system. In response to the Federal Energy Regulatory Commission (FERC) order 779, a team of scientists and engineers that worked under the auspices of North American Electric Reliability Corporation (NERC), has developed extreme geomagnetic storm and geoelectric field benchmark(s) that use various scaling factors that account for geomagnetic latitude and ground structure of the locations of interest. These benchmarks, together with the information generated in the National Space Weather Action Plan, are the foundation for the hazards assessments that the industry will be carrying out in response to the FERC order and under the auspices of the National Science and Technology Council. While the scaling factors developed in the past work were based on the best available information, there is now significant new information available for parts of the U.S. pertaining to the ground response to external geomagnetic field excitation. The significant new information includes the results magnetotelluric surveys that have been conducted over the past few years across the contiguous US and results from previous surveys that have been made available in a combined online database. In this paper, we distill this new information in the framework of the NERC benchmark and in terms of updated ground response scaling factors thereby allowing straightforward utilization in the hazard assessments. We also outline the path forward for improving the overall extreme event benchmark scenario(s) including generalization of the storm waveforms and geoelectric field spatial patterns.
Spring onset variations and long-term trends from new hemispheric-scale products and remote sensing
NASA Astrophysics Data System (ADS)
Dye, D. G.; Li, X.; Ault, T.; Zurita-Milla, R.; Schwartz, M. D.
2015-12-01
Spring onset is commonly characterized by plant phenophase changes among a variety of biophysical transitions and has important implications for natural and man-managed ecosystems. Here, we present a new integrated analysis of variability in gridded Northern Hemisphere spring onset metrics. We developed a set of hemispheric temperature-based spring indices spanning 1920-2013. As these were derived solely from meteorological data, they are used as a benchmark for isolating the climate system's role in modulating spring "green up" estimated from the annual cycle of normalized difference vegetation index (NDVI). Spatial patterns of interannual variations, teleconnections, and long-term trends were also analyzed in all metrics. At mid-to-high latitudes, all indices exhibit larger variability at interannual to decadal time scales than at spatial scales of a few kilometers. Trends of spring onset vary across space and time. However, compared to long-term trend, interannual to decadal variability generally accounts for a larger portion of the total variance in spring onset timing. Therefore, spring onset trends identified from short existing records may be aliased by decadal climate variations due to their limited temporal depth, even when these records span the entire satellite era. Based on our findings, we also demonstrated that our indices have skill in representing ecosystem-level spring phenology and may have important implications in understanding relationships between phenology, atmosphere dynamics and climate variability.
NASA Astrophysics Data System (ADS)
Velioǧlu, Deniz; Cevdet Yalçıner, Ahmet; Zaytsev, Andrey
2016-04-01
Tsunamis are huge waves with long wave periods and wave lengths that can cause great devastation and loss of life when they strike a coast. The interest in experimental and numerical modeling of tsunami propagation and inundation increased considerably after the 2011 Great East Japan earthquake. In this study, two numerical codes, FLOW 3D and NAMI DANCE, that analyze tsunami propagation and inundation patterns are considered. Flow 3D simulates linear and nonlinear propagating surface waves as well as long waves by solving three-dimensional Navier-Stokes (3D-NS) equations. NAMI DANCE uses finite difference computational method to solve 2D depth-averaged linear and nonlinear forms of shallow water equations (NSWE) in long wave problems, specifically tsunamis. In order to validate these two codes and analyze the differences between 3D-NS and 2D depth-averaged NSWE equations, two benchmark problems are applied. One benchmark problem investigates the runup of long waves over a complex 3D beach. The experimental setup is a 1:400 scale model of Monai Valley located on the west coast of Okushiri Island, Japan. Other benchmark problem is discussed in 2015 National Tsunami Hazard Mitigation Program (NTHMP) Annual meeting in Portland, USA. It is a field dataset, recording the Japan 2011 tsunami in Hilo Harbor, Hawaii. The computed water surface elevation and velocity data are compared with the measured data. The comparisons showed that both codes are in fairly good agreement with each other and benchmark data. The differences between 3D-NS and 2D depth-averaged NSWE equations are highlighted. All results are presented with discussions and comparisons. Acknowledgements: Partial support by Japan-Turkey Joint Research Project by JICA on earthquakes and tsunamis in Marmara Region (JICA SATREPS - MarDiM Project), 603839 ASTARTE Project of EU, UDAP-C-12-14 project of AFAD Turkey, 108Y227, 113M556 and 213M534 projects of TUBITAK Turkey, RAPSODI (CONCERT_Dis-021) of CONCERT-Japan Joint Call and Istanbul Metropolitan Municipality are all acknowledged.
Cancer Detection in Microarray Data Using a Modified Cat Swarm Optimization Clustering Approach
M, Pandi; R, Balamurugan; N, Sadhasivam
2017-12-29
Objective: A better understanding of functional genomics can be obtained by extracting patterns hidden in gene expression data. This could have paramount implications for cancer diagnosis, gene treatments and other domains. Clustering may reveal natural structures and identify interesting patterns in underlying data. The main objective of this research was to derive a heuristic approach to detection of highly co-expressed genes related to cancer from gene expression data with minimum Mean Squared Error (MSE). Methods: A modified CSO algorithm using Harmony Search (MCSO-HS) for clustering cancer gene expression data was applied. Experiment results are analyzed using two cancer gene expression benchmark datasets, namely for leukaemia and for breast cancer. Result: The results indicated MCSO-HS to be better than HS and CSO, 13% and 9% with the leukaemia dataset. For breast cancer dataset improvement was by 22% and 17%, respectively, in terms of MSE. Conclusion: The results showed MCSO-HS to outperform HS and CSO with both benchmark datasets. To validate the clustering results, this work was tested with internal and external cluster validation indices. Also this work points to biological validation of clusters with gene ontology in terms of function, process and component. Creative Commons Attribution License
Adaptive unified continuum FEM modeling of a 3D FSI benchmark problem.
Jansson, Johan; Degirmenci, Niyazi Cem; Hoffman, Johan
2017-09-01
In this paper, we address a 3D fluid-structure interaction benchmark problem that represents important characteristics of biomedical modeling. We present a goal-oriented adaptive finite element methodology for incompressible fluid-structure interaction based on a streamline diffusion-type stabilization of the balance equations for mass and momentum for the entire continuum in the domain, which is implemented in the Unicorn/FEniCS software framework. A phase marker function and its corresponding transport equation are introduced to select the constitutive law, where the mesh tracks the discontinuous fluid-structure interface. This results in a unified simulation method for fluids and structures. We present detailed results for the benchmark problem compared with experiments, together with a mesh convergence study. Copyright © 2016 John Wiley & Sons, Ltd.
Lagarde, Nathalie; Zagury, Jean-François; Montes, Matthieu
2015-07-27
Virtual screening methods are commonly used nowadays in drug discovery processes. However, to ensure their reliability, they have to be carefully evaluated. The evaluation of these methods is often realized in a retrospective way, notably by studying the enrichment of benchmarking data sets. To this purpose, numerous benchmarking data sets were developed over the years, and the resulting improvements led to the availability of high quality benchmarking data sets. However, some points still have to be considered in the selection of the active compounds, decoys, and protein structures to obtain optimal benchmarking data sets.
DOT National Transportation Integrated Search
2014-08-01
This report proposes a set of specifications for bridge structural health monitoring that has resulted from the : experiences gained during the installation and monitoring of six permanent long-term bridge monitoring systems in : Connecticut. As expe...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, Trevor; Pelletier, Steve; Giovanni, Matt
This report summarizes results of a long-term regional acoustic survey of bat activity at remote islands, offshore structures, and coastal sites in the Gulf of Maine, Great Lakes, and mid-Atlantic coast.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Will, M.E.; Suter, G.W. II
1994-09-01
One of the initial stages in ecological risk assessment for hazardous waste sites is screening contaminants to determine which of them are worthy of further consideration as contaminants of potential concern. This process is termed contaminant screening. It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to plants. This report presents a standard method for deriving benchmarks for this purpose (phytotoxicity benchmarks), a set of data concerning effects of chemicals in soil or soil solution on plants, and a setmore » of phytotoxicity benchmarks for 38 chemicals potentially associated with United States Department of Energy (DOE) sites. In addition, background information on the phytotoxicity and occurrence of the chemicals in soils is presented, and literature describing the experiments from which data were drawn for benchmark derivation is reviewed. Chemicals that are found in soil at concentrations exceeding both the phytotoxicity benchmark and the background concentration for the soil type should be considered contaminants of potential concern.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suter, G.W. II
1993-01-01
One of the initial stages in ecological risk assessment for hazardous waste sites is screening contaminants to determine which of them are worthy of further consideration as contaminants of potential concern. This process is termed contaminant screening. It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to plants. This report presents a standard method for deriving benchmarks for this purpose (phytotoxicity benchmarks), a set of data concerning effects of chemicals in soil or soil solution on plants, and a setmore » of phytotoxicity benchmarks for 38 chemicals potentially associated with United States Department of Energy (DOE) sites. In addition, background information on the phytotoxicity and occurrence of the chemicals in soils is presented, and literature describing the experiments from which data were drawn for benchmark derivation is reviewed. Chemicals that are found in soil at concentrations exceeding both the phytotoxicity benchmark and the background concentration for the soil type should be considered contaminants of potential concern.« less
Paulovich, Amanda G.; Billheimer, Dean; Ham, Amy-Joan L.; Vega-Montoto, Lorenzo; Rudnick, Paul A.; Tabb, David L.; Wang, Pei; Blackman, Ronald K.; Bunk, David M.; Cardasis, Helene L.; Clauser, Karl R.; Kinsinger, Christopher R.; Schilling, Birgit; Tegeler, Tony J.; Variyath, Asokan Mulayath; Wang, Mu; Whiteaker, Jeffrey R.; Zimmerman, Lisa J.; Fenyo, David; Carr, Steven A.; Fisher, Susan J.; Gibson, Bradford W.; Mesri, Mehdi; Neubert, Thomas A.; Regnier, Fred E.; Rodriguez, Henry; Spiegelman, Cliff; Stein, Stephen E.; Tempst, Paul; Liebler, Daniel C.
2010-01-01
Optimal performance of LC-MS/MS platforms is critical to generating high quality proteomics data. Although individual laboratories have developed quality control samples, there is no widely available performance standard of biological complexity (and associated reference data sets) for benchmarking of platform performance for analysis of complex biological proteomes across different laboratories in the community. Individual preparations of the yeast Saccharomyces cerevisiae proteome have been used extensively by laboratories in the proteomics community to characterize LC-MS platform performance. The yeast proteome is uniquely attractive as a performance standard because it is the most extensively characterized complex biological proteome and the only one associated with several large scale studies estimating the abundance of all detectable proteins. In this study, we describe a standard operating protocol for large scale production of the yeast performance standard and offer aliquots to the community through the National Institute of Standards and Technology where the yeast proteome is under development as a certified reference material to meet the long term needs of the community. Using a series of metrics that characterize LC-MS performance, we provide a reference data set demonstrating typical performance of commonly used ion trap instrument platforms in expert laboratories; the results provide a basis for laboratories to benchmark their own performance, to improve upon current methods, and to evaluate new technologies. Additionally, we demonstrate how the yeast reference, spiked with human proteins, can be used to benchmark the power of proteomics platforms for detection of differentially expressed proteins at different levels of concentration in a complex matrix, thereby providing a metric to evaluate and minimize preanalytical and analytical variation in comparative proteomics experiments. PMID:19858499
Brucker, Sara Y; Wallwiener, Markus; Kreienberg, Rolf; Jonat, Walter; Beckmann, Matthias W; Bamberg, Michael; Wallwiener, Diethelm; Souchon, Rainer
2011-02-01
A voluntary, external, science-based benchmarking program was established in Germany in 2003 to analyze and improve the quality of breast cancer (BC) care. Based on recent data from 2009, we aim to show that such analyses can also be performed for individual interdisciplinary specialties, such as radiation oncology (RO). Breast centers were invited to participate in the benchmarking program. Nine guideline-based quality indicators (QIs) were initially defined, reviewed annually, and modified, expanded, or abandoned accordingly. QI changes over time were analyzed descriptively, with particular emphasis on relevance to radiation oncology. During the 2003-2009 study period, there were marked increases in breast center participation and postoperatively confirmed primary BCs. Starting from 9 process QIs, 15 QIs were developed by 2009 as surrogate indicators of long-term outcome. During 2003-2009, 2/7 RO-relevant QIs (radiotherapy after breast-conserving surgery or after mastectomy) showed considerable increases (from 20 to 85% and 8 to 70%, respectively). Another three, initially high QIs practically reached the required levels. The current data confirm proof-of-concept for the established benchmarking program, which allows participating institutions to be compared and changes in quality of BC care to be tracked over time. Overall, marked QI increases suggest that BC care in Germany improved from 2003-2009. Moreover, it has become possible for the first time to demonstrate improvements in the quality of BC care longitudinally for individual breast centers. In addition, subgroups of relevant QIs can be used to demonstrate the progress achieved, but also the need for further improvement, in specific interdisciplinary specialties.
Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schulz, Roland; Lindner, Benjamin; Petridis, Loukas
2009-01-01
A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors,more » other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million atom biological systems scale well up to 30k cores, producing 30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.« less
Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer.
Schulz, Roland; Lindner, Benjamin; Petridis, Loukas; Smith, Jeremy C
2009-10-13
A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors, other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million-atom biological systems scale well up to ∼30k cores, producing ∼30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.
McIlrath, Carole; Keeney, Sinead; McKenna, Hugh; McLaughlin, Derek
2010-02-01
This paper is a report of a study conducted to identify and gain consensus on appropriate benchmarks for effective primary care-based nursing services for adults with depression. Worldwide evidence suggests that between 5% and 16% of the population have a diagnosis of depression. Most of their care and treatment takes place in primary care. In recent years, primary care nurses, including community mental health nurses, have become more involved in the identification and management of patients with depression; however, there are no appropriate benchmarks to guide, develop and support their practice. In 2006, a three-round electronic Delphi survey was completed by a United Kingdom multi-professional expert panel (n = 67). Round 1 generated 1216 statements relating to structures (such as training and protocols), processes (such as access and screening) and outcomes (such as patient satisfaction and treatments). Content analysis was used to collapse statements into 140 benchmarks. Seventy-three benchmarks achieved consensus during subsequent rounds. Of these, 45 (61%) were related to structures, 18 (25%) to processes and 10 (14%) to outcomes. Multi-professional primary care staff have similar views about the appropriate benchmarks for care of adults with depression. These benchmarks could serve as a foundation for depression improvement initiatives in primary care and ongoing research into depression management by nurses.
Effects of physical aging on long-term creep of polymers and polymer matrix composites
NASA Technical Reports Server (NTRS)
Brinson, L. Catherine; Gates, Thomas S.
1994-01-01
For many polymeric materials in use below the glass transition temperature, the long term viscoelastic behavior is greatly affected by physical aging. To use polymer matrix composites as critical structural components in existing and novel technological applications, this long term behavior of the material system must be understood. Towards that end, this study applied the concepts governing the mechanics of physical aging in a consistent manner to the study of laminated composite systems. Even in fiber-dominated lay-ups the effects of physical aging are found to be important in the long-term behavior of the composite. The basic concepts describing physical aging of polymers are discussed. Several aspects of physical aging which have not been previously documented are also explored in this study, namely the effects of aging into equilibrium and a relationship to the time-temperature shift factor. The physical aging theory is then extended to develop the long-term compliance/modulus of a single lamina with varying fiber orientation. The latter is then built into classical lamination theory to predict long-time response of general oriented lamina and laminates. It is illustrated that the long term response can be counterintuitive, stressing the need for consistent modeling efforts to make long term predictions of laminates to be used in structural situations.
AmeriFlux US-ADR Amargosa Desert Research Site (ADRS)
Moreo, Michael [U.S. Geological Survey
2018-01-01
This is the AmeriFlux version of the carbon flux data for the site US-ADR Amargosa Desert Research Site (ADRS). Site Description - This tower is located at the Amargosa Desert Research Site (ADRS). The U.S. Geological Survey (USGS) began studies of unsaturated zone hydrology at ADRS in 1976. Over the years, USGS investigations at ADRS have provided long-term "benchmark" information about the hydraulic characteristics and soil-water movement for both natural-site conditions and simulated waste-site conditions in an arid environment. The ADRS is located in a creosote-bush community adjacent to disposal trenches for low-level radioactive waste.
Assessing streamflow sensitivity to variations in glacier mass balance
O'Neel, Shad; Hood, Eran; Arendt, Anthony; Sass, Louis
2014-01-01
The purpose of this paper is to evaluate relationships among seasonal and annual glacier mass balances, glacier runoff and streamflow in two glacierized basins in different climate settings. We use long-term glacier mass balance and streamflow datasets from the United States Geological Survey (USGS) Alaska Benchmark Glacier Program to compare and contrast glacier-streamflow interactions in a maritime climate (Wolverine Glacier) with those in a continental climate (Gulkana Glacier). Our overall goal is to improve our understanding of how glacier mass balance processes impact streamflow, ultimately improving our conceptual understanding of the future evolution of glacier runoff in continental and maritime climates.
Loftus, Kelli; Tilley, Terry; Hoffman, Jason; Bradburn, Eric; Harvey, Ellen
2015-01-01
The creation of a consistent culture of safety and quality in an intensive care unit is challenging. We applied the Six Sigma Define-Measure-Analyze-Improve-Control (DMAIC) model for quality improvement (QI) to develop a long-term solution to improve outcomes in a high-risk neurotrauma intensive care unit. We sought to reduce central line utilization as a cornerstone in preventing central line-associated bloodstream infections (CLABSIs). This study describes the successful application of the DMAIC model in the creation and implementation of evidence-based quality improvement designed to reduce CLABSIs to below national benchmarks.
Insurer Competition In Federally Run Marketplaces Is Associated With Lower Premiums.
Jacobs, Paul D; Banthin, Jessica S; Trachtman, Samuel
2015-12-01
Federal subsidies for health insurance premiums sold through the Marketplaces are tied to the cost of the benchmark plan, the second-lowest-cost silver plan. According to economic theory, the presence of more competitors should lead to lower premiums, implying smaller federal outlays for premium subsidies. The long-term impact of the Affordable Care Act on government spending will depend on the cost of these premium subsidies over time, with insurer participation and the level of competition likely to influence those costs. We studied insurer participation and premiums during the first two years of the Marketplaces. We found that the addition of a single insurer in a county was associated with a 1.2 percent lower premium for the average silver plan and a 3.5 percent lower premium for the benchmark plan in the federally run Marketplaces. We found that the effect of insurer entry was muted after two or three additional entrants. These findings suggest that increased insurer participation in the federally run Marketplaces reduces federal payments for premium subsidies. Project HOPE—The People-to-People Health Foundation, Inc.
ERIC Educational Resources Information Center
Bendick, Marc, Jr.
Federal initiatives should be undertaken to reduce long-term structural unemployment in the United States. Long-term structural unemployment has risen during the 1970s and 1980s but is still primarily a problem of disadvantaged workers, not dislocated ones. The impact of technological change on occupations is felt mainly by the employed who are…
A shortest-path graph kernel for estimating gene product semantic similarity.
Alvarez, Marco A; Qi, Xiaojun; Yan, Changhui
2011-07-29
Existing methods for calculating semantic similarity between gene products using the Gene Ontology (GO) often rely on external resources, which are not part of the ontology. Consequently, changes in these external resources like biased term distribution caused by shifting of hot research topics, will affect the calculation of semantic similarity. One way to avoid this problem is to use semantic methods that are "intrinsic" to the ontology, i.e. independent of external knowledge. We present a shortest-path graph kernel (spgk) method that relies exclusively on the GO and its structure. In spgk, a gene product is represented by an induced subgraph of the GO, which consists of all the GO terms annotating it. Then a shortest-path graph kernel is used to compute the similarity between two graphs. In a comprehensive evaluation using a benchmark dataset, spgk compares favorably with other methods that depend on external resources. Compared with simUI, a method that is also intrinsic to GO, spgk achieves slightly better results on the benchmark dataset. Statistical tests show that the improvement is significant when the resolution and EC similarity correlation coefficient are used to measure the performance, but is insignificant when the Pfam similarity correlation coefficient is used. Spgk uses a graph kernel method in polynomial time to exploit the structure of the GO to calculate semantic similarity between gene products. It provides an alternative to both methods that use external resources and "intrinsic" methods with comparable performance.
Human Thermal Model Evaluation Using the JSC Human Thermal Database
NASA Technical Reports Server (NTRS)
Cognata, T.; Bue, G.; Makinen, J.
2011-01-01
The human thermal database developed at the Johnson Space Center (JSC) is used to evaluate a set of widely used human thermal models. This database will facilitate a more accurate evaluation of human thermoregulatory response using in a variety of situations, including those situations that might otherwise prove too dangerous for actual testing--such as extreme hot or cold splashdown conditions. This set includes the Wissler human thermal model, a model that has been widely used to predict the human thermoregulatory response to a variety of cold and hot environments. These models are statistically compared to the current database, which contains experiments of human subjects primarily in air from a literature survey ranging between 1953 and 2004 and from a suited experiment recently performed by the authors, for a quantitative study of relative strength and predictive quality of the models. Human thermal modeling has considerable long term utility to human space flight. Such models provide a tool to predict crew survivability in support of vehicle design and to evaluate crew response in untested environments. It is to the benefit of any such model not only to collect relevant experimental data to correlate it against, but also to maintain an experimental standard or benchmark for future development in a readily and rapidly searchable and software accessible format. The Human thermal database project is intended to do just so; to collect relevant data from literature and experimentation and to store the data in a database structure for immediate and future use as a benchmark to judge human thermal models against, in identifying model strengths and weakness, to support model development and improve correlation, and to statistically quantify a model s predictive quality.
A Global Drought and Flood Catalogue for the past 100 years
NASA Astrophysics Data System (ADS)
Sheffield, J.; He, X.; Peng, L.; Pan, M.; Fisher, C. K.; Wood, E. F.
2017-12-01
Extreme hydrological events cause the most impacts of natural hazards globally, impacting on a wide range of sectors including, most prominently, agriculture, food security and water availability and quality, but also on energy production, forestry, health, transportation and fisheries. Understanding how floods and droughts intersect, and have changed in the past provides the basis for understanding current risk and how it may change in the future. To do this requires an understanding of the mechanisms associated with events and therefore their predictability, attribution of long-term changes in risk, and quantification of projections of changes in the future. Of key importance are long-term records of relevant variables so that risk can be quantified more accurately, given the growing acknowledgement that risk is not stationary under long-term climate variability and climate change. To address this, we develop a catalogue of drought and flood events based on land surface and hydrodynamic modeling, forced by a hybrid meteorological dataset that draws from the continuity and coverage of reanalysis, and satellite datasets, merged with global gauge databases. The meteorological dataset is corrected for temporal inhomogeneities, spurious trends and variable inter-dependencies to ensure long-term consistency, as well as realistic representation of short-term variability and extremes. The VIC land surface model is run for the past 100 years at 0.25-degree resolution for global land areas. The VIC runoff is then used to drive the CaMa-Flood hydrodynamic model to obtain information on flood inundation risk. The model outputs are compared to satellite based estimates of flood and drought conditions and the observational flood record. The data are analyzed in terms of the spatio-temporal characteristics of large-scale flood and drought events with a particular focus on characterizing the long-term variability in risk. Significant changes in risk occur on multi-decadal time scales and are mostly associated with variability in the North Atlantic and Pacific. The catalogue can be used for analysis of extreme events, risk assessment, and as a benchmark for model evaluation.
Cation exchange properties of zeolites in hyper alkaline aqueous media.
Van Tendeloo, Leen; de Blochouse, Benny; Dom, Dirk; Vancluysen, Jacqueline; Snellings, Ruben; Martens, Johan A; Kirschhock, Christine E A; Maes, André; Breynaert, Eric
2015-02-03
Construction of multibarrier concrete based waste disposal sites and management of alkaline mine drainage water requires cation exchangers combining excellent sorption properties with a high stability and predictable performance in hyper alkaline media. Though highly selective organic cation exchange resins have been developed for most pollutants, they can serve as a growth medium for bacterial proliferation, impairing their long-term stability and introducing unpredictable parameters into the evolution of the system. Zeolites represent a family of inorganic cation exchangers, which naturally occur in hyper alkaline conditions and cannot serve as an electron donor or carbon source for microbial proliferation. Despite their successful application as industrial cation exchangers under near neutral conditions, their performance in hyper alkaline, saline water remains highly undocumented. Using Cs(+) as a benchmark element, this study aims to assess the long-term cation exchange performance of zeolites in concrete derived aqueous solutions. Comparison of their exchange properties in alkaline media with data obtained in near neutral solutions demonstrated that the cation exchange selectivity remains unaffected by the increased hydroxyl concentration; the cation exchange capacity did however show an unexpected increase in hyper alkaline media.
Marshall, Michael S; Issa, Yazan; Jakubauskas, Benas; Stoskute, Monika; Elackattu, Vince; Marshall, Jeffrey N; Bogue, Wil; Nguyen, Duc; Hauck, Zane; Rue, Emily; Karumuthil-Melethil, Subha; Zaric, Violeta; Bosland, Maarten; van Breemen, Richard B; Givogri, Maria I; Gray, Steven J; Crocker, Stephen J; Bongarzone, Ernesto R
2018-03-07
We report a global adeno-associated virus (AAV)9-based gene therapy protocol to deliver therapeutic galactosylceramidase (GALC), a lysosomal enzyme that is deficient in Krabbe's disease. When globally administered via intrathecal, intracranial, and intravenous injections to newborn mice affected with GALC deficiency (twitcher mice), this approach largely surpassed prior published benchmarks of survival and metabolic correction, showing long-term protection of demyelination, neuroinflammation, and motor function. Bone marrow transplantation, performed in this protocol without immunosuppressive preconditioning, added minimal benefits to the AAV9 gene therapy. Contrasting with other proposed pre-clinical therapies, these results demonstrate that achieving nearly complete correction of GALC's metabolic deficiencies across the entire nervous system via gene therapy can have a significant improvement to behavioral deficits, pathophysiological changes, and survival. These results are an important consideration for determining the safest and most effective manner for adapting gene therapy to treat this leukodystrophy in the clinic. Copyright © 2018 The American Society of Gene and Cell Therapy. Published by Elsevier Inc. All rights reserved.
The Long-Term Agro-Ecosystem Research (LTAR) Network: A New In-Situ Data Network For Agriculture
NASA Astrophysics Data System (ADS)
Walbridge, M. R.
2014-12-01
Agriculture in the 21st Century faces significant challenges due to increases in the demand for agricultural products from a global population expected to reach 9.5 billion by 2050, changes in land use that are reducing the area of arable land worldwide, and the uncertainties associated with increasing climate variability and change. There is broad agreement that meeting these challenges will require significant changes in agro-ecosystem management at the landscape scale. In 2012, the USDA/ARS announced the reorganization of 10 existing benchmark watersheds, experimental ranges, and research farms into a Long-Term Agro-ecosystem Research (LTAR) network. Earlier this year, the LTAR network expanded to 18 sites, including 3 led by land grant universities and/or private foundations. The central question addressed by the LTAR network is, "How do we sustain or enhance productivity, profitability, and ecosystem services in agro-ecosystems and agricultural landscapes"? All 18 LTAR sites possess rich historical databases that extend up to 100 years into the past. However as LTAR moves forward, the focus is on collecting a core set of common measurements over the next 30-50 years that can be used to draw inferences regarding the nature of agricultural sustainability and how it varies across regional and continental-scale gradients. As such, LTAR is part long-term research network and part observatory network. Rather than focusing on a single site, each LTAR has developed regional partnerships that allow it to address agro-ecosystem function in the large basins and eco-climatic zones that underpin regional food production systems. Partners include other long-term in-situ data networks (e.g., Ameriflux, CZO, GRACEnet, LTER, NEON). 'Next steps' include designing and implementing a cross-site experiment addressing LTAR's central question.
Gimeno-Blanes, Francisco J.; Blanco-Velasco, Manuel; Barquero-Pérez, Óscar; García-Alberola, Arcadi; Rojo-Álvarez, José L.
2016-01-01
Great effort has been devoted in recent years to the development of sudden cardiac risk predictors as a function of electric cardiac signals, mainly obtained from the electrocardiogram (ECG) analysis. But these prediction techniques are still seldom used in clinical practice, partly due to its limited diagnostic accuracy and to the lack of consensus about the appropriate computational signal processing implementation. This paper addresses a three-fold approach, based on ECG indices, to structure this review on sudden cardiac risk stratification. First, throughout the computational techniques that had been widely proposed for obtaining these indices in technical literature. Second, over the scientific evidence, that although is supported by observational clinical studies, they are not always representative enough. And third, via the limited technology transfer of academy-accepted algorithms, requiring further meditation for future systems. We focus on three families of ECG derived indices which are tackled from the aforementioned viewpoints, namely, heart rate turbulence (HRT), heart rate variability (HRV), and T-wave alternans. In terms of computational algorithms, we still need clearer scientific evidence, standardizing, and benchmarking, siting on advanced algorithms applied over large and representative datasets. New scenarios like electronic health recordings, big data, long-term monitoring, and cloud databases, will eventually open new frameworks to foresee suitable new paradigms in the near future. PMID:27014083
Is Long-Term Structural Priming Affected by Patterns of Experience with Individual Verbs?
ERIC Educational Resources Information Center
Kaschak, Michael P.; Borreggine, Kristin L.
2008-01-01
Several recent papers have reported long-term structural priming effects in experiments where previous patterns of experience with the double object and prepositional object constructions are shown to affect later patterns of language production for those constructions. The experiments reported in this paper address the extent to which these…
Working Memory, Long-Term Memory, and Medial Temporal Lobe Function
ERIC Educational Resources Information Center
Jeneson, Annette; Squire, Larry R.
2012-01-01
Early studies of memory-impaired patients with medial temporal lobe (MTL) damage led to the view that the hippocampus and related MTL structures are involved in the formation of long-term memory and that immediate memory and working memory are independent of these structures. This traditional idea has recently been revisited. Impaired performance…
Some Thoughts on Stability in Nonlinear Periodic Focusing Systems
DOE R&D Accomplishments Database
McMillan, E. M.
1967-09-05
A brief discussion is given of the long-term stability of particle motions through periodic focusing structures containing lumped nonlinear elements. A method is presented whereby one can specify the nonlinear elements in such a way as to generate a variety of structures in which the motion has long-term stability.
Benchmark Design and Installation: A synthesis of Existing Information.
1987-07-01
casings (15 ft deep) drilled to rock and filled with concrete. Disks - 1 . Set on vertically stable structures (e.g., dam monoliths). 2 . Set in rock ...Structural movement survey 1 . Rock outcrops (first choice) -- chiseled square on high point. 2 . Massive concrete structure (second choice) - cut square on...bolt marker (type 2 ). 58,. % %--"% %I 1 ± 4 -I,.- Table Cl. Recomnded benchmarks. Type of condition or terrain Type of markert Bedrock, rock outcrops
Chemical Stockpile Disposal Program. Emergency Response Concept Plan.
1987-07-01
the long term response will likely be managed by...response, the primary management considerations for the secondary response will include emergency medical care, long term health considerations...site, to establish the legal basis for the management structure for the secondary response. 3-4 .~~~~~~~~~L ." ..... .. The long term response to
4. Exterior view of LongTerm Hydrazine Silo (T28E), looking west. ...
4. Exterior view of Long-Term Hydrazine Silo (T-28E), looking west. The low-lying building to the immediate left of the silo is the Fuel Purification Structure (T-28E). - Air Force Plant PJKS, Systems Integration Laboratory, Long-Term Hydrazine Silo, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO
2. Exterior view of LongTerm Hydrazine Silo (T28E), looking east. ...
2. Exterior view of Long-Term Hydrazine Silo (T-28E), looking east. The low-lying building to the immediate right of the silo is the Fuel Purification Structure T-28E). - Air Force Plant PJKS, Systems Integration Laboratory, Long-Term Hydrazine Silo, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO
3. Exterior view of LongTerm Hydrazine Silo (T28E), looking southwest. ...
3. Exterior view of Long-Term Hydrazine Silo (T-28E), looking southwest. The low-lying building to the immediate left of the silo is the Fuel Purification Structure (T-28E). - Air Force Plant PJKS, Systems Integration Laboratory, Long-Term Hydrazine Silo, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO
Mortazavi, Majid; Brandenburg, Jan Gerit; Maurer, Reinhard J; Tkatchenko, Alexandre
2018-01-18
Accurate prediction of structure and stability of molecular crystals is crucial in materials science and requires reliable modeling of long-range dispersion interactions. Semiempirical electronic structure methods are computationally more efficient than their ab initio counterparts, allowing structure sampling with significant speedups. We combine the Tkatchenko-Scheffler van der Waals method (TS) and the many-body dispersion method (MBD) with third-order density functional tight-binding (DFTB3) via a charge population-based method. We find an overall good performance for the X23 benchmark database of molecular crystals, despite an underestimation of crystal volume that can be traced to the DFTB parametrization. We achieve accurate lattice energy predictions with DFT+MBD energetics on top of vdW-inclusive DFTB3 structures, resulting in a speedup of up to 3000 times compared with a full DFT treatment. This suggests that vdW-inclusive DFTB3 can serve as a viable structural prescreening tool in crystal structure prediction.
NASA Astrophysics Data System (ADS)
Lee, Yi-Kang
2017-09-01
Nuclear decommissioning takes place in several stages due to the radioactivity in the reactor structure materials. A good estimation of the neutron activation products distributed in the reactor structure materials impacts obviously on the decommissioning planning and the low-level radioactive waste management. Continuous energy Monte-Carlo radiation transport code TRIPOLI-4 has been applied on radiation protection and shielding analyses. To enhance the TRIPOLI-4 application in nuclear decommissioning activities, both experimental and computational benchmarks are being performed. To calculate the neutron activation of the shielding and structure materials of nuclear facilities, the knowledge of 3D neutron flux map and energy spectra must be first investigated. To perform this type of neutron deep penetration calculations with the Monte Carlo transport code, variance reduction techniques are necessary in order to reduce the uncertainty of the neutron activation estimation. In this study, variance reduction options of the TRIPOLI-4 code were used on the NAIADE 1 light water shielding benchmark. This benchmark document is available from the OECD/NEA SINBAD shielding benchmark database. From this benchmark database, a simplified NAIADE 1 water shielding model was first proposed in this work in order to make the code validation easier. Determination of the fission neutron transport was performed in light water for penetration up to 50 cm for fast neutrons and up to about 180 cm for thermal neutrons. Measurement and calculation results were benchmarked. Variance reduction options and their performance were discussed and compared.
Exterior view of north and east exterior walls of LongTerm ...
Exterior view of north and east exterior walls of Long-Term Oxidizer Silo (T-28B), looking south. Silo was designed to assess long-term environmental impacts on storage of the Titan II's oxidizer (nitrogen tetroxide). The shorter Oxidizer Conditioning Structure (T-28D) is located behind and to the immediate left of T-28B - Air Force Plant PJKS, Systems Integration Laboratory, Long-Term Oxidizer Silo, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO
Volatility, house edge and prize structure of gambling games.
Turner, Nigel E
2011-12-01
This study used simulations to examine the effect of prize structure on the outcome volatility and the number of winners of various game configurations. The two most common prize structures found in gambling games are even money payoff games (bet $1; win $2) found on most table games and multilevel prizes structures found in gambling machine games. Simulations were set up to examine the effect of prize structure on the long-term outcomes of these games. Eight different prize structures were compared in terms of the number of winners and volatility. It was found that the standard table game and commercial gambling machines produced fairly high numbers of short term winners (1 h), but few long term winners (50 h). It was found that the typical even money game set up produced the lowest level of volatility. Of the multilevel prize structures examined, the three simulations based on commercial gambling machines were the least volatile. The results are examined in terms of the pragmatics of game design.
High fidelity 3-dimensional models of beam-electron cloud interactions in circular accelerators
NASA Astrophysics Data System (ADS)
Feiz Zarrin Ghalam, Ali
Electron cloud is a low-density electron profile created inside the vacuum chamber of circular machines with positively charged beams. Electron cloud limits the peak current of the beam and degrades the beams' quality through luminosity degradation, emittance growth and head to tail or bunch to bunch instability. The adverse effects of electron cloud on long-term beam dynamics becomes more and more important as the beams go to higher and higher energies. This problem has become a major concern in many future circular machines design like the Large Hadron Collider (LHC) under construction at European Center for Nuclear Research (CERN). Due to the importance of the problem several simulation models have been developed to model long-term beam-electron cloud interaction. These models are based on "single kick approximation" where the electron cloud is assumed to be concentrated at one thin slab around the ring. While this model is efficient in terms of computational costs, it does not reflect the real physical situation as the forces from electron cloud to the beam are non-linear contrary to this model's assumption. To address the existing codes limitation, in this thesis a new model is developed to continuously model the beam-electron cloud interaction. The code is derived from a 3-D parallel Particle-In-Cell (PIC) model (QuickPIC) originally used for plasma wakefield acceleration research. To make the original model fit into circular machines environment, betatron and synchrotron equations of motions have been added to the code, also the effect of chromaticity, lattice structure have been included. QuickPIC is then benchmarked against one of the codes developed based on single kick approximation (HEAD-TAIL) for the transverse spot size of the beam in CERN-LHC. The growth predicted by QuickPIC is less than the one predicted by HEAD-TAIL. The code is then used to investigate the effect of electron cloud image charges on the long-term beam dynamics, particularly on the transverse tune shift of the beam at CERN Super Proton Synchrotron (SPS) ring. The force from the electron cloud image charges on the beam cancels the force due to cloud compression formed on the beam axis and therefore the tune shift is mainly due to the uniform electron cloud density. (Abstract shortened by UMI.)
Composite structural materials
NASA Technical Reports Server (NTRS)
Loewy, R.; Wiberley, S. E.
1986-01-01
Overall emphasis is on basic long-term research in the following categories: constituent materials, composite materials, generic structural elements, processing science technology; and maintaining long-term structural integrity. Research in basic composition, characteristics, and processing science of composite materials and their constituents is balanced against the mechanics, conceptual design, fabrication, and testing of generic structural elements typical of aerospace vehicles so as to encourage the discovery of unusual solutions to present and future problems. Detailed descriptions of the progress achieved in the various component parts of this comprehensive program are presented.
Towards three-dimensional continuum models of self-consistent along-strike megathrust segmentation
NASA Astrophysics Data System (ADS)
Pranger, Casper; van Dinther, Ylona; May, Dave; Le Pourhiet, Laetitia; Gerya, Taras
2016-04-01
At subduction megathrusts, propagation of large ruptures may be confined between the up-dip and down-dip limits of the seismogenic zone. This opens a primary role for lateral rupture dimensions to control the magnitude and severity of megathrust earthquakes. The goal of this study is to improve our understanding of the ways in which the inherent variability of the subduction interface may influence the degree of interseismic locking, and the propensity of a rupture to propagate over regions of variable slip potential. The global absence of a historic record sufficiently long to base risk assessment on, makes us rely on numerical modelling as a way to extend our understanding of the spatio-temporal occurrence of earthquakes. However, the complex interaction of the subduction stress environment, the variability of the subduction interface, and the structure and deformation of the crustal wedge has made it very difficult to construct comprehensive numerical models of megathrust segmentation. We develop and exploit the power of a plastic 3D continuum representation of the subduction megathrust, as well as off-megathrust faulting to model the long-term tectonic build-up of stresses, and their sudden seismic release. The sheer size of the 3D problem, and the time scales covering those of tectonics as well as seismology, force us to explore efficient and accurate physical and numerical techniques. We thus focused our efforts on developing a staggered grid finite difference code that makes use of the PETSc library for massively parallel computing. The code incorporates a newly developed automatic discretization algorithm, which enables it to handle a wide variety of equations with relative ease. The different physical and numerical ingredients - like attenuating visco-elasto-plastic materials, frictional weakening and inertially driven seismic release, and adaptive time marching schemes - most of which have been implemented and benchmarked individually - are now combined into one algorithm. We are working towards presenting the first benchmarked 3D dynamic rupture models as an important step towards seismic cycle modelling of megathrust segmentation in a three-dimensional subduction setting with slow tectonic loading, self consistent fault development, and spontaneous seismicity.
Modelling seagrass growth and development to evaluate transplanting strategies for restoration.
Renton, Michael; Airey, Michael; Cambridge, Marion L; Kendrick, Gary A
2011-10-01
Seagrasses are important marine plants that are under threat globally. Restoration by transplanting vegetative fragments or seedlings into areas where seagrasses have been lost is possible, but long-term trial data are limited. The goal of this study is to use available short-term data to predict long-term outcomes of transplanting seagrass. A functional-structural plant model of seagrass growth that integrates data collected from short-term trials and experiments is presented. The model was parameterized for the species Posidonia australis, a limited validation of the model against independent data and a sensitivity analysis were conducted and the model was used to conduct a preliminary evaluation of different transplanting strategies. The limited validation was successful, and reasonable long-term outcomes could be predicted, based only on short-term data. This approach for modelling seagrass growth and development enables long-term predictions of the outcomes to be made from different strategies for transplanting seagrass, even when empirical long-term data are difficult or impossible to collect. More validation is required to improve confidence in the model's predictions, and inclusion of more mechanism will extend the model's usefulness. Marine restoration represents a novel application of functional-structural plant modelling.
Weismer, Susan Ellis
2015-01-01
Purpose Spoken language benchmarks proposed by Tager-Flusberg et al. (2009) were used to characterize communication profiles of toddlers with autism spectrum disorders and to investigate if there were differences in variables hypothesized to influence language development at different benchmark levels. Method The communication abilities of a large sample of toddlers with autism spectrum disorders (N = 105) were characterized in terms of spoken language benchmarks. The toddlers were grouped according to these benchmarks to investigate whether there were differences in selected variables across benchmark groups at a mean age of 2.5 years. Results The majority of children in the sample presented with uneven communication profiles with relative strengths in phonology and significant weaknesses in pragmatics. When children were grouped according to one expressive language domain, across-group differences were observed in response to joint attention and gestures but not cognition or restricted and repetitive behaviors. Conclusion The spoken language benchmarks are useful for characterizing early communication profiles and investigating features that influence expressive language growth. PMID:26254475
DOE Office of Scientific and Technical Information (OSTI.GOV)
Will, M.E.; Suter, G.W. II
1994-09-01
One of the initial stages in ecological risk assessments for hazardous waste sites is the screening of contaminants to determine which of them are worthy of further consideration as {open_quotes}contaminants of potential concern.{close_quotes} This process is termed {open_quotes}contaminant screening.{close_quotes} It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to soil- and litter-dwelling invertebrates, including earthworms, other micro- and macroinvertebrates, or heterotrophic bacteria and fungi. This report presents a standard method for deriving benchmarks for this purpose, sets of data concerningmore » effects of chemicals in soil on invertebrates and soil microbial processes, and benchmarks for chemicals potentially associated with United States Department of Energy sites. In addition, literature describing the experiments from which data were drawn for benchmark derivation. Chemicals that are found in soil at concentrations exceeding both the benchmarks and the background concentration for the soil type should be considered contaminants of potential concern.« less
Structural composite panel performance under long-term load
Theodore L. Laufenberg
1988-01-01
Information on the performance of wood-based structural composite panels under long-term load is currently needed to permit their use in engineered assemblies and systems. A broad assessment of the time-dependent properties of panels is critical for creating databases and models of the creep-rupture phenomenon that lead to reliability-based design procedures. This...
ERIC Educational Resources Information Center
Sarver, Dustin E.; Rapport, Mark D.; Kofler, Michael J.; Scanlan, Sean W.; Raiker, Joseph S.; Altro, Thomas A.; Bolden, Jennifer
2012-01-01
The current study examined individual differences in children's phonological and visuospatial short-term memory as potential mediators of the relationship among attention problems and near- and long-term scholastic achievement. Nested structural equation models revealed that teacher-reported attention problems were associated negatively with…
Pseudo-icosahedral Cr55Al232 -δ as a high-temperature protective material
NASA Astrophysics Data System (ADS)
Rosa, R.; Bhattacharya, S.; Pabla, J.; He, H.; Misuraca, J.; Nakajima, Y.; Bender, A. D.; Antonacci, A. K.; Adrip, W.; McNally, D. E.; Zebro, A.; Kamenov, P.; Geschwind, G.; Ghose, S.; Dooryhee, E.; Ibrahim, A.; Tritt, T. M.; Aronson, M. C.; Simonson, J. W.
2018-03-01
We report here a course of basic research into the potential suitability of a pseudo-icosahedral Cr aluminide as a material for high-temperature protective coatings. Cr55Al232 -δ [ δ =2.70 (6 ) ] exhibits high hardness at room temperature as well as low thermal conductivity and excellent oxidation resistance at 973 K, with an oxidation rate comparable to those of softer, denser benchmark materials. The origin of these promising properties can be traced to competing long-range and short-range symmetries within the pseudo-icosahedral crystal structure, suggesting new criteria for future materials research.
Integrated control/structure optimization by multilevel decomposition
NASA Technical Reports Server (NTRS)
Zeiler, Thomas A.; Gilbert, Michael G.
1990-01-01
A method for integrated control/structure optimization by multilevel decomposition is presented. It is shown that several previously reported methods were actually partial decompositions wherein only the control was decomposed into a subsystem design. One of these partially decomposed problems was selected as a benchmark example for comparison. The system is fully decomposed into structural and control subsystem designs and an improved design is produced. Theory, implementation, and results for the method are presented and compared with the benchmark example.
Short-Term Field Study Programs: A Holistic and Experiential Approach to Learning
ERIC Educational Resources Information Center
Long, Mary M.; Sandler, Dennis M.; Topol, Martin T.
2017-01-01
For business schools, AACSB and Middle States' call for more experiential learning is one reason to provide study abroad programs. Universities must attend to the demand for continuous improvement and employ metrics to benchmark and evaluate their relative standing among peer institutions. One such benchmark is the National Survey of Student…
Taking Aims: New CASE Study Benchmarks Advancement Investments and Returns
ERIC Educational Resources Information Center
Goldsmith, Rae
2012-01-01
Advancement professionals have always been thirsty for information that will help them understand how their programs compare with those of their peers. But in recent years the demand for benchmarking data has exploded as budgets have become leaner, leaders have become more business minded, and terms like "performance metrics and return on…
Friedman, Michael A.; Bailey, Alyssa M.; Rondon, Matthew J.; McNerny, Erin M.; Sahar, Nadder D.; Kohn, David H.
2016-01-01
Exercise has long-lasting benefits to bone health that may help prevent fractures by increasing bone mass, bone strength, and tissue quality. Long-term exercise of 6–12 weeks in rodents increases bone mass and bone strength. However, in growing mice, a short-term exercise program of 3 weeks can limit increases in bone mass and structural strength, compared to non-exercised controls. Short-term exercise can, however, increase tissue strength, suggesting that exercise may create competition for minerals that favors initially improving tissue-level properties over structural-level properties. It was therefore hypothesized that adding calcium and phosphorus supplements to the diet may prevent decreases in bone mass and structural strength during a short-term exercise program, while leading to greater bone mass and structural strength than exercise alone after a long-term exercise program. A short-term exercise experiment was done for 3 weeks, and a long-term exercise experiment was done for 8 weeks. For each experiment, male 16-week old C57BL/6 mice were assigned to 4 weight-matched groups–exercise and non-exercise groups fed a control or mineral-supplemented diet. Exercise consisted of treadmill running at 12 m/min, 30 min/day for 7 days/week. After 3 weeks, exercised mice fed the supplemented diet had significantly increased tibial tissue mineral content (TMC) and cross-sectional area over exercised mice fed the control diet. After 8 weeks, tibial TMC, cross-sectional area, yield force, and ultimate force were greater from the combined treatments than from either exercise or supplemented diet alone. Serum markers of bone formation (PINP) and resorption (CTX) were both decreased by exercise on day 2. In exercised mice, day 2 PINP was significantly positively correlated with day 2 serum Ca, a correlation that was weaker and negative in non-exercised mice. Increasing dietary mineral consumption during an exercise program increases bone mass after 3 weeks and increases structural strength after 8 weeks, making bones best able to resist fracture. PMID:27008546
NASA Astrophysics Data System (ADS)
Lutz, Jesse J.; Duan, Xiaofeng F.; Ranasinghe, Duminda S.; Jin, Yifan; Margraf, Johannes T.; Perera, Ajith; Burggraf, Larry W.; Bartlett, Rodney J.
2018-05-01
Accurate optical characterization of the closo-Si12C12 molecule is important to guide experimental efforts toward the synthesis of nano-wires, cyclic nano-arrays, and related array structures, which are anticipated to be robust and efficient exciton materials for opto-electronic devices. Working toward calibrated methods for the description of closo-Si12C12 oligomers, various electronic structure approaches are evaluated for their ability to reproduce measured optical transitions of the SiC2, Si2Cn (n = 1-3), and Si3Cn (n = 1, 2) clusters reported earlier by Steglich and Maier [Astrophys. J. 801, 119 (2015)]. Complete-basis-limit equation-of-motion coupled-cluster (EOMCC) results are presented and a comparison is made between perturbative and renormalized non-iterative triples corrections. The effect of adding a renormalized correction for quadruples is also tested. Benchmark test sets derived from both measurement and high-level EOMCC calculations are then used to evaluate the performance of a variety of density functionals within the time-dependent density functional theory (TD-DFT) framework. The best-performing functionals are subsequently applied to predict valence TD-DFT excitation energies for the lowest-energy isomers of SinC and Sin-1C7-n (n = 4-6). TD-DFT approaches are then applied to the SinCn (n = 4-12) clusters and unique spectroscopic signatures of closo-Si12C12 are discussed. Finally, various long-range corrected density functionals, including those from the CAM-QTP family, are applied to a charge-transfer excitation in a cyclic (Si4C4)4 oligomer. Approaches for gauging the extent of charge-transfer character are also tested and EOMCC results are used to benchmark functionals and make recommendations.
Benchmarking the SPHINX and CTH shock physics codes for three problems in ballistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, L.T.; Hertel, E.; Schwalbe, L.
1998-02-01
The CTH Eulerian hydrocode, and the SPHINX smooth particle hydrodynamics (SPH) code were used to model a shock tube, two long rod penetrations into semi-infinite steel targets, and a long rod penetration into a spaced plate array. The results were then compared to experimental data. Both SPHINX and CTH modeled the one-dimensional shock tube problem well. Both codes did a reasonable job in modeling the outcome of the axisymmetric rod impact problem. Neither code correctly reproduced the depth of penetration in both experiments. In the 3-D problem, both codes reasonably replicated the penetration of the rod through the first plate.more » After this, however, the predictions of both codes began to diverge from the results seen in the experiment. In terms of computer resources, the run times are problem dependent, and are discussed in the text.« less
NASA Astrophysics Data System (ADS)
Rönnby, Johan
2007-12-01
The interaction between humans and the maritime coastal landscape must be one of the central theoretical questions for maritime archaeology. How should an academic discipline, which is defined by its studies in a certain physical milieu, avoid the trap of environmental determinism and still be able to argue for the special influence of the maritime factor? And how should this long-term relation to the sea be interpreted and described? In this article, based mainly on material from the central Swedish Baltic Sea coast, three examples of long-term structures regarding the relationship between people and the sea are discussed. The structures, here called “maritime durees”, which almost all coastal habitants in the analyzed area seem to have had in common are linked to: exploitation of marine resources, communication over water and the mental presence of the sea. In conclusion the actual meaning of these long-term structures for everyday life and for cultural and social change are discussed in comparison to more short term structures: the changing historical circumstances and possibilities for people to choose different strategies.
NASA Astrophysics Data System (ADS)
Rimov, A. A.; Chukanova, T. I.; Trofimov, Yu. V.
2016-12-01
Data on the comparative analysis variants of the quality of power installations (benchmarking) applied in the power industry is systematized. It is shown that the most efficient variant of implementation of the benchmarking technique is the analysis of statistical distributions of the indicators in the composed homogenous group of the uniform power installations. The benchmarking technique aimed at revealing the available reserves on improvement of the reliability and heat efficiency indicators of the power installations of the thermal power plants is developed in the furtherance of this approach. The technique provides a possibility of reliable comparison of the quality of the power installations in their homogenous group limited by the number and adoption of the adequate decision on improving some or other technical characteristics of this power installation. The technique provides structuring of the list of the comparison indicators and internal factors affecting them represented according to the requirements of the sectoral standards and taking into account the price formation characteristics in the Russian power industry. The mentioned structuring ensures traceability of the reasons of deviation of the internal influencing factors from the specified values. The starting point for further detail analysis of the delay of the certain power installation indicators from the best practice expressed in the specific money equivalent is positioning of this power installation on distribution of the key indicator being a convolution of the comparison indicators. The distribution of the key indicator is simulated by the Monte-Carlo method after receiving the actual distributions of the comparison indicators: specific lost profit due to the short supply of electric energy and short delivery of power, specific cost of losses due to the nonoptimal expenditures for repairs, and specific cost of excess fuel equivalent consumption. The quality loss indicators are developed facilitating the analysis of the benchmarking results permitting to represent the quality loss of this power installation in the form of the difference between the actual value of the key indicator or comparison indicator and the best quartile of the existing distribution. The uncertainty of the obtained values of the quality loss indicators was evaluated by transforming the standard uncertainties of the input values into the expanded uncertainties of the output values with the confidence level of 95%. The efficiency of the technique is demonstrated in terms of benchmarking of the main thermal and mechanical equipment of the extraction power-generating units T-250 and power installations of the thermal power plants with the main steam pressure 130 atm.
Homogenising time series: beliefs, dogmas and facts
NASA Astrophysics Data System (ADS)
Domonkos, P.
2011-06-01
In the recent decades various homogenisation methods have been developed, but the real effects of their application on time series are still not known sufficiently. The ongoing COST action HOME (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As a part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever before to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. Empirical results show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities often have similar statistical characteristics than natural changes caused by climatic variability, thus the pure application of the classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality than in raw time series. Some problems around detecting multiple structures of inhomogeneities, as well as that of time series comparisons within homogenisation procedures are discussed briefly in the study.
Liao, Peilin; Carter, Emily A
2011-09-07
Quantitative characterization of low-lying excited electronic states in materials is critical for the development of solar energy conversion materials. The many-body Green's function method known as the GW approximation (GWA) directly probes states corresponding to photoemission and inverse photoemission experiments, thereby determining the associated band structure. Several versions of the GW approximation with different levels of self-consistency exist in the field. While the GWA based on density functional theory (DFT) works well for conventional semiconductors, less is known about its reliability for strongly correlated semiconducting materials. Here we present a systematic study of the GWA using hematite (α-Fe(2)O(3)) as the benchmark material. We analyze its performance in terms of the calculated photoemission/inverse photoemission band gaps, densities of states, and dielectric functions. Overall, a non-self-consistent G(0)W(0) using input from DFT+U theory produces physical observables in best agreement with experiments. This journal is © the Owner Societies 2011
The effect of patterning options on embedded memory cells in logic technologies at iN10 and iN7
NASA Astrophysics Data System (ADS)
Appeltans, Raf; Weckx, Pieter; Raghavan, Praveen; Kim, Ryoung-Han; Kar, Gouri Sankar; Furnémont, Arnaud; Van der Perre, Liesbet; Dehaene, Wim
2017-03-01
Static Random Access Memory (SRAM) cells are used together with logic standard cells as the benchmark to develop the process flow for new logic technologies. In order to achieve successful integration of Spin-Transfer Torque Magnetic Random Access Memory (STT-MRAM) as area efficient higher level embedded cache, it also needs to be included as a benchmark. The simple cell structure of STT-MRAM brings extra patterning challenges to achieve high density. The two memory types are compared in terms of minimum area and critical design rules in both the iN10 and iN7 node, with an extra focus on patterning options in iN7. Both the use of Self-Aligned Quadruple Patterning (SAQP) mandrel and spacer engineering, as well as multi-level via's are explored. These patterning options result in large area gains for the STT-MRAM cell and moreover determine which cell variant is the smallest.
Glasgow Coma Scale and Outcomes after Structural Traumatic Head Injury in Early Childhood
Heather, Natasha L.; Derraik, José G. B.; Beca, John; Hofman, Paul L.; Dansey, Rangi; Hamill, James; Cutfield, Wayne S.
2013-01-01
Objective To assess the association of the Glasgow Coma Scale (GCS) with radiological evidence of head injury (the Abbreviated Injury Scale for the head region, AIS-HR) in young children hospitalized with traumatic head injury (THI), and the predictive value of GCS and AIS-HR scores for long-term impairment. Methods Our study involved a 10-year retrospective review of a database encompassing all patients admitted to Starship Children’s Hospital (Auckland, New Zealand, 2000–2010) with THI. Results We studied 619 children aged <5 years at the time of THI, with long-term outcome data available for 161 subjects. Both GCS and AIS-HR scores were predictive of length of intensive care unit and hospital stay (all p<0.001). GCS was correlated with AIS-HR (ρ=-0.46; p<0.001), although mild GCS scores (13–15) commonly under-estimated the severity of radiological injury: 42% of children with mild GCS scores had serious–critical THI (AIS-HR 3–5). Increasingly severe GCS or AIS-HR scores were both associated with a greater likelihood of long-term impairment (neurological disability, residual problems, and educational support). However, long-term impairment was also relatively common in children with mild GCS scores paired with structural THI more severe than a simple linear skull fracture. Conclusion Severe GCS scores will identify most cases of severe radiological injury in early childhood, and are good predictors of poor long-term outcome. However, young children admitted to hospital with structural THI and mild GCS scores have an appreciable risk of long-term disability, and also warrant long-term follow-up. PMID:24312648
Bauer, Matthias R; Ibrahim, Tamer M; Vogel, Simon M; Boeckler, Frank M
2013-06-24
The application of molecular benchmarking sets helps to assess the actual performance of virtual screening (VS) workflows. To improve the efficiency of structure-based VS approaches, the selection and optimization of various parameters can be guided by benchmarking. With the DEKOIS 2.0 library, we aim to further extend and complement the collection of publicly available decoy sets. Based on BindingDB bioactivity data, we provide 81 new and structurally diverse benchmark sets for a wide variety of different target classes. To ensure a meaningful selection of ligands, we address several issues that can be found in bioactivity data. We have improved our previously introduced DEKOIS methodology with enhanced physicochemical matching, now including the consideration of molecular charges, as well as a more sophisticated elimination of latent actives in the decoy set (LADS). We evaluate the docking performance of Glide, GOLD, and AutoDock Vina with our data sets and highlight existing challenges for VS tools. All DEKOIS 2.0 benchmark sets will be made accessible at http://www.dekois.com.
Customer assessment of long-term care pharmacy provider services.
Clark, Thomas R
2008-09-01
Assess performance of long-term care pharmacy providers on key services offered to nursing facilities. Cross-sectional; nursing facility team. Random phone survey of nursing facility team members. 485 nursing facility team members (practicing in nursing facilities, interacting with > or = 1 consultant pharmacist); 46 members excluded, unable to identify facility's pharmacy provider. Directors of nursing, medical directors, and administrators were asked to rate long-term care pharmacy provider performance of eight commonly offered pharmacy services. All groups evaluated pharmacy provider performance of these services using a five-point scale. Results are broken down by employer type. Average rating for eight pharmacy services was 3.64. Top two services: "Labeling medications accurately" ranked in top 1-2 services for all groups (combined rating of 3.97) and "Provides medication administration system" ranked in top 1-3 services for all groups (combined rating of 3.95). One service, "Provides educational inservices," ranked lowest for all groups (combined rating of 3.54). In general, when looking at the eight services in combination for all providers, all services were ranked between Good and Very Good (average score of 3.64). Therefore, while the pharmacy provider is performing above average for these services, there is room for improvement in all of these services. These results can be used as a benchmark. Detailed data results and sample surveys are available online at www.ascp.com/supplements. These surveys can be used by the pharmacy provider to solicit assessments from their own facilities on these services.
Benchmarking an Unstructured-Grid Model for Tsunami Current Modeling
NASA Astrophysics Data System (ADS)
Zhang, Yinglong J.; Priest, George; Allan, Jonathan; Stimely, Laura
2016-12-01
We present model results derived from a tsunami current benchmarking workshop held by the NTHMP (National Tsunami Hazard Mitigation Program) in February 2015. Modeling was undertaken using our own 3D unstructured-grid model that has been previously certified by the NTHMP for tsunami inundation. Results for two benchmark tests are described here, including: (1) vortex structure in the wake of a submerged shoal and (2) impact of tsunami waves on Hilo Harbor in the 2011 Tohoku event. The modeled current velocities are compared with available lab and field data. We demonstrate that the model is able to accurately capture the velocity field in the two benchmark tests; in particular, the 3D model gives a much more accurate wake structure than the 2D model for the first test, with the root-mean-square error and mean bias no more than 2 cm s-1 and 8 mm s-1, respectively, for the modeled velocity.
NASA Astrophysics Data System (ADS)
Asadollahi, Parisa; Li, Jian
2016-04-01
Understanding the dynamic behavior of complex structures such as long-span bridges requires dense deployment of sensors. Traditional wired sensor systems are generally expensive and time-consuming to install due to cabling. With wireless communication and on-board computation capabilities, wireless smart sensor networks have the advantages of being low cost, easy to deploy and maintain and therefore facilitate dense instrumentation for structural health monitoring. A long-term monitoring project was recently carried out for a cable-stayed bridge in South Korea with a dense array of 113 smart sensors, which feature the world's largest wireless smart sensor network for civil structural monitoring. This paper presents a comprehensive statistical analysis of the modal properties including natural frequencies, damping ratios and mode shapes of the monitored cable-stayed bridge. Data analyzed in this paper is composed of structural vibration signals monitored during a 12-month period under ambient excitations. The correlation between environmental temperature and the modal frequencies is also investigated. The results showed the long-term statistical structural behavior of the bridge, which serves as the basis for Bayesian statistical updating for the numerical model.
Engineering and Design: Structural Deformation Surveying
2002-06-01
loading deformations. Long-term measurements are far more common and somewhat more complex given their external nature . Long-term monitoring of a...fitting of structural elements, environmental protection, and development of mitigative measures in the case of natural disasters (land slides, earthquakes...of additional localized monitoring points (i.e., points not intended for routine observation) to determine the nature and extent of large displacements
ERIC Educational Resources Information Center
Wu, Ying-Tien; Tsai, Chin-Chung
2005-01-01
The main purpose of this study was to explore the effects of long-term constructivist-oriented science instruction on elementary school students' process of constructing cognitive structures. Furthermore, such effects on different science achievers were also investigated. The subjects of this study were 69 fifth graders in Taiwan, while they were…
InfAcrOnt: calculating cross-ontology term similarities using information flow by a random walk.
Cheng, Liang; Jiang, Yue; Ju, Hong; Sun, Jie; Peng, Jiajie; Zhou, Meng; Hu, Yang
2018-01-19
Since the establishment of the first biomedical ontology Gene Ontology (GO), the number of biomedical ontology has increased dramatically. Nowadays over 300 ontologies have been built including extensively used Disease Ontology (DO) and Human Phenotype Ontology (HPO). Because of the advantage of identifying novel relationships between terms, calculating similarity between ontology terms is one of the major tasks in this research area. Though similarities between terms within each ontology have been studied with in silico methods, term similarities across different ontologies were not investigated as deeply. The latest method took advantage of gene functional interaction network (GFIN) to explore such inter-ontology similarities of terms. However, it only used gene interactions and failed to make full use of the connectivity among gene nodes of the network. In addition, all existent methods are particularly designed for GO and their performances on the extended ontology community remain unknown. We proposed a method InfAcrOnt to infer similarities between terms across ontologies utilizing the entire GFIN. InfAcrOnt builds a term-gene-gene network which comprised ontology annotations and GFIN, and acquires similarities between terms across ontologies through modeling the information flow within the network by random walk. In our benchmark experiments on sub-ontologies of GO, InfAcrOnt achieves a high average area under the receiver operating characteristic curve (AUC) (0.9322 and 0.9309) and low standard deviations (1.8746e-6 and 3.0977e-6) in both human and yeast benchmark datasets exhibiting superior performance. Meanwhile, comparisons of InfAcrOnt results and prior knowledge on pair-wise DO-HPO terms and pair-wise DO-GO terms show high correlations. The experiment results show that InfAcrOnt significantly improves the performance of inferring similarities between terms across ontologies in benchmark set.
Integrated control/structure optimization by multilevel decomposition
NASA Technical Reports Server (NTRS)
Zeiler, Thomas A.; Gilbert, Michael G.
1990-01-01
A method for integrated control/structure optimization by multilevel decomposition is presented. It is shown that several previously reported methods were actually partial decompositions wherein only the control was decomposed into a subsystem design. One of these partially decomposed problems was selected as a benchmark example for comparison. The present paper fully decomposes the system into structural and control subsystem designs and produces an improved design. Theory, implementation, and results for the method are presented and compared with the benchmark example.
Effect of stresses on the structural changes in high-chromium steel upon creep
NASA Astrophysics Data System (ADS)
Fedoseeva, A. E.; Dudova, N. R.; Kaibyshev, R. O.
2017-06-01
The effect of stresses on the microstructure and dispersed particles in a heating-performance Fe‒0.12C-0.06Si-0.04Ni-0.2Mn-9.5Cr-3.2Co-0.45Mo-3.1W-0.2V-0.06Nb-0.005B-0.05N (wt %) steel has been studied under long-term strength tests at T = 650°C under initial applied stresses ranging from 220 to 100 MPa with a step of 20 MPa. Under an applied stress of 160 MPa, which corresponds to a time to fracture of 1703 h, a transfer from short- to long-term creep takes place. It has been shown that alloying with 3% Co and an increase in W content to 3% significantly increase the short-term creep resistance and slightly increase the long-term strength upon tests by more than 104 h. The transfer from short- to the long-term creep is accompanied by substantial changes in the microstructure of the steel. Under long-term creep, the solid solution became depleted of tungsten and of molybdenum down to the thermodynamically equilibrium content of these elements in the solid solution, which leads to the precipitation of a large amount of fine particles of the Laves phase at the boundaries of laths and prior austenitic grains. At a time to fracture of more than 4 × 103 h, the coalescence of the M23C6 carbides and Laves-phase particles occurs, which causes the transformation of the structure of fine tempered martensite lath structure into a subgrained structure.
Rclick: a web server for comparison of RNA 3D structures.
Nguyen, Minh N; Verma, Chandra
2015-03-15
RNA molecules play important roles in key biological processes in the cell and are becoming attractive for developing therapeutic applications. Since the function of RNA depends on its structure and dynamics, comparing and classifying the RNA 3D structures is of crucial importance to molecular biology. In this study, we have developed Rclick, a web server that is capable of superimposing RNA 3D structures by using clique matching and 3D least-squares fitting. Our server Rclick has been benchmarked and compared with other popular servers and methods for RNA structural alignments. In most cases, Rclick alignments were better in terms of structure overlap. Our server also recognizes conformational changes between structures. For this purpose, the server produces complementary alignments to maximize the extent of detectable similarity. Various examples showcase the utility of our web server for comparison of RNA, RNA-protein complexes and RNA-ligand structures. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Agoraphobia: an outreach treatment programme.
Croft, Alison; Hackmann, Ann
2013-05-01
Agoraphobia is disabling and clients find it hard to access effective treatment. This paper describes the development of an inexpensive service, delivered by trained volunteers in or near the client's own home. We describe the development of the service, including selection, training and supervision. Outcomes were evaluated over 5 years, and compared with those available from the local psychology service. Effect sizes on all measures were high. Benchmarking indicated that results on comparable measures were not significantly different from the local psychology service. As in many previous studies drop-out rate was fairly high. This model worked well, and was inexpensive and effective. Further research on long term outcome and methods of enhancing engagement is needed.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., the activity causing the authorized impacts. (b) Sustainability. Compensatory mitigation projects... sustainability. Where active long-term management and maintenance are necessary to ensure long-term sustainability (e.g., prescribed burning, invasive species control, maintenance of water control structures...
Code of Federal Regulations, 2014 CFR
2014-07-01
..., the activity causing the authorized impacts. (b) Sustainability. Compensatory mitigation projects... sustainability. Where active long-term management and maintenance are necessary to ensure long-term sustainability (e.g., prescribed burning, invasive species control, maintenance of water control structures...
Code of Federal Regulations, 2011 CFR
2011-07-01
..., the activity causing the authorized impacts. (b) Sustainability. Compensatory mitigation projects... sustainability. Where active long-term management and maintenance are necessary to ensure long-term sustainability (e.g., prescribed burning, invasive species control, maintenance of water control structures...
Code of Federal Regulations, 2014 CFR
2014-07-01
..., the activity causing the authorized impacts. (b) Sustainability. Compensatory mitigation projects... sustainability. Where active long-term management and maintenance are necessary to ensure long-term sustainability (e.g., prescribed burning, invasive species control, maintenance of water control structures...
Code of Federal Regulations, 2013 CFR
2013-07-01
..., the activity causing the authorized impacts. (b) Sustainability. Compensatory mitigation projects... sustainability. Where active long-term management and maintenance are necessary to ensure long-term sustainability (e.g., prescribed burning, invasive species control, maintenance of water control structures...
Code of Federal Regulations, 2012 CFR
2012-07-01
..., the activity causing the authorized impacts. (b) Sustainability. Compensatory mitigation projects... sustainability. Where active long-term management and maintenance are necessary to ensure long-term sustainability (e.g., prescribed burning, invasive species control, maintenance of water control structures...
Code of Federal Regulations, 2011 CFR
2011-07-01
..., the activity causing the authorized impacts. (b) Sustainability. Compensatory mitigation projects... sustainability. Where active long-term management and maintenance are necessary to ensure long-term sustainability (e.g., prescribed burning, invasive species control, maintenance of water control structures...
Code of Federal Regulations, 2013 CFR
2013-07-01
..., the activity causing the authorized impacts. (b) Sustainability. Compensatory mitigation projects... sustainability. Where active long-term management and maintenance are necessary to ensure long-term sustainability (e.g., prescribed burning, invasive species control, maintenance of water control structures...
ERIC Educational Resources Information Center
Stern, Luli; Ahlgren, Andrew
2002-01-01
Project 2061 of the American Association for the Advancement of Science (AAAS) developed and field-tested a procedure for analyzing curriculum materials, including assessments, in terms of contribution to the attainment of benchmarks and standards. Using this procedure, Project 2061 produced a database of reports on nine science middle school…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Will, M.E.
1994-01-01
This report presents a standard method for deriving benchmarks for the purpose of ''contaminant screening,'' performed by comparing measured ambient concentrations of chemicals. The work was performed under Work Breakdown Structure 1.4.12.2.3.04.07.02 (Activity Data Sheet 8304). In addition, this report presents sets of data concerning the effects of chemicals in soil on invertebrates and soil microbial processes, benchmarks for chemicals potentially associated with United States Department of Energy sites, and literature describing the experiments from which data were drawn for benchmark derivation.
Esposito, Fabrizio; Formisano, Elia; Seifritz, Erich; Goebel, Rainer; Morrone, Renato; Tedeschi, Gioacchino; Di Salle, Francesco
2002-07-01
Independent component analysis (ICA) has been successfully employed to decompose functional MRI (fMRI) time-series into sets of activation maps and associated time-courses. Several ICA algorithms have been proposed in the neural network literature. Applied to fMRI, these algorithms might lead to different spatial or temporal readouts of brain activation. We compared the two ICA algorithms that have been used so far for spatial ICA (sICA) of fMRI time-series: the Infomax (Bell and Sejnowski [1995]: Neural Comput 7:1004-1034) and the Fixed-Point (Hyvärinen [1999]: Adv Neural Inf Proc Syst 10:273-279) algorithms. We evaluated the Infomax- and Fixed Point-based sICA decompositions of simulated motor, and real motor and visual activation fMRI time-series using an ensemble of measures. Log-likelihood (McKeown et al. [1998]: Hum Brain Mapp 6:160-188) was used as a measure of how significantly the estimated independent sources fit the statistical structure of the data; receiver operating characteristics (ROC) and linear correlation analyses were used to evaluate the algorithms' accuracy of estimating the spatial layout and the temporal dynamics of simulated and real activations; cluster sizing calculations and an estimation of a residual gaussian noise term within the components were used to examine the anatomic structure of ICA components and for the assessment of noise reduction capabilities. Whereas both algorithms produced highly accurate results, the Fixed-Point outperformed the Infomax in terms of spatial and temporal accuracy as long as inferential statistics were employed as benchmarks. Conversely, the Infomax sICA was superior in terms of global estimation of the ICA model and noise reduction capabilities. Because of its adaptive nature, the Infomax approach appears to be better suited to investigate activation phenomena that are not predictable or adequately modelled by inferential techniques. Copyright 2002 Wiley-Liss, Inc.
NASA Technical Reports Server (NTRS)
Tuttle, M. E.; Brinson, H. F.
1986-01-01
The impact of flight error in measured viscoelastic parameters on subsequent long-term viscoelastic predictions is numerically evaluated using the Schapery nonlinear viscoelastic model. Of the seven Schapery parameters, the results indicated that long-term predictions were most sensitive to errors in the power law parameter n. Although errors in the other parameters were significant as well, errors in n dominated all other factors at long times. The process of selecting an appropriate short-term test cycle so as to insure an accurate long-term prediction was considered, and a short-term test cycle was selected using material properties typical for T300/5208 graphite-epoxy at 149 C. The process of selection is described, and its individual steps are itemized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hedegård, Erik Donovan, E-mail: erik.hedegard@phys.chem.ethz.ch; Department of Physics, Chemistry and Pharmacy, University of Southern Denmark, Campusvej 55, DK-5230 Odense; Olsen, Jógvan Magnus Haugaard
2015-03-21
We present here the coupling of a polarizable embedding (PE) model to the recently developed multiconfiguration short-range density functional theory method (MC-srDFT), which can treat multiconfigurational systems with a simultaneous account for dynamical and static correlation effects. PE-MC-srDFT is designed to combine efficient treatment of complicated electronic structures with inclusion of effects from the surrounding environment. The environmental effects encompass classical electrostatic interactions as well as polarization of both the quantum region and the environment. Using response theory, molecular properties such as excitation energies and oscillator strengths can be obtained. The PE-MC-srDFT method and the additional terms required for linearmore » response have been implemented in a development version of DALTON. To benchmark the PE-MC-srDFT approach against the literature data, we have investigated the low-lying electronic excitations of acetone and uracil, both immersed in water solution. The PE-MC-srDFT results are consistent and accurate, both in terms of the calculated solvent shift and, unlike regular PE-MCSCF, also with respect to the individual absolute excitation energies. To demonstrate the capabilities of PE-MC-srDFT, we also investigated the retinylidene Schiff base chromophore embedded in the channelrhodopsin protein. While using a much more compact reference wave function in terms of active space, our PE-MC-srDFT approach yields excitation energies comparable in quality to CASSCF/CASPT2 benchmarks.« less
Modelling seagrass growth and development to evaluate transplanting strategies for restoration
Renton, Michael; Airey, Michael; Cambridge, Marion L.; Kendrick, Gary A.
2011-01-01
Background and Aims Seagrasses are important marine plants that are under threat globally. Restoration by transplanting vegetative fragments or seedlings into areas where seagrasses have been lost is possible, but long-term trial data are limited. The goal of this study is to use available short-term data to predict long-term outcomes of transplanting seagrass. Methods A functional–structural plant model of seagrass growth that integrates data collected from short-term trials and experiments is presented. The model was parameterized for the species Posidonia australis, a limited validation of the model against independent data and a sensitivity analysis were conducted and the model was used to conduct a preliminary evaluation of different transplanting strategies. Key Results The limited validation was successful, and reasonable long-term outcomes could be predicted, based only on short-term data. Conclusions This approach for modelling seagrass growth and development enables long-term predictions of the outcomes to be made from different strategies for transplanting seagrass, even when empirical long-term data are difficult or impossible to collect. More validation is required to improve confidence in the model's predictions, and inclusion of more mechanism will extend the model's usefulness. Marine restoration represents a novel application of functional–structural plant modelling. PMID:21821624
A Seafloor Benchmark for 3-dimensional Geodesy
NASA Astrophysics Data System (ADS)
Chadwell, C. D.; Webb, S. C.; Nooner, S. L.
2014-12-01
We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone. Using a ROV to place and remove sensors on the benchmarks will significantly reduce the number of sensors required by the community to monitor offshore strain in subduction zones.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, Peter; Bohrer, Gil; Gough, Christopher
2015-03-12
At the University of Michigan Biological Station (UMBS) AmeriFlux sites (US-UMB and US-UMd), long-term C cycling measurements and a novel ecosystem-scale experiment are revealing physical, biological, and ecological mechanisms driving long-term trajectories of C cycling, providing new data for improving modeling forecasts of C storage in eastern forests. Our findings provide support for previously untested hypotheses that stand-level structural and biological properties constrain long-term trajectories of C storage, and that remotely sensed canopy structural parameters can substantially improve model forecasts of forest C storage. Through the Forest Accelerated Succession ExperimenT (FASET), we are directly testing the hypothesis that forest Cmore » storage will increase due to increasing structural and biological complexity of the emerging tree communities. Support from this project, 2011-2014, enabled us to incorporate novel physical and ecological mechanisms into ecological, meteorological, and hydrological models to improve forecasts of future forest C storage in response to disturbance, succession, and current and long-term climate variation« less
Long-term bridge performance high priority bridge performance issues.
DOT National Transportation Integrated Search
2014-10-01
Bridge performance is a multifaceted issue involving performance of materials and protective systems, : performance of individual components of the bridge, and performance of the structural system as a whole. The : Long-Term Bridge Performance (LTBP)...
Benchmarking methods and data sets for ligand enrichment assessment in virtual screening.
Xia, Jie; Tilahun, Ermias Lemma; Reid, Terry-Elinor; Zhang, Liangren; Wang, Xiang Simon
2015-01-01
Retrospective small-scale virtual screening (VS) based on benchmarking data sets has been widely used to estimate ligand enrichments of VS approaches in the prospective (i.e. real-world) efforts. However, the intrinsic differences of benchmarking sets to the real screening chemical libraries can cause biased assessment. Herein, we summarize the history of benchmarking methods as well as data sets and highlight three main types of biases found in benchmarking sets, i.e. "analogue bias", "artificial enrichment" and "false negative". In addition, we introduce our recent algorithm to build maximum-unbiased benchmarking sets applicable to both ligand-based and structure-based VS approaches, and its implementations to three important human histone deacetylases (HDACs) isoforms, i.e. HDAC1, HDAC6 and HDAC8. The leave-one-out cross-validation (LOO CV) demonstrates that the benchmarking sets built by our algorithm are maximum-unbiased as measured by property matching, ROC curves and AUCs. Copyright © 2014 Elsevier Inc. All rights reserved.
A Review of Flood Loss Models as Basis for Harmonization and Benchmarking
Kreibich, Heidi; Franco, Guillermo; Marechal, David
2016-01-01
Risk-based approaches have been increasingly accepted and operationalized in flood risk management during recent decades. For instance, commercial flood risk models are used by the insurance industry to assess potential losses, establish the pricing of policies and determine reinsurance needs. Despite considerable progress in the development of loss estimation tools since the 1980s, loss estimates still reflect high uncertainties and disparities that often lead to questioning their quality. This requires an assessment of the validity and robustness of loss models as it affects prioritization and investment decision in flood risk management as well as regulatory requirements and business decisions in the insurance industry. Hence, more effort is needed to quantify uncertainties and undertake validations. Due to a lack of detailed and reliable flood loss data, first order validations are difficult to accomplish, so that model comparisons in terms of benchmarking are essential. It is checked if the models are informed by existing data and knowledge and if the assumptions made in the models are aligned with the existing knowledge. When this alignment is confirmed through validation or benchmarking exercises, the user gains confidence in the models. Before these benchmarking exercises are feasible, however, a cohesive survey of existing knowledge needs to be undertaken. With that aim, this work presents a review of flood loss–or flood vulnerability–relationships collected from the public domain and some professional sources. Our survey analyses 61 sources consisting of publications or software packages, of which 47 are reviewed in detail. This exercise results in probably the most complete review of flood loss models to date containing nearly a thousand vulnerability functions. These functions are highly heterogeneous and only about half of the loss models are found to be accompanied by explicit validation at the time of their proposal. This paper exemplarily presents an approach for a quantitative comparison of disparate models via the reduction to the joint input variables of all models. Harmonization of models for benchmarking and comparison requires profound insight into the model structures, mechanisms and underlying assumptions. Possibilities and challenges are discussed that exist in model harmonization and the application of the inventory in a benchmarking framework. PMID:27454604
A Review of Flood Loss Models as Basis for Harmonization and Benchmarking.
Gerl, Tina; Kreibich, Heidi; Franco, Guillermo; Marechal, David; Schröter, Kai
2016-01-01
Risk-based approaches have been increasingly accepted and operationalized in flood risk management during recent decades. For instance, commercial flood risk models are used by the insurance industry to assess potential losses, establish the pricing of policies and determine reinsurance needs. Despite considerable progress in the development of loss estimation tools since the 1980s, loss estimates still reflect high uncertainties and disparities that often lead to questioning their quality. This requires an assessment of the validity and robustness of loss models as it affects prioritization and investment decision in flood risk management as well as regulatory requirements and business decisions in the insurance industry. Hence, more effort is needed to quantify uncertainties and undertake validations. Due to a lack of detailed and reliable flood loss data, first order validations are difficult to accomplish, so that model comparisons in terms of benchmarking are essential. It is checked if the models are informed by existing data and knowledge and if the assumptions made in the models are aligned with the existing knowledge. When this alignment is confirmed through validation or benchmarking exercises, the user gains confidence in the models. Before these benchmarking exercises are feasible, however, a cohesive survey of existing knowledge needs to be undertaken. With that aim, this work presents a review of flood loss-or flood vulnerability-relationships collected from the public domain and some professional sources. Our survey analyses 61 sources consisting of publications or software packages, of which 47 are reviewed in detail. This exercise results in probably the most complete review of flood loss models to date containing nearly a thousand vulnerability functions. These functions are highly heterogeneous and only about half of the loss models are found to be accompanied by explicit validation at the time of their proposal. This paper exemplarily presents an approach for a quantitative comparison of disparate models via the reduction to the joint input variables of all models. Harmonization of models for benchmarking and comparison requires profound insight into the model structures, mechanisms and underlying assumptions. Possibilities and challenges are discussed that exist in model harmonization and the application of the inventory in a benchmarking framework.
A review on the benchmarking concept in Malaysian construction safety performance
NASA Astrophysics Data System (ADS)
Ishak, Nurfadzillah; Azizan, Muhammad Azizi
2018-02-01
Construction industry is one of the major industries that propels Malaysia's economy in highly contributes to our nation's GDP growth, yet the high fatality rates on construction sites have caused concern among safety practitioners and the stakeholders. Hence, there is a need of benchmarking in performance of Malaysia's construction industry especially in terms of safety. This concept can create a fertile ground for ideas, but only in a receptive environment, organization that share good practices and compare their safety performance against other benefit most to establish improvement in safety culture. This research was conducted to study the awareness important, evaluate current practice and improvement, and also identify the constraint in implement of benchmarking on safety performance in our industry. Additionally, interviews with construction professionals were come out with different views on this concept. Comparison has been done to show the different understanding of benchmarking approach and how safety performance can be benchmarked. But, it's viewed as one mission, which to evaluate objectives identified through benchmarking that will improve the organization's safety performance. Finally, the expected result from this research is to help Malaysia's construction industry implement best practice in safety performance management through the concept of benchmarking.
ERIC Educational Resources Information Center
Conner, Tom; Prokhorov, Artem; Page, Connie; Fang, Yu; Xiao, Yimin; Post, Lori A.
2011-01-01
Elder abuse in long-term care has become a very important public health concern. Recent estimates of elder abuse prevalence are in the range of 2% to 10% (Lachs & Pillemer, 2004), and current changes in population structure indicate a potential for an upward trend in prevalence (Malley-Morrison, Nolido, & Chawla, 2006; Post et al., 2006).…
Some Thoughts on Stability in Nonlinear Periodic Focusing Systems [Addendum
DOE R&D Accomplishments Database
McMillan, Edwin M.
1968-03-29
Addendum to September 5, 1967 report with the same title and with the abstract: A brief discussion is given of the long-term stability of particle motions through periodic focusing structures containing lumped nonlinear elements. A method is presented whereby one can specify the nonlinear elements in such a way as to generate a variety of structures in which the motion has long-term stability.
A benchmarking method to measure dietary absorption efficiency of chemicals by fish.
Xiao, Ruiyang; Adolfsson-Erici, Margaretha; Åkerman, Gun; McLachlan, Michael S; MacLeod, Matthew
2013-12-01
Understanding the dietary absorption efficiency of chemicals in the gastrointestinal tract of fish is important from both a scientific and a regulatory point of view. However, reported fish absorption efficiencies for well-studied chemicals are highly variable. In the present study, the authors developed and exploited an internal chemical benchmarking method that has the potential to reduce uncertainty and variability and, thus, to improve the precision of measurements of fish absorption efficiency. The authors applied the benchmarking method to measure the gross absorption efficiency for 15 chemicals with a wide range of physicochemical properties and structures. They selected 2,2',5,6'-tetrachlorobiphenyl (PCB53) and decabromodiphenyl ethane as absorbable and nonabsorbable benchmarks, respectively. Quantities of chemicals determined in fish were benchmarked to the fraction of PCB53 recovered in fish, and quantities of chemicals determined in feces were benchmarked to the fraction of decabromodiphenyl ethane recovered in feces. The performance of the benchmarking procedure was evaluated based on the recovery of the test chemicals and precision of absorption efficiency from repeated tests. Benchmarking did not improve the precision of the measurements; after benchmarking, however, the median recovery for 15 chemicals was 106%, and variability of recoveries was reduced compared with before benchmarking, suggesting that benchmarking could account for incomplete extraction of chemical in fish and incomplete collection of feces from different tests. © 2013 SETAC.
Exterior view of north wall of LongTerm Oxidizer Silo (T28B) ...
Exterior view of north wall of Long-Term Oxidizer Silo (T-28B) and Oxidizer Conditioning Structure (T-28D) behind and to its immediate left, looking south. A nitrogen line, used to prepare the Titan II's nitrogen-tetroxide oxidizer, is in the right foreground - Air Force Plant PJKS, Systems Integration Laboratory, Long-Term Oxidizer Silo, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO
5. Exterior view of LongTerm Hydrazine Silo (T28E), looking west. ...
5. Exterior view of Long-Term Hydrazine Silo (T-28E), looking west. The low-lying building to the left of the silo is the Fuel Purification Structure (T-28E). A hydrazine tank is in the concrete truck well in the immediate foreground. - Air Force Plant PJKS, Systems Integration Laboratory, Long-Term Hydrazine Silo, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO
A suite of benchmark and challenge problems for enhanced geothermal systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Mark; Fu, Pengcheng; McClure, Mark
A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulation capabilitiesmore » to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners. We present the suite of benchmark and challenge problems developed for the GTO-CCS, providing problem descriptions and sample solutions.« less
Benditz, Achim; Greimel, Felix; Auer, Patrick; Zeman, Florian; Göttermann, Antje; Grifka, Joachim; Meissner, Winfried; von Kunow, Frederik
2016-01-01
Background The number of total hip replacement surgeries has steadily increased over recent years. Reduction in postoperative pain increases patient satisfaction and enables better mobilization. Thus, pain management needs to be continuously improved. Problems are often caused not only by medical issues but also by organization and hospital structure. The present study shows how the quality of pain management can be increased by implementing a standardized pain concept and simple, consistent, benchmarking. Methods All patients included in the study had undergone total hip arthroplasty (THA). Outcome parameters were analyzed 24 hours after surgery by means of the questionnaires from the German-wide project “Quality Improvement in Postoperative Pain Management” (QUIPS). A pain nurse interviewed patients and continuously assessed outcome quality parameters. A multidisciplinary team of anesthetists, orthopedic surgeons, and nurses implemented a regular procedure of data analysis and internal benchmarking. The health care team was informed of any results, and suggested improvements. Every staff member involved in pain management participated in educational lessons, and a special pain nurse was trained in each ward. Results From 2014 to 2015, 367 patients were included. The mean maximal pain score 24 hours after surgery was 4.0 (±3.0) on an 11-point numeric rating scale, and patient satisfaction was 9.0 (±1.2). Over time, the maximum pain score decreased (mean 3.0, ±2.0), whereas patient satisfaction significantly increased (mean 9.8, ±0.4; p<0.05). Among 49 anonymized hospitals, our clinic stayed on first rank in terms of lowest maximum pain and patient satisfaction over the period. Conclusion Results were already acceptable at the beginning of benchmarking a standardized pain management concept. But regular benchmarking, implementation of feedback mechanisms, and staff education made the pain management concept even more successful. Multidisciplinary teamwork and flexibility in adapting processes seem to be highly important for successful pain management. PMID:28031727
Benditz, Achim; Greimel, Felix; Auer, Patrick; Zeman, Florian; Göttermann, Antje; Grifka, Joachim; Meissner, Winfried; von Kunow, Frederik
2016-01-01
The number of total hip replacement surgeries has steadily increased over recent years. Reduction in postoperative pain increases patient satisfaction and enables better mobilization. Thus, pain management needs to be continuously improved. Problems are often caused not only by medical issues but also by organization and hospital structure. The present study shows how the quality of pain management can be increased by implementing a standardized pain concept and simple, consistent, benchmarking. All patients included in the study had undergone total hip arthroplasty (THA). Outcome parameters were analyzed 24 hours after surgery by means of the questionnaires from the German-wide project "Quality Improvement in Postoperative Pain Management" (QUIPS). A pain nurse interviewed patients and continuously assessed outcome quality parameters. A multidisciplinary team of anesthetists, orthopedic surgeons, and nurses implemented a regular procedure of data analysis and internal benchmarking. The health care team was informed of any results, and suggested improvements. Every staff member involved in pain management participated in educational lessons, and a special pain nurse was trained in each ward. From 2014 to 2015, 367 patients were included. The mean maximal pain score 24 hours after surgery was 4.0 (±3.0) on an 11-point numeric rating scale, and patient satisfaction was 9.0 (±1.2). Over time, the maximum pain score decreased (mean 3.0, ±2.0), whereas patient satisfaction significantly increased (mean 9.8, ±0.4; p <0.05). Among 49 anonymized hospitals, our clinic stayed on first rank in terms of lowest maximum pain and patient satisfaction over the period. Results were already acceptable at the beginning of benchmarking a standardized pain management concept. But regular benchmarking, implementation of feedback mechanisms, and staff education made the pain management concept even more successful. Multidisciplinary teamwork and flexibility in adapting processes seem to be highly important for successful pain management.
Extensive sequencing of seven human genomes to characterize benchmark reference materials
Zook, Justin M.; Catoe, David; McDaniel, Jennifer; Vang, Lindsay; Spies, Noah; Sidow, Arend; Weng, Ziming; Liu, Yuling; Mason, Christopher E.; Alexander, Noah; Henaff, Elizabeth; McIntyre, Alexa B.R.; Chandramohan, Dhruva; Chen, Feng; Jaeger, Erich; Moshrefi, Ali; Pham, Khoa; Stedman, William; Liang, Tiffany; Saghbini, Michael; Dzakula, Zeljko; Hastie, Alex; Cao, Han; Deikus, Gintaras; Schadt, Eric; Sebra, Robert; Bashir, Ali; Truty, Rebecca M.; Chang, Christopher C.; Gulbahce, Natali; Zhao, Keyan; Ghosh, Srinka; Hyland, Fiona; Fu, Yutao; Chaisson, Mark; Xiao, Chunlin; Trow, Jonathan; Sherry, Stephen T.; Zaranek, Alexander W.; Ball, Madeleine; Bobe, Jason; Estep, Preston; Church, George M.; Marks, Patrick; Kyriazopoulou-Panagiotopoulou, Sofia; Zheng, Grace X.Y.; Schnall-Levin, Michael; Ordonez, Heather S.; Mudivarti, Patrice A.; Giorda, Kristina; Sheng, Ying; Rypdal, Karoline Bjarnesdatter; Salit, Marc
2016-01-01
The Genome in a Bottle Consortium, hosted by the National Institute of Standards and Technology (NIST) is creating reference materials and data for human genome sequencing, as well as methods for genome comparison and benchmarking. Here, we describe a large, diverse set of sequencing data for seven human genomes; five are current or candidate NIST Reference Materials. The pilot genome, NA12878, has been released as NIST RM 8398. We also describe data from two Personal Genome Project trios, one of Ashkenazim Jewish ancestry and one of Chinese ancestry. The data come from 12 technologies: BioNano Genomics, Complete Genomics paired-end and LFR, Ion Proton exome, Oxford Nanopore, Pacific Biosciences, SOLiD, 10X Genomics GemCode WGS, and Illumina exome and WGS paired-end, mate-pair, and synthetic long reads. Cell lines, DNA, and data from these individuals are publicly available. Therefore, we expect these data to be useful for revealing novel information about the human genome and improving sequencing technologies, SNP, indel, and structural variant calling, and de novo assembly. PMID:27271295
Developments in lithium-ion battery technology in the Peoples Republic of China.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patil, P. G.; Energy Systems
2008-02-28
Argonne National Laboratory prepared this report, under the sponsorship of the Office of Vehicle Technologies (OVT) of the U.S. Department of Energy's (DOE's) Office of Energy Efficiency and Renewable Energy, for the Vehicles Technologies Team. The information in the report is based on the author's visit to Beijing; Tianjin; and Shanghai, China, to meet with representatives from several organizations (listed in Appendix A) developing and manufacturing lithium-ion battery technology for cell phones and electronics, electric bikes, and electric and hybrid vehicle applications. The purpose of the visit was to assess the status of lithium-ion battery technology in China and tomore » determine if lithium-ion batteries produced in China are available for benchmarking in the United States. With benchmarking, DOE and the U.S. battery development industry would be able to understand the status of the battery technology, which would enable the industry to formulate a long-term research and development program. This report also describes the state of lithium-ion battery technology in the United States, provides information on joint ventures, and includes information on government incentives and policies in the Peoples Republic of China (PRC).« less
A Diagnostic Assessment of Evolutionary Multiobjective Optimization for Water Resources Systems
NASA Astrophysics Data System (ADS)
Reed, P.; Hadka, D.; Herman, J.; Kasprzyk, J.; Kollat, J.
2012-04-01
This study contributes a rigorous diagnostic assessment of state-of-the-art multiobjective evolutionary algorithms (MOEAs) and highlights key advances that the water resources field can exploit to better discover the critical tradeoffs constraining our systems. This study provides the most comprehensive diagnostic assessment of MOEAs for water resources to date, exploiting more than 100,000 MOEA runs and trillions of design evaluations. The diagnostic assessment measures the effectiveness, efficiency, reliability, and controllability of ten benchmark MOEAs for a representative suite of water resources applications addressing rainfall-runoff calibration, long-term groundwater monitoring (LTM), and risk-based water supply portfolio planning. The suite of problems encompasses a range of challenging problem properties including (1) many-objective formulations with 4 or more objectives, (2) multi-modality (or false optima), (3) nonlinearity, (4) discreteness, (5) severe constraints, (6) stochastic objectives, and (7) non-separability (also called epistasis). The applications are representative of the dominant problem classes that have shaped the history of MOEAs in water resources and that will be dominant foci in the future. Recommendations are provided for which modern MOEAs should serve as tools and benchmarks in the future water resources literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leclaire, Nicolas; Le Dauphin, Francois-Xavier; Duhamel, Isabelle
2014-11-04
The MIRTE (Materials in Interacting and Reflecting configurations, all Thicknesses) program was established to answer the needs of criticality safety practitioners in terms of experimental validation of structural materials and to possibly contribute to nuclear data improvement, which ultimately supports reactor safety analysis as well. MIRTE took the shape of a collaboration between the AREVA and ANDRA French industrialists and a noncommercial international funding partner such as the U.S. Department of Energy. The aim of this paper is to present the configurations of the MIRTE 1 and MIRTE 2 programs and to highlight the results of the titanium experiments recentlymore » published in the International Handbook of Evaluated Criticality Safety Benchmark Experiments.« less
Altered neural processing of emotional faces in remitted Cushing's disease.
Bas-Hoogendam, Janna Marie; Andela, Cornelie D; van der Werff, Steven J A; Pannekoek, J Nienke; van Steenbergen, Henk; Meijer, Onno C; van Buchem, Mark A; Rombouts, Serge A R B; van der Mast, Roos C; Biermasz, Nienke R; van der Wee, Nic J A; Pereira, Alberto M
2015-09-01
Patients with long-term remission of Cushing's disease (CD) demonstrate residual psychological complaints. At present, it is not known how previous exposure to hypercortisolism affects psychological functioning in the long-term. Earlier magnetic resonance imaging (MRI) studies demonstrated abnormalities of brain structure and resting-state connectivity in patients with long-term remission of CD, but no data are available on functional alterations in the brain during the performance of emotional or cognitive tasks in these patients. We performed a cross-sectional functional MRI study, investigating brain activation during emotion processing in patients with long-term remission of CD. Processing of emotional faces versus a non-emotional control condition was examined in 21 patients and 21 matched healthy controls. Analyses focused on activation and connectivity of two a priori determined regions of interest: the amygdala and the medial prefrontal-orbitofrontal cortex (mPFC-OFC). We also assessed psychological functioning, cognitive failure, and clinical disease severity. Patients showed less mPFC activation during processing of emotional faces compared to controls, whereas no differences were found in amygdala activation. An exploratory psychophysiological interaction analysis demonstrated decreased functional coupling between the ventromedial PFC and posterior cingulate cortex (a region structurally connected to the PFC) in CD-patients. The present study is the first to show alterations in brain function and task-related functional coupling in patients with long-term remission of CD relative to matched healthy controls. These alterations may, together with abnormalities in brain structure, be related to the persisting psychological morbidity in patients with CD after long-term remission. Copyright © 2015 Elsevier Ltd. All rights reserved.
Integrated modeling of long-term vegetation and hydrologic dynamics in Rocky Mountain watersheds
Robert Steven Ahl
2007-01-01
Changes in forest structure resulting from natural disturbances, or managed treatments, can have negative and long lasting impacts on water resources. To facilitate integrated management of forest and water resources, a System for Long-Term Integrated Management Modeling (SLIMM) was developed. By combining two spatially explicit, continuous time models, vegetation...
Dellve, Lotta; Skagert, Katrin; Eklöf, Mats
2008-09-01
Despite several years of conducting formalized systematic occupational health and safety management (SOHSM), as required by law in Sweden and most other industrialized countries, there is still little evidence on how SOHSM should be approached to have an impact on employees' health. The aim of this study was to investigate the importance of SOHSM, considering structured routines and participation processes, for the incidence of occupational disorders and the prevalence of long-term work attendance among home care workers (HCWs). Municipal human service organizations were compared concerning (a) their structured routines and participation processes for SOHSM and (b) employee health, i.e. the municipal five-year incidence of occupational disorders and prevalence of work attendance among HCWs. National register-based data from the whole population of HCWs (n=154 773) were linked to register-data of occupational disorders and prevalence of long-term work attendance. The top managers and safety representatives in selected high- and low-incidence organizations (n=60) answered a questionnaire about structure and participation process of SOHSM. The results showed that prevalence of long-term work attendance was higher where structure and routines for SOHSM (policy, goals and plans for action) were well organized. Highly structured SOHSM and human resource management were also related to high organizational incidence of reported occupational disorders. Allocated budget and routines related to HCWs' influence in decisions concerning performance of care were also related to long-term work attendance. The participation processes had a weak effect on occupational disorders and work attendance among HCWs. Reporting occupational disorders may be a functional tool to stimulate the development of effective SOHSM, to improve the work environment and sustainable work ability.
Benchmarking of relative permeability
NASA Astrophysics Data System (ADS)
DiCarlo, D. A.
2017-12-01
Relative permeability is the key relation in terms of multi-phase flow through porous media. There are hundreds of published relative permeability curves for various media, some classic (Oak 90 and 91), some contradictory. This can lead to a confusing situation if one is trying to benchmark simulation results to "experimental data". Coming from the experimental side, I have found that modelers have too much trust in relative permeability data sets. In this talk, I will discuss reasons for discrepancies within and between data sets, and give guidance on which portions of the data sets are most solid in terms of matching through models.
Overview of developing desired conditions: Short-term actions, long-term objectives
J. D. Chew; K. O' Hara; J. G. Jones
2001-01-01
A number of modeling tools are required to go from short-term treatments to long-term objectives expressed as desired future conditions. Three models are used in an example that starts with determining desired stand level structure and ends with the implementation of treatments over time at a landscape scale. The Multi-Aged Stocking Assessment Model (MASAM) is used for...
NASA Astrophysics Data System (ADS)
Hori, M.; Sugiura, K.; Kobayashi, K.; Aoki, T.; Tanikawa, T.; Niwano, M.; Enomoto, H.
2017-12-01
A long-term Northern Hemisphere (NH) snow cover extent (SCE) product (JASMES SCE) was developed from the application of a consistent objective snow cover mapping algorithm to satellite-borne optical sensors (NOAA/AVHRR and NASA's optical sensor MODIS) from 1982 to the present. We estimated NH SCE from weekly composited snow cover maps and evaluated the accuracies of snow cover detection using in-situ snow data. As benchmark SCE product, we also evaluated the accuracy of SCE maps from the National Oceanic and Atmospheric Administration Climate Data Record (NOAA-CDR) product. The evaluation showed that JASMES SCE has more temporally stable accuracies. Seasonally averaged SCE derived from JASMES exhibited negative slopes in all seasons which is opposite to those of NOAA-CDR SCE in the fall and winter seasons. The spatial pattern of annual snow cover duration (SCD) trends exhibited noticeable asymmetric pattern between continents with the largest negative trends seen over western Eurasia. The NH SCE product will be connected to the data of the Japanese Earth Observing satellite named "Global Change Observation Mission for Climate (GCOM-C)" to be launched in late 2017.
NASA Astrophysics Data System (ADS)
Foolad, Foad; Franz, Trenton E.; Wang, Tiejun; Gibson, Justin; Kilic, Ayse; Allen, Richard G.; Suyker, Andrew
2017-03-01
In this study, the feasibility of using inverse vadose zone modeling for estimating field-scale actual evapotranspiration (ETa) was explored at a long-term agricultural monitoring site in eastern Nebraska. Data from both point-scale soil water content (SWC) sensors and the area-average technique of cosmic-ray neutron probes were evaluated against independent ETa estimates from a co-located eddy covariance tower. While this methodology has been successfully used for estimates of groundwater recharge, it was essential to assess the performance of other components of the water balance such as ETa. In light of recent evaluations of land surface models (LSMs), independent estimates of hydrologic state variables and fluxes are critically needed benchmarks. The results here indicate reasonable estimates of daily and annual ETa from the point sensors, but with highly varied soil hydraulic function parameterizations due to local soil texture variability. The results of multiple soil hydraulic parameterizations leading to equally good ETa estimates is consistent with the hydrological principle of equifinality. While this study focused on one particular site, the framework can be easily applied to other SWC monitoring networks across the globe. The value-added products of groundwater recharge and ETa flux from the SWC monitoring networks will provide additional and more robust benchmarks for the validation of LSM that continues to improve their forecast skill. In addition, the value-added products of groundwater recharge and ETa often have more direct impacts on societal decision-making than SWC alone. Water flux impacts human decision-making from policies on the long-term management of groundwater resources (recharge), to yield forecasts (ETa), and to optimal irrigation scheduling (ETa). Illustrating the societal benefits of SWC monitoring is critical to insure the continued operation and expansion of these public datasets.
NASA Astrophysics Data System (ADS)
Novak, A.; Honzik, P.; Bruneau, M.
2017-08-01
Miniaturized vibrating MEMS devices, active (receivers or emitters) or passive devices, and their use for either new applications (hearing, meta-materials, consumer devices,…) or metrological purposes under non-standard conditions, are involved today in several acoustic domains. More in-depth characterisation than the classical ones available until now are needed. In this context, the paper presents analytical and numerical approaches for describing the behaviour of three kinds of planar micro-beams of rectangular shape (suspended rigid or clamped elastic planar beam) loaded by a backing cavity or a fluid-gap, surrounded by very thin slits, and excited by an incident acoustic field. The analytical approach accounts for the coupling between the vibrating structure and the acoustic field in the backing cavity, the thermal and viscous diffusion processes in the boundary layers in the slits and the cavity, the modal behaviour for the vibrating structure, and the non-uniformity of the acoustic field in the backing cavity which is modelled in using an integral formulation with a suitable Green's function. Benchmark solutions are proposed in terms of beam motion (from which the sensitivity, input impedance, and pressure transfer function can be calculated). A numerical implementation (FEM) is handled against which the analytical results are tested.
García-García, Raquel; Cruz-Gómez, Álvaro Javier; Urios, Amparo; Mangas-Losada, Alba; Forn, Cristina; Escudero-García, Desamparados; Kosenko, Elena; Torregrosa, Isidro; Tosca, Joan; Giner-Durán, Remedios; Serra, Miguel Angel; Avila, César; Belloch, Vicente; Felipo, Vicente; Montoliu, Carmina
2018-06-25
Patients with minimal hepatic encephalopathy (MHE) show mild cognitive impairment associated with alterations in attentional and executive networks. There are no studies evaluating the relationship between memory in MHE and structural and functional connectivity (FC) changes in the hippocampal system. This study aimed to evaluate verbal learning and long-term memory in cirrhotic patients with (C-MHE) and without MHE (C-NMHE) and healthy controls. We assessed the relationship between alterations in memory and the structural integrity and FC of the hippocampal system. C-MHE patients showed impairments in learning, long-term memory, and recognition, compared to C-NMHE patients and controls. Cirrhotic patients showed reduced fimbria volume compared to controls. Larger volumes in hippocampus subfields were related to better memory performance in C-NMHE patients and controls. C-MHE patients presented lower FC between the L-presubiculum and L-precuneus than C-NMHE patients. Compared to controls, C-MHE patients had reduced FC between L-presubiculum and subiculum seeds and bilateral precuneus, which correlated with cognitive impairment and memory performance. Alterations in the FC of the hippocampal system could contribute to learning and long-term memory impairments in C-MHE patients. This study demonstrates the association between alterations in learning and long-term memory and structural and FC disturbances in hippocampal structures in cirrhotic patients.
Falibene, Agustina; Roces, Flavio; Rössler, Wolfgang
2015-01-01
Long-term behavioral changes related to learning and experience have been shown to be associated with structural remodeling in the brain. Leaf-cutting ants learn to avoid previously preferred plants after they have proved harmful for their symbiotic fungus, a process that involves long-term olfactory memory. We studied the dynamics of brain microarchitectural changes after long-term olfactory memory formation following avoidance learning in Acromyrmex ambiguus. After performing experiments to control for possible neuronal changes related to age and body size, we quantified synaptic complexes (microglomeruli, MG) in olfactory regions of the mushroom bodies (MBs) at different times after learning. Long-term avoidance memory formation was associated with a transient change in MG densities. Two days after learning, MG density was higher than before learning. At days 4 and 15 after learning—when ants still showed plant avoidance—MG densities had decreased to the initial state. The structural reorganization of MG triggered by long-term avoidance memory formation clearly differed from changes promoted by pure exposure to and collection of novel plants with distinct odors. Sensory exposure by the simultaneous collection of several, instead of one, non-harmful plant species resulted in a decrease in MG densities in the olfactory lip. We hypothesize that while sensory exposure leads to MG pruning in the MB olfactory lip, the formation of long-term avoidance memory involves an initial growth of new MG followed by subsequent pruning. PMID:25904854
Sahm, Maik; Otto, Ronny; Pross, Matthias; Mantke, Rene
2018-06-25
Approximately 90,000 thyroid operations are performed in Germany each year. Minimally invasive video-assisted thyroidectomy (MIVAT) accounts for 5 - 10% of these operations. There are few data that compare long-term cosmetic results after MIVAT to those after conventional surgery. Current systematic reviews show no advantage for MIVAT. The goal of this study was to analyse the long-term postoperative results in both procedures and the evaluation of relevant factors. The analysis of the long-term results is based on follow-up examinations using a validated method for scar appraisal (POSAS). Cohort analysis was performed on MIVAT operations in our hospital between 2004 and 2011 and conventional thyroid operations in 2011. Follow-up examination data were analysed from 117 patients from the MIVAT group and 102 patients from the conventional group. The follow-up examination was performed with a mean of 23.1 vs. 23.6 months postoperatively (MIVAT vs. conventional). The Friedman Test showed that scar pigmentation (mean rank 4.79) and scar surface structure (mean rank 3.62) were the deciding factors influencing the long-term cosmetic results. Both MIVAT and conventional surgery gave very good long-term cosmetic results. From the patient's perspective, there is no significant advantage with conventional surgery. The evaluation of the long-term results largely depends on factors such as scar pigmentation and surface structure that can only be influenced to a limited extent by the surgical procedure. Georg Thieme Verlag KG Stuttgart · New York.
Acute coronary syndrome in the Asia-Pacific region.
Chan, Mark Y; Du, Xin; Eccleston, David; Ma, Changsheng; Mohanan, Padinhare P; Ogita, Manabu; Shyu, Kou-Gi; Yan, Bryan P; Jeong, Young-Hoon
2016-01-01
More than 4.2 billion inhabitants populate the Asia-Pacific region. Acute coronary syndrome (ACS) is now a major cause of death and disability in this region with in-hospital mortality typically exceeding 5%. Yet, the region still lacks consensus on the best approach to overcoming its specific challenges in reducing mortality from ACS. The Asia-Pacific Real world evIdenCe on Outcome and Treatment of ACS (APRICOT) project reviewed current published and unpublished registry data, unmet needs in ACS management and possible approaches towards improving ACS-related mortality in the region. There was striking heterogeneity in the use of invasive procedures, pharmacologic practice (hospitalization/post-discharge), and in short- and long-term clinical outcomes across healthcare systems; this heterogeneity was perceived to be far greater than in Western Europe or the United States. 'Benchmark' short-term clinical outcomes are preferred over long-term outcomes due to difficulties in follow-up, recording and maintenance of medication adherence in a geographically large and culturally diverse region. Key 'barriers' towards improving outcomes include patient education (pain awareness, consequences of missing medication and secondary prevention), geographical landscape (urban vs. metropolitan), limited long-term adherence to guideline-based management and widespread adoption of cost-based rather than value-based healthcare systems. Initiatives to overcome these barriers should include implementation of pre-hospital management strategies, toolkits to aid in-hospital treatment, greater community outreach with online patient/physician education and telemedicine, sustainable economic models to improve accessibility to effective pharmacotherapies and the acquisition of high-quality 'real-world' regional data to tailor secondary prevention initiatives that meet the unique needs of countries in this region. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Optimization of the computational load of a hypercube supercomputer onboard a mobile robot.
Barhen, J; Toomarian, N; Protopopescu, V
1987-12-01
A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of singleneuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as prec xdence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.
Optimization of the computational load of a hypercube supercomputer onboard a mobile robot
NASA Technical Reports Server (NTRS)
Barhen, Jacob; Toomarian, N.; Protopopescu, V.
1987-01-01
A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of single-neuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as precedence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.
2016-11-01
iii Contents List of Figures v 1. Introduction 1 2. Background 1 3. Yahoo ! Cloud Serving Benchmark (YCSB) 2 3.1 Data Loading and Performance...transactional system. 3. Yahoo ! Cloud Serving Benchmark (YCSB) 3.1 Data Loading and Performance Testing Framework When originally setting out to perform the...that referred to a data loading and performance testing framework, Yahoo ! Cloud Serving Benchmark (YCSB).12 This framework is freely available and
Pseudo-icosahedral Cr 55 Al 232 - δ as a high-temperature protective material
Rosa, R.; Bhattacharya, S.; Pabla, J.; ...
2018-03-19
In this paper, we report here a course of basic research into the potential suitability of a pseudo-icosahedral Cr aluminide as a material for high temperature protective coatings. Cr 55Al 232-δ [δ = 2.70(6)] exhibits high hardness at room temperature as well as low thermal conductivity and excellent oxidation resistance at 973 K, with an oxidation rate comparable to those of softer, denser benchmark materials. Lastly, the origin of these promising properties can be traced to competing long-range and short-range symmetries within the pseudo-icosahedral crystal structure, suggesting new criteria for future materials research.
Pseudo-icosahedral Cr 55 Al 232 - δ as a high-temperature protective material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosa, R.; Bhattacharya, S.; Pabla, J.
In this paper, we report here a course of basic research into the potential suitability of a pseudo-icosahedral Cr aluminide as a material for high temperature protective coatings. Cr 55Al 232-δ [δ = 2.70(6)] exhibits high hardness at room temperature as well as low thermal conductivity and excellent oxidation resistance at 973 K, with an oxidation rate comparable to those of softer, denser benchmark materials. Lastly, the origin of these promising properties can be traced to competing long-range and short-range symmetries within the pseudo-icosahedral crystal structure, suggesting new criteria for future materials research.
ERIC Educational Resources Information Center
Songhua, Tan; Wang, Catherine Yan
2012-01-01
The "National Medium- and Long-Term Educational Reform and Development Guideline (2010-20)" (hereafter abbreviated as the "Guideline") posits that the development of education must be driven by reform and innovation. It devotes six chapters to mapping out the targets, tasks, and major policy measures for reforming the…
Johnstone, Victoria P A; Wright, David K; Wong, Kendrew; O'Brien, Terence J; Rajan, Ramesh; Shultz, Sandy R
2015-09-01
Traumatic brain injury (TBI) is a leading cause of death worldwide. In recent studies, we have shown that experimental TBI caused an immediate (24-h post) suppression of neuronal processing, especially in supragranular cortical layers. We now examine the long-term effects of experimental TBI on the sensory cortex and how these changes may contribute to a range of TBI morbidities. Adult male Sprague-Dawley rats received either a moderate lateral fluid percussion injury (n=14) or a sham surgery (n=12) and 12 weeks of recovery before behavioral assessment, magnetic resonance imaging, and electrophysiological recordings from the barrel cortex. TBI rats demonstrated sensorimotor deficits, cognitive impairments, and anxiety-like behavior, and this was associated with significant atrophy of the barrel cortex and other brain structures. Extracellular recordings from ipsilateral barrel cortex revealed normal neuronal responsiveness and diffusion tensor MRI showed increased fractional anisotropy, axial diffusivity, and tract density within this region. These findings suggest that long-term recovery of neuronal responsiveness is owing to structural reorganization within this region. Therefore, it is likely that long-term structural and functional changes within sensory cortex post-TBI may allow for recovery of neuronal responsiveness, but that this recovery does not remediate all behavioral deficits.
Using a health promotion model to promote benchmarking.
Welby, Jane
2006-07-01
The North East (England) Neonatal Benchmarking Group has been established for almost a decade and has researched and developed a substantial number of evidence-based benchmarks. With no firm evidence that these were being used or that there was any standardisation of neonatal care throughout the region, the group embarked on a programme to review the benchmarks and determine what evidence-based guidelines were needed to support standardisation. A health promotion planning model was used by one subgroup to structure the programme; it enabled all members of the sub group to engage in the review process and provided the motivation and supporting documentation for implementation of changes in practice. The need for a regional guideline development group to complement the activity of the benchmarking group is being addressed.
Kohn-Sham Band Structure Benchmark Including Spin-Orbit Coupling for 2D and 3D Solids
NASA Astrophysics Data System (ADS)
Huhn, William; Blum, Volker
2015-03-01
Accurate electronic band structures serve as a primary indicator of the suitability of a material for a given application, e.g., as electronic or catalytic materials. Computed band structures, however, are subject to a host of approximations, some of which are more obvious (e.g., the treatment of the exchange-correlation of self-energy) and others less obvious (e.g., the treatment of core, semicore, or valence electrons, handling of relativistic effects, or the accuracy of the underlying basis set used). We here provide a set of accurate Kohn-Sham band structure benchmarks, using the numeric atom-centered all-electron electronic structure code FHI-aims combined with the ``traditional'' PBE functional and the hybrid HSE functional, to calculate core, valence, and low-lying conduction bands of a set of 2D and 3D materials. Benchmarks are provided with and without effects of spin-orbit coupling, using quasi-degenerate perturbation theory to predict spin-orbit splittings. This work is funded by Fritz-Haber-Institut der Max-Planck-Gesellschaft.
Evolutionary Optimization of a Geometrically Refined Truss
NASA Technical Reports Server (NTRS)
Hull, P. V.; Tinker, M. L.; Dozier, G. V.
2007-01-01
Structural optimization is a field of research that has experienced noteworthy growth for many years. Researchers in this area have developed optimization tools to successfully design and model structures, typically minimizing mass while maintaining certain deflection and stress constraints. Numerous optimization studies have been performed to minimize mass, deflection, and stress on a benchmark cantilever truss problem. Predominantly traditional optimization theory is applied to this problem. The cross-sectional area of each member is optimized to minimize the aforementioned objectives. This Technical Publication (TP) presents a structural optimization technique that has been previously applied to compliant mechanism design. This technique demonstrates a method that combines topology optimization, geometric refinement, finite element analysis, and two forms of evolutionary computation: genetic algorithms and differential evolution to successfully optimize a benchmark structural optimization problem. A nontraditional solution to the benchmark problem is presented in this TP, specifically a geometrically refined topological solution. The design process begins with an alternate control mesh formulation, multilevel geometric smoothing operation, and an elastostatic structural analysis. The design process is wrapped in an evolutionary computing optimization toolset.
A call for benchmarking transposable element annotation methods.
Hoen, Douglas R; Hickey, Glenn; Bourque, Guillaume; Casacuberta, Josep; Cordaux, Richard; Feschotte, Cédric; Fiston-Lavier, Anna-Sophie; Hua-Van, Aurélie; Hubley, Robert; Kapusta, Aurélie; Lerat, Emmanuelle; Maumus, Florian; Pollock, David D; Quesneville, Hadi; Smit, Arian; Wheeler, Travis J; Bureau, Thomas E; Blanchette, Mathieu
2015-01-01
DNA derived from transposable elements (TEs) constitutes large parts of the genomes of complex eukaryotes, with major impacts not only on genomic research but also on how organisms evolve and function. Although a variety of methods and tools have been developed to detect and annotate TEs, there are as yet no standard benchmarks-that is, no standard way to measure or compare their accuracy. This lack of accuracy assessment calls into question conclusions from a wide range of research that depends explicitly or implicitly on TE annotation. In the absence of standard benchmarks, toolmakers are impeded in improving their tools, annotators cannot properly assess which tools might best suit their needs, and downstream researchers cannot judge how accuracy limitations might impact their studies. We therefore propose that the TE research community create and adopt standard TE annotation benchmarks, and we call for other researchers to join the authors in making this long-overdue effort a success.
DOT National Transportation Integrated Search
2011-06-01
In this project a description of the maintenance of the sensor monitoring systems installed on three California : highway bridges is presented. The monitoring systems consist of accelerometers, strain gauges, pressure sensors, : and displacement sens...
DOT National Transportation Integrated Search
2011-06-01
In this project a description of the maintenance of the sensor monitoring systems installed on three California : highway bridges is presented. The monitoring systems consist of accelerometers, strain gauges, pressure sensors, : and displacement sens...
Sensitivity of soil permafrost to winter warming: Modeled impacts of climate change.
NASA Astrophysics Data System (ADS)
Bouskill, N.; Riley, W. J.; Mekonnen, Z. A.; Grant, R.
2016-12-01
High-latitude tundra soils are warming at nearly twice the rate of temperate ecosystems. Changes in temperature and soil moisture can feedback on the processes controlling the carbon balance of tundra soils by altering plant community composition and productivity and microbial decomposition rates. Recent field manipulation experiments have shown that elevated soil and air temperatures can stimulate both gross primary productivity and ecosystem respiration. However, the observed soil carbon gains following summer time stimulation of plant productivity have been more than offset by elevated decomposition rates during the rest of the year, and particularly over winter. A critical uncertainty is whether these short-term responses also represent the long-term trajectory of tundra ecosystems under chronic disturbance. Herein we employ a mechanistic land-model (ecosys) that represents many of the key above- and belowground processes regulating the carbon balance of tundra soils to simulate a winter warming experiment at Eight Mile Lake, Alaska. Using this model we examined the short-term (5 - 10 year) influence of soil warming through the wintertime by mimicking the accumulation of a deeper snow pack. This deeper snow pack was removed to a height equal to that of the snow pack over control plots prior to snow melt. We benchmarked the model using physical and biological measurements made over the course of a six-year experiment at the site. The model accurately represented the effect of the experimental manipulation on thaw depth, N mineralization, winter respiration, and ecosystem gross and net primary production. After establishing confidence in the modeled short-term responses, we extend the same chronic disturbance to 2050 to examine the long-term response of the plant and microbial communities to warming. We discuss our results in reference to the long-term trajectory of the carbon and nutrient cycles of high-latitude permafrost regions.
Benchmark matrix and guide: Part II.
1991-01-01
In the last issue of the Journal of Quality Assurance (September/October 1991, Volume 13, Number 5, pp. 14-19), the benchmark matrix developed by Headquarters Air Force Logistics Command was published. Five horizontal levels on the matrix delineate progress in TQM: business as usual, initiation, implementation, expansion, and integration. The six vertical categories that are critical to the success of TQM are leadership, structure, training, recognition, process improvement, and customer focus. In this issue, "Benchmark Matrix and Guide: Part II" will show specifically how to apply the categories of leadership, structure, and training to the benchmark matrix progress levels. At the intersection of each category and level, specific behavior objectives are listed with supporting behaviors and guidelines. Some categories will have objectives that are relatively easy to accomplish, allowing quick progress from one level to the next. Other categories will take considerable time and effort to complete. In the next issue, Part III of this series will focus on recognition, process improvement, and customer focus.
Regional restoration benchmarks for Acropora cervicornis
NASA Astrophysics Data System (ADS)
Schopmeyer, Stephanie A.; Lirman, Diego; Bartels, Erich; Gilliam, David S.; Goergen, Elizabeth A.; Griffin, Sean P.; Johnson, Meaghan E.; Lustic, Caitlin; Maxwell, Kerry; Walter, Cory S.
2017-12-01
Coral gardening plays an important role in the recovery of depleted populations of threatened Acropora cervicornis in the Caribbean. Over the past decade, high survival coupled with fast growth of in situ nursery corals have allowed practitioners to create healthy and genotypically diverse nursery stocks. Currently, thousands of corals are propagated and outplanted onto degraded reefs on a yearly basis, representing a substantial increase in the abundance, biomass, and overall footprint of A. cervicornis. Here, we combined an extensive dataset collected by restoration practitioners to document early (1-2 yr) restoration success metrics in Florida and Puerto Rico, USA. By reporting region-specific data on the impacts of fragment collection on donor colonies, survivorship and productivity of nursery corals, and survivorship and productivity of outplanted corals during normal conditions, we provide the basis for a stop-light indicator framework for new or existing restoration programs to evaluate their performance. We show that current restoration methods are very effective, that no excess damage is caused to donor colonies, and that once outplanted, corals behave just as wild colonies. We also provide science-based benchmarks that can be used by programs to evaluate successes and challenges of their efforts, and to make modifications where needed. We propose that up to 10% of the biomass can be collected from healthy, large A. cervicornis donor colonies for nursery propagation. We also propose the following benchmarks for the first year of activities for A. cervicornis restoration: (1) >75% live tissue cover on donor colonies; (2) >80% survivorship of nursery corals; and (3) >70% survivorship of outplanted corals. Finally, we report productivity means of 4.4 cm yr-1 for nursery corals and 4.8 cm yr-1 for outplants as a frame of reference for ranking performance within programs. Such benchmarks, and potential subsequent adaptive actions, are needed to fully assess the long-term success of coral restoration and species recovery programs.
Incremental cost effectiveness evaluation in clinical research.
Krummenauer, Frank; Landwehr, I
2005-01-28
The health economic evaluation of therapeutic and diagnostic strategies is of increasing importance in clinical research. Therefore also clinical trialists have to involve health economic aspects more frequently. However, whereas they are quite familiar with classical effect measures in clinical trials, the corresponding parameters in health economic evaluation of therapeutic and diagnostic procedures are still not this common. The concepts of incremental cost effectiveness ratios (ICERs) and incremental net health benefit (INHB) will be illustrated and contrasted along the cost effectiveness evaluation of cataract surgery with monofocal and multifocal intraocular lenses. ICERs relate the costs of a treatment to its clinical benefit in terms of a ratio expression (indexed as Euro per clinical benefit unit). Therefore ICERs can be directly compared to a pre-specified willingness to pay (WTP) benchmark, which represents the maximum costs, health insurers would invest to achieve one clinical benefit unit. INHBs estimate a treatment's net clinical benefit after accounting for its cost increase versus an established therapeutic standard. Resource allocation rules can be formulated by means of both effect measures. Both the ICER and the INHB approach enable the definition of directional resource allocation rules. The allocation decisions arising from these rules are identical, as long as the willingness to pay benchmark is fixed in advance. Therefore both strategies crucially call for a priori determination of both the underlying clinical benefit endpoint (such as gain in vision lines after cataract surgery or gain in quality-adjusted life years) and the corresponding willingness to pay benchmark. The use of incremental cost effectiveness and net health benefit estimates provides a rationale for health economic allocation discussions and founding decisions. It implies the same requirements on trial protocols as yet established for clinical trials, that is the a priori definition of primary hypotheses (formulated as an allocation rule involving a pre-specified willingness to pay benchmark) and the primary clinical benefit endpoint (as a rationale for effectiveness evaluation).
NASA Astrophysics Data System (ADS)
Ito, Akihiko; Nishina, Kazuya; Reyer, Christopher P. O.; François, Louis; Henrot, Alexandra-Jane; Munhoven, Guy; Jacquemin, Ingrid; Tian, Hanqin; Yang, Jia; Pan, Shufen; Morfopoulos, Catherine; Betts, Richard; Hickler, Thomas; Steinkamp, Jörg; Ostberg, Sebastian; Schaphoff, Sibyll; Ciais, Philippe; Chang, Jinfeng; Rafique, Rashid; Zeng, Ning; Zhao, Fang
2017-08-01
Simulating vegetation photosynthetic productivity (or gross primary production, GPP) is a critical feature of the biome models used for impact assessments of climate change. We conducted a benchmarking of global GPP simulated by eight biome models participating in the second phase of the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP2a) with four meteorological forcing datasets (30 simulations), using independent GPP estimates and recent satellite data of solar-induced chlorophyll fluorescence as a proxy of GPP. The simulated global terrestrial GPP ranged from 98 to 141 Pg C yr-1 (1981-2000 mean); considerable inter-model and inter-data differences were found. Major features of spatial distribution and seasonal change of GPP were captured by each model, showing good agreement with the benchmarking data. All simulations showed incremental trends of annual GPP, seasonal-cycle amplitude, radiation-use efficiency, and water-use efficiency, mainly caused by the CO2 fertilization effect. The incremental slopes were higher than those obtained by remote sensing studies, but comparable with those by recent atmospheric observation. Apparent differences were found in the relationship between GPP and incoming solar radiation, for which forcing data differed considerably. The simulated GPP trends co-varied with a vegetation structural parameter, leaf area index, at model-dependent strengths, implying the importance of constraining canopy properties. In terms of extreme events, GPP anomalies associated with a historical El Niño event and large volcanic eruption were not consistently simulated in the model experiments due to deficiencies in both forcing data and parameterized environmental responsiveness. Although the benchmarking demonstrated the overall advancement of contemporary biome models, further refinements are required, for example, for solar radiation data and vegetation canopy schemes.
Silver, Emily J.; D'Amato, Anthony W.; Fraver, Shawn; Palik, Brian J.; Bradford, John B.
2013-01-01
The structure and developmental dynamics of old-growth forests often serve as important baselines for restoration prescriptions aimed at promoting more complex structural conditions in managed forest landscapes. Nonetheless, long-term information on natural patterns of development is rare for many commercially important and ecologically widespread forest types. Moreover, the effectiveness of approaches recommended for restoring old-growth structural conditions to managed forests, such as the application of extended rotation forestry, has been little studied. This study uses several long-term datasets from old growth, extended rotation, and unmanaged second growth Pinus resinosa (red pine) forests in northern Minnesota, USA, to quantify the range of variation in structural conditions for this forest type and to evaluate the effectiveness of extended rotation forestry at promoting the development of late-successional structural conditions. Long-term tree population data from permanent plots for one of the old-growth stands and the extended rotation stands (87 and 61 years, respectively) also allowed for an examination of the long-term structural dynamics of these systems. Old-growth forests were more structurally complex than unmanaged second-growth and extended rotation red pine stands, due in large part to the significantly higher volumes of coarse woody debris (70.7 vs. 11.5 and 4.7 m3/ha, respectively) and higher snag basal area (6.9 vs. 2.9 and 0.5 m2/ha, respectively). In addition, old-growth forests, although red pine-dominated, contained a greater abundance of other species, including Pinus strobus, Abies balsamea, and Picea glauca relative to the other stand types examined. These differences between stand types largely reflect historic gap-scale disturbances within the old-growth systems and their corresponding structural and compositional legacies. Nonetheless, extended rotation thinning treatments, by accelerating advancement to larger tree diameter classes, generated diameter distributions more closely approximating those found in old growth within a shorter time frame than depicted in long-term examinations of old-growth structural development. These results suggest that extended rotation treatments may accelerate the development of old-growth structural characteristics, provided that coarse woody debris and snags are deliberately retained and created on site. These and other developmental characteristics of old-growth systems can inform forest management when objectives include the restoration of structural conditions found in late-successional forests.
Comparison of Long-Term Outcomes in Adolescents with Anorexia Nervosa Treated with Family Therapy
ERIC Educational Resources Information Center
Lock, James; Couturier, Jennifer; Agras, W. Stewart
2006-01-01
Objective: To describe the relative effectiveness of a short versus long course of family-based therapy (FBT) for adolescent anorexia nervosa at long-term follow-up. Method: This study used clinical and structured interviews to assess psychological and psychosocial outcomes of adolescents (ages 12-18 years at baseline) who were previously treated…
Family Structure and Long-Term Care Insurance Purchase
Van Houtven, Courtney Harold; Coe, Norma B.; Konetzka, R. Tamara
2015-01-01
While it has long been assumed that family structure and potential sources of informal care play a large role in the purchase decisions for long-term care insurance (LTCI), current empirical evidence is inconclusive. Our study examines the relationship between family structure and LTCI purchase and addresses several major limitations of the prior literature by using a long panel of data and considering modern family relationships, such as presence of stepchildren. We find that family structure characteristics from one’s own generation, particularly about one’s spouse, are associated with purchase, but that few family structure attributes from the younger generation have an influence. Family factors that may indicate future caregiver supply are negatively associated with purchase: having a coresidential child, signaling close proximity, and having a currently working spouse, signaling a healthy and able spouse, that LTC planning has not occurred yet, or that there is less need for asset protection afforded by LTCI. Dynamic factors, such as increasing wealth or turning 65, are associated with higher likelihood of LTCI purchase. PMID:25760583
Benchmarking: applications to transfusion medicine.
Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M
2012-10-01
Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal. Copyright © 2012 Elsevier Inc. All rights reserved.
Janssens, Heidi; Braeckman, Lutgart; De Clercq, Bart; Casini, Annalisa; De Bacquer, Dirk; Kittel, France; Clays, Els
2016-08-22
In this longitudinal study the complex interplay between both job strain and bullying in relation to sickness absence was investigated. Following the "work environment hypothesis", which establishes several work characteristics as antecedents of bullying, we assumed that job strain, conceptualized by the Job-Demand-Control model, has an indirect relation with long-term sickness absence through bullying. The sample consisted of 2983 Belgian workers, aged 30 to 55 years, who participated in the Belstress III study. They completed a survey, including the Job Content Questionnaire and a bullying inventory, at baseline. Their sickness absence figures were registered during 1 year follow-up. Long-term sickness absence was defined as at least 15 consecutive days. A mediation analysis, using structural equation modeling, was performed to examine the indirect association of job strain through bullying with long-term sickness absence. The full structural model was adjusted for several possible confounders: age, gender, occupational group, educational level, company, smoking habits, alcohol use, body mass index, self-rated health, baseline long-term sickness absence and neuroticism. The results support the hypothesis: a significant indirect association of job strain with long-term sickness absence through bullying was observed, suggesting that bullying is an intermediate variable between job strain and long-term sickness absence. No evidence for the reversed pathway of an indirect association of bullying through job strain was found. Bullying was observed as a mediating variable in the relation between job strain and sickness absence. The results suggest that exposure to job strain may create circumstances in which a worker risks to become a target of bullying. Our findings are generally in line with the work environment hypothesis, which emphasizes the importance of organizational work factors in the origin of bullying. This study highlights that remodeling jobs to reduce job strain may be important in the prevention of bullying and subsequent sickness absence.
Standardised Benchmarking in the Quest for Orthologs
Altenhoff, Adrian M.; Boeckmann, Brigitte; Capella-Gutierrez, Salvador; Dalquen, Daniel A.; DeLuca, Todd; Forslund, Kristoffer; Huerta-Cepas, Jaime; Linard, Benjamin; Pereira, Cécile; Pryszcz, Leszek P.; Schreiber, Fabian; Sousa da Silva, Alan; Szklarczyk, Damian; Train, Clément-Marie; Bork, Peer; Lecompte, Odile; von Mering, Christian; Xenarios, Ioannis; Sjölander, Kimmen; Juhl Jensen, Lars; Martin, Maria J.; Muffato, Matthieu; Gabaldón, Toni; Lewis, Suzanna E.; Thomas, Paul D.; Sonnhammer, Erik; Dessimoz, Christophe
2016-01-01
The identification of evolutionarily related genes across different species—orthologs in particular—forms the backbone of many comparative, evolutionary, and functional genomic analyses. Achieving high accuracy in orthology inference is thus essential. Yet the true evolutionary history of genes, required to ascertain orthology, is generally unknown. Furthermore, orthologs are used for very different applications across different phyla, with different requirements in terms of the precision-recall trade-off. As a result, assessing the performance of orthology inference methods remains difficult for both users and method developers. Here, we present a community effort to establish standards in orthology benchmarking and facilitate orthology benchmarking through an automated web-based service (http://orthology.benchmarkservice.org). Using this new service, we characterise the performance of 15 well-established orthology inference methods and resources on a battery of 20 different benchmarks. Standardised benchmarking provides a way for users to identify the most effective methods for the problem at hand, sets a minimal requirement for new tools and resources, and guides the development of more accurate orthology inference methods. PMID:27043882
Subjective and Objective Parameters Determining "Salience" in Long-Term Dialect Accommodation.
ERIC Educational Resources Information Center
Auer, Peter; Barden, Birgit; Grosskopf, Beate
1998-01-01
Presents results of a longitudinal study on long-term dialect accommodation in a German dialect setting. An important model of explaining which linguistic structures undergo such convergence and which do not makes use of the notion of "salience." (Author/VWL)
US-23 aggregate test road long-term performance evaluation : final report.
DOT National Transportation Integrated Search
2017-03-24
The US-23 Aggregate Test Road was constructed in 1992 with the main purpose to determine the influence of coarse : aggregate of varying frost susceptibility on long-term concrete durability. The pavement structure for the entire Test Road consists : ...
Graham, S.A.; Craft, C.B.; McCormick, P.V.; Aldous, A.
2005-01-01
Forms, amounts, and accumulation of soil phosphorus (P) were measured in natural and recently restored marshes surrounding Upper Klamath Lake located in south-central Oregon, USA to determine rates of P accumulation in natural marshes and to assess changes in P pools caused by long-term drainage in recently restored marshes. Soil cores were collected from three natural marshes and radiometrically dated to determine recent (l37Cs-based) and long-term (210Pb-based) rates of peat accretion and P accumulation. A second set of soil cores collected from the three natural marshes and from three recently restored marshes was analyzed using a modification of the Hedley procedure to determine the forms and amounts of soil P. Total P in the recently restored marshes (222 to 311 ??g cm-3) was 2-3 times greater than in the natural marshes (103 to 117 ??g cm-3), primarily due to greater bulk density caused by soil subsidence, a consequence of long-term marsh drainage. Occluded Fe- and Al-bound Pi, calcium-bound Pi and residual P were 4 times, 22 times, and 5 times greater, respectively, in the recently restored marshes. More than 67% of the P pool in both the natural and recently restored marshes was present in recalcitrant forms (humic-acid P o and residual P) that provide long-term P storage in peat. Phosphorus accumulation in the natural marshes averaged 0.45 g m-2 yr-1 (137Cs) and 0.40 g m-2 yr-1 (210Pb), providing a benchmark for optimizing P sequestration in the recently restored marshes. Effective P sequestration in the recently restored marshes, however, will depend on re-establishing equilibrium between the P-enriched soils and the P concentration of floodwaters and a hydrologie regime similar to the natural marshes. ?? 2005, The Society of Wetland Scientists.
ERIC Educational Resources Information Center
Martinez, Michael E.
2010-01-01
The human mind has two types of memory: short-term and long-term. In all types of learning, it is best to use that structure rather than to fight against it. One way to do that is to ensure that learners can fit new information into patterns that can be stored in and more easily retrieved from long-term memory.
Survey of long-term durability of fiberglass reinforced plastic structures
NASA Technical Reports Server (NTRS)
Lieblein, S.
1981-01-01
Included are fluid containment vessels, marine structures, and aircraft components with up to 19 years of service. Correlations were obtained for the variation of static fatigue strength, cyclic fatigue strength, and residual burst strength for pressure vessels. In addition, data are presented for the effects of moisture on strength retention. Data variations were analyzed, and relationships and implications for testing are discussed. Change in strength properties for complete structures was examined for indications of the effects of environmental conditions such as moisture and outdoor exposure (ultraviolet radiation, weathering) on long term durability.
Assembly of hard spheres in a cylinder: a computational and experimental study.
Fu, Lin; Bian, Ce; Shields, C Wyatt; Cruz, Daniela F; López, Gabriel P; Charbonneau, Patrick
2017-05-14
Hard spheres are an important benchmark of our understanding of natural and synthetic systems. In this work, colloidal experiments and Monte Carlo simulations examine the equilibrium and out-of-equilibrium assembly of hard spheres of diameter σ within cylinders of diameter σ≤D≤ 2.82σ. Although phase transitions formally do not exist in such systems, marked structural crossovers can nonetheless be observed. Over this range of D, we find in simulations that structural crossovers echo the structural changes in the sequence of densest packings. We also observe that the out-of-equilibrium self-assembly depends on the compression rate. Slow compression approximates equilibrium results, while fast compression can skip intermediate structures. Crossovers for which no continuous line-slip exists are found to be dynamically unfavorable, which is the main source of this difference. Results from colloidal sedimentation experiments at low diffusion rate are found to be consistent with the results of fast compressions, as long as appropriate boundary conditions are used.
Issues in Benchmarking and Assessing Institutional Engagement
ERIC Educational Resources Information Center
Furco, Andrew; Miller, William
2009-01-01
The process of assessing and benchmarking community engagement can take many forms. To date, more than two dozen assessment tools for measuring community engagement institutionalization have been published. These tools vary substantially in purpose, level of complexity, scope, process, structure, and focus. While some instruments are designed to…
Research on computer systems benchmarking
NASA Technical Reports Server (NTRS)
Smith, Alan Jay (Principal Investigator)
1996-01-01
This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1986-01-01
An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.
Working memory, long-term memory, and medial temporal lobe function
Jeneson, Annette; Squire, Larry R.
2012-01-01
Early studies of memory-impaired patients with medial temporal lobe (MTL) damage led to the view that the hippocampus and related MTL structures are involved in the formation of long-term memory and that immediate memory and working memory are independent of these structures. This traditional idea has recently been revisited. Impaired performance in patients with MTL lesions on tasks with short retention intervals, or no retention interval, and neuroimaging findings with similar tasks have been interpreted to mean that the MTL is sometimes needed for working memory and possibly even for visual perception itself. We present a reappraisal of this interpretation. Our main conclusion is that, if the material to be learned exceeds working memory capacity, if the material is difficult to rehearse, or if attention is diverted, performance depends on long-term memory even when the retention interval is brief. This fundamental notion is better captured by the terms subspan memory and supraspan memory than by the terms short-term memory and long-term memory. We propose methods for determining when performance on short-delay tasks must depend on long-term (supraspan) memory and suggest that MTL lesions impair performance only when immediate memory and working memory are insufficient to support performance. In neuroimaging studies, MTL activity during encoding is influenced by the memory load and correlates positively with long-term retention of the material that was presented. The most parsimonious and consistent interpretation of all the data is that subspan memoranda are supported by immediate memory and working memory and are independent of the MTL. PMID:22180053
Meyerand, M.E.; Sutula, T.
2015-01-01
Neural activity promotes circuit formation in developing systems and during critical periods permanently modifies circuit organization and functional properties. These observations suggest that excessive neural activity, as occurs during seizures, might influence developing neural circuitry with long-term outcomes that depend on age at the time of seizures. We systematically examined long-term structural and functional consequences of seizures induced in rats by kainic acid, pentylenetetrazol, and hyperthermia across postnatal ages from birth through postnatal day 90 in adulthood (P90). Magnetic resonance imaging (MRI), diffusion tensor imaging (DTI), and electrophysiological methods at ≥P95 following seizures induced from P1 to P90 demonstrated consistent patterns of gross atrophy, microstructural abnormalities in the corpus callosum and hippocampus, and functional alterations in hippocampal circuitry at ≥P95 that were independent of the method of seizure induction and varied systematically as a function of age at the time of seizures. Three distinct epochs were observed in which seizures resulted in distinct long-term structural and functional outcomes at ≥P95. Seizures prior to P20 resulted in DTI abnormalities in corpus callosum and hippocampus in the absence of gross cerebral atrophy, and increased paired pulse inhibition (PPI) in the dentate gyrus at ≥P95. Seizures after P30 induced a different pattern of DTI abnormalities in the fimbria and hippocampus accompanied by gross cerebral atrophy with increases in lateral ventricular volume, as well as increased PPI in the dentate gyrus at ≥P95. In contrast, seizures between P20-P30 did not result in cerebral atrophy or significant imaging abnormalities in the hippocampus or white matter, but irreversibly decreased PPI in the dentate gyrus compared to normal adult controls. These age-specific long-term structural and functional outcomes identify P20-P30 as a potential critical period in hippocampal development defined by distinctive long-term structural and functional properties in adult hippocampal circuitry, including loss of capacity for seizure-induced plasticity in adulthood that could influence epileptogenesis and other hippocampal – dependent behaviors and functional properties. PMID:25555928
Meta-Cresol Purple Reference Material® (RM) for Seawater pH Measurements
NASA Astrophysics Data System (ADS)
Easley, R. A.; Waters, J. F.; Place, B. J.; Pratt, K. W.
2016-02-01
The pH of seawater is a fundamental quantity that governs the carbon dioxide - carbonate system in the world's oceans. High quality pH measurements for long-term monitoring, shipboard studies, and shorter-term biological studies (mesocosm and field experiments) can be ensured through a reference material (RM) that is compatible with existing procedures and which is traceable to primary pH measurement metrology. High-precision spectrophotometric measurements of seawater pH using an indicator dye such as meta-cresol purple (mCP) are well established. However, traceability of these measurements to the International System of Units (SI) additionally requires characterizing the spectrophotometric pH response of the dye in multiple artificial seawater buffers that themselves are benchmarked via primary pH (Harned cell) measurements at a range of pH, salinity, and temperature. NIST is currently developing such a mCP pH RM using this approach. This material will also incorporate new procedures developed at NIST for assessing the purity and homogeneity of the mCP reagent itself. The resulting mCP will provide long-term (years) stability and ease of shipment compared to artificial seawater pH buffers. These efforts will provide the oceanographic user community with a NIST issued mCP (RM), characterized as to its molar absorptivity values and acid dissociation constants (pKa), with uncertainties that comply with the Guide to the Expression of Uncertainty in Measurement (GUM).
3-D ICs as a Platform for IoT Devices
2017-03-01
small footprint, integrate disparate technologies, and require long term sustainability (extremely low power or self powered). The 3-D structure exhibits...proof, corrosion may slowly degrade the structure of the device, affecting long term reliability. Mechanical stress can also ruin the device due to...substrate vias (TSVs). The TSVs are short vertical in- terconnections (typically 20 µm in length and 2 to 4 µm in diameter [3]) that carry a variety of
Study of energy parameters of machine parts of water-ice jet cleaning applications
NASA Astrophysics Data System (ADS)
Prezhbilov, A. N.; Burnashov, M. A.
2018-03-01
The reader will achieve a benchmark understanding of the essence of cleaning for the removal of contaminants from machine elements by means of cryo jet/water-ice jet with particles prepared beforehand. This paper represents the classification of the most common contaminants appearing on the surfaces of machine elements after a long-term service. The conceptual contribution of the paper is to represent a thermo-physical model of contaminant removal by means of a water ice jet. In conclusion, it is evident that this study has shown the dependencies between the friction force of an ice particle with an obstacle (contamination), a dimensional change of an ice particle in the cleaning process and the quantity of heat transmitted to an ice particle.
An agricultural survey for more than 9,500 African households
Waha, Katharina; Zipf, Birgit; Kurukulasuriya, Pradeep; Hassan, Rashid M.
2016-01-01
Surveys for more than 9,500 households were conducted in the growing seasons 2002/2003 or 2003/2004 in eleven African countries: Burkina Faso, Cameroon, Ghana, Niger and Senegal in western Africa; Egypt in northern Africa; Ethiopia and Kenya in eastern Africa; South Africa, Zambia and Zimbabwe in southern Africa. Households were chosen randomly in districts that are representative for key agro-climatic zones and farming systems. The data set specifies farming systems characteristics that can help inform about the importance of each system for a country’s agricultural production and its ability to cope with short- and long-term climate changes or extreme weather events. Further it informs about the location of smallholders and vulnerable systems and permits benchmarking agricultural systems characteristics. PMID:27218890
Vrijens, France; Renard, Françoise; Jonckheer, Pascale; Van den Heede, Koen; Desomer, Anja; Van de Voorde, Carine; Walckiers, Denise; Dubois, Cécile; Camberlin, Cécile; Vlayen, Joan; Van Oyen, Herman; Léonard, Christian; Meeus, Pascal
2013-09-01
Following the commitments of the Tallinn Charter, Belgium publishes the second report on the performance of its health system. A set of 74 measurable indicators is analysed, and results are interpreted following the five dimensions of the conceptual framework: accessibility, quality of care, efficiency, sustainability and equity. All domains of care are covered (preventive, curative, long-term and end-of-life care), as well as health status and health promotion. For all indicators, national/regional values are presented with their evolution over time. Benchmarking to results of other EU-15 countries is also systematic. The policy recommendations represent the most important output of the report. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Lahmiri, Salim; Uddin, Gazi Salah; Bekiros, Stelios
2017-11-01
We propose a general framework for measuring short and long term dynamics in asset classes based on the wavelet presentation of clustering analysis. The empirical results show strong evidence of instability of the financial system aftermath of the global financial crisis. Indeed, both short and long-term dynamics have significantly changed after the global financial crisis. This study provides an interesting insights complex structure of global financial and economic system.
NASA Technical Reports Server (NTRS)
Tien, John K.
1990-01-01
The long term interdiffusional stability of tungsten fiber reinforced niobium alloy composites is addressed. The matrix alloy that is most promising for use as a high temperature structural material for reliable long-term space power generation is Nb1Zr. As an ancillary project to this program, efforts were made to assess the nature and kinetics of interphase reaction between selected beryllide intermetallics and nickel and iron aluminides.
Technical Ramifications of Inclusion of Toxins in the Chemical Weapons Convention (CWC), Supplement
1993-08-01
delaying closure of the ductus arteriosus in infants born with certain cardiac abnormalities, and PGI 2 has been used in cardiopulmonary bypass operations...development of "novel" (and therapeutic) compounds by the industrial medicinal chemists is the necessity for structural novelty in the patent sense (176). (This...such as long-term changes in numbers of receptors, long-term closure of certain ion channels, and possibly even long-term changes in number of
Knotty: Efficient and Accurate Prediction of Complex RNA Pseudoknot Structures.
Jabbari, Hosna; Wark, Ian; Montemagno, Carlo; Will, Sebastian
2018-06-01
The computational prediction of RNA secondary structure by free energy minimization has become an important tool in RNA research. However in practice, energy minimization is mostly limited to pseudoknot-free structures or rather simple pseudoknots, not covering many biologically important structures such as kissing hairpins. Algorithms capable of predicting sufficiently complex pseudoknots (for sequences of length n) used to have extreme complexities, e.g. Pknots (Rivas and Eddy, 1999) has O(n6) time and O(n4) space complexity. The algorithm CCJ (Chen et al., 2009) dramatically improves the asymptotic run time for predicting complex pseudoknots (handling almost all relevant pseudoknots, while being slightly less general than Pknots), but this came at the cost of large constant factors in space and time, which strongly limited its practical application (∼200 bases already require 256GB space). We present a CCJ-type algorithm, Knotty, that handles the same comprehensive pseudoknot class of structures as CCJ with improved space complexity of Θ(n3 + Z)-due to the applied technique of sparsification, the number of "candidates", Z, appears to grow significantly slower than n4 on our benchmark set (which include pseudoknotted RNAs up to 400 nucleotides). In terms of run time over this benchmark, Knotty clearly outperforms Pknots and the original CCJ implementation, CCJ 1.0; Knotty's space consumption fundamentally improves over CCJ 1.0, being on a par with the space-economic Pknots. By comparing to CCJ 2.0, our unsparsified Knotty variant, we demonstrate the isolated effect of sparsification. Moreover, Knotty employs the state-of-the-art energy model of "HotKnots DP09", which results in superior prediction accuracy over Pknots. Our software is available at https://github.com/HosnaJabbari/Knotty. will@tbi.unvie.ac.at. Supplementary data are available at Bioinformatics online.
XWeB: The XML Warehouse Benchmark
NASA Astrophysics Data System (ADS)
Mahboubi, Hadj; Darmont, Jérôme
With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.
Child-Resistant Packaging for E-Liquid: A Review of US State Legislation.
Frey, Leslie T; Tilburg, William C
2016-02-01
A growing number of states have introduced or enacted legislation requiring child-resistant packaging for e-liquid containers; however, these laws involve varying terms, packaging standards, and enforcement provisions, raising concerns about their effectiveness. We evaluated bills against 4 benchmarks: broad product definitions that contemplate future developments in the market, citations to a specific packaging standard, stated penalties for violations, and express grants of authority to a state entity to enforce the packaging requirements. Our findings showed that 3 states meet all 4 benchmarks in their enacted legislation. We encourage states to consider these benchmarks when revising statutes or drafting future legislation.
Child-Resistant Packaging for E-Liquid: A Review of US State Legislation
Tilburg, William C.
2016-01-01
A growing number of states have introduced or enacted legislation requiring child-resistant packaging for e-liquid containers; however, these laws involve varying terms, packaging standards, and enforcement provisions, raising concerns about their effectiveness. We evaluated bills against 4 benchmarks: broad product definitions that contemplate future developments in the market, citations to a specific packaging standard, stated penalties for violations, and express grants of authority to a state entity to enforce the packaging requirements. Our findings showed that 3 states meet all 4 benchmarks in their enacted legislation. We encourage states to consider these benchmarks when revising statutes or drafting future legislation. PMID:26691114
Ultra-low power wireless sensing for long-term structural health monitoring
NASA Astrophysics Data System (ADS)
Bilbao, Argenis; Hoover, Davis; Rice, Jennifer; Chapman, Jamie
2011-04-01
Researchers have made significant progress in recent years towards realizing long-term structural health monitoring (SHM) utilizing wireless smart sensor networks (WSSNs). These efforts have focused on improving the performance and robustness of such networks to achieve high quality data acquisition and in-network processing. One of the primary challenges still facing the use of smart sensors for long-term monitoring deployments is their limited power resources. Periodically accessing the sensor nodes to change batteries is not feasible or economical in many deployment cases. While energy harvesting techniques show promise for prolonging unattended network life, low-power design and operation are still critically important. This research presents a new, fully integrated ultra-low power wireless smart sensor node and a flexible base station, both designed for long-term SHM applications. The power consumption of the sensor nodes and base station has been minimized through careful hardware selection and the implementation of power-aware network software, without sacrificing flexibility and functionality.
Smith, Joanna; Cheater, Francine; Bekker, Hilary
2015-08-01
Living with a child with a long-term condition can result in challenges above usual parenting because of illness-specific demands. A critical evaluation of research exploring parents' experiences of living with a child with a long-term condition is timely because international health policy advocates that patients with long-term conditions become active collaborators in care decisions. A rapid structured review was undertaken (January 1999-December 2009) in accordance with the United Kingdom Centre for Reviews and Dissemination guidance. Three data bases (MEDLINE, CINAHL, PSYCINFO) were searched and also hand searching of the Journal of Advanced Nursing and Child: Care, Health and Development. Primary research studies written in English language describing parents' experiences of living with a child with a long-term condition were included. Thematic analysis underpinned data synthesis. Quality appraisal involved assessing each study against predetermined criteria. Thirty-four studies met the inclusion criteria. The impact of living with a child with a long-term condition related to dealing with immediate concerns following the child's diagnosis and responding to the challenges of integrating the child's needs into family life. Parents' perceived they are not always supported in their quest for information and forming effective relationships with health-care professionals can be stressful. Although having ultimate responsibility for their child's health can be overwhelming, parents developed considerable expertise in managing their child's condition. Parents' accounts suggest they not always supported in their role as manager for their child's long-term condition and their expertise, and contribution to care is not always valued. © 2013 John Wiley & Sons Ltd.
A benchmark testing ground for integrating homology modeling and protein docking.
Bohnuud, Tanggis; Luo, Lingqi; Wodak, Shoshana J; Bonvin, Alexandre M J J; Weng, Zhiping; Vajda, Sandor; Schueler-Furman, Ora; Kozakov, Dima
2017-01-01
Protein docking procedures carry out the task of predicting the structure of a protein-protein complex starting from the known structures of the individual protein components. More often than not, however, the structure of one or both components is not known, but can be derived by homology modeling on the basis of known structures of related proteins deposited in the Protein Data Bank (PDB). Thus, the problem is to develop methods that optimally integrate homology modeling and docking with the goal of predicting the structure of a complex directly from the amino acid sequences of its component proteins. One possibility is to use the best available homology modeling and docking methods. However, the models built for the individual subunits often differ to a significant degree from the bound conformation in the complex, often much more so than the differences observed between free and bound structures of the same protein, and therefore additional conformational adjustments, both at the backbone and side chain levels need to be modeled to achieve an accurate docking prediction. In particular, even homology models of overall good accuracy frequently include localized errors that unfavorably impact docking results. The predicted reliability of the different regions in the model can also serve as a useful input for the docking calculations. Here we present a benchmark dataset that should help to explore and solve combined modeling and docking problems. This dataset comprises a subset of the experimentally solved 'target' complexes from the widely used Docking Benchmark from the Weng Lab (excluding antibody-antigen complexes). This subset is extended to include the structures from the PDB related to those of the individual components of each complex, and hence represent potential templates for investigating and benchmarking integrated homology modeling and docking approaches. Template sets can be dynamically customized by specifying ranges in sequence similarity and in PDB release dates, or using other filtering options, such as excluding sets of specific structures from the template list. Multiple sequence alignments, as well as structural alignments of the templates to their corresponding subunits in the target are also provided. The resource is accessible online or can be downloaded at http://cluspro.org/benchmark, and is updated on a weekly basis in synchrony with new PDB releases. Proteins 2016; 85:10-16. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Lamy, T.; Galzin, R.; Kulbicki, M.; Lison de Loma, T.; Claudet, J.
2016-03-01
Coral reefs are increasingly being altered by a myriad of anthropogenic activities and natural disturbances. Long-term studies offer unique opportunities to understand how multiple and recurrent disturbances can influence coral reef resilience and long-term dynamics. While the long-term dynamics of coral assemblages have been extensively documented, the long-term dynamics of coral reef fish assemblages have received less attention. Here, we describe the changes in fish assemblages on Tiahura reef, Moorea, from 1979 to 2011. During this 33-yr period, Tiahura was exposed to multiple disturbances (crown-of-thorns seastar outbreaks and cyclones) that caused recurrent declines and recoveries of coral cover and changes in the dominant coral genera. These shifts in coral composition were associated with long-term cascading effects on fish assemblages. The composition and trophic structure of fish assemblages continuously shifted without returning to their initial composition, whereas fish species richness remained stable, albeit with a small increase over time. We detected nonlinear responses of fish density when corals were most degraded. When coral cover dropped below 10 % following a severe crown-of-thorns sea star outbreak, the density of most fish trophic groups sharply decreased. Our study shows that historical contingency may potentially be an important but largely underestimated factor explaining the contemporary structure of reef fish assemblages and suggests that temporal stability in their structure and function should not necessarily be the target of management strategies that aim at increasing or maintaining coral reef resilience.
A large-scale benchmark of gene prioritization methods.
Guala, Dimitri; Sonnhammer, Erik L L
2017-04-21
In order to maximize the use of results from high-throughput experimental studies, e.g. GWAS, for identification and diagnostics of new disease-associated genes, it is important to have properly analyzed and benchmarked gene prioritization tools. While prospective benchmarks are underpowered to provide statistically significant results in their attempt to differentiate the performance of gene prioritization tools, a strategy for retrospective benchmarking has been missing, and new tools usually only provide internal validations. The Gene Ontology(GO) contains genes clustered around annotation terms. This intrinsic property of GO can be utilized in construction of robust benchmarks, objective to the problem domain. We demonstrate how this can be achieved for network-based gene prioritization tools, utilizing the FunCoup network. We use cross-validation and a set of appropriate performance measures to compare state-of-the-art gene prioritization algorithms: three based on network diffusion, NetRank and two implementations of Random Walk with Restart, and MaxLink that utilizes network neighborhood. Our benchmark suite provides a systematic and objective way to compare the multitude of available and future gene prioritization tools, enabling researchers to select the best gene prioritization tool for the task at hand, and helping to guide the development of more accurate methods.
The Long-Term Conditions Questionnaire: conceptual framework and item development.
Peters, Michele; Potter, Caroline M; Kelly, Laura; Hunter, Cheryl; Gibbons, Elizabeth; Jenkinson, Crispin; Coulter, Angela; Forder, Julien; Towers, Ann-Marie; A'Court, Christine; Fitzpatrick, Ray
2016-01-01
To identify the main issues of importance when living with long-term conditions to refine a conceptual framework for informing the item development of a patient-reported outcome measure for long-term conditions. Semi-structured qualitative interviews (n=48) were conducted with people living with at least one long-term condition. Participants were recruited through primary care. The interviews were transcribed verbatim and analyzed by thematic analysis. The analysis served to refine the conceptual framework, based on reviews of the literature and stakeholder consultations, for developing candidate items for a new measure for long-term conditions. Three main organizing concepts were identified: impact of long-term conditions, experience of services and support, and self-care. The findings helped to refine a conceptual framework, leading to the development of 23 items that represent issues of importance in long-term conditions. The 23 candidate items formed the first draft of the measure, currently named the Long-Term Conditions Questionnaire. The aim of this study was to refine the conceptual framework and develop items for a patient-reported outcome measure for long-term conditions, including single and multiple morbidities and physical and mental health conditions. Qualitative interviews identified the key themes for assessing outcomes in long-term conditions, and these underpinned the development of the initial draft of the measure. These initial items will undergo cognitive testing to refine the items prior to further validation in a survey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suter, G.W., II
1993-01-01
One of the initial stages in ecological risk assessment of hazardous waste sites is the screening of contaminants to determine which, if any, of them are worthy of further consideration; this process is termed contaminant screening. Screening is performed by comparing concentrations in ambient media to benchmark concentrations that are either indicative of a high likelihood of significant effects (upper screening benchmarks) or of a very low likelihood of significant effects (lower screening benchmarks). Exceedance of an upper screening benchmark indicates that the chemical in question is clearly of concern and remedial actions are likely to be needed. Exceedance ofmore » a lower screening benchmark indicates that a contaminant is of concern unless other information indicates that the data are unreliable or the comparison is inappropriate. Chemicals with concentrations below the lower benchmark are not of concern if the ambient data are judged to be adequate. This report presents potential screening benchmarks for protection of aquatic life from contaminants in water. Because there is no guidance for screening benchmarks, a set of alternative benchmarks is presented herein. The alternative benchmarks are based on different conceptual approaches to estimating concentrations causing significant effects. For the upper screening benchmark, there are the acute National Ambient Water Quality Criteria (NAWQC) and the Secondary Acute Values (SAV). The SAV concentrations are values estimated with 80% confidence not to exceed the unknown acute NAWQC for those chemicals with no NAWQC. The alternative chronic benchmarks are the chronic NAWQC, the Secondary Chronic Value (SCV), the lowest chronic values for fish and daphnids, the lowest EC20 for fish and daphnids from chronic toxicity tests, the estimated EC20 for a sensitive species, and the concentration estimated to cause a 20% reduction in the recruit abundance of largemouth bass. It is recommended that ambient chemical concentrations be compared to all of these benchmarks. If NAWQC are exceeded, the chemicals must be contaminants of concern because the NAWQC are applicable or relevant and appropriate requirements (ARARs). If NAWQC are not exceeded, but other benchmarks are, contaminants should be selected on the basis of the number of benchmarks exceeded and the conservatism of the particular benchmark values, as discussed in the text. To the extent that toxicity data are available, this report presents the alternative benchmarks for chemicals that have been detected on the Oak Ridge Reservation. It also presents the data used to calculate the benchmarks and the sources of the data. It compares the benchmarks and discusses their relative conservatism and utility. This report supersedes a prior aquatic benchmarks report (Suter and Mabrey 1994). It adds two new types of benchmarks. It also updates the benchmark values where appropriate, adds some new benchmark values, replaces secondary sources with primary sources, and provides more complete documentation of the sources and derivation of all values.« less
2015-01-01
Benchmarking data sets have become common in recent years for the purpose of virtual screening, though the main focus had been placed on the structure-based virtual screening (SBVS) approaches. Due to the lack of crystal structures, there is great need for unbiased benchmarking sets to evaluate various ligand-based virtual screening (LBVS) methods for important drug targets such as G protein-coupled receptors (GPCRs). To date these ready-to-apply data sets for LBVS are fairly limited, and the direct usage of benchmarking sets designed for SBVS could bring the biases to the evaluation of LBVS. Herein, we propose an unbiased method to build benchmarking sets for LBVS and validate it on a multitude of GPCRs targets. To be more specific, our methods can (1) ensure chemical diversity of ligands, (2) maintain the physicochemical similarity between ligands and decoys, (3) make the decoys dissimilar in chemical topology to all ligands to avoid false negatives, and (4) maximize spatial random distribution of ligands and decoys. We evaluated the quality of our Unbiased Ligand Set (ULS) and Unbiased Decoy Set (UDS) using three common LBVS approaches, with Leave-One-Out (LOO) Cross-Validation (CV) and a metric of average AUC of the ROC curves. Our method has greatly reduced the “artificial enrichment” and “analogue bias” of a published GPCRs benchmarking set, i.e., GPCR Ligand Library (GLL)/GPCR Decoy Database (GDD). In addition, we addressed an important issue about the ratio of decoys per ligand and found that for a range of 30 to 100 it does not affect the quality of the benchmarking set, so we kept the original ratio of 39 from the GLL/GDD. PMID:24749745
Benchmarking in pathology: development of an activity-based costing model.
Burnett, Leslie; Wilson, Roger; Pfeffer, Sally; Lowry, John
2012-12-01
Benchmarking in Pathology (BiP) allows pathology laboratories to determine the unit cost of all laboratory tests and procedures, and also provides organisational productivity indices allowing comparisons of performance with other BiP participants. We describe 14 years of progressive enhancement to a BiP program, including the implementation of 'avoidable costs' as the accounting basis for allocation of costs rather than previous approaches using 'total costs'. A hierarchical tree-structured activity-based costing model distributes 'avoidable costs' attributable to the pathology activities component of a pathology laboratory operation. The hierarchical tree model permits costs to be allocated across multiple laboratory sites and organisational structures. This has enabled benchmarking on a number of levels, including test profiles and non-testing related workload activities. The development of methods for dealing with variable cost inputs, allocation of indirect costs using imputation techniques, panels of tests, and blood-bank record keeping, have been successfully integrated into the costing model. A variety of laboratory management reports are produced, including the 'cost per test' of each pathology 'test' output. Benchmarking comparisons may be undertaken at any and all of the 'cost per test' and 'cost per Benchmarking Complexity Unit' level, 'discipline/department' (sub-specialty) level, or overall laboratory/site and organisational levels. We have completed development of a national BiP program. An activity-based costing methodology based on avoidable costs overcomes many problems of previous benchmarking studies based on total costs. The use of benchmarking complexity adjustment permits correction for varying test-mix and diagnostic complexity between laboratories. Use of iterative communication strategies with program participants can overcome many obstacles and lead to innovations.
Xia, Jie; Jin, Hongwei; Liu, Zhenming; Zhang, Liangren; Wang, Xiang Simon
2014-05-27
Benchmarking data sets have become common in recent years for the purpose of virtual screening, though the main focus had been placed on the structure-based virtual screening (SBVS) approaches. Due to the lack of crystal structures, there is great need for unbiased benchmarking sets to evaluate various ligand-based virtual screening (LBVS) methods for important drug targets such as G protein-coupled receptors (GPCRs). To date these ready-to-apply data sets for LBVS are fairly limited, and the direct usage of benchmarking sets designed for SBVS could bring the biases to the evaluation of LBVS. Herein, we propose an unbiased method to build benchmarking sets for LBVS and validate it on a multitude of GPCRs targets. To be more specific, our methods can (1) ensure chemical diversity of ligands, (2) maintain the physicochemical similarity between ligands and decoys, (3) make the decoys dissimilar in chemical topology to all ligands to avoid false negatives, and (4) maximize spatial random distribution of ligands and decoys. We evaluated the quality of our Unbiased Ligand Set (ULS) and Unbiased Decoy Set (UDS) using three common LBVS approaches, with Leave-One-Out (LOO) Cross-Validation (CV) and a metric of average AUC of the ROC curves. Our method has greatly reduced the "artificial enrichment" and "analogue bias" of a published GPCRs benchmarking set, i.e., GPCR Ligand Library (GLL)/GPCR Decoy Database (GDD). In addition, we addressed an important issue about the ratio of decoys per ligand and found that for a range of 30 to 100 it does not affect the quality of the benchmarking set, so we kept the original ratio of 39 from the GLL/GDD.
Accurate multiple sequence-structure alignment of RNA sequences using combinatorial optimization.
Bauer, Markus; Klau, Gunnar W; Reinert, Knut
2007-07-27
The discovery of functional non-coding RNA sequences has led to an increasing interest in algorithms related to RNA analysis. Traditional sequence alignment algorithms, however, fail at computing reliable alignments of low-homology RNA sequences. The spatial conformation of RNA sequences largely determines their function, and therefore RNA alignment algorithms have to take structural information into account. We present a graph-based representation for sequence-structure alignments, which we model as an integer linear program (ILP). We sketch how we compute an optimal or near-optimal solution to the ILP using methods from combinatorial optimization, and present results on a recently published benchmark set for RNA alignments. The implementation of our algorithm yields better alignments in terms of two published scores than the other programs that we tested: This is especially the case with an increasing number of input sequences. Our program LARA is freely available for academic purposes from http://www.planet-lisa.net.
Salient object detection: manifold-based similarity adaptation approach
NASA Astrophysics Data System (ADS)
Zhou, Jingbo; Ren, Yongfeng; Yan, Yunyang; Gao, Shangbing
2014-11-01
A saliency detection algorithm based on manifold-based similarity adaptation is proposed. The proposed algorithm is divided into three steps. First, we segment an input image into superpixels, which are represented as the nodes in a graph. Second, a new similarity measurement is used in the proposed algorithm. The weight matrix of the graph, which indicates the similarities between the nodes, uses a similarity-based method. It also captures the manifold structure of the image patches, in which the graph edges are determined in a data adaptive manner in terms of both similarity and manifold structure. Then, we use local reconstruction method as a diffusion method to obtain the saliency maps. The objective function in the proposed method is based on local reconstruction, with which estimated weights capture the manifold structure. Experiments on four bench-mark databases demonstrate the accuracy and robustness of the proposed method.
Effect of physical activity on frailty and associated negative outcomes: the LIFE randomized trial
USDA-ARS?s Scientific Manuscript database
Background: Limited evidence suggests that physical activity may prevent frailty and associated negative outcomes in older adults. Definitive data from large, long-term, randomized trials are lacking. Objective: To determine whether a long-term structured moderate-intensity physical activity (PA) p...
24 CFR 886.107 - Approval of applications.
Code of Federal Regulations, 2011 CFR
2011-04-01
... turnover, and provides a reasonable assurance of long-term project viability. A determination of long-term viability shall be based upon the following considerations: (1) The project is not subject to any serious problems that are non-economic in nature. Examples of such problems are poor location, structural...
24 CFR 886.107 - Approval of applications.
Code of Federal Regulations, 2013 CFR
2013-04-01
... turnover, and provides a reasonable assurance of long-term project viability. A determination of long-term viability shall be based upon the following considerations: (1) The project is not subject to any serious problems that are non-economic in nature. Examples of such problems are poor location, structural...
24 CFR 886.107 - Approval of applications.
Code of Federal Regulations, 2012 CFR
2012-04-01
... turnover, and provides a reasonable assurance of long-term project viability. A determination of long-term viability shall be based upon the following considerations: (1) The project is not subject to any serious problems that are non-economic in nature. Examples of such problems are poor location, structural...
24 CFR 886.107 - Approval of applications.
Code of Federal Regulations, 2010 CFR
2010-04-01
... turnover, and provides a reasonable assurance of long-term project viability. A determination of long-term viability shall be based upon the following considerations: (1) The project is not subject to any serious problems that are non-economic in nature. Examples of such problems are poor location, structural...
24 CFR 886.107 - Approval of applications.
Code of Federal Regulations, 2014 CFR
2014-04-01
... turnover, and provides a reasonable assurance of long-term project viability. A determination of long-term viability shall be based upon the following considerations: (1) The project is not subject to any serious problems that are non-economic in nature. Examples of such problems are poor location, structural...
NASA Astrophysics Data System (ADS)
Boldina, Inna; Beninger, Peter G.; Le Coz, Maïwen
2014-01-01
Situated at the interface of the microbial and macrofaunal compartments, soft-bottom meiofauna accomplish important ecological functions. However, little is known of their spatial distribution in the benthic environment. To assess the effects of long-term mechanical disturbance on soft-bottom meiofaunal spatial distribution, we compared a site subjected to long-term clam digging to a nearby site untouched by such activities, in Bourgneuf Bay, on the Atlantic coast of France. Six patterned replicate samples were taken at 3, 6, 9, 12, 15, 18, 21 and 24 cm lags, all sampling stations being separated by 5 m. A combined correlogram-variogram approach was used to enhance interpretation of the meiofaunal spatial distribution; in particular, the definition of autocorrelation strength and its statistical significance, as well as the detailed characteristics of the periodic spatial structure of nematode assemblages, and the determination of the maximum distance of their spatial autocorrelation. At both sites, nematodes and copepods clearly exhibited aggregated spatial structure at the meso scale; this structure was attenuated at the impacted site. The nematode spatial distribution showed periodicity at the non-impacted site, but not at the impacted site. This is the first explicit report of a periodic process in meiofaunal spatial distribution. No such cyclic spatial process was observed for the more motile copepods at either site. This first study to indicate the impacts of long-term anthropogenic mechanical perturbation on meiofaunal spatial structure opens the door to a new dimension of mudflat ecology. Since macrofaunal predator search behaviour is known to be strongly influenced by prey spatial structure, the alteration of this structure may have important consequences for ecosystem functioning.
Ó Conchúir, Shane; Barlow, Kyle A; Pache, Roland A; Ollikainen, Noah; Kundert, Kale; O'Meara, Matthew J; Smith, Colin A; Kortemme, Tanja
2015-01-01
The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks) to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a "best practice" set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts, and tutorials for running the benchmark. The captures may be run with any suitable modeling method; we supply command lines for running the benchmarks using the Rosetta software suite. We have compiled initial benchmarks for the resource spanning three key areas: prediction of energetic effects of mutations, protein design, and protein structure prediction, each with associated state-of-the-art modeling protocols. With the help of the wider macromolecular modeling community, we hope to expand the variety of benchmarks included on the website and continue to evaluate new iterations of current methods as they become available.
Monday, Hannah R; Younts, Thomas J; Castillo, Pablo E
2018-04-25
Long-lasting changes of brain function in response to experience rely on diverse forms of activity-dependent synaptic plasticity. Chief among them are long-term potentiation and long-term depression of neurotransmitter release, which are widely expressed by excitatory and inhibitory synapses throughout the central nervous system and can dynamically regulate information flow in neural circuits. This review article explores recent advances in presynaptic long-term plasticity mechanisms and contributions to circuit function. Growing evidence indicates that presynaptic plasticity may involve structural changes, presynaptic protein synthesis, and transsynaptic signaling. Presynaptic long-term plasticity can alter the short-term dynamics of neurotransmitter release, thereby contributing to circuit computations such as novelty detection, modifications of the excitatory/inhibitory balance, and sensory adaptation. In addition, presynaptic long-term plasticity underlies forms of learning and its dysregulation participates in several neuropsychiatric conditions, including schizophrenia, autism, intellectual disabilities, neurodegenerative diseases, and drug abuse. Expected final online publication date for the Annual Review of Neuroscience Volume 41 is July 8, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Patient empowerment in long-term conditions: development and preliminary testing of a new measure
2013-01-01
Background Patient empowerment is viewed by policy makers and health care practitioners as a mechanism to help patients with long-term conditions better manage their health and achieve better outcomes. However, assessing the role of empowerment is dependent on effective measures of empowerment. Although many measures of empowerment exist, no measure has been developed specifically for patients with long-term conditions in the primary care setting. This study presents preliminary data on the development and validation of such a measure. Methods We conducted two empirical studies. Study one was an interview study to understand empowerment from the perspective of patients living with long-term conditions. Qualitative analysis identified dimensions of empowerment, and the qualitative data were used to generate items relating to these dimensions. Study two was a cross-sectional postal study involving patients with different types of long-term conditions recruited from general practices. The survey was conducted to test and validate our new measure of empowerment. Factor analysis and regression were performed to test scale structure, internal consistency and construct validity. Results Sixteen predominately elderly patients with different types of long-term conditions described empowerment in terms of 5 dimensions (identity, knowledge and understanding, personal control, personal decision-making, and enabling other patients). One hundred and ninety seven survey responses were received from mainly older white females, with relatively low levels of formal education, with the majority retired from paid work. Almost half of the sample reported cardiovascular, joint or diabetes long-term conditions. Factor analysis identified a three factor solution (positive attitude and sense of control, knowledge and confidence in decision making and enabling others), although the structure lacked clarity. A total empowerment score across all items showed acceptable levels of internal consistency and relationships with other measures were generally supportive of its construct validity. Conclusion Initial analyses suggest that the new empowerment measure meets basic psychometric criteria. Reasons concerning the failure to confirm the hypothesized factor structure are discussed alongside further developments of the scale. PMID:23835131
Complex Network Structure Influences Processing in Long-Term and Short-Term Memory
ERIC Educational Resources Information Center
Vitevitch, Michael S.; Chan, Kit Ying; Roodenrys, Steven
2012-01-01
Complex networks describe how entities in systems interact; the structure of such networks is argued to influence processing. One measure of network structure, clustering coefficient, C, measures the extent to which neighbors of a node are also neighbors of each other. Previous psycholinguistic experiments found that the C of phonological…
Development and Testing of an Inflatable, Rigidizable Space Structure Experiment
2006-03-01
successful, including physical dimension, weight , and cost. Inflatable structures have the potential to achieve greater efficiency in all of these...potential for low cost, high mechanical packaging efficiency, deployment reliability and low weight (13). The term inflatable structure indicates that a...back-up inflation gas a necessity for long term success. This addition can be very costly in terms of volume, weight , and expense due to added or
A Comparison of Mental Health Care Systems in Northern and Southern Europe: A Service Mapping Study.
Sadeniemi, Minna; Almeda, Nerea; Salinas-Pérez, Jose A; Gutiérrez-Colosía, Mencía R; García-Alonso, Carlos; Ala-Nikkola, Taina; Joffe, Grigori; Pirkola, Sami; Wahlbeck, Kristian; Cid, Jordi; Salvador-Carulla, Luis
2018-05-31
Mental health services (MHS) have gone through vast changes during the last decades, shifting from hospital to community-based care. Developing the optimal balance and use of resources requires standard comparisons of mental health care systems across countries. This study aimed to compare the structure, personnel resource allocation, and the productivity of the MHS in two benchmark health districts in a Nordic welfare state and a southern European, family-centered country. The study is part of the REFINEMENT (Research on Financing Systems' Effect on the Quality of Mental Health Care) project. The study areas were the Helsinki and Uusimaa region in Finland and the Girona region in Spain. The MHS were mapped by using the DESDE-LTC (Description and Evaluation of Services and Directories for Long Term Care) tool. There were 6.7 times more personnel resources in the MHS in Helsinki and Uusimaa than in Girona. The resource allocation was more residential-service-oriented in Helsinki and Uusimaa. The difference in mental health personnel resources is not explained by the respective differences in the need for MHS among the population. It is important to make a standard comparison of the MHS for supporting policymaking and to ensure equal access to care across European countries.
Time-frequency featured co-movement between the stock and prices of crude oil and gold
NASA Astrophysics Data System (ADS)
Huang, Shupei; An, Haizhong; Gao, Xiangyun; Huang, Xuan
2016-02-01
The nonlinear relationships among variables caused by the hidden frequency information complicate the time series analysis. To shed more light on this nonlinear issue, we examine their relationships in joint time-frequency domain with multivariate framework, and the analyses in the time domain and frequency domain serve as comparisons. The daily Brent oil prices, London gold fixing price and Shanghai Composite index from January 1991 to September 2014 are adopted as example. First, they have long-term cointegration relationship in time domain from holistic perspective. Second, the Granger causality tests in different frequency bands are heterogeneous. Finally, the comparison between results from wavelet coherence and multiple wavelet coherence in the joint time-frequency domain indicates that in the high (1-14 days) and medium frequency (14-128 days) bands, the combination of Brent and gold prices has stronger correlation with the stock. In the low frequency band (256-512 days), year 2003 is the structure broken point before which Brent and oil are ideal choice for hedging the risk of the stock market. Thus, this paper offers more details between the Chinese stock market and the commodities markets of crude oil and gold, which suggests that the decisions for different time and frequencies should consider the corresponding benchmark information.
A Benchmark Problem for Development of Autonomous Structural Modal Identification
NASA Technical Reports Server (NTRS)
Pappa, Richard S.; Woodard, Stanley E.; Juang, Jer-Nan
1996-01-01
This paper summarizes modal identification results obtained using an autonomous version of the Eigensystem Realization Algorithm on a dynamically complex, laboratory structure. The benchmark problem uses 48 of 768 free-decay responses measured in a complete modal survey test. The true modal parameters of the structure are well known from two previous, independent investigations. Without user involvement, the autonomous data analysis identified 24 to 33 structural modes with good to excellent accuracy in 62 seconds of CPU time (on a DEC Alpha 4000 computer). The modal identification technique described in the paper is the baseline algorithm for NASA's Autonomous Dynamics Determination (ADD) experiment scheduled to fly on International Space Station assembly flights in 1997-1999.
Phillip J. Van Mantgem; Nathan L. Stephenson; Eric Knapp; John Barrles; Jon E. Keeley
2011-01-01
The capacity of prescribed fire to restore forest conditions is often judged by changes in forest structure within a few years following burning. However, prescribed fire might have longer-term effects on forest structure, potentially changing treatment assessments. We examined annual changes in forest structure in five 1 ha old-growth plots immediately before...
The introduction of Dreissena to the Great lakes has profoundly impacted benthic ecosystems, resulting in the decline of native species and dramatic community restructuring. In Lake Ontario, long-term monitoring has yielded a wealth of detailed information regarding both the exp...
1990-08-01
evidence for a surprising degree of long-term skill retention. We formulated a theoretical framework , focusing on the importance of procedural reinstatement...considerable forgetting over even relatively short retention intervals. We have been able to place these studies in the same general theoretical framework developed
Recommendations for Benchmarking Web Site Usage among Academic Libraries.
ERIC Educational Resources Information Center
Hightower, Christy; Sih, Julie; Tilghman, Adam
1998-01-01
To help library directors and Web developers create a benchmarking program to compare statistics of academic Web sites, the authors analyzed the Web server log files of 14 university science and engineering libraries. Recommends a centralized voluntary reporting structure coordinated by the Association of Research Libraries (ARL) and a method for…
Academic Achievement and Extracurricular School Activities of At-Risk High School Students
ERIC Educational Resources Information Center
Marchetti, Ryan; Wilson, Randal H.; Dunham, Mardis
2016-01-01
This study compared the employment, extracurricular participation, and family structure status of students from low socioeconomic families that achieved state-approved benchmarks on ACT reading and mathematics tests to those that did not achieve the benchmarks. Free and reduced lunch eligibility was used to determine SES. Participants included 211…
Benchmarking Alumni Relations in Community Colleges: Findings from a 2015 CASE Survey
ERIC Educational Resources Information Center
Paradise, Andrew
2016-01-01
The Benchmarking Alumni Relations in Community Colleges white paper features key data on alumni relations programs at community colleges across the United States. The paper compares results from 2015 and 2012 across such areas as the structure, operations and budget for alumni relations, alumni data collection and management, alumni communications…
Richer, Stuart; Patel, Shana; Sockanathan, Shivani; Ulanski, Lawrence J.; Miller, Luke; Podella, Carla
2014-01-01
Background: Longevinex® (L/RV) is a low dose hormetic over-the-counter (OTC) oral resveratrol (RV) based matrix of red wine solids, vitamin D3 and inositol hexaphosphate (IP6) with established bioavailability, safety, and short-term efficacy against the earliest signs of human atherosclerosis, murine cardiac reperfusion injury, clinical retinal neovascularization, and stem cell survival. We previously reported our short-term findings for dry and wet age-related macular degeneration (AMD) patients. Today we report long term (two to three year) clinical efficacy. Methods: We treated three patients including a patient with an AMD treatment resistant variant (polypoidal retinal vasculature disease). We evaluated two clinical measures of ocular structure (fundus autofluorescent imaging and spectral domain optical coherence extended depth choroidal imaging) and qualitatively appraised changes in macular pigment volume. We further evaluated three clinical measures of visual function (Snellen visual acuity, contrast sensitivity, and glare recovery to a cone photo-stress stimulus). Results: We observed broad bilateral improvements in ocular structure and function over a long time period, opposite to what might be expected due to aging and the natural progression of the patient’s pathophysiology. No side effects were observed. Conclusions: These three cases demonstrate that application of epigenetics has long-term efficacy against AMD retinal disease, when the retinal specialist has exhausted other therapeutic modalities. PMID:25329968
Richer, Stuart; Patel, Shana; Sockanathan, Shivani; Ulanski, Lawrence J; Miller, Luke; Podella, Carla
2014-10-17
Longevinex® (L/RV) is a low dose hormetic over-the-counter (OTC) oral resveratrol (RV) based matrix of red wine solids, vitamin D3 and inositol hexaphosphate (IP6) with established bioavailability, safety, and short-term efficacy against the earliest signs of human atherosclerosis, murine cardiac reperfusion injury, clinical retinal neovascularization, and stem cell survival. We previously reported our short-term findings for dry and wet age-related macular degeneration (AMD) patients. Today we report long term (two to three year) clinical efficacy. We treated three patients including a patient with an AMD treatment resistant variant (polypoidal retinal vasculature disease). We evaluated two clinical measures of ocular structure (fundus autofluorescent imaging and spectral domain optical coherence extended depth choroidal imaging) and qualitatively appraised changes in macular pigment volume. We further evaluated three clinical measures of visual function (Snellen visual acuity, contrast sensitivity, and glare recovery to a cone photo-stress stimulus). We observed broad bilateral improvements in ocular structure and function over a long time period, opposite to what might be expected due to aging and the natural progression of the patient's pathophysiology. No side effects were observed. These three cases demonstrate that application of epigenetics has long-term efficacy against AMD retinal disease, when the retinal specialist has exhausted other therapeutic modalities.
Li, Juan; Suo, Jinping; Zou, Peng; Jia, Lintao; Wang, Shifang
2010-01-01
The data for long-term drug-delivery systems are scarce compared to the short-term systems because the required research efforts are more time-consuming. In this study, we report a novel cross-linked composite based on poly(vinyl alcohol) (PVA) containing cupric ions for long-term delivery, which is helpful for contraception and trace element balance in the human body. The composition, corrosion products, crystal structure, chemical structure and mechanical stability of the composite, after being immersed in simulated body fluid (SBF) for one year, were studied by X-ray fluorescence spectroscopy (XRF), X-ray diffraction (XRD), differential scanning calorimetry (DSC), Fourier-transform infrared spectroscopy (FT-IR) and mechanical testing. The results show that no other new elements, such as P, Cl and Ca, appear on the surface of the composite and no Cu(2)O was formed after immersion in SBF for one year. The effectiveness of copper can be greatly improved and the side-effects caused by these compounds might also be eliminated. Furthermore, this novel composite exhibits long-term mechanical stability in SBF. The present in vitro long-term data suggest that this novel copper-containing composite may serve as a substitute for conventional materials of copper-containing intrauterine devices (Cu-IUDs) and as a carrier for controlled-release material in a variety of other applications.
The Histone Deacetylase HDAC4 Regulates Long-Term Memory in Drosophila
Fitzsimons, Helen L.; Schwartz, Silvia; Given, Fiona M.; Scott, Maxwell J.
2013-01-01
A growing body of research indicates that pharmacological inhibition of histone deacetylases (HDACs) correlates with enhancement of long-term memory and current research is concentrated on determining the roles that individual HDACs play in cognitive function. Here, we investigate the role of HDAC4 in long-term memory formation in Drosophila. We show that overexpression of HDAC4 in the adult mushroom body, an important structure for memory formation, resulted in a specific impairment in long-term courtship memory, but had no affect on short-term memory. Overexpression of an HDAC4 catalytic mutant also abolished LTM, suggesting a mode of action independent of catalytic activity. We found that overexpression of HDAC4 resulted in a redistribution of the transcription factor MEF2 from a relatively uniform distribution through the nucleus into punctate nuclear bodies, where it colocalized with HDAC4. As MEF2 has also been implicated in regulation of long-term memory, these data suggest that the repressive effects of HDAC4 on long-term memory may be through interaction with MEF2. In the same genetic background, we also found that RNAi-mediated knockdown of HDAC4 impairs long-term memory, therefore we demonstrate that HDAC4 is not only a repressor of long-term memory, but also modulates normal memory formation. PMID:24349558
Lutz, Jesse J; Duan, Xiaofeng F; Ranasinghe, Duminda S; Jin, Yifan; Margraf, Johannes T; Perera, Ajith; Burggraf, Larry W; Bartlett, Rodney J
2018-05-07
Accurate optical characterization of the closo-Si 12 C 12 molecule is important to guide experimental efforts toward the synthesis of nano-wires, cyclic nano-arrays, and related array structures, which are anticipated to be robust and efficient exciton materials for opto-electronic devices. Working toward calibrated methods for the description of closo-Si 12 C 12 oligomers, various electronic structure approaches are evaluated for their ability to reproduce measured optical transitions of the SiC 2 , Si 2 C n (n = 1-3), and Si 3 C n (n = 1, 2) clusters reported earlier by Steglich and Maier [Astrophys. J. 801, 119 (2015)]. Complete-basis-limit equation-of-motion coupled-cluster (EOMCC) results are presented and a comparison is made between perturbative and renormalized non-iterative triples corrections. The effect of adding a renormalized correction for quadruples is also tested. Benchmark test sets derived from both measurement and high-level EOMCC calculations are then used to evaluate the performance of a variety of density functionals within the time-dependent density functional theory (TD-DFT) framework. The best-performing functionals are subsequently applied to predict valence TD-DFT excitation energies for the lowest-energy isomers of Si n C and Si n-1 C 7-n (n = 4-6). TD-DFT approaches are then applied to the Si n C n (n = 4-12) clusters and unique spectroscopic signatures of closo-Si 12 C 12 are discussed. Finally, various long-range corrected density functionals, including those from the CAM-QTP family, are applied to a charge-transfer excitation in a cyclic (Si 4 C 4 ) 4 oligomer. Approaches for gauging the extent of charge-transfer character are also tested and EOMCC results are used to benchmark functionals and make recommendations.
Rahman, Sajjad; Salameh, Khalil; Al-Rifai, Hilal; Masoud, Ahmed; Lutfi, Samawal; Salama, Husam; Abdoh, Ghassan; Omar, Fahmi; Bener, Abdulbari
2011-09-01
To analyze and compare the current gestational age specific neonatal survival rates between Qatar and international benchmarks. An analytical comparative study. Women's Hospital, Hamad Medical Corporation, Doha, Qatar, from 2003-2008. Six year's (2003-2008) gestational age specific neonatal mortality data was stratified for each completed week of gestation at birth from 24 weeks till term. The data from World Health Statistics by WHO (2010), Vermont Oxford Network (VON, 2007) and National Statistics United Kingdom (2006) were used as international benchmarks for comparative analysis. A total of 82,002 babies were born during the study period. Qatar's neonatal mortality rate (NMR) dropped from 6/1000 in 2003 to 4.3/1000 in 2008 (p < 0.05). The overall and gestational age specific neonatal mortality rates of Qatar were comparable with international benchmarks. The survival of < 27 weeks and term babies was better in Qatar (p=0.01 and p < 0.001 respectively) as compared to VON. The survival of > 32 weeks babies was better in UK (p=0.01) as compared to Qatar. The relative risk (RR) of death decreased with increasing gestational age (p < 0.0001). Preterm babies (45%) followed by lethal chromosomal and congenital anomalies (26.5%) were the two leading causes of neonatal deaths in Qatar. The current total and gestational age specific neonatal survival rates in the State of Qatar are comparable with international benchmarks. In Qatar, persistently high rates of low birth weight and lethal chromosomal and congenital anomalies significantly contribute towards neonatal mortality.
Benchmarking initiatives in the water industry.
Parena, R; Smeets, E
2001-01-01
Customer satisfaction and service care are every day pushing professionals in the water industry to seek to improve their performance, lowering costs and increasing the provided service level. Process Benchmarking is generally recognised as a systematic mechanism of comparing one's own utility with other utilities or businesses with the intent of self-improvement by adopting structures or methods used elsewhere. The IWA Task Force on Benchmarking, operating inside the Statistics and Economics Committee, has been committed to developing a general accepted concept of Process Benchmarking to support water decision-makers in addressing issues of efficiency. In a first step the Task Force disseminated among the Committee members a questionnaire focused on providing suggestions about the kind, the evolution degree and the main concepts of Benchmarking adopted in the represented Countries. A comparison among the guidelines adopted in The Netherlands and Scandinavia has recently challenged the Task Force in drafting a methodology for a worldwide process benchmarking in water industry. The paper provides a framework of the most interesting benchmarking experiences in the water sector and describes in detail both the final results of the survey and the methodology focused on identification of possible improvement areas.
Supercomputer simulations of structure formation in the Universe
NASA Astrophysics Data System (ADS)
Ishiyama, Tomoaki
2017-06-01
We describe the implementation and performance results of our massively parallel MPI†/OpenMP‡ hybrid TreePM code for large-scale cosmological N-body simulations. For domain decomposition, a recursive multi-section algorithm is used and the size of domains are automatically set so that the total calculation time is the same for all processes. We developed a highly-tuned gravity kernel for short-range forces, and a novel communication algorithm for long-range forces. For two trillion particles benchmark simulation, the average performance on the fullsystem of K computer (82,944 nodes, the total number of core is 663,552) is 5.8 Pflops, which corresponds to 55% of the peak speed.
QUASAR--scoring and ranking of sequence-structure alignments.
Birzele, Fabian; Gewehr, Jan E; Zimmer, Ralf
2005-12-15
Sequence-structure alignments are a common means for protein structure prediction in the fields of fold recognition and homology modeling, and there is a broad variety of programs that provide such alignments based on sequence similarity, secondary structure or contact potentials. Nevertheless, finding the best sequence-structure alignment in a pool of alignments remains a difficult problem. QUASAR (quality of sequence-structure alignments ranking) provides a unifying framework for scoring sequence-structure alignments that aids finding well-performing combinations of well-known and custom-made scoring schemes. Those scoring functions can be benchmarked against widely accepted quality scores like MaxSub, TMScore, Touch and APDB, thus enabling users to test their own alignment scores against 'standard-of-truth' structure-based scores. Furthermore, individual score combinations can be optimized with respect to benchmark sets based on known structural relationships using QUASAR's in-built optimization routines.
Modernization of the migrant women in Dhaka, Bangladesh: analysis of some demographic variables.
Huq-hussain, S
1995-04-01
This study examines demographic factors as signs of change among female rural migrants who settled in the slums of Dhaka, Bangladesh, in 1988. Data are obtained from structured and open-ended interviews among 399 migrant women working in selected clusters of Dhaka in 1988. Short-term migrants lived in Dhaka city for 6 months to 5 years. Long-term migrants lived longer than 5 years in Dhaka city. Findings indicate that 81% of all migrant families were nuclear, 18% were joint, and only 1% were extended. A greater proportion of recent migrants had nuclear families (89%) compared to long-term migrants (78%). 90% of recent migrants and 81% of long-term migrants favored nuclear families. Migrant women indicated that small nuclear families were preferred due to economic hardship and the shortage of housing. Interpersonal quarrels were another reason for their preference for nuclear families. Women who desired joint families found their domestic burdens relieved with more people to share responsibilities. 17% of all migrant households were headed by women (15% among recent migrants and 18% among long-term migrants). Female heads reported that their family structure was due to the death of or divorce from their spouse and the preference of independence from family dependency. 43% of families had 4 members, 36% had 5-7 members, and 21% had over 7 members. 52% of recent migrants and 40% of long-term migrants had small families under 4 members. 39% of long-term migrants and 26% of recent migrants had families with 5-7 members. Recent female migrants were characterized as primarily aged 11-30 years compared to long-term female migrants who were aged mainly under 10 years. A distinctive feature of marriage among migrant women was the larger proportion of women marrying at early ages. 43% were married at ages 11-14 years. 29% of recent migrants and only 11% of long-term migrants supported early age at marriage, but over 50% supported a marriage age of 18-20 years. It is argued that the long-term migrants support for lower fertility and modern values was an adjustment to urban life.
Page, Robert L; Ghushchyan, Vahram; Gifford, Brian; Read, Richard Allen; Raut, Monika; Bookhart, Brahim K; Naim, Ahmad B; Damaraju, C V; Nair, Kavita V
2014-09-01
To determine productivity loss and indirect costs with deep vein thrombosis (DVT) and pulmonary embolism (PE). Medical and pharmacy claims with short-term disability (STD) and long-term disability (LTD) claims from 2007 to 2010 were analyzed from the Integrated Benefits Institute's Health and Productivity Benchmarking (IBI-HPB) database (STD and LTD claims) and IMS LifeLink™ data (medical and pharmacy claims), which were indirectly linked using a weighting approach matching from IBI-HPB patients' demographic distribution. A total of 5442 DVT and 6199 PE claims were identified. Employees with DVT lost 57 STD and 440 LTD days per disability incident. The average per claim productivity loss from STD and LTD was $7414 and $58181, respectively. Employees with PE lost 56 STD and 364 LTD days per disability incident. The average per claim productivity loss from STD and LTD was $7605 and $48,751, respectively. Deep vein thrombosis and PE impose substantial economic burdens.
Uzarski, Diane; Burke, James; Turner, Barbara; Vroom, James; Short, Nancy
2015-10-01
Researcher-initiated biobanks based at academic institutions contribute valuable biomarker and translational research advances to medicine. With many legacy banks once supported by federal funding, reductions in fiscal support threaten the future of existing and new biobanks. When the Brain Bank at Duke University's Bryan Alzheimer's Disease Center (ADRC) faced a funding crisis, a collaborative, multidisciplinary team embarked on a 2-year biobank sustainability project utilizing a comprehensive business strategy, dedicated project management, and a systems approach involving many Duke University entities. By synthesizing and applying existing knowledge, Duke Translational Medicine Institute created and launched a business model that can be adjusted and applied to legacy and start-up academic biobanks. This model provides a path to identify new funding mechanisms, while also emphasizing improved communication, business development, and a focus on collaborating with industry to improve access to biospecimens. Benchmarks for short-term Brain Bank stabilization have been successfully attained, and the evaluation of long-term sustainability metrics is ongoing. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Feng, Juan; Li, Jianping; Zhu, Jianlei; Li, Yang; Li, Fei
2018-02-01
The response of the Hadley circulation (HC) to the sea surface temperature (SST) is determined by the meridional structure of SST and varies according to the changing nature of this meridional structure. The capability of the models from the phase 5 of the Coupled Model Intercomparison Project (CMIP5) is utilized to represent the contrast response of the HC to different meridional SST structures. To evaluate the responses, the variations of HC and SST were linearly decomposed into two components: the equatorially asymmetric (HEA for HC, and SEA for SST) and equatorially symmetric (HES for HC, and SES for SST) components. The result shows that the climatological features of HC and tropical SST (including the spatial structures and amplitude) are reasonably simulated in all the models. However, the response contrast of HC to different SST meridional structures shows uncertainties among models. This may be due to the fact that the long-term temporal variabilities of HEA, HES, and SEA are limited reproduced in the models, although the spatial structures of their long-term variabilities are relatively reasonably simulated. These results indicate that the performance of the CMIP5 models to simulate long-term temporal variability of different meridional SST structures and related HC variations plays a fundamental role in the successful reproduction of the response of HC to different meridional SST structures.
Terms, Trends, and Insights: PV Project Finance in the United States, 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feldman, David J; Schwabe, Paul D
This brief is a compilation of data points and market insights that reflect the state of the project finance market for solar photovoltaic (PV) assets in the United States as of the third quarter of 2017. This information can generally be used as a simplified benchmark of the costs associated with securing financing for solar PV as well as the cost of the financing itself (i.e., the cost of capital). This work represents the second DOE sponsored effort to benchmark financing costs across the residential, commercial, and utility-scale PV markets, as part of its larger effort to benchmark the componentsmore » of PV system costs.« less
Nishino, Tatsuya
2017-12-01
As the Asian country with the most aged population, Japan, has been modifying its social welfare system. In 2000, the Japanese social care vision turned towards meeting the elderly's care needs in their own homes with proper formal care services. This study aims to understand the quantitative properties of the macro supply and demand structure for facilities for the elderly who require support or long-term care throughout Japan and present them as index values. Additionally, this study compares the targets for establishing long-term care facilities set by Japan's Ministry of Health, Labor and Welfare for 2025. In 2014, approximately 90% of all the people who were certified as requiring support and long-term care and those receiving preventive long-term care or long-term care services, were 75 years or older. The target increases in the number of established facilities by 2025 (for the 75-years-or-older population) were calculated to be 3.3% for nursing homes; 2.71% for long-term-care health facilities; 1.7% for group living facilities; and, 1.84% for community-based multi-care facilities. It was revealed that the establishment targets for 2025 also increase over current projections with the expected increase of the absolute number of users of group living facilities and community-based multi-care facilities. On the other hand, the establishment target for nursing homes remains almost the same as the current projection, whereas that for long-term-care health facilities decreases. These changes of facility ratios reveal that the Japanese social care system is shifting to realize 'Ageing in Place'. When considering households' tendencies, the target ratios for established facilities are expected to be applied to the other countries in Asia.
Nishino, Tatsuya
2017-01-01
As the Asian country with the most aged population, Japan, has been modifying its social welfare system. In 2000, the Japanese social care vision turned towards meeting the elderly’s care needs in their own homes with proper formal care services. This study aims to understand the quantitative properties of the macro supply and demand structure for facilities for the elderly who require support or long-term care throughout Japan and present them as index values. Additionally, this study compares the targets for establishing long-term care facilities set by Japan’s Ministry of Health, Labor and Welfare for 2025. In 2014, approximately 90% of all the people who were certified as requiring support and long-term care and those receiving preventive long-term care or long-term care services, were 75 years or older. The target increases in the number of established facilities by 2025 (for the 75-years-or-older population) were calculated to be 3.3% for nursing homes; 2.71% for long-term-care health facilities; 1.7% for group living facilities; and, 1.84% for community-based multi-care facilities. It was revealed that the establishment targets for 2025 also increase over current projections with the expected increase of the absolute number of users of group living facilities and community-based multi-care facilities. On the other hand, the establishment target for nursing homes remains almost the same as the current projection, whereas that for long-term-care health facilities decreases. These changes of facility ratios reveal that the Japanese social care system is shifting to realize ‘Ageing in Place’. When considering households’ tendencies, the target ratios for established facilities are expected to be applied to the other countries in Asia. PMID:29194405
Dee, C R; Rankin, J A; Burns, C A
1998-07-01
Journal usage studies, which are useful for budget management and for evaluating collection performance relative to library use, have generally described a single library or subject discipline. The Southern Chapter/Medical Library Association (SC/MLA) study has examined journal usage at the aggregate data level with the long-term goal of developing hospital library benchmarks for journal use. Thirty-six SC/MLA hospital libraries, categorized for the study by size as small, medium, or large, reported current journal title use centrally for a one-year period following standardized data collection procedures. Institutional and aggregate data were analyzed for the average annual frequency of use, average costs per use and non-use, and average percent of non-used titles. Permutation F-type tests were used to measure difference among the three hospital groups. Averages were reported for each data set analysis. Statistical tests indicated no significant differences between the hospital groups, suggesting that benchmarks can be derived applying to all types of hospital libraries. The unanticipated lack of commonality among heavily used titles pointed to a need for uniquely tailored collections. Although the small sample size precluded definitive results, the study's findings constituted a baseline of data that can be compared against future studies.
Dee, C R; Rankin, J A; Burns, C A
1998-01-01
BACKGROUND: Journal usage studies, which are useful for budget management and for evaluating collection performance relative to library use, have generally described a single library or subject discipline. The Southern Chapter/Medical Library Association (SC/MLA) study has examined journal usage at the aggregate data level with the long-term goal of developing hospital library benchmarks for journal use. METHODS: Thirty-six SC/MLA hospital libraries, categorized for the study by size as small, medium, or large, reported current journal title use centrally for a one-year period following standardized data collection procedures. Institutional and aggregate data were analyzed for the average annual frequency of use, average costs per use and non-use, and average percent of non-used titles. Permutation F-type tests were used to measure difference among the three hospital groups. RESULTS: Averages were reported for each data set analysis. Statistical tests indicated no significant differences between the hospital groups, suggesting that benchmarks can be derived applying to all types of hospital libraries. The unanticipated lack of commonality among heavily used titles pointed to a need for uniquely tailored collections. CONCLUSION: Although the small sample size precluded definitive results, the study's findings constituted a baseline of data that can be compared against future studies. PMID:9681164
Oliveira, Augusto F; Philipsen, Pier; Heine, Thomas
2015-11-10
In the first part of this series, we presented a parametrization strategy to obtain high-quality electronic band structures on the basis of density-functional-based tight-binding (DFTB) calculations and published a parameter set called QUASINANO2013.1. Here, we extend our parametrization effort to include the remaining terms that are needed to compute the total energy and its gradient, commonly referred to as repulsive potential. Instead of parametrizing these terms as a two-body potential, we calculate them explicitly from the DFTB analogues of the Kohn-Sham total energy expression. This strategy requires only two further numerical parameters per element. Thus, the atomic configuration and four real numbers per element are sufficient to define the DFTB model at this level of parametrization. The QUASINANO2015 parameter set allows the calculation of energy, structure, and electronic structure of all systems composed of elements ranging from H to Ca. Extensive benchmarks show that the overall accuracy of QUASINANO2015 is comparable to that of well-established methods, including PM7 and hand-tuned DFTB parameter sets, while coverage of a much larger range of chemical systems is available.
Ab initio calculations, structure, NBO and NCI analyses of Xsbnd H⋯π interactions
NASA Astrophysics Data System (ADS)
Wu, Qiyang; Su, He; Wang, Hongyan; Wang, Hui
2018-02-01
The performance of ab initio methods (MP2, DFT/B3LYP, random-phase approximation (RPA), CCSD(T) and QCISD(T)) in predicting interaction energy of Xsbnd H⋯π (Xsbnd H = HCCH, HCl, HF; π = C2H2, C2H4, C6H6) hydrogen complexes are assessed systematically. The CCSD(T)/CBS benchmarks of interaction energy are reported. It is found that RPA agrees well with CCSD(T)/CBS benchmarks and experimental results. CCSD(T) and QCISD(T) perform the best only when compared with CCSD(T)/CBS benchmarks, MP2 performs well only for experimental data. B3LYP provides the worst accuracy. Additionally, the equilibrium structure, interaction type of Xsbnd H⋯π hydrogen complexes are investigated by the natural bond orbital (NBO) and the non-covalent interaction index (NCI).
Antibody-protein interactions: benchmark datasets and prediction tools evaluation
Ponomarenko, Julia V; Bourne, Philip E
2007-01-01
Background The ability to predict antibody binding sites (aka antigenic determinants or B-cell epitopes) for a given protein is a precursor to new vaccine design and diagnostics. Among the various methods of B-cell epitope identification X-ray crystallography is one of the most reliable methods. Using these experimental data computational methods exist for B-cell epitope prediction. As the number of structures of antibody-protein complexes grows, further interest in prediction methods using 3D structure is anticipated. This work aims to establish a benchmark for 3D structure-based epitope prediction methods. Results Two B-cell epitope benchmark datasets inferred from the 3D structures of antibody-protein complexes were defined. The first is a dataset of 62 representative 3D structures of protein antigens with inferred structural epitopes. The second is a dataset of 82 structures of antibody-protein complexes containing different structural epitopes. Using these datasets, eight web-servers developed for antibody and protein binding sites prediction have been evaluated. In no method did performance exceed a 40% precision and 46% recall. The values of the area under the receiver operating characteristic curve for the evaluated methods were about 0.6 for ConSurf, DiscoTope, and PPI-PRED methods and above 0.65 but not exceeding 0.70 for protein-protein docking methods when the best of the top ten models for the bound docking were considered; the remaining methods performed close to random. The benchmark datasets are included as a supplement to this paper. Conclusion It may be possible to improve epitope prediction methods through training on datasets which include only immune epitopes and through utilizing more features characterizing epitopes, for example, the evolutionary conservation score. Notwithstanding, overall poor performance may reflect the generality of antigenicity and hence the inability to decipher B-cell epitopes as an intrinsic feature of the protein. It is an open question as to whether ultimately discriminatory features can be found. PMID:17910770
The Long-Term Conditions Questionnaire: conceptual framework and item development
Peters, Michele; Potter, Caroline M; Kelly, Laura; Hunter, Cheryl; Gibbons, Elizabeth; Jenkinson, Crispin; Coulter, Angela; Forder, Julien; Towers, Ann-Marie; A’Court, Christine; Fitzpatrick, Ray
2016-01-01
Purpose To identify the main issues of importance when living with long-term conditions to refine a conceptual framework for informing the item development of a patient-reported outcome measure for long-term conditions. Materials and methods Semi-structured qualitative interviews (n=48) were conducted with people living with at least one long-term condition. Participants were recruited through primary care. The interviews were transcribed verbatim and analyzed by thematic analysis. The analysis served to refine the conceptual framework, based on reviews of the literature and stakeholder consultations, for developing candidate items for a new measure for long-term conditions. Results Three main organizing concepts were identified: impact of long-term conditions, experience of services and support, and self-care. The findings helped to refine a conceptual framework, leading to the development of 23 items that represent issues of importance in long-term conditions. The 23 candidate items formed the first draft of the measure, currently named the Long-Term Conditions Questionnaire. Conclusion The aim of this study was to refine the conceptual framework and develop items for a patient-reported outcome measure for long-term conditions, including single and multiple morbidities and physical and mental health conditions. Qualitative interviews identified the key themes for assessing outcomes in long-term conditions, and these underpinned the development of the initial draft of the measure. These initial items will undergo cognitive testing to refine the items prior to further validation in a survey. PMID:27621678
Haettig, Jakob; Stefanko, Daniel P.; Multani, Monica L.; Figueroa, Dario X.; McQuown, Susan C.; Wood, Marcelo A.
2011-01-01
Transcription of genes required for long-term memory not only involves transcription factors, but also enzymatic protein complexes that modify chromatin structure. Chromatin-modifying enzymes, such as the histone acetyltransferase (HAT) CREB (cyclic-AMP response element binding) binding protein (CBP), are pivotal for the transcriptional regulation required for long-term memory. Several studies have shown that CBP and histone acetylation are necessary for hippocampus-dependent long-term memory and hippocampal long-term potentiation (LTP). Importantly, every genetically modified Cbp mutant mouse exhibits long-term memory impairments in object recognition. However, the role of the hippocampus in object recognition is controversial. To better understand how chromatin-modifying enzymes modulate long-term memory for object recognition, we first examined the role of the hippocampus in retrieval of long-term memory for object recognition or object location. Muscimol inactivation of the dorsal hippocampus prior to retrieval had no effect on long-term memory for object recognition, but completely blocked long-term memory for object location. This was consistent with experiments showing that muscimol inactivation of the hippocampus had no effect on long-term memory for the object itself, supporting the idea that the hippocampus encodes spatial information about an object (such as location or context), whereas cortical areas (such as the perirhinal or insular cortex) encode information about the object itself. Using location-dependent object recognition tasks that engage the hippocampus, we demonstrate that CBP is essential for the modulation of long-term memory via HDAC inhibition. Together, these results indicate that HDAC inhibition modulates memory in the hippocampus via CBP and that different brain regions utilize different chromatin-modifying enzymes to regulate learning and memory. PMID:21224411
Test Cases for Modeling and Validation of Structures with Piezoelectric Actuators
NASA Technical Reports Server (NTRS)
Reaves, Mercedes C.; Horta, Lucas G.
2001-01-01
A set of benchmark test articles were developed to validate techniques for modeling structures containing piezoelectric actuators using commercially available finite element analysis packages. The paper presents the development, modeling, and testing of two structures: an aluminum plate with surface mounted patch actuators and a composite box beam with surface mounted actuators. Three approaches for modeling structures containing piezoelectric actuators using the commercially available packages: MSC/NASTRAN and ANSYS are presented. The approaches, applications, and limitations are discussed. Data for both test articles are compared in terms of frequency response functions from deflection and strain data to input voltage to the actuator. Frequency response function results using the three different analysis approaches provided comparable test/analysis results. It is shown that global versus local behavior of the analytical model and test article must be considered when comparing different approaches. Also, improper bonding of actuators greatly reduces the electrical to mechanical effectiveness of the actuators producing anti-resonance errors.
An Object-Oriented Serial DSMC Simulation Package
NASA Astrophysics Data System (ADS)
Liu, Hongli; Cai, Chunpei
2011-05-01
A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.
Kiranyaz, Serkan; Ince, Turker; Pulkkinen, Jenni; Gabbouj, Moncef
2010-01-01
In this paper, we address dynamic clustering in high dimensional data or feature spaces as an optimization problem where multi-dimensional particle swarm optimization (MD PSO) is used to find out the true number of clusters, while fractional global best formation (FGBF) is applied to avoid local optima. Based on these techniques we then present a novel and personalized long-term ECG classification system, which addresses the problem of labeling the beats within a long-term ECG signal, known as Holter register, recorded from an individual patient. Due to the massive amount of ECG beats in a Holter register, visual inspection is quite difficult and cumbersome, if not impossible. Therefore the proposed system helps professionals to quickly and accurately diagnose any latent heart disease by examining only the representative beats (the so called master key-beats) each of which is representing a cluster of homogeneous (similar) beats. We tested the system on a benchmark database where the beats of each Holter register have been manually labeled by cardiologists. The selection of the right master key-beats is the key factor for achieving a highly accurate classification and the proposed systematic approach produced results that were consistent with the manual labels with 99.5% average accuracy, which basically shows the efficiency of the system.
Arroll, Bruce; Bennett, Merran; Dalbeth, Nicola; Hettiarachchi, Dilanka; Ben, Cribben; Shelling, Ginnie
2009-12-01
To establish a benchmark for gout control using the proportion of patients with serum uric acid (SUA) < 0.36 mmol/L, assess patients' understanding of their preventive medication and trial a mail and phone intervention to improve gout control. Patients clinically diagnosed with gout and baseline SUAs were identified in two South Auckland practices. A mail and phone intervention was introduced aimed at improving the control of gout. Intervention #1 took place in one practice over three months. Intervention #2 occurred in the other practice four to 16 months following baseline. No significant change in SUA from intervention #1 after three months. The second intervention by mail and phone resulted in improvement in SUA levels with a greater proportion of those with SUA < 0.36 mmol/L and the difference in means statistically significant (p = 0.039 two-tailed paired t-test). Benchmarking for usual care was established at 38-43% SUA < 0.36 level. It was possible to increase from 38% to 50%. Issues relating to gout identified included lack of understanding of the need for long-term allopurinol and diagnosis and management for patients for whom English is not their first language. 1. Community workers who speak Pacific languages may assist GPs in communicating to non-English speaking patients. 2. Alternative diagnoses should be considered in symptomatic patients with prolonged normouricaemia. 3. GPs should gradually introduce allopurinol after acute gout attacks, emphasising importance of prophylaxis. 4. A campaign to inform patients about benefits of allopurinol should be considered. 5. A simple one keystroke audit is needed for gout audit and benchmarking. 6. GP guidelines for gout diagnosis and management should be available.
Indicators of AEI applied to the Delaware Estuary.
Barnthouse, Lawrence W; Heimbuch, Douglas G; Anthony, Vaughn C; Hilborn, Ray W; Myers, Ransom A
2002-05-18
We evaluated the impacts of entrainment and impingement at the Salem Generating Station on fish populations and communities in the Delaware Estuary. In the absence of an agreed-upon regulatory definition of "adverse environmental impact" (AEI), we developed three independent benchmarks of AEI based on observed or predicted changes that could threaten the sustainability of a population or the integrity of a community. Our benchmarks of AEI included: (1) disruption of the balanced indigenous community of fish in the vicinity of Salem (the "BIC" analysis); (2) a continued downward trend in the abundance of one or more susceptible fish species (the "Trends" analysis); and (3) occurrence of entrainment/impingement mortality sufficient, in combination with fishing mortality, to jeopardize the future sustainability of one or more populations (the "Stock Jeopardy" analysis). The BIC analysis utilized nearly 30 years of species presence/absence data collected in the immediate vicinity of Salem. The Trends analysis examined three independent data sets that document trends in the abundance of juvenile fish throughout the estuary over the past 20 years. The Stock Jeopardy analysis used two different assessment models to quantify potential long-term impacts of entrainment and impingement on susceptible fish populations. For one of these models, the compensatory capacities of the modeled species were quantified through meta-analysis of spawner-recruit data available for several hundred fish stocks. All three analyses indicated that the fish populations and communities of the Delaware Estuary are healthy and show no evidence of an adverse impact due to Salem. Although the specific models and analyses used at Salem are not applicable to every facility, we believe that a weight of evidence approach that evaluates multiple benchmarks of AEI using both retrospective and predictive methods is the best approach for assessing entrainment and impingement impacts at existing facilities.
Liu, Haiyang; Li, Jia; Zhao, Yan; Xie, Kexin; Tang, Xianjin; Wang, Shaoxian; Li, Zhongpei; Liao, Yulin; Xu, Jianming; Di, Hongjie; Li, Yong
2018-08-15
Nitrification plays an important role in the soil nitrogen (N) cycle, and fertilizer application may influence soil nitrifiers' abundance and composition. However, the effect of long-term manure application in paddy soils on nitrifying populations is poorly understood. We chose four long-term manure experimental fields in the south of China to study how the abundance and community structure of nitrifiers would change in response to long-term manure application using quantitative PCR and Miseq sequencing analyses. Our results showed that manure application significantly increased ammonia oxidizing archaea (AOA) abundance at the ChangSha (CS) and NanChang (NC) sites, while the abundance of ammonia oxidizing bacteria (AOB) represented 4.8- and 12.8- fold increases at the JiaXing (JX) and YingTan (YT) sites, respectively. Miseq sequencing of 16S rRNA genes indicated that manure application altered the community structure of nitrifying populations, especially at the NC and YT sites. The application of manure significantly changed AOA and nitrite oxidizing bacteria (NOB) community structures but not those of AOB, suggesting that AOA and NOB may be more sensitive to manures. Variation partitioning analysis (VPA) and redundancy analysis (RDA) indicated that soil pH, TN, NO 3 - -N and water content were the main factors in shaping nitrifying communities. These findings suggest that nitrifiers respond diversely to manure application, and soil physiochemical properties play an important role in determining nitrifiers' abundance and communities with long-term manure addition. Copyright © 2018 Elsevier B.V. All rights reserved.
Lean, Lyn Li; Hong, Ryan Yee Shiun; Ti, Lian Kah
2017-08-01
Communication of feedback during teaching of practical procedures is a fine balance of structure and timing. We investigate if continuous in-task (IT) or end-task feedback (ET) is more effective in teaching spinal anaesthesia to medical students. End-task feedback was hypothesized to improve both short-term and long-term procedural learning retention as experiential learning promotes active learning after encountering errors during practice. Upon exposure to a 5-min instructional video, students randomized to IT or ET feedbacks were trained using a spinal simulator mannequin. A blinded expert tested the students using a spinal anaesthesia checklist in the short term (immediate) and long-term (average 4 months). Sixty-five students completed the training and testing. There were no differences in demographics of age or gender within IT or ET distributions. Both short-term and long-term learning retention of spinal anaesthesia ET feedback proved to be better (P < 0.01) than IT feedback. The time taken for ET students was shorter at long-term testing. End-task feedback improves both short-term and long-term procedural learning retention.
Elevation trends and shrink-swell response of wetland soils to flooding and drying
Cahoon, Donald R.; Perez, Brian C.; Segura, Bradley D.; Lynch, James C.
2011-01-01
Given the potential for a projected acceleration in sea-level rise to impact wetland sustainability over the next century, a better understanding is needed of climate-related drivers that influence the processes controlling wetland elevation. Changes in local hydrology and groundwater conditions can cause short-term perturbations to marsh elevation trends through shrink—swell of marsh soils. To better understand the magnitude of these perturbations and their impacts on marsh elevation trends, we measured vertical accretion and elevation dynamics in microtidal marshes in Texas and Louisiana during and after the extreme drought conditions that existed there from 1998 to 2000. In a Louisiana marsh, elevation was controlled by subsurface hydrologic fluxes occurring below the root zone but above the 4 m depth (i.e., the base of the surface elevation table benchmark) that were related to regional drought and local meteorological conditions, with marsh elevation tracking water level variations closely. In Texas, a rapid decline in marsh elevation was related to severe drought conditions, which lowered local groundwater levels. Unfragmented marshes experienced smaller water level drawdowns and more rapid marsh elevation recovery than fragmented marshes. It appears that extended drawdowns lead to increased substrate consolidation making it less resilient to respond to future favorable conditions. Overall, changes in water storage lead to rapid and large short-term impacts on marsh elevation that are as much as five times greater than the long-term elevation trend, indicating the importance of long-term, high-resolution elevation data sets to understand the prolonged effects of water deficits on marsh elevation change.
Groussard, Mathilde; La Joie, Renaud; Rauchs, Géraldine; Landeau, Brigitte; Chételat, Gaël; Viader, Fausto; Desgranges, Béatrice; Eustache, Francis; Platel, Hervé
2010-10-05
The development of musical skills by musicians results in specific structural and functional modifications in the brain. Surprisingly, no functional magnetic resonance imaging (fMRI) study has investigated the impact of musical training on brain function during long-term memory retrieval, a faculty particularly important in music. Thus, using fMRI, we examined for the first time this process during a musical familiarity task (i.e., semantic memory for music). Musical expertise induced supplementary activations in the hippocampus, medial frontal gyrus, and superior temporal areas on both sides, suggesting a constant interaction between episodic and semantic memory during this task in musicians. In addition, a voxel-based morphometry (VBM) investigation was performed within these areas and revealed that gray matter density of the hippocampus was higher in musicians than in nonmusicians. Our data indicate that musical expertise critically modifies long-term memory processes and induces structural and functional plasticity in the hippocampus.
Park, Jeongkyu; Yoon, Seokwon; Moon, Sung Seek; Lee, Kyoung Hag; Park, Jueun
2017-01-01
A large and growing population of elderly Koreans with chronic conditions necessitates an increase in long-term care. This study is aimed at investigating the effects of occupational stress, work-centrality, self-efficacy, and job satisfaction on intent to leave among long-term care workers in Korea. We tested the hypothesized structural equation model predicting the intention to quit among long-term care workers in Korea. Survey data were collected from 532 long-term care workers in Seoul, Korea. Results showed that occupational stress was positively associated with intention to leave the job. The study also identified several possible mediators (self-efficacy, work-centrality, job satisfaction) in the relationship between stress and intent to quit. Evidence-based stress management interventions are suggested to help the workers better cope with stressors. Mentoring programs should also be considered for new workers.
ERIC Educational Resources Information Center
Haettig, Jakob; Stefanko, Daniel P.; Multani, Monica L.; Figueroa, Dario X.; McQuown, Susan C.; Wood, Marcelo A.
2011-01-01
Transcription of genes required for long-term memory not only involves transcription factors, but also enzymatic protein complexes that modify chromatin structure. Chromatin-modifying enzymes, such as the histone acetyltransferase (HAT) CREB (cyclic-AMP response element binding) binding protein (CBP), are pivotal for the transcriptional regulation…
USDA-ARS?s Scientific Manuscript database
Soil microorganisms play essential roles in soil organic matter dynamics and nutrient cycling in agroecosystems and have been used as soil quality indicators. The response of soil microbial communities to land management is complex and the long-term impacts of cropping systems on soil microbes is l...
To fully understand the potential long-term ecological impacts a pollutant has on a species, population-level effects must be estimated. Since long-term field experiments are typically not feasible, vital rates such as survival, growth, and reproduction of individual organisms ar...
Necessarily Cumbersome, Messy, and Slow: Community Collaborative Work within Art Institutions
ERIC Educational Resources Information Center
Filipovic, Yaël
2013-01-01
Building relationships and community collaborations--especially on an institutional level--is a slow and long-term process. These types of innovative, experimental, and long-term collaborations with community organizations and groups often lead art institutions to reflect on the value and place of their institutional structures when engaging in…
An Optimization Study of Hot Stamping Operation
NASA Astrophysics Data System (ADS)
Ghoo, Bonyoung; Umezu, Yasuyoshi; Watanabe, Yuko; Ma, Ninshu; Averill, Ron
2010-06-01
In the present study, 3-dimensional finite element analyses for hot-stamping processes of Audi B-pillar product are conducted using JSTAMP/NV and HEEDS. Special attention is paid to the optimization of simulation technology coupling with thermal-mechanical formulations. Numerical simulation based on FEM technology and optimization design using the hybrid adaptive SHERPA algorithm are applied to hot stamping operation to improve productivity. The robustness of the SHERPA algorithm is found through the results of the benchmark example. The SHERPA algorithm is shown to be far superior to the GA (Genetic Algorithm) in terms of efficiency, whose calculation time is about 7 times faster than that of the GA. The SHERPA algorithm could show high performance in a large scale problem having complicated design space and long calculation time.
NASA Technical Reports Server (NTRS)
Stefanescu, D. M.; Catalina, A. V.; Juretzko, Frank R.; Sen, Subhayu; Curreri, P. A.
2003-01-01
The objective of the work on Particle Engulfment and Pushing by Solidifying Interfaces (PEP) include: 1) to obtain fundamental understanding of the physics of particle pushing and engulfment, 2) to develop mathematical models to describe the phenomenon, and 3) to perform critical experiments in the microgravity environment of space to provide benchmark data for model validation. Successful completion of this project will yield vital information relevant to a diverse area of terrestrial applications. With PEP being a long term research effort, this report will focus on advances in the theoretical treatment of the solid/liquid interface interaction with an approaching particle, experimental validation of some aspects of the developed models, and the experimental design aspects of future experiments to be performed on board the International Space Station.
A Look at the Impact of High-End Computing Technologies on NASA Missions
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Dunbar, Jill; Hardman, John; Bailey, F. Ron; Wheeler, Lorien; Rogers, Stuart
2012-01-01
From its bold start nearly 30 years ago and continuing today, the NASA Advanced Supercomputing (NAS) facility at Ames Research Center has enabled remarkable breakthroughs in the space agency s science and engineering missions. Throughout this time, NAS experts have influenced the state-of-the-art in high-performance computing (HPC) and related technologies such as scientific visualization, system benchmarking, batch scheduling, and grid environments. We highlight the pioneering achievements and innovations originating from and made possible by NAS resources and know-how, from early supercomputing environment design and software development, to long-term simulation and analyses critical to design safe Space Shuttle operations and associated spinoff technologies, to the highly successful Kepler Mission s discovery of new planets now capturing the world s imagination.
Learning to forget: continual prediction with LSTM.
Gers, F A; Schmidhuber, J; Cummins, F
2000-10-01
Long short-term memory (LSTM; Hochreiter & Schmidhuber, 1997) can solve numerous tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams that are not a priori segmented into subsequences with explicitly marked ends at which the network's internal state could be reset. Without resets, the state may grow indefinitely and eventually cause the network to break down. Our remedy is a novel, adaptive "forget gate" that enables an LSTM cell to learn to reset itself at appropriate times, thus releasing internal resources. We review illustrative benchmark problems on which standard LSTM outperforms other RNN algorithms. All algorithms (including LSTM) fail to solve continual versions of these problems. LSTM with forget gates, however, easily solves them, and in an elegant way.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Chen; Gupta, Vipul; Huang, Shenyan
The goal of this project is to model long-term creep performance for nickel-base superalloy weldments in high temperature power generation systems. The project uses physics-based modeling methodologies and algorithms for predicting alloy properties in heterogeneous material structures. The modeling methodology will be demonstrated on a gas turbine combustor liner weldment of Haynes 282 precipitate-strengthened nickel-base superalloy. The major developments are: (1) microstructure-property relationships under creep conditions and microstructure characterization (2) modeling inhomogeneous microstructure in superalloy weld (3) modeling mesoscale plastic deformation in superalloy weld and (4) a constitutive creep model that accounts for weld and base metal microstructure and theirmore » long term evolution. The developed modeling technology is aimed to provide a more efficient and accurate assessment of a material’s long-term performance compared with current testing and extrapolation methods. This modeling technology will also accelerate development and qualification of new materials in advanced power generation systems. This document is a final technical report for the project, covering efforts conducted from October 2014 to December 2016.« less
Long-term Care Insurance and Carers' Labor Supply - A Structural Model.
Geyer, Johannes; Korfhage, Thorben
2015-09-01
In Germany, individuals in need of long-term care receive support through benefits of the long-term care insurance. A central goal of the insurance is to support informal care provided by family members. Care recipients can choose between benefits in kind (formal home care services) and benefits in cash. From a budgetary perspective, family care is often considered a cost-saving alternative to formal home care and to stationary nursing care. However, the opportunity costs resulting from reduced labor supply of the carer are often overlooked. We focus on the labor supply decision of family carers and the incentives set by the long-term care insurance. We estimate a structural model of labor supply and the choice of benefits of family carers. We find that benefits in kind have small positive effects on labor supply. Labor supply elasticities of cash benefits are larger and negative. If both types of benefits increase, negative labor supply effects are offset to a large extent. However, the average effect is significantly negative. Copyright © 2015 John Wiley & Sons, Ltd.
Benchmark Analysis of Career and Technical Education in Lenawee County. Final Report.
ERIC Educational Resources Information Center
Hollenbeck, Kevin
The career and technical education (CTE) provided in grades K-12 in the county's vocational-technical center and 12 local public school districts of Lenawee County, Michigan, was benchmarked with respect to its attention to career development. Data were collected from the following sources: structured interviews with a number of key respondents…
NASA Astrophysics Data System (ADS)
Rawlins, M. A.; Adam, J. C.; Vorosmarty, C. J.; Serreze, M. C.; Hinzman, L. D.; Holland, M.; Shiklomanov, A.
2007-12-01
It is expected that a warming climate will be attended by an intensification of the global hydrological cycle. While there are signs of positive trends in several hydrological quantities emerging at the global scale, the scope, character, and quantitative significance of these changes are not well established. In particular, long-term increases in river discharge across Arctic Eurasia are assumed to represent such an intensification and have received considerable attention. Yet, no change in long-term annual precipitation across the region can be related with the discharge trend. Given linkages and feedbacks between the arctic and global climate systems, a more complete understanding of observed changes across northern high latitudes is needed. We present a working definition of an accelerated or intensified hydrological cycle and a synthesis of long-term (nominally 50 years) trends in observed freshwater stocks and fluxes across the arctic land-atmosphere-ocean system. Trend and significance measures from observed data are described alongside expectations of intensification based on GCM simulations of contemporary and future climate. Our domain of interest includes the terrestrial arctic drainage (including all of Alaska and drainage to Hudson Bay), the Arctic Ocean, and the atmosphere over the land and ocean domains. For the terrestrial Arctic, time series of spatial averages which are derived from station data and atmospheric reanalysis are available. Reconstructed data sets are used for quantities such as Arctic Ocean ice and liquid freshwater transports. Study goals include a comprehensive survey of past changes in freshwater across the pan-arctic and a set of benchmarks for expected changes based on an ensemble of GCM simulations, and identification of potential mechanistic linkages which may be examined with contemporary remote sensing data sets.
Wound-healing outcomes using standardized assessment and care in clinical practice.
Bolton, Laura; McNees, Patrick; van Rijswijk, Lia; de Leon, Jean; Lyder, Courtney; Kobza, Laura; Edman, Kelly; Scheurich, Anne; Shannon, Ron; Toth, Michelle
2004-01-01
Wound-healing outcomes applying standardized protocols have typically been measured within controlled clinical trials, not natural settings. Standardized protocols of wound care have been validated for clinical use, creating an opportunity to measure the resulting outcomes. Wound-healing outcomes were explored during clinical use of standardized validated protocols of care based on patient and wound assessments. This was a prospective multicenter study of wound-healing outcomes management in real-world clinical practice. Healing outcomes from March 26 to October 31, 2001, were recorded on patients in 3 long-term care facilities, 1 long-term acute care hospital, and 12 home care agencies for wounds selected by staff to receive care based on computer-generated validated wound care algorithms. After diagnosis, wound dimensions and status were assessed using a tool adapted from the Pressure Sore Status Toolfor use on all wounds. Wound, ostomy, and continence nursing professionals accessed consistent protocols of care, via telemedicine in home care or paper forms in long-term care. A physician entered assessments into a desktop computer in the wound clinic. Based on evidence that healing proceeds faster with fewer infections in environments without gauze, the protocols generally avoided gauze dressings. Most of the 767 wounds selected to receive the standardized-protocols of care were stage III-IV pressure ulcers (n = 373; mean healing time 62 days) or full-thickness venous ulcers (n = 124; mean healing time 57 days). Partial-thickness wounds healed faster than same-etiology full-thickness wounds. These results provide benchmarks for natural-setting healing outcomes and help to define and address wound care challenges. Outcomes primarily using nongauze protocols of care matched or surpassed best previously published results on similar wounds using gauze-based protocols of care, including protocols applying gauze impregnated with growth factors or other agents.
Vlayen, Annemie; Hellings, Johan; Claes, Neree; Peleman, Hilde; Schrooten, Ward
2012-09-01
To measure patient safety culture in Belgian hospitals and to examine the homogeneous grouping of underlying safety culture dimensions. The Hospital Survey on Patient Safety Culture was distributed organisation-wide in 180 Belgian hospitals participating in the federal program on quality and safety between 2007 and 2009. Participating hospitals were invited to submit their data to a comparative database. Homogeneous groups of underlying safety culture dimensions were sought by hierarchical cluster analysis. 90 acute, 42 psychiatric and 11 long-term care hospitals submitted their data for comparison to other hospitals. The benchmark database included 55 225 completed questionnaires (53.7% response rate). Overall dimensional scores were low, although scores were found to be higher for psychiatric and long-term care hospitals than for acute hospitals. The overall perception of patient safety was lower in French-speaking hospitals. Hierarchical clustering of dimensions resulted in two distinct clusters. Cluster I grouped supervisor/manager expectations and actions promoting safety, organisational learning-continuous improvement, teamwork within units and communication openness, while Cluster II included feedback and communication about error, overall perceptions of patient safety, non-punitive response to error, frequency of events reported, teamwork across units, handoffs and transitions, staffing and management support for patient safety. The nationwide safety culture assessment confirms the need for a long-term national initiative to improve patient safety culture and provides each hospital with a baseline patient safety culture profile to direct an intervention plan. The identification of clusters of safety culture dimensions indicates the need for a different approach and context towards the implementation of interventions aimed at improving the safety culture. Certain clusters require unit level improvements, whereas others demand a hospital-wide policy.
ERIC Educational Resources Information Center
Moilanen, Kristin L.
2007-01-01
This manuscript presents a study in which the factor structure and validity of the Adolescent Self-Regulatory Inventory (ASRI) were examined. The ASRI is a theoretically-based questionnaire that taps two temporal aspects of self-regulation (regulation in the short- and long-term). 169 students in the 6th, 8th, and 10th grades of a small,…
Unified concept of effective one component plasma for hot dense plasmas
Clerouin, Jean; Arnault, Philippe; Ticknor, Christopher; ...
2016-03-17
Orbital-free molecular dynamics simulations are used to benchmark two popular models for hot dense plasmas: the one component plasma (OCP) and the Yukawa model. A unified concept emerges where an effective OCP (EOCP) is constructed from the short-range structure of the plasma. An unambiguous ionization and the screening length can be defined and used for a Yukawa system, which reproduces the long-range structure with finite compressibility. Similarly, the dispersion relation of longitudinal waves is consistent with the screened model at vanishing wave number but merges with the OCP at high wave number. Additionally, the EOCP reproduces the overall relaxation timemore » scales of the correlation functions associated with ionic motion. Lastly, in the hot dense regime, this unified concept of EOCP can be fruitfully applied to deduce properties such as the equation of state, ionic transport coefficients, and the ion feature in x-ray Thomson scattering experiments.« less
Variability at the edge: highly accreting objects in Taurus
NASA Astrophysics Data System (ADS)
Abraham, Peter; Kospal, Agnes; Szabo, Robert
2017-04-01
In Kepler K2, Campaign 13, we will obtain 80-days-long optical light curves of seven highly accreting T Tauri stars in the benchmark Taurus star forming region. Here we propose to monitor our sample simultaneously with Kepler and Spitzer, to be able to separate variability patterns related to different physical processes. Monitoring our targets with Spitzer during the final 11 days of the K2 campaign, we will clean the light curves from non-accretion effects (rotating stellar spots, dips due to passing dust structures), and construct, for the first time, a variability curve which reflects the time-dependent accretion only. We will then study and understand how time-dependent mass accretion affects the density and temperature structure of the protoplanetary disk, which sets the initial conditions for planet formation. The proposed work cannot be done without the unparalleled precision of Kepler and Spitzer. This unique and one-time opportunity motivated our DDT proposal.
Adverse event reporting in Czech long-term care facilities.
Hěib, Zdenřk; Vychytil, Pavel; Marx, David
2013-04-01
To describe adverse event reporting processes in long-term care facilities in the Czech Republic. Prospective cohort study involving a written questionnaire followed by in-person structured interviews with selected respondents. Long-term care facilities located in the Czech Republic. Staff of 111 long-term care facilities (87% of long-term care facilities in the Czech Republic). None. Sixty-three percent of long-term health-care facilities in the Czech Republic have adverse event-reporting processes already established, but these were frequently very immature programs sometimes consisting only of paper recording of incidents. Compared to questionnaire responses, in-person interview responses only partially tended to confirm the results of the written survey. Twenty-one facilities (33%) had at most 1 unconfirmed response, 31 facilities (49%) had 2 or 3 unconfirmed responses and the remaining 11 facilities (17%) had 4 or more unconfirmed responses. In-person interviews suggest that use of a written questionnaire to assess the adverse event-reporting process may have limited validity. Staff of the facilities we studied expressed an understanding of the importance of adverse event reporting and prevention, but interviews also suggested a lack of knowledge necessary for establishing a good institutional reporting system in long-term care.
Dutta, Pritha; Basu, Subhadip; Kundu, Mahantapas
2017-03-31
The semantic similarity between two interacting proteins can be estimated by combining the similarity scores of the GO terms associated with the proteins. Greater number of similar GO annotations between two proteins indicates greater interaction affinity. Existing semantic similarity measures make use of the GO graph structure, the information content of GO terms, or a combination of both. In this paper, we present a hybrid approach which utilizes both the topological features of the GO graph and information contents of the GO terms. More specifically, we 1) consider a fuzzy clustering of the GO graph based on the level of association of the GO terms, 2) estimate the GO term memberships to each cluster center based on the respective shortest path lengths, and 3) assign weightage to GO term pairs on the basis of their dissimilarity with respect to the cluster centers. We test the performance of our semantic similarity measure against seven other previously published similarity measures using benchmark protein-protein interaction datasets of Homo sapiens and Saccharomyces cerevisiae based on sequence similarity, Pfam similarity, area under ROC curve and F1 measure.
Accurate quantum chemical calculations
NASA Technical Reports Server (NTRS)
Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.
1989-01-01
An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.
U.S. Solar Photovoltaic System Cost Benchmark: Q1 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Ran; Feldman, David; Margolis, Robert
This report benchmarks U.S. solar photovoltaic (PV) system installed costs as of the first quarter of 2017 (Q1 2017). We use a bottom-up methodology, accounting for all system and projectdevelopment costs incurred during the installation to model the costs for residential, commercial, and utility-scale systems. In general, we attempt to model the typical installation techniques and business operations from an installed-cost perspective. Costs are represented from the perspective of the developer/installer; thus, all hardware costs represent the price at which components are purchased by the developer/installer, not accounting for preexisting supply agreements or other contracts. Importantly, the benchmark also representsmore » the sales price paid to the installer; therefore, it includes profit in the cost of the hardware, 1 along with the profit the installer/developer receives, as a separate cost category. However, it does not include any additional net profit, such as a developer fee or price gross-up, which is common in the marketplace. We adopt this approach owing to the wide variation in developer profits in all three sectors, where project pricing is highly dependent on region and project specifics such as local retail electricity rate structures, local rebate and incentive structures, competitive environment, and overall project or deal structures. Finally, our benchmarks are national averages weighted by state installed capacities.« less
Source-term development for a contaminant plume for use by multimedia risk assessment models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.
1999-12-01
Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less
NASA Astrophysics Data System (ADS)
Richter, Martin; Fingerhut, Benjamin P.
2017-06-01
The description of non-Markovian effects imposed by low frequency bath modes poses a persistent challenge for path integral based approaches like the iterative quasi-adiabatic propagator path integral (iQUAPI) method. We present a novel approximate method, termed mask assisted coarse graining of influence coefficients (MACGIC)-iQUAPI, that offers appealing computational savings due to substantial reduction of considered path segments for propagation. The method relies on an efficient path segment merging procedure via an intermediate coarse grained representation of Feynman-Vernon influence coefficients that exploits physical properties of system decoherence. The MACGIC-iQUAPI method allows us to access the regime of biological significant long-time bath memory on the order of hundred propagation time steps while retaining convergence to iQUAPI results. Numerical performance is demonstrated for a set of benchmark problems that cover bath assisted long range electron transfer, the transition from coherent to incoherent dynamics in a prototypical molecular dimer and excitation energy transfer in a 24-state model of the Fenna-Matthews-Olson trimer complex where in all cases excellent agreement with numerically exact reference data is obtained.
Schulkind, M D
1999-09-01
In three experiments, long-term memory for temporal structure was examined by having participants identify both well-known (e.g., "I've Been Working on the Railroad") and novel songs. The target songs were subjected to a number of rhythmic alterations, to assess the importance of four critical features of identification performance. The four critical features were meter, phrasing, rhythmic contour (ordinal scaling of note durations), and the ratio of successive durations. In contrast with previous work, the unaltered version of each song was identified significantly better than any altered version. This indicates that rhythm is stored in long-term memory. Furthermore, the results demonstrated that all four critical features play a role in the identification of songs. These results held for both well-known and novel tunes.
Accelerated Test Method for Corrosion Protective Coatings Project
NASA Technical Reports Server (NTRS)
Falker, John; Zeitlin, Nancy; Calle, Luz
2015-01-01
This project seeks to develop a new accelerated corrosion test method that predicts the long-term corrosion protection performance of spaceport structure coatings as accurately and reliably as current long-term atmospheric exposure tests. This new accelerated test method will shorten the time needed to evaluate the corrosion protection performance of coatings for NASA's critical ground support structures. Lifetime prediction for spaceport structure coatings has a 5-year qualification cycle using atmospheric exposure. Current accelerated corrosion tests often provide false positives and negatives for coating performance, do not correlate to atmospheric corrosion exposure results, and do not correlate with atmospheric exposure timescales for lifetime prediction.
Petukh, Marharyta; Li, Minghui; Alexov, Emil
2015-07-01
A new methodology termed Single Amino Acid Mutation based change in Binding free Energy (SAAMBE) was developed to predict the changes of the binding free energy caused by mutations. The method utilizes 3D structures of the corresponding protein-protein complexes and takes advantage of both approaches: sequence- and structure-based methods. The method has two components: a MM/PBSA-based component, and an additional set of statistical terms delivered from statistical investigation of physico-chemical properties of protein complexes. While the approach is rigid body approach and does not explicitly consider plausible conformational changes caused by the binding, the effect of conformational changes, including changes away from binding interface, on electrostatics are mimicked with amino acid specific dielectric constants. This provides significant improvement of SAAMBE predictions as indicated by better match against experimentally determined binding free energy changes over 1300 mutations in 43 proteins. The final benchmarking resulted in a very good agreement with experimental data (correlation coefficient 0.624) while the algorithm being fast enough to allow for large-scale calculations (the average time is less than a minute per mutation).
Hardware friendly probabilistic spiking neural network with long-term and short-term plasticity.
Hsieh, Hung-Yi; Tang, Kea-Tiong
2013-12-01
This paper proposes a probabilistic spiking neural network (PSNN) with unimodal weight distribution, possessing long- and short-term plasticity. The proposed algorithm is derived by both the arithmetic gradient decent calculation and bioinspired algorithms. The algorithm is benchmarked by the Iris and Wisconsin breast cancer (WBC) data sets. The network features fast convergence speed and high accuracy. In the experiment, the PSNN took not more than 40 epochs for convergence. The average testing accuracy for Iris and WBC data is 96.7% and 97.2%, respectively. To test the usefulness of the PSNN to real world application, the PSNN was also tested with the odor data, which was collected by our self-developed electronic nose (e-nose). Compared with the algorithm (K-nearest neighbor) that has the highest classification accuracy in the e-nose for the same odor data, the classification accuracy of the PSNN is only 1.3% less but the memory requirement can be reduced at least 40%. All the experiments suggest that the PSNN is hardware friendly. First, it requires only nine-bits weight resolution for training and testing. Second, the PSNN can learn complex data sets with a little number of neurons that in turn reduce the cost of VLSI implementation. In addition, the algorithm is insensitive to synaptic noise and the parameter variation induced by the VLSI fabrication. Therefore, the algorithm can be implemented by either software or hardware, making it suitable for wider application.
Nagy, Balázs; Setyawan, Juliana; Coghill, David; Soroncz-Szabó, Tamás; Kaló, Zoltán; Doshi, Jalpa A
2017-06-01
Models incorporating long-term outcomes (LTOs) are not available to assess the health economic impact of attention-deficit/hyperactivity disorder (ADHD). Develop a conceptual modelling framework capable of assessing long-term economic impact of ADHD therapies. Literature was reviewed; a conceptual structure for the long-term model was outlined with attention to disease characteristics and potential impact of treatment strategies. The proposed model has four layers: i) multi-state short-term framework to differentiate between ADHD treatments; ii) multiple states being merged into three core health states associated with LTOs; iii) series of sub-models in which particular LTOs are depicted; iv) outcomes collected to be either used directly for economic analyses or translated into other relevant measures. This conceptual model provides a framework to assess relationships between short- and long-term outcomes of the disease and its treatment, and to estimate the economic impact of ADHD treatments throughout the course of the disease.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-05
... does not include a rate for either the ``Korea Trade Insurance Corporation (K-SURE)-- Short-Term Export.... C. Benchmark Interest Rate for Short-Term Loans Section 771(5)(E)(ii) of the Act states that the... Development Bank (KDB)/IBK Short-Term Discounted Loans for Export Receivables'' program, an analysis of any...
Opening the Door: The Experience of Chronic Critical Illness in a Long-Term Acute Care Hospital.
Lamas, Daniela J; Owens, Robert L; Nace, R Nicholas; Massaro, Anthony F; Pertsch, Nathan J; Gass, Jonathon; Bernacki, Rachelle E; Block, Susan D
2017-04-01
Chronically critically ill patients have recurrent infections, organ dysfunction, and at least half die within 1 year. They are frequently cared for in long-term acute care hospitals, yet little is known about their experience in this setting. Our objective was to explore the understanding and expectations and goals of these patients and surrogates. We conducted semi-structured interviews with chronically critically ill long-term acute care hospital patients or surrogates. Conversations were recorded, transcribed, and analyzed. One long-term acute care hospital. Chronically critically ill patients, defined by tracheotomy for prolonged mechanical ventilation, or surrogates. Semi-structured conversation about quality of life, expectations, and planning for setbacks. A total of 50 subjects (30 patients and 20 surrogates) were enrolled. Thematic analyses demonstrated: 1) poor quality of life for patients; 2) surrogate stress and anxiety; 3) optimistic health expectations; 4) poor planning for medical setbacks; and 5) disruptive care transitions. Nearly 80% of patient and their surrogate decision makers identified going home as a goal; 38% were at home at 1 year. Our study describes the experience of chronically critically ill patients and surrogates in an long-term acute care hospital and the feasibility of patient-focused research in this setting. Our findings indicate overly optimistic expectations about return home and unmet palliative care needs, suggesting the need for integration of palliative care within the long-term acute care hospital. Further research is also needed to more fully understand the challenges of this growing population of ICU survivors.
A Comparison of Classical Force-Fields for Molecular Dynamics Simulations of Lubricants
Ewen, James P.; Gattinoni, Chiara; Thakkar, Foram M.; Morgan, Neal; Spikes, Hugh A.; Dini, Daniele
2016-01-01
For the successful development and application of lubricants, a full understanding of their complex nanoscale behavior under a wide range of external conditions is required, but this is difficult to obtain experimentally. Nonequilibrium molecular dynamics (NEMD) simulations can be used to yield unique insights into the atomic-scale structure and friction of lubricants and additives; however, the accuracy of the results depend on the chosen force-field. In this study, we demonstrate that the use of an accurate, all-atom force-field is critical in order to; (i) accurately predict important properties of long-chain, linear molecules; and (ii) reproduce experimental friction behavior of multi-component tribological systems. In particular, we focus on n-hexadecane, an important model lubricant with a wide range of industrial applications. Moreover, simulating conditions common in tribological systems, i.e., high temperatures and pressures (HTHP), allows the limits of the selected force-fields to be tested. In the first section, a large number of united-atom and all-atom force-fields are benchmarked in terms of their density and viscosity prediction accuracy of n-hexadecane using equilibrium molecular dynamics (EMD) simulations at ambient and HTHP conditions. Whilst united-atom force-fields accurately reproduce experimental density, the viscosity is significantly under-predicted compared to all-atom force-fields and experiments. Moreover, some all-atom force-fields yield elevated melting points, leading to significant overestimation of both the density and viscosity. In the second section, the most accurate united-atom and all-atom force-field are compared in confined NEMD simulations which probe the structure and friction of stearic acid adsorbed on iron oxide and separated by a thin layer of n-hexadecane. The united-atom force-field provides an accurate representation of the structure of the confined stearic acid film; however, friction coefficients are consistently under-predicted and the friction-coverage and friction-velocity behavior deviates from that observed using all-atom force-fields and experimentally. This has important implications regarding force-field selection for NEMD simulations of systems containing long-chain, linear molecules; specifically, it is recommended that accurate all-atom potentials, such as L-OPLS-AA, are employed. PMID:28773773
A Comparison of Classical Force-Fields for Molecular Dynamics Simulations of Lubricants.
Ewen, James P; Gattinoni, Chiara; Thakkar, Foram M; Morgan, Neal; Spikes, Hugh A; Dini, Daniele
2016-08-02
For the successful development and application of lubricants, a full understanding of their complex nanoscale behavior under a wide range of external conditions is required, but this is difficult to obtain experimentally. Nonequilibrium molecular dynamics (NEMD) simulations can be used to yield unique insights into the atomic-scale structure and friction of lubricants and additives; however, the accuracy of the results depend on the chosen force-field. In this study, we demonstrate that the use of an accurate, all-atom force-field is critical in order to; (i) accurately predict important properties of long-chain, linear molecules; and (ii) reproduce experimental friction behavior of multi-component tribological systems. In particular, we focus on n -hexadecane, an important model lubricant with a wide range of industrial applications. Moreover, simulating conditions common in tribological systems, i.e., high temperatures and pressures (HTHP), allows the limits of the selected force-fields to be tested. In the first section, a large number of united-atom and all-atom force-fields are benchmarked in terms of their density and viscosity prediction accuracy of n -hexadecane using equilibrium molecular dynamics (EMD) simulations at ambient and HTHP conditions. Whilst united-atom force-fields accurately reproduce experimental density, the viscosity is significantly under-predicted compared to all-atom force-fields and experiments. Moreover, some all-atom force-fields yield elevated melting points, leading to significant overestimation of both the density and viscosity. In the second section, the most accurate united-atom and all-atom force-field are compared in confined NEMD simulations which probe the structure and friction of stearic acid adsorbed on iron oxide and separated by a thin layer of n -hexadecane. The united-atom force-field provides an accurate representation of the structure of the confined stearic acid film; however, friction coefficients are consistently under-predicted and the friction-coverage and friction-velocity behavior deviates from that observed using all-atom force-fields and experimentally. This has important implications regarding force-field selection for NEMD simulations of systems containing long-chain, linear molecules; specifically, it is recommended that accurate all-atom potentials, such as L-OPLS-AA, are employed.
Long-term Culture of Human iPS Cell-derived Telencephalic Neuron Aggregates on Collagen Gel.
Oyama, Hiroshi; Takahashi, Koji; Tanaka, Yoshikazu; Takemoto, Hiroshi; Haga, Hisashi
2018-01-01
It takes several months to form the 3-dimensional morphology of the human embryonic brain. Therefore, establishing a long-term culture method for neuronal tissues derived from human induced pluripotent stem (iPS) cells is very important for studying human brain development. However, it is difficult to keep primary neurons alive for more than 3 weeks in culture. Moreover, long-term adherent culture to maintain the morphology of telencephalic neuron aggregates induced from human iPS cells is also difficult. Although collagen gel has been widely used to support long-term culture of cells, it is not clear whether human iPS cell-derived neuron aggregates can be cultured for long periods on this substrate. In the present study, we differentiated human iPS cells to telencephalic neuron aggregates and examined long-term culture of these aggregates on collagen gel. The results indicated that these aggregates could be cultured for over 3 months by adhering tightly onto collagen gel. Furthermore, telencephalic neuronal precursors within these aggregates matured over time and formed layered structures. Thus, long-term culture of telencephalic neuron aggregates derived from human iPS cells on collagen gel would be useful for studying human cerebral cortex development.Key words: Induced pluripotent stem cell, forebrain neuron, collagen gel, long-term culture.
Long-wavelength microinstabilities in toroidal plasmas*
NASA Astrophysics Data System (ADS)
Tang, W. M.; Rewoldt, G.
1993-07-01
Realistic kinetic toroidal eigenmode calculations have been carried out to support a proper assessment of the influence of long-wavelength microturbulence on transport in tokamak plasmas. In order to efficiently evaluate large-scale kinetic behavior extending over many rational surfaces, significant improvements have been made to a toroidal finite element code used to analyze the fully two-dimensional (r,θ) mode structures of trapped-ion and toroidal ion temperature gradient (ITG) instabilities. It is found that even at very long wavelengths, these eigenmodes exhibit a strong ballooning character with the associated radial structure relatively insensitive to ion Landau damping at the rational surfaces. In contrast to the long-accepted picture that the radial extent of trapped-ion instabilities is characterized by the ion-gyroradius-scale associated with strong localization between adjacent rational surfaces, present results demonstrate that under realistic conditions, the actual scale is governed by the large-scale variations in the equilibrium gradients. Applications to recent measurements of fluctuation properties in Tokamak Fusion Test Reactor (TFTR) [Plasma Phys. Controlled Nucl. Fusion Res. (International Atomic Energy Agency, Vienna, 1985), Vol. 1, p. 29] L-mode plasmas indicate that the theoretical trends appear consistent with spectral characteristics as well as rough heuristic estimates of the transport level. Benchmarking calculations in support of the development of a three-dimensional toroidal gyrokinetic code indicate reasonable agreement with respect to both the properties of the eigenfunctions and the magnitude of the eigenvalues during the linear phase of the simulations of toroidal ITG instabilities.
Source-term development for a contaminant plume for use by multimedia risk assessment models
NASA Astrophysics Data System (ADS)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.
2000-02-01
Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.
Noise Maps for Quantitative and Clinical Severity Towards Long-Term ECG Monitoring.
Everss-Villalba, Estrella; Melgarejo-Meseguer, Francisco Manuel; Blanco-Velasco, Manuel; Gimeno-Blanes, Francisco Javier; Sala-Pla, Salvador; Rojo-Álvarez, José Luis; García-Alberola, Arcadi
2017-10-25
Noise and artifacts are inherent contaminating components and are particularly present in Holter electrocardiogram (ECG) monitoring. The presence of noise is even more significant in long-term monitoring (LTM) recordings, as these are collected for several days in patients following their daily activities; hence, strong artifact components can temporarily impair the clinical measurements from the LTM recordings. Traditionally, the noise presence has been dealt with as a problem of non-desirable component removal by means of several quantitative signal metrics such as the signal-to-noise ratio (SNR), but current systems do not provide any information about the true impact of noise on the ECG clinical evaluation. As a first step towards an alternative to classical approaches, this work assesses the ECG quality under the assumption that an ECG has good quality when it is clinically interpretable. Therefore, our hypotheses are that it is possible (a) to create a clinical severity score for the effect of the noise on the ECG, (b) to characterize its consistency in terms of its temporal and statistical distribution, and (c) to use it for signal quality evaluation in LTM scenarios. For this purpose, a database of external event recorder (EER) signals is assembled and labeled from a clinical point of view for its use as the gold standard of noise severity categorization. These devices are assumed to capture those signal segments more prone to be corrupted with noise during long-term periods. Then, the ECG noise is characterized through the comparison of these clinical severity criteria with conventional quantitative metrics taken from traditional noise-removal approaches, and noise maps are proposed as a novel representation tool to achieve this comparison. Our results showed that neither of the benchmarked quantitative noise measurement criteria represent an accurate enough estimation of the clinical severity of the noise. A case study of long-term ECG is reported, showing the statistical and temporal correspondences and properties with respect to EER signals used to create the gold standard for clinical noise. The proposed noise maps, together with the statistical consistency of the characterization of the noise clinical severity, paves the way towards forthcoming systems providing us with noise maps of the noise clinical severity, allowing the user to process different ECG segments with different techniques and in terms of different measured clinical parameters.
Closer to home (or home alone?) The British Columbia long-term care system in transition.
Brody, B L; Simon, H J; Stadler, K L
1997-01-01
Finding ways to organize and deliver long-term care that provides for quality of life at an affordable price is of increasing importance as the population ages, family size decreases, and women enter the workforce. For the past 2 decades, British Columbia has provided a model system that has apparently avoided disruptive conflicts. Although formal users' complaints are rare, this study--based on focus groups and interviews with users, their families, and advocates--identified problems users encountered toward resolving concerns about the structure, process, and outcome of long-term care. We present these findings in the context of British Columbia's current devolution from provincial to regional control that aims to save costs and keep disabled elderly persons in the community. British Columbia may be continuing to lead the way in meeting the needs of its burgeoning elderly population for long-term care. Study findings have implications for the development of US long-term care policy by pointing to the value of obtaining users' views of long-term care to identify both obvious and more subtle trouble spots. PMID:9392982
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 3 2010-07-01 2010-07-01 false Bonding requirements for underground coal mines and long-term coal-related surface facilities and structures. 800.17 Section 800.17 Mineral Resources... REQUIREMENTS FOR SURFACE COAL MINING AND RECLAMATION OPERATIONS BOND AND INSURANCE REQUIREMENTS FOR SURFACE...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 3 2012-07-01 2012-07-01 false Bonding requirements for underground coal mines and long-term coal-related surface facilities and structures. 800.17 Section 800.17 Mineral Resources... REQUIREMENTS FOR SURFACE COAL MINING AND RECLAMATION OPERATIONS BOND AND INSURANCE REQUIREMENTS FOR SURFACE...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 3 2013-07-01 2013-07-01 false Bonding requirements for underground coal mines and long-term coal-related surface facilities and structures. 800.17 Section 800.17 Mineral Resources... REQUIREMENTS FOR SURFACE COAL MINING AND RECLAMATION OPERATIONS BOND AND INSURANCE REQUIREMENTS FOR SURFACE...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 3 2014-07-01 2014-07-01 false Bonding requirements for underground coal mines and long-term coal-related surface facilities and structures. 800.17 Section 800.17 Mineral Resources... REQUIREMENTS FOR SURFACE COAL MINING AND RECLAMATION OPERATIONS BOND AND INSURANCE REQUIREMENTS FOR SURFACE...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 3 2011-07-01 2011-07-01 false Bonding requirements for underground coal mines and long-term coal-related surface facilities and structures. 800.17 Section 800.17 Mineral Resources... REQUIREMENTS FOR SURFACE COAL MINING AND RECLAMATION OPERATIONS BOND AND INSURANCE REQUIREMENTS FOR SURFACE...
Long-term Follow-up with AlloDerm in Breast Reconstruction
2013-01-01
Summary: Little is known about the long-term fate of acellular dermal matrices in breast implant surgery. A 12-year follow-up case with tissue analysis of AlloDerm in revision breast reconstruction reveals retention of graft volume and integration with an organized collagen structure, minimal capsule formation, and little or no indication of inflammation. PMID:25289211
Long-term Follow-up with AlloDerm in Breast Reconstruction.
Baxter, Richard A
2013-05-01
Little is known about the long-term fate of acellular dermal matrices in breast implant surgery. A 12-year follow-up case with tissue analysis of AlloDerm in revision breast reconstruction reveals retention of graft volume and integration with an organized collagen structure, minimal capsule formation, and little or no indication of inflammation.
Profile extrusion and mechanical properties of crosslinked wood–thermoplastic composites
Magnus Bengtsson; Kristiina Oksman; Stark Nicole M.
2006-01-01
Challenges for wood-thermoplastic composites to be utilized in structural applications are to lower product weight and to improve the long-term load performance. Silane crosslinking of the composites is one way to reduce the creep during long-term loading and to improve the mechanical properties. In this study, silane crosslinked wood-polyethylene composites were...
The Dynamics and Inequality of Italian Men's Earnings: Long-Term Changes or Transitory Fluctuations?
ERIC Educational Resources Information Center
Cappellari, Lorenzo
2004-01-01
This paper provides a longitudinal perspective on changes in Italian men's earnings inequality since the late 1970s by decomposing the earnings autocovariance structure into its long-term and transitory parts. Cross-sectional earnings differentials grew over the period and the longitudinal analysis shows that such growth was determined by the…
Growth, yield, and structure of extended rotation Pinus resinosa stands in Minnesota, USA
Anthony W. D' Amato; Brian J. Palik; Christel C. Kern
2010-01-01
Extended rotations are increasingly used to meet ecological objectives on forestland; however, information about long-term growth and yield of these systems is lacking for most forests in North America. Additionally, long-term growth responses to repeated thinnings in older stands have received little attention. We addressed these needs by examining the growth and...
ERIC Educational Resources Information Center
McGrath, Dennis, Ed.
1998-01-01
This volume offers a variety of examples of long-term collaborative efforts within schools that began with external funding. Articles include: (1) "Lessons from a Long-Term Collaboration," (Lindsay M. Wright and Rona Middleberg); (2) "Creating Structural Change: Best Practices," (Janet E. Lieberman); (3) "An Urban Intervention That Works: The…
GABA-Mediated Presynaptic Inhibition Is Required for Precision of Long-Term Memory
ERIC Educational Resources Information Center
Cullen, Patrick K.; Dulka, Brooke N.; Ortiz, Samantha; Riccio, David C.; Jasnow, Aaron M.
2014-01-01
Though much attention has been given to the neural structures that underlie the long-term consolidation of contextual memories, little is known about the mechanisms responsible for the maintenance of memory precision. Here, we demonstrate a rapid time-dependent decline in memory precision in GABA [subscript B(1a)] receptor knockout mice. First, we…
Long-Term Memory for Music: Infants Remember Tempo and Timbre
ERIC Educational Resources Information Center
Trainor, Laurel J.; Wu, Luann; Tsang, Christine D.
2004-01-01
We show that infants' long-term memory representations for melodies are not just reduced to the structural features of relative pitches and durations, but contain surface or performance tempo- and timbre-specific information. Using a head turn preference procedure, we found that after a one week exposure to an old English folk song, infants…
ERIC Educational Resources Information Center
O'Rourke, Norm; Chappell, Neena L.; Caspar, Sienna
2009-01-01
Purpose: Motivating and enabling formal caregivers to provide individualized resident care has become an increasingly important objective in long-term care (LTC) facilities. The current study set out to examine the structure of responses to the individualized care inventory (ICI). Design and Methods: Samples of 242 registered nurses (RNs)/licensed…
ERIC Educational Resources Information Center
Pardini, Matteo; Elia, Maurizio; Garaci, Francesco G.; Guida, Silvia; Coniglione, Filadelfo; Krueger, Frank; Benassi, Francesca; Gialloreti, Leonardo Emberti
2012-01-01
Recent evidence points to white-matter abnormalities as a key factor in autism physiopathology. Using Diffusion Tensor Imaging, we studied white-matter structural properties in a convenience sample of twenty-two subjects with low-functioning autism exposed to long-term augmentative and alternative communication, combined with sessions of cognitive…
Endogenous BDNF Is Required for Long-Term Memory Formation in the Rat Parietal Cortex
ERIC Educational Resources Information Center
Alonso, Mariana; Bekinschtein, Pedro, Cammarota, Martin; Vianna, Monica R. M.; Izquierdo, Ivan; Medina, Jorge H.
2005-01-01
Information storage in the brain is a temporally graded process involving different memory phases as well as different structures in the mammalian brain. Cortical plasticity seems to be essential to store stable long-term memories, although little information is available at the moment regarding molecular and cellular events supporting memory…
Long-term effects of child punishment on Mexican women: a structural model.
Frias-Armenta, Martha
2002-04-01
The aim of this study was to investigate long-term effects of parental use of physical and verbal punishment on Mexican women. To study both direct and indirect effects of these phenomena, a structural model was developed and tested. One hundred and fifty Mexican women were interviewed with regard to their history of child abuse, their level of depression, alcohol use, antisocial behavior, and punishment of their own children. Factors representing such constructs were specified within a structural equation model and their inter-relations were estimated. Women's history of abuse was considered as an exogenous latent variable directly affecting three other factors: mothers' antisocial behavior, their alcohol consumption, and their levels of depression or anxiety. These factors, in turn, were specified as influencing mothers' harsh discipline of their own children. Data supported this model, indicating that a history of abuse has long-term effects on women's behavior and psychological functioning, which in turn cause women's punitive behavior against their children. These results are discussed in terms of the theoretical framework of intergenerational transmission of violence. The direct consequences (depression, anxiety, alcohol consumption, and antisocial behavior) of child punishment act as risk factors for the next generation of child abuse.
ICASE/LaRC Workshop on Benchmark Problems in Computational Aeroacoustics (CAA)
NASA Technical Reports Server (NTRS)
Hardin, Jay C. (Editor); Ristorcelli, J. Ray (Editor); Tam, Christopher K. W. (Editor)
1995-01-01
The proceedings of the Benchmark Problems in Computational Aeroacoustics Workshop held at NASA Langley Research Center are the subject of this report. The purpose of the Workshop was to assess the utility of a number of numerical schemes in the context of the unusual requirements of aeroacoustical calculations. The schemes were assessed from the viewpoint of dispersion and dissipation -- issues important to long time integration and long distance propagation in aeroacoustics. Also investigated were the effect of implementation of different boundary conditions. The Workshop included a forum in which practical engineering problems related to computational aeroacoustics were discussed. This discussion took the form of a dialogue between an industrial panel and the workshop participants and was an effort to suggest the direction of evolution of this field in the context of current engineering needs.
The Influence of Hurricane Winds on Caribbean Dry Forest Structure and Nutrient Pools
Skip J. Van Bloem; Peter G. Murphy; Ariel E. Lugo; Rebecca Ostertag; Maria Rivera Costa; Ivelisse Ruiz Bernard; Sandra Molina Colon; Miguel Canals Mora
2005-01-01
In 1998, we measured the effects of Hurricane Georges after it passed over long-term research sites in Puerto Rican dry forest. Our primary objectives were to quantify hurricane effects on forest structure, to compare effects in a large tract of forest versus a series of nearby forest fragments, to evaluate short-term response to hurricane disturbance in terms of...
Clemens, Timo; Michelsen, Kai; Commers, Matt; Garel, Pascal; Dowdeswell, Barrie; Brand, Helmut
2014-07-01
Hospitals have become a focal point for health care reform strategies in many European countries during the current financial crisis. It has been called for both, short-term reforms to reduce costs and long-term changes to improve the performance in the long run. On the basis of a literature and document analysis this study analyses how EU member states align short-term and long-term pressures for hospital reforms in times of the financial crisis and assesses the EU's influence on the national reform agenda. The results reveal that there has been an emphasis on cost containment measures rather than embarking on structural redesign of the hospital sector and its position within the broader health care system. The EU influences hospital reform efforts through its enhanced economic framework governance which determines key aspects of the financial context for hospitals in some countries. In addition, the EU health policy agenda which increasingly addresses health system questions stimulates the process of structural hospital reforms by knowledge generation, policy advice and financial incentives. We conclude that successful reforms in such a period would arguably need to address both the organisational and financing sides to hospital care. Moreover, critical to structural reform is a widely held acknowledgement of shortfalls in the current system and belief that new models of hospital care can deliver solutions to overcome these deficits. Advancing the structural redesign of the hospital sector while pressured to contain cost in the short-term is not an easy task and only slowly emerging in Europe. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Adsorption structures and energetics of molecules on metal surfaces: Bridging experiment and theory
NASA Astrophysics Data System (ADS)
Maurer, Reinhard J.; Ruiz, Victor G.; Camarillo-Cisneros, Javier; Liu, Wei; Ferri, Nicola; Reuter, Karsten; Tkatchenko, Alexandre
2016-05-01
Adsorption geometry and stability of organic molecules on surfaces are key parameters that determine the observable properties and functions of hybrid inorganic/organic systems (HIOSs). Despite many recent advances in precise experimental characterization and improvements in first-principles electronic structure methods, reliable databases of structures and energetics for large adsorbed molecules are largely amiss. In this review, we present such a database for a range of molecules adsorbed on metal single-crystal surfaces. The systems we analyze include noble-gas atoms, conjugated aromatic molecules, carbon nanostructures, and heteroaromatic compounds adsorbed on five different metal surfaces. The overall objective is to establish a diverse benchmark dataset that enables an assessment of current and future electronic structure methods, and motivates further experimental studies that provide ever more reliable data. Specifically, the benchmark structures and energetics from experiment are here compared with the recently developed van der Waals (vdW) inclusive density-functional theory (DFT) method, DFT + vdWsurf. In comparison to 23 adsorption heights and 17 adsorption energies from experiment we find a mean average deviation of 0.06 Å and 0.16 eV, respectively. This confirms the DFT + vdWsurf method as an accurate and efficient approach to treat HIOSs. A detailed discussion identifies remaining challenges to be addressed in future development of electronic structure methods, for which the here presented benchmark database may serve as an important reference.
Classification and assessment tools for structural motif discovery algorithms.
Badr, Ghada; Al-Turaiki, Isra; Mathkour, Hassan
2013-01-01
Motif discovery is the problem of finding recurring patterns in biological data. Patterns can be sequential, mainly when discovered in DNA sequences. They can also be structural (e.g. when discovering RNA motifs). Finding common structural patterns helps to gain a better understanding of the mechanism of action (e.g. post-transcriptional regulation). Unlike DNA motifs, which are sequentially conserved, RNA motifs exhibit conservation in structure, which may be common even if the sequences are different. Over the past few years, hundreds of algorithms have been developed to solve the sequential motif discovery problem, while less work has been done for the structural case. In this paper, we survey, classify, and compare different algorithms that solve the structural motif discovery problem, where the underlying sequences may be different. We highlight their strengths and weaknesses. We start by proposing a benchmark dataset and a measurement tool that can be used to evaluate different motif discovery approaches. Then, we proceed by proposing our experimental setup. Finally, results are obtained using the proposed benchmark to compare available tools. To the best of our knowledge, this is the first attempt to compare tools solely designed for structural motif discovery. Results show that the accuracy of discovered motifs is relatively low. The results also suggest a complementary behavior among tools where some tools perform well on simple structures, while other tools are better for complex structures. We have classified and evaluated the performance of available structural motif discovery tools. In addition, we have proposed a benchmark dataset with tools that can be used to evaluate newly developed tools.
Site-specific to local-scale shallow landslides triggering zones assessment using TRIGRS
NASA Astrophysics Data System (ADS)
Bordoni, M.; Meisina, C.; Valentino, R.; Bittelli, M.; Chersich, S.
2015-05-01
Rainfall-induced shallow landslides are common phenomena in many parts of the world, affecting cultivation and infrastructure and sometimes causing human losses. Assessing the triggering zones of shallow landslides is fundamental for land planning at different scales. This work defines a reliable methodology to extend a slope stability analysis from the site-specific to local scale by using a well-established physically based model (TRIGRS-unsaturated). The model is initially applied to a sample slope and then to the surrounding 13.4 km2 area in Oltrepo Pavese (northern Italy). To obtain more reliable input data for the model, long-term hydro-meteorological monitoring has been carried out at the sample slope, which has been assumed to be representative of the study area. Field measurements identified the triggering mechanism of shallow failures and were used to verify the reliability of the model to obtain pore water pressure trends consistent with those measured during the monitoring activity. In this way, more reliable trends have been modelled for past landslide events, such as the April 2009 event that was assumed as a benchmark. The assessment of shallow landslide triggering zones obtained using TRIGRS-unsaturated for the benchmark event appears good for both the monitored slope and the whole study area, with better results when a pedological instead of geological zoning is considered at the regional scale. The sensitivity analyses of the influence of the soil input data show that the mean values of the soil properties give the best results in terms of the ratio between the true positive and false positive rates. The scheme followed in this work allows us to obtain better results in the assessment of shallow landslide triggering areas in terms of the reduction in the overestimation of unstable zones with respect to other distributed models applied in the past.
U.S. Solar Photovoltaic System Cost Benchmark: Q1 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Ran; Feldman, David J.; Margolis, Robert M.
NREL has been modeling U.S. photovoltaic (PV) system costs since 2009. This year, our report benchmarks costs of U.S. solar PV for residential, commercial, and utility-scale systems built in the first quarter of 2017 (Q1 2017). Costs are represented from the perspective of the developer/installer, thus all hardware costs represent the price at which components are purchased by the developer/installer, not accounting for preexisting supply agreements or other contracts. Importantly, the benchmark this year (2017) also represents the sales price paid to the installer; therefore, it includes profit in the cost of the hardware, along with the profit the installer/developermore » receives, as a separate cost category. However, it does not include any additional net profit, such as a developer fee or price gross-up, which are common in the marketplace. We adopt this approach owing to the wide variation in developer profits in all three sectors, where project pricing is highly dependent on region and project specifics such as local retail electricity rate structures, local rebate and incentive structures, competitive environment, and overall project or deal structures.« less
Benchmarking Ensemble Streamflow Prediction Skill in the UK
NASA Astrophysics Data System (ADS)
Harrigan, Shaun; Smith, Katie; Parry, Simon; Tanguy, Maliko; Prudhomme, Christel
2017-04-01
Skilful hydrological forecasts at weekly to seasonal lead times would be extremely beneficial for decision-making in operational water management, especially during drought conditions. Hydro-meteorological ensemble forecasting systems are an attractive approach as they use two sources of streamflow predictability: (i) initial hydrologic conditions (IHCs), where soil moisture, groundwater and snow storage states can provide an estimate of future streamflow situations, and (ii) atmospheric predictability, where skilful forecasts of weather and climate variables can be used to force hydrological models. In the UK, prediction of rainfall at long lead times and for summer months in particular is notoriously difficult given the large degree of natural climate variability in ocean influenced mid-latitude regions, but recent research has uncovered exciting prospects for improved rainfall skill at seasonal lead times due to improved prediction of the North Atlantic Oscillation. However, before we fully understand what this improved atmospheric predictability might mean in terms of improved hydrological forecasts, we must first evaluate how much skill can be gained from IHCs alone. Ensemble Streamflow Prediction (ESP) is a well-established method for generating an ensemble of streamflow forecasts in the absence of skilful future meteorological predictions. The aim of this study is therefore to benchmark when (lead time/forecast initialisation month) and where (spatial pattern/catchment characteristics) ESP is skilful across a diverse set of catchments in the UK. Forecast skill was evaluated seamlessly from lead times of 1-day to 12-months and forecasts were initialised at the first of each month over the 1965-2015 hindcast period. This ESP output also provides a robust benchmark against which to assess how much improvement in skill can be achieved when meteorological forecasts are incorporated (next steps). To provide a 'tough to beat' benchmark, several variants of ESP with increasing complexity were produced, including better model representation of hydrological processes and sub-sampling of historic climate sequences (e.g. NAO+/NAO- years). This work is part of the Improving Predictions of Drought for User Decision Making (IMPETUS) project and provides insight to where advancements in atmospheric predictability is most needed in the UK in the context of water management.
TRUST. I. A 3D externally illuminated slab benchmark for dust radiative transfer
NASA Astrophysics Data System (ADS)
Gordon, K. D.; Baes, M.; Bianchi, S.; Camps, P.; Juvela, M.; Kuiper, R.; Lunttila, T.; Misselt, K. A.; Natale, G.; Robitaille, T.; Steinacker, J.
2017-07-01
Context. The radiative transport of photons through arbitrary three-dimensional (3D) structures of dust is a challenging problem due to the anisotropic scattering of dust grains and strong coupling between different spatial regions. The radiative transfer problem in 3D is solved using Monte Carlo or Ray Tracing techniques as no full analytic solution exists for the true 3D structures. Aims: We provide the first 3D dust radiative transfer benchmark composed of a slab of dust with uniform density externally illuminated by a star. This simple 3D benchmark is explicitly formulated to provide tests of the different components of the radiative transfer problem including dust absorption, scattering, and emission. Methods: The details of the external star, the slab itself, and the dust properties are provided. This benchmark includes models with a range of dust optical depths fully probing cases that are optically thin at all wavelengths to optically thick at most wavelengths. The dust properties adopted are characteristic of the diffuse Milky Way interstellar medium. This benchmark includes solutions for the full dust emission including single photon (stochastic) heating as well as two simplifying approximations: One where all grains are considered in equilibrium with the radiation field and one where the emission is from a single effective grain with size-distribution-averaged properties. A total of six Monte Carlo codes and one Ray Tracing code provide solutions to this benchmark. Results: The solution to this benchmark is given as global spectral energy distributions (SEDs) and images at select diagnostic wavelengths from the ultraviolet through the infrared. Comparison of the results revealed that the global SEDs are consistent on average to a few percent for all but the scattered stellar flux at very high optical depths. The image results are consistent within 10%, again except for the stellar scattered flux at very high optical depths. The lack of agreement between different codes of the scattered flux at high optical depths is quantified for the first time. Convergence tests using one of the Monte Carlo codes illustrate the sensitivity of the solutions to various model parameters. Conclusions: We provide the first 3D dust radiative transfer benchmark and validate the accuracy of this benchmark through comparisons between multiple independent codes and detailed convergence tests.
Structural Brain Imaging of Long-Term Anabolic-Androgenic Steroid Users and Nonusing Weightlifters.
Bjørnebekk, Astrid; Walhovd, Kristine B; Jørstad, Marie L; Due-Tønnessen, Paulina; Hullstein, Ingunn R; Fjell, Anders M
2017-08-15
Prolonged high-dose anabolic-androgenic steroid (AAS) use has been associated with psychiatric symptoms and cognitive deficits, yet we have almost no knowledge of the long-term consequences of AAS use on the brain. The purpose of this study is to investigate the association between long-term AAS exposure and brain morphometry, including subcortical neuroanatomical volumes and regional cortical thickness. Male AAS users and weightlifters with no experience with AASs or any other equivalent doping substances underwent structural magnetic resonance imaging scans of the brain. The current paper is based upon high-resolution structural T1-weighted images from 82 current or past AAS users exceeding 1 year of cumulative AAS use and 68 non-AAS-using weightlifters. Images were processed with the FreeSurfer software to compare neuroanatomical volumes and cerebral cortical thickness between the groups. Compared to non-AAS-using weightlifters, the AAS group had thinner cortex in widespread regions and significantly smaller neuroanatomical volumes, including total gray matter, cerebral cortex, and putamen. Both volumetric and thickness effects remained relatively stable across different AAS subsamples comprising various degrees of exposure to AASs and also when excluding participants with previous and current non-AAS drug abuse. The effects could not be explained by differences in verbal IQ, intracranial volume, anxiety/depression, or attention or behavioral problems. This large-scale systematic investigation of AAS use on brain structure shows negative correlations between AAS use and brain volume and cortical thickness. Although the findings are correlational, they may serve to raise concern about the long-term consequences of AAS use on structural features of the brain. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Goodkind, Daniel; Lollock, Lisa; Choi, Yoonjoung; McDevitt, Thomas; West, Loraine
2018-01-01
Meeting demand for family planning can facilitate progress towards all major themes of the United Nations Sustainable Development Goals (SDGs): people, planet, prosperity, peace, and partnership. Many policymakers have embraced a benchmark goal that at least 75% of the demand for family planning in all countries be satisfied with modern contraceptive methods by the year 2030. This study examines the demographic impact (and development implications) of achieving the 75% benchmark in 13 developing countries that are expected to be the furthest from achieving that benchmark. Estimation of the demographic impact of achieving the 75% benchmark requires three steps in each country: 1) translate contraceptive prevalence assumptions (with and without intervention) into future fertility levels based on biometric models, 2) incorporate each pair of fertility assumptions into separate population projections, and 3) compare the demographic differences between the two population projections. Data are drawn from the United Nations, the US Census Bureau, and Demographic and Health Surveys. The demographic impact of meeting the 75% benchmark is examined via projected differences in fertility rates (average expected births per woman's reproductive lifetime), total population, growth rates, age structure, and youth dependency. On average, meeting the benchmark would imply a 16 percentage point increase in modern contraceptive prevalence by 2030 and a 20% decline in youth dependency, which portends a potential demographic dividend to spur economic growth. Improvements in meeting the demand for family planning with modern contraceptive methods can bring substantial benefits to developing countries. To our knowledge, this is the first study to show formally how such improvements can alter population size and age structure. Declines in youth dependency portend a demographic dividend, an added bonus to the already well-known benefits of meeting existing demands for family planning.
Multisource Estimation of Long-term Global Terrestrial Surface Radiation
NASA Astrophysics Data System (ADS)
Peng, L.; Sheffield, J.
2017-12-01
Land surface net radiation is the essential energy source at the earth's surface. It determines the surface energy budget and its partitioning, drives the hydrological cycle by providing available energy, and offers heat, light, and energy for biological processes. Individual components in net radiation have changed historically due to natural and anthropogenic climate change and land use change. Decadal variations in radiation such as global dimming or brightening have important implications for hydrological and carbon cycles. In order to assess the trends and variability of net radiation and evapotranspiration, there is a need for accurate estimates of long-term terrestrial surface radiation. While large progress in measuring top of atmosphere energy budget has been made, huge discrepancies exist among ground observations, satellite retrievals, and reanalysis fields of surface radiation, due to the lack of observational networks, the difficulty in measuring from space, and the uncertainty in algorithm parameters. To overcome the weakness of single source datasets, we propose a multi-source merging approach to fully utilize and combine multiple datasets of radiation components separately, as they are complementary in space and time. First, we conduct diagnostic analysis of multiple satellite and reanalysis datasets based on in-situ measurements such as Global Energy Balance Archive (GEBA), existing validation studies, and other information such as network density and consistency with other meteorological variables. Then, we calculate the optimal weighted average of multiple datasets by minimizing the variance of error between in-situ measurements and other observations. Finally, we quantify the uncertainties in the estimates of surface net radiation and employ physical constraints based on the surface energy balance to reduce these uncertainties. The final dataset is evaluated in terms of the long-term variability and its attribution to changes in individual components. The goal of this study is to provide a merged observational benchmark for large-scale diagnostic analyses, remote sensing and land surface modeling.