The Misattribution of Summers in Teacher Value-Added
ERIC Educational Resources Information Center
Atteberry, Allison
2012-01-01
This paper investigates the extent to which spring-to-spring testing timelines bias teacher value-added as a result of conflating summer and school-year learning. Using a unique dataset that contains both fall and spring standardized test scores, the author examines the patterns in school-year versus summer learning. She estimates value-added…
[Value-Added--Adding Economic Value in the Food Industry].
ERIC Educational Resources Information Center
Welch, Mary A., Ed.
1989-01-01
This booklet focuses on the economic concept of "value added" to goods and services. A student activity worksheet illustrates how the steps involved in processing food are examples of the concept of value added. The booklet further links food processing to the idea of value added to the Gross National Product (GNP). Discussion questions,…
Selection-Fusion Approach for Classification of Datasets with Missing Values
Ghannad-Rezaie, Mostafa; Soltanian-Zadeh, Hamid; Ying, Hao; Dong, Ming
2010-01-01
This paper proposes a new approach based on missing value pattern discovery for classifying incomplete data. This approach is particularly designed for classification of datasets with a small number of samples and a high percentage of missing values where available missing value treatment approaches do not usually work well. Based on the pattern of the missing values, the proposed approach finds subsets of samples for which most of the features are available and trains a classifier for each subset. Then, it combines the outputs of the classifiers. Subset selection is translated into a clustering problem, allowing derivation of a mathematical framework for it. A trade off is established between the computational complexity (number of subsets) and the accuracy of the overall classifier. To deal with this trade off, a numerical criterion is proposed for the prediction of the overall performance. The proposed method is applied to seven datasets from the popular University of California, Irvine data mining archive and an epilepsy dataset from Henry Ford Hospital, Detroit, Michigan (total of eight datasets). Experimental results show that classification accuracy of the proposed method is superior to those of the widely used multiple imputations method and four other methods. They also show that the level of superiority depends on the pattern and percentage of missing values. PMID:20212921
School system evaluation by value added analysis under endogeneity.
Manzi, Jorge; San Martín, Ernesto; Van Bellegem, Sébastien
2014-01-01
Value added is a common tool in educational research on effectiveness. It is often modeled as a (prediction of a) random effect in a specific hierarchical linear model. This paper shows that this modeling strategy is not valid when endogeneity is present. Endogeneity stems, for instance, from a correlation between the random effect in the hierarchical model and some of its covariates. This paper shows that this phenomenon is far from exceptional and can even be a generic problem when the covariates contain the prior score attainments, a typical situation in value added modeling. Starting from a general, model-free definition of value added, the paper derives an explicit expression of the value added in an endogeneous hierarchical linear Gaussian model. Inference on value added is proposed using an instrumental variable approach. The impact of endogeneity on the value added and the estimated value added is calculated accurately. This is also illustrated on a large data set of individual scores of about 200,000 students in Chile.
ERIC Educational Resources Information Center
UCLA IDEA, 2012
2012-01-01
Value added measures (VAM) uses changes in student test scores to determine how much "value" an individual teacher has "added" to student growth during the school year. Some policymakers, school districts, and educational advocates have applauded VAM as a straightforward measure of teacher effectiveness: the better a teacher,…
What's the Value in Value-Added?
ERIC Educational Resources Information Center
Duffrin, Elizabeth
2011-01-01
A growing number of school districts are adopting "value-added" measures of teaching quality to award bonuses or even tenure. And two competitive federal grants are spurring them on. Districts using value-added data are encouraged by the results. But researchers who support value-added measures advise caution. The ratings, which use a…
PDF added value of a high resolution climate simulation for precipitation
NASA Astrophysics Data System (ADS)
Soares, Pedro M. M.; Cardoso, Rita M.
2015-04-01
General Circulation Models (GCMs) are models suitable to study the global atmospheric system, its evolution and response to changes in external forcing, namely to increasing emissions of CO2. However, the resolution of GCMs, of the order of 1o, is not sufficient to reproduce finer scale features of the atmospheric flow related to complex topography, coastal processes and boundary layer processes, and higher resolution models are needed to describe observed weather and climate. The latter are known as Regional Climate Models (RCMs) and are widely used to downscale GCMs results for many regions of the globe and are able to capture physically consistent regional and local circulations. Most of the RCMs evaluations rely on the comparison of its results with observations, either from weather stations networks or regular gridded datasets, revealing the ability of RCMs to describe local climatic properties, and assuming most of the times its higher performance in comparison with the forcing GCMs. The additional climatic details given by RCMs when compared with the results of the driving models is usually named as added value, and it's evaluation is still scarce and controversial in the literuature. Recently, some studies have proposed different methodologies to different applications and processes to characterize the added value of specific RCMs. A number of examples reveal that some RCMs do add value to GCMs in some properties or regions, and also the opposite, elighnening that RCMs may add value to GCM resuls, but improvements depend basically on the type of application, model setup, atmospheric property and location. The precipitation can be characterized by histograms of daily precipitation, or also known as probability density functions (PDFs). There are different strategies to evaluate the quality of both GCMs and RCMs in describing the precipitation PDFs when compared to observations. Here, we present a new method to measure the PDF added value obtained from
NASA Technical Reports Server (NTRS)
Moody, Eric G.; King, Michael D.; Platnick, Steven; Schaaf, Crystal B.; Gao, Feng
2004-01-01
Land surface albedo is an important parameter in describing the radiative properties of the earth s surface as it represents the amount of incoming solar radiation that is reflected from the surface. The amount and type of vegetation of the surface dramatically alters the amount of radiation that is reflected; for example, croplands that contain leafy vegetation will reflect radiation very differently than blacktop associated with urban areas. In addition, since vegetation goes through a growth, or phenological, cycle, the amount of radiation that is reflected changes over the course of a year. As a result, albedo is both temporally and spatially dependant upon global location as there is a distribution of vegetated surface types and growing conditions. Land surface albedo is critical for a wide variety of earth system research projects including but not restricted to remote sensing of atmospheric aerosol and cloud properties from space, ground-based analysis of aerosol optical properties from surface-based sun/sky radiometers, biophysically-based land surface modeling of the exchange of energy, water, momentum, and carbon for various land use categories, and surface energy balance studies. These projects require proper representation of the surface albedo s spatial, spectral, and temporal variations, however, these representations are often lacking in datasets prior to the latest generation of land surface albedo products.
NASA Astrophysics Data System (ADS)
Soares, P. M. M.; Cardoso, R. M.
2017-12-01
Regional climate models (RCM) are used with increasing resolutions pursuing to represent in an improved way regional to local scale atmospheric phenomena. The EURO-CORDEX simulations at 0.11° and simulations exploiting finer grid spacing approaching the convective-permitting regimes are representative examples. The climate runs are computationally very demanding and do not always show improvements. These depend on the region, variable and object of study. The gains or losses associated with the use of higher resolution in relation to the forcing model (global climate model or reanalysis), or to different resolution RCM simulations, is known as added value. Its characterization is a long-standing issue, and many different added-value measures have been proposed. In the current paper, a new method is proposed to assess the added value of finer resolution simulations, in comparison to its forcing data or coarser resolution counterparts. This approach builds on a probability density function (PDF) matching score, giving a normalised measure of the difference between diverse resolution PDFs, mediated by the observational ones. The distribution added value (DAV) is an objective added value measure that can be applied to any variable, region or temporal scale, from hindcast or historical (non-synchronous) simulations. The DAVs metric and an application to the EURO-CORDEX simulations, for daily temperatures and precipitation, are here presented. The EURO-CORDEX simulations at both resolutions (0.44o,0.11o) display a clear added value in relation to ERA-Interim, with values around 30% in summer and 20% in the intermediate seasons, for precipitation. When both RCM resolutions are directly compared the added value is limited. The regions with the larger precipitation DAVs are areas where convection is relevant, e.g. Alps and Iberia. When looking at the extreme precipitation PDF tail, the higher resolution improvement is generally greater than the low resolution for seasons
Measuring Teacher Effectiveness with the Pennsylvania Value-Added Assessment System
ERIC Educational Resources Information Center
Bowen, Naomi
2017-01-01
The purpose of this research was to determine if the Pennsylvania Value-Added Assessment System Average Growth Index (PVAAS AGI) scores, derived from standardized tests and calculated for Pennsylvania schools, provide a valid and reliable assessment of teacher effectiveness, as these scores are currently used to derive 15% of the annual…
Value Added in English Schools
ERIC Educational Resources Information Center
Ray, Andrew; McCormack, Tanya; Evans, Helen
2009-01-01
Value-added indicators are now a central part of school accountability in England, and value-added information is routinely used in school improvement at both the national and the local levels. This article describes the value-added models that are being used in the academic year 2007-8 by schools, parents, school inspectors, and other…
Myths & Facts about Value-Added Analysis
ERIC Educational Resources Information Center
TNTP, 2011
2011-01-01
This paper presents myths as well as facts about value-added analysis. These myths include: (1) "Value-added isn't fair to teachers who work in high-need schools, where students tend to lag far behind academically"; (2) "Value-added scores are too volatile from year-to-year to be trusted"; (3) "There's no research behind value-added"; (4) "Using…
ERIC Educational Resources Information Center
Richards, Andrew
2015-01-01
Two quantitative measures of school performance are currently used, the average points score (APS) at Key Stage 2 and value-added (VA), which measures the rate of academic improvement between Key Stage 1 and 2. These figures are used by parents and the Office for Standards in Education to make judgements and comparisons. However, simple…
Value-Added Models and the Measurement of Teacher Productivity. CALDER Working Paper No. 54
ERIC Educational Resources Information Center
Harris, Douglas; Sass, Tim; Semykina, Anastasia
2010-01-01
Research on teacher productivity, and recently developed accountability systems for teachers, rely on value-added models to estimate the impact of teachers on student performance. The authors test many of the central assumptions required to derive value-added models from an underlying structural cumulative achievement model and reject nearly all…
Kelder, Johannes C; Cowie, Martin R; McDonagh, Theresa A; Hardman, Suzanna M C; Grobbee, Diederick E; Cost, Bernard; Hoes, Arno W
2011-06-01
Diagnosing early stages of heart failure with mild symptoms is difficult. B-type natriuretic peptide (BNP) has promising biochemical test characteristics, but its diagnostic yield on top of readily available diagnostic knowledge has not been sufficiently quantified in early stages of heart failure. To quantify the added diagnostic value of BNP for the diagnosis of heart failure in a population relevant to GPs and validate the findings in an independent primary care patient population. Individual patient data meta-analysis followed by external validation. The additional diagnostic yield of BNP above standard clinical information was compared with ECG and chest x-ray results. Derivation was performed on two existing datasets from Hillingdon (n=127) and Rotterdam (n=149) while the UK Natriuretic Peptide Study (n=306) served as validation dataset. Included were patients with suspected heart failure referred to a rapid-access diagnostic outpatient clinic. Case definition was according to the ESC guideline. Logistic regression was used to assess discrimination (with the c-statistic) and calibration. Of the 276 patients in the derivation set, 30.8% had heart failure. The clinical model (encompassing age, gender, known coronary artery disease, diabetes, orthopnoea, elevated jugular venous pressure, crackles, pitting oedema and S3 gallop) had a c-statistic of 0.79. Adding, respectively, chest x-ray results, ECG results or BNP to the clinical model increased the c-statistic to 0.84, 0.85 and 0.92. Neither ECG nor chest x-ray added significantly to the 'clinical plus BNP' model. All models had adequate calibration. The 'clinical plus BNP' diagnostic model performed well in an independent cohort with comparable inclusion criteria (c-statistic=0.91 and adequate calibration). Using separate cut-off values for 'ruling in' (typically implying referral for echocardiography) and for 'ruling out' heart failure--creating a grey zone--resulted in insufficient proportions of patients
National Hydrography Dataset Plus (NHDPlus)
The NHDPlus Version 1.0 is an integrated suite of application-ready geospatial data sets that incorporate many of the best features of the National Hydrography Dataset (NHD) and the National Elevation Dataset (NED). The NHDPlus includes a stream network (based on the 1:100,000-scale NHD), improved networking, naming, and value-added attributes (VAA's). NHDPlus also includes elevation-derived catchments (drainage areas) produced using a drainageenforcement technique first broadly applied in New England, and thus dubbed The New-England Method. This technique involves burning-in the 1:100,000-scale NHD and when available building walls using the national WatershedBoundary Dataset (WBD). The resulting modified digital elevation model(HydroDEM) is used to produce hydrologic derivatives that agree with the NHDand WBD. An interdisciplinary team from the U. S. Geological Survey (USGS), U.S. Environmental Protection Agency (USEPA), and contractors, over the lasttwo years has found this method to produce the best quality NHD catchments using an automated process.The VAAs include greatly enhanced capabilities for upstream and downstream navigation, analysis and modeling. Examples include: retrieve all flowlines (predominantly confluence-to-confluence stream segments) and catchments upstream of a given flowline using queries rather than by slower flowline-by flowline navigation; retrieve flowlines by stream order; subset a stream level path sorted in hydrologic order for st
Outlier Removal in Model-Based Missing Value Imputation for Medical Datasets.
Huang, Min-Wei; Lin, Wei-Chao; Tsai, Chih-Fong
2018-01-01
Many real-world medical datasets contain some proportion of missing (attribute) values. In general, missing value imputation can be performed to solve this problem, which is to provide estimations for the missing values by a reasoning process based on the (complete) observed data. However, if the observed data contain some noisy information or outliers, the estimations of the missing values may not be reliable or may even be quite different from the real values. The aim of this paper is to examine whether a combination of instance selection from the observed data and missing value imputation offers better performance than performing missing value imputation alone. In particular, three instance selection algorithms, DROP3, GA, and IB3, and three imputation algorithms, KNNI, MLP, and SVM, are used in order to find out the best combination. The experimental results show that that performing instance selection can have a positive impact on missing value imputation over the numerical data type of medical datasets, and specific combinations of instance selection and imputation methods can improve the imputation results over the mixed data type of medical datasets. However, instance selection does not have a definitely positive impact on the imputation result for categorical medical datasets.
Chen, Yougui; Thiyam-Hollander, Usha; Barthet, Veronique J; Aachary, Ayyappan A
2014-10-08
Valuable phenolic antioxidants are lost during oil refining, but evaluation of their occurrence in refining byproducts is lacking. Rapeseed and canola oil are both rich sources of sinapic acid derivatives and tocopherols. The retention and loss of sinapic acid derivatives and tocopherols in commercially produced expeller-pressed canola oils subjected to various refining steps and the respective byproducts were investigated. Loss of canolol (3) and tocopherols were observed during bleaching (84.9%) and deodorization (37.6%), respectively. Sinapic acid (2) (42.9 μg/g), sinapine (1) (199 μg/g), and canolol (344 μg/g) were found in the refining byproducts, namely, soap stock, spent bleaching clay, and wash water, for the first time. Tocopherols (3.75 mg/g) and other nonidentified phenolic compounds (2.7 mg sinapic acid equivalent/g) were found in deodistillates, a byproduct of deodorization. DPPH radical scavenging confirmed the antioxidant potential of the byproducts. This study confirms the value-added potential of byproducts of refining as sources of endogenous phenolics.
Microbial production of value-added nutraceuticals.
Wang, Jian; Guleria, Sanjay; Koffas, Mattheos Ag; Yan, Yajun
2016-02-01
Nutraceuticals are important natural bioactive compounds that confer health-promoting and medical benefits to humans. Globally growing demands for value-added nutraceuticals for prevention and treatment of human diseases have rendered nutraceuticals a multi-billion dollar market. However, supply limitations and extraction difficulties from natural sources such as plants, animals or fungi, restrict the large-scale use of nutraceuticals. Metabolic engineering via microbial production platforms has been advanced as an eco-friendly alternative approach for production of value-added nutraceuticals from simple carbon sources. Microbial platforms like the most widely used Escherichia coli and Saccharomyces cerevisiae have been engineered as versatile cell factories for production of diverse and complex value-added chemicals such as phytochemicals, prebiotics, polysaccaharides and poly amino acids. This review highlights the recent progresses in biological production of value-added nutraceuticals via metabolic engineering approaches. Copyright © 2015 Elsevier Ltd. All rights reserved.
Georeferencing UAS Derivatives Through Point Cloud Registration with Archived Lidar Datasets
NASA Astrophysics Data System (ADS)
Magtalas, M. S. L. Y.; Aves, J. C. L.; Blanco, A. C.
2016-10-01
Georeferencing gathered images is a common step before performing spatial analysis and other processes on acquired datasets using unmanned aerial systems (UAS). Methods of applying spatial information to aerial images or their derivatives is through onboard GPS (Global Positioning Systems) geotagging, or through tying of models through GCPs (Ground Control Points) acquired in the field. Currently, UAS (Unmanned Aerial System) derivatives are limited to meter-levels of accuracy when their generation is unaided with points of known position on the ground. The use of ground control points established using survey-grade GPS or GNSS receivers can greatly reduce model errors to centimeter levels. However, this comes with additional costs not only with instrument acquisition and survey operations, but also in actual time spent in the field. This study uses a workflow for cloud-based post-processing of UAS data in combination with already existing LiDAR data. The georeferencing of the UAV point cloud is executed using the Iterative Closest Point algorithm (ICP). It is applied through the open-source CloudCompare software (Girardeau-Montaut, 2006) on a `skeleton point cloud'. This skeleton point cloud consists of manually extracted features consistent on both LiDAR and UAV data. For this cloud, roads and buildings with minimal deviations given their differing dates of acquisition are considered consistent. Transformation parameters are computed for the skeleton cloud which could then be applied to the whole UAS dataset. In addition, a separate cloud consisting of non-vegetation features automatically derived using CANUPO classification algorithm (Brodu and Lague, 2012) was used to generate a separate set of parameters. Ground survey is done to validate the transformed cloud. An RMSE value of around 16 centimeters was found when comparing validation data to the models georeferenced using the CANUPO cloud and the manual skeleton cloud. Cloud-to-cloud distance computations of
Evans, Nick
2016-07-01
Essential facts Leading Change, Adding Value is NHS England's new nursing and midwifery framework. It builds on Compassion in Practice (CiP), which set out the 6Cs. While CiP established the values of nursing and midwifery, the new framework explains how staff can help transform the health and care sectors to meet the aims of NHS England's Five Year Forward View.
Analysis of plant-derived miRNAs in animal small RNA datasets
2012-01-01
Background Plants contain significant quantities of small RNAs (sRNAs) derived from various sRNA biogenesis pathways. Many of these sRNAs play regulatory roles in plants. Previous analysis revealed that numerous sRNAs in corn, rice and soybean seeds have high sequence similarity to animal genes. However, exogenous RNA is considered to be unstable within the gastrointestinal tract of many animals, thus limiting potential for any adverse effects from consumption of dietary RNA. A recent paper reported that putative plant miRNAs were detected in animal plasma and serum, presumably acquired through ingestion, and may have a functional impact in the consuming organisms. Results To address the question of how common this phenomenon could be, we searched for plant miRNAs sequences in public sRNA datasets from various tissues of mammals, chicken and insects. Our analyses revealed that plant miRNAs were present in the animal sRNA datasets, and significantly miR168 was extremely over-represented. Furthermore, all or nearly all (>96%) miR168 sequences were monocot derived for most datasets, including datasets for two insects reared on dicot plants in their respective experiments. To investigate if plant-derived miRNAs, including miR168, could accumulate and move systemically in insects, we conducted insect feeding studies for three insects including corn rootworm, which has been shown to be responsive to plant-produced long double-stranded RNAs. Conclusions Our analyses suggest that the observed plant miRNAs in animal sRNA datasets can originate in the process of sequencing, and that accumulation of plant miRNAs via dietary exposure is not universal in animals. PMID:22873950
Evans, Nick
2016-07-06
Essential facts Leading Change, Adding Value is NHS England's new nursing and midwifery framework. It builds on Compassion in Practice (CiP), which set out the 6Cs. While CiP established the values of nursing and midwifery, the new framework explains how staff can help transform the health and care sectors to meet the aims of the NHS England's Five Year Forward View.
Evaluating Teachers: The Important Role of Value-Added
ERIC Educational Resources Information Center
Glazerman, Steven; Loeb, Susanna; Goldhaber, Dan; Staiger, Douglas; Raudenbush, Stephen; Whitehurst, Grover
2010-01-01
The evaluation of teachers based on the contribution they make to the learning of their students, "value-added", is an increasingly popular but controversial education reform policy. In this report, the authors highlight and try to clarify four areas of confusion about value-added. The first is between value-added information and the…
Selecting Value-Added Models for Postsecondary Institutional Assessment
ERIC Educational Resources Information Center
Steedle, Jeffrey T.
2012-01-01
Value-added scores from tests of college learning indicate how score gains compare to those expected from students of similar entering academic ability. Unfortunately, the choice of value-added model can impact results, and this makes it difficult to determine which results to trust. The research presented here demonstrates how value-added models…
NASA Astrophysics Data System (ADS)
di Luca, Alejandro; de Elía, Ramón; Laprise, René
2012-03-01
Regional Climate Models (RCMs) constitute the most often used method to perform affordable high-resolution regional climate simulations. The key issue in the evaluation of nested regional models is to determine whether RCM simulations improve the representation of climatic statistics compared to the driving data, that is, whether RCMs add value. In this study we examine a necessary condition that some climate statistics derived from the precipitation field must satisfy in order that the RCM technique can generate some added value: we focus on whether the climate statistics of interest contain some fine spatial-scale variability that would be absent on a coarser grid. The presence and magnitude of fine-scale precipitation variance required to adequately describe a given climate statistics will then be used to quantify the potential added value (PAV) of RCMs. Our results show that the PAV of RCMs is much higher for short temporal scales (e.g., 3-hourly data) than for long temporal scales (16-day average data) due to the filtering resulting from the time-averaging process. PAV is higher in warm season compared to cold season due to the higher proportion of precipitation falling from small-scale weather systems in the warm season. In regions of complex topography, the orographic forcing induces an extra component of PAV, no matter the season or the temporal scale considered. The PAV is also estimated using high-resolution datasets based on observations allowing the evaluation of the sensitivity of changing resolution in the real climate system. The results show that RCMs tend to reproduce relatively well the PAV compared to observations although showing an overestimation of the PAV in warm season and mountainous regions.
Patel, Samir
2015-03-01
Health care is in a state of transition, shifting from volume-based success to value-based success. Hospital executives and referring physicians often do not understand the total value a radiology group provides. A template for easy, cost-effective implementation in clinical practice for most radiology groups to demonstrate the value they provide to their clients (patients, physicians, health care executives) has not been well described. A value management program was developed to document all of the value-added activities performed by on-site radiologists, quantify them in terms of time spent on each activity (investment), and present the benefits to internal and external stakeholders (outcomes). The radiology value-added matrix is the platform from which value-added activities are categorized and synthesized into a template for defining investments and outcomes. The value management program was first implemented systemwide in 2013. Across all serviced locations, 9,931.75 hours were invested. An annual executive summary report template demonstrating outcomes is given to clients. The mean and median individual value-added hours per radiologist were 134.52 and 113.33, respectively. If this program were extrapolated to the entire field of radiology, approximately 30,000 radiologists, this would have resulted in 10,641,161 uncompensated value-added hours documented in 2013, with an estimated economic value of $2.21 billion. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Northern Hemisphere winter storm track trends since 1959 derived from multiple reanalysis datasets
NASA Astrophysics Data System (ADS)
Chang, Edmund K. M.; Yau, Albert M. W.
2016-09-01
In this study, a comprehensive comparison of Northern Hemisphere winter storm track trend since 1959 derived from multiple reanalysis datasets and rawinsonde observations has been conducted. In addition, trends in terms of variance and cyclone track statistics have been compared. Previous studies, based largely on the National Center for Environmental Prediction-National Center for Atmospheric Research Reanalysis (NNR), have suggested that both the Pacific and Atlantic storm tracks have significantly intensified between the 1950s and 1990s. Comparison with trends derived from rawinsonde observations suggest that the trends derived from NNR are significantly biased high, while those from the European Center for Medium Range Weather Forecasts 40-year Reanalysis and the Japanese 55-year Reanalysis are much less biased but still too high. Those from the two twentieth century reanalysis datasets are most consistent with observations but may exhibit slight biases of opposite signs. Between 1959 and 2010, Pacific storm track activity has likely increased by 10 % or more, while Atlantic storm track activity has likely increased by <10 %. Our analysis suggests that trends in Pacific and Atlantic basin wide storm track activity prior to the 1950s derived from the two twentieth century reanalysis datasets are unlikely to be reliable due to changes in density of surface observations. Nevertheless, these datasets may provide useful information on interannual variability, especially over the Atlantic.
Beyond Test Scores: Adding Value to Assessment
ERIC Educational Resources Information Center
Rothman, Robert
2010-01-01
At a time when teacher quality has emerged as a key factor in student learning, a statistical technique that determines the "value added" that teachers bring to student achievement is getting new scrutiny. Value-added measures compare students' growth in achievement to their expected growth, based on prior achievement and demographic…
Vereecken, H; Vanderborght, J; Kasteel, R; Spiteller, M; Schäffer, A; Close, M
2011-01-01
In this study, we analyzed sorption parameters for pesticides that were derived from batch and column or batch and field experiments. The batch experiments analyzed in this study were run with the same pesticide and soil as in the column and field experiments. We analyzed the relationship between the pore water velocity of the column and field experiments, solute residence times, and sorption parameters, such as the organic carbon normalized distribution coefficient ( ) and the mass exchange coefficient in kinetic models, as well as the predictability of sorption parameters from basic soil properties. The batch/column analysis included 38 studies with a total of 139 observations. The batch/field analysis included five studies, resulting in a dataset of 24 observations. For the batch/column data, power law relationships between pore water velocity, residence time, and sorption constants were derived. The unexplained variability in these equations was reduced, taking into account the saturation status and the packing status (disturbed-undisturbed) of the soil sample. A new regression equation was derived that allows estimating the values derived from column experiments using organic matter and bulk density with an value of 0.56. Regression analysis of the batch/column data showed that the relationship between batch- and column-derived values depends on the saturation status and packing of the soil column. Analysis of the batch/field data showed that as the batch-derived value becomes larger, field-derived values tend to be lower than the corresponding batch-derived values, and vice versa. The present dataset also showed that the variability in the ratio of batch- to column-derived value increases with increasing pore water velocity, with a maximum value approaching 3.5. American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America.
ERIC Educational Resources Information Center
Imberman, Scott; Lovenheim, Michael F.
2015-01-01
Value-added data have become an increasingly common evaluation tool for schools and teachers. Many school districts have begun to adopt these methods and have released results publicly. In this paper, we use the unique public release of value-added data in Los Angeles to identify how this measure of school quality is capitalized into housing…
Value added medicines: what value repurposed medicines might bring to society?
Toumi, Mondher; Rémuzat, Cécile
2017-01-01
Background & objectives : Despite the wide interest surrounding drug repurposing, no common terminology has been yet agreed for these products and their full potential value is not always recognised and rewarded, creating a disincentive for further development. The objectives of the present study were to assess from a wide perspective which value drug repurposing might bring to society, but also to identify key obstacles for adoption of these medicines and to discuss policy recommendations. Methods : A preliminary comprehensive search was conducted to assess how the concept of drug repurposing was described in the literature. Following completion of the literature review, a primary research was conducted to get perspective of various stakeholders across EU member states on drug repurposing ( healthcare professionals, regulatory authorities and Health Technology Assessment (HTA) bodies/payers, patients, and representatives of the pharmaceutical industry developing medicines in this field). Ad hoc literature review was performed to illustrate, when appropriate, statements of the various stakeholders. Results : Various nomenclatures have been used to describe the concept of drug repurposing in the literature, with more or less broad definitions either based on outcomes, processes, or being a mix of both. In this context, Medicines for Europe (http://www.medicinesforeurope.com/value-added-medicines/) established one single terminology for these medicines, known as value added medicines, defined as 'medicines based on known molecules that address healthcare needs and deliver relevant improvements for patients, healthcare professionals and/or payers'. Stakeholder interviews highlighted three main potential benefits for value added medicines: (1) to address a number of medicine-related healthcare inefficiencies related to irrational use of medicines, non-availability of appropriate treatment options, shortage of mature products, geographical inequity in medicine access
Value added medicines: what value repurposed medicines might bring to society?
Toumi, Mondher; Rémuzat, Cécile
2017-01-01
ABSTRACT Background & objectives: Despite the wide interest surrounding drug repurposing, no common terminology has been yet agreed for these products and their full potential value is not always recognised and rewarded, creating a disincentive for further development. The objectives of the present study were to assess from a wide perspective which value drug repurposing might bring to society, but also to identify key obstacles for adoption of these medicines and to discuss policy recommendations. Methods: A preliminary comprehensive search was conducted to assess how the concept of drug repurposing was described in the literature. Following completion of the literature review, a primary research was conducted to get perspective of various stakeholders across EU member states on drug repurposing (healthcare professionals, regulatory authorities and Health Technology Assessment (HTA) bodies/payers, patients, and representatives of the pharmaceutical industry developing medicines in this field). Ad hoc literature review was performed to illustrate, when appropriate, statements of the various stakeholders. Results: Various nomenclatures have been used to describe the concept of drug repurposing in the literature, with more or less broad definitions either based on outcomes, processes, or being a mix of both. In this context, Medicines for Europe (http://www.medicinesforeurope.com/value-added-medicines/) established one single terminology for these medicines, known as value added medicines, defined as ‘medicines based on known molecules that address healthcare needs and deliver relevant improvements for patients, healthcare professionals and/or payers’. Stakeholder interviews highlighted three main potential benefits for value added medicines: (1) to address a number of medicine-related healthcare inefficiencies related to irrational use of medicines, non-availability of appropriate treatment options, shortage of mature products, geographical inequity in medicine
Evans, Nick
2016-09-12
Essential facts Leading Change, Adding Value is NHS England's new nursing and midwifery framework. It is designed to build on Compassion in Practice (CiP), which was published 3 years ago and set out the 6Cs: compassion, care, commitment, courage, competence and communication. CiP established the values at the heart of nursing and midwifery, while the new framework sets out how staff can help transform the health and care sectors to meet the aims of the NHS England's Five Year Forward View.
NASA Astrophysics Data System (ADS)
Tsontos, V. M.; Huang, T.; Holt, B.
2015-12-01
The earth science enterprise increasingly relies on the integration and synthesis of multivariate datasets from diverse observational platforms. NASA's ocean salinity missions, that include Aquarius/SAC-D and the SPURS (Salinity Processes in the Upper Ocean Regional Study) field campaign, illustrate the value of integrated observations in support of studies on ocean circulation, the water cycle, and climate. However, the inherent heterogeneity of resulting data and the disparate, distributed systems that serve them complicates their effective utilization for both earth science research and applications. Key technical interoperability challenges include adherence to metadata and data format standards that are particularly acute for in-situ data and the lack of a unified metadata model facilitating archival and integration of both satellite and oceanographic field datasets. Here we report on efforts at the PO.DAAC, NASA's physical oceanographic data center, to extend our data management and distribution support capabilities for field campaign datasets such as those from SPURS. We also discuss value-added services, based on the integration of satellite and in-situ datasets, which are under development with a particular focus on DOMS. The distributed oceanographic matchup service (DOMS) implements a portable technical infrastructure and associated web services that will be broadly accessible via the PO.DAAC for the dynamic collocation of satellite and in-situ data, hosted by distributed data providers, in support of mission cal/val, science and operational applications.
2 CFR 200.470 - Taxes (including Value Added Tax).
Code of Federal Regulations, 2014 CFR
2014-01-01
... 2 Grants and Agreements 1 2014-01-01 2014-01-01 false Taxes (including Value Added Tax). 200.470... Cost § 200.470 Taxes (including Value Added Tax). (a) For states, local governments and Indian tribes... Federal government for the taxes, interest, and penalties. (c) Value Added Tax (VAT) Foreign taxes charged...
Scrubchem: Building Bioactivity Datasets from Pubchem ...
The PubChem Bioassay database is a non-curated public repository with data from 64 sources, including: ChEMBL, BindingDb, DrugBank, EPA Tox21, NIH Molecular Libraries Screening Program, and various other academic, government, and industrial contributors. Methods for extracting this public data into quality datasets, useable for analytical research, presents several big-data challenges for which we have designed manageable solutions. According to our preliminary work, there are approximately 549 million bioactivity values and related meta-data within PubChem that can be mapped to over 10,000 biological targets. However, this data is not ready for use in data-driven research, mainly due to lack of structured annotations.We used a pragmatic approach that provides increasing access to bioactivity values in the PubChem Bioassay database. This included restructuring of individual PubChem Bioassay files into a relational database (ScrubChem). ScrubChem contains all primary PubChem Bioassay data that was: reparsed; error-corrected (when applicable); enriched with additional data links from other NCBI databases; and improved by adding key biological and assay annotations derived from logic-based language processing rules. The utility of ScrubChem and the curation process were illustrated using an example bioactivity dataset for the androgen receptor protein. This initial work serves as a trial ground for establishing the technical framework for accessing, integrating, cu
Conducting high-value secondary dataset analysis: an introductory guide and resources.
Smith, Alexander K; Ayanian, John Z; Covinsky, Kenneth E; Landon, Bruce E; McCarthy, Ellen P; Wee, Christina C; Steinman, Michael A
2011-08-01
Secondary analyses of large datasets provide a mechanism for researchers to address high impact questions that would otherwise be prohibitively expensive and time-consuming to study. This paper presents a guide to assist investigators interested in conducting secondary data analysis, including advice on the process of successful secondary data analysis as well as a brief summary of high-value datasets and online resources for researchers, including the SGIM dataset compendium ( www.sgim.org/go/datasets ). The same basic research principles that apply to primary data analysis apply to secondary data analysis, including the development of a clear and clinically relevant research question, study sample, appropriate measures, and a thoughtful analytic approach. A real-world case description illustrates key steps: (1) define your research topic and question; (2) select a dataset; (3) get to know your dataset; and (4) structure your analysis and presentation of findings in a way that is clinically meaningful. Secondary dataset analysis is a well-established methodology. Secondary analysis is particularly valuable for junior investigators, who have limited time and resources to demonstrate expertise and productivity.
ERIC Educational Resources Information Center
Koedel, Cory; Betts, Julian
2009-01-01
Value-added measures of teacher quality may be sensitive to the quantitative properties of the student tests upon which they are based. This paper focuses on the sensitivity of value- added to test-score-ceiling effects. Test-score ceilings are increasingly common in testing instruments across the country as education policy continues to emphasize…
Can Value Added Add Value to Teacher Evaluation?
ERIC Educational Resources Information Center
Darling-Hammond, Linda
2015-01-01
The five thoughtful papers included in this issue of "Educational Researcher" ("ER") raise new questions about the use of value-added methods (VAMs) to estimate teachers' contributions to students' learning as part of personnel evaluation. The papers address both technical and implementation concerns, considering potential…
Value-Added Models for the Pittsburgh Public Schools
ERIC Educational Resources Information Center
Johnson, Matthew; Lipscomb, Stephen; Gill, Brian; Booker, Kevin; Bruch, Julie
2012-01-01
At the request of Pittsburgh Public Schools (PPS) and the Pittsburgh Federation of Teachers (PFT), Mathematica has developed value-added models (VAMs) that aim to estimate the contributions of individual teachers, teams of teachers, and schools to the achievement growth of their students. The authors' work in estimating value-added in Pittsburgh…
Western hardwoods : value-added research and demonstration program
D. W. Green; W. W. Von Segen; S. A. Willits
1995-01-01
Research results from the value-added research and demonstration program for western hardwoods are summarized in this report. The intent of the program was to enhance the economy of the Pacific Northwest by helping local communities and forest industries produce wood products more efficiently. Emphasis was given to value-added products and barriers to increased...
On the added value of WUDAPT for Urban Climate Modelling
NASA Astrophysics Data System (ADS)
Brousse, Oscar; Martilli, Alberto; Mills, Gerald; Bechtel, Benjamin; Hammerberg, Kris; Demuzere, Matthias; Wouters, Hendrik; Van Lipzig, Nicole; Ren, Chao; Feddema, Johannes J.; Masson, Valéry; Ching, Jason
2017-04-01
Over half of the planet's population now live in cities and is expected to grow up to 65% by 2050 (United Nations, 2014), most of whom will actually occupy new emerging cities of the global South. Cities' impact on climate is known to be a key driver of environmental change (IPCC, 2014) and has been studied for decades now (Howard, 1875). Still very little is known about our cities' structure around the world, preventing urban climate simulations to be done and hence guidance to be provided for mitigation. Assessing the need to bridge the urban knowledge gap for urban climate modelling perspectives, the World Urban Database and Access Portal Tool - WUDAPT - project (Ching et al., 2015; Mills et al., 2015) developed an innovative technique to map cities globally rapidly and freely. The framework established by Bechtel and Daneke (2012) derives Local Climate Zones (Stewart and Oke, 2012) city maps out of LANDSAT 8 OLI-TIRS imagery (Bechtel et al., 2015) through a supervised classification by a Random Forest Classification algorithm (Breiman, 2001). The first attempt to implement Local Climate Zones (LCZ) out of the WUDAPT product within a major climate model was carried out by Brousse et al. (2016) over Madrid, Spain. This study proved the applicability of LCZs as an enhanced urban parameterization within the WRF model (Chen et al. 2011) employing the urban canopy model BEP-BEM (Martilli, 2002; Salamanca et al., 2010), using the averaged values of the morphological and physical parameters' ranges proposed by Stewart and Oke (2012). Other studies have now used the Local Climate Zones for urban climate modelling purposes (Alexander et al., 2016; Wouters et al. 2016; Hammerberg et al., 2017; Brousse et al., 2017) and demonstrated the added value of the WUDAPT dataset. As urban data accessibility is one of the major challenge for simulations in emerging countries, this presentation will show results of simulations using LCZs and the capacity of the WUDAPT framework to be
Robustness of Value-Added Analysis of School Effectiveness. Research Report. ETS RR-08-22
ERIC Educational Resources Information Center
Braun, Henry; Qu, Yanxuan
2008-01-01
This paper reports on a study conducted to investigate the consistency of the results between 2 approaches to estimating school effectiveness through value-added modeling. Estimates of school effects from the layered model employing item response theory (IRT) scaled data are compared to estimates derived from a discrete growth model based on the…
The Disaggregation of Value-Added Test Scores to Assess Learning Outcomes in Economics Courses
ERIC Educational Resources Information Center
Walstad, William B.; Wagner, Jamie
2016-01-01
This study disaggregates posttest, pretest, and value-added or difference scores in economics into four types of economic learning: positive, retained, negative, and zero. The types are derived from patterns of student responses to individual items on a multiple-choice test. The micro and macro data from the "Test of Understanding in College…
AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku
2014-05-27
The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting
Adding value to laboratory medicine: a professional responsibility.
Beastall, Graham H
2013-01-01
Laboratory medicine is a medical specialty at the centre of healthcare. When used optimally laboratory medicine generates knowledge that can facilitate patient safety, improve patient outcomes, shorten patient journeys and lead to more cost-effective healthcare. Optimal use of laboratory medicine relies on dynamic and authoritative leadership outside as well as inside the laboratory. The first responsibility of the head of a clinical laboratory is to ensure the provision of a high quality service across a wide range of parameters culminating in laboratory accreditation against an international standard, such as ISO 15189. From that essential baseline the leadership of laboratory medicine at local, national and international level needs to 'add value' to ensure the optimal delivery, use, development and evaluation of the services provided for individuals and for groups of patients. A convenient tool to illustrate added value is use of the mnemonic 'SCIENCE'. This tool allows added value to be considered in seven domains: standardisation and harmonisation; clinical effectiveness; innovation; evidence-based practice; novel applications; cost-effectiveness; and education of others. The assessment of added value in laboratory medicine may be considered against a framework that comprises three dimensions: operational efficiency; patient management; and patient behaviours. The profession and the patient will benefit from sharing examples of adding value to laboratory medicine.
Penicillium roqueforti: a multifunctional cell factory of high value-added molecules.
Mioso, R; Toledo Marante, F J; Herrera Bravo de Laguna, I
2015-04-01
This is a comprehensive review, with 114 references, of the chemical diversity found in the fungus Penicillium roqueforti. Secondary metabolites of an alkaloidal nature are described, for example, ergot alkaloids such as festuclavine, isofumigaclavines A and B, and diketopiperazine alkaloids such as roquefortines A-D, which are derived from imidazole. Other metabolites are marcfortines A-C, PR-toxin, eremofortines A-E, mycophenolic and penicillic acids, and some γ-lactones. Also, recent developments related to the structural characteristics of botryodiplodin and andrastin are studied-the latter has anticancer properties. Finally, we discuss the enzymes of P. roqueforti, which can participate in the biotechnological production of high value-added molecules, as well as the use of secondary metabolite profiles for taxonomic purposes. © 2014 The Society for Applied Microbiology.
What's the Value of VAM (Value-Added Modeling)?
ERIC Educational Resources Information Center
Scherrer, Jimmy
2012-01-01
The use of value-added modeling (VAM) in school accountability is expanding, but deciding how to embrace VAM is difficult. Various experts say it's too unreliable, causes more harm than good, and has a big margin for error. Others assert VAM is imperfect but useful, and provides valuable feedback. A closer look at the models, and their use,…
Adding Value to Indiana's Commodities.
ERIC Educational Resources Information Center
Welch, Mary A., Ed.
1995-01-01
Food processing plants are adding value to bulk and intermediate products to sell overseas. The Asian Pacific Rim economies constituted the largest market for consumer food products in 1993. This shift toward consumer food imports in this area is due to more women working outside the home, the internationalization of populations, and dramatic…
DOE Office of Scientific and Technical Information (OSTI.GOV)
KL Gaustad; DD Turner
2007-09-30
This report provides a short description of the Atmospheric Radiation Measurement (ARM) microwave radiometer (MWR) RETrievel (MWRRET) Value-Added Product (VAP) algorithm. This algorithm utilizes complimentary physical and statistical retrieval methods and applies brightness temperature offsets to reduce spurious liquid water path (LWP) bias in clear skies resulting in significantly improved precipitable water vapor (PWV) and LWP retrievals. We present a general overview of the technique, input parameters, output products, and describe data quality checks. A more complete discussion of the theory and results is given in Turner et al. (2007b).
ERIC Educational Resources Information Center
Rodgers, Timothy
2007-01-01
The 2003 UK higher education White Paper suggested that the sector needed to re-examine the potential of the value added concept. This paper describes a possible methodology for developing a performance indicator based on the economic value added to graduates. The paper examines how an entry-quality-adjusted measure of a graduate's…
ERIC Educational Resources Information Center
Loeb, Susanna
2013-01-01
The question for this brief is whether education leaders can use value-added measures as tools for improving schooling and, if so, how to do this. Districts, states, and schools can, at least in theory, generate gains in educational outcomes for students using value-added measures in three ways: creating information on effective programs, making…
Patient-centered care as value-added service by compounding pharmacists.
McPherson, Timothy B; Fontane, Patrick E; Day, Jonathan R
2013-01-01
The term "value-added" is widely used to describe business and professional services that complement a product or service or that differentiate it from competing products and services. The objective of this study was to determine compounding pharmacists' self-perceptions of the value-added services they provide. A web-based survey method was used. Respondents' perceptions of their most important value-added service frequently fell into one of two categories: (1) enhanced pharmacist contribution to developing and implementing patient therapeutic plans and (2) providing customized medications of high pharmaceutical quality. The results were consistent with a hybrid community clinical practice model for compounding pharmacists wherein personalization of the professional relationship is the value-added characteristic.
ERIC Educational Resources Information Center
Harris, Douglas N.; Anderson, Andrew
2013-01-01
There is a growing body of research on the validity and reliability of value-added measures, but most of this research has focused on elementary grades. Driven by several federal initiatives such as Race to the Top, Teacher Incentive Fund, and ESEA waivers, however, many states have incorporated value-added measures into the evaluations not only…
Implementing Value-Added Measures of School Effectiveness: Getting the Incentives Right.
ERIC Educational Resources Information Center
Ladd, Helen F.; Walsh, Randall P.
2002-01-01
Evaluates value-added approach to measuring school effectiveness in North and South Carolina. Finds that value-added approach favors high-achievement schools, with large percentage of students from high-SES backgrounds. Discusses statistical problems in measuring value added. Concludes teachers' and administrators' avoidance of low-achievement,…
Methodological Concerns about the Education Value-Added Assessment System
ERIC Educational Resources Information Center
Amrein-Beardsley, Audrey
2008-01-01
Value-added models help to evaluate the knowledge that school districts, schools, and teachers add to student learning as students progress through school. In this article, the well-known Education Value-Added Assessment System (EVAAS) is examined. The author presents a practical investigation of the methodological issues associated with the…
Nurse Value-Added and Patient Outcomes in Acute Care
Yakusheva, Olga; Lindrooth, Richard; Weiss, Marianne
2014-01-01
Objective The aims of the study were to (1) estimate the relative nurse effectiveness, or individual nurse value-added (NVA), to patients’ clinical condition change during hospitalization; (2) examine nurse characteristics contributing to NVA; and (3) estimate the contribution of value-added nursing care to patient outcomes. Data Sources/Study Setting Electronic data on 1,203 staff nurses matched with 7,318 adult medical–surgical patients discharged between July 1, 2011 and December 31, 2011 from an urban Magnet-designated, 854-bed teaching hospital. Study Design Retrospective observational longitudinal analysis using a covariate-adjustment value-added model with nurse fixed effects. Data Collection/Extraction Methods Data were extracted from the study hospital's electronic patient records and human resources databases. Principal Findings Nurse effects were jointly significant and explained 7.9 percent of variance in patient clinical condition change during hospitalization. NVA was positively associated with having a baccalaureate degree or higher (0.55, p = .04) and expertise level (0.66, p = .03). NVA contributed to patient outcomes of shorter length of stay and lower costs. Conclusions Nurses differ in their value-added to patient outcomes. The ability to measure individual nurse relative value-added opens the possibility for development of performance metrics, performance-based rankings, and merit-based salary schemes to improve patient outcomes and reduce costs. PMID:25256089
48 CFR 252.229-7006 - Value added tax exclusion (United Kingdom).
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Value added tax exclusion... CLAUSES Text of Provisions And Clauses 252.229-7006 Value added tax exclusion (United Kingdom). As prescribed in 229.402-70(f), use the following clause: Value Added Tax Exclusion (United Kingdom) (JUN 1997...
48 CFR 252.229-7006 - Value added tax exclusion (United Kingdom).
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Value added tax exclusion... CLAUSES Text of Provisions And Clauses 252.229-7006 Value added tax exclusion (United Kingdom). As prescribed in 229.402-70(f), use the following clause: Value Added Tax Exclusion (United Kingdom) (JUN 1997...
48 CFR 252.229-7006 - Value Added Tax Exclusion (United Kingdom)
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Value Added Tax Exclusion... CLAUSES Text of Provisions And Clauses 252.229-7006 Value Added Tax Exclusion (United Kingdom) As prescribed in 229.402-70(f), use the follow clause: Value Added Tax Exclusion (United Kingdom) (DEC 2011) The...
48 CFR 252.229-7006 - Value Added Tax Exclusion (United Kingdom)
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Value Added Tax Exclusion... CLAUSES Text of Provisions And Clauses 252.229-7006 Value Added Tax Exclusion (United Kingdom) As prescribed in 229.402-70(f), use the follow clause: Value Added Tax Exclusion (United Kingdom) (DEC 2011) The...
48 CFR 252.229-7006 - Value Added Tax Exclusion (United Kingdom)
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Value Added Tax Exclusion... CLAUSES Text of Provisions And Clauses 252.229-7006 Value Added Tax Exclusion (United Kingdom) As prescribed in 229.402-70(f), use the follow clause: Value Added Tax Exclusion (United Kingdom) (DEC 2011) The...
Value-Added Results for Public Virtual Schools in California
ERIC Educational Resources Information Center
Ford, Richard; Rice, Kerry
2015-01-01
The objective of this paper is to present value-added calculation methods that were applied to determine whether online schools performed at the same or different levels relative to standardized testing. This study includes information on how we approached our value added model development and the results for 32 online public high schools in…
Exogenous Variables and Value-Added Assessments: A Fatal Flaw
ERIC Educational Resources Information Center
Berliner, David C.
2014-01-01
Background: There has been rapid growth in value-added assessment of teachers to meet the widely supported policy goal of identifying the most effective and the most ineffective teachers in a school system. The former group is to be rewarded while the latter group is to be helped or fired for their poor performance. But, value-added approaches to…
Using School Lotteries to Evaluate the Value-Added Model
ERIC Educational Resources Information Center
Deutsch, Jonah
2013-01-01
There has been an active debate in the literature over the validity of value-added models. In this study, the author tests the central assumption of value-added models that school assignment is random relative to expected test scores conditional on prior test scores, demographic variables, and other controls. He uses a Chicago charter school's…
Exploring Value-Added Options - Opportunities in Mouldings and Millwork
Bob Smith; Philip A. Araman
1997-01-01
The millwork industry, which includes manufacture of doors, windows, stair parts, blinds, mouldings, picture frame material, and assorted trim, can be a lucrative value-added opportunity for sawmills. Those entering the value-added millwork market often find that it is a great opportunity to generate greater profits from upper grades and utility species, such as yellow...
NASA Astrophysics Data System (ADS)
Turco, M.; Milelli, M.
2009-09-01
skill scores of two competitive forecast. It is important to underline that the conclusions refer to the analysis of the Piemonte operational alert system, so they cannot be directly taken as universally true. But we think that some of the main lessons that can be derived from this study could be useful for the meteorological community. In details, the main conclusions are the following: - despite the overall improvement in global scale and the fact that the resolution of the limited area models has increased considerably over recent years, the QPF produced by the meteorological models involved in this study has not improved enough to allow its direct use, that is, the subjective HQPF continues to offer the best performance; - in the forecast process, the step where humans have the largest added value with respect to mathematical models, is the communication. In fact the human characterisation and communication of the forecast uncertainty to end users cannot be replaced by any computer code; - eventually, although there is no novelty in this study, we would like to show that the correct application of appropriated statistical techniques permits a better definition and quantification of the errors and, mostly important, allows a correct (unbiased) communication between forecasters and decision makers.
Consumer preferences and willingness to pay for value-added chicken product attributes.
Martínez Michel, Lorelei; Anders, Sven; Wismer, Wendy V
2011-10-01
A growing demand for convenient and ready-to-eat products has increased poultry processors' interest in developing consumer-oriented value-added chicken products. In this study, a conjoint analysis survey of 276 chicken consumers in Edmonton was conducted during the summer of 2009 to assess the importance of the chicken part, production method, processing method, storage method, the presence of added flavor, and cooking method on consumer preferences for different value-added chicken product attributes. Estimates of consumer willingness to pay (WTP) premium prices for different combinations of value-added chicken attributes were also determined. Participants'"ideal" chicken product was a refrigerated product made with free-range chicken breast, produced with no additives or preservatives and no added flavor, which could be oven heated or pan heated. Half of all participants on average were willing to pay 30% more for a value-added chicken product over the price of a conventional product. Overall, young consumers, individuals who shop at Farmers' Markets and those who prefer free-range or organic products were more likely to pay a premium for value-added chicken products. As expected, consumers' WTP was affected negatively by product price. Combined knowledge of consumer product attribute preferences and consumer WTP for value-added chicken products can help the poultry industry design innovative value-added chicken products. Practical Application: An optimum combination of product attributes desired by consumers for the development of a new value-added chicken product, as well as the WTP for this product, have been identified in this study. This information is relevant to the poultry industry to enhance consumer satisfaction of future value-added chicken products and provide the tools for future profit growth. © 2011 Institute of Food Technologists®
Getting Value out of Value-Added: Report of a Workshop
ERIC Educational Resources Information Center
Braun, Henry, Ed.; Chudowsky, Naomi, Ed.; Koenig, Judith, Ed.
2010-01-01
Value-added methods refer to efforts to estimate the relative contributions of specific teachers, schools, or programs to student test performance. In recent years, these methods have attracted considerable attention because of their potential applicability for educational accountability, teacher pay-for-performance systems, school and teacher…
NASA Astrophysics Data System (ADS)
Zambrano, Francisco; Wardlow, Brian; Tadesse, Tsegaye; Lillo-Saavedra, Mario; Lagos, Octavio
2017-04-01
Precipitation is a key parameter for the study of climate change and variability and the detection and monitoring of natural disaster such as drought. Precipitation datasets that accurately capture the amount and spatial variability of rainfall is critical for drought monitoring and a wide range of other climate applications. This is challenging in many parts of the world, which often have a limited number of weather stations and/or historical data records. Satellite-derived precipitation products offer a viable alternative with several remotely sensed precipitation datasets now available with long historical data records (+30years), which include the Climate Hazards Group InfraRed Precipitation with Station (CHIRPS) and Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Climate Data Record (PERSIANN-CDR) datasets. This study presents a comparative analysis of three historical satellite-based precipitation datasets that include Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) 3B43 version 7 (1998-2015), PERSIANN-CDR (1983-2015) and CHIRPS 2.0 (1981-2015) over Chile to assess their performance across the country and for the case of the two long-term products the applicability for agricultural drought were evaluated when used in the calculation of commonly used drought indicator as the Standardized Precipitation Index (SPI). In this analysis, 278 weather stations of in situ rainfall measurements across Chile were initially compared to the satellite data. The study area (Chile) was divided into five latitudinal zones: North, North-Central, Central, South-Central and South to determine if there were a regional difference among these satellite products, and nine statistics were used to evaluate their performance to estimate the amount and spatial distribution of historical rainfall across Chile. Hierarchical cluster analysis, k-means and singular value decomposition were used to analyze
Flight-determined aerodynamic derivatives of the AD-1 oblique-wing research airplane
NASA Technical Reports Server (NTRS)
Sim, A. G.; Curry, R. E.
1984-01-01
The AD-1 is a variable-sweep oblique-wing research airplane that exhibits unconventional stability and control characteristics. In this report, flight-determined and predicted stability and control derivatives for the AD-1 airplane are compared. The predictions are based on both wind tunnel and computational results. A final best estimate of derivatives is presented.
Waste valorization by biotechnological conversion into added value products.
Liguori, Rossana; Amore, Antonella; Faraco, Vincenza
2013-07-01
Fossil fuel reserves depletion, global warming, unrelenting population growth, and costly and problematic waste recycling call for renewable resources of energy and consumer products. As an alternative to the 100 % oil economy, production processes based on biomass can be developed. Huge amounts of lignocellulosic wastes are yearly produced all around the world. They include agricultural residues, food farming wastes, "green-grocer's wastes," tree pruning residues, and organic and paper fraction of urban solid wastes. The common ways currently adopted for disposal of these wastes present environmental and economic disadvantages. As an alternative, processes for adding value to wastes producing high added products should be developed, that is the upgrading concept: adding value to wastes by production of a product with desired reproducible properties, having economic and ecological advantages. A wide range of high added value products, such as enzymes, biofuels, organic acids, biopolymers, bioelectricity, and molecules for food and pharmaceutical industries, can be obtained by upgrading solid wastes. The most recent advancements of their production by biotechnological processes are overviewed in this manuscript.
Value Added Based on Educational Positions in Dutch Secondary Education
ERIC Educational Resources Information Center
Timmermans, Anneke C.; Bosker, Roel J.; de Wolf, Inge F.; Doolaard, Simone; van der Werf, Margaretha P. C.
2014-01-01
Estimating added value as an indicator of school effectiveness in the context of educational accountability often occurs using test or examination scores of students. This study investigates the possibilities for using scores of educational positions as an alternative indicator. A number of advantages of a value added indicator based on…
Metrix Matrix: A Cloud-Based System for Tracking Non-Relative Value Unit Value-Added Work Metrics.
Kovacs, Mark D; Sheafor, Douglas H; Thacker, Paul G; Hardie, Andrew D; Costello, Philip
2018-03-01
In the era of value-based medicine, it will become increasingly important for radiologists to provide metrics that demonstrate their value beyond clinical productivity. In this article the authors describe their institution's development of an easy-to-use system for tracking value-added but non-relative value unit (RVU)-based activities. Metrix Matrix is an efficient cloud-based system for tracking value-added work. A password-protected home page contains links to web-based forms created using Google Forms, with collected data populating Google Sheets spreadsheets. Value-added work metrics selected for tracking included interdisciplinary conferences, hospital committee meetings, consulting on nonbilled outside studies, and practice-based quality improvement. Over a period of 4 months, value-added work data were collected for all clinical attending faculty members in a university-based radiology department (n = 39). Time required for data entry was analyzed for 2 faculty members over the same time period. Thirty-nine faculty members (equivalent to 36.4 full-time equivalents) reported a total of 1,223.5 hours of value-added work time (VAWT). A formula was used to calculate "value-added RVUs" (vRVUs) from VAWT. VAWT amounted to 5,793.6 vRVUs or 6.0% of total work performed (vRVUs plus work RVUs [wRVUs]). Were vRVUs considered equivalent to wRVUs for staffing purposes, this would require an additional 2.3 full-time equivalents, on the basis of average wRVU calculations. Mean data entry time was 56.1 seconds per day per faculty member. As health care reimbursement evolves with an emphasis on value-based medicine, it is imperative that radiologists demonstrate the value they add to patient care beyond wRVUs. This free and easy-to-use cloud-based system allows the efficient quantification of value-added work activities. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.
"Value Added" Gauge of Teaching Probed
ERIC Educational Resources Information Center
Viadero, Debra
2009-01-01
A new study by a public and labor economist suggests that "value added" methods for determining the effectiveness of classroom teachers are built on some shaky assumptions and may be misleading. The study, due to be published in February in the "Quarterly Journal of Economics," is the first of a handful of papers now in the…
ARM KAZR-ARSCL Value Added Product
Jensen, Michael
2012-09-28
The Ka-band ARM Zenith Radars (KAZRs) have replaced the long-serving Millimeter Cloud Radars, or MMCRs. Accordingly, the primary MMCR Value Added Product (VAP), the Active Remote Sensing of CLouds (ARSCL) product, is being replaced by a KAZR-based version, the KAZR-ARSCL VAP. KAZR-ARSCL provides cloud boundaries and best-estimate time-height fields of radar moments.
Markham, Wolfgang A.; Young, Robert; Sweeting, Helen; West, Patrick; Aveyard, Paul
2012-01-01
Previous studies found lower substance use in schools achieving better examination and truancy results than expected, given their pupil populations (high value-added schools). This study examines whether these findings are replicated in West Scotland and whether school ethos indicators focussing on pupils' perceptions of schooling (environment, involvement, engagement and teacher–pupil relations) mediate the associations. Teenagers from forty-one schools (S2, aged 13, n = 2268; S4, aged 15, n = 2096) previously surveyed in primary school (aged 11, n = 2482) were surveyed in the late 1990s. School value-added scores were derived from standardised residuals of two regression equations separately predicting from pupils' socio-demographic characteristics (1) proportions of pupils passing five Scottish Standard Grade Examinations, and (2) half-day truancy loss. Outcomes were current smoking, monthly drinking, ever illicit drug use. Random effects logistic regression models adjusted for potential pupil-level confounders were used to assess (1) associations between substance use and school-level value-added scores and (2) whether these associations were mediated by pupils' perceptions of schooling or other school-level factors (school roll, religious denomination and mean aggregated school-level ethos scores). Against expectations, value-added education was positively associated with smoking (Odds Ratios [95% confidence intervals] for one standard deviation increase in value-added scores were 1.28 [1.02–1.61] in S2 and 1.13 [1.00–1.27] in S4) and positively but weakly and non-significantly associated with drinking and drug use. Engagement and positive teacher–pupil relations were strongly and negatively associated with all substance use outcomes at both ages. Other school-level factors appeared weakly and largely non-significantly related to substance use. Value-added scores were unrelated to school ethos measures and no ethos measure mediated associations
Markham, Wolfgang A; Young, Robert; Sweeting, Helen; West, Patrick; Aveyard, Paul
2012-07-01
Previous studies found lower substance use in schools achieving better examination and truancy results than expected, given their pupil populations (high value-added schools). This study examines whether these findings are replicated in West Scotland and whether school ethos indicators focussing on pupils' perceptions of schooling (environment, involvement, engagement and teacher-pupil relations) mediate the associations. Teenagers from forty-one schools (S2, aged 13, n = 2268; S4, aged 15, n = 2096) previously surveyed in primary school (aged 11, n = 2482) were surveyed in the late 1990s. School value-added scores were derived from standardised residuals of two regression equations separately predicting from pupils' socio-demographic characteristics (1) proportions of pupils passing five Scottish Standard Grade Examinations, and (2) half-day truancy loss. Outcomes were current smoking, monthly drinking, ever illicit drug use. Random effects logistic regression models adjusted for potential pupil-level confounders were used to assess (1) associations between substance use and school-level value-added scores and (2) whether these associations were mediated by pupils' perceptions of schooling or other school-level factors (school roll, religious denomination and mean aggregated school-level ethos scores). Against expectations, value-added education was positively associated with smoking (Odds Ratios [95% confidence intervals] for one standard deviation increase in value-added scores were 1.28 [1.02-1.61] in S2 and 1.13 [1.00-1.27] in S4) and positively but weakly and non-significantly associated with drinking and drug use. Engagement and positive teacher-pupil relations were strongly and negatively associated with all substance use outcomes at both ages. Other school-level factors appeared weakly and largely non-significantly related to substance use. Value-added scores were unrelated to school ethos measures and no ethos measure mediated associations between value-added
Cloud Type Classification (cldtype) Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flynn, Donna; Shi, Yan; Lim, K-S
The Cloud Type (cldtype) value-added product (VAP) provides an automated cloud type classification based on macrophysical quantities derived from vertically pointing lidar and radar. Up to 10 layers of clouds are classified into seven cloud types based on predetermined and site-specific thresholds of cloud top, base and thickness. Examples of thresholds for selected U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility sites are provided in Tables 1 and 2. Inputs for the cldtype VAP include lidar and radar cloud boundaries obtained from the Active Remotely Sensed Cloud Location (ARSCL) and Surface Meteorological Systems (MET) data. Rainmore » rates from MET are used to determine when radar signal attenuation precludes accurate cloud detection. Temporal resolution and vertical resolution for cldtype are 1 minute and 30 m respectively and match the resolution of ARSCL. The cldtype classification is an initial step for further categorization of clouds. It was developed for use by the Shallow Cumulus VAP to identify potential periods of interest to the LASSO model and is intended to find clouds of interest for a variety of users.« less
ERIC Educational Resources Information Center
Ye, Yincheng; Singh, Kusum
2017-01-01
The purpose of this study is to better understand how math teachers' effectiveness as measured by value-added scores and student satisfaction with teaching is influenced by school's working conditions. The data for the study were derived from 2009 to 2010 Teacher Working Condition Survey and Student Perception Survey in Measures of Effective…
WRF added value to capture the spatio-temporal drought variability
NASA Astrophysics Data System (ADS)
García-Valdecasas Ojeda, Matilde; Quishpe-Vásquez, César; Raquel Gámiz-Fortis, Sonia; Castro-Díez, Yolanda; Jesús Esteban-Parra, María
2017-04-01
Regional Climate Models (RCM) has been widely used as a tool to perform high resolution climate fields in areas with high climate variability such as Spain. However, the outputs provided by downscaling techniques have many sources of uncertainty associated at different aspects. In this study, the ability of the Weather Research and Forecasting (WRF) model to capture drought conditions has been analyzed. The WRF simulation was carried out for a period that spanned from 1980 to 2010 over a domain centered in the Iberian Peninsula with a spatial resolution of 0.088°, and nested in the coarser EURO-CORDEX domain (0.44° spatial resolution). To investigate the spatiotemporal drought variability, the Standardized Precipitation Index (SPI) and the Standardized Precipitation Evapotranspiration Index (SPEI) has been computed at two different timescales: 3- and 12-months due to its suitability to study agricultural and hydrological droughts. The drought indices computed from WRF outputs were compared with those obtained from the observational (MOTEDAS and MOPREDAS) datasets. In order to assess the added value provided by downscaled fields, these indices were also computed from the ERA-Interim Re-Analysis database, which provides the lateral and boundary conditions of the WRF simulations. Results from this study indicate that WRF provides a noticeable benefit with respect to ERA-Interim for many regions in Spain in terms of drought indices, greater for SPI than for SPEI. The improvement offered by WRF depends on the region, index and timescale analyzed, being greater at longer timescales. These findings prove the reliability of the downscaled fields to detect drought events and, therefore, it is a remarkable source of knowledge for a suitable decision making related to water-resource management. Keywords: Drought, added value, Regional Climate Models, WRF, SPEI, SPI. Acknowledgements: This work has been financed by the projects P11-RNM-7941 (Junta de Andalucía-Spain) and
Developing a new global network of river reaches from merged satellite-derived datasets
NASA Astrophysics Data System (ADS)
Lion, C.; Allen, G. H.; Beighley, E.; Pavelsky, T.
2015-12-01
In 2020, the Surface Water and Ocean Topography satellite (SWOT), a joint mission of NASA/CNES/CSA/UK will be launched. One of its major products will be the measurements of continental water extent, including the width, height, and slope of rivers and the surface area and elevations of lakes. The mission will improve the monitoring of continental water and also our understanding of the interactions between different hydrologic reservoirs. For rivers, SWOT measurements of slope must be carried out over predefined river reaches. As such, an a priori dataset for rivers is needed in order to facilitate analysis of the raw SWOT data. The information required to produce this dataset includes measurements of river width, elevation, slope, planform, river network topology, and flow accumulation. To produce this product, we have linked two existing global datasets: the Global River Widths from Landsat (GRWL) database, which contains river centerline locations, widths, and a braiding index derived from Landsat imagery, and a modified version of the HydroSHEDS hydrologically corrected digital elevation product, which contains heights and flow accumulation measurements for streams at 3 arcsecond spatial resolution. Merging these two datasets requires considerable care. The difficulties, among others, lie in the difference of resolution: 30m versus 3 arseconds, and the age of the datasets: 2000 versus ~2010 (some rivers have moved, the braided sections are different). As such, we have developed custom software to merge the two datasets, taking into account the spatial proximity of river channels in the two datasets and ensuring that flow accumulation in the final dataset always increases downstream. Here, we present our preliminary results for a portion of South America and demonstrate the strengths and weaknesses of the method.
Value-added biotransformation of cellulosic sugars by engineered Saccharomyces cerevisiae.
Lane, Stephan; Dong, Jia; Jin, Yong-Su
2018-07-01
The substantial research efforts into lignocellulosic biofuels have generated an abundance of valuable knowledge and technologies for metabolic engineering. In particular, these investments have led to a vast growth in proficiency of engineering the yeast Saccharomyces cerevisiae for consuming lignocellulosic sugars, enabling the simultaneous assimilation of multiple carbon sources, and producing a large variety of value-added products by introduction of heterologous metabolic pathways. While microbial conversion of cellulosic sugars into large-volume low-value biofuels is not currently economically feasible, there may still be opportunities to produce other value-added chemicals as regulation of cellulosic sugar metabolism is quite different from glucose metabolism. This review summarizes these recent advances with an emphasis on employing engineered yeast for the bioconversion of lignocellulosic sugars into a variety of non-ethanol value-added products. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaustad, KL; Turner, DD
2009-05-30
This report provides a short description of the Atmospheric Radiation Measurement (ARM) Climate Research Facility (ACRF) microwave radiometer (MWR) RETrievel (MWRRET) value-added product (VAP) algorithm. This algorithm utilizes a complementary physical retrieval method and applies brightness temperature offsets to reduce spurious liquid water path (LWP) bias in clear skies resulting in significantly improved precipitable water vapor (PWV) and LWP retrievals. We present a general overview of the technique, input parameters, output products, and describe data quality checks. A more complete discussion of the theory and results is given in Turner et al. (2007b).
Hagedorn Temperature of AdS5/CFT4 via Integrability
NASA Astrophysics Data System (ADS)
Harmark, Troels; Wilhelm, Matthias
2018-02-01
We establish a framework for calculating the Hagedorn temperature of AdS5/CFT4 via integrability. Concretely, we derive the thermodynamic Bethe ansatz equations that yield the Hagedorn temperature of planar N =4 super Yang-Mills theory at any value of the 't Hooft coupling. We solve these equations perturbatively at weak coupling via the associated Y system, confirming the known results at tree level and one-loop order as well as deriving the previously unknown two-loop Hagedorn temperature. Finally, we comment on solving the equations at finite coupling.
The forecaster's added value in QPF
NASA Astrophysics Data System (ADS)
Turco, M.; Milelli, M.
2010-03-01
To the authors' knowledge there are relatively few studies that try to answer this question: "Are humans able to add value to computer-generated forecasts and warnings?". Moreover, the answers are not always positive. In particular some postprocessing method is competitive or superior to human forecast. Within the alert system of ARPA Piemonte it is possible to study in an objective manner if the human forecaster is able to add value with respect to computer-generated forecasts. Every day the meteorology group of the Centro Funzionale of Regione Piemonte produces the HQPF (Human Quantitative Precipitation Forecast) in terms of an areal average and maximum value for each of the 13 warning areas, which have been created according to meteo-hydrological criteria. This allows the decision makers to produce an evaluation of the expected effects by comparing these HQPFs with predefined rainfall thresholds. Another important ingredient in this study is the very dense non-GTS (Global Telecommunication System) network of rain gauges available that makes possible a high resolution verification. In this work we compare the performances of the latest three years of QPF derived from the meteorological models COSMO-I7 (the Italian version of the COSMO Model, a mesoscale model developed in the framework of the COSMO Consortium) and IFS (the ECMWF global model) with the HQPF. In this analysis it is possible to introduce the hypothesis test developed by Hamill (1999), in which a confidence interval is calculated with the bootstrap method in order to establish the real difference between the skill scores of two competitive forecasts. It is important to underline that the conclusions refer to the analysis of the Piemonte operational alert system, so they cannot be directly taken as universally true. But we think that some of the main lessons that can be derived from this study could be useful for the meteorological community. In details, the main conclusions are the following
NASA Astrophysics Data System (ADS)
Zambrano, Francisco; Wardlow, Brian; Tadesse, Tsegaye
2016-10-01
Precipitation is a key parameter for the study of climate change and variability and the detection and monitoring of natural disaster such as drought. Precipitation datasets that accurately capture the amount and spatial variability of rainfall is critical for drought monitoring and a wide range of other climate applications. This is challenging in many parts of the world, which often have a limited number of weather stations and/or historical data records. Satellite-derived precipitation products offer a viable alternative with several remotely sensed precipitation datasets now available with long historical data records (+30 years), which include the Climate Hazards Group InfraRed Precipitation with Station (CHIRPS) and Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Climate Data Record (PERSIANN-CDR) datasets. This study presents a comparative analysis of three historical satellite-based precipitation datasets that include Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) 3B43 version 7 (1998-2015), PERSIANN-CDR (1983-2015) and CHIRPS 2.0 (1981-2015) over Chile to assess their performance across the country and evaluate their applicability for agricultural drought evaluation when used in the calculation of commonly used drought indicator as the Standardized Precipitation Index (SPI). In this analysis, 278 weather stations of in-situ rainfall measurements across Chile were initially compared to the satellite-based precipitation estimates. The study area (Chile) was divided into five latitudinal zones: North, North-Central, Central, South-Central and South to determine if there were a regional difference among these satellite-based estimates. Nine statistics were used to evaluate the performance of satellite products to estimate the amount and spatial distribution of historical rainfall across Chile. Hierarchical cluster analysis, k-means and singular value decomposition were used to
Value-Added Tax -- Can Schools Use It?
ERIC Educational Resources Information Center
Salmon, Richard G.
1973-01-01
Defines the value-added tax and examines it in light of equity, economic effects, cost of administration, and stability and yield. Compares the tax with the property tax and suggests alternative ways in which States and the Federal Government may participate in the financing of education. (DN)
Steenstra, Ivan A; Franche, Renée-Louise; Furlan, Andrea D; Amick, Ben; Hogg-Johnson, Sheilah
2016-06-01
Objectives Some injured workers with work-related, compensated back pain experience a troubling course in return to work. A prediction tool was developed in an earlier study, using administrative data only. This study explored the added value of worker reported data in identifying those workers with back pain at higher risk of being on benefits for a longer period of time. Methods This was a cohort study of workers with compensated back pain in 2005 in Ontario. Workplace Safety and Insurance Board (WSIB) data was used. As well, we examined the added value of patient-reported prognostic factors obtained from a prospective cohort study. Improvement of model fit was determined by comparing area under the curve (AUC) statistics. The outcome measure was time on benefits during a first workers' compensation claim for back pain. Follow-up was 2 years. Results Among 1442 workers with WSIB data still on full benefits at 4 weeks, 113 were also part of the prospective cohort study. Model fit of an established rule in the smaller dataset of 113 workers was comparable to the fit previously established in the larger dataset. Adding worker rating of pain at baseline improved the rule substantially (AUC = 0.80, 95 % CI 0.68, 0.91 compared to benefit status at 180 days, AUC = 0.88, 95 % CI 0.74, 1.00 compared to benefits status at 360 days). Conclusion Although data routinely collected by workers' compensation boards show some ability to predict prolonged time on benefits, adding information on experienced pain reported by the worker improves the predictive ability of the model from 'fairly good' to 'good'. In this study, a combination of prognostic factors, reported by multiple stakeholders, including the worker, could identify those at high risk of extended duration on disability benefits and in potentially in need of additional support at the individual level.
English Value-Added Measures: Examining the Limitations of School Performance Measurement
ERIC Educational Resources Information Center
Perry, Thomas
2016-01-01
Value-added "Progress" measures are to be introduced for all English schools in 2016 as "headline" measures of school performance. This move comes despite research highlighting high levels of instability in value-added measures and concerns about the omission of contextual variables in the planned measure. This article studies…
The Potential Consequence of Using Value-Added Models to Evaluate Teachers
ERIC Educational Resources Information Center
Shen, Zuchao; Simon, Carlee Escue; Kelcey, Ben
2016-01-01
Value-added models try to separate the contribution of individual teachers or schools to students' learning growth measured by standardized test scores. There is a policy trend to use value-added modeling to evaluate teachers because of its face validity and superficial objectiveness. This article investigates the potential long term consequences…
Evaluating Special Educator Effectiveness: Addressing Issues Inherent to Value-Added Modeling
ERIC Educational Resources Information Center
Steinbrecher, Trisha D.; Selig, James P.; Cosbey, Joanna; Thorstensen, Beata I.
2014-01-01
States are increasingly using value-added approaches to evaluate teacher effectiveness. There is much debate regarding whether these methods should be employed and, if employed, what role such methods should play in comprehensive teacher evaluation systems. In this article, we consider the use of value-added modeling (VAM) to evaluate special…
Engineering microbial factories for synthesis of value-added products
Du, Jing; Shao, Zengyi; Zhao, Huimin
2011-01-01
Microorganisms have become an increasingly important platform for the production of drugs, chemicals, and biofuels from renewable resources. Advances in protein engineering, metabolic engineering, and synthetic biology enable redesigning microbial cellular networks and fine-tuning physiological capabilities, thus generating industrially viable strains for the production of natural and unnatural value-added compounds. In this review, we describe the recent progress on engineering microbial factories for synthesis of valued-added products including alkaloids, terpenoids, flavonoids, polyketides, non-ribosomal peptides, biofuels, and chemicals. Related topics on lignocellulose degradation, sugar utilization, and microbial tolerance improvement will also be discussed. PMID:21526386
NASA Astrophysics Data System (ADS)
Zittis, G.; Bruggeman, A.; Camera, C.; Hadjinicolaou, P.; Lelieveld, J.
2017-07-01
Climate change is expected to substantially influence precipitation amounts and distribution. To improve simulations of extreme rainfall events, we analyzed the performance of different convection and microphysics parameterizations of the WRF (Weather Research and Forecasting) model at very high horizontal resolutions (12, 4 and 1 km). Our study focused on the eastern Mediterranean climate change hot-spot. Five extreme rainfall events over Cyprus were identified from observations and were dynamically downscaled from the ERA-Interim (EI) dataset with WRF. We applied an objective ranking scheme, using a 1-km gridded observational dataset over Cyprus and six different performance metrics, to investigate the skill of the WRF configurations. We evaluated the rainfall timing and amounts for the different resolutions, and discussed the observational uncertainty over the particular extreme events by comparing three gridded precipitation datasets (E-OBS, APHRODITE and CHIRPS). Simulations with WRF capture rainfall over the eastern Mediterranean reasonably well for three of the five selected extreme events. For these three cases, the WRF simulations improved the ERA-Interim data, which strongly underestimate the rainfall extremes over Cyprus. The best model performance is obtained for the January 1989 event, simulated with an average bias of 4% and a modified Nash-Sutcliff of 0.72 for the 5-member ensemble of the 1-km simulations. We found overall added value for the convection-permitting simulations, especially over regions of high-elevation. Interestingly, for some cases the intermediate 4-km nest was found to outperform the 1-km simulations for low-elevation coastal parts of Cyprus. Finally, we identified significant and inconsistent discrepancies between the three, state of the art, gridded precipitation datasets for the tested events, highlighting the observational uncertainty in the region.
How to Use Value-Added Measures Right
ERIC Educational Resources Information Center
Di Carlo, Matthew
2012-01-01
Value-added models are a specific type of "growth model," a diverse group of statistical techniques to isolate a teacher's impact on his or her students' testing progress while controlling for other measurable factors, such as student and school characteristics, that are outside that teacher's control. Opponents, including many teachers, argue…
How One School Implements and Experiences Ohio's Value-Added Model: A Case Study
ERIC Educational Resources Information Center
Quattrochi, David
2009-01-01
Ohio made value-added law in 2003 and incorporated value-added assessment to its operating standards for teachers and administrators in 2006. Value-added data is used to determine if students are making a year's growth at the end of each school year. Schools and districts receive a rating of "Below Growth, Met Growth, or Above Growth" on…
Teacher Effects, Value-Added Models, and Accountability
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2014-01-01
Background: In the last decade, the effects of teachers on student performance (typically manifested as state-wide standardized tests) have been re-examined using statistical models that are known as value-added models. These statistical models aim to compute the unique contribution of the teachers in promoting student achievement gains from grade…
Higher Education Value Added Using Multiple Outcomes
ERIC Educational Resources Information Center
Milla, Joniada; Martín, Ernesto San; Van Bellegem, Sébastien
2016-01-01
In this article we develop a methodology for the joint value added analysis of multiple outcomes that takes into account the inherent correlation between them. This is especially crucial in the analysis of higher education institutions. We use a unique Colombian database on universities, which contains scores in five domains tested in a…
Online Visualization and Value Added Services of MERRA-2 Data at GES DISC
NASA Technical Reports Server (NTRS)
Shen, Suhung; Ostrenga, Dana M.; Vollmer, Bruce E.; Hegde, Mahabaleshwa S.; Wei, Jennifer C.; Bosilovich, Michael G.
2017-01-01
NASA climate reanalysis datasets from MERRA-2, distributed at the Goddard Earth Sciences Data and Information Services Center (GES DISC), have been used in broad research areas, such as climate variations, extreme weather, agriculture, renewable energy, and air quality, etc. The datasets contain numerous variables for atmosphere, land, and ocean, grouped into 95 products. The total archived volume is approximately 337 TB ( approximately 562K files) at the end of October 2017. Due to the large number of products and files, and large data volumes, it may be a challenge for a user to find and download the data of interest. The support team at GES DISC, working closely with the MERRA-2 science team, has created and is continuing to work on value added data services to best meet the needs of a broad user community. This presentation, using aerosol over Asia Monsoon as an example, provides an overview of the MERRA-2 data services at GES DISC, including: How to find the data? How many data access methods are provided? What are the best data access methods for me? How do download the subsetted (parameter, spatial, temporal) data and save in preferred spatial resolution and data format? How to visualize and explore the data online? In addition, we introduce a future online analytic tool designed for supporting application research, focusing on long-term hourly time-series data access and analysis.
Martínez-Santiago, O; Marrero-Ponce, Y; Vivas-Reyes, R; Rivera-Borroto, O M; Hurtado, E; Treto-Suarez, M A; Ramos, Y; Vergara-Murillo, F; Orozco-Ugarriza, M E; Martínez-López, Y
2017-05-01
Graph derivative indices (GDIs) have recently been defined over N-atoms (N = 2, 3 and 4) simultaneously, which are based on the concept of derivatives in discrete mathematics (finite difference), metaphorical to the derivative concept in classical mathematical analysis. These molecular descriptors (MDs) codify topo-chemical and topo-structural information based on the concept of the derivative of a molecular graph with respect to a given event (S) over duplex, triplex and quadruplex relations of atoms (vertices). These GDIs have been successfully applied in the description of physicochemical properties like reactivity, solubility and chemical shift, among others, and in several comparative quantitative structure activity/property relationship (QSAR/QSPR) studies. Although satisfactory results have been obtained in previous modelling studies with the aforementioned indices, it is necessary to develop new, more rigorous analysis to assess the true predictive performance of the novel structure codification. So, in the present paper, an assessment and statistical validation of the performance of these novel approaches in QSAR studies are executed, as well as a comparison with those of other QSAR procedures reported in the literature. To achieve the main aim of this research, QSARs were developed on eight chemical datasets widely used as benchmarks in the evaluation/validation of several QSAR methods and/or many different MDs (fundamentally 3D MDs). Three to seven variable QSAR models were built for each chemical dataset, according to the original dissection into training/test sets. The models were developed by using multiple linear regression (MLR) coupled with a genetic algorithm as the feature wrapper selection technique in the MobyDigs software. Each family of GDIs (for duplex, triplex and quadruplex) behaves similarly in all modelling, although there were some exceptions. However, when all families were used in combination, the results achieved were quantitatively
ERIC Educational Resources Information Center
Tymms, Peter
This is the fourth in a series of technical reports that have dealt with issues surrounding the possibility of national value-added systems for primary schools in England. The main focus has been on the relative progress made by students between the ends of Key Stage 1 (KS1) and Key Stage 2 (KS2). The analysis has indicated that the strength of…
Dollar$ & $en$e. Part V: What is your added value?
Wilkinson, I
2001-01-01
In Part I of this series, I introduced the concept of memes (1). Memes are ideas or concepts--the information world equivalent of genes. The goal of this series of articles is to infect you with memes, so that you will assimilate, translate, and express them. No matter what our area of expertise or "-ology," we all are in the information business. Our goal is to be in the wisdom business. In the previous papers in this series, I showed that when we convert raw data into wisdom we are moving along a value chain. Each step in the chain adds a different amount of value to the final product: timely, relevant, accurate, and precise knowledge that can be applied to create the ultimate product in the value chain: wisdom. In Part II of this series, I introduced a set of memes for measuring the cost of adding value (2). In Part III of this series, I presented a new set of memes for measuring the added value of knowledge, i.e., intellectual capital (3). In Part IV of this series, I discussed practical knowledge management tools for measuring the value of people, structural, and customer capital (4). In Part V of this series, I will apply intellectual capital and knowledge management concepts at the individual level, to help answer a fundamental question: What is my added value?
"Value Added" Proves Beneficial to Teacher Prep
ERIC Educational Resources Information Center
Sawchuk, Stephen
2012-01-01
The use of "value added" information appears poised to expand into the nation's teacher colleges, with more than a dozen states planning to use the technique to analyze how graduates of training programs fare in classrooms. Supporters say the data could help determine which teacher education pathways produce teachers who are at least as…
Value-Added Models: What the Experts Say
ERIC Educational Resources Information Center
Amrein-Beardsley, Audrey; Pivovarova, Margarita; Geiger, Tray J.
2016-01-01
Being an expert involves explaining how things are supposed to work, and, perhaps more important, why things might not work as supposed. In this study, researchers surveyed scholars with expertise in value-added models (VAMs) to solicit their opinions about the uses and potential of VAMs for teacher-level accountability purposes (for example, in…
Added Value of Assessing Adnexal Masses with Advanced MRI Techniques
Thomassin-Naggara, I.; Balvay, D.; Rockall, A.; Carette, M. F.; Ballester, M.; Darai, E.; Bazot, M.
2015-01-01
This review will present the added value of perfusion and diffusion MR sequences to characterize adnexal masses. These two functional MR techniques are readily available in routine clinical practice. We will describe the acquisition parameters and a method of analysis to optimize their added value compared with conventional images. We will then propose a model of interpretation that combines the anatomical and morphological information from conventional MRI sequences with the functional information provided by perfusion and diffusion weighted sequences. PMID:26413542
Loops in AdS from conformal field theory
Aharony, Ofer; Alday, Luis F.; Bissi, Agnese; ...
2017-07-10
We propose and demonstrate a new use for conformal field theory (CFT) crossing equations in the context of AdS/CFT: the computation of loop amplitudes in AdS, dual to non-planar correlators in holographic CFTs. Loops in AdS are largely unexplored, mostly due to technical difficulties in direct calculations. We revisit this problem, and the dual 1=N expansion of CFTs, in two independent ways. The first is to show how to explicitly solve the crossing equations to the first subleading order in 1=N 2, given a leading order solution. This is done as a systematic expansion in inverse powers of the spin, to all orders. These expansions can be resummed, leading to the CFT data for nite values of the spin. Our second approach involves Mellin space. We show how the polar part of the four-point, loop-level Mellin amplitudes can be fully reconstructed from the leading-order data. The anomalous dimensions computed with both methods agree. In the case ofmore » $$\\phi$$ 4 theory in AdS, our crossing solution reproduces a previous computation of the one-loop bubble diagram. We can go further, deriving the four-point scalar triangle diagram in AdS, which had never been computed. In the process, we show how to analytically derive anomalous dimensions from Mellin amplitudes with an in nite series of poles, and discuss applications to more complicated cases such as the N = 4 super-Yang-Mills theory.« less
Loops in AdS from conformal field theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aharony, Ofer; Alday, Luis F.; Bissi, Agnese
We propose and demonstrate a new use for conformal field theory (CFT) crossing equations in the context of AdS/CFT: the computation of loop amplitudes in AdS, dual to non-planar correlators in holographic CFTs. Loops in AdS are largely unexplored, mostly due to technical difficulties in direct calculations. We revisit this problem, and the dual 1=N expansion of CFTs, in two independent ways. The first is to show how to explicitly solve the crossing equations to the first subleading order in 1=N 2, given a leading order solution. This is done as a systematic expansion in inverse powers of the spin, to all orders. These expansions can be resummed, leading to the CFT data for nite values of the spin. Our second approach involves Mellin space. We show how the polar part of the four-point, loop-level Mellin amplitudes can be fully reconstructed from the leading-order data. The anomalous dimensions computed with both methods agree. In the case ofmore » $$\\phi$$ 4 theory in AdS, our crossing solution reproduces a previous computation of the one-loop bubble diagram. We can go further, deriving the four-point scalar triangle diagram in AdS, which had never been computed. In the process, we show how to analytically derive anomalous dimensions from Mellin amplitudes with an in nite series of poles, and discuss applications to more complicated cases such as the N = 4 super-Yang-Mills theory.« less
Loops in AdS from conformal field theory
NASA Astrophysics Data System (ADS)
Aharony, Ofer; Alday, Luis F.; Bissi, Agnese; Perlmutter, Eric
2017-07-01
We propose and demonstrate a new use for conformal field theory (CFT) crossing equations in the context of AdS/CFT: the computation of loop amplitudes in AdS, dual to non-planar correlators in holographic CFTs. Loops in AdS are largely unexplored, mostly due to technical difficulties in direct calculations. We revisit this problem, and the dual 1 /N expansion of CFTs, in two independent ways. The first is to show how to explicitly solve the crossing equations to the first subleading order in 1 /N 2, given a leading order solution. This is done as a systematic expansion in inverse powers of the spin, to all orders. These expansions can be resummed, leading to the CFT data for finite values of the spin. Our second approach involves Mellin space. We show how the polar part of the four-point, loop-level Mellin amplitudes can be fully reconstructed from the leading-order data. The anomalous dimensions computed with both methods agree. In the case of ϕ 4 theory in AdS, our crossing solution reproduces a previous computation of the one-loop bubble diagram. We can go further, deriving the four-point scalar triangle diagram in AdS, which had never been computed. In the process, we show how to analytically derive anomalous dimensions from Mellin amplitudes with an infinite series of poles, and discuss applications to more complicated cases such as the N = 4 super-Yang-Mills theory.
Generalized derivation of the added-mass and circulatory forces for viscous flows
NASA Astrophysics Data System (ADS)
Limacher, Eric; Morton, Chris; Wood, David
2018-01-01
The concept of added mass arises from potential flow analysis and is associated with the acceleration of a body in an inviscid irrotational fluid. When shed vorticity is modeled as vortex singularities embedded in this irrotational flow, the associated force can be superimposed onto the added-mass force due to the linearity of the governing Laplace equation. This decomposition of force into added-mass and circulatory components remains common in modern aerodynamic models, but its applicability to viscous separated flows remains unclear. The present work addresses this knowledge gap by presenting a generalized derivation of the added-mass and circulatory force decomposition which is valid for a body of arbitrary shape in an unbounded, incompressible fluid domain, in both two and three dimensions, undergoing arbitrary motions amid continuous distributions of vorticity. From the general expression, the classical added-mass force is rederived for well-known canonical cases and is seen to be additive to the circulatory force for any flow. The formulation is shown to be equivalent to existing theoretical work under the specific conditions and assumptions of previous studies. It is also validated using a numerical simulation of a pitching plate in a steady freestream flow, conducted by Wang and Eldredge [Theor. Comput. Fluid Dyn. 27, 577 (2013), 10.1007/s00162-012-0279-5]. In response to persistent confusion in the literature, a discussion of the most appropriate physical interpretation of added mass is included, informed by inspection of the derived equations. The added-mass force is seen to account for the dynamic effect of near-body vorticity and is not (as is commonly claimed) associated with the acceleration of near-body fluid which "must" somehow move with the body. Various other consequences of the derivation are discussed, including a concept which has been labeled the conservation of image-vorticity impulse.
Small-diameter log evaluation for value-added structural applications
Ronald Wolfe; Cassandra Moseley
2000-01-01
Three species of small-diameter logs from the Klamath/Siskiyou Mountains and the Cascade Range in southwest Oregon were tested for their potential for value-added structural applications. The logs were tested in bending and compression parallel to the grain. Strength and stiffness values were correlated to possible nondestructive evaluation grading parameters and...
Methodology for adding glycemic index and glycemic load values to 24-hour dietary recall database.
Olendzki, Barbara C; Ma, Yunsheng; Culver, Annie L; Ockene, Ira S; Griffith, Jennifer A; Hafner, Andrea R; Hebert, James R
2006-01-01
We describe a method of adding the glycemic index (GI) and glycemic load (GL) values to the nutrient database of the 24-hour dietary recall interview (24HR), a widely used dietary assessment. We also calculated daily GI and GL values from the 24HR. Subjects were 641 healthy adults from central Massachusetts who completed 9067 24HRs. The 24HR-derived food data were matched to the International Table of Glycemic Index and Glycemic Load Values. The GI values for specific foods not in the table were estimated against similar foods according to physical and chemical factors that determine GI. Mixed foods were disaggregated into individual ingredients. Of 1261 carbohydrate-containing foods in the database, GI values of 602 foods were obtained from a direct match (47.7%), accounting for 22.36% of dietary carbohydrate. GI values from 656 foods (52.1%) were estimated, contributing to 77.64% of dietary carbohydrate. The GI values from three unknown foods (0.2%) could not be assigned. The average daily GI was 84 (SD 5.1, white bread as referent) and the average GL was 196 (SD 63). Using this methodology for adding GI and GL values to nutrient databases, it is possible to assess associations between GI and/or GL and body weight and chronic disease outcomes (diabetes, cancer, heart disease). This method can be used in clinical and survey research settings where 24HRs are a practical means for assessing diet. The implications for using this methodology compel a broader evaluation of diet with disease outcomes.
The Reliability, Impact, and Cost-Effectiveness of Value-Added Teacher Assessment Methods
ERIC Educational Resources Information Center
Yeh, Stuart S.
2012-01-01
This article reviews evidence regarding the intertemporal reliability of teacher rankings based on value-added methods. Value-added methods exhibit low reliability, yet are broadly supported by prominent educational researchers and are increasingly being used to evaluate and fire teachers. The article then presents a cost-effectiveness analysis…
Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables
ERIC Educational Resources Information Center
Johnson, Matthew T.; Lipscomb, Stephen; Gill, Brian
2015-01-01
Teacher value-added models (VAMs) must isolate teachers' contributions to student achievement to be valid. Well-known VAMs use different specifications, however, leaving policymakers with little clear guidance for constructing a valid model. We examine the sensitivity of teacher value-added estimates under different models based on whether they…
Stability of Teacher Value-Added Rankings across Measurement Model and Scaling Conditions
ERIC Educational Resources Information Center
Hawley, Leslie R.; Bovaird, James A.; Wu, ChaoRong
2017-01-01
Value-added assessment methods have been criticized by researchers and policy makers for a number of reasons. One issue includes the sensitivity of model results across different outcome measures. This study examined the utility of incorporating multivariate latent variable approaches within a traditional value-added framework. We evaluated the…
Adding Value by Hospital Real Estate: An Exploration of Dutch Practice.
van der Zwart, Johan; van der Voordt, Theo J M
2016-01-01
To explore how hospital real estate can add value to the healthcare organization, which values are prioritized in practice, and why. Dutch healthcare organizations are self-responsible for the costs and benefits of their accommodation. Meanwhile, a lively debate is going on about possible added values of corporate and public real estate in the fields of corporate real estate management and facility management. This article connects both worlds and compares insights from literature with experiences from practice. Added values extracted from literature have been discussed with 15 chief executive officers and project leaders of recently newly built hospitals in the Netherlands. Interviewees were asked (1) which values are included in the design and management of their hospital and why, (2) to prioritize most important values from a list of nine predefined values, and (3) to explain how the chosen real estate decisions are supposed to support organizational objectives. Stimulating innovation, user satisfaction, and improving organizational culture are most highly valued, followed by improving productivity, reducing building costs, and creating building flexibility. Image, risk control, and financing possibilities got lower rankings. The findings have been used to develop a value-impact matrix that connects nine values to various stakeholders and possible interventions. The findings and the value-impact matrix can make different stakeholders aware of many possible added values of hospital real estate, potential synergy and conflicts between different values, and how to steer on value add in different phases of the life cycle. © The Author(s) 2015.
Rethinking Teacher Evaluation: A Conversation about Statistical Inferences and Value-Added Models
ERIC Educational Resources Information Center
Callister Everson, Kimberlee; Feinauer, Erika; Sudweeks, Richard R.
2013-01-01
In this article, the authors provide a methodological critique of the current standard of value-added modeling forwarded in educational policy contexts as a means of measuring teacher effectiveness. Conventional value-added estimates of teacher quality are attempts to determine to what degree a teacher would theoretically contribute, on average,…
Fuel cell added value for early market applications
NASA Astrophysics Data System (ADS)
Hardman, Scott; Chandan, Amrit; Steinberger-Wilckens, Robert
2015-08-01
Fuel Cells are often considered in the market place as just power providers. Whilst fuel cells do provide power, there are additional beneficial characteristics that should be highlighted to consumers. Due to the high price premiums associated with fuel cells, added value features need to be exploited in order to make them more appealing and increase unit sales and market penetration. This paper looks at the approach taken by two companies to sell high value fuel cells to niche markets. The first, SFC Energy, has a proven track record selling fuel cell power providers. The second, Bloom Energy, is making significant progress in the US by having sold its Energy Server to more than 40 corporations including Wal-Mart, Staples, Google, eBay and Apple. Further to these current markets, two prospective added value applications for fuel cells are discussed. These are fuel cells for aircraft APUs and fuel cells for fire prevention. These two existing markets and two future markets highlight that fuel cells are not just power providers. Rather, they can be used as solutions to many needs, thus being more cost effective by replacing a number of incumbent systems at the same time.
NASA Astrophysics Data System (ADS)
Kawano, N.; Varquez, A. C. G.; Dong, Y.; Kanda, M.
2016-12-01
Numerical model such as Weather Research and Forecasting model coupled with single-layer Urban Canopy Model (WRF-UCM) is one of the powerful tools to investigate urban heat island. Urban parameters such as average building height (Have), plain area index (λp) and frontal area index (λf), are necessary inputs for the model. In general, these parameters are uniformly assumed in WRF-UCM but this leads to unrealistic urban representation. Distributed urban parameters can also be incorporated into WRF-UCM to consider a detail urban effect. The problem is that distributed building information is not readily available for most megacities especially in developing countries. Furthermore, acquiring real building parameters often require huge amount of time and money. In this study, we investigated the potential of using globally available satellite-captured datasets for the estimation of the parameters, Have, λp, and λf. Global datasets comprised of high spatial resolution population dataset (LandScan by Oak Ridge National Laboratory), nighttime lights (NOAA), and vegetation fraction (NASA). True samples of Have, λp, and λf were acquired from actual building footprints from satellite images and 3D building database of Tokyo, New York, Paris, Melbourne, Istanbul, Jakarta and so on. Regression equations were then derived from the block-averaging of spatial pairs of real parameters and global datasets. Results show that two regression curves to estimate Have and λf from the combination of population and nightlight are necessary depending on the city's level of development. An index which can be used to decide which equation to use for a city is the Gross Domestic Product (GDP). On the other hand, λphas less dependence on GDP but indicated a negative relationship to vegetation fraction. Finally, a simplified but precise approximation of urban parameters through readily-available, high-resolution global datasets and our derived regressions can be utilized to estimate a
The forecaster's added value in QPF
NASA Astrophysics Data System (ADS)
Turco, M.; Milelli, M.
2009-04-01
skill scores of two competitive forecast. It is important to underline that the conclusions refer to the analysis of the Piemonte operational alert system, so they cannot be directly taken as universally true. But we think that some of the main lessons that can be derived from this study could be useful for the meteorological community. In details, the main conclusions are the following: · despite the overall improvement in global scale and the fact that the resolution of the limited area models has increased considerably over recent years, the QPF produced by the meteorological models involved in this study has not improved enough to allow its direct use: the subjective HQPF continues to offer the best performance; · in the forecast process, the step where humans have the largest added value with respect to mathematical models, is the communication. In fact the human characterisation and communication of the forecast uncertainty to end users cannot be replaced by any computer code; · the QPFs verification is one of the most important activities of a Centro Funzionale because it allows a better understanding of the model behaviour in the different meteorological configurations, highlights the systematic characteristics, and helps in evaluating the reliability, in average or extreme values, over long term or in current situations; · eventually, although there is no novelty in this study, we would like to show that the correct application of appropriated statistical tecniques permits a better definition and quantification of the errors and, mostly important, allows a correct (unbiased) communication between forecasters and decision makers.
Bayesian Methods for Scalable Multivariate Value-Added Assessment
ERIC Educational Resources Information Center
Lockwood, J. R.; McCaffrey, Daniel F.; Mariano, Louis T.; Setodji, Claude
2007-01-01
There is increased interest in value-added models relying on longitudinal student-level test score data to isolate teachers' contributions to student achievement. The complex linkage of students to teachers as students progress through grades poses both substantive and computational challenges. This article introduces a multivariate Bayesian…
The Principal Axis Approach to Value-Added Calculation
ERIC Educational Resources Information Center
He, Qingping; Tymms, Peter
2014-01-01
The assessment of the achievement of students and the quality of schools has drawn increasing attention from educational researchers, policy makers, and practitioners. Various test-based accountability and feedback systems involving the use of value-added techniques have been developed for evaluating the effectiveness of individual teaching…
Value-Added Modeling and Educational Accountability: Are We Answering the Real Questions?
ERIC Educational Resources Information Center
Everson, Kimberlee C.
2017-01-01
Value-added estimates of teacher or school quality are increasingly used for both high- and low-stakes accountability purposes, making understanding of their limitations critical. A review of the recent value-added literature suggests three concerns with the state of the research. First, the issues receiving the most research attention have not…
Value-added care: a paradigm shift in patient care delivery.
Upenieks, Valda V; Akhavan, Jaleh; Kotlerman, Jenny
2008-01-01
Spiraling costs in health care have placed hospitals in a constant state of transition. As a result, nursing practice is now influenced by numerous factors and has remained in a continuous state of flux. Multiple changes within the last 2 decades in nurse/patient ratio and blend of front-line nurses are examples of this transition. To reframe the nursing practice into an economic equation that captures the cost, quality, and service, a paradigm shift in thinking is needed in order to assess work redesign. Nursing productivity must be evaluated in terms of value-added care, a vision that goes beyond direct care activities and includes team collaboration, physician rounding, increased RN-to-aide communication, and patient centeredness; all of which are crucial to the nurse's role and the patient's well-being. The science of appropriating staffing depends on assessment and implementation of systematic changes best illustrated through a "systems theory" framework. A throughput transformation is required to create process changes with input elements (number of front-line nurses) in order to increase time spent in value-added care and to decrease waste activities with an improvement in efficiency, quality, and service. The purpose of this pilot study was two-fold: (a) to gain an understanding of how much time RNs spent in value-added care, and (b) whether increasing the combined level of RNs and unlicensed assistive personnel increased the amount of time spent in value-added care compared to time spent in necessary tasks and waste.
Analysis of Added Value of Subscores with Respect to Classification
ERIC Educational Resources Information Center
Sinharay, Sandip
2014-01-01
Brennan noted that users of test scores often want (indeed, demand) that subscores be reported, along with total test scores, for diagnostic purposes. Haberman suggested a method based on classical test theory (CTT) to determine if subscores have added value over the total score. One way to interpret the method is that a subscore has added value…
Diagnostic and Value-Added Assessment of Business Writing
ERIC Educational Resources Information Center
Fraser, Linda; Harich, Katrin; Norby, Joni; Brzovic, Kathy; Rizkallah, Teeanna; Loewy, Dana
2005-01-01
To assess students' business writing abilities upon entry into the business program and exit from the capstone course, a multitiered assessment package was developed that measures students' achievement of specific learning outcomes and provides "value-added" scores. The online segment of the test measures five competencies across three process…
Impediments to the Estimation of Teacher Value Added
ERIC Educational Resources Information Center
Ishii, Jun; Rivkin, Steven G.
2009-01-01
This article considers potential impediments to the estimation of teacher quality caused primarily by the purposeful behavior of families, administrators, and teachers. The discussion highlights the benefits of accounting for student and school differences through a value-added modeling approach that incorporates a student's history of family,…
Reflections on the added value of using mixed methods in the SCAPE study.
Murphy, Kathy; Casey, Dympna; Devane, Declan; Meskell, Pauline; Higgins, Agnes; Elliot, Naomi; Lalor, Joan; Begley, Cecily
2014-03-01
To reflect on the added value that a mixed method design gave in a large national evaluation study of specialist and advanced practice (SCAPE), and to propose a reporting guide that could help make explicit the added value of mixed methods in other studies. Recently, researchers have focused on how to carry out mixed methods research (MMR) rigorously. The value-added claims for MMR include the capacity to exploit the strengths and compensate for weakness inherent in single designs, generate comprehensive description of phenomena, produce more convincing results for funders or policy-makers and build methodological expertise. Data illustrating value added claims were drawn from the SCAPE study. Studies about the purpose of mixed methods were identified from a search of literature. The authors explain why and how they undertook components of the study, and propose a guideline to facilitate such studies. If MMR is to become the third methodological paradigm, then articulation of what extra benefit MMR adds to a study is essential. The authors conclude that MMR has added value and found the guideline useful as a way of making value claims explicit. The clear articulation of the procedural aspects of mixed-methods research, and identification of a guideline to facilitate such research, will enable researchers to learn more effectively from each other.
Value-added uses for crude glycerol--a byproduct of biodiesel production
2012-01-01
Biodiesel is a promising alternative, and renewable, fuel. As its production increases, so does production of the principle co-product, crude glycerol. The effective utilization of crude glycerol will contribute to the viability of biodiesel. In this review, composition and quality factors of crude glycerol are discussed. The value-added utilization opportunities of crude glycerol are reviewed. The majority of crude glycerol is used as feedstock for production of other value-added chemicals, followed by animal feeds. PMID:22413907
Tobler, Amy L; Komro, Kelli A; Dabroski, Alexis; Aveyard, Paul; Markham, Wolfgang A
2011-06-01
We examined whether schools achieving better than expected educational outcomes for their students influence the risk of drug use and delinquency among urban, racial/ethnic minority youth. Adolescents (n = 2,621), who were primarily African American and Hispanic and enrolled in Chicago public schools (n = 61), completed surveys in 6th (aged 12) and 8th (aged 14) grades. Value-added education was derived from standardized residuals of regression equations predicting school-level academic achievement and attendance from students' sociodemographic profiles and defined as having higher academic achievement and attendance than that expected given the sociodemographic profile of the schools' student composition. Multilevel logistic regression estimated the effects of value-added education on students' drug use and delinquency. After considering initial risk behavior, value-added education was associated with lower incidence of alcohol, cigarette and marijuana use; stealing; and participating in a group-against-group fight. Significant beneficial effects of value-added education remained for cigarette and marijuana use, stealing and participating in a group-against-group fight after adjustment for individual- and school-level covariates. Alcohol use (past month and heavy episodic) showed marginally significant trends in the hypothesized direction after these adjustments. Inner-city schools may break the links between social disadvantage, drug use and delinquency. Identifying the processes related to value-added education in order to improve school environments is warranted given the high costs associated with individual-level interventions.
Understanding Achievement Differences between Schools in Ireland--Can Existing Data-Sets Help?
ERIC Educational Resources Information Center
Gilleece, Lorraine
2014-01-01
Recent years have seen an increased focus on school accountability in Ireland and calls for greater use to be made of student achievement data for monitoring student outcomes. In this paper, it is argued that existing data-sets in Ireland offer limited potential for the value-added modelling approaches used for accountability purposes in many…
Value-Added Analysis and Education Policy. Brief 1
ERIC Educational Resources Information Center
Rivkin, Steven G.
2007-01-01
This brief describes estimation and measurement issues relevant to estimating the quality of instruction in the context of a cumulative model of learning. It also discusses implications for the use of value-added estimates in personnel and compensation matters. The discussion highlights the importance of accounting for student differences and the…
Zeng, Lili; Wang, Dongxiao; Chen, Ju; Wang, Weiqiang; Chen, Rongyu
2016-04-26
In addition to the oceanographic data available for the South China Sea (SCS) from the World Ocean Database (WOD) and Array for Real-time Geostrophic Oceanography (Argo) floats, a suite of observations has been made by the South China Sea Institute of Oceanology (SCSIO) starting from the 1970s. Here, we assemble a SCS Physical Oceanographic Dataset (SCSPOD14) based on 51,392 validated temperature and salinity profiles collected from these three datasets for the period 1919-2014. A gridded dataset of climatological monthly mean temperature, salinity, and mixed and isothermal layer depth derived from an objective analysis of profiles is also presented. Comparisons with the World Ocean Atlas (WOA) and IFREMER/LOS Mixed Layer Depth Climatology confirm the reliability of the new dataset. This unique dataset offers an invaluable baseline perspective on the thermodynamic processes, spatial and temporal variability of water masses, and basin-scale and mesoscale oceanic structures in the SCS. We anticipate improvements and regular updates to this product as more observations become available from existing and future in situ networks.
Relative added value: what are the tools to evaluate it?
Le Jeunne, Claire; Woronoff-Lemsi, Marie-Christine; David, Nadine; de Sahb, Rima
2008-01-01
The relative added value of a drug is currently evaluated in France by the Transparency Commission (TC) of the National Health Authority (HAS), by assigning a level of Improvement in Actual Benefit (IAB). IAB is based on two parameters, efficacy and safety of the product, in a defined target population, either as compared to one or more other drugs with similar indications, or within therapeutic strategy. The items used for evaluation, including the level of clinical effect, the relevance of the comparator, the choice of comparison criteria and the methodology used (indirect comparison, non-inferiority studies, etc.), have been reviewed by the working group in Giens with regard to an analysis of the opinion on TC issued between 2004 and 2007 in several therapeutic areas First of all, this attempt at rationalisation based on the criteria used to assess the relative added value demonstrated the rareness of direct comparative data, and was followed by a discussion on the possible broadening of the evaluation criteria. The group discussed taking into account the Public Health Impact (PHI), which has now been incorporated into the assessment of Actual Benefit (AB). The group believes that PHI seems to be more related to the notion of IAB than to that of AB. Indeed, it is frequently the relative added value of a new drug that produces an impact in public health. Conversely, considering the comparative evaluation criteria of PHI, which are not systematically taken into account in IMSR (such as improvement in the health of the population, meeting a public health need or impact on the healthcare system), PHI could legitimately be included in the assessment of the relative added value of a drug. Other parameters such as compliance or impact on professional practice have been considered. Thus, the notion of relative added value, evaluated at initial registration, could be based on an expected improvement in medical service. The notion of expected medical service leads to the
Liguori, Rossana; Ventorino, Valeria; Pepe, Olimpia; Faraco, Vincenza
2016-01-01
Lignocellulosic biomasses derived from dedicated crops and agro-industrial residual materials are promising renewable resources for the production of fuels and other added value bioproducts. Due to the tolerance to a wide range of environments, the dedicated crops can be cultivated on marginal lands, avoiding conflict with food production and having beneficial effects on the environment. Besides, the agro-industrial residual materials represent an abundant, available, and cheap source of bioproducts that completely cut out the economical and environmental issues related to the cultivation of energy crops. Different processing steps like pretreatment, hydrolysis and microbial fermentation are needed to convert biomass into added value bioproducts. The reactor configuration, the operative conditions, and the operation mode of the conversion processes are crucial parameters for a high yield and productivity of the biomass bioconversion process. This review summarizes the last progresses in the bioreactor field, with main attention on the new configurations and the agitation systems, for conversion of dedicated energy crops (Arundo donax) and residual materials (corn stover, wheat straw, mesquite wood, agave bagasse, fruit and citrus peel wastes, sunflower seed hull, switchgrass, poplar sawdust, cogon grass, sugarcane bagasse, sunflower seed hull, and poplar wood) into sugars and ethanol. The main novelty of this review is its focus on reactor components and properties.
Value-added service in health care institutions.
Umiker, W
1996-12-01
In today's highly competitive atmosphere, the survival of health care institutions depends largely on the ability to provide value-added services (VAS) at the lowest possible cost. Managers must identify their customers and delineate the needs and expectation of those customers. A strategy for satisfying these needs and expectations is essential. While technical advances and reasonable charges are important, a successful "high-tech," "high touch" approach demands the combination of process reengineering and employee training in customer relations.
A new global 1-km dataset of percentage tree cover derived from remote sensing
DeFries, R.S.; Hansen, M.C.; Townshend, J.R.G.; Janetos, A.C.; Loveland, Thomas R.
2000-01-01
Accurate assessment of the spatial extent of forest cover is a crucial requirement for quantifying the sources and sinks of carbon from the terrestrial biosphere. In the more immediate context of the United Nations Framework Convention on Climate Change, implementation of the Kyoto Protocol calls for estimates of carbon stocks for a baseline year as well as for subsequent years. Data sources from country level statistics and other ground-based information are based on varying definitions of 'forest' and are consequently problematic for obtaining spatially and temporally consistent carbon stock estimates. By combining two datasets previously derived from the Advanced Very High Resolution Radiometer (AVHRR) at 1 km spatial resolution, we have generated a prototype global map depicting percentage tree cover and associated proportions of trees with different leaf longevity (evergreen and deciduous) and leaf type (broadleaf and needleleaf). The product is intended for use in terrestrial carbon cycle models, in conjunction with other spatial datasets such as climate and soil type, to obtain more consistent and reliable estimates of carbon stocks. The percentage tree cover dataset is available through the Global Land Cover Facility at the University of Maryland at http://glcf.umiacs.umd.edu.
NASA Technical Reports Server (NTRS)
Berard, Peter R.
1993-01-01
Researchers in the Molecular Sciences Research Center (MSRC) of Pacific Northwest Laboratory (PNL) currently generate massive amounts of scientific data. The amount of data that will need to be managed by the turn of the century is expected to increase significantly. Automated tools that support the management, maintenance, and sharing of this data are minimal. Researchers typically manage their own data by physically moving datasets to and from long term storage devices and recording a dataset's historical information in a laboratory notebook. Even though it is not the most efficient use of resources, researchers have tolerated the process. The solution to this problem will evolve over the next three years in three phases. PNL plans to add sophistication to existing multilevel file system (MLFS) software by integrating it with an object database management system (ODBMS). The first phase in the evolution is currently underway. A prototype system of limited scale is being used to gather information that will feed into the next two phases. This paper describes the prototype system, identifies the successes and problems/complications experienced to date, and outlines PNL's long term goals and objectives in providing a permanent solution.
Chapter 17: Adding Value to the Biorefinery with Lignin: An Engineer's Perspective
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biddy, Mary J
There is a long-standing belief that 'you can make anything out of lignin...except money.' This chapter serves to highlight that opportunities for making money from biomass-derived lignin exist both with current technology in the production of steam and power to new emerging areas of R&D focused on value-added chemical and material coproducts from lignin. To understand and quantify the economic potential for lignin valorization, the techno-economic analysis methodology is first described in detail. As demonstrated in the provided case study, these types of economic evaluations serve not only to estimate the economic impacts that lignin conversion could have for anmore » integrated biorefinery and outline drivers for further cost reduction but also identify data gaps and R&D needs for improving the design basis and reducing the risk for process scale-up.« less
NASA Astrophysics Data System (ADS)
Sembiring, M. T.; Wahyuni, D.; Sinaga, T. S.; Silaban, A.
2018-02-01
Cost allocation at manufacturing industry particularly in Palm Oil Mill still widely practiced based on estimation. It leads to cost distortion. Besides, processing time determined by company is not in accordance with actual processing time in work station. Hence, the purpose of this study is to eliminates non-value-added activities therefore processing time could be shortened and production cost could be reduced. Activity Based Costing Method is used in this research to calculate production cost with Value Added and Non-Value-Added Activities consideration. The result of this study is processing time decreasing for 35.75% at Weighting Bridge Station, 29.77% at Sorting Station, 5.05% at Loading Ramp Station, and 0.79% at Sterilizer Station. Cost of Manufactured for Crude Palm Oil are IDR 5.236,81/kg calculated by Traditional Method, IDR 4.583,37/kg calculated by Activity Based Costing Method before implementation of Activity Improvement and IDR 4.581,71/kg after implementation of Activity Improvement Meanwhile Cost of Manufactured for Palm Kernel are IDR 2.159,50/kg calculated by Traditional Method, IDR 4.584,63/kg calculated by Activity Based Costing Method before implementation of Activity Improvement and IDR 4.582,97/kg after implementation of Activity Improvement.
Rathore, Kusum; Cekanova, Maria
2015-01-01
Doxorubicin (DOX) is one of the most commonly used chemotherapeutic treatments for a wide range of cancers. N-benzyladriamycin-14-valerate (AD198) is a lipophilic anthracycline that has been shown to target conventional and novel isoforms of protein kinase C (PKC) in cytoplasm of cells. Because of the adverse effects of DOX, including hair loss, nausea, vomiting, liver dysfunction, and cardiotoxicity, novel derivatives of DOX have been synthesized and validated. In this study, we evaluated the effects of DOX and its derivative, AD198, on cell viability of three canine transitional cell carcinoma (K9TCC) (K9TCC#1-Lillie, K9TCC#2-Dakota, K9TCC#4-Molly) and three canine osteosarcoma (K9OSA) (K9OSA#1-Zoe, K9OSA#2-Nashville, K9OSA#3-JJ) primary cancer cell lines. DOX and AD198 significantly inhibited cell proliferation in all tested K9TCC and K9OSA cell lines in a dose-dependent manner. AD198 inhibited cell viability of tested K9TCC and K9OSA cell lines more efficiently as compared to DOX at the same concentration using MTS (3-(4,5-dimethyl-2-yl)-5-(3-carboxymethoxyphenyl)-2-(4-sulfophenyl)-2h-tetrazolium) assay. AD198 had lower IC50 values as compared to DOX for all tested K9TCC and K9OSA cell lines. In addition, AD198 increased apoptosis in all tested K9TCC and K9OSA cell lines. AD198 increased the caspase activity in tested K9TCC and K9OSA cell lines, which was confirmed by caspase-3/7 assay, and cleavage of poly (ADP-ribose) polymerase (PARP) was confirmed by Western blotting analysis. In addition, AD198 cleaved PKC-δ, which subsequently activated the p38 signaling pathway, resulting in the apoptosis of tested K9TCC and K9OSA cell lines. Inhibition of the p38 signaling pathway by SB203580 rescued DOX- and AD198-induced apoptosis in tested K9TCC and K9OSA cell lines. Our in vitro results suggest that AD198 might be considered as a new treatment option for K9TCC and K9OSA cell lines cancers in vivo.
Publicly Releasing a Large Simulation Dataset with NDS Labs
NASA Astrophysics Data System (ADS)
Goldbaum, Nathan
2016-03-01
Optimally, all publicly funded research should be accompanied by the tools, code, and data necessary to fully reproduce the analysis performed in journal articles describing the research. This ideal can be difficult to attain, particularly when dealing with large (>10 TB) simulation datasets. In this lightning talk, we describe the process of publicly releasing a large simulation dataset to accompany the submission of a journal article. The simulation was performed using Enzo, an open source, community-developed N-body/hydrodynamics code and was analyzed using a wide range of community- developed tools in the scientific Python ecosystem. Although the simulation was performed and analyzed using an ecosystem of sustainably developed tools, we enable sustainable science using our data by making it publicly available. Combining the data release with the NDS Labs infrastructure allows a substantial amount of added value, including web-based access to analysis and visualization using the yt analysis package through an IPython notebook interface. In addition, we are able to accompany the paper submission to the arXiv preprint server with links to the raw simulation data as well as interactive real-time data visualizations that readers can explore on their own or share with colleagues during journal club discussions. It is our hope that the value added by these services will substantially increase the impact and readership of the paper.
Conversion of sweet sorghum bagasse into value-added biochar
USDA-ARS?s Scientific Manuscript database
Sweet sorghum bagasse is an untapped resourceful carbon-rich material that can be thermochemically converted into value-added biochars. These biochars can be applied to the field as soil amendment for soil health enhancement, improved soil carbon content, water holding capacity, soil drainage and a...
Costs and added value of haemodialysis and peritoneal dialysis outsourcing agreements.
Lamas Barreiro, J M; Alonso Suárez, M; Saavedra Alonso, J A; Gándara Martínez, A
2011-01-01
Despite the discrepancy in results from Spanish studies on the costs of dialysis, it is assumed that peritoneal dialysis (PD) is more efficient than haemodialysis (HD). To analyse the costs and added value of HD and PD outsourcing agreements in Galicia, the medical transport for HD and the relationship between the cost of the agreement and the cost of consumables used in continuous ambulatory peritoneal dialysis (CAPD) with bicarbonate. The cost of the outsourcing agreements and the staff was obtained from official publications. The cost of PD and medical transport were calculated using health service data for one month and extrapolating it to one year. The cost of CAPD consumables was provided by the suppliers. The added value was calculated from the investments generated for each agreement treating 40 patients. Expressed as patient/year, the mean costs for treatment were €21595 and €25664 in HD and PD, respectively. Medical transport varied between €3323 and €6338, while those of the CAPD agreement and consumables were €19268 and €12057, respectively. The added value was greater with the HD agreement, especially considering the jobs created. One cannot generalise that the cost of PD, which is significantly influenced by prescriptions, is lower than that of HD. It would be appropriate to review the additional cost to consumables in the CAPD agreement. The added value generated by dialysis agreements should be considered in future studies and in health planning. More controlled studies are needed to better understand this issue.
Zeng, Lili; Wang, Dongxiao; Chen, Ju; Wang, Weiqiang; Chen, Rongyu
2016-01-01
In addition to the oceanographic data available for the South China Sea (SCS) from the World Ocean Database (WOD) and Array for Real-time Geostrophic Oceanography (Argo) floats, a suite of observations has been made by the South China Sea Institute of Oceanology (SCSIO) starting from the 1970s. Here, we assemble a SCS Physical Oceanographic Dataset (SCSPOD14) based on 51,392 validated temperature and salinity profiles collected from these three datasets for the period 1919–2014. A gridded dataset of climatological monthly mean temperature, salinity, and mixed and isothermal layer depth derived from an objective analysis of profiles is also presented. Comparisons with the World Ocean Atlas (WOA) and IFREMER/LOS Mixed Layer Depth Climatology confirm the reliability of the new dataset. This unique dataset offers an invaluable baseline perspective on the thermodynamic processes, spatial and temporal variability of water masses, and basin-scale and mesoscale oceanic structures in the SCS. We anticipate improvements and regular updates to this product as more observations become available from existing and future in situ networks. PMID:27116565
Dhamankar, Himanshu; Prather, Kristala L J
2011-08-01
The dwindling nature of petroleum and other fossil reserves has provided impetus towards microbial synthesis of fuels and value added chemicals from biomass-derived sugars as a renewable resource. Microbes have naturally evolved enzymes and pathways that can convert biomass into hundreds of unique chemical structures, a property that can be effectively exploited for their engineering into Microbial Chemical Factories (MCFs). De novo pathway engineering facilitates expansion of the repertoire of microbially synthesized compounds beyond natural products. In this review, we visit some recent successes in such novel pathway engineering and optimization, with particular emphasis on the selection and engineering of pathway enzymes and balancing of their accessory cofactors. Copyright © 2011 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
McCaffrey, Daniel F.
2012-01-01
Value-added models have caught the interest of policymakers because, unlike using student tests scores for other means of accountability, they purport to "level the playing field." That is, they supposedly reflect only a teacher's effectiveness, not whether she teaches high- or low-income students, for instance, or students in accelerated or…
Using Value-Added Measures of Teacher Quality. Brief 9
ERIC Educational Resources Information Center
Hanushek, Eric A.; Rivkin, Steven G.
2010-01-01
Extensive education research on the contribution of teachers to student achievement produces two generally accepted results. First, teacher quality varies substantially as measured by the value added to student achievement or future academic attainment or earnings. Second, variables often used to determine entry into the profession and…
The Student Mathematics Portfolio: Value Added to Student Preparation?
ERIC Educational Resources Information Center
Burks, Robert
2010-01-01
This article describes key elements for educators to successfully implement a student mathematics portfolio in an undergraduate mathematics course. This article offers practical advice for implementing a student mathematics portfolio in a freshman precalculus course. We look at the potential value added to student class preparation and compare our…
Can Value-Added Measures of Teacher Performance Be Trusted?
ERIC Educational Resources Information Center
Guarino, Cassandra M.; Reckase, Mark D.; Wooldridge, Jeffrey M.
2015-01-01
We investigate whether commonly used value-added estimation strategies produce accurate estimates of teacher effects under a variety of scenarios. We estimate teacher effects in simulated student achievement data sets that mimic plausible types of student grouping and teacher assignment scenarios. We find that no one method accurately captures…
Using a 'value-added' approach for contextual design of geographic information.
May, Andrew J
2013-11-01
The aim of this article is to demonstrate how a 'value-added' approach can be used for user-centred design of geographic information. An information science perspective was used, with value being the difference in outcomes arising from alternative information sets. Sixteen drivers navigated a complex, unfamiliar urban route, using visual and verbal instructions representing the distance-to-turn and junction layout information presented by typical satellite navigation systems. Data measuring driving errors, navigation errors and driver confidence were collected throughout the trial. The results show how driver performance varied considerably according to the geographic context at specific locations, and that there are specific opportunities to add value with enhanced geographical information. The conclusions are that a value-added approach facilitates a more explicit focus on 'desired' (and feasible) levels of end user performance with different information sets, and is a potentially effective approach to user-centred design of geographic information. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
MicroRNA array normalization: an evaluation using a randomized dataset as the benchmark.
Qin, Li-Xuan; Zhou, Qin
2014-01-01
MicroRNA arrays possess a number of unique data features that challenge the assumption key to many normalization methods. We assessed the performance of existing normalization methods using two microRNA array datasets derived from the same set of tumor samples: one dataset was generated using a blocked randomization design when assigning arrays to samples and hence was free of confounding array effects; the second dataset was generated without blocking or randomization and exhibited array effects. The randomized dataset was assessed for differential expression between two tumor groups and treated as the benchmark. The non-randomized dataset was assessed for differential expression after normalization and compared against the benchmark. Normalization improved the true positive rate significantly in the non-randomized data but still possessed a false discovery rate as high as 50%. Adding a batch adjustment step before normalization further reduced the number of false positive markers while maintaining a similar number of true positive markers, which resulted in a false discovery rate of 32% to 48%, depending on the specific normalization method. We concluded the paper with some insights on possible causes of false discoveries to shed light on how to improve normalization for microRNA arrays.
Gururaj, Anupama E.; Chen, Xiaoling; Pournejati, Saeid; Alter, George; Hersh, William R.; Demner-Fushman, Dina; Ohno-Machado, Lucila
2017-01-01
Abstract The rapid proliferation of publicly available biomedical datasets has provided abundant resources that are potentially of value as a means to reproduce prior experiments, and to generate and explore novel hypotheses. However, there are a number of barriers to the re-use of such datasets, which are distributed across a broad array of dataset repositories, focusing on different data types and indexed using different terminologies. New methods are needed to enable biomedical researchers to locate datasets of interest within this rapidly expanding information ecosystem, and new resources are needed for the formal evaluation of these methods as they emerge. In this paper, we describe the design and generation of a benchmark for information retrieval of biomedical datasets, which was developed and used for the 2016 bioCADDIE Dataset Retrieval Challenge. In the tradition of the seminal Cranfield experiments, and as exemplified by the Text Retrieval Conference (TREC), this benchmark includes a corpus (biomedical datasets), a set of queries, and relevance judgments relating these queries to elements of the corpus. This paper describes the process through which each of these elements was derived, with a focus on those aspects that distinguish this benchmark from typical information retrieval reference sets. Specifically, we discuss the origin of our queries in the context of a larger collaborative effort, the biomedical and healthCAre Data Discovery Index Ecosystem (bioCADDIE) consortium, and the distinguishing features of biomedical dataset retrieval as a task. The resulting benchmark set has been made publicly available to advance research in the area of biomedical dataset retrieval. Database URL: https://biocaddie.org/benchmark-data PMID:29220453
ERIC Educational Resources Information Center
Lincove, Jane Arnold; Osborne, Cynthia; Dillon, Amanda; Mills, Nicholas
2014-01-01
Despite questions about validity and reliability, the use of value-added estimation methods has moved beyond academic research into state accountability systems for teachers, schools, and teacher preparation programs (TPPs). Prior studies of value-added measurement for TPPs test the validity of researcher-designed models and find that measuring…
Leiva-Candia, D E; Tsakona, S; Kopsahelis, N; García, I L; Papanikolaou, S; Dorado, M P; Koutinas, A A
2015-08-01
This study focuses on the valorisation of crude glycerol and sunflower meal (SFM) from conventional biodiesel production plants for the separation of value-added co-products (antioxidant-rich extracts and protein isolate) and for enhancing biodiesel production through microbial oil synthesis. Microbial oil production was evaluated using three oleaginous yeast strains (Rhodosporidium toruloides, Lipomyces starkeyi and Cryptococcus curvatus) cultivated on crude glycerol and nutrient-rich hydrolysates derived from either whole SFM or SFM fractions that remained after separation of value-added co-products. Fed-batch bioreactor cultures with R. toruloides led to the production of 37.4gL(-1) of total dry weight with a microbial oil content of 51.3% (ww(-1)) when a biorefinery concept based on SFM fractionation was employed. The estimated biodiesel properties conformed with the limits set by the EN 14214 and ASTM D 6751 standards. The estimated cold filter plugging point (7.3-8.6°C) of the lipids produced by R. toruloides is closer to that of biodiesel derived from palm oil. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Multilayer Dataset of SSM/I-Derived Global Ocean Surface Turbulent Fluxes
NASA Technical Reports Server (NTRS)
Chou, Shu-Hsien; Shie, Chung-Lin; Atlas, Robert M.; Ardizzone, Joe; Nelkin, Eric; Einaud, Franco (Technical Monitor)
2001-01-01
A dataset including daily- and monthly-mean turbulent fluxes (momentum, latent heat, and sensible heat) and some relevant parameters over global oceans, derived from the Special Sensor Microwave/Imager (SSM/I) data, for the period July 1987-December 1994 and the 1988-94 annual and monthly-mean climatologies of the same variables is created. It has a spatial resolution of 2.0deg x 2.5deg latitude-longitude. The retrieved surface air humidity is found to be generally accurate as compared to the collocated radiosonde observations over global oceans. The retrieved wind stress and latent heat flux show useful accuracy as verified against research quality measurements of ship and buoy in the western equatorial Pacific. The 1988-94 seasonal-mean wind stress and latent heat flux show reasonable patterns related to seasonal variations of the atmospheric general circulation. The patterns of 1990-93 annual-mean turbulent fluxes and input variables are generally in good agreement with one of the best global analyzed flux datasets that based on COADS (comprehensive ocean-atmosphere data set) with corrections on wind speeds and covered the same period. The retrieved wind speed is generally within +/-1 m/s of the COADS-based, but is stronger by approx. 1-2 m/s in the northern extratropical oceans. The discrepancy is suggested to be mainly due to higher COADS-modified wind speeds resulting from underestimation of anemometer heights. Compared to the COADS-based, the retrieved latent heat flux and sea-air humidity difference are generally larger with significant differences in the trade wind zones and the ocean south of 40degS (up to approx. 40-60 W/sq m and approx. 1-1.5 g/kg). The discrepancy is believed to be mainly caused by higher COADS-based surface air humidity arising from the overestimation of dew point temperatures and from the extrapolation of observed high humidity southward into data-void regions south of 40degS. The retrieved sensible heat flux is generally within +/-5
A sustainable biorefinery to convert agricultural residues into value-added chemicals.
Liu, Zhiguo; Liao, Wei; Liu, Yan
2016-01-01
Animal wastes are of particular environmental concern due to greenhouse gases emissions, odor problem, and potential water contamination. Anaerobic digestion (AD) is an effective and widely used technology to treat them for bioenergy production. However, the sustainability of AD is compromised by two by-products of the nutrient-rich liquid digestate and the fiber-rich solid digestate. To overcome these limitations, this paper demonstrates a biorefinery concept to fully utilize animal wastes and create a new value-added route for animal waste management. The studied biorefinery includes an AD, electrocoagulation (EC) treatment of the liquid digestate, and fungal conversion of the solid fiber into a fine chemical-chitin. Animal wastes were first treated by an AD to produce methane gas for energy generation to power the entire biorefinery. The resulting liquid digestate was treated by EC to reclaim water. Enzymatic hydrolysis and fungal fermentation were then applied on the cellulose-rich solid digestate to produce chitin. EC water was used as the processing water for the fungal fermentation. The results indicate that the studied biorefinery converts 1 kg dry animal wastes into 17 g fungal biomass containing 12 % of chitin (10 % of glucosamine), and generates 1.7 MJ renewable energy and 8.5 kg irrigation water. This study demonstrates an energy positive and freshwater-free biorefinery to simultaneously treat animal wastes and produce a fine chemical-chitin. The sustainable biorefinery concept provides a win-win solution for agricultural waste management and value-added chemical production.
Rathore, Kusum; Cekanova, Maria
2015-01-01
Doxorubicin (DOX) is one of the most commonly used chemotherapeutic treatments for a wide range of cancers. N-benzyladriamycin-14-valerate (AD198) is a lipophilic anthracycline that has been shown to target conventional and novel isoforms of protein kinase C (PKC) in cytoplasm of cells. Because of the adverse effects of DOX, including hair loss, nausea, vomiting, liver dysfunction, and cardiotoxicity, novel derivatives of DOX have been synthesized and validated. In this study, we evaluated the effects of DOX and its derivative, AD198, on cell viability of three canine transitional cell carcinoma (K9TCC) (K9TCC#1-Lillie, K9TCC#2-Dakota, K9TCC#4-Molly) and three canine osteosarcoma (K9OSA) (K9OSA#1-Zoe, K9OSA#2-Nashville, K9OSA#3-JJ) primary cancer cell lines. DOX and AD198 significantly inhibited cell proliferation in all tested K9TCC and K9OSA cell lines in a dose-dependent manner. AD198 inhibited cell viability of tested K9TCC and K9OSA cell lines more efficiently as compared to DOX at the same concentration using MTS (3-(4,5-dimethyl-2-yl)-5-(3-carboxymethoxyphenyl)-2-(4-sulfophenyl)-2h-tetrazolium) assay. AD198 had lower IC50 values as compared to DOX for all tested K9TCC and K9OSA cell lines. In addition, AD198 increased apoptosis in all tested K9TCC and K9OSA cell lines. AD198 increased the caspase activity in tested K9TCC and K9OSA cell lines, which was confirmed by caspase-3/7 assay, and cleavage of poly (ADP-ribose) polymerase (PARP) was confirmed by Western blotting analysis. In addition, AD198 cleaved PKC-δ, which subsequently activated the p38 signaling pathway, resulting in the apoptosis of tested K9TCC and K9OSA cell lines. Inhibition of the p38 signaling pathway by SB203580 rescued DOX- and AD198-induced apoptosis in tested K9TCC and K9OSA cell lines. Our in vitro results suggest that AD198 might be considered as a new treatment option for K9TCC and K9OSA cell lines cancers in vivo. PMID:26451087
A Brief History of Educational "Value-Added": How Did We Get to Where We Are?
ERIC Educational Resources Information Center
Saunders, Lesley
1999-01-01
Explains how and why the economics concept "value added" came to be used in an educational context, focusing on early usage in the United Kingdom. The term has been developed, used, and defined in various, conflicting ways. Some ambiguities cannot be eliminated. Value-added effectiveness measures involve value judgments. (44 references)…
Is there potential added value in COSMO-CLM forced by ERA reanalysis data?
NASA Astrophysics Data System (ADS)
Lenz, Claus-Jürgen; Früh, Barbara; Adalatpanah, Fatemeh Davary
2017-12-01
An application of the potential added value (PAV) concept suggested by Di Luca et al. (Clim Dyn 40:443-464, 2013a) is applied to ERA Interim driven runs of the regional climate model COSMO-CLM. They are performed for the time period 1979-2013 for the EURO-CORDEX domain at horizontal grid resolutions 0.11°, 0.22°, and 0.44° such that the higher resolved model grid fits into the next coarser grid. The concept of the potential added value is applied to annual, seasonal, and monthly means of the 2 m air temperature. Results show the highest potential added value at the run with the finest grid and generally increasing PAV with increasing resolution. The potential added value strongly depends on the season as well as the region of consideration. The gain of PAV is higher enhancing the resolution from 0.44° to 0.22° than from 0.22° to 0.11°. At grid aggregations to 0.88° and 1.76° the differences in PAV between the COSMO-CLM runs on the mentioned grid resolutions are maximal. They nearly vanish at aggregations to even coarser grids. In all cases the PAV is dominated by at least 80% by its stationary part.
Gregory, Simon; Patterson, Fiona; Baron, Helen; Knight, Alec; Walsh, Kieran; Irish, Bill; Thomas, Sally
2016-10-01
Increasing pressure is being placed on external accountability and cost efficiency in medical education and training internationally. We present an illustrative data analysis of the value-added of postgraduate medical education. We analysed historical selection (entry) and licensure (exit) examination results for trainees sitting the UK Membership of the Royal College of General Practitioners (MRCGP) licensing examination (N = 2291). Selection data comprised: a clinical problem solving test (CPST); a situational judgement test (SJT); and a selection centre (SC). Exit data was an applied knowledge test (AKT) from MRCGP. Ordinary least squares (OLS) regression analyses were used to model differences in attainment in the AKT based on performance at selection (the value-added score). Results were aggregated to the regional level for comparisons. We discovered significant differences in the value-added score between regional training providers. Whilst three training providers confer significant value-added, one training provider was significantly lower than would be predicted based on the attainment of trainees at selection. Value-added analysis in postgraduate medical education potentially offers useful information, although the methodology is complex, controversial, and has significant limitations. Developing models further could offer important insights to support continuous improvement in medical education in future.
Value-Added Dairy Products from Grass-Based Dairy Farms: A Case Study in Vermont
ERIC Educational Resources Information Center
Wang, Qingbin; Parsons, Robert; Colby, Jennifer; Castle, Jeffrey
2016-01-01
On-farm processing of value-added dairy products can be a way for small dairy farms to diversify production and increase revenue. This article examines characteristics of three groups of Vermont farmers who have grass-based dairy farms--those producing value-added dairy products, those interested in such products, and those not interested in such…
The effect of added dimensionality on perceived image value
NASA Astrophysics Data System (ADS)
Farnand, Susan
2008-01-01
Texture is an important element of the world around us. It can convey information about the object at hand. Although embossing has been used in a limited way, to enhance the appearance of greeting cards and book covers for example, texture is something that printed material traditionally lacks. Recently, techniques have been developed that allow the incorporation of texture in printed material. Prints made using such processes are similar to traditional 2D prints but have added texture such that a reproduction of an oil painting can have the texture of oil paint on canvas or a picture of a lizard can actually have the texture of lizard skin. It seems intuitive that the added dimensionality would add to the perceived quality of the image, but to what degree? To examine the question of the impact of a third dimension on the perceived quality of printed images, a survey was conducted asking participants to determine the relative worth of sets of print products. Pairs of print products were created, where one print of each pair was 2D and the other was the same image with added texture. Using these print pairs, thirty people from the Rochester Institute of Technology community were surveyed. The participants were shown seven pairs of print products and asked to rate the relative value of each pair by apportioning a specified amount of money between the two items according to their perception of what each item was worth. The results indicated that the addition of a third dimension or texture to the printed images gave a clear boost to the perceived worth of the printed products. The rating results were 50% higher for the 3D products than the 2D products, with the participants apportioning approximately 60% of each dollar to the 3D product and 40% to the 2D product. About 80% of the time participants felt that the 3D items had at least some added value over their 2D counterparts, about 15% of the time, they felt the products were essentially equivalent in value and 4% of
NASA Astrophysics Data System (ADS)
Prat, O. P.; Nelson, B. R.
2013-12-01
We use a suite of quantitative precipitation estimates (QPEs) derived from satellite, radar, surface observations, and models to derive precipitation characteristics over CONUS for the period 2002-2012. This comparison effort includes satellite multi-sensor datasets of TMPA 3B42, CMORPH, and PERSIANN. The satellite based QPEs are compared over the concurrent period with the NCEP Stage IV product, which is a near real time product providing precipitation data at the hourly temporal scale gridded at a nominal 4-km spatial resolution. In addition, remotely sensed precipitation datasets are compared with surface observations from the Global Historical Climatology Network (GHCN-Daily) and from the PRISM (Parameter-elevation Regressions on Independent Slopes Model), which provides gridded precipitation estimates that are used as a baseline for multi-sensor QPE products comparison. The comparisons are performed at the annual, seasonal, monthly, and daily scales with focus on selected river basins (Southeastern US, Pacific Northwest, Great Plains). While, unconditional annual rain rates present a satisfying agreement between all products, results suggest that satellite QPE datasets exhibit important biases in particular at higher rain rates (≥4 mm/day). Conversely, on seasonal scales differences between remotely sensed data and ground surface observations can be greater than 50% and up to 90% for low daily accumulation (≤1 mm/day) such as in the Western US (summer) and Central US (winter). The conditional analysis performed using different daily rainfall accumulation thresholds (from low rainfall intensity to intense precipitation) shows that while intense events measured at the ground are infrequent (around 2% for daily accumulation above 2 inches/day), remotely sensed products displayed differences from 20-50% and up to 90-100%. A discussion on the impact of differing spatial and temporal resolutions with respect to the datasets ability to capture extreme
Value-added beef products (Productos Carnicos con Valor Agregado)
Mac Donaldson; Will Holder; Jan Holder
2006-01-01
I'm speaking for Will and Jan Holder, who couldn't be here. I happen to be familiar with Will and Jan's company, Ervin's Natural Beef, and its program because I've sold them cattle. Will and Jan's value-added beef program is based on their family ranch in the area known as The Blue, in the mountains of eastern Arizona.
Multi-Angle Snowflake Camera Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shkurko, Konstantin; Garrett, T.; Gaustad, K
The Multi-Angle Snowflake Camera (MASC) addresses a need for high-resolution multi-angle imaging of hydrometeors in freefall with simultaneous measurement of fallspeed. As illustrated in Figure 1, the MASC consists of three cameras, separated by 36°, each pointing at an identical focal point approximately 10 cm away. Located immediately above each camera, a light aims directly at the center of depth of field for its corresponding camera. The focal point at which the cameras are aimed lies within a ring through which hydrometeors fall. The ring houses a system of near-infrared emitter-detector pairs, arranged in two arrays separated vertically by 32more » mm. When hydrometeors pass through the lower array, they simultaneously trigger all cameras and lights. Fallspeed is calculated from the time it takes to traverse the distance between the upper and lower triggering arrays. The trigger electronics filter out ambient light fluctuations associated with varying sunlight and shadows. The microprocessor onboard the MASC controls the camera system and communicates with the personal computer (PC). The image data is sent via FireWire 800 line, and fallspeed (and camera control) is sent via a Universal Serial Bus (USB) line that relies on RS232-over-USB serial conversion. See Table 1 for specific details on the MASC located at the Oliktok Point Mobile Facility on the North Slope of Alaska. The value-added product (VAP) detailed in this documentation analyzes the raw data (Section 2.0) using Python: images rely on OpenCV image processing library and derived aggregated statistics rely on some clever averaging. See Sections 4.1 and 4.2 for more details on what variables are computed.« less
78 FR 70260 - Inviting Applications for Value-Added Producer Grants
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-25
... end goals. All proposals must demonstrate economic viability and sustainability in order to compete... development of a defined program of economic planning activities to determine the viability of a potential... enter into value-added activities. Awards may be made for either economic planning or working capital...
Assessing the Added Value of Dynamical Downscaling Using the Standardized Precipitation Index
In this study, the Standardized Precipitation Index (SPI) is used to ascertain the added value of dynamical downscaling over the contiguous United States. WRF is used as a regional climate model (RCM) to dynamically downscale reanalysis fields to compare values of SPI over drough...
MicroRNA Array Normalization: An Evaluation Using a Randomized Dataset as the Benchmark
Qin, Li-Xuan; Zhou, Qin
2014-01-01
MicroRNA arrays possess a number of unique data features that challenge the assumption key to many normalization methods. We assessed the performance of existing normalization methods using two microRNA array datasets derived from the same set of tumor samples: one dataset was generated using a blocked randomization design when assigning arrays to samples and hence was free of confounding array effects; the second dataset was generated without blocking or randomization and exhibited array effects. The randomized dataset was assessed for differential expression between two tumor groups and treated as the benchmark. The non-randomized dataset was assessed for differential expression after normalization and compared against the benchmark. Normalization improved the true positive rate significantly in the non-randomized data but still possessed a false discovery rate as high as 50%. Adding a batch adjustment step before normalization further reduced the number of false positive markers while maintaining a similar number of true positive markers, which resulted in a false discovery rate of 32% to 48%, depending on the specific normalization method. We concluded the paper with some insights on possible causes of false discoveries to shed light on how to improve normalization for microRNA arrays. PMID:24905456
Inuloxins A-D and derivatives as antileishmanial agents: structure-activity relationship study
USDA-ARS?s Scientific Manuscript database
Inuloxins A-D (1-4) and a-costic acid (5), the phytotoxic compounds previously isolated from Inula viscosa, as well as synthetic derivatives of inuloxin A (compounds 6-10), inuloxin C (compound 11) and inuloxin D (compound 12) were tested in vitro for their activity against Leishmania donovani, the ...
The added clinical and economic value of diagnostic testing for epilepsy surgery.
Hinde, Sebastian; Soares, Marta; Burch, Jane; Marson, Anthony; Woolacott, Nerys; Palmer, Stephen
2014-05-01
The costs, benefits and risks associated with diagnostic imaging investigations for epilepsy surgery necessitate the identification of an optimal pathway in the pre-surgical workup. In order to assess the added value of additional investigations a full cost-effectiveness evaluation should be conducted, taking into account all of the life-time costs and benefits associated with undertaking additional investigations. This paper considers and applies the appropriate framework against which a full evaluation should be assessed. We conducted a systematic review to evaluate the progression of the literature through this framework, finding that only isolated elements of added value have been appropriately evaluated. The results from applying the full added value framework are also presented, identifying an optimal strategy for pre-surgical evaluation for temporal lobe epilepsy surgery. Our results suggest that additional FDG-PET and invasive EEG investigations after an initially discordant MRI and video-EEG appears cost-effective, and that the value of subsequent invasive-EEGs is closely linked to the maintenance of longer-term benefits after surgery. It is integral to the evaluation of imaging technologies in the work-up for epilepsy surgery that the impact of the use of these technologies on clinical decision-making, and on further treatment decisions, is considered fully when informing cost-effectiveness. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
What's the Difference? A Model for Measuring the Value Added by Higher Education in Australia
ERIC Educational Resources Information Center
Coates, Hamish
2009-01-01
Measures of student learning are playing an increasingly significant role in determining the quality and productivity of higher education. This paper evaluates approaches for estimating the value added by university education, and proposes a methodology for use by institutions and systems. The paper argues that value-added measures of learning are…
All Sizzle and No Steak: Value-Added Model Doesn't Add Value in Houston
ERIC Educational Resources Information Center
Amrein-Beardsley, Audrey; Geiger, Tray
2017-01-01
Houston's experience with the Educational Value-Added Assessment System (R) (EVAAS) raises questions that other districts should consider before buying the software and using it for high-stakes decisions. Researchers found that teachers in Houston, all of whom were under the EVAAS gun, but who taught relatively more racial minority students,…
Are Abeta and its derivatives causative agents or innocent bystanders in AD?
Robakis, Nikolaos K
2010-01-01
Alzheimer's disease (AD) is characterized by neurodegeneration in neocortical regions of the brain. Currently, Abeta-based theories, including amyloid depositions and soluble Abeta, form the basis of most therapeutic approaches to AD. It remains unclear, however, whether Abeta and its derivatives are the primary causative agents of neuronal loss in AD. Reported studies show no significant correlations between brain amyloid depositions and either degree of dementia or loss of neurons, and brain amyloid loads similar to AD are often found in normal individuals. Furthermore, behavioral abnormalities in animal models overexpressing amyloid precursor protein seem independent of amyloid depositions. Soluble Abeta theories propose toxic Abeta42 or its oligomers as the agents that promote cell death in AD. Abeta peptides, however, are normal components of human serum and CSF, and it is unclear under what conditions these peptides become toxic. Presently, there is little evidence of disease-associated abnormalities in soluble Abeta and no toxic oligomers specific to AD have been found. That familial AD mutations of amyloid precursor protein, PS1 and PS2 promote neurodegeneration suggests the biological functions of these proteins play critical roles in neuronal survival. Evidence shows that the PS/gamma-secretase system promotes production of peptides involved in cell surface-to-nucleus signaling and gene expression, providing support for the hypothesis that familial AD mutations may contribute to neurodegeneration by inhibiting PS-dependent signaling pathways. Copyright 2010 S. Karger AG, Basel.
Are Aβ and Its Derivatives Causative Agents or Innocent Bystanders in AD?
Robakis, Nikolaos K.
2010-01-01
Alzheimer's disease (AD) is characterized by neurodegeneration in neocortical regions of the brain. Currently, Aβ-based theories, including amyloid depositions and soluble Aβ, form the basis of most therapeutic approaches to AD. It remains unclear, however, whether Aβ and its derivatives are the primary causative agents of neuronal loss in AD. Reported studies show no significant correlations between brain amyloid depositions and either degree of dementia or loss of neurons, and brain amyloid loads similar to AD are often found in normal individuals. Furthermore, behavioral abnormalities in animal models overexpressing amyloid precursor protein seem independent of amyloid depositions. Soluble Aβ theories propose toxic Aβ42 or its oligomers as the agents that promote cell death in AD. Aβ peptides, however, are normal components of human serum and CSF, and it is unclear under what conditions these peptides become toxic. Presently, there is little evidence of disease-associated abnormalities in soluble Aβ and no toxic oligomers specific to AD have been found. That familial AD mutations of amyloid precursor protein, PS1 and PS2 promote neurodegeneration suggests the biological functions of these proteins play critical roles in neuronal survival. Evidence shows that the PS/γ-secretase system promotes production of peptides involved in cell surface-to-nucleus signaling and gene expression, providing support for the hypothesis that familial AD mutations may contribute to neurodegeneration by inhibiting PS-dependent signaling pathways. PMID:20160455
ERIC Educational Resources Information Center
Karl, Andrew T.; Yang, Yan; Lohr, Sharon L.
2013-01-01
Value-added models have been widely used to assess the contributions of individual teachers and schools to students' academic growth based on longitudinal student achievement outcomes. There is concern, however, that ignoring the presence of missing values, which are common in longitudinal studies, can bias teachers' value-added scores.…
Using a Value-Added Approach to Assess the Sociology Major
ERIC Educational Resources Information Center
Pedersen, Daphne E.; White, Frank
2011-01-01
Universities across the nation have been called upon to provide evidence of student learning through direct means of assessment. Value-added assessment, which aims to document the development of student learning from the beginning of the university experience to the end, has been called "accountability's new frontier" by the American…
ERIC Educational Resources Information Center
Harris, Douglas N.
2012-01-01
In the recent drive to revamp teacher evaluation and accountability, measures of a teacher's value added have played the starring role. But the star of the show is not always the best actor, nor can the star succeed without a strong supporting cast. In assessing teacher performance, observations of classroom practice, portfolios of teachers' work,…
On the added value of forensic science and grand innovation challenges for the forensic community.
van Asten, Arian C
2014-03-01
In this paper the insights and results are presented of a long term and ongoing improvement effort within the Netherlands Forensic Institute (NFI) to establish a valuable innovation programme. From the overall perspective of the role and use of forensic science in the criminal justice system, the concepts of Forensic Information Value Added (FIVA) and Forensic Information Value Efficiency (FIVE) are introduced. From these concepts the key factors determining the added value of forensic investigations are discussed; Evidential Value, Relevance, Quality, Speed and Cost. By unravelling the added value of forensic science and combining this with the future needs and scientific and technological developments, six forensic grand challenges are introduced: i) Molecular Photo-fitting; ii) chemical imaging, profiling and age estimation of finger marks; iii) Advancing Forensic Medicine; iv) Objective Forensic Evaluation; v) the Digital Forensic Service Centre and vi) Real time In-Situ Chemical Identification. Finally, models for forensic innovation are presented that could lead to major international breakthroughs on all these six themes within a five year time span. This could cause a step change in the added value of forensic science and would make forensic investigative methods even more valuable than they already are today. © 2013. Published by Elsevier Ireland Ltd on behalf of Forensic Science Society. All rights reserved.
Non-minimal derivative coupling gravity in cosmology
NASA Astrophysics Data System (ADS)
Gumjudpai, Burin; Rangdee, Phongsaphat
2015-11-01
We give a brief review of the non-minimal derivative coupling (NMDC) scalar field theory in which there is non-minimal coupling between the scalar field derivative term and the Einstein tensor. We assume that the expansion is of power-law type or super-acceleration type for small redshift. The Lagrangian includes the NMDC term, a free kinetic term, a cosmological constant term and a barotropic matter term. For a value of the coupling constant that is compatible with inflation, we use the combined WMAP9 (WMAP9 + eCMB + BAO + H_0) dataset, the PLANCK + WP dataset, and the PLANCK TT, TE, EE + lowP + Lensing + ext datasets to find the value of the cosmological constant in the model. Modeling the expansion with power-law gives a negative cosmological constants while the phantom power-law (super-acceleration) expansion gives positive cosmological constant with large error bar. The value obtained is of the same order as in the Λ CDM model, since at late times the NMDC effect is tiny due to small curvature.
Adding Value to Scholarly Journals through a Citation Indexing System
ERIC Educational Resources Information Center
Zainab, A. N.; Abrizah, A.; Raj, R. G.
2013-01-01
Purpose: The purpose of this paper is to relate the problems identified about scholarly journal publishing in Malaysia to establish motivation for the system development; to describe the design of MyCite, a Malaysian citation indexing system and to highlight the added value to journals and articles indexed through the generation of bibliometrics…
Government Workers Adding Societal Value: The Ohio Workforce Development Program
ERIC Educational Resources Information Center
Guerra, Ingrid; Bernardez, Mariano; Jones, Michael; Zidan, Suhail
2005-01-01
This case study illustrates the application of Mega--adding measurable value for all stakeholders including society--as the central and ultimate focus for needs assessment. In this case, two needs assessment studies were conducted within a five-year period (1999-2003) with the State of Ohio's Workforce Development (WD) program. An initial needs…
Kaitaniemi, Pekka
2008-04-09
Allometric equations are widely used in many branches of biological science. The potential information content of the normalization constant b in allometric equations of the form Y = bX(a) has, however, remained largely neglected. To demonstrate the potential for utilizing this information, I generated a large number of artificial datasets that resembled those that are frequently encountered in biological studies, i.e., relatively small samples including measurement error or uncontrolled variation. The value of X was allowed to vary randomly within the limits describing different data ranges, and a was set to a fixed theoretical value. The constant b was set to a range of values describing the effect of a continuous environmental variable. In addition, a normally distributed random error was added to the values of both X and Y. Two different approaches were then used to model the data. The traditional approach estimated both a and b using a regression model, whereas an alternative approach set the exponent a at its theoretical value and only estimated the value of b. Both approaches produced virtually the same model fit with less than 0.3% difference in the coefficient of determination. Only the alternative approach was able to precisely reproduce the effect of the environmental variable, which was largely lost among noise variation when using the traditional approach. The results show how the value of b can be used as a source of valuable biological information if an appropriate regression model is selected.
A New Interpretation of Augmented Subscores and Their Added Value in Terms of Parallel Forms
ERIC Educational Resources Information Center
Sinharay, Sandip
2018-01-01
The value-added method of Haberman is arguably one of the most popular methods to evaluate the quality of subscores. The method is based on the classical test theory and deems a subscore to be of added value if the subscore predicts the corresponding true subscore better than does the total score. Sinharay provided an interpretation of the added…
Korea Institute for Advanced Study Value-Added Galaxy Catalog
NASA Astrophysics Data System (ADS)
Choi, Yun-Young; Han, Du-Hwan; Kim, Sungsoo S.
2010-12-01
We present the Korea Institute for Advanced Study Value-Added Galaxy Catalog (KIAS VAGC),a catalog of galaxies based on the Large Scale Structure (LSS) sample of New York University Value-Added Galaxy Catalog (NYU VAGC) Data Release 7. Our catalog supplements redshifts of 10,497 galaxies with 10 < r_{P} ≤ 17.6 (1455 with 10 < r_{P} ≤ 14.5) to the NYU VAGC LSS sample. Redshifts from various existing catalogs such as the Updated Zwicky Catalog, the IRAS Point Source Catalog Redshift Survey, the Third Reference Catalogue of Bright Galaxies, and the Two Degree Field Galaxy Redshift Survey have been put into the NYU VAGC photometric catalog. Our supplementation significantly improves spectroscopic completeness: the area covered by the spectroscopic sample with completeness higher than 95% increases from 2.119 to 1.737 sr.Our catalog also provides morphological types of all galaxies that are determined by the automated morphology classification scheme of Park & Choi (2005), and related parameters, together with fundamental photometry parameters supplied by the NYU VAGC. Our catalog contains matches to objects in the Max Planck for Astronomy (MPA) & Johns Hopkins University (JHU) spectrum measurements (Data Release 7). This new catalog, the KIAS VAGC, is complementary to the NYU VAGC and MPA-JHU catalog.
NASA Astrophysics Data System (ADS)
Löwe, P.; Hammitzsch, M.; Babeyko, A.; Wächter, J.
2012-04-01
The development of new Tsunami Early Warning Systems (TEWS) requires the modelling of spatio-temporal spreading of tsunami waves both recorded from past events and hypothetical future cases. The model results are maintained in digital repositories for use in TEWS command and control units for situation assessment once a real tsunami occurs. Thus the simulation results must be absolutely trustworthy, in a sense that the quality of these datasets is assured. This is a prerequisite as solid decision making during a crisis event and the dissemination of dependable warning messages to communities under risk will be based on them. This requires data format validity, but even more the integrity and information value of the content, being a derived value-added product derived from raw tsunami model output. Quality checking of simulation result products can be done in multiple ways, yet the visual verification of both temporal and spatial spreading characteristics for each simulation remains important. The eye of the human observer still remains an unmatched tool for the detection of irregularities. This requires the availability of convenient, human-accessible mappings of each simulation. The improvement of tsunami models necessitates the changes in many variables, including simulation end-parameters. Whenever new improved iterations of the general models or underlying spatial data are evaluated, hundreds to thousands of tsunami model results must be generated for each model iteration, each one having distinct initial parameter settings. The use of a Compute Cluster Environment (CCE) of sufficient size allows the automated generation of all tsunami-results within model iterations in little time. This is a significant improvement to linear processing on dedicated desktop machines or servers. This allows for accelerated/improved visual quality checking iterations, which in turn can provide a positive feedback into the overall model improvement iteratively. An approach to set
NASA Technical Reports Server (NTRS)
Moody, E. G.; King, M. D.; Platnick, S.; Schaaf, C. B.; Gao, F.
2004-01-01
Spectral land surface albedo is an important parameter for describing the radiative properties of the Earth. Accordingly it reflects the consequences of natural and human interactions, such as anthropogenic, meteorological, and phenological effects, on global and local climatological trends. Consequently, albedos are integral parts in a variety of research areas, such as general circulation models (GCMs), energy balance studies, modeling of land use and land use change, and biophysical, oceanographic, and meteorological studies. The availability of global albedo data over a large range of spectral channels and at high spatial resolution has dramatically improved with the launch of the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument aboard NASA s Earth Observing System (EOS) Terra spacecraft in December 1999. However, lack of spatial and temporal coverage due to cloud and snow effects can preclude utilization of official products in production and research studies. We report on a technique used to fill incomplete MOD43 albedo data sets with the intention of providing complete value-added maps. The technique is influenced by the phenological concept that within a certain area, a pixel s ecosystem class should exhibit similar growth cycle events over the same time period. The shape of an area s phenological temporal curve can be imposed upon existing pixel-level data to fill missing temporal points. The methodology will be reviewed by showcasing 2001 global and regional results of complete albedo and NDVl data sets.
Evaluating Specification Tests in the Context of Value-Added Estimation
ERIC Educational Resources Information Center
Guarino, Cassandra M.; Reckase, Mark D.; Stacy, Brian W.; Wooldridge, Jeffrey M.
2015-01-01
We study the properties of two specification tests that have been applied to a variety of estimators in the context of value-added measures (VAMs) of teacher and school quality: the Hausman test for choosing between student-level random and fixed effects, and a test for feedback (sometimes called a "falsification test"). We discuss…
Large Dataset of Acute Oral Toxicity Data Created for Testing ...
Acute toxicity data is a common requirement for substance registration in the US. Currently only data derived from animal tests are accepted by regulatory agencies, and the standard in vivo tests use lethality as the endpoint. Non-animal alternatives such as in silico models are being developed due to animal welfare and resource considerations. We compiled a large dataset of oral rat LD50 values to assess the predictive performance currently available in silico models. Our dataset combines LD50 values from five different sources: literature data provided by The Dow Chemical Company, REACH data from eChemportal, HSDB (Hazardous Substances Data Bank), RTECS data from Leadscope, and the training set underpinning TEST (Toxicity Estimation Software Tool). Combined these data sources yield 33848 chemical-LD50 pairs (data points), with 23475 unique data points covering 16439 compounds. The entire dataset was loaded into a chemical properties database. All of the compounds were registered in DSSTox and 59.5% have publically available structures. Compounds without a structure in DSSTox are currently having their structures registered. The structural data will be used to evaluate the predictive performance and applicable chemical domains of three QSAR models (TIMES, PROTOX, and TEST). Future work will combine the dataset with information from ToxCast assays, and using random forest modeling, assess whether ToxCast assays are useful in predicting acute oral toxicity. Pre
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polo, J.; Wilbert, S.; Ruiz-Arias, J. A.
2016-07-01
At any site, the bankability of a projected solar power plant largely depends on the accuracy and general quality of the solar radiation data generated during the solar resource assessment phase. The term 'site adaptation' has recently started to be used in the framework of solar energy projects to refer to the improvement that can be achieved in satellite-derived solar irradiance and model data when short-term local ground measurements are used to correct systematic errors and bias in the original dataset. This contribution presents a preliminary survey of different possible techniques that can improve long-term satellite-derived and model-derived solar radiationmore » data through the use of short-term on-site ground measurements. The possible approaches that are reported here may be applied in different ways, depending on the origin and characteristics of the uncertainties in the modeled data. This work, which is the first step of a forthcoming in-depth assessment of methodologies for site adaptation, has been done within the framework of the International Energy Agency Solar Heating and Cooling Programme Task 46 'Solar Resource Assessment and Forecasting.'« less
Fast randomization of large genomic datasets while preserving alteration counts.
Gobbi, Andrea; Iorio, Francesco; Dawson, Kevin J; Wedge, David C; Tamborero, David; Alexandrov, Ludmil B; Lopez-Bigas, Nuria; Garnett, Mathew J; Jurman, Giuseppe; Saez-Rodriguez, Julio
2014-09-01
Studying combinatorial patterns in cancer genomic datasets has recently emerged as a tool for identifying novel cancer driver networks. Approaches have been devised to quantify, for example, the tendency of a set of genes to be mutated in a 'mutually exclusive' manner. The significance of the proposed metrics is usually evaluated by computing P-values under appropriate null models. To this end, a Monte Carlo method (the switching-algorithm) is used to sample simulated datasets under a null model that preserves patient- and gene-wise mutation rates. In this method, a genomic dataset is represented as a bipartite network, to which Markov chain updates (switching-steps) are applied. These steps modify the network topology, and a minimal number of them must be executed to draw simulated datasets independently under the null model. This number has previously been deducted empirically to be a linear function of the total number of variants, making this process computationally expensive. We present a novel approximate lower bound for the number of switching-steps, derived analytically. Additionally, we have developed the R package BiRewire, including new efficient implementations of the switching-algorithm. We illustrate the performances of BiRewire by applying it to large real cancer genomics datasets. We report vast reductions in time requirement, with respect to existing implementations/bounds and equivalent P-value computations. Thus, we propose BiRewire to study statistical properties in genomic datasets, and other data that can be modeled as bipartite networks. BiRewire is available on BioConductor at http://www.bioconductor.org/packages/2.13/bioc/html/BiRewire.html. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Adding Value to Total Joint Arthroplasty Care in an Academic Environment: The Utah Experience.
Pelt, Christopher E; Anderson, Mike B; Erickson, Jill A; Gililland, Jeremy M; Peters, Christopher L
2018-06-01
Adding value in a university-based academic health care system provides unique challenges when compared to other health care delivery models. Herein, we describe our experience in adding value to joint arthroplasty care at the University of Utah, where the concept of value-based health care reform has become an embraced and driving force. To improve the value, new resources were needed for care redesign, physician leadership, and engagement in alternative payment models. The changes that occurred at our institution are described. Real-time data and knowledgeable personnel working behind the scenes, while physicians provide clinical care, help move clinical pathway redesigns. Engaged physicians are essential to the successful implementation of value creation and care pathway redesign that can lead to improvements in value. An investment of money and resources toward added infrastructure and personnel is often needed to realize large-scale improvements. Alignment of providers, payers, and hospital administration, including by means of gainsharing programs, can lead to improvements. Although significant care pathway redesign efforts may realize substantial initial cost savings, savings may be asymptotic in nature, which calls into question the likely sustainability of programs that incentivize or penalize payments based on historical targets. Copyright © 2018 Elsevier Inc. All rights reserved.
Tan, Christine L.; Hassali, Mohamed A.; Saleem, Fahad; Shafie, Asrul A.; Aljadhey, Hisham; Gan, Vincent B.
2015-01-01
Objective: (i) To develop the Pharmacy Value-Added Services Questionnaire (PVASQ) using emerging themes generated from interviews. (ii) To establish reliability and validity of questionnaire instrument. Methods: Using an extended Theory of Planned Behavior as the theoretical model, face-to-face interviews generated salient beliefs of pharmacy value-added services. The PVASQ was constructed initially in English incorporating important themes and later translated into the Malay language with forward and backward translation. Intention (INT) to adopt pharmacy value-added services is predicted by attitudes (ATT), subjective norms (SN), perceived behavioral control (PBC), knowledge and expectations. Using a 7-point Likert-type scale and a dichotomous scale, test-retest reliability (N=25) was assessed by administrating the questionnaire instrument twice at an interval of one week apart. Internal consistency was measured by Cronbach’s alpha and construct validity between two administrations was assessed using the kappa statistic and the intraclass correlation coefficient (ICC). Confirmatory Factor Analysis, CFA (N=410) was conducted to assess construct validity of the PVASQ. Results: The kappa coefficients indicate a moderate to almost perfect strength of agreement between test and retest. The ICC for all scales tested for intra-rater (test-retest) reliability was good. The overall Cronbach’ s alpha (N=25) is 0.912 and 0.908 for the two time points. The result of CFA (N=410) showed most items loaded strongly and correctly into corresponding factors. Only one item was eliminated. Conclusions: This study is the first to develop and establish the reliability and validity of the Pharmacy Value-Added Services Questionnaire instrument using the Theory of Planned Behavior as the theoretical model. The translated Malay language version of PVASQ is reliable and valid to predict Malaysian patients’ intention to adopt pharmacy value-added services to collect partial medicine
Tan, Christine L; Hassali, Mohamed A; Saleem, Fahad; Shafie, Asrul A; Aljadhey, Hisham; Gan, Vincent B
2015-01-01
(i) To develop the Pharmacy Value-Added Services Questionnaire (PVASQ) using emerging themes generated from interviews. (ii) To establish reliability and validity of questionnaire instrument. Using an extended Theory of Planned Behavior as the theoretical model, face-to-face interviews generated salient beliefs of pharmacy value-added services. The PVASQ was constructed initially in English incorporating important themes and later translated into the Malay language with forward and backward translation. Intention (INT) to adopt pharmacy value-added services is predicted by attitudes (ATT), subjective norms (SN), perceived behavioral control (PBC), knowledge and expectations. Using a 7-point Likert-type scale and a dichotomous scale, test-retest reliability (N=25) was assessed by administrating the questionnaire instrument twice at an interval of one week apart. Internal consistency was measured by Cronbach's alpha and construct validity between two administrations was assessed using the kappa statistic and the intraclass correlation coefficient (ICC). Confirmatory Factor Analysis, CFA (N=410) was conducted to assess construct validity of the PVASQ. The kappa coefficients indicate a moderate to almost perfect strength of agreement between test and retest. The ICC for all scales tested for intra-rater (test-retest) reliability was good. The overall Cronbach' s alpha (N=25) is 0.912 and 0.908 for the two time points. The result of CFA (N=410) showed most items loaded strongly and correctly into corresponding factors. Only one item was eliminated. This study is the first to develop and establish the reliability and validity of the Pharmacy Value-Added Services Questionnaire instrument using the Theory of Planned Behavior as the theoretical model. The translated Malay language version of PVASQ is reliable and valid to predict Malaysian patients' intention to adopt pharmacy value-added services to collect partial medicine supply.
An efficient annotation and gene-expression derivation tool for Illumina Solexa datasets.
Hosseini, Parsa; Tremblay, Arianne; Matthews, Benjamin F; Alkharouf, Nadim W
2010-07-02
The data produced by an Illumina flow cell with all eight lanes occupied, produces well over a terabyte worth of images with gigabytes of reads following sequence alignment. The ability to translate such reads into meaningful annotation is therefore of great concern and importance. Very easily, one can get flooded with such a great volume of textual, unannotated data irrespective of read quality or size. CASAVA, a optional analysis tool for Illumina sequencing experiments, enables the ability to understand INDEL detection, SNP information, and allele calling. To not only extract from such analysis, a measure of gene expression in the form of tag-counts, but furthermore to annotate such reads is therefore of significant value. We developed TASE (Tag counting and Analysis of Solexa Experiments), a rapid tag-counting and annotation software tool specifically designed for Illumina CASAVA sequencing datasets. Developed in Java and deployed using jTDS JDBC driver and a SQL Server backend, TASE provides an extremely fast means of calculating gene expression through tag-counts while annotating sequenced reads with the gene's presumed function, from any given CASAVA-build. Such a build is generated for both DNA and RNA sequencing. Analysis is broken into two distinct components: DNA sequence or read concatenation, followed by tag-counting and annotation. The end result produces output containing the homology-based functional annotation and respective gene expression measure signifying how many times sequenced reads were found within the genomic ranges of functional annotations. TASE is a powerful tool to facilitate the process of annotating a given Illumina Solexa sequencing dataset. Our results indicate that both homology-based annotation and tag-count analysis are achieved in very efficient times, providing researchers to delve deep in a given CASAVA-build and maximize information extraction from a sequencing dataset. TASE is specially designed to translate sequence data
An efficient annotation and gene-expression derivation tool for Illumina Solexa datasets
2010-01-01
Background The data produced by an Illumina flow cell with all eight lanes occupied, produces well over a terabyte worth of images with gigabytes of reads following sequence alignment. The ability to translate such reads into meaningful annotation is therefore of great concern and importance. Very easily, one can get flooded with such a great volume of textual, unannotated data irrespective of read quality or size. CASAVA, a optional analysis tool for Illumina sequencing experiments, enables the ability to understand INDEL detection, SNP information, and allele calling. To not only extract from such analysis, a measure of gene expression in the form of tag-counts, but furthermore to annotate such reads is therefore of significant value. Findings We developed TASE (Tag counting and Analysis of Solexa Experiments), a rapid tag-counting and annotation software tool specifically designed for Illumina CASAVA sequencing datasets. Developed in Java and deployed using jTDS JDBC driver and a SQL Server backend, TASE provides an extremely fast means of calculating gene expression through tag-counts while annotating sequenced reads with the gene's presumed function, from any given CASAVA-build. Such a build is generated for both DNA and RNA sequencing. Analysis is broken into two distinct components: DNA sequence or read concatenation, followed by tag-counting and annotation. The end result produces output containing the homology-based functional annotation and respective gene expression measure signifying how many times sequenced reads were found within the genomic ranges of functional annotations. Conclusions TASE is a powerful tool to facilitate the process of annotating a given Illumina Solexa sequencing dataset. Our results indicate that both homology-based annotation and tag-count analysis are achieved in very efficient times, providing researchers to delve deep in a given CASAVA-build and maximize information extraction from a sequencing dataset. TASE is specially
Dual functional bioactive-peptide, AIMP1-derived peptide (AdP), for anti-aging.
Kim, Jina; Kang, Sujin; Kwon, HanJin; Moon, HoSang; Park, Min Chul
2018-06-19
Human skin aging is caused by several factors, such as UV irradiation, stress, hormone, and pollution. Wrinkle formation and skin pigmentation are representative features of skin aging. Although EGF and arbutin are used as anti-wrinkle and skin whitening agents, respectively, they have adverse effects on skin. When more cosmeceutical ingredients are added to cosmetic product, adverse effects are also accumulated. For these reasons, multifunctional and safe cosmetic ingredients are in demand. The aim of the present study is to investigate the novel anti-aging agents, AIMP1-derived peptide (AdP, INCI name: sh-oligopeptide-5/sh-oligopeptide SP) for cosmetic products. To assess the anti-wrinkle effect of AdP, collagen type I synthesis and fibroblast proliferation were determined on human fibroblasts. The anti-wrinkle effect of AdP was examined by ELISA and cell titer glo assay. To assess the whitening, melanin content and tyrosinase activity were determined on melanocytes. The whitening effect of AdP was examined by melanin measurement and enzyme activity assay. The safety of AdP was determined by cytotoxicity and immunogenicity, CCK-8 and TNF-α ELISA assay, respectively. AdP treatment induced the collagen type I synthesis and fibroblast proliferation. Also, AdP treatment inhibited melanin synthesis by regulating tyrosinase activity. The anti-aging effect of AdP is more potent than EGF and albutin. AdP did not show adverse effects. These results show that AdP can be dual functional and safe cosmeceutical agent to prevent skin aging. © 2018 Wiley Periodicals, Inc.
76 FR 37774 - Announcement of Value-Added Producer Grant Application Deadlines
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-28
...-Based Business Ventures develop strategies to create marketing opportunities and to help develop Business Plans for viable marketing opportunities regarding production of bio-based products from... Capital Grants directly related to the processing and/or marketing of value-added products. In order to...
Economic value added: can it apply to an S corporation medical practice?
Shapiro, Michael D
2007-08-01
Typically, owners of medical practices use financial formulas such as ROI and net present value to evaluate the financial benefit of new projects. However, economic value added, a concept used by many large corporations to define and maximize return, may add greater benefit in helping medical practice owners realize a reasonable return on their core business.
Added value measures in education show genetic as well as environmental influence.
Haworth, Claire M A; Asbury, Kathryn; Dale, Philip S; Plomin, Robert
2011-02-02
Does achievement independent of ability or previous attainment provide a purer measure of the added value of school? In a study of 4000 pairs of 12-year-old twins in the UK, we measured achievement with year-long teacher assessments as well as tests. Raw achievement shows moderate heritability (about 50%) and modest shared environmental influences (25%). Unexpectedly, we show that for indices of the added value of school, genetic influences remain moderate (around 50%), and the shared (school) environment is less important (about 12%). The pervasiveness of genetic influence in how and how much children learn is compatible with an active view of learning in which children create their own educational experiences in part on the basis of their genetic propensities.
ERIC Educational Resources Information Center
Kersting, Nicole B.; Chen, Mei-kuang; Stigler, James W.
2013-01-01
If teacher value-added estimates (VAEs) are to be used as indicators of individual teacher performance in teacher evaluation and accountability systems, it is important to understand how much VAEs are affected by the data and model specifications used to estimate them. In this study we explored the effects of three conditions on the stability of…
Added-values of high spatiotemporal remote sensing data in crop yield estimation
NASA Astrophysics Data System (ADS)
Gao, F.; Anderson, M. C.
2017-12-01
Timely and accurate estimation of crop yield before harvest is critical for food market and administrative planning. Remote sensing derived parameters have been used for estimating crop yield by using either empirical or crop growth models. The uses of remote sensing vegetation index (VI) in crop yield modeling have been typically evaluated at regional and country scales using coarse spatial resolution (a few hundred to kilo-meters) data or assessed over a small region at field level using moderate resolution spatial resolution data (10-100m). Both data sources have shown great potential in capturing spatial and temporal variability in crop yield. However, the added value of data with both high spatial and temporal resolution data has not been evaluated due to the lack of such data source with routine, global coverage. In recent years, more moderate resolution data have become freely available and data fusion approaches that combine data acquired from different spatial and temporal resolutions have been developed. These make the monitoring crop condition and estimating crop yield at field scale become possible. Here we investigate the added value of the high spatial and temporal VI for describing variability of crop yield. The explanatory ability of crop yield based on high spatial and temporal resolution remote sensing data was evaluated in a rain-fed agricultural area in the U.S. Corn Belt. Results show that the fused Landsat-MODIS (high spatial and temporal) VI explains yield variability better than single data source (Landsat or MODIS alone), with EVI2 performing slightly better than NDVI. The maximum VI describes yield variability better than cumulative VI. Even though VI is effective in explaining yield variability within season, the inter-annual variability is more complex and need additional information (e.g. weather, water use and management). Our findings augment the importance of high spatiotemporal remote sensing data and supports new moderate
NASA Astrophysics Data System (ADS)
Schnepp, Elisabeth; Lanos, Philippe; Chauvin, Annick
2009-08-01
Geomagnetic paleointensities have been determined from a single archaeological site in Lübeck, Germany, where a sequence of 25 bread oven floors has been preserved in a bakery from medieval times until today. Age dating confines the time interval from about 1300 A.D. to about 1750 A.D. Paleomagnetic directions have been published from each oven floor and are updated here. The specimens have very stable directions and no or only weak secondary components. The oven floor material was characterized rock magnetically using Thellier viscosity indices, median destructive field values, Curie point determinations, and hysteresis measurements. Magnetic carriers are mixtures of SD, PSD, and minor MD magnetite and/or maghemite together with small amounts of hematite. Paleointensity was measured from selected specimens with the double-heating Thellier method including pTRM checks and determination of TRM anisotropy tensors. Corrections for anisotropy as well as for cooling rate turned out to be unnecessary. Ninety-two percent of the Thellier experiments passed the assigned acceptance criteria and provided four to six reliable paleointensity estimates per oven floor. Mean paleointensity values derived from 22 oven floors show maxima in the 15th and early 17th centuries A.D., followed by a decrease of paleointensity of about 20% until 1750 A.D. Together with the directions the record represents about 450 years of full vector secular variation. The results compare well with historical models of the Earth's magnetic field as well as with a selected high-quality paleointensity data set for western and central Europe.
Teichgräber, Ulf K; de Bucourt, Maximilian
2012-01-01
OJECTIVES: To eliminate non-value-adding (NVA) waste for the procurement of endovascular stents in interventional radiology services by applying value stream mapping (VSM). The Lean manufacturing technique was used to analyze the process of material and information flow currently required to direct endovascular stents from external suppliers to patients. Based on a decision point analysis for the procurement of stents in the hospital, a present state VSM was drawn. After assessment of the current status VSM and progressive elimination of unnecessary NVA waste, a future state VSM was drawn. The current state VSM demonstrated that out of 13 processes for the procurement of stents only 2 processes were value-adding. Out of the NVA processes 5 processes were unnecessary NVA activities, which could be eliminated. The decision point analysis demonstrated that the procurement of stents was mainly a forecast driven push system. The future state VSM applies a pull inventory control system to trigger the movement of a unit after withdrawal by using a consignment stock. VSM is a visualization tool for the supply chain and value stream, based on the Toyota Production System and greatly assists in successfully implementing a Lean system. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Satellite-derived pan-Arctic melt onset dataset, 2000-2009
NASA Astrophysics Data System (ADS)
Wang, L.; Derksen, C.; Howell, S.; Wolken, G. J.; Sharp, M. J.; Markus, T.
2009-12-01
The SeaWinds Scatterometer on QuikSCAT (QS) has been in orbit for over a decade since its launch in June 1999. Due to its high sensitivity to the appearance of liquid water in snow and day/night all weather capability, QS data have been successfully used to detect melt onset and melt duration for various elements of the cryosphere. These melt datasets are especially useful in the polar regions where the application of imagery from optical sensors is hindered by polar nights and frequent cloud cover. In this study, we generate a pan-Arctic, pan-cryosphere melt onset dataset by combining estimates from previously published algorithms optimized for individual cryospheric elements and applied to QS and Special Sensor Microwave Imager (SSM/I) data for the northern high latitude land surface, ice caps, large lakes, and sea ice. Comparisons of melt onset along the boundaries between different components of the cryosphere show that in general the integrated dataset provides consistent and spatially coherent melt onset estimates across the pan-Arctic. We present the climatology and the anomaly patterns in melt onset during 2000-2009, and identify synoptic-scale linkages between atmospheric conditions and the observed patterns. We also investigate the possible trends in melt onset in the pan-Arctic during the 10-year period.
Will Courts Shape Value-Added Methods for Teacher Evaluation? ACT Working Paper Series. WP-2014-2
ERIC Educational Resources Information Center
Croft, Michelle; Buddin, Richard
2014-01-01
As more states begin to adopt teacher evaluation systems based on value-added measures, legal challenges have been filed both seeking to limit the use of value-added measures ("Cook v. Stewart") and others seeking to require more robust evaluation systems ("Vergara v. California"). This study reviews existing teacher evaluation…
Ionic liquid solutions as extractive solvents for value-added compounds from biomass
Passos, Helena; Freire, Mara G.; Coutinho, João A. P.
2014-01-01
In the past few years, the number of studies regarding the application of ionic liquids (ILs) as alternative solvents to extract value-added compounds from biomass has been growing. Based on an extended compilation and analysis of the data hitherto reported, the main objective of this review is to provide an overview on the use of ILs and their mixtures with molecular solvents for the extraction of value-added compounds present in natural sources. The ILs (or IL solutions) investigated as solvents for the extraction of natural compounds, such as alkaloids, flavonoids, terpenoids, lipids, among others, are outlined. The extraction techniques employed, namely solid–liquid extraction, and microwave-assisted and ultrasound-assisted extractions, are emphasized and discussed in terms of extraction yields and purification factors. Furthermore, the evaluation of the IL chemical structure and the optimization of the process conditions (IL concentration, temperature, biomass–solvent ratio, etc.) are critically addressed. Major conclusions on the role of the ILs towards the extraction mechanisms and improved extraction yields are additionally provided. The isolation and recovery procedures of the value-added compounds are ascertained as well as some scattered strategies already reported for the IL solvent recovery and reusability. Finally, a critical analysis on the economic impact versus the extraction performance of IL-based methodologies was also carried out and is here presented and discussed. PMID:25516718
Ionic liquid solutions as extractive solvents for value-added compounds from biomass.
Passos, Helena; Freire, Mara G; Coutinho, João A P
2014-12-01
In the past few years, the number of studies regarding the application of ionic liquids (ILs) as alternative solvents to extract value-added compounds from biomass has been growing. Based on an extended compilation and analysis of the data hitherto reported, the main objective of this review is to provide an overview on the use of ILs and their mixtures with molecular solvents for the extraction of value-added compounds present in natural sources. The ILs (or IL solutions) investigated as solvents for the extraction of natural compounds, such as alkaloids, flavonoids, terpenoids, lipids, among others, are outlined. The extraction techniques employed, namely solid-liquid extraction, and microwave-assisted and ultrasound-assisted extractions, are emphasized and discussed in terms of extraction yields and purification factors. Furthermore, the evaluation of the IL chemical structure and the optimization of the process conditions (IL concentration, temperature, biomass-solvent ratio, etc.) are critically addressed. Major conclusions on the role of the ILs towards the extraction mechanisms and improved extraction yields are additionally provided. The isolation and recovery procedures of the value-added compounds are ascertained as well as some scattered strategies already reported for the IL solvent recovery and reusability. Finally, a critical analysis on the economic impact versus the extraction performance of IL-based methodologies was also carried out and is here presented and discussed.
Liang, Xueying; Schnetz-Boutaud, Nathalie; Bartlett, Jackie; Allen, Melissa J; Gwirtsman, Harry; Schmechel, Don E; Carney, Regina M; Gilbert, John R; Pericak-Vance, Margaret A; Haines, Jonathan L
2008-01-01
SNP rs498055 in the predicted gene LOC439999 on chromosome 10 was recently identified as being strongly associated with late-onset Alzheimer disease (LOAD). This SNP falls within a chromosomal region that has engendered continued interest generated from both preliminary genetic linkage and candidate gene studies. To independently evaluate this interesting candidate SNP we examined four independent datasets, three family-based and one case-control. All the cases were late-onset AD Caucasian patients with minimum age at onset >or= 60 years. None of the three family samples or the combined family-based dataset showed association in either allelic or genotypic family-based association tests at p < 0.05. Both original and OSA two-point LOD scores were calculated. However, there was no evidence indicating linkage no matter what covariates were applied (the highest LOD score was 0.82). The case-control dataset did not demonstrate any association between this SNP and AD (all p-values > 0.52). Our results do not confirm the previous association, but are consistent with a more recent negative association result that used family-based association tests to examine the effect of this SNP in two family datasets. Thus we conclude that rs498055 is not associated with an increased risk of LOAD.
Reduced prefrontal and temporal processing and recall of high "sensation value" ads.
Langleben, Daniel D; Loughead, James W; Ruparel, Kosha; Hakun, Jonathan G; Busch-Winokur, Samantha; Holloway, Matthew B; Strasser, Andrew A; Cappella, Joseph N; Lerman, Caryn
2009-05-15
Public service announcements (PSAs) are non-commercial broadcast ads that are an important part of televised public health campaigns. "Message sensation value" (MSV), a measure of sensory intensity of audio, visual, and content features of an ad, is an important factor in PSA impact. Some communication theories propose that higher message sensation value brings increased attention and cognitive processing, leading to higher ad impact. Others argue that the attention-intensive format could compete with ad's message for cognitive resources and result in reduced processing of PSA content and reduced overall effectiveness. Brain imaging during PSA viewing provides a quantitative surrogate measure of PSA impact and addresses questions of PSA evaluation and design not accessible with traditional subjective and epidemiological methods. We used Blood Oxygenation Level Dependent (BOLD) functional Magnetic Resonance Imaging (fMRI) and recognition memory measures to compare high and low MSV anti-tobacco PSAs and neutral videos. In a short-delay, forced-choice memory test, frames extracted from PSAs were recognized more accurately than frames extracted from the NV. Frames from the low MSV PSAs were better recognized than frames from the high MSV PSAs. The accuracy of recognition of PSA frames was positively correlated with the prefrontal and temporal, and negatively correlated with the occipital cortex activation. The low MSV PSAs were associated with greater prefrontal and temporal activation, than the high MSV PSAs. The high MSV PSAs produced greater activation primarily in the occipital cortex. These findings support the "dual processing" and "limited capacity" theories of communication that postulate a competition between ad's content and format for the viewers' cognitive resources and suggest that the "attention-grabbing" high MSV format could impede the learning and retention of an ad. These findings demonstrate the potential of using neuroimaging in the design and
Dataset on predictive compressive strength model for self-compacting concrete.
Ofuyatan, O M; Edeki, S O
2018-04-01
The determination of compressive strength is affected by many variables such as the water cement (WC) ratio, the superplasticizer (SP), the aggregate combination, and the binder combination. In this dataset article, 7, 28, and 90-day compressive strength models are derived using statistical analysis. The response surface methodology is used toinvestigate the effect of the parameters: Varying percentages of ash, cement, WC, and SP on hardened properties-compressive strengthat 7,28 and 90 days. Thelevels of independent parameters are determinedbased on preliminary experiments. The experimental values for compressive strengthat 7, 28 and 90 days and modulus of elasticity underdifferent treatment conditions are also discussed and presented.These dataset can effectively be used for modelling and prediction in concrete production settings.
A rapid approach for automated comparison of independently derived stream networks
Stanislawski, Larry V.; Buttenfield, Barbara P.; Doumbouya, Ariel T.
2015-01-01
This paper presents an improved coefficient of line correspondence (CLC) metric for automatically assessing the similarity of two different sets of linear features. Elevation-derived channels at 1:24,000 scale (24K) are generated from a weighted flow-accumulation model and compared to 24K National Hydrography Dataset (NHD) flowlines. The CLC process conflates two vector datasets through a raster line-density differencing approach that is faster and more reliable than earlier methods. Methods are tested on 30 subbasins distributed across different terrain and climate conditions of the conterminous United States. CLC values for the 30 subbasins indicate 44–83% of the features match between the two datasets, with the majority of the mismatching features comprised of first-order features. Relatively lower CLC values result from subbasins with less than about 1.5 degrees of slope. The primary difference between the two datasets may be explained by different data capture criteria. First-order, headwater tributaries derived from the flow-accumulation model are captured more comprehensively through drainage area and terrain conditions, whereas capture of headwater features in the NHD is cartographically constrained by tributary length. The addition of missing headwaters to the NHD, as guided by the elevation-derived channels, can substantially improve the scientific value of the NHD.
Value-added processing of crude glycerol into chemicals and polymers.
Luo, Xiaolan; Ge, Xumeng; Cui, Shaoqing; Li, Yebo
2016-09-01
Crude glycerol is a low-value byproduct which is primarily obtained from the biodiesel production process. Its composition is significantly different from that of pure glycerol. Crude glycerol usually contains various impurities, such as water, methanol, soap, fatty acids, and fatty acid methyl esters. Considerable efforts have been devoted to finding applications for converting crude glycerol into high-value products, such as biofuels, chemicals, polymers, and animal feed, to improve the economic viability of the biodiesel industry and overcome environmental challenges associated with crude glycerol disposal. This article reviews recent advances of biological and chemical technologies for value-added processing of crude glycerol into chemicals and polymers, and provides strategies for addressing production challenges. Copyright © 2016 Elsevier Ltd. All rights reserved.
Chen, Min; Shen, Nan-Xing; Chen, Zhi-Qi; Zhang, Feng-Min; Chen, Yang
2017-04-28
Four new azaphilones, penicilones A-D (1-4), were isolated from the mangrove rhizosphere soil-derived fungus Penicillium janthinellum HK1-6. Their planar structures and absolute configurations were determined by extensive analysis of NMR spectroscopic data, ECD spectra, the modified Mosher's method, and chemical conversions. Interestingly, 1 and 2 had the opposite configuration at C-7 compared to the closely related chloro analogues 3 and 4. Ester hydrolysis of 2 and 4 afforded their parental azaphilones, named penicilones E (5) and F (6). Compounds 1-6 were evaluated for their antibacterial activities in vitro. Penicilones B-D (2-4) showed potent anti-MRSA (Staphylococcus aureus ATCC 43300, ATCC 33591) activities with MIC values ranging from 3.13 to 6.25 μg/mL.
Code of Federal Regulations, 2014 CFR
2014-10-01
... and value added tax on fuel (passenger vehicles) (United Kingdom). 252.229-7009 Section 252.229-7009... Relief from customs duty and value added tax on fuel (passenger vehicles) (United Kingdom). As prescribed in 229.402-70(i), use the following clause: Relief from Customs Duty and Value Added Tax on Fuel...
Code of Federal Regulations, 2010 CFR
2010-10-01
... and value added tax on fuel (passenger vehicles) (United Kingdom). 252.229-7009 Section 252.229-7009... Relief from customs duty and value added tax on fuel (passenger vehicles) (United Kingdom). As prescribed in 229.402-70(i), use the following clause: Relief from Customs Duty and Value Added Tax on Fuel...
Code of Federal Regulations, 2013 CFR
2013-10-01
... and value added tax on fuel (passenger vehicles) (United Kingdom). 252.229-7009 Section 252.229-7009... Relief from customs duty and value added tax on fuel (passenger vehicles) (United Kingdom). As prescribed in 229.402-70(i), use the following clause: Relief from Customs Duty and Value Added Tax on Fuel...
Code of Federal Regulations, 2012 CFR
2012-10-01
... and value added tax on fuel (passenger vehicles) (United Kingdom). 252.229-7009 Section 252.229-7009... Relief from customs duty and value added tax on fuel (passenger vehicles) (United Kingdom). As prescribed in 229.402-70(i), use the following clause: Relief from Customs Duty and Value Added Tax on Fuel...
Code of Federal Regulations, 2011 CFR
2011-10-01
... and value added tax on fuel (passenger vehicles) (United Kingdom). 252.229-7009 Section 252.229-7009... Relief from customs duty and value added tax on fuel (passenger vehicles) (United Kingdom). As prescribed in 229.402-70(i), use the following clause: Relief from Customs Duty and Value Added Tax on Fuel...
Downscaled and debiased climate simulations for North America from 21,000 years ago to 2100AD
Lorenz, David J.; Nieto-Lugilde, Diego; Blois, Jessica L.; Fitzpatrick, Matthew C.; Williams, John W.
2016-01-01
Increasingly, ecological modellers are integrating paleodata with future projections to understand climate-driven biodiversity dynamics from the past through the current century. Climate simulations from earth system models are necessary to this effort, but must be debiased and downscaled before they can be used by ecological models. Downscaling methods and observational baselines vary among researchers, which produces confounding biases among downscaled climate simulations. We present unified datasets of debiased and downscaled climate simulations for North America from 21 ka BP to 2100AD, at 0.5° spatial resolution. Temporal resolution is decadal averages of monthly data until 1950AD, average climates for 1950–2005 AD, and monthly data from 2010 to 2100AD, with decadal averages also provided. This downscaling includes two transient paleoclimatic simulations and 12 climate models for the IPCC AR5 (CMIP5) historical (1850–2005), RCP4.5, and RCP8.5 21st-century scenarios. Climate variables include primary variables and derived bioclimatic variables. These datasets provide a common set of climate simulations suitable for seamlessly modelling the effects of past and future climate change on species distributions and diversity. PMID:27377537
Downscaled and debiased climate simulations for North America from 21,000 years ago to 2100AD.
Lorenz, David J; Nieto-Lugilde, Diego; Blois, Jessica L; Fitzpatrick, Matthew C; Williams, John W
2016-07-05
Increasingly, ecological modellers are integrating paleodata with future projections to understand climate-driven biodiversity dynamics from the past through the current century. Climate simulations from earth system models are necessary to this effort, but must be debiased and downscaled before they can be used by ecological models. Downscaling methods and observational baselines vary among researchers, which produces confounding biases among downscaled climate simulations. We present unified datasets of debiased and downscaled climate simulations for North America from 21 ka BP to 2100AD, at 0.5° spatial resolution. Temporal resolution is decadal averages of monthly data until 1950AD, average climates for 1950-2005 AD, and monthly data from 2010 to 2100AD, with decadal averages also provided. This downscaling includes two transient paleoclimatic simulations and 12 climate models for the IPCC AR5 (CMIP5) historical (1850-2005), RCP4.5, and RCP8.5 21st-century scenarios. Climate variables include primary variables and derived bioclimatic variables. These datasets provide a common set of climate simulations suitable for seamlessly modelling the effects of past and future climate change on species distributions and diversity.
Autonomously Propelled Motors for Value-Added Product Synthesis and Purification.
Srivastava, Sarvesh K; Schmidt, Oliver G
2016-06-27
A proof-of-concept design for autonomous, self-propelling motors towards value-added product synthesis and separation is presented. The hybrid motor design consists of two distinct functional blocks. The first, a sodium borohydride (NaBH4 ) granule, serves both as a reaction prerequisite for the reduction of vanillin and also as a localized solid-state fuel in the reaction mixture. The second capping functional block consisting of a graphene-polymer composite serves as a hydrophobic matrix to attract the reaction product vanillyl alcohol (VA), resulting in facile separation of this edible value-added product. These autonomously propelled motors were fabricated at a length scale down to 400 μm, and once introduced in the reaction environment showed rapid bubble-propulsion followed by high-purity separation of the reaction product (VA) by the virtue of the graphene-polymer cap acting as a mesoporous sponge. The concept has excellent potential towards the synthesis/isolation of industrially important compounds, affinity-based product separation, pollutant remediation (such as heavy metal chelation/adsorption), as well as localized fuel-gradients as an alternative to external fuel dependency. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Value-Added Electricity Services: New Roles for Utilities and Third-Party Providers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blansfield, J.; Wood, L.; Katofsky, R.
New energy generation, storage, delivery, and end-use technologies support a broad range of value-added electricity services for retail electricity customers. Sophisticated energy management services, distributed generation coupled with storage, and electric vehicle charging are just a few examples of emerging offerings. Who should provide value-added services — utilities or third parties, or both, and under what conditions? What policy and regulatory changes may be needed to promote competition and innovation, to account for utility costs to enable these services, and to protect consumers? The report approaches the issues from three perspectives: utilities, third-party service providers, and consumers: -Jonathan Blansfield andmore » Lisa Wood, Institute for Electric Innovation -Ryan Katofsky, Benjamin Stafford and Danny Waggoner, Advanced Energy Economy -National Association of State Utility Consumer Advocates« less
Liberating the potential: the role of non-nurses in adding value to nurse education.
Dickinson, Julie
2006-01-01
In this paper, I have attempted to explore the role of non-nurse lecturers in adding value to nurse education programmes. In measuring "added-value" in higher education, I have embraced a more comprehensive approach including investigating the views of "Experts"; including the nurse and non-nurse lecturers themselves, and various United Kingdom stakeholders such as the Government, the Nursing and Midwifery Council and the Quality Assurance Agency. The students' views are also taken into account, when considering both the content of the programmes and how they are delivered. The complexity of "objective measurement" is considered, and the requirements of a "good" teaching experience. The potential areas for adding value include: health and social care policy priorities which encourage partnership working, the blurring of professional boundaries, and inter-professional working. Professional-specific changes embracing extended and enhanced roles and the concepts of specialist and assistant practitioners. Other areas include the Higher Education agendas including transferable skills and adult and student-centred learning. I conclude by discussing the latest policy changes and suggest that the role of the non-nurse lecturer needs more exploration to provide the best value for all.
Conceptual and Empirical Differences among Various Value-Added Models for Accountability
ERIC Educational Resources Information Center
Timmermans, Anneke C.; Doolaard, Simone; de Wolf, Inge
2011-01-01
Accountability systems in education generally include indicators of student performance. However, these indicators often differ considerably among the various systems. More and more countries try to include value-added measures, mainly because they do not want to hold schools accountable for differences in their initial intake of students. This…
ERIC Educational Resources Information Center
Harris, Douglas N.
2010-01-01
In this policy brief, the author explores the problems with attainment measures when it comes to evaluating performance at the school level, and explores the best uses of value-added measures. These value-added measures, the author writes, are useful for sorting out-of-school influences from school influences or from teacher performance, giving…
Intelligent Noninvasive Diagnosis of Aneuploidy: Raw Values and Highly Imbalanced Dataset.
Neocleous, Andreas C; Nicolaides, Kypros H; Schizas, Christos N
2017-09-01
The objective of this paper is to introduce a noninvasive diagnosis procedure for aneuploidy and to minimize the social and financial cost of prenatal diagnosis tests that are performed for fetal aneuploidies in an early stage of pregnancy. We propose a method by using artificial neural networks trained with data from singleton pregnancy cases, while undergoing first trimester screening. Three different datasets 1 with a total of 122 362 euploid and 967 aneuploid cases were used in this study. The data for each case contained markers collected from the mother and the fetus. This study, unlike previous studies published by the authors for a similar problem differs in three basic principles: 1) the training of the artificial neural networks is done by using the markers' values in their raw form (unprocessed), 2) a balanced training dataset is created and used by selecting only a representative number of euploids for the training phase, and 3) emphasis is given to the financials and suggest hierarchy and necessity of the available tests. The proposed artificial neural networks models were optimized in the sense of reaching a minimum false positive rate and at the same time securing a 100% detection rate for Trisomy 21. These systems correctly identify other aneuploidies (Trisomies 13&18, Turner, and Triploid syndromes) at a detection rate greater than 80%. In conclusion, we demonstrate that artificial neural network systems can contribute in providing noninvasive, effective early screening for fetal aneuploidies with results that compare favorably to other existing methods.
ERIC Educational Resources Information Center
Goldhaber, Dan
2015-01-01
The past decade has seen a tremendous amount of research on the use of value-added modeling to assess individual teachers, and a significant number of states and districts are now using, or plan to use, value added as a component of a teacher's summative performance evaluation. In this article, I explore the various mechanisms through which the…
Miller, Robert; Stalder, Tobias; Jarczok, Marc; Almeida, David M; Badrick, Ellena; Bartels, Meike; Boomsma, Dorret I; Coe, Christopher L; Dekker, Marieke C J; Donzella, Bonny; Fischer, Joachim E; Gunnar, Megan R; Kumari, Meena; Lederbogen, Florian; Power, Christine; Ryff, Carol D; Subramanian, S V; Tiemeier, Henning; Watamura, Sarah E; Kirschbaum, Clemens
2016-11-01
Diurnal salivary cortisol profiles are valuable indicators of adrenocortical functioning in epidemiological research and clinical practice. However, normative reference values derived from a large number of participants and across a wide age range are still missing. To fill this gap, data were compiled from 15 independently conducted field studies with a total of 104,623 salivary cortisol samples obtained from 18,698 unselected individuals (mean age: 48.3 years, age range: 0.5-98.5 years, 39% females). Besides providing a descriptive analysis of the complete dataset, we also performed mixed-effects growth curve modeling of diurnal salivary cortisol (i.e., 1-16h after awakening). Cortisol decreased significantly across the day and was influenced by both, age and sex. Intriguingly, we also found a pronounced impact of sampling season with elevated diurnal cortisol in spring and decreased levels in autumn. However, the majority of variance was accounted for by between-participant and between-study variance components. Based on these analyses, reference ranges (LC/MS-MS calibrated) for cortisol concentrations in saliva were derived for different times across the day, with more specific reference ranges generated for males and females in different age categories. This integrative summary provides important reference values on salivary cortisol to aid basic scientists and clinicians in interpreting deviations from the normal diurnal cycle. Copyright © 2016 Elsevier Ltd. All rights reserved.
Miller, Robert; Stalder, Tobias; Jarczok, Marc; Almeida, David M.; Badrick, Ellena; Bartels, Meike; Boomsma, Dorret I.; Coe, Christopher L.; Dekker, Marieke C. J.; Donzella, Bonny; Fischer, Joachim E.; Gunnar, Megan R.; Kumari, Meena; Lederbogen, Florian; Oldehinkel, Albertine J.; Power, Christine; Rosmalen, Judith G.; Ryff, Carol D.; Subramanian, S V; Tiemeier, Henning; Watamura, Sarah E.; Kirschbaum, Clemens
2016-01-01
Diurnal salivary cortisol profiles are valuable indicators of adrenocortical functioning in epidemiological research and clinical practice. However, normative reference values derived from a large number of participants and across a wide age range are still missing. To fill this gap, data were compiled from 15 independently conducted field studies with a total of 104,623 salivary cortisol samples obtained from 18,698 unselected individuals (mean age: 48.3 years, age range: 0.5 to 98.5 years, 39% females). Besides providing a descriptive analysis of the complete dataset, we also performed mixed-effects growth curve modeling of diurnal salivary cortisol (i.e., 1 to 16 hours after awakening). Cortisol decreased significantly across the day and was influenced by both, age and sex. Intriguingly, we also found a pronounced impact of sampling season with elevated diurnal cortisol in spring and decreased levels in autumn. However, the majority of variance was accounted for by between-participant and between-study variance components. Based on these analyses, reference ranges (LC/MS-MS calibrated) for cortisol concentrations in saliva were derived for different times across the day, with more specific reference ranges generated for males and females in different age categories. This integrative summary provides important reference values on salivary cortisol to aid basic scientists and clinicians in interpreting deviations from the normal diurnal cycle. PMID:27448524
Value-Added Model (VAM) Research for Educational Policy: Framing the Issue
ERIC Educational Resources Information Center
Amrein-Beardsley, Audrey; Collins, Clarin; Polasky, Sarah A.; Sloat, Edward F.
2013-01-01
In this manuscript, the guest editors of the EPAA Special Issue on "Value-Added Model (VAM) Research for Educational Policy" (1) introduce the background and policy context surrounding the increased use of VAMs for teacher evaluation and accountability purposes across the United States; (2) summarize the five research papers and one…
NASA Technical Reports Server (NTRS)
Stefanov, William L.
2017-01-01
The NASA Earth observations dataset obtained by humans in orbit using handheld film and digital cameras is freely accessible to the global community through the online searchable database at https://eol.jsc.nasa.gov, and offers a useful compliment to traditional ground-commanded sensor data. The dataset includes imagery from the NASA Mercury (1961) through present-day International Space Station (ISS) programs, and currently totals over 2.6 million individual frames. Geographic coverage of the dataset includes land and oceans areas between approximately 52 degrees North and South latitudes, but is spatially and temporally discontinuous. The photographic dataset includes some significant impediments for immediate research, applied, and educational use: commercial RGB films and camera systems with overlapping bandpasses; use of different focal length lenses, unconstrained look angles, and variable spacecraft altitudes; and no native geolocation information. Such factors led to this dataset being underutilized by the community but recent advances in automated and semi-automated image geolocation, image feature classification, and web-based services are adding new value to the astronaut-acquired imagery. A coupled ground software and on-orbit hardware system for the ISS is in development for planned deployment in mid-2017; this system will capture camera pose information for each astronaut photograph to allow automated, full georegistration of the data. The ground system component of the system is currently in use to fully georeference imagery collected in response to International Disaster Charter activations, and the auto-registration procedures are being applied to the extensive historical database of imagery to add value for research and educational purposes. In parallel, machine learning techniques are being applied to automate feature identification and classification throughout the dataset, in order to build descriptive metadata that will improve search
NASA Astrophysics Data System (ADS)
Prat, O. P.; Nelson, B. R.
2014-10-01
We use a suite of quantitative precipitation estimates (QPEs) derived from satellite, radar, and surface observations to derive precipitation characteristics over CONUS for the period 2002-2012. This comparison effort includes satellite multi-sensor datasets (bias-adjusted TMPA 3B42, near-real time 3B42RT), radar estimates (NCEP Stage IV), and rain gauge observations. Remotely sensed precipitation datasets are compared with surface observations from the Global Historical Climatology Network (GHCN-Daily) and from the PRISM (Parameter-elevation Regressions on Independent Slopes Model). The comparisons are performed at the annual, seasonal, and daily scales over the River Forecast Centers (RFCs) for CONUS. Annual average rain rates present a satisfying agreement with GHCN-D for all products over CONUS (± 6%). However, differences at the RFC are more important in particular for near-real time 3B42RT precipitation estimates (-33 to +49%). At annual and seasonal scales, the bias-adjusted 3B42 presented important improvement when compared to its near real time counterpart 3B42RT. However, large biases remained for 3B42 over the Western US for higher average accumulation (≥ 5 mm day-1) with respect to GHCN-D surface observations. At the daily scale, 3B42RT performed poorly in capturing extreme daily precipitation (> 4 in day-1) over the Northwest. Furthermore, the conditional analysis and the contingency analysis conducted illustrated the challenge of retrieving extreme precipitation from remote sensing estimates.
A robust post-processing workflow for datasets with motion artifacts in diffusion kurtosis imaging.
Li, Xianjun; Yang, Jian; Gao, Jie; Luo, Xue; Zhou, Zhenyu; Hu, Yajie; Wu, Ed X; Wan, Mingxi
2014-01-01
The aim of this study was to develop a robust post-processing workflow for motion-corrupted datasets in diffusion kurtosis imaging (DKI). The proposed workflow consisted of brain extraction, rigid registration, distortion correction, artifacts rejection, spatial smoothing and tensor estimation. Rigid registration was utilized to correct misalignments. Motion artifacts were rejected by using local Pearson correlation coefficient (LPCC). The performance of LPCC in characterizing relative differences between artifacts and artifact-free images was compared with that of the conventional correlation coefficient in 10 randomly selected DKI datasets. The influence of rejected artifacts with information of gradient directions and b values for the parameter estimation was investigated by using mean square error (MSE). The variance of noise was used as the criterion for MSEs. The clinical practicality of the proposed workflow was evaluated by the image quality and measurements in regions of interest on 36 DKI datasets, including 18 artifact-free (18 pediatric subjects) and 18 motion-corrupted datasets (15 pediatric subjects and 3 essential tremor patients). The relative difference between artifacts and artifact-free images calculated by LPCC was larger than that of the conventional correlation coefficient (p<0.05). It indicated that LPCC was more sensitive in detecting motion artifacts. MSEs of all derived parameters from the reserved data after the artifacts rejection were smaller than the variance of the noise. It suggested that influence of rejected artifacts was less than influence of noise on the precision of derived parameters. The proposed workflow improved the image quality and reduced the measurement biases significantly on motion-corrupted datasets (p<0.05). The proposed post-processing workflow was reliable to improve the image quality and the measurement precision of the derived parameters on motion-corrupted DKI datasets. The workflow provided an effective post
Integrative missing value estimation for microarray data.
Hu, Jianjun; Li, Haifeng; Waterman, Michael S; Zhou, Xianghong Jasmine
2006-10-12
Missing value estimation is an important preprocessing step in microarray analysis. Although several methods have been developed to solve this problem, their performance is unsatisfactory for datasets with high rates of missing data, high measurement noise, or limited numbers of samples. In fact, more than 80% of the time-series datasets in Stanford Microarray Database contain less than eight samples. We present the integrative Missing Value Estimation method (iMISS) by incorporating information from multiple reference microarray datasets to improve missing value estimation. For each gene with missing data, we derive a consistent neighbor-gene list by taking reference data sets into consideration. To determine whether the given reference data sets are sufficiently informative for integration, we use a submatrix imputation approach. Our experiments showed that iMISS can significantly and consistently improve the accuracy of the state-of-the-art Local Least Square (LLS) imputation algorithm by up to 15% improvement in our benchmark tests. We demonstrated that the order-statistics-based integrative imputation algorithms can achieve significant improvements over the state-of-the-art missing value estimation approaches such as LLS and is especially good for imputing microarray datasets with a limited number of samples, high rates of missing data, or very noisy measurements. With the rapid accumulation of microarray datasets, the performance of our approach can be further improved by incorporating larger and more appropriate reference datasets.
Improved Correction of IR Loss in Diffuse Shortwave Measurements: An ARM Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younkin, K; Long, CN
Simple single black detector pyranometers, such as the Eppley Precision Spectral Pyranometer (PSP) used by the Atmospheric Radiation Measurement (ARM) Program, are known to lose energy via infrared (IR) emission to the sky. This is especially a problem when making clear-sky diffuse shortwave (SW) measurements, which are inherently of low magnitude and suffer the greatest IR loss. Dutton et al. (2001) proposed a technique using information from collocated pyrgeometers to help compensate for this IR loss. The technique uses an empirically derived relationship between the pyrgeometer detector data (and alternatively the detector data plus the difference between the pyrgeometer casemore » and dome temperatures) and the nighttime pyranometer IR loss data. This relationship is then used to apply a correction to the diffuse SW data during daylight hours. We developed an ARM value-added product (VAP) called the SW DIFF CORR 1DUTT VAP to apply the Dutton et al. correction technique to ARM PSP diffuse SW measurements.« less
Sustainable multipurpose biorefineries for third-generation biofuels and value-added co-products
USDA-ARS?s Scientific Manuscript database
Modern biorefinery facilities conduct many types of processes, including those producing advanced biofuels, commodity chemicals, biodiesel, and value-added co-products such as sweeteners and bioinsecticides, with many more co-products, chemicals and biofuels on the horizon. Most of these processes ...
How Often Do Subscores Have Added Value? Results from Operational and Simulated Data
ERIC Educational Resources Information Center
Sinharay, Sandip
2010-01-01
Recently, there has been an increasing level of interest in subscores for their potential diagnostic value. Haberman suggested a method based on classical test theory to determine whether subscores have added value over total scores. In this article I first provide a rich collection of results regarding when subscores were found to have added…
ERIC Educational Resources Information Center
Welch, Matt
2004-01-01
This article profiles retiring values teacher Gene Doxey and describes his foundational contributions to the students of California's Ramona Unified School District. Every one of the Ramona Unified School District's 7,200 students is eventually funneled through Doxey's Contemporary Issues class, a required rite of passage between elementary school…
A Value-Added Estimate of Higher Education Quality of US States
ERIC Educational Resources Information Center
Zhang, Lei
2009-01-01
States differ substantially in higher education policies. Little is known about the effects of state policies on the performance of public colleges and universities, largely because no clear measures of college quality exist. In this paper, I estimate the average quality of public colleges of US states based on the value-added to individuals'…
Production facility site selection factors for Texas value-added wood producers
Judd H. Michael; Joanna Teitel; James E. Granskog
1998-01-01
Value-added wood products manufacturers serve an important role in the economies of many U.S. regions and are therefore sought after by entities such as economic development agencies. The reasons why certain locations for a prospective prodution facility would be more attractive to secondary wood industry producers are not clearly understood. Therefore, this research...
Value-Added Measures in Education: What Every Educator Needs to Know
ERIC Educational Resources Information Center
Harris, Douglas N.
2011-01-01
In "Value-Added Measures in Education", Douglas N. Harris takes on one of the most hotly debated topics in education. Drawing on his extensive work with schools and districts, he sets out to help educators and policymakers understand this innovative approach to assessment and the issues associated with its use. Written in straightforward language…
Rea, Alan; Skinner, Kenneth D.
2012-01-01
The U.S. Geological Survey Hawaii StreamStats application uses an integrated suite of raster and vector geospatial datasets to delineate and characterize watersheds. The geospatial datasets used to delineate and characterize watersheds on the StreamStats website, and the methods used to develop the datasets are described in this report. The datasets for Hawaii were derived primarily from 10 meter resolution National Elevation Dataset (NED) elevation models, and the National Hydrography Dataset (NHD), using a set of procedures designed to enforce the drainage pattern from the NHD into the NED, resulting in an integrated suite of elevation-derived datasets. Additional sources of data used for computing basin characteristics include precipitation, land cover, soil permeability, and elevation-derivative datasets. The report also includes links for metadata and downloads of the geospatial datasets.
Photocatalytic conversion of CO2 into value-added and renewable fuels
NASA Astrophysics Data System (ADS)
Yuan, Lan; Xu, Yi-Jun
2015-07-01
The increasing energy crisis and the worsening global climate caused by the excessive utilization of fossil fuel have boosted tremendous research activities about CO2 capture, storage and utilization. Artificial photosynthesis that uses solar light energy to convert CO2 to form value-added and renewable fuels such as methane or methanol has been consistently drawing increasing attention. It is like killing two birds with one stone since it can not only reduce the greenhouse effects caused by CO2 emission but also produce value added chemicals for alternative energy supplying. This review provides a brief introduction about the basic principles of artificial photosynthesis of CO2 and the progress made in exploring more efficient photocatalysts from the viewpoint of light harvesting and photogenerated charge carriers boosting. Moreover, the undergoing mechanisms of CO2 photoreduction are discussed with selected examples, in terms of adsorption of reactants, CO2 activation as well as the possible reaction pathways. Finally, perspectives on future research directions and open issues in CO2 photoreduction are outlined.
ERIC Educational Resources Information Center
Ferrão, Maria Eugénia; Couto, Alcino Pinto
2014-01-01
This article focuses on the use of a value-added approach for promoting school improvement. It presents yearly value-added estimates, analyses their stability over time, and discusses the contribution of this methodological approach for promoting school improvement programmes in the Portuguese system of evaluation. The value-added model is applied…
Li, Xiaohu; Angelidaki, Irini; Zhang, Yifeng
2018-06-14
Biological conversion of CO 2 to value-added chemicals and biofuels has emerged as an attractive strategy to address the energy and environmental concerns caused by the over-reliance on fossil fuels. In this study, an innovative microbial reverse-electrodialysis electrolysis cell (MREC), which combines the strengths of reverse electrodialysis (RED) and microbial electrosynthesis technology platforms, was developed to achieve efficient CO 2 -to-value chemicals bioconversion by using the salinity gradient energy as driven energy sources. In the MREC, maximum acetate and ethanol concentrations of 477.5 ± 33.2 and 46.2 ± 8.2 mg L -1 were obtained at the cathode, catalyzed by Sporomusa ovata with production rates of 165.79 ± 11.52 and 25.11 ± 4.46 mmol m -2 d -1 , respectively. Electron balance analysis indicates that 94.4 ± 3.9% of the electrons derived from wastewater and salinity gradient were recovered in acetate and ethanol. This work for the first time proved the potential of innovative MREC configuration has the potential as an efficient technology platform for simultaneous CO 2 capture and electrosynthesis of valuable chemicals. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kojo, Matti; Richardson, Phil
In some countries nuclear waste facility siting programs include social and economic benefits, compensation, local empowerment and motivation measures and other incentives for the potential host community. This can generally be referred to as an 'added value approach'. Demonstration of the safety of a repository is seen as a precondition of an added value approach. Recently much focus has been placed on studying and developing public participation approaches but less on the use of such incentive and community benefit packages, although they are becoming a more common element in many site selection strategies for nuclear waste management facilities. The primarymore » objective of this paper is to report on an ongoing study of stakeholders' opinions of the use of an added value approach in siting a radioactive waste facility in the Czech Republic, Poland and Slovenia. The paper argues that an added value approach should adapt to the interests and needs of stakeholders during different stages of a siting process. The main question posed in the study is as follows: What are the measures which should be included in 'added value approach' according to the stakeholders? The research data consists of stakeholders' responses to a survey focusing on the use of added value (community benefits) and incentives in siting nuclear waste management facilities. The survey involved use of a questionnaire developed as part of the EU-funded IPPA* project in three countries: the Czech Republic, Poland and Slovenia. (* Implementing Public Participation Approaches in Radioactive Waste Disposal, FP7 Contract Number: 269849). The target audiences for the questionnaires were the stakeholders represented in the national stakeholder groups established to discuss site selection for a nuclear waste repository in their country. A total of 105 questionnaires were sent to the stakeholders between November 2011 and January 2012. 44 questionnaires were returned, resulting in a total response rate
The Value Simulation-Based Learning Added to Machining Technology in Singapore
ERIC Educational Resources Information Center
Fang, Linda; Tan, Hock Soon; Thwin, Mya Mya; Tan, Kim Cheng; Koh, Caroline
2011-01-01
This study seeks to understand the value simulation-based learning (SBL) added to the learning of Machining Technology in a 15-week core subject course offered to university students. The research questions were: (1) How did SBL enhance classroom learning? (2) How did SBL help participants in their test? (3) How did SBL prepare participants for…
Garraín, Daniel; Fazio, Simone; de la Rúa, Cristina; Recchioni, Marco; Lechón, Yolanda; Mathieux, Fabrice
2015-01-01
The aim of this study is to identify areas of potential improvement of the European Reference Life Cycle Database (ELCD) fuel datasets. The revision is based on the data quality indicators described by the ILCD Handbook, applied on sectorial basis. These indicators evaluate the technological, geographical and time-related representativeness of the dataset and the appropriateness in terms of completeness, precision and methodology. Results show that ELCD fuel datasets have a very good quality in general terms, nevertheless some findings and recommendations in order to improve the quality of Life-Cycle Inventories have been derived. Moreover, these results ensure the quality of the fuel-related datasets to any LCA practitioner, and provide insights related to the limitations and assumptions underlying in the datasets modelling. Giving this information, the LCA practitioner will be able to decide whether the use of the ELCD fuel datasets is appropriate based on the goal and scope of the analysis to be conducted. The methodological approach would be also useful for dataset developers and reviewers, in order to improve the overall DQR of databases.
Derivation of guideline values for gold (III) ion toxicity limits to protect aquatic ecosystems.
Nam, Sun-Hwa; Lee, Woo-Mi; Shin, Yu-Jin; Yoon, Sung-Ji; Kim, Shin Woong; Kwak, Jin Il; An, Youn-Joo
2014-01-01
This study focused on estimating the toxicity values of various aquatic organisms exposed to gold (III) ion (Au(3+)), and to propose maximum guideline values for Au(3+) toxicity that protect the aquatic ecosystem. A comparative assessment of methods developed in Australia and New Zealand versus the European Community (EC) was conducted. The test species used in this study included two bacteria (Escherichia coli and Bacillus subtilis), one alga (Pseudokirchneriella subcapitata), one euglena (Euglena gracilis), three cladocerans (Daphnia magna, Moina macrocopa, and Simocephalus mixtus), and two fish (Danio rerio and Oryzias latipes). Au(3+) induced growth inhibition, mortality, immobilization, and/or developmental malformations in all test species, with responses being concentration-dependent. According to the moderate reliability method of Australia and New Zealand, 0.006 and 0.075 mg/L of guideline values for Au(3+) were obtained by dividing 0.33 and 4.46 mg/L of HC5 and HC50 species sensitivity distributions (SSD) with an FACR (Final Acute to Chronic Ratio) of 59.09. In contrast, the EC method uses an assessment factor (AF), with the 0.0006 mg/L guideline value for Au(3+) being divided with the 48-h EC50 value for 0.60 mg/L (the lowest toxicity value obtained from short term results) by an AF of 1000. The Au(3+) guideline value derived using an AF was more stringent than the SSD. We recommend that more toxicity data using various bioassays are required to develop more accurate ecological risk assessments. More chronic/long-term exposure studies on sensitive endpoints using additional fish species and invertebrates not included in the current dataset will be needed to use other derivation methods (e.g., US EPA and Canadian Type A) or the "High Reliability Method" from Australia/New Zealand. Such research would facilitate the establishment of guideline values for various pollutants that reflect the universal effects of various pollutants in aquatic ecosystems. To
ERIC Educational Resources Information Center
Goldhaber, Dan; Quince, Vanessa; Theobald, Roddy
2016-01-01
This policy brief reviews evidence about the extent to which disadvantaged students are taught by teachers with lower value-added estimates of performance, and seeks to reconcile differences in findings from different studies. We demonstrate that much of the inequity in teacher value added in Washington state is due to differences across different…
Vivek, Narisetty; Sindhu, Raveendran; Madhavan, Aravind; Anju, Alphonsa Jose; Castro, Eulogio; Faraco, Vincenza; Pandey, Ashok; Binod, Parameswaran
2017-09-01
One of the major ecological concerns associated with biodiesel production is the generation of waste/crude glycerol during the trans-esterification process. Purification of this crude glycerol is not economically viable. In this context, the development of an efficient and economically viable strategy would be biotransformation reactions converting the biodiesel derived crude glycerol into value added chemicals. Hence the process ensures the sustainability and waste management in biodiesel industry, paving a path to integrated biorefineries. This review addresses a waste to wealth approach for utilization of crude glycerol in the production of value added chemicals, current trends, challenges, future perspectives, metabolic approaches and the genetic tools developed for the improved synthesis over wild type microorganisms were described. Copyright © 2017 Elsevier Ltd. All rights reserved.
Added value of high-resolution regional climate model over the Bohai Sea and Yellow Sea areas
NASA Astrophysics Data System (ADS)
Li, Delei; von Storch, Hans; Geyer, Beate
2016-04-01
Added value from dynamical downscaling has long been a crucial and debatable issue in regional climate studies. A 34 year (1979-2012) high-resolution (7 km grid) atmospheric hindcast over the Bohai Sea and the Yellow Sea (BYS) has been performed using COSMO-CLM (CCLM) forced by ERA-Interim reanalysis data (ERA-I). The accuracy of CCLM in surface wind reproduction and the added value of dynamical downscaling to ERA-I have been investigated through comparisons with the satellite data (including QuikSCAT Level2B 12.5 km version 3 (L2B12v3) swath data and MODIS images) and in situ observations, with adoption of quantitative metrics and qualitative assessment methods. The results revealed that CCLM has a reliable ability to reproduce the regional wind characteristics over the BYS areas. Over marine areas, added value to ERA-I has been detected in the coastal areas with complex coastlines and orography. CCLM was better able to represent light and moderate winds but has even more added value for strong winds relative to ERA-I. Over land areas, the high-resolution CCLM hindcast can add value to ERA-I in reproducing wind intensities and direction, wind probability distribution and extreme winds mainly at mountain areas. With respect to atmospheric processes, CCLM outperforms ERA-I in resolving detailed temporal and spatial structures for phenomena of a typhoon and of a coastal atmospheric front; CCLM generates some orography related phenomena such as a vortex street which is not captured by ERA-I. These added values demonstrate the utility of the 7-km-resolution CCLM for regional and local climate studies and applications. The simulation was constrained with adoption of spectral nudging method. The results may be different when simulations are considered, which are not constrained by spectral nudging.
A 7.5-Year Dataset of SSM/I-Derived Surface Turbulent Fluxes Over Global Oceans
NASA Technical Reports Server (NTRS)
Chou, Shu-Hsien; Shie, Chung-Lin; Atlas, Robert M.; Ardizzone, Joe; Nelkin, Eric; Einaudi, Franco (Technical Monitor)
2001-01-01
The surface turbulent fluxes of momentum, latent heat, and sensible heat over global oceans are essential to weather, climate and ocean problems. Wind stress is the major forcing for driving the oceanic circulation, while Evaporation is a key component of hydrological cycle and surface heat budget. We have produced a 7.5-year (July 1987-December 1994) dataset of daily, individual monthly-mean and climatological (1988-94) monthly-mean surface turbulent fluxes over the global oceans from measurements of the Special Sensor Microwave/Imager (SSM/I) on board the US Defense Meteorological Satellite Program F8, F10, and F11 satellites. It has a spatial resolution of 2.0x2.5 latitude-longitude. Daily turbulent fluxes are derived from daily data of SSM/I surface winds and specific humidity, National Centers for Environmental Prediction (NCEP) sea surface temperatures, and European Centre for Medium-Range Weather Forecasts (ECMWF) air-sea temperature differences, using a stability-dependent bulk scheme. The retrieved instantaneous surface air humidity (with a 25-km resolution) IS found to be generally accurate as compared to the collocated radiosonde observations over global oceans. The surface wind speed and specific humidity (latent heat flux) derived from the F10 SSM/I are found to be -encrally smaller (larger) than those retrieved from the F11 SSM/I. The F11 SSM/I appears to have slightly better retrieval accuracy for surface wind speed and humidity as compared to the F10 SSM/I. This difference may be due to the orbital drift of the F10 satellite. The daily wind stresses and latent heat fluxes retrieved from F10 and F11 SSM/Is show useful accuracy as verified against the research quality in si -neasurerrients (IMET buoy, RV Moana Wave, and RV Wecoma) in the western Pacific warm pool during the TOGA COARE Intensive observing period (November 1992-February 1993). The 1988-94 seasonal-mean turbulent fluxes and input variables derived from FS and F11 SSM/Is show reasonable
Learning to recognize rat social behavior: Novel dataset and cross-dataset application.
Lorbach, Malte; Kyriakou, Elisavet I; Poppe, Ronald; van Dam, Elsbeth A; Noldus, Lucas P J J; Veltkamp, Remco C
2018-04-15
Social behavior is an important aspect of rodent models. Automated measuring tools that make use of video analysis and machine learning are an increasingly attractive alternative to manual annotation. Because machine learning-based methods need to be trained, it is important that they are validated using data from different experiment settings. To develop and validate automated measuring tools, there is a need for annotated rodent interaction datasets. Currently, the availability of such datasets is limited to two mouse datasets. We introduce the first, publicly available rat social interaction dataset, RatSI. We demonstrate the practical value of the novel dataset by using it as the training set for a rat interaction recognition method. We show that behavior variations induced by the experiment setting can lead to reduced performance, which illustrates the importance of cross-dataset validation. Consequently, we add a simple adaptation step to our method and improve the recognition performance. Most existing methods are trained and evaluated in one experimental setting, which limits the predictive power of the evaluation to that particular setting. We demonstrate that cross-dataset experiments provide more insight in the performance of classifiers. With our novel, public dataset we encourage the development and validation of automated recognition methods. We are convinced that cross-dataset validation enhances our understanding of rodent interactions and facilitates the development of more sophisticated recognition methods. Combining them with adaptation techniques may enable us to apply automated recognition methods to a variety of animals and experiment settings. Copyright © 2017 Elsevier B.V. All rights reserved.
The Problem with Big Data: Operating on Smaller Datasets to Bridge the Implementation Gap.
Mann, Richard P; Mushtaq, Faisal; White, Alan D; Mata-Cervantes, Gabriel; Pike, Tom; Coker, Dalton; Murdoch, Stuart; Hiles, Tim; Smith, Clare; Berridge, David; Hinchliffe, Suzanne; Hall, Geoff; Smye, Stephen; Wilkie, Richard M; Lodge, J Peter A; Mon-Williams, Mark
2016-01-01
Big datasets have the potential to revolutionize public health. However, there is a mismatch between the political and scientific optimism surrounding big data and the public's perception of its benefit. We suggest a systematic and concerted emphasis on developing models derived from smaller datasets to illustrate to the public how big data can produce tangible benefits in the long term. In order to highlight the immediate value of a small data approach, we produced a proof-of-concept model predicting hospital length of stay. The results demonstrate that existing small datasets can be used to create models that generate a reasonable prediction, facilitating health-care delivery. We propose that greater attention (and funding) needs to be directed toward the utilization of existing information resources in parallel with current efforts to create and exploit "big data."
The value added by sawmilling in the Appalachian hill country of Ohio and Kentucky
Orris D. McCauley; James C. Whittaker
1967-01-01
The difference between log costs and lumber values at 40 sawmills in the Appalachian hill country of Ohio and Kentucky provides an estimate of the value added by sawmill production. Based on these estimates, sawmilling contributed about $12.8 million to the region's economy in 1962.
Method for conversion of carbohydrate polymers to value-added chemical products
Zhang, Zongchao C [Norwood, NJ; Brown, Heather M [Kennewick, WA; Su, Yu [Richland, WA
2012-02-07
Methods are described for conversion of carbohydrate polymers in ionic liquids, including cellulose, that yield value-added chemicals including, e.g., glucose and 5-hydroxylmethylfurfural (HMF) at temperatures below 120.degree. C. Catalyst compositions that include various mixed metal halides are described that are selective for specified products with yields, e.g., of up to about 56% in a single step process.
Feasibility of producing value-added wood products from reclaimed hemlock lumber
John J. Janowiak; Robert H. Falk; Jeffery D. Kimmel
2007-01-01
This study evaluated the feasibility of producing value-added wood products from hemlock lumber salvaged from building deconstruction. About 6,000 board feet of lumber, ranging in size from 3 in. by 8 in. to 3 in. by 12 in., was remilled into four products including log cabin siding, V-groove paneling, beadboard (wainscoting), and tongue and groove flooring. The...
Targeted and efficient transfer of multiple value-added genes into wheat varieties
USDA-ARS?s Scientific Manuscript database
With an objective to optimize an approach to transfer multiple value added genes to a wheat variety while maintaining and improving agronomic performance, two alleles with mutations in the acetolactate synthase (ALS) gene located on wheat chromosomes 6B and 6D providing tolerance to imidazolinone (I...
NASA Astrophysics Data System (ADS)
Skok, Gregor; Žagar, Nedjeljka; Honzak, Luka; Žabkar, Rahela; Rakovec, Jože; Ceglar, Andrej
2016-01-01
The study presents a precipitation intercomparison based on two satellite-derived datasets (TRMM 3B42, CMORPH), four raingauge-based datasets (GPCC, E-OBS, Willmott & Matsuura, CRU), ERA Interim reanalysis (ERAInt), and a single climate simulation using the WRF model. The comparison was performed for a domain encompassing parts of Europe and the North Atlantic over the 11-year period of 2000-2010. The four raingauge-based datasets are similar to the TRMM dataset with biases over Europe ranging from -7 % to +4 %. The spread among the raingauge-based datasets is relatively small over most of Europe, although areas with greater uncertainty (more than 30 %) exist, especially near the Alps and other mountainous regions. There are distinct differences between the datasets over the European land area and the Atlantic Ocean in comparison to the TRMM dataset. ERAInt has a small dry bias over the land; the WRF simulation has a large wet bias (+30 %), whereas CMORPH is characterized by a large and spatially consistent dry bias (-21 %). Over the ocean, both ERAInt and CMORPH have a small wet bias (+8 %) while the wet bias in WRF is significantly larger (+47 %). ERAInt has the highest frequency of low-intensity precipitation while the frequency of high-intensity precipitation is the lowest due to its lower native resolution. Both satellite-derived datasets have more low-intensity precipitation over the ocean than over the land, while the frequency of higher-intensity precipitation is similar or larger over the land. This result is likely related to orography, which triggers more intense convective precipitation, while the Atlantic Ocean is characterized by more homogenous large-scale precipitation systems which are associated with larger areas of lower intensity precipitation. However, this is not observed in ERAInt and WRF, indicating the insufficient representation of convective processes in the models. Finally, the Fraction Skill Score confirmed that both models perform
Added Value of Reliability to a Microgrid: Simulations of Three California Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marnay, Chris; Lai, Judy; Stadler, Michael
The Distributed Energy Resources Customer Adoption Model is used to estimate the value an Oakland nursing home, a Riverside high school, and a Sunnyvale data center would need to put on higher electricity service reliability for them to adopt a Consortium for Electric Reliability Technology Solutions Microgrid (CM) based on economics alone. A fraction of each building's load is deemed critical based on its mission, and the added cost of CM capability to meet it added to on-site generation options. The three sites are analyzed with various resources available as microgrid components. Results show that the value placed on highermore » reliability often does not have to be significant for CM to appear attractive, about 25 $/kWcdota and up, but the carbon footprint consequences are mixed because storage is often used to shift cheaper off-peak electricity to use during afternoon hours in competition with the solar sources.« less
ERIC Educational Resources Information Center
Raudenbush, Stephen
2013-01-01
This brief considers the problem of using value-added scores to compare teachers who work in different schools. The author focuses on whether such comparisons can be regarded as fair, or, in statistical language, "unbiased." An unbiased measure does not systematically favor teachers because of the backgrounds of the students they are…
NASA Astrophysics Data System (ADS)
Allard, Jason; Thompson, Clint; Keim, Barry D.
2015-04-01
The National Climatic Data Center's climate divisional dataset (CDD) is commonly used in climate change analyses. This dataset is a spatially continuous dataset for the conterminous USA from 1895 to the present. The CDD since 1931 is computed by averaging all available representative cooperative weather station data into a single monthly value for each of the 344 climate divisions of the conterminous USA, while pre-1931 data for climate divisions are derived from statewide averages using regression equations. This study examines the veracity of these pre-1931 data. All available Cooperative Observer Program (COOP) stations within each climate division in Georgia and Louisiana were averaged into a single monthly value for each month and each climate division from 1897 to 1930 to generate a divisional dataset (COOP DD), using similar methods to those used by the National Climatic Data Center to generate the post-1931 CDD. The reliability of the official CDD—derived from statewide averages—to produce temperature and precipitation means and trends prior to 1931 are then evaluated by comparing that dataset with the COOP DD with difference-of-means tests, correlations, and linear regression techniques. The CDD and the COOP DD are also compared to a divisional dataset derived from the United States Historical Climatology Network (USHCN) data (USHCN DD), with difference of means and correlation techniques, to demonstrate potential impacts of inhomogeneities within the CDD and the COOP DD. The statistical results, taken as a whole, not only indicate broad similarities between the CDD and COOP DD but also show that the CDD does not adequately portray pre-1931 temperature and precipitation in certain climate divisions within Georgia and Louisiana. In comparison with the USHCN DD, both the CDD and the COOP DD appear to be subject to biases that probably result from changing stations within climate divisions. As such, the CDD should be used judiciously for long-term studies
MICROREFINING OF WASTE GLYCEROL FOR THE PRODUCTION OF A VALUE-ADDED PRODUCT
As a result of Phase I, a process to refine crude glycerin waste to value-added products was designed. An economic analysis was performed to determine the capital and operating costs for a commercial facility that implements this design. Using the estimated 1,800 gallons of ra...
Can Value-Added Measures of Teacher Performance Be Trusted? Working Paper #18
ERIC Educational Resources Information Center
Guarino, Cassandra M.; Reckase, Mark D.; Woolridge, Jeffrey M.
2012-01-01
We investigate whether commonly used value-added estimation strategies can produce accurate estimates of teacher effects. We estimate teacher effects in simulated student achievement data sets that mimic plausible types of student grouping and teacher assignment scenarios. No one method accurately captures true teacher effects in all scenarios,…
Ichihara, Kiyoshi; Ozarda, Yesim; Barth, Julian H; Klee, George; Qiu, Ling; Erasmus, Rajiv; Borai, Anwar; Evgina, Svetlana; Ashavaid, Tester; Khan, Dilshad; Schreier, Laura; Rolle, Reynan; Shimizu, Yoshihisa; Kimura, Shogo; Kawano, Reo; Armbruster, David; Mori, Kazuo; Yadav, Binod K
2017-04-01
The IFCC Committee on Reference Intervals and Decision Limits coordinated a global multicenter study on reference values (RVs) to explore rational and harmonizable procedures for derivation of reference intervals (RIs) and investigate the feasibility of sharing RIs through evaluation of sources of variation of RVs on a global scale. For the common protocol, rather lenient criteria for reference individuals were adopted to facilitate harmonized recruitment with planned use of the latent abnormal values exclusion (LAVE) method. As of July 2015, 12 countries had completed their study with total recruitment of 13,386 healthy adults. 25 analytes were measured chemically and 25 immunologically. A serum panel with assigned values was measured by all laboratories. RIs were derived by parametric and nonparametric methods. The effect of LAVE methods is prominent in analytes which reflect nutritional status, inflammation and muscular exertion, indicating that inappropriate results are frequent in any country. The validity of the parametric method was confirmed by the presence of analyte-specific distribution patterns and successful Gaussian transformation using the modified Box-Cox formula in all countries. After successful alignment of RVs based on the panel test results, nearly half the analytes showed variable degrees of between-country differences. This finding, however, requires confirmation after adjusting for BMI and other sources of variation. The results are reported in the second part of this paper. The collaborative study enabled us to evaluate rational methods for deriving RIs and comparing the RVs based on real-world datasets obtained in a harmonized manner. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Garraín, Daniel; Fazio, Simone; de la Rúa, Cristina; Recchioni, Marco; Lechón, Yolanda; Mathieux, Fabrice
2015-01-01
The aim of this paper is to identify areas of potential improvement of the European Reference Life Cycle Database (ELCD) electricity datasets. The revision is based on the data quality indicators described by the International Life Cycle Data system (ILCD) Handbook, applied on sectorial basis. These indicators evaluate the technological, geographical and time-related representativeness of the dataset and the appropriateness in terms of completeness, precision and methodology. Results show that ELCD electricity datasets have a very good quality in general terms, nevertheless some findings and recommendations in order to improve the quality of Life-Cycle Inventories have been derived. Moreover, these results ensure the quality of the electricity-related datasets to any LCA practitioner, and provide insights related to the limitations and assumptions underlying in the datasets modelling. Giving this information, the LCA practitioner will be able to decide whether the use of the ELCD electricity datasets is appropriate based on the goal and scope of the analysis to be conducted. The methodological approach would be also useful for dataset developers and reviewers, in order to improve the overall Data Quality Requirements of databases.
2014-01-01
Background Qualitative research is undertaken with randomized controlled trials of health interventions. Our aim was to explore the perceptions of researchers with experience of this endeavour to understand the added value of qualitative research to the trial in practice. Methods A telephone semi-structured interview study with 18 researchers with experience of undertaking the trial and/or the qualitative research. Results Interviewees described the added value of qualitative research for the trial, explaining how it solved problems at the pretrial stage, explained findings, and helped to increase the utility of the evidence generated by the trial. From the interviews, we identified three models of relationship of the qualitative research to the trial. In ‘the peripheral’ model, the trial was an opportunity to undertake qualitative research, with no intention that it would add value to the trial. In ‘the add-on’ model, the qualitative researcher understood the potential value of the qualitative research but it was viewed as a separate and complementary endeavour by the trial lead investigator and wider team. Interviewees described how this could limit the value of the qualitative research to the trial. Finally ‘the integral’ model played out in two ways. In ‘integral-in-theory’ studies, the lead investigator viewed the qualitative research as essential to the trial. However, in practice the qualitative research was under-resourced relative to the trial, potentially limiting its ability to add value to the trial. In ‘integral-in-practice’ studies, interviewees described how the qualitative research was planned from the beginning of the study, senior qualitative expertise was on the team from beginning to end, and staff and time were dedicated to the qualitative research. In these studies interviewees described the qualitative research adding value to the trial although this value was not necessarily visible beyond the original research team due
O'Cathain, Alicia; Goode, Jackie; Drabble, Sarah J; Thomas, Kate J; Rudolph, Anne; Hewison, Jenny
2014-06-09
Qualitative research is undertaken with randomized controlled trials of health interventions. Our aim was to explore the perceptions of researchers with experience of this endeavour to understand the added value of qualitative research to the trial in practice. A telephone semi-structured interview study with 18 researchers with experience of undertaking the trial and/or the qualitative research. Interviewees described the added value of qualitative research for the trial, explaining how it solved problems at the pretrial stage, explained findings, and helped to increase the utility of the evidence generated by the trial. From the interviews, we identified three models of relationship of the qualitative research to the trial. In 'the peripheral' model, the trial was an opportunity to undertake qualitative research, with no intention that it would add value to the trial. In 'the add-on' model, the qualitative researcher understood the potential value of the qualitative research but it was viewed as a separate and complementary endeavour by the trial lead investigator and wider team. Interviewees described how this could limit the value of the qualitative research to the trial. Finally 'the integral' model played out in two ways. In 'integral-in-theory' studies, the lead investigator viewed the qualitative research as essential to the trial. However, in practice the qualitative research was under-resourced relative to the trial, potentially limiting its ability to add value to the trial. In 'integral-in-practice' studies, interviewees described how the qualitative research was planned from the beginning of the study, senior qualitative expertise was on the team from beginning to end, and staff and time were dedicated to the qualitative research. In these studies interviewees described the qualitative research adding value to the trial although this value was not necessarily visible beyond the original research team due to the challenges of publishing this research
Ning, Yawei; Li, Qiang; Chen, Feng; Yang, Na; Jin, Zhengyu; Xu, Xueming
2012-01-01
The effects of medium composition and culture conditions on the production of (6)G-fructofuranosidase with value-added astaxanthin were investigated to reduce the capital cost of neo-fructooligosaccharides (neo-FOS) production by Xanthophyllomyces dendrorhous. The sucrose and corn steep liquor (CSL) were found to be the optimal carbon source and nitrogen source, respectively. CSL and initial pH were selected as the critical factors using Plackett-Burman design. Maximum (6)G-fructofuranosidase 242.57 U/mL with 5.23 mg/L value-added astaxanthin was obtained at CSL 52.5 mL/L and pH 7.89 by central composite design. Neo-FOS yield could reach 238.12 g/L under the optimized medium conditions. Cost analysis suggested 66.3% of substrate cost was reduced compared with that before optimization. These results demonstrated that the optimized medium and culture conditions could significantly enhance the production of (6)G-fructofuranosidase with value-added astaxanthin and remarkably decrease the substrate cost, which opened up possibilities to produce neo-FOS industrially. Copyright © 2011 Elsevier Ltd. All rights reserved.
Microalgal cultivation for value-added products: a critical enviro-economical assessment.
Kothari, Richa; Pandey, Arya; Ahmad, Shamshad; Kumar, Ashwani; Pathak, Vinayak V; Tyagi, V V
2017-08-01
The present review focuses on the cultivation of algal biomass for generating value-added products (VAP) and to assess their economic benefits and harmful environmental impact. Additionally, the impact of bioreactor designs on the yield of microalgal biomass for VAP is also considered. All these factors are discussed in relation to the impact of microalgae production on the bio-economy sector of commercial biotechnology.
NASA Astrophysics Data System (ADS)
Nlandu Kamavuako, Ernest; Scheme, Erik Justin; Englehart, Kevin Brian
2016-08-01
Objective. For over two decades, Hudgins’ set of time domain features have extensively been applied for classification of hand motions. The calculation of slope sign change and zero crossing features uses a threshold to attenuate the effect of background noise. However, there is no consensus on the optimum threshold value. In this study, we investigate for the first time the effect of threshold selection on the feature space and classification accuracy using multiple datasets. Approach. In the first part, four datasets were used, and classification error (CE), separability index, scatter matrix separability criterion, and cardinality of the features were used as performance measures. In the second part, data from eight classes were collected during two separate days with two days in between from eight able-bodied subjects. The threshold for each feature was computed as a factor (R = 0:0.01:4) times the average root mean square of data during rest. For each day, we quantified CE for R = 0 (CEr0) and minimum error (CEbest). Moreover, a cross day threshold validation was applied where, for example, CE of day two (CEodt) is computed based on optimum threshold from day one and vice versa. Finally, we quantified the effect of the threshold when using training data from one day and test data of the other. Main results. All performance metrics generally degraded with increasing threshold values. On average, CEbest (5.26 ± 2.42%) was significantly better than CEr0 (7.51 ± 2.41%, P = 0.018), and CEodt (7.50 ± 2.50%, P = 0.021). During the two-fold validation between days, CEbest performed similar to CEr0. Interestingly, when using the threshold values optimized per subject from day one and day two respectively, on the cross-days classification, the performance decreased. Significance. We have demonstrated that threshold value has a strong impact on the feature space and that an optimum threshold can be quantified. However, this optimum threshold is highly data and
Kamavuako, Ernest Nlandu; Scheme, Erik Justin; Englehart, Kevin Brian
2016-08-01
For over two decades, Hudgins' set of time domain features have extensively been applied for classification of hand motions. The calculation of slope sign change and zero crossing features uses a threshold to attenuate the effect of background noise. However, there is no consensus on the optimum threshold value. In this study, we investigate for the first time the effect of threshold selection on the feature space and classification accuracy using multiple datasets. In the first part, four datasets were used, and classification error (CE), separability index, scatter matrix separability criterion, and cardinality of the features were used as performance measures. In the second part, data from eight classes were collected during two separate days with two days in between from eight able-bodied subjects. The threshold for each feature was computed as a factor (R = 0:0.01:4) times the average root mean square of data during rest. For each day, we quantified CE for R = 0 (CEr0) and minimum error (CEbest). Moreover, a cross day threshold validation was applied where, for example, CE of day two (CEodt) is computed based on optimum threshold from day one and vice versa. Finally, we quantified the effect of the threshold when using training data from one day and test data of the other. All performance metrics generally degraded with increasing threshold values. On average, CEbest (5.26 ± 2.42%) was significantly better than CEr0 (7.51 ± 2.41%, P = 0.018), and CEodt (7.50 ± 2.50%, P = 0.021). During the two-fold validation between days, CEbest performed similar to CEr0. Interestingly, when using the threshold values optimized per subject from day one and day two respectively, on the cross-days classification, the performance decreased. We have demonstrated that threshold value has a strong impact on the feature space and that an optimum threshold can be quantified. However, this optimum threshold is highly data and subject driven and thus do not generalize
NASA Astrophysics Data System (ADS)
Song, Wei; Anninos, Dionysios; Li, Wei; Padi, Megha; Strominger, Andrew
2009-03-01
Three dimensional topologically massive gravity (TMG) with a negative cosmological constant -ell-2 and positive Newton constant G admits an AdS3 vacuum solution for any value of the graviton mass μ. These are all known to be perturbatively unstable except at the recently explored chiral point μell = 1. However we show herein that for every value of μell ≠ 3 there are two other (potentially stable) vacuum solutions given by SL(2,Bbb R) × U(1)-invariant warped AdS3 geometries, with a timelike or spacelike U(1) isometry. Critical behavior occurs at μell = 3, where the warping transitions from a stretching to a squashing, and there are a pair of warped solutions with a null U(1) isometry. For μell > 3, there are known warped black hole solutions which are asymptotic to warped AdS3. We show that these black holes are discrete quotients of warped AdS3 just as BTZ black holes are discrete quotients of ordinary AdS3. Moreover new solutions of this type, relevant to any theory with warped AdS3 solutions, are exhibited. Finally we note that the black hole thermodynamics is consistent with the hypothesis that, for μell > 3, the warped AdS3 ground state of TMG is holographically dual to a 2D boundary CFT with central charges c_R-formula and c_L-formula.
NASA Astrophysics Data System (ADS)
Anninos, Dionysios; Li, Wei; Padi, Megha; Song, Wei; Strominger, Andrew
2009-03-01
Three dimensional topologically massive gravity (TMG) with a negative cosmological constant -l-2 and positive Newton constant G admits an AdS3 vacuum solution for any value of the graviton mass μ. These are all known to be perturbatively unstable except at the recently explored chiral point μl = 1. However we show herein that for every value of μl ≠ 3 there are two other (potentially stable) vacuum solutions given by SL(2,Bbb R) × U(1)-invariant warped AdS3 geometries, with a timelike or spacelike U(1) isometry. Critical behavior occurs at μl = 3, where the warping transitions from a stretching to a squashing, and there are a pair of warped solutions with a null U(1) isometry. For μl > 3, there are known warped black hole solutions which are asymptotic to warped AdS3. We show that these black holes are discrete quotients of warped AdS3 just as BTZ black holes are discrete quotients of ordinary AdS3. Moreover new solutions of this type, relevant to any theory with warped AdS3 solutions, are exhibited. Finally we note that the black hole thermodynamics is consistent with the hypothesis that, for μl > 3, the warped AdS3 ground state of TMG is holographically dual to a 2D boundary CFT with central charges c_R-formula and c_L-formula.
A Research on Performance Measurement Based on Economic Valued-Added Comprehensive Scorecard
NASA Astrophysics Data System (ADS)
Chen, Qin; Zhang, Xiaomei
With the development of economic, the traditional performance mainly rely on financial indicators could not satisfy the need of work. In order to make the performance measurement taking the best services for business goals, this paper proposed Economic Valued-Added Comprehensive Scorecard based on research of shortages and advantages of EVA and BSC .We used Analytic Hierarchy Process to build matrix to solve the weighting of EVA Comprehensive Scorecard. At last we could find the most influence factors for enterprise value forming the weighting.
Examining the Relationship between Value-Added Results and Elements of Teacher Effectiveness
ERIC Educational Resources Information Center
Holloway, Carla Euniece
2014-01-01
The purpose of this quantitative, correlational study was to determine if student growth as measured by value-added measure of fourth grade 2011-2012 reading test scores was correlated with teacher observation ratings on the Teach and Cultivate Learning Environment domains of the Teaching and Learning Framework Rubric. A second purpose of the…
A Robust Post-Processing Workflow for Datasets with Motion Artifacts in Diffusion Kurtosis Imaging
Li, Xianjun; Yang, Jian; Gao, Jie; Luo, Xue; Zhou, Zhenyu; Hu, Yajie; Wu, Ed X.; Wan, Mingxi
2014-01-01
Purpose The aim of this study was to develop a robust post-processing workflow for motion-corrupted datasets in diffusion kurtosis imaging (DKI). Materials and methods The proposed workflow consisted of brain extraction, rigid registration, distortion correction, artifacts rejection, spatial smoothing and tensor estimation. Rigid registration was utilized to correct misalignments. Motion artifacts were rejected by using local Pearson correlation coefficient (LPCC). The performance of LPCC in characterizing relative differences between artifacts and artifact-free images was compared with that of the conventional correlation coefficient in 10 randomly selected DKI datasets. The influence of rejected artifacts with information of gradient directions and b values for the parameter estimation was investigated by using mean square error (MSE). The variance of noise was used as the criterion for MSEs. The clinical practicality of the proposed workflow was evaluated by the image quality and measurements in regions of interest on 36 DKI datasets, including 18 artifact-free (18 pediatric subjects) and 18 motion-corrupted datasets (15 pediatric subjects and 3 essential tremor patients). Results The relative difference between artifacts and artifact-free images calculated by LPCC was larger than that of the conventional correlation coefficient (p<0.05). It indicated that LPCC was more sensitive in detecting motion artifacts. MSEs of all derived parameters from the reserved data after the artifacts rejection were smaller than the variance of the noise. It suggested that influence of rejected artifacts was less than influence of noise on the precision of derived parameters. The proposed workflow improved the image quality and reduced the measurement biases significantly on motion-corrupted datasets (p<0.05). Conclusion The proposed post-processing workflow was reliable to improve the image quality and the measurement precision of the derived parameters on motion-corrupted DKI
Expanding the biomass derived chemical space
Brun, Nicolas; Hesemann, Peter
2017-01-01
Biorefinery aims at the conversion of biomass and renewable feedstocks into fuels and platform chemicals, in analogy to conventional oil refinery. In the past years, the scientific community has defined a number of primary building blocks that can be obtained by direct biomass decomposition. However, the large potential of this “renewable chemical space” to contribute to the generation of value added bio-active compounds and materials still remains unexplored. In general, biomass derived building blocks feature a diverse range of chemical functionalities. In order to be integrated into value-added compounds, they require additional functionalization and/or covalent modification thereby generating secondary building blocks. The latter can be thus regarded as functional components of bio-active molecules or materials and represent an expansion of the renewable chemical space. This perspective highlights the most recent developments and opportunities for the synthesis of secondary biomass derived building blocks and their application to the preparation of value added products. PMID:28959397
Aquacultural and socio-economic aspects of processing carps into some value-added products.
Sehgal, H S; Sehgal, G K
2002-05-01
Carps are the mainstay of Indian aquaculture, contributing over 90% to the total fish production, which was estimated to be 1.77 million metric tonnes in 1996. Carp culture has a great potential for waste utilization and thus for pollution abatement. Many wastes such as cow, poultry, pig, duck, goat, and sheep excreta, biogas slurry, effluents from different kinds of factories/industries have been efficiently used for enhancing the productivity of natural food of carps and related species. Besides, several organic wastes/byproducts such as plant products, wastes from animal husbandry, and industrial by-products have been used as carp feed ingredients to lower the cost of supplementary feeding. However, to ensure the continued expansion of fish ponds and the pollution control, there must be a market for the fish (carps) produced in these ponds. The carps have, however, a low market value due to the presence of intra-muscular bones, which reduces their consumer acceptability. Thus, a need was felt to develop some boneless convenience products for enhancing the consumer acceptability of the carps. Efforts were made to prepare three value-added fish products, namely fish patty, fish finger and fish salad from carp flesh and were compared with a reference product ('fish pakoura'). Sensory evaluation of these products gave highly encouraging results. The methods of preparation of these products were transferred to some progressive farmers of the region who prepared and sold these products at very attractive prices. Carp processing has a great potential for the establishment of a fish ancillary industry and thus for boosting the production of these species. In Punjab alone, there is a potential of consuming 32,448 metric tonnes per annum of such value-added products (which would require 54,080 metric tonnes of raw fish). The development of value-added products has a significant role in raising the socio-economic status of the people associated with carp culture. The
Added clinical value of the inferior temporal EEG electrode chain.
Bach Justesen, Anders; Eskelund Johansen, Ann Berit; Martinussen, Noomi Ida; Wasserman, Danielle; Terney, Daniella; Meritam, Pirgit; Gardella, Elena; Beniczky, Sándor
2018-01-01
To investigate the diagnostic added value of supplementing the 10-20 EEG array with six electrodes in the inferior temporal chain. EEGs were recorded with 25 electrodes: 19 positions of the 10-20 system, and six additional electrodes in the inferior temporal chain (F9/10, T9/10, P9/10). Five-hundred consecutive standard and sleep EEG recordings were reviewed using the 10-20 array and the extended array. We identified the recordings with EEG abnormalities that had peak negativities at the inferior temporal electrodes, and those that only were visible at the inferior temporal electrodes. From the 286 abnormal recordings, the peak negativity was at the inferior temporal electrodes in 81 cases (28.3%) and only visible at the inferior temporal electrodes in eight cases (2.8%). In the sub-group of patients with temporal abnormalities (n = 134), these represented 59% (peak in the inferior chain) and 6% (only seen at the inferior chain). Adding six electrodes in the inferior temporal electrode chain to the 10-20 array improves the localization and identification of EEG abnormalities, especially those located in the temporal region. Our results suggest that inferior temporal electrodes should be added to the EEG array, to increase the diagnostic yield of the recordings. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
Johnstone, Daniel; Milward, Elizabeth A.; Berretta, Regina; Moscato, Pablo
2012-01-01
Background Recent Alzheimer's disease (AD) research has focused on finding biomarkers to identify disease at the pre-clinical stage of mild cognitive impairment (MCI), allowing treatment to be initiated before irreversible damage occurs. Many studies have examined brain imaging or cerebrospinal fluid but there is also growing interest in blood biomarkers. The Alzheimer's Disease Neuroimaging Initiative (ADNI) has generated data on 190 plasma analytes in 566 individuals with MCI, AD or normal cognition. We conducted independent analyses of this dataset to identify plasma protein signatures predicting pre-clinical AD. Methods and Findings We focused on identifying signatures that discriminate cognitively normal controls (n = 54) from individuals with MCI who subsequently progress to AD (n = 163). Based on p value, apolipoprotein E (APOE) showed the strongest difference between these groups (p = 2.3×10−13). We applied a multivariate approach based on combinatorial optimization ((α,β)-k Feature Set Selection), which retains information about individual participants and maintains the context of interrelationships between different analytes, to identify the optimal set of analytes (signature) to discriminate these two groups. We identified 11-analyte signatures achieving values of sensitivity and specificity between 65% and 86% for both MCI and AD groups, depending on whether APOE was included and other factors. Classification accuracy was improved by considering “meta-features,” representing the difference in relative abundance of two analytes, with an 8-meta-feature signature consistently achieving sensitivity and specificity both over 85%. Generating signatures based on longitudinal rather than cross-sectional data further improved classification accuracy, returning sensitivities and specificities of approximately 90%. Conclusions Applying these novel analysis approaches to the powerful and well-characterized ADNI dataset has identified sets of
ERIC Educational Resources Information Center
Collins, Clarin
2014-01-01
This study examined the SAS Education Value-Added Assessment System (EVAAS®) in practice, as perceived and experienced by teachers in the Southwest School District (SSD). To evaluate teacher effectiveness, SSD is using SAS EVAAS® for high-stakes consequences more than any other district or state in the country. A mixed-method design including a…
A Hybrid Neuro-Fuzzy Model For Integrating Large Earth-Science Datasets
NASA Astrophysics Data System (ADS)
Porwal, A.; Carranza, J.; Hale, M.
2004-12-01
A GIS-based hybrid neuro-fuzzy approach to integration of large earth-science datasets for mineral prospectivity mapping is described. It implements a Takagi-Sugeno type fuzzy inference system in the framework of a four-layered feed-forward adaptive neural network. Each unique combination of the datasets is considered a feature vector whose components are derived by knowledge-based ordinal encoding of the constituent datasets. A subset of feature vectors with a known output target vector (i.e., unique conditions known to be associated with either a mineralized or a barren location) is used for the training of an adaptive neuro-fuzzy inference system. Training involves iterative adjustment of parameters of the adaptive neuro-fuzzy inference system using a hybrid learning procedure for mapping each training vector to its output target vector with minimum sum of squared error. The trained adaptive neuro-fuzzy inference system is used to process all feature vectors. The output for each feature vector is a value that indicates the extent to which a feature vector belongs to the mineralized class or the barren class. These values are used to generate a prospectivity map. The procedure is demonstrated by an application to regional-scale base metal prospectivity mapping in a study area located in the Aravalli metallogenic province (western India). A comparison of the hybrid neuro-fuzzy approach with pure knowledge-driven fuzzy and pure data-driven neural network approaches indicates that the former offers a superior method for integrating large earth-science datasets for predictive spatial mathematical modelling.
Handbook for Local Coordinators: Value-Added, Compact Disk, Union Catalog Test Phase.
ERIC Educational Resources Information Center
Townley, Charles
In 1988, the Associated College Libraries of Central Pennsylvania received a grant to create a value-added, compact disk, union catalog from the U.S. Department of Education's College Library Technology and Cooperative Grants Program, Title II of the Higher Education Act. Designed to contain, in time, 2,000,830 records from 17 member library…
Investigating the Added Value of Interactivity and Serious Gaming for Educational TV
ERIC Educational Resources Information Center
Bellotti, F.; Berta, R.; De Gloria, A.; Ozolina, A.
2011-01-01
TV is a medium with high penetration rates and has been suited to deliver informal education in several aspects since years. Thus, interactive TV may play a significant role in the current Life-Long Learning challenges, provided that meaningful applications are implemented. In this research work, we have explored the added value of interactivity…
ERIC Educational Resources Information Center
Rothstein, Jesse
2009-01-01
Non-random assignment of students to teachers can bias value added estimates of teachers' causal effects. Rothstein (2008a, b) shows that typical value added models indicate large counter-factual effects of 5th grade teachers on students' 4th grade learning, indicating that classroom assignments are far from random. This paper quantifies the…
Mushroom cultivation, processing and value added products: a patent based review.
Singhal, Somya; Rasane, Prasad; Kaur, Sawinder; Garba, Umar; Singh, Jyoti; Raj, Nishant; Gupta, Neeru
2018-06-03
Edible mushrooms are an abundant source of carbohydrates, proteins, and multiple antioxidants and phytonutrients. This paper presents a general overview on the edible fungus describing the inventions made in the field of its cultivation, equipment and value added products. To understand and review the innovations and nutraceutical benefits of mushrooms as well as to develop interest regarding the edible mushrooms. Information provided in this review is based on the available research investigations and patents. Mushrooms are an edible source of a wide variety of antioxidants and phytonutrients with a number of nutraceutical properties including anti-tumor and anti-carcinogenic. Thus, several investigations are made for cultivation and improvement of the yield of mushrooms through improvisation of growth substrates and equipment used for mushroom processing. The mushroom has been processed into various products to increase its consumption, providing the health and nutritional benefit to mankind. This paper summarizes the cultivation practices of mushroom, its processing equipments, methods of preservation, value added based products, and its nutraceutical properties. The review also highlights the various scientific feats achieved in terms of patents and research publications promoting mushroom as a wholesome food. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
A Reanalysis of the Effects of Teacher Replacement Using Value-Added Modeling
ERIC Educational Resources Information Center
Yeh, Stuart S.
2013-01-01
Background: In principle, value-added modeling (VAM) might be justified if it can be shown to be a more reliable indicator of teacher quality than existing indicators for existing low-stakes decisions that are already being made, such as the award of small merit bonuses. However, a growing number of researchers now advocate the use of VAM to…
Methods for Accounting for Co-Teaching in Value-Added Models. Working Paper
ERIC Educational Resources Information Center
Hock, Heinrich; Isenberg, Eric
2012-01-01
Isolating the effect of a given teacher on student achievement (value-added modeling) is complicated when the student is taught the same subject by more than one teacher. We consider three methods, which we call the Partial Credit Method, Teacher Team Method, and Full Roster Method, for estimating teacher effects in the presence of co-teaching.…
An Evaluation of Empirical Bayes's Estimation of Value-Added Teacher Performance Measures
ERIC Educational Resources Information Center
Guarino, Cassandra M.; Maxfield, Michelle; Reckase, Mark D.; Thompson, Paul N.; Wooldridge, Jeffrey M.
2015-01-01
Empirical Bayes's (EB) estimation has become a popular procedure used to calculate teacher value added, often as a way to make imprecise estimates more reliable. In this article, we review the theory of EB estimation and use simulated and real student achievement data to study the ability of EB estimators to properly rank teachers. We compare the…
The 'added value' GPs bring to commissioning: a qualitative study in primary care.
Perkins, Neil; Coleman, Anna; Wright, Michael; Gadsby, Erica; McDermott, Imelda; Petsoulas, Christina; Checkland, Kath
2014-11-01
The 2012 Health and Social Care Act in England replaced primary care trusts with clinical commissioning groups (CCGs) as the main purchasing organisations. These new organisations are GP-led, and it was claimed that this increased clinical input would significantly improve commissioning practice. To explore some of the key assumptions underpinning CCGs, and to examine the claim that GPs bring 'added value' to commissioning. In-depth interviews with clinicians and managers across seven CCGs in England between April and September 2013. A total of 40 clinicians and managers were interviewed. Interviews focused on the perceived 'added value' that GPs bring to commissioning. Claims to GP 'added value' centred on their intimate knowledge of their patients. It was argued that this detailed and concrete knowledge improves service design and that a close working relationship between GPs and managers strengthens the ability of managers to negotiate. However, responders also expressed concerns about the large workload that they face and about the difficulty in engaging with the wider body of GPs. GPs have been involved in commissioning in many ways since fundholding in the 1990s, and claims such as these are not new. The key question is whether these new organisations better support and enable the effective use of this knowledge. Furthermore, emphasis on experiential knowledge brings with it concerns about representativeness and the extent to which other voices are heard. Finally, the implicit privileging of GPs' personal knowledge ahead of systematic public health intelligence also requires exploration. © British Journal of General Practice 2014.
MGH-USC Human Connectome Project Datasets with Ultra-High b-Value Diffusion MRI
Fan, Qiuyun; Witzel, Thomas; Nummenmaa, Aapo; Van Dijk, Koene R.A.; Van Horn, John D.; Drews, Michelle K.; Somerville, Leah H.; Sheridan, Margaret A.; Santillana, Rosario M.; Snyder, Jenna; Hedden, Trey; Shaw, Emily E.; Hollinshead, Marisa O.; Renvall, Ville; Zanzonico, Roberta; Keil, Boris; Cauley, Stephen; Polimeni, Jonathan R.; Tisdall, Dylan; Buckner, Randy L.; Wedeen, Van J.; Wald, Lawrence L.; Toga, Arthur W.; Rosen, Bruce R.
2015-01-01
The MGH-USC CONNECTOM MRI scanner housed at the Massachusetts General Hospital (MGH) is a major hardware innovation of the Human Connectome Project (HCP). The 3T CONNECTOM scanner is capable of producing magnetic field gradient of up to 300 mT/m strength for in vivo human brain imaging, which greatly shortens the time spent on diffusion encoding, and decreases the signal loss due to T2 decay. To demonstrate the capability of the novel gradient system, data of healthy adult participants were acquired for this MGH-USC Adult Diffusion Dataset (N=35), minimally preprocessed, and shared through the Laboratory of Neuro Imaging Image Data Archive (LONI IDA) and the WU-Minn Connectome Database (ConnecomeDB). Another purpose of sharing the data is to facilitate methodological studies of diffusion MRI (dMRI) analyses utilizing high diffusion contrast, which perhaps is not easily feasible with standard MR gradient system. In addition, acquisition of the MGH-Harvard-USC Lifespan Dataset is currently underway to include 120 healthy participants ranging from 8 to 90 years old, which will also be shared through LONI IDA and ConnectomeDB. Here we describe the efforts of the MGH-USC HCP consortium in acquiring and sharing the ultra-high b-value diffusion MRI data and provide a report on data preprocessing and access. We conclude with a demonstration of the example data, along with results of standard diffusion analyses, including q-ball Orientation Distribution Function (ODF) reconstruction and tractography. PMID:26364861
Advanced Manufacturing and Value-added Products from US Agriculture
NASA Technical Reports Server (NTRS)
Villet, Ruxton H.; Child, Dennis R.; Acock, Basil
1992-01-01
An objective of the US Department of Agriculture (USDA) Agriculture Research Service (ARS) is to develop technology leading to a broad portfolio of value-added marketable products. Modern scientific disciplines such as chemical engineering are brought into play to develop processes for converting bulk commodities into high-margin products. To accomplish this, the extremely sophisticated processing devices which form the basis of modern biotechnology, namely, genes and enzymes, can be tailored to perform the required functions. The USDA/ARS is a leader in the development of intelligent processing equipment (IPE) for agriculture in the broadest sense. Applications of IPE are found in the production, processing, grading, and marketing aspects of agriculture. Various biotechnology applications of IPE are discussed.
Govindhan, R; Karthikeyan, B
2017-10-01
The data presented in this article are related to the research entitled of UV-A stable nanotubes. The nanotubes have been prepared from 3,5-bis(trifluoromethyl)benzylamine derivative of tyrosine (BTTP). XRD data reveals the size of the nanotubes. As-synthesized nanotubes (BTTPNTs) are characterized by UV-vis optical absorption studies [1] and photo physical degradation kinetics. The resulted dataset is made available to enable critical or extended analyzes of the BTTPNTs as an excellent light resistive materials.
Added Value of Early Literacy Screening in Preschool Children.
Iyer, Sai Nandini; Dawson, M Zachary; Sawyer, Mark I; Abdullah, Neelab; Saju, Leya; Needlman, Robert D
2017-09-01
The Early Literacy Screener (ELS) is a brief screen for emergent literacy delays in 4- and 5-year-olds. Standard developmental screens may also flag these children. What is the value of adding the ELS? Parents of children aged 4 (n = 45) and 5 (n = 26) years completed the Ages and Stages Questionnaire-3 (ASQ-3), the Survey of Well-Being in Young Children (SWYC), and the ELS. Rates of positive agreement (PA), negative agreement (NA), and overall agreement (Cohen's κ) across the various screening tools were calculated. Early literacy delays were detected in 51% of those who passed the ASQ and 38% of those who passed the SWYC. For ELS versus ASQ, κ = 0.18, PA = 0.36 (95% CI = 0.23-0.51), and NA = 0.83 (95% CI = 0.66-0.92). For ELS versus SWYC, κ = 0.42, PA = 0.61 (95% CI = 0.45-0.75), and NA = 0.82 (95% CI = 0.65-0.92). The ELS adds value by flagging early literacy delays in many children who pass either the ASQ-3 or SWYC.
Dataset definition for CMS operations and physics analyses
NASA Astrophysics Data System (ADS)
Franzoni, Giovanni; Compact Muon Solenoid Collaboration
2016-04-01
Data recorded at the CMS experiment are funnelled into streams, integrated in the HLT menu, and further organised in a hierarchical structure of primary datasets and secondary datasets/dedicated skims. Datasets are defined according to the final-state particles reconstructed by the high level trigger, the data format and the use case (physics analysis, alignment and calibration, performance studies). During the first LHC run, new workflows have been added to this canonical scheme, to exploit at best the flexibility of the CMS trigger and data acquisition systems. The concepts of data parking and data scouting have been introduced to extend the physics reach of CMS, offering the opportunity of defining physics triggers with extremely loose selections (e.g. dijet resonance trigger collecting data at a 1 kHz). In this presentation, we review the evolution of the dataset definition during the LHC run I, and we discuss the plans for the run II.
The impact of the resolution of meteorological datasets on catchment-scale drought studies
NASA Astrophysics Data System (ADS)
Hellwig, Jost; Stahl, Kerstin
2017-04-01
Gridded meteorological datasets provide the basis to study drought at a range of scales, including catchment scale drought studies in hydrology. They are readily available to study past weather conditions and often serve real time monitoring as well. As these datasets differ in spatial/temporal coverage and spatial/temporal resolution, for most studies there is a tradeoff between these features. Our investigation examines whether biases occur when studying drought on catchment scale with low resolution input data. For that, a comparison among the datasets HYRAS (covering Central Europe, 1x1 km grid, daily data, 1951 - 2005), E-OBS (Europe, 0.25° grid, daily data, 1950-2015) and GPCC (whole world, 0.5° grid, monthly data, 1901 - 2013) is carried out. Generally, biases in precipitation increase with decreasing resolution. Most important variations are found during summer. In low mountain range of Central Europe the datasets of sparse resolution (E-OBS, GPCC) overestimate dry days and underestimate total precipitation since they are not able to describe high spatial variability. However, relative measures like the correlation coefficient reveal good consistencies of dry and wet periods, both for absolute precipitation values and standardized indices like the Standardized Precipitation Index (SPI) or Standardized Precipitation Evaporation Index (SPEI). Particularly the most severe droughts derived from the different datasets match very well. These results indicate that absolute values of sparse resolution datasets applied to catchment scale might be critical to use for an assessment of the hydrological drought at catchment scale, whereas relative measures for determining periods of drought are more trustworthy. Therefore, studies on drought, that downscale meteorological data, should carefully consider their data needs and focus on relative measures for dry periods if sufficient for the task.
ERIC Educational Resources Information Center
Chetty, Raj; Friedman, John N.; Rockoff, Jonah E.
2011-01-01
Are teachers' impacts on students' test scores ("value-added") a good measure of their quality? This question has sparked debate largely because of disagreement about (1) whether value-added (VA) provides unbiased estimates of teachers' impacts on student achievement and (2) whether high-VA teachers improve students' long-term outcomes.…
The Implications of Summer Learning Loss for Value-Added Estimates of Teacher Effectiveness
ERIC Educational Resources Information Center
Gershenson, Seth; Hayes, Michael S.
2018-01-01
School districts across the United States increasingly use value-added models (VAMs) to evaluate teachers. In practice, VAMs typically rely on lagged test scores from the previous academic year, which necessarily conflate summer with school-year learning and potentially bias estimates of teacher effectiveness. We investigate the practical…
Viking Seismometer PDS Archive Dataset
NASA Astrophysics Data System (ADS)
Lorenz, R. D.
2016-12-01
The Viking Lander 2 seismometer operated successfully for over 500 Sols on the Martian surface, recording at least one likely candidate Marsquake. The Viking mission, in an era when data handling hardware (both on board and on the ground) was limited in capability, predated modern planetary data archiving, and ad-hoc repositories of the data, and the very low-level record at NSSDC, were neither convenient to process nor well-known. In an effort supported by the NASA Mars Data Analysis Program, we have converted the bulk of the Viking dataset (namely the 49,000 and 270,000 records made in High- and Event- modes at 20 and 1 Hz respectively) into a simple ASCII table format. Additionally, since wind-generated lander motion is a major component of the signal, contemporaneous meteorological data are included in summary records to facilitate correlation. These datasets are being archived at the PDS Geosciences Node. In addition to brief instrument and dataset descriptions, the archive includes code snippets in the freely-available language 'R' to demonstrate plotting and analysis. Further, we present examples of lander-generated noise, associated with the sampler arm, instrument dumps and other mechanical operations.
The Added Value of Water Footprint Assessment for National Water Policy: A Case Study for Morocco
Schyns, Joep F.; Hoekstra, Arjen Y.
2014-01-01
A Water Footprint Assessment is carried out for Morocco, mapping the water footprint of different activities at river basin and monthly scale, distinguishing between surface- and groundwater. The paper aims to demonstrate the added value of detailed analysis of the human water footprint within a country and thorough assessment of the virtual water flows leaving and entering a country for formulating national water policy. Green, blue and grey water footprint estimates and virtual water flows are mainly derived from a previous grid-based (5×5 arc minute) global study for the period 1996–2005. These estimates are placed in the context of monthly natural runoff and waste assimilation capacity per river basin derived from Moroccan data sources. The study finds that: (i) evaporation from storage reservoirs is the second largest form of blue water consumption in Morocco, after irrigated crop production; (ii) Morocco’s water and land resources are mainly used to produce relatively low-value (in US$/m3 and US$/ha) crops such as cereals, olives and almonds; (iii) most of the virtual water export from Morocco relates to the export of products with a relatively low economic water productivity (in US$/m3); (iv) blue water scarcity on a monthly scale is severe in all river basins and pressure on groundwater resources by abstractions and nitrate pollution is considerable in most basins; (v) the estimated potential water savings by partial relocation of crops to basins where they consume less water and by reducing water footprints of crops down to benchmark levels are significant compared to demand reducing and supply increasing measures considered in Morocco’s national water strategy. PMID:24919194
The added value of water footprint assessment for national water policy: a case study for Morocco.
Schyns, Joep F; Hoekstra, Arjen Y
2014-01-01
A Water Footprint Assessment is carried out for Morocco, mapping the water footprint of different activities at river basin and monthly scale, distinguishing between surface- and groundwater. The paper aims to demonstrate the added value of detailed analysis of the human water footprint within a country and thorough assessment of the virtual water flows leaving and entering a country for formulating national water policy. Green, blue and grey water footprint estimates and virtual water flows are mainly derived from a previous grid-based (5 × 5 arc minute) global study for the period 1996-2005. These estimates are placed in the context of monthly natural runoff and waste assimilation capacity per river basin derived from Moroccan data sources. The study finds that: (i) evaporation from storage reservoirs is the second largest form of blue water consumption in Morocco, after irrigated crop production; (ii) Morocco's water and land resources are mainly used to produce relatively low-value (in US$/m3 and US$/ha) crops such as cereals, olives and almonds; (iii) most of the virtual water export from Morocco relates to the export of products with a relatively low economic water productivity (in US$/m3); (iv) blue water scarcity on a monthly scale is severe in all river basins and pressure on groundwater resources by abstractions and nitrate pollution is considerable in most basins; (v) the estimated potential water savings by partial relocation of crops to basins where they consume less water and by reducing water footprints of crops down to benchmark levels are significant compared to demand reducing and supply increasing measures considered in Morocco's national water strategy.
a European Global Navigation Satellite System — the German Market and Value Adding Chain Effects
NASA Astrophysics Data System (ADS)
Vollerthun, A.; Wieser, M.
2002-03-01
Since Europe is considering to establish a "market-driven" European Global Navigation Satellite System, the German Center of Aerospace initiated a market research to justify a German investment in such a European project. The market research performed included the following market segments: aviation, railway, road traffic, shipping, surveying, farming, military, space applications, leisure, and sport. In these market segments, the forementioned inputs were determined for satellite navigation hardware (receivers) as well as satellite navigation services. The forecast period was from year 2007 to 2017. For the considered period, the market amounts to a total of DM 83.0 billion (approx. US $50 billion), whereas the satellite navigation equipment market makes up DM 39.8 billion, and charges for value-added-services amount to DM 43.2 billion. On closer examination road traffic can be identified as the dominant market share, both in the receiver-market and service-market. With a share of 96% for receivers and 73% for services the significance of the road traffic segment becomes obvious. The second part of this paper investigates the effects the market potential has on the Value-Adding-Chain. Therefore, all participants in the Value-Adding-Chain are identified, using industrial cost structure models the employment effect is analyzed, and possible tax revenues for the state are examined.
ARM Climate Research Facility Quarterly Value-Added Product Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sivaraman, C.
The purpose of this report is to provide a concise status update for Value-Added Products (VAPs) implemented by the Atmospheric Radiation Measurement (ARM) Climate Research Facility. The report is divided into the following sections: (1) new VAPs for which development has begun; (2) progress on existing VAPs; (3) future VAPs that have been recently approved; (4) other work that leads to a VAP; (5) top requested VAPs from the ARM Data Archive; and (6) a summary of VAP and data releases to production and evaluation. New information is highlighted in blue text. New information about processed data by the developermore » is highlighted in red text. The upcoming milestones and dates are highlighted in green.« less
Niemantsverdriet, Ellis; Feyen, Bart F E; Le Bastard, Nathalie; Martin, Jean-Jacques; Goeman, Johan; De Deyn, Peter Paul; Bjerke, Maria; Engelborghs, Sebastiaan
2018-01-01
Differential dementia diagnosis remains a challenge due to overlap of clinical profiles, which often results in diagnostic doubt. Determine the added diagnostic value of cerebrospinal fluid (CSF) biomarkers for differential dementia diagnosis as compared to autopsy-confirmed diagnosis. Seventy-one dementia patients with autopsy-confirmed diagnoses were included in this study. All neuropathological diagnoses were established according to standard neuropathological criteria and consisted of Alzheimer's disease (AD) or other dementias (NONAD). CSF levels of Aβ1 - 42, T-tau, and P-tau181 were determined and interpreted based on the IWG-2 and NIA-AA criteria, separately. A panel of three neurologists experienced with dementia made clinical consensus dementia diagnoses. Clinical and CSF biomarker diagnoses were compared to the autopsy-confirmed diagnoses. Forty-two patients (59%) had autopsy-confirmed AD, whereas 29 patients (41%) had autopsy-confirmed NONAD. Of the 24 patients with an ambiguous clinical dementia diagnosis, a correct diagnosis would have been established in 67% of the cases applying CSF biomarkers in the context of the IWG-2 or the NIA-AA criteria respectively. AD CSF biomarkers have an added diagnostic value in differential dementia diagnosis and can help establishing a correct dementia diagnosis in case of ambiguous clinical dementia diagnoses.
Reducing added sugar intake increases the relative reinforcing value of high-sugar foods
USDA-ARS?s Scientific Manuscript database
Objective: To determine whether reducing added sugar intake to <10% of calories for 1 week changes the relative reinforcing value (RRV) of foods high in sugar and to test whether changes in RRV of high-sugar foods differed between non-overweight and obese adults. Background: The 2015-2020 DGA focu...
ERIC Educational Resources Information Center
Isenberg, Eric; Hock, Heinrich
2011-01-01
This report presents the value-added models that will be used to measure school and teacher effectiveness in the District of Columbia Public Schools (DCPS) in the 2010-2011 school year. It updates the earlier technical report, "Measuring Value Added for IMPACT and TEAM in DC Public Schools." The earlier report described the methods used…
MGH-USC Human Connectome Project datasets with ultra-high b-value diffusion MRI.
Fan, Qiuyun; Witzel, Thomas; Nummenmaa, Aapo; Van Dijk, Koene R A; Van Horn, John D; Drews, Michelle K; Somerville, Leah H; Sheridan, Margaret A; Santillana, Rosario M; Snyder, Jenna; Hedden, Trey; Shaw, Emily E; Hollinshead, Marisa O; Renvall, Ville; Zanzonico, Roberta; Keil, Boris; Cauley, Stephen; Polimeni, Jonathan R; Tisdall, Dylan; Buckner, Randy L; Wedeen, Van J; Wald, Lawrence L; Toga, Arthur W; Rosen, Bruce R
2016-01-01
The MGH-USC CONNECTOM MRI scanner housed at the Massachusetts General Hospital (MGH) is a major hardware innovation of the Human Connectome Project (HCP). The 3T CONNECTOM scanner is capable of producing a magnetic field gradient of up to 300 mT/m strength for in vivo human brain imaging, which greatly shortens the time spent on diffusion encoding, and decreases the signal loss due to T2 decay. To demonstrate the capability of the novel gradient system, data of healthy adult participants were acquired for this MGH-USC Adult Diffusion Dataset (N=35), minimally preprocessed, and shared through the Laboratory of Neuro Imaging Image Data Archive (LONI IDA) and the WU-Minn Connectome Database (ConnectomeDB). Another purpose of sharing the data is to facilitate methodological studies of diffusion MRI (dMRI) analyses utilizing high diffusion contrast, which perhaps is not easily feasible with standard MR gradient system. In addition, acquisition of the MGH-Harvard-USC Lifespan Dataset is currently underway to include 120 healthy participants ranging from 8 to 90 years old, which will also be shared through LONI IDA and ConnectomeDB. Here we describe the efforts of the MGH-USC HCP consortium in acquiring and sharing the ultra-high b-value diffusion MRI data and provide a report on data preprocessing and access. We conclude with a demonstration of the example data, along with results of standard diffusion analyses, including q-ball Orientation Distribution Function (ODF) reconstruction and tractography. Copyright © 2015 Elsevier Inc. All rights reserved.
Assessing the Added Value of Dynamical Downscaling Using ...
In this study, the Standardized Precipitation Index (SPI) is used to ascertain the added value of dynamical downscaling over the contiguous United States. WRF is used as a regional climate model (RCM) to dynamically downscale reanalysis fields to compare values of SPI over drought timescales that have implications for agriculture and water resources planning. The regional climate generated by WRF has the largest improvement over reanalysis for SPI correlation with observations as the drought timescale increases. This suggests that dynamically downscaled fields may be more reliable than larger-scale fields for water resource applications (e.g., water storage within reservoirs). WRF improves the timing and intensity of moderate to extreme wet and dry periods, even in regions with homogenous terrain. This study also examines changes in SPI from the extreme drought of 1988 and three “drought busting” tropical storms. Each of those events illustrates the importance of using downscaling to resolve the spatial extent of droughts. The analysis of the “drought busting” tropical storms demonstrates that while the impact of these storms on ending prolonged droughts is improved by the RCM relative to the reanalysis, it remains underestimated. These results illustrate the importance and some limitations of using RCMs to project drought. The National Exposure Research Laboratory’s Atmospheric Modeling Division (AMAD) conducts research in support of EPA’s mission t
Ten years of medical education registrars: Value added?
Brazil, Victoria; Davin, Lorna
2018-05-22
There is a paucity of any long-term follow up of trainees' career pathways or organisational outcomes from medical education registrar posts in emergency medicine training. We report on the experience of a selected group of medical education trainees during and subsequent to their post and reflect on the value added to emergency medical education at three institutions. We conducted an online survey study, examining quantitative outcomes and qualitative reflections, of emergency physicians who had previously undertaken a medical education registrar post. Descriptive statistics were used to summarise responses to Likert items. The authors independently analysed and interpreted the reflective responses to identify key themes and sub-themes. Nineteen of 21 surveys were completed. Most respondents were in formal educational roles, in addition to clinical practice. The thematic analysis revealed that the medical education registrar experience, and the subsequent contribution of these trainees to medical education, is significantly shaped by external factors. These include the extent of faculty support, and the value placed on medical education by hospitals/departments/leaders. Acquisition of knowledge and skills in medical education was only part of a broader developmental journey and transitioning of identity for the trainees. Our findings suggest that medical education trainees in emergency medicine progress to educational roles, and most respondents attribute their career progression to the medical education training experience. We recommend that medical education registrar programmes need to be valued within the clinical service, supported by faculty and a 'community of practice', to support trainees' transition to clinician educator leadership roles. © 2018 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
Go Digital! Making Physical Samples a Valued Part of the Online Record of Science
NASA Astrophysics Data System (ADS)
Klump, J. F.; Lehnert, K.
2016-12-01
Physical samples, at first glance, seem to be the opposite to the virtual world of the internet. Yet, as anything not natively digital, physical samples can have a digital representation that is accessible through the internet. Most museums and other institutions have many more objects in their collections than they could ever put on display and many samples exist outside of formal curation workflows. Nevertheless, these objects can be of importance to science, maybe because this particular fossil is a holotype that defines an extinct animal species, or it is a mineral sample that was used to derive a reference optical reflectance spectrum that is used in the interpretation of remote sensing data from satellites. As these examples show, the value of a scientific collection lies not only in its objects but also in how these objects are integrated into the record of science. Fundamental to this are, of course, catalogues of the samples held in a collection. Significant value can be added to a collection if its catalogue is web accessible, and even better if its catalogue can be harvested into disciplinary portals to aid the discovery of samples. Sample curation in the digital age, however, must go beyond simply labeling and cataloguing. In the same way that publications and datasets can now be identified and accessed over the web, steps are now being made to do the same for physical samples. Globally unique, resolvable identifiers of samples, datasets and literature can serve as nodes to link these resources together and in this way, then cross-link between scientific interpretation in the literature, data interpreted in these works, and samples from which these data were derived. These linkages must not only be recorded in the metadata but must also be machine actionable to allow integration of these digital assets into the ever growing body and richness of the scientific record. This presentation will discuss cyberinfrastructures for samples and sample curation
Reformers, Batting Averages, and Malpractice: The Case for Caution in Value-Added Use
ERIC Educational Resources Information Center
Gleason, Daniel
2014-01-01
The essay considers two analogies that help to reveal the limitations of value-added modeling: the first, a comparison with batting averages, shows that the model's reliability is quite limited even though year-to-year correlation figures may seem impressive; the second, a comparison between medical malpractice and so-called educational…
Evaluating Specification Tests in the Context of Value-Added Estimation. Working Paper #38
ERIC Educational Resources Information Center
Guarino, Cassandra M.; Reckase, Mark D.; Stacy, Brian W.; Wooldridge, Jeffrey M.
2014-01-01
We study the properties of two specification tests that have been applied to a variety of estimators in the context of value-added measures (VAMs) of teacher and school quality: the Hausman test for choosing between random and fixed effects and a test for feedback (sometimes called a "falsification test"). We discuss theoretical…
Fermionic currents in AdS spacetime with compact dimensions
NASA Astrophysics Data System (ADS)
Bellucci, S.; Saharian, A. A.; Vardanyan, V.
2017-09-01
We derive a closed expression for the vacuum expectation value (VEV) of the fermionic current density in a (D +1 )-dimensional locally AdS spacetime with an arbitrary number of toroidally compactified Poincaré spatial dimensions and in the presence of a constant gauge field. The latter can be formally interpreted in terms of a magnetic flux treading the compact dimensions. In the compact subspace, the field operator obeys quasiperiodicity conditions with arbitrary phases. The VEV of the charge density is zero and the current density has nonzero components along the compact dimensions only. They are periodic functions of the magnetic flux with the period equal to the flux quantum and tend to zero on the AdS boundary. Near the horizon, the effect of the background gravitational field is small and the leading term in the corresponding asymptotic expansion coincides with the VEV for a massless field in the locally Minkowski bulk. Unlike the Minkowskian case, in the system consisting of an equal number of fermionic and scalar degrees of freedom, with same masses, charges and phases in the periodicity conditions, the total current density does not vanish. In these systems, the leading divergences in the scalar and fermionic contributions on the horizon are canceled and, as a consequence of that, the charge flux, integrated over the coordinate perpendicular to the AdS boundary, becomes finite. We show that in odd spacetime dimensions the fermionic fields realizing two inequivalent representations of the Clifford algebra and having equal phases in the periodicity conditions give the same contribution to the VEV of the current density. Combining the contributions from these fields, the current density in odd-dimensional C -,P - and T -symmetric models are obtained. As an application, we consider the ground state current density in curved carbon nanotubes described in terms of a (2 +1 )-dimensional effective Dirac model.
ERIC Educational Resources Information Center
What Works Clearinghouse, 2012
2012-01-01
This study examined whether being taught by a teacher with a high "value-added" improves a student's long-term outcomes. The study analyzed more than 20 years of data for nearly one million fourth- through eighth-grade students in a large urban school district. The study reported that having a teacher with a higher level of value-added was…
da Silva, Teresa Lopes; Gouveia, Luísa; Reis, Alberto
2014-02-01
The production of microbial biofuels is currently under investigation, as they are alternative sources to fossil fuels, which are diminishing and their use has a negative impact on the environment. However, so far, biofuels derived from microbes are not economically competitive. One way to overcome this bottleneck is the use of microorganisms to transform substrates into biofuels and high value-added products, and simultaneously taking advantage of the various microbial biomass components to produce other products of interest, as an integrated process. In this way, it is possible to maximize the economic value of the whole process, with the desired reduction of the waste streams produced. It is expected that this integrated system makes the biofuel production economically sustainable and competitive in the near future. This review describes the investigation on integrated microbial processes (based on bacteria, yeast, and microalgal cultivations) that have been experimentally developed, highlighting the importance of this approach as a way to optimize microbial biofuel production process.
A cross-country Exchange Market Pressure (EMP) dataset.
Desai, Mohit; Patnaik, Ila; Felman, Joshua; Shah, Ajay
2017-06-01
The data presented in this article are related to the research article titled - "An exchange market pressure measure for cross country analysis" (Patnaik et al. [1]). In this article, we present the dataset for Exchange Market Pressure values (EMP) for 139 countries along with their conversion factors, ρ (rho). Exchange Market Pressure, expressed in percentage change in exchange rate, measures the change in exchange rate that would have taken place had the central bank not intervened. The conversion factor ρ can interpreted as the change in exchange rate associated with $1 billion of intervention. Estimates of conversion factor ρ allow us to calculate a monthly time series of EMP for 139 countries. Additionally, the dataset contains the 68% confidence interval (high and low values) for the point estimates of ρ 's. Using the standard errors of estimates of ρ 's, we obtain one sigma intervals around mean estimates of EMP values. These values are also reported in the dataset.
ERIC Educational Resources Information Center
Ready, Douglas David
2013-01-01
Accountability systems that measure student learning rather than student achievement have the potential to more accurately evaluate school quality. However, one methodological concern has remained surprisingly absent from discussions of value-added modeling. Standardized assessments that exhibit either positive or negative correlations between…
NASA Astrophysics Data System (ADS)
Batzias, Dimitris F.
2012-12-01
In this work, we present an analytic estimation of recycled products added value in order to provide a means for determining the degree of recycling that maximizes profit, taking also into account the social interest by including the subsidy of the corresponding investment. A methodology has been developed based on Life Cycle Product (LCP) with emphasis on added values H, R as fractions of production and recycle cost, respectively (H, R >1, since profit is included), which decrease by the corresponding rates h, r in the recycle course, due to deterioration of quality. At macrolevel, the claim that "an increase of exergy price, as a result of available cheap energy sources becoming more scarce, leads to less recovered quantity of any recyclable material" is proved by means of the tradeoff between the partial benefits due to material saving and resources degradation/consumption (assessed in monetary terms).
ERIC Educational Resources Information Center
Sinharay, Sandip
2010-01-01
Recently, there has been an increasing level of interest in subscores for their potential diagnostic value. Haberman (2008) suggested a method based on classical test theory to determine whether subscores have added value over total scores. This paper provides a literature review and reports when subscores were found to have added value for…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cvetic, Mirjam; Papadimitriou, Ioannis
Here, we construct the holographic dictionary for both running and constant dilaton solutions of the two dimensional Einstein-Maxwell-Dilaton theory that is obtained by a circle reduction from Einstein-Hilbert gravity with negative cosmological constant in three dimensions. This specific model ensures that the dual theory has a well defined ultraviolet completion in terms of a two dimensional conformal field theory, but our results apply qualitatively to a wider class of two dimensional dilaton gravity theories. For each type of solutions we perform holographic renormalization, compute the exact renormalized one-point functions in the presence of arbitrary sources, and derive the asymptotic symmetriesmore » and the corresponding conserved charges. In both cases we find that the scalar operator dual to the dilaton plays a crucial role in the description of the dynamics. Its source gives rise to a matter conformal anomaly for the running dilaton solutions, while its expectation value is the only non trivial observable for constant dilaton solutions. The role of this operator has been largely overlooked in the literature. We further show that the only non trivial conserved charges for running dilaton solutions are the mass and the electric charge, while for constant dilaton solutions only the electric charge is non zero. However, by uplifting the solutions to three dimensions we show that constant dilaton solutions can support non trivial extended symmetry algebras, including the one found by Compère, Song and Strominger, in agreement with the results of Castro and Song. Finally, we demonstrate that any solution of this specific dilaton gravity model can be uplifted to a family of asymptotically AdS 2 × S 2 or conformally AdS 2 × S 2 solutions of the STU model in four dimensions, including non extremal black holes. As a result, the four dimensional solutions obtained by uplifting the running dilaton solutions coincide with the so called ‘subtracted geometries
Cvetic, Mirjam; Papadimitriou, Ioannis
2016-12-02
Here, we construct the holographic dictionary for both running and constant dilaton solutions of the two dimensional Einstein-Maxwell-Dilaton theory that is obtained by a circle reduction from Einstein-Hilbert gravity with negative cosmological constant in three dimensions. This specific model ensures that the dual theory has a well defined ultraviolet completion in terms of a two dimensional conformal field theory, but our results apply qualitatively to a wider class of two dimensional dilaton gravity theories. For each type of solutions we perform holographic renormalization, compute the exact renormalized one-point functions in the presence of arbitrary sources, and derive the asymptotic symmetriesmore » and the corresponding conserved charges. In both cases we find that the scalar operator dual to the dilaton plays a crucial role in the description of the dynamics. Its source gives rise to a matter conformal anomaly for the running dilaton solutions, while its expectation value is the only non trivial observable for constant dilaton solutions. The role of this operator has been largely overlooked in the literature. We further show that the only non trivial conserved charges for running dilaton solutions are the mass and the electric charge, while for constant dilaton solutions only the electric charge is non zero. However, by uplifting the solutions to three dimensions we show that constant dilaton solutions can support non trivial extended symmetry algebras, including the one found by Compère, Song and Strominger, in agreement with the results of Castro and Song. Finally, we demonstrate that any solution of this specific dilaton gravity model can be uplifted to a family of asymptotically AdS 2 × S 2 or conformally AdS 2 × S 2 solutions of the STU model in four dimensions, including non extremal black holes. As a result, the four dimensional solutions obtained by uplifting the running dilaton solutions coincide with the so called ‘subtracted geometries
Sentinels Guarding the Grail: Value-Added Measurement and the Quest for Education Reform
ERIC Educational Resources Information Center
Gabriel, Rachael; Lester, Jessica Nina
2013-01-01
Since the beginning of the federal Race To The Top grant competition, Value-Added Measurement (VAM) has captured the attention of the American public through high-profile media representations of the tool and the controversy that surrounds it. In this paper, we build upon investigations of constructions of VAM in the media and present a discourse…
Quality and the Rise of Value-Added in Education: The Case of Ireland
ERIC Educational Resources Information Center
Brown, Martin; McNamara, Gerry; O'Hara, Joe
2016-01-01
This paper examines the rise of value-added as a measure of quality in education. As a point of departure, the paper begins with an analysis of the rise of the concept of quality in education and discusses how, at times, various contradictory determinants of quality have managed to influence the evaluation and assessment frameworks of most…
The Effect of Summer on Value-Added Assessments of Teacher and School Performance
ERIC Educational Resources Information Center
Palardy, Gregory J.; Peng, Luyao
2015-01-01
This study examines the effects of including the summer period on value-added assessments (VAA) of teacher and school performance at the early grades. The results indicate that 40-62% of the variance in VAA estimates originates from the summer period, depending on the outcome (i.e., reading or math achievement gains). Furthermore, when summer is…
An Exploratory Study of Value-Added and Academic Optimism of Urban Reading Teachers
ERIC Educational Resources Information Center
Huff-Franklin, Clairie L.
2017-01-01
The purpose of this study is to explore the correlation between state-recorded value- added (VA) scores and academic optimism (AO) scores, which measure teacher self-efficacy, trust, and academic emphasis. The sample for this study is 87 third through eighth grade Reading teachers, from fifty-five schools, in an urban school district in Ohio who…
Non-redundant patent sequence databases with value-added annotations at two levels
Li, Weizhong; McWilliam, Hamish; de la Torre, Ana Richart; Grodowski, Adam; Benediktovich, Irina; Goujon, Mickael; Nauche, Stephane; Lopez, Rodrigo
2010-01-01
The European Bioinformatics Institute (EMBL-EBI) provides public access to patent data, including abstracts, chemical compounds and sequences. Sequences can appear multiple times due to the filing of the same invention with multiple patent offices, or the use of the same sequence by different inventors in different contexts. Information relating to the source invention may be incomplete, and biological information available in patent documents elsewhere may not be reflected in the annotation of the sequence. Search and analysis of these data have become increasingly challenging for both the scientific and intellectual-property communities. Here, we report a collection of non-redundant patent sequence databases, which cover the EMBL-Bank nucleotides patent class and the patent protein databases and contain value-added annotations from patent documents. The databases were created at two levels by the use of sequence MD5 checksums. Sequences within a level-1 cluster are 100% identical over their whole length. Level-2 clusters were defined by sub-grouping level-1 clusters based on patent family information. Value-added annotations, such as publication number corrections, earliest publication dates and feature collations, significantly enhance the quality of the data, allowing for better tracking and cross-referencing. The databases are available format: http://www.ebi.ac.uk/patentdata/nr/. PMID:19884134
Non-redundant patent sequence databases with value-added annotations at two levels.
Li, Weizhong; McWilliam, Hamish; de la Torre, Ana Richart; Grodowski, Adam; Benediktovich, Irina; Goujon, Mickael; Nauche, Stephane; Lopez, Rodrigo
2010-01-01
The European Bioinformatics Institute (EMBL-EBI) provides public access to patent data, including abstracts, chemical compounds and sequences. Sequences can appear multiple times due to the filing of the same invention with multiple patent offices, or the use of the same sequence by different inventors in different contexts. Information relating to the source invention may be incomplete, and biological information available in patent documents elsewhere may not be reflected in the annotation of the sequence. Search and analysis of these data have become increasingly challenging for both the scientific and intellectual-property communities. Here, we report a collection of non-redundant patent sequence databases, which cover the EMBL-Bank nucleotides patent class and the patent protein databases and contain value-added annotations from patent documents. The databases were created at two levels by the use of sequence MD5 checksums. Sequences within a level-1 cluster are 100% identical over their whole length. Level-2 clusters were defined by sub-grouping level-1 clusters based on patent family information. Value-added annotations, such as publication number corrections, earliest publication dates and feature collations, significantly enhance the quality of the data, allowing for better tracking and cross-referencing. The databases are available format: http://www.ebi.ac.uk/patentdata/nr/.
Hara, Michikazu; Nakajima, Kiyotaka; Kamata, Keigo
2015-06-01
In recent decades, the substitution of non-renewable fossil resources by renewable biomass as a sustainable feedstock has been extensively investigated for the manufacture of high value-added products such as biofuels, commodity chemicals, and new bio-based materials such as bioplastics. Numerous solid catalyst systems for the effective conversion of biomass feedstocks into value-added chemicals and fuels have been developed. Solid catalysts are classified into four main groups with respect to their structures and substrate activation properties: (a) micro- and mesoporous materials, (b) metal oxides, (c) supported metal catalysts, and (d) sulfonated polymers. This review article focuses on the activation of substrates and/or reagents on the basis of groups (a)-(d), and the corresponding reaction mechanisms. In addition, recent progress in chemocatalytic processes for the production of five industrially important products (5-hydroxymethylfurfural, lactic acid, glyceraldehyde, 1,3-dihydroxyacetone, and furan-2,5-dicarboxylic acid) as bio-based plastic monomers and their intermediates is comprehensively summarized.
NASA Astrophysics Data System (ADS)
Hara, Michikazu; Nakajima, Kiyotaka; Kamata, Keigo
2015-06-01
In recent decades, the substitution of non-renewable fossil resources by renewable biomass as a sustainable feedstock has been extensively investigated for the manufacture of high value-added products such as biofuels, commodity chemicals, and new bio-based materials such as bioplastics. Numerous solid catalyst systems for the effective conversion of biomass feedstocks into value-added chemicals and fuels have been developed. Solid catalysts are classified into four main groups with respect to their structures and substrate activation properties: (a) micro- and mesoporous materials, (b) metal oxides, (c) supported metal catalysts, and (d) sulfonated polymers. This review article focuses on the activation of substrates and/or reagents on the basis of groups (a)-(d), and the corresponding reaction mechanisms. In addition, recent progress in chemocatalytic processes for the production of five industrially important products (5-hydroxymethylfurfural, lactic acid, glyceraldehyde, 1,3-dihydroxyacetone, and furan-2,5-dicarboxylic acid) as bio-based plastic monomers and their intermediates is comprehensively summarized.
Exploring Valued-Added Options - Edge-Glued Panels and Blanks Offer Value-Added Opportunities
Bob Smith; Philip A. Araman
1997-01-01
As sawmills search for new opportunities to add value to rough sawn lumber, many consider producing dimension parts as one solution. Assembling dimension parts into edge-glued panels or standard blanks can add even further value. Blanks are defined as pieces of solid wood (which may be edge-glued) that are manufactured to a predetermined size. This article discusses...
Methods for conversion of carbohydrates in ionic liquids to value-added chemicals
Zhao, Haibo [The Woodlands, TX; Holladay, Johnathan E [Kennewick, WA; Zhang, Zongchao C [Norwood, NJ
2011-05-10
Methods are described for converting carbohydrates including, e.g., monosaccharides, disaccharides, and polysaccharides in ionic liquids to value-added chemicals including furans, useful as chemical intermediates and/or feedstocks. Fructose is converted to 5-hydroxylmethylfurfural (HMF) in the presence of metal halide and acid catalysts. Glucose is effectively converted to HMF in the presence of chromium chloride catalysts. Yields of up to about 70% are achieved with low levels of impurities such as levulinic acid.
Handling value added tax (VAT) in economic evaluations: should prices include VAT?
Bech, Mickael; Christiansen, Terkel; Gyrd-Hansen, Dorte
2006-01-01
In health economic evaluations, value added tax is commonly treated as a transfer payment. Following this argument, resources are valued equal to their net-of-tax prices in economic evaluations applying a societal perspective. In this article we argue that if there is the possibility that a new healthcare intervention may expand the healthcare budget, the social cost of input factors should be the gross-of-tax prices and not the net-of-tax prices. The rising interest in cost-benefit analysis and the use of absolute thresholds, net benefit estimates and acceptability curves in cost-effectiveness analysis makes this argument highly relevant for an appropriate use of these tools in prioritisation.
Topic modeling for cluster analysis of large biological and medical datasets
2014-01-01
Background The big data moniker is nowhere better deserved than to describe the ever-increasing prodigiousness and complexity of biological and medical datasets. New methods are needed to generate and test hypotheses, foster biological interpretation, and build validated predictors. Although multivariate techniques such as cluster analysis may allow researchers to identify groups, or clusters, of related variables, the accuracies and effectiveness of traditional clustering methods diminish for large and hyper dimensional datasets. Topic modeling is an active research field in machine learning and has been mainly used as an analytical tool to structure large textual corpora for data mining. Its ability to reduce high dimensionality to a small number of latent variables makes it suitable as a means for clustering or overcoming clustering difficulties in large biological and medical datasets. Results In this study, three topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, are proposed and tested on the cluster analysis of three large datasets: Salmonella pulsed-field gel electrophoresis (PFGE) dataset, lung cancer dataset, and breast cancer dataset, which represent various types of large biological or medical datasets. All three various methods are shown to improve the efficacy/effectiveness of clustering results on the three datasets in comparison to traditional methods. A preferable cluster analysis method emerged for each of the three datasets on the basis of replicating known biological truths. Conclusion Topic modeling could be advantageously applied to the large datasets of biological or medical research. The three proposed topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, yield clustering improvements for the three different data types. Clusters more efficaciously represent truthful groupings and subgroupings in the data than
Topic modeling for cluster analysis of large biological and medical datasets.
Zhao, Weizhong; Zou, Wen; Chen, James J
2014-01-01
The big data moniker is nowhere better deserved than to describe the ever-increasing prodigiousness and complexity of biological and medical datasets. New methods are needed to generate and test hypotheses, foster biological interpretation, and build validated predictors. Although multivariate techniques such as cluster analysis may allow researchers to identify groups, or clusters, of related variables, the accuracies and effectiveness of traditional clustering methods diminish for large and hyper dimensional datasets. Topic modeling is an active research field in machine learning and has been mainly used as an analytical tool to structure large textual corpora for data mining. Its ability to reduce high dimensionality to a small number of latent variables makes it suitable as a means for clustering or overcoming clustering difficulties in large biological and medical datasets. In this study, three topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, are proposed and tested on the cluster analysis of three large datasets: Salmonella pulsed-field gel electrophoresis (PFGE) dataset, lung cancer dataset, and breast cancer dataset, which represent various types of large biological or medical datasets. All three various methods are shown to improve the efficacy/effectiveness of clustering results on the three datasets in comparison to traditional methods. A preferable cluster analysis method emerged for each of the three datasets on the basis of replicating known biological truths. Topic modeling could be advantageously applied to the large datasets of biological or medical research. The three proposed topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, yield clustering improvements for the three different data types. Clusters more efficaciously represent truthful groupings and subgroupings in the data than traditional methods, suggesting
Internal Consistency of the NVAP Water Vapor Dataset
NASA Technical Reports Server (NTRS)
Suggs, Ronnie J.; Jedlovec, Gary J.; Arnold, James E. (Technical Monitor)
2001-01-01
The NVAP (NASA Water Vapor Project) dataset is a global dataset at 1 x 1 degree spatial resolution consisting of daily, pentad, and monthly atmospheric precipitable water (PW) products. The analysis blends measurements from the Television and Infrared Operational Satellite (TIROS) Operational Vertical Sounder (TOVS), the Special Sensor Microwave/Imager (SSM/I), and radiosonde observations into a daily collage of PW. The original dataset consisted of five years of data from 1988 to 1992. Recent updates have added three additional years (1993-1995) and incorporated procedural and algorithm changes from the original methodology. Since each of the PW sources (TOVS, SSM/I, and radiosonde) do not provide global coverage, each of these sources compliment one another by providing spatial coverage over regions and during times where the other is not available. For this type of spatial and temporal blending to be successful, each of the source components should have similar or compatible accuracies. If this is not the case, regional and time varying biases may be manifested in the NVAP dataset. This study examines the consistency of the NVAP source data by comparing daily collocated TOVS and SSM/I PW retrievals with collocated radiosonde PW observations. The daily PW intercomparisons are performed over the time period of the dataset and for various regions.
Analysis of illicit drugs in wastewater - Is there an added value for law enforcement?
Been, F; Esseiva, P; Delémont, O
2016-09-01
Assessing illicit drug use through the analysis of wastewater is progressively being integrated into existing methods used to monitor the epidemiology of drug use. However, the approach's potential to deliver pertinent information for law enforcement has been discussed only limitedly. Thus, this work focuses on evaluating the added value of the approach from the perspective of law enforcement. Results from wastewater analysis carried out in two cities in Switzerland were scrutinised, taking into account intelligence derived from the work of drug enforcement in the area. Focus was set on three substances, namely cocaine, heroin and methamphetamine. Findings show that results from wastewater analysis can be used by law enforcement to assess the market share held by criminal groups. Combined with intelligence resulting from police work (e.g., investigations and informants), wastewater analysis can contribute to deciphering the structure of drug markets, as well as the local organisation of trafficking networks. The results presented here constitute valuable pieces of information, which can be used by law enforcement to guide decisions at strategic and/or operational levels. Furthermore, intelligence gathered through investigations and surveillance constitutes an alternative viewpoint to evaluate results of wastewater analysis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Gauge boson exchange in AdS d+1
NASA Astrophysics Data System (ADS)
D'Hoker, Eric; Freedman, Daniel Z.
1999-04-01
We study the amplitude for exchange of massless gauge bosons between pairs of massive scalar fields in anti-de Sitter space. In the AdS/CFT correspondence this amplitude describes the contribution of conserved flavor symmetry currents to 4-point functions of scalar operators in the boundary conformal theory. A concise, covariant, Y2K compatible derivation of the gauge boson propagator in AdS d + 1 is given. Techniques are developed to calculate the two bulk integrals over AdS space leading to explicit expressions or convenient, simple integral representations for the amplitude. The amplitude contains leading power and sub-leading logarithmic singularities in the gauge boson channel and leading logarithms in the crossed channel. The new methods of this paper are expected to have other applications in the study of the Maldacena conjecture.
A Dataset from TIMSS to Examine the Relationship between Computer Use and Mathematics Achievement
ERIC Educational Resources Information Center
Kadijevich, Djordje M.
2015-01-01
Because the relationship between computer use and achievement is still puzzling, there is a need to prepare and analyze good quality datasets on computer use and achievement. Such a dataset can be derived from TIMSS data. This paper describes how this dataset can be prepared. It also gives an example of how the dataset may be analyzed. The…
The Value-Added Achievement Gains of NBPTS-Certified Teachers in Tennessee: A Brief Report.
ERIC Educational Resources Information Center
Stone, J. E.
This study investigated whether National Board for Professional Teaching Standards (NBPTS)-certified teachers in Tennessee were exceptionally effective in bringing about objectively measured student achievement gains. Tennessee has over 40 NBPTS-certified teachers, 16 of whom teach in grades 3-8 and have value-added teacher reports in the state…
What Are Error Rates for Classifying Teacher and School Performance Using Value-Added Models?
ERIC Educational Resources Information Center
Schochet, Peter Z.; Chiang, Hanley S.
2013-01-01
This article addresses likely error rates for measuring teacher and school performance in the upper elementary grades using value-added models applied to student test score gain data. Using a realistic performance measurement system scheme based on hypothesis testing, the authors develop error rate formulas based on ordinary least squares and…
ERIC Educational Resources Information Center
Epstein, Diana; Miller, Raegen T.
2011-01-01
In August 2010 the "Los Angeles Times" published a special report on their website featuring performance ratings for nearly 6,000 Los Angeles Unified School District teachers. The move was controversial because the ratings were based on so-called value-added estimates of teachers' contributions to student learning. As with most…
McKenzie, Grant; Janowicz, Krzysztof
2017-01-01
Gaining access to inexpensive, high-resolution, up-to-date, three-dimensional road network data is a top priority beyond research, as such data would fuel applications in industry, governments, and the broader public alike. Road network data are openly available via user-generated content such as OpenStreetMap (OSM) but lack the resolution required for many tasks, e.g., emergency management. More importantly, however, few publicly available data offer information on elevation and slope. For most parts of the world, up-to-date digital elevation products with a resolution of less than 10 meters are a distant dream and, if available, those datasets have to be matched to the road network through an error-prone process. In this paper we present a radically different approach by deriving road network elevation data from massive amounts of in-situ observations extracted from user-contributed data from an online social fitness tracking application. While each individual observation may be of low-quality in terms of resolution and accuracy, taken together they form an accurate, high-resolution, up-to-date, three-dimensional road network that excels where other technologies such as LiDAR fail, e.g., in case of overpasses, overhangs, and so forth. In fact, the 1m spatial resolution dataset created in this research based on 350 million individual 3D location fixes has an RMSE of approximately 3.11m compared to a LiDAR-based ground-truth and can be used to enhance existing road network datasets where individual elevation fixes differ by up to 60m. In contrast, using interpolated data from the National Elevation Dataset (NED) results in 4.75m RMSE compared to the base line. We utilize Linked Data technologies to integrate the proposed high-resolution dataset with OpenStreetMap road geometries without requiring any changes to the OSM data model.
Estimating parameters for probabilistic linkage of privacy-preserved datasets.
Brown, Adrian P; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Boyd, James H
2017-07-10
Probabilistic record linkage is a process used to bring together person-based records from within the same dataset (de-duplication) or from disparate datasets using pairwise comparisons and matching probabilities. The linkage strategy and associated match probabilities are often estimated through investigations into data quality and manual inspection. However, as privacy-preserved datasets comprise encrypted data, such methods are not possible. In this paper, we present a method for estimating the probabilities and threshold values for probabilistic privacy-preserved record linkage using Bloom filters. Our method was tested through a simulation study using synthetic data, followed by an application using real-world administrative data. Synthetic datasets were generated with error rates from zero to 20% error. Our method was used to estimate parameters (probabilities and thresholds) for de-duplication linkages. Linkage quality was determined by F-measure. Each dataset was privacy-preserved using separate Bloom filters for each field. Match probabilities were estimated using the expectation-maximisation (EM) algorithm on the privacy-preserved data. Threshold cut-off values were determined by an extension to the EM algorithm allowing linkage quality to be estimated for each possible threshold. De-duplication linkages of each privacy-preserved dataset were performed using both estimated and calculated probabilities. Linkage quality using the F-measure at the estimated threshold values was also compared to the highest F-measure. Three large administrative datasets were used to demonstrate the applicability of the probability and threshold estimation technique on real-world data. Linkage of the synthetic datasets using the estimated probabilities produced an F-measure that was comparable to the F-measure using calculated probabilities, even with up to 20% error. Linkage of the administrative datasets using estimated probabilities produced an F-measure that was higher
ERIC Educational Resources Information Center
Pride, Bryce L.
2012-01-01
The Adequate Yearly Progress (AYP) Model has been used to make many high-stakes decisions concerning schools, though it does not provide a complete assessment of student academic achievement and school effectiveness. To provide a clearer perspective, many states have implemented various Growth and Value Added Models, in addition to AYP. The…
ERIC Educational Resources Information Center
Lipscomb, Stephen; Gill, Brian; Booker, Kevin; Johnson, Matthew
2010-01-01
At the request of Pittsburgh Public Schools (PPS) and the Pittsburgh Federation of Teachers (PFT), Mathematica is developing value-added models (VAMs) that aim to estimate the contributions of individual teachers, teams of teachers, and schools to the achievement growth of their students. The analyses described in this report are intended as an…
NASA Astrophysics Data System (ADS)
Rudenko, I.; Bekchanov, M.; Djanibekov, U.; Lamers, J. P. A.
2013-11-01
Since independence from the former Soviet Union in 1991, Uzbekistan is challenged to consolidate its efforts and identify and introduce suitable agricultural policies to ease the threat of advancing land, water and ecosystem deterioration. On the one hand, irrigated cotton production provides income, food and energy sources for a large part of the rural households, which accounts for about 70% of the total population. On the other hand, this sector is considered a major driver of the on-going environmental degradation. Due to this dual nature, an integrated approach is needed that allows the analyses of the cotton sector at different stages and, consequently, deriving comprehensive options for action. The findings of the economic based value chain analysis and ecologically-oriented water footprint analysis on regional level were complemented with the findings of an input-output model on national level. This combination gave an added value for better-informed decision-making to reach land, water and ecosystem sustainability, compared to the individual results of each approach. The synergy of approaches pointed at various options for actions, such as to (i) promote the shift of water use from the high water consuming agricultural sector to a less water consuming cotton processing sector, (ii) increase overall water use efficiency by expanding the highly water productive industrial sectors and concurrently decreasing sectors with inefficient water use, and (iii) reduce agricultural water use by improving irrigation and conveyance efficiencies. The findings showed that increasing water use efficiency, manufacturing products with higher value added and raising water users' awareness of the real value of water are essential for providing water security in Uzbekistan.
EXACT S-MATRICES FOR AdS3/CFT2
NASA Astrophysics Data System (ADS)
Ahn, Changrim; Bombardelli, Diego
2013-12-01
We propose exact S-matrices for the AdS3/CFT2 duality between type IIB strings on AdS3×S3×M4 with M4 = S3×S1 or T4 and the corresponding two-dimensional conformal field theories. We fix the two-particle S-matrices on the basis of the symmetries su(1|1) and su(1|1)×su(1|1). A crucial justification comes from the derivation of the all-loop Bethe ansatz matching exactly the recent conjecture proposed by Babichenko et al. [J. High Energy Phys.1003, 058 (2010), arXiv:0912.1723 [hep-th
Adding intelligence to mobile asset management in hospitals: the true value of RFID.
Castro, Linda; Lefebvre, Elisabeth; Lefebvre, Louis A
2013-10-01
RFID (Radio Frequency Identification) technology is expected to play a vital role in the healthcare arena, especially in times when cost containments are at the top of the priorities of healthcare management authorities. Medical equipment represents a significant share of yearly healthcare operational costs; hence, ensuring an effective and efficient management of such key assets is critical to promptly and reliably deliver a diversity of clinical services at the patient bedside. Empirical evidence from a phased-out RFID implementation in one European hospital demonstrates that RFID has the potential to transform asset management by improving inventory management, enhancing asset utilization, increasing staff productivity, improving care services, enhancing maintenance compliance, and increasing information visibility. Most importantly, RFID allows the emergence of intelligent asset management processes, which is, undoubtedly, the most important benefit that could be derived from the RFID system. Results show that the added intelligence can be rather basic (auto-status change) or a bit more advanced (personalized automatic triggers). More importantly, adding intelligence improves planning and decision-making processes.
Xiao, Jian; Hu, Jia-Yao; Sun, Hao-Dong; Zhao, Xiang; Zhong, Wan-Tong; Duan, Dong-Zhu; Wang, Le; Wang, Xiao-Ling
2017-11-28
Four new diphenyl ether derivatives, sinopestalotiollides A-D (1-4), one new natural α-pyrone product (11), as well as twelve known compounds (5-1 7), were obtained from the ethyl acetate extract of the endophytic fungus Pestalotiopsis palmarum isolated from the leaves of medicinal plant Sinomenium acutum (Thunb.) Rehd et Wils. The structures were elucidated by HR-ESI-MS and NMR spectrometry data. Bioassay experiments revealed that compounds 1-4 and 11 exhibited strong to weak cytotoxicities against three human tumor cell lines Hela, HCT116 and A549. Copyright © 2017 Elsevier Ltd. All rights reserved.
Deryabin, Dmitry G; Efremova, Ludmila V; Vasilchenko, Alexey S; Saidakova, Evgeniya V; Sizova, Elena A; Troshin, Pavel A; Zhilenkov, Alexander V; Khakina, Ekaterina A; Khakina, Ekaterina E
2015-08-08
The cause-effect relationships between physicochemical properties of amphiphilic [60]fullerene derivatives and their toxicity against bacterial cells have not yet been clarified. In this study, we report how the differences in the chemical structure of organic addends in 10 originally synthesized penta-substituted [60]fullerene derivatives modulate their zeta potential and aggregate's size in salt-free and salt-added aqueous suspensions as well as how these physicochemical characteristics affect the bioenergetics of freshwater Escherichia coli and marine Photobacterium phosphoreum bacteria. Dynamic light scattering, laser Doppler micro-electrophoresis, agarose gel electrophoresis, atomic force microscopy, and bioluminescence inhibition assay were used to characterize the fullerene aggregation behavior in aqueous solution and their interaction with the bacterial cell surface, following zeta potential changes and toxic effects. Dynamic light scattering results indicated the formation of self-assembled [60]fullerene aggregates in aqueous suspensions. The measurement of the zeta potential of the particles revealed that they have different surface charges. The relationship between these physicochemical characteristics was presented as an exponential regression that correctly described the dependence of the aggregate's size of penta-substituted [60]fullerene derivatives in salt-free aqueous suspension from zeta potential value. The prevalence of DLVO-related effects was shown in salt-added aqueous suspension that decreased zeta potential values and affected the aggregation of [60]fullerene derivatives expressed differently for individual compounds. A bioluminescence inhibition assay demonstrated that the toxic effect of [60]fullerene derivatives against E. coli cells was strictly determined by their positive zeta potential charge value being weakened against P. phosphoreum cells in an aquatic system of high salinity. Atomic force microscopy data suggested that the
Value-Added and Observational Measures Used in the Teacher Evaluation Process: A Validation Study
ERIC Educational Resources Information Center
Guerere, Claudia
2013-01-01
Scores from value-added models (VAMs), as used for educational accountability, represent the educational effect teachers have on their students. The use of these scores in teacher evaluations for high-stakes decision making is new for the State of Florida. Validity evidence that supports or questions the use of these scores is critically needed.…
Evaluation of the Precision of Satellite-Derived Sea Surface Temperature Fields
NASA Astrophysics Data System (ADS)
Wu, F.; Cornillon, P. C.; Guan, L.
2016-02-01
A great deal of attention has been focused on the temporal accuracy of satellite-derived sea surface temperature (SST) fields with little attention being given to their spatial precision. Specifically, the primary measure of the quality of SST fields has been the bias and variance of selected values minus co-located (in space and time) in situ values. Contributing values, determined by the location of the in situ values and the necessity that the satellite-derived values be cloud free, are generally widely separated in space and time hence provide little information related to the pixel-to-pixel uncertainty in the retrievals. But the main contribution to the uncertainty in satellite-derived SST retrievals relates to atmospheric contamination and because the spatial scales of atmospheric features are, in general, large compared with the pixel separation of modern infra-red sensors, the pixel-to-pixel uncertainty is often smaller than the accuracy determined from in situ match-ups. This makes selection of satellite-derived datasets for the study of submesoscale processes, for which the spatial structure of the upper ocean is significant, problematic. In this presentation we present a methodology to characterize the spatial precision of satellite-derived SST fields. The method is based on an examination of the high wavenumber tail of the 2-D spectrum of SST fields in the Sargasso Sea, a low energy region of the ocean close to the track of the MV Oleander, a container ship making weekly roundtrips between New York and Bermuda, with engine intake temperatures sampled every 75 m along track. Important spectral characteristics are the point at which the satellite-derived spectra separate from the Oleander spectra and the spectral slope following separation. In this presentation a number of high resolution 375 m to 10 km SST datasets are evaluated based on this approach.
A Multiyear Dataset of SSM/I-Derived Global Ocean Surface Turbulent Fluxes
NASA Technical Reports Server (NTRS)
Chou, Shu-Hsien; Shie, Chung-Lin; Atlas, Robert M.; Ardizzone, Joe; Nelkin, Eric; Einaudi, Franco (Technical Monitor)
2001-01-01
The surface turbulent fluxes of momentum, latent heat, and sensible heat over global oceans are essential to weather, climate and ocean problems. Evaporation is a key component of the hydrological cycle and the surface heat budget, while the wind stress is the major forcing for driving the oceanic circulation. The global air-sea fluxes of momentum, latent and sensible heat, radiation, and freshwater (precipitation-evaporation) are the forcing for driving oceanic circulation and, hence, are essential for understanding the general circulation of global oceans. The global air-sea fluxes are required for driving ocean models and validating coupled ocean-atmosphere global models. We have produced a 7.5-year (July 1987-December 1994) dataset of daily surface turbulent fluxes over the global oceans from the Special Sensor microwave/Imager (SSM/I) data. Daily turbulent fluxes were derived from daily data of SSM/I surface winds and specific humidity, National Centers for Environmental Prediction (NCEP) sea surface temperatures, and European Centre for Medium-Range Weather Forecasts (ECMWF) air-sea temperature differences, using a stability-dependent bulk scheme. The retrieved instantaneous surface air humidity (with a 25-km resolution) validated well with that of the collocated radiosonde observations over the global oceans. Furthermore, the retrieved daily wind stresses and latent heat fluxes were found to agree well with that of the in situ measurements (IMET buoy, RV Moana Wave, and RV Wecoma) in the western Pacific warm pool during the TOGA COARE intensive observing period (November 1992-February 1993). The global distributions of 1988-94 seasonal-mean turbulent fluxes will be presented. In addition, the global distributions of 1990-93 annual-means turbulent fluxes and input variables will be compared with those of UWM/COADS covering the same period. The latter is based on the COADS (comprehensive ocean-atmosphere data set) and is recognized to be one of the best
NASA Astrophysics Data System (ADS)
Agustinus, E. T. S.
2018-02-01
Indonesia's position on the path of ring of fire makes it rich in mineral resources. Nevertheless, in the past, the exploitation of Indonesian mineral resources was uncontrolled resulting in environmental degradation and marginal reserves. Exploitation of excessive mineral resources is very detrimental to the state. Reflecting on the occasion, the management and utilization of Indonesia's mineral resources need to be good in mining practice. The problem is how to utilize the mineral reserve resources effectively and efficiently. Utilization of marginal reserves requires new technologies and processing methods because the old processing methods are inadequate. This paper gives a result of Multi Blending Technology (MBT) Method. The underlying concept is not to do the extraction or refinement but processing through the formulation of raw materials by adding an additive and produce a new material called functional materials. Application of this method becomes important to be summarized into a scientific paper in a book form, so that the information can spread across multiple print media and become focused on and optimized. This book is expected to be used as a reference for stakeholder providing added value to environmentally marginal reserves in Indonesia. The conclusions are that Multi Blending Technology (MBT) Method can be used as a strategy to increase added values effectively and efficiently to marginal reserve minerals and that Multi Blending Technology (MBT) method has been applied to forsterite, Atapulgite Synthesis, Zeoceramic, GEM, MPMO, SMAC and Geomaterial.
Arafiles, Kim Hazel V; Iwasaka, Hiroaki; Eramoto, Yuri; Okamura, Yoshiko; Tajima, Takahisa; Matsumura, Yukihiko; Nakashimada, Yutaka; Aki, Tsunehiro
2014-11-01
Thraustochytrid production of polyunsaturated fatty acids and xanthophylls have been generally sourced from crop-derived substrates, making the exploration of alternative feedstocks attractive since they promise increased sustainability and lower production costs. In this study, a distinct two-stage fermentation system was conceptualized for the first time, using the brown seaweed sugar mannitol as substrate for the intermediary biocatalyst Gluconobacter oxydans, an acetic acid bacterium, along with the marine thraustochytrid Aurantiochytrium sp. to produce the value-added lipids and xanthophylls. Jar fermenter culture resulted in seaweed mannitol conversion to fructose with an efficiency of 83 % by G. oxydans and, after bacteriostasis with sea salts, production of astaxanthin and docosahexaenoic acid by Aurantiochytrium sp. KH105. Astaxanthin productivity was high at 3.60 mg/L/day. This new system, therefore, widens possibilities of obtaining more varieties of industrially valuable products including foods, cosmetics, pharmaceuticals, and biofuel precursor lipids from seaweed fermentation upon the use of suitable thraustochytrid strains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Velazquez, E Rios; Narayan, V; Grossmann, P
2015-06-15
Purpose: To compare the complementary prognostic value of automated Radiomic features to that of radiologist-annotated VASARI features in TCGA-GBM MRI dataset. Methods: For 96 GBM patients, pre-operative MRI images were obtained from The Cancer Imaging Archive. The abnormal tumor bulks were manually defined on post-contrast T1w images. The contrast-enhancing and necrotic regions were segmented using FAST. From these sub-volumes and the total abnormal tumor bulk, a set of Radiomic features quantifying phenotypic differences based on the tumor intensity, shape and texture, were extracted from the post-contrast T1w images. Minimum-redundancy-maximum-relevance (MRMR) was used to identify the most informative Radiomic, VASARI andmore » combined Radiomic-VASARI features in 70% of the dataset (training-set). Multivariate Cox-proportional hazards models were evaluated in 30% of the dataset (validation-set) using the C-index for OS. A bootstrap procedure was used to assess significance while comparing the C-Indices of the different models. Results: Overall, the Radiomic features showed a moderate correlation with the radiologist-annotated VASARI features (r = −0.37 – 0.49); however that correlation was stronger for the Tumor Diameter and Proportion of Necrosis VASARI features (r = −0.71 – 0.69). After MRMR feature selection, the best-performing Radiomic, VASARI, and Radiomic-VASARI Cox-PH models showed a validation C-index of 0.56 (p = NS), 0.58 (p = NS) and 0.65 (p = 0.01), respectively. The combined Radiomic-VASARI model C-index was significantly higher than that obtained from either the Radiomic or VASARI model alone (p = <0.001). Conclusion: Quantitative volumetric and textural Radiomic features complement the qualitative and semi-quantitative annotated VASARI feature set. The prognostic value of informative qualitative VASARI features such as Eloquent Brain and Multifocality is increased with the addition of quantitative volumetric and textural features from
Custom auroral electrojet indices calculated by using MANGO value-added services
NASA Astrophysics Data System (ADS)
Bargatze, L. F.; Moore, W. B.; King, T. A.
2009-12-01
A set of computational routines called MANGO, Magnetogram Analysis for the Network of Geophysical Observatories, is utilized to calculate customized versions of the auroral electrojet indices, AE, AL, and AU. MANGO is part of an effort to enhance data services available to users of the Heliophysics VxOs, specifically for the Virtual Magnetospheric Observatory (VMO). The MANGO value-added service package is composed of a set of IDL routines that decompose ground magnetic field observations to isolate secular, diurnal, and disturbance variations of magnetic field disturbance, station-by-station. Each MANGO subroutine has been written in modular fashion to allow "plug and play"-style flexibility and each has been designed to account for failure modes and noisy data so that the programs will run to completion producing as much derived data as possible. The capabilities of the MANGO service package will be demonstrated through their application to the study of auroral electrojet current flow during magnetic substorms. Traditionally, the AE indices are calculated by using data from about twelve ground stations located at northern auroral zone latitudes spread longitudinally around the world. Magnetogram data are corrected for secular variation prior to calculating the standard version of the indices but the data are not corrected for diurnal variations. A custom version of the AE indices will be created by using the MANGO routines including a step to subtract diurnal curves from the magnetic field data at each station. The custom AE indices provide more accurate measures of auroral electrojet activity due to isolation of the sunstorm electrojet magnetic field signiture. The improvements in the accuracy of the custom AE indices over the tradition indices are largest during the northern hemisphere summer when the range of diurnal variation reaches its maximum.
NASA Technical Reports Server (NTRS)
Jansen, Michael
2005-01-01
Earned value management [EVM] ...either you swear by it, or swear at it. Either way, there s no getting around the fact that EVM can be one of the most efficient and insightful methods of synthesizing cost, schedule, and technical status information into a single set of program health metrics. Is there a way of implementing EVM that allows a program to reap its early warning benefits while avoiding the pitfalls that make it infamous to its detractors? That s the question recently faced by the International Space Station [ISS] program.
Tradeoffs in the Use of Value-Added Estimates of Teacher Effectiveness by School Districts
ERIC Educational Resources Information Center
Baxter, Andrew David
2011-01-01
A new capacity to track the inputs and outcomes of individual students' education production function has spurred a growing number of school districts to attempt to measure the productivity of their teachers in terms of student outcomes. The use of these value-added measures of teacher effectiveness is at the center of current education reform.…
Measurement Error and Bias in Value-Added Models. Research Report. ETS RR-17-25
ERIC Educational Resources Information Center
Kane, Michael T.
2017-01-01
By aggregating residual gain scores (the differences between each student's current score and a predicted score based on prior performance) for a school or a teacher, value-added models (VAMs) can be used to generate estimates of school or teacher effects. It is known that random errors in the prior scores will introduce bias into predictions of…
E-Books in the Early Literacy Environment: Is There Added Value for Vocabulary Development?
ERIC Educational Resources Information Center
Roskos, Kathleen A.; Sullivan, Shannon; Simpson, Danielle; Zuzolo, Nicole
2016-01-01
Using a theory of affordances, this study examines the introduction of e-books into the early literacy environment as resources that can increase children's opportunity for learning vocabulary. Added value was observed under conditions of (1) book browsing, (2) instruction, and (3) a print-only condition. A total of 33 4-year-olds (18 boys, 15…
Accounting for Co-Teaching: A Guide for Policymakers and Developers of Value-Added Models
ERIC Educational Resources Information Center
Isenberg, Eric; Walsh, Elias
2015-01-01
We outline the options available to policymakers for addressing co-teaching in a value-added model. Building on earlier work, we propose an improvement to a method of accounting for co-teaching that treats co-teachers as teams, with each teacher receiving equal credit for co-taught students. Hock and Isenberg (2012) described a method known as the…
Simplified and age-appropriate recommendations for added sugars in children.
Goran, M I; Riemer, S L; Alderete, T L
2018-04-01
Excess sugar intake increases risk for obesity and related comorbidities among children. The World Health Organization (WHO), American Heart Association (AHA) and the 2015 USDA dietary recommendations have proposed guidelines for added sugar intake to reduce risk for disease. WHO and USDA recommendations are presented as a percentage of daily calories from added sugar. This approach is not easily understood or translated to children, where energy needs increase with age. The AHA recommendation is based on a fixed value of 25 g of added sugar for all children 2-19 years of age. This approach does not take into account the different levels of intake across this wide age range. Due to these limitations, we adapted current recommendations for added sugars based on daily energy needs of children 2-19 years. We used those values to derive simple regression equations to predict grams or teaspoons of added sugars per day based on age that would be equivalent to 10% of daily energy needs. This proposed approach aligns with the changing nutritional needs of children and adolescents during growth. © 2017 World Obesity Federation.
Transformations of asymptotically AdS hyperbolic initial data and associated geometric inequalities
NASA Astrophysics Data System (ADS)
Cha, Ye Sle; Khuri, Marcus
2018-01-01
We construct transformations which take asymptotically AdS hyperbolic initial data into asymptotically flat initial data, and which preserve relevant physical quantities. This is used to derive geometric inequalities in the asymptotically AdS hyperbolic setting from counterparts in the asymptotically flat realm, whenever a geometrically motivated system of elliptic equations admits a solution. The inequalities treated here relate mass, angular momentum, charge, and horizon area. Furthermore, new mass-angular momentum inequalities in this setting are conjectured and discussed.
Ka-Band ARM Zenith Radar Corrections Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Karen; Toto, Tami; Giangrande, Scott
The KAZRCOR Value -added Product (VAP) performs several corrections to the ingested KAZR moments and also creates a significant detection mask for each radar mode. The VAP computes gaseous attenuation as a function of time and radial distance from the radar antenna, based on ambient meteorological observations, and corrects observed reflectivities for that effect. KAZRCOR also dealiases mean Doppler velocities to correct velocities whose magnitudes exceed the radar’s Nyquist velocity. Input KAZR data fields are passed through into the KAZRCOR output files, in their native time and range coordinates. Complementary corrected reflectivity and velocity fields are provided, along with amore » mask of significant detections and a number of data quality flags. This report covers the KAZRCOR VAP as applied to the original KAZR radars and the upgraded KAZR2 radars. Currently there are two separate code bases for the different radar versions, but once KAZR and KAZR2 data formats are harmonized, only a single code base will be required.« less
Continuous-spin mixed-symmetry fields in AdS(5)
NASA Astrophysics Data System (ADS)
Metsaev, R. R.
2018-05-01
Free mixed-symmetry continuous-spin fields propagating in AdS(5) space and flat R(4,1) space are studied. In the framework of a light-cone gauge formulation of relativistic dynamics, we build simple actions for such fields. The realization of relativistic symmetries on the space of light-cone gauge mixed-symmetry continuous-spin fields is also found. Interrelations between constant parameters entering the light-cone gauge actions and eigenvalues of the Casimir operators of space-time symmetry algebras are obtained. Using these interrelations and requiring that the field dynamics in AdS(5) be irreducible and classically unitary, we derive restrictions on the constant parameters and eigenvalues of the second-order Casimir operator of the algebra.
A dataset on tail risk of commodities markets.
Powell, Robert J; Vo, Duc H; Pham, Thach N; Singh, Abhay K
2017-12-01
This article contains the datasets related to the research article "The long and short of commodity tails and their relationship to Asian equity markets"(Powell et al., 2017) [1]. The datasets contain the daily prices (and price movements) of 24 different commodities decomposed from the S&P GSCI index and the daily prices (and price movements) of three share market indices including World, Asia, and South East Asia for the period 2004-2015. Then, the dataset is divided into annual periods, showing the worst 5% of price movements for each year. The datasets are convenient to examine the tail risk of different commodities as measured by Conditional Value at Risk (CVaR) as well as their changes over periods. The datasets can also be used to investigate the association between commodity markets and share markets.
Stability of warped AdS3 vacua of topologically massive gravity
NASA Astrophysics Data System (ADS)
Anninos, Dionysios; Esole, Mboyo; Guica, Monica
2009-10-01
AdS3 vacua of topologically massive gravity (TMG) have been shown to be perturbatively unstable for all values of the coupling constant except the chiral point μl = 1. We study the possibility that the warped vacua of TMG, which exist for all values of μ, are stable under linearized perturbations. In this paper, we show that spacelike warped AdS3 vacua with Compère-Detournay boundary conditions are indeed stable in the range μl>3. This is precisely the range in which black hole solutions arise as discrete identifications of the warped AdS3 vacuum. The situation somewhat resembles chiral gravity: although negative energy modes do exist, they are all excluded by the boundary conditions, and the perturbative spectrum solely consists of boundary (pure large gauge) gravitons.
,
2002-01-01
The National Elevation Dataset (NED) is a new raster product assembled by the U.S. Geological Survey. NED is designed to provide National elevation data in a seamless form with a consistent datum, elevation unit, and projection. Data corrections were made in the NED assembly process to minimize artifacts, perform edge matching, and fill sliver areas of missing data. NED has a resolution of one arc-second (approximately 30 meters) for the conterminous United States, Hawaii, Puerto Rico and the island territories and a resolution of two arc-seconds for Alaska. NED data sources have a variety of elevation units, horizontal datums, and map projections. In the NED assembly process the elevation values are converted to decimal meters as a consistent unit of measure, NAD83 is consistently used as horizontal datum, and all the data are recast in a geographic projection. Older DEM's produced by methods that are now obsolete have been filtered during the NED assembly process to minimize artifacts that are commonly found in data produced by these methods. Artifact removal greatly improves the quality of the slope, shaded-relief, and synthetic drainage information that can be derived from the elevation data. Figure 2 illustrates the results of this artifact removal filtering. NED processing also includes steps to adjust values where adjacent DEM's do not match well, and to fill sliver areas of missing data between DEM's. These processing steps ensure that NED has no void areas and artificial discontinuities have been minimized. The artifact removal filtering process does not eliminate all of the artifacts. In areas where the only available DEM is produced by older methods, then "striping" may still occur.
Interactive visualization and analysis of multimodal datasets for surgical applications.
Kirmizibayrak, Can; Yim, Yeny; Wakid, Mike; Hahn, James
2012-12-01
Surgeons use information from multiple sources when making surgical decisions. These include volumetric datasets (such as CT, PET, MRI, and their variants), 2D datasets (such as endoscopic videos), and vector-valued datasets (such as computer simulations). Presenting all the information to the user in an effective manner is a challenging problem. In this paper, we present a visualization approach that displays the information from various sources in a single coherent view. The system allows the user to explore and manipulate volumetric datasets, display analysis of dataset values in local regions, combine 2D and 3D imaging modalities and display results of vector-based computer simulations. Several interaction methods are discussed: in addition to traditional interfaces including mouse and trackers, gesture-based natural interaction methods are shown to control these visualizations with real-time performance. An example of a medical application (medialization laryngoplasty) is presented to demonstrate how the combination of different modalities can be used in a surgical setting with our approach.
Tensionless string spectra on AdS3
NASA Astrophysics Data System (ADS)
Gaberdiel, Matthias R.; Gopakumar, Rajesh
2018-05-01
The spectrum of superstrings on AdS3 × S3 × M 4 with pure NS-NS flux is analysed for the background where the radius of the AdS space takes the minimal value ( k = 1). Both for M 4 = S3 × S1 and M 4 = T 4 we show that there is a special set of physical states, coming from the bottom of the spectrally flowed continuous representations, which agree in precise detail with the single particle spectrum of a free symmetric product orbifold. For the case of AdS3 × S3 × T 4 this relies on making sense of the world-sheet theory at k = 1, for which we make a concrete proposal. We also comment on the implications of this striking result.
Mebane, Christopher A.
2006-01-01
In 2001, the U.S. Environmental Protection Agency (EPA) released updated aquatic life criteria for cadmium. Since then, additional data on the effects of cadmium to aquatic life have become available from studies supported by the EPA, Idaho Department of Environmental Quality (IDEQ), and the U.S. Geological Survey, among other sources. Updated data on the effects of cadmium to aquatic life were compiled and reviewed and low-effect concentrations were estimated. Low-effect values were calculated using EPA's guidelines for deriving numerical national water-quality criteria for the protection of aquatic organisms and their uses. Data on the short-term (acute) effects of cadmium on North American freshwater species that were suitable for criteria derivation were located for 69 species representing 57 genera and 33 families. For longer-term (chronic) effects of cadmium on North American freshwater species, suitable data were located for 28 species representing 21 genera and 17 families. Both the acute and chronic toxicity of cadmium were dependent on the hardness of the test water. Hardness-toxicity regressions were developed for both acute and chronic datasets so that effects data from different tests could be adjusted to a common water hardness. Hardness-adjusted effects values were pooled to obtain species and genus mean acute and chronic values, which then were ranked by their sensitivity to cadmium. The four most sensitive genera to acute exposures were, in order of increasing cadmium resistance, Oncorhynchus (Pacific trout and salmon), Salvelinus ('char' trout), Salmo (Atlantic trout and salmon), and Cottus (sculpin). The four most sensitive genera to chronic exposures were Hyalella (amphipod), Cottus, Gammarus (amphipod), and Salvelinus. Using the updated datasets, hardness dependent criteria equations were calculated for acute and chronic exposures to cadmium. At a hardness of 50 mg/L as calcium carbonate, the criterion maximum concentration (CMC, or 'acute
Roelen, Corné A M; Bültmann, Ute; Groothoff, Johan W; Twisk, Jos W R; Heymans, Martijn W
2015-11-01
Prognostic models including age, self-rated health and prior sickness absence (SA) have been found to predict high (≥ 30) SA days and high (≥ 3) SA episodes during 1-year follow-up. More predictors of high SA are needed to improve these SA prognostic models. The purpose of this study was to investigate fatigue as new predictor in SA prognostic models by using risk reclassification methods and measures. This was a prospective cohort study with 1-year follow-up of 1,137 office workers. Fatigue was measured at baseline with the 20-item checklist individual strength and added to the existing SA prognostic models. SA days and episodes during 1-year follow-up were retrieved from an occupational health service register. The added value of fatigue was investigated with Net Reclassification Index (NRI) and integrated discrimination improvement (IDI) measures. In total, 579 (51 %) office workers had complete data for analysis. Fatigue was prospectively associated with both high SA days and episodes. The NRI revealed that adding fatigue to the SA days model correctly reclassified workers with high SA days, but incorrectly reclassified workers without high SA days. The IDI indicated no improvement in risk discrimination by the SA days model. Both NRI and IDI showed that the prognostic model predicting high SA episodes did not improve when fatigue was added as predictor variable. In the present study, fatigue increased false-positive rates which may reduce the cost-effectiveness of interventions for preventing SA.
A reanalysis dataset of the South China Sea.
Zeng, Xuezhi; Peng, Shiqiu; Li, Zhijin; Qi, Yiquan; Chen, Rongyu
2014-01-01
Ocean reanalysis provides a temporally continuous and spatially gridded four-dimensional estimate of the ocean state for a better understanding of the ocean dynamics and its spatial/temporal variability. Here we present a 19-year (1992-2010) high-resolution ocean reanalysis dataset of the upper ocean in the South China Sea (SCS) produced from an ocean data assimilation system. A wide variety of observations, including in-situ temperature/salinity profiles, ship-measured and satellite-derived sea surface temperatures, and sea surface height anomalies from satellite altimetry, are assimilated into the outputs of an ocean general circulation model using a multi-scale incremental three-dimensional variational data assimilation scheme, yielding a daily high-resolution reanalysis dataset of the SCS. Comparisons between the reanalysis and independent observations support the reliability of the dataset. The presented dataset provides the research community of the SCS an important data source for studying the thermodynamic processes of the ocean circulation and meso-scale features in the SCS, including their spatial and temporal variability.
ERIC Educational Resources Information Center
Goldring, Ellen; Grissom, Jason A.; Rubin, Mollie; Neumerski, Christine M.; Cannata, Marisa; Drake, Timothy; Schuermann, Patrick
2015-01-01
Increasingly, states and districts are combining student growth measures with rigorous, rubric-aligned teacher observations in constructing teacher evaluation measures. Although the student growth or value-added components of these measures have received much research and policy attention, the results of this study suggest that the data generated…
ERIC Educational Resources Information Center
Brady, Michael P.; Heiser, Lawrence A.; McCormick, Jazarae K.; Forgan, James
2016-01-01
High-stakes standardized student assessments are increasingly used in value-added evaluation models to connect teacher performance to P-12 student learning. These assessments are also being used to evaluate teacher preparation programs, despite validity and reliability threats. A more rational model linking student performance to candidates who…
Bob Smith; Philip A. Araman
1997-01-01
This paper looks at various opportunities for sawmills to add value to rough sawn lumber. The paper discusses edge-glued panels and blanks, mouldings and millwork, niche market opportunities, and timber bridge members.
Boucaud-Maitre, Denis; Altman, Jean-Jacques
2016-10-01
The Food and Drug Administration (FDA) and the European Medicines Agency (EMA) have both implemented procedures in order to shorten review time for marketing authorizations with potential therapeutic added value, called priority review and accelerated assessment procedure, respectively. The aim of this study is to compare the new molecular entities (NME) assessed in shorter review time by both agencies and to investigate whether granting a shorter review time status subsequently predicts its therapeutic value attributed by a health technology assessment agency, the French Haute Autorité de Santé (HAS). All NME approved by the EMA and the FDA with a therapeutic added value between 2007 and June 30, 2015 were extracted. We assessed the sensibility, the positive predictive value, and the EMA review time. One hundred seventy-eight NME were approved by the FDA and the EMA and a therapeutic value was available for 160 NME. Eighty-eight (55.0 %) NME were on FDA priority review, 24 (15.0 %) on EMA accelerated procedure and 43 (26.9 %) were considered of high clinical added value. The sensibility was 86.0 % for the FDA and 30.2 % for the EMA. The positive predictive value was, respectively, 42.0 and 54.2 %. Twenty-five NME on FDA priority review and of high therapeutic added value were not on EMA accelerated assessment procedure, leading to a supplementary mean EMA review time of 146 days. The EMA was restrictive to grant a shorten review time status for products with therapeutic interest during the study period.
The Dubious Utility of the Value-Added Concept in Higher Education: The Case of Accounting
ERIC Educational Resources Information Center
Yunker, J.A.
2005-01-01
Using data on CPA exam pass rates and various institutional variables, this research examines the potential usefulness of the value-added concept in accounting higher education. For a sample of 548 US colleges and universities, predicted pass rates were computed from regression equations relating observed pass rates to institutional variables. The…
ERIC Educational Resources Information Center
Troncoso, Patricio; Pampaka, Maria; Olsen, Wendy
2016-01-01
School value-added studies have largely demonstrated the effects of socioeconomic and demographic characteristics of the schools and the pupils on performance in standardised tests. Traditionally, these studies have assessed the variation coming only from the schools and the pupils. However, recent studies have shown that the analysis of academic…
Schwach, Pierre; Pan, Xiulian; Bao, Xinhe
2017-07-12
The quest for an efficient process to convert methane efficiently to fuels and high value-added chemicals such as olefins and aromatics is motivated by their increasing demands and recently discovered large reserves and resources of methane. Direct conversion to these chemicals can be realized either oxidatively via oxidative coupling of methane (OCM) or nonoxidatively via methane dehydroaromatization (MDA), which have been under intensive investigation for decades. While industrial applications are still limited by their low yield (selectivity) and stability issues, innovations in new catalysts and concepts are needed. The newly emerging strategy using iron single sites to catalyze methane conversion to olefins, aromatics, and hydrogen (MTOAH) attracted much attention when it was reported. Because the challenge lies in controlled dehydrogenation of the highly stable CH 4 and selective C-C coupling, we focus mainly on the fundamentals of C-H activation and analyze the reaction pathways toward selective routes of OCM, MDA, and MTOAH. With this, we intend to provide some insights into their reaction mechanisms and implications for future development of highly selective catalysts for direct conversion of methane to high value-added chemicals.
Wang, Liangmin
2018-01-01
Today IoT integrate thousands of inter networks and sensing devices e.g., vehicular networks, which are considered to be challenging due to its high speed and network dynamics. The goal of future vehicular networks is to improve road safety, promote commercial or infotainment products and to reduce the traffic accidents. All these applications are based on the information exchange among nodes, so not only reliable data delivery but also the authenticity and credibility of the data itself are prerequisite. To cope with the aforementioned problem, trust management come up as promising candidate to conduct node’s transaction and interaction management, which requires distributed mobile nodes cooperation for achieving design goals. In this paper, we propose a trust-based routing protocol i.e., 3VSR (Three Valued Secure Routing), which extends the widely used AODV (Ad hoc On-demand Distance Vector) routing protocol and employs the idea of Sensing Logic-based trust model to enhance the security solution of VANET (Vehicular Ad-Hoc Network). The existing routing protocol are mostly based on key or signature-based schemes, which off course increases computation overhead. In our proposed 3VSR, trust among entities is updated frequently by means of opinion derived from sensing logic due to vehicles random topologies. In 3VSR the theoretical capabilities are based on Dirichlet distribution by considering prior and posterior uncertainty of the said event. Also by using trust recommendation message exchange, nodes are able to reduce computation and routing overhead. The simulated results shows that the proposed scheme is secure and practical. PMID:29538314
Sohail, Muhammad; Wang, Liangmin
2018-03-14
Today IoT integrate thousands of inter networks and sensing devices e.g., vehicular networks, which are considered to be challenging due to its high speed and network dynamics. The goal of future vehicular networks is to improve road safety, promote commercial or infotainment products and to reduce the traffic accidents. All these applications are based on the information exchange among nodes, so not only reliable data delivery but also the authenticity and credibility of the data itself are prerequisite. To cope with the aforementioned problem, trust management come up as promising candidate to conduct node's transaction and interaction management, which requires distributed mobile nodes cooperation for achieving design goals. In this paper, we propose a trust-based routing protocol i.e., 3VSR (Three Valued Secure Routing), which extends the widely used AODV (Ad hoc On-demand Distance Vector) routing protocol and employs the idea of Sensing Logic-based trust model to enhance the security solution of VANET (Vehicular Ad-Hoc Network). The existing routing protocol are mostly based on key or signature-based schemes, which off course increases computation overhead. In our proposed 3VSR, trust among entities is updated frequently by means of opinion derived from sensing logic due to vehicles random topologies. In 3VSR the theoretical capabilities are based on Dirichlet distribution by considering prior and posterior uncertainty of the said event. Also by using trust recommendation message exchange, nodes are able to reduce computation and routing overhead. The simulated results shows that the proposed scheme is secure and practical.
ERIC Educational Resources Information Center
Finney, Sara J.; Sundre, Donna L.; Swain, Matthew S.; Williams, Laura M.
2016-01-01
Accountability mandates often prompt assessment of student learning gains (e.g., value-added estimates) via achievement tests. The validity of these estimates have been questioned when performance on tests is low stakes for students. To assess the effects of motivation on value-added estimates, we assigned students to one of three test consequence…
NASA Astrophysics Data System (ADS)
Lag-Brotons, Alfonso; Marshall, Rachel; Herbert, Ben; Hurst, Lois; Ostle, Nick; Dodd, Ian; Quinton, John; Surridge, Ben; Aiouache, Farid; Semple, Kirk T.
2017-04-01
Resource recovery from waste plays a central role in strategies tackling current worldwide sustainability problems. In this sense, two waste streams derived from bioenergy production (anaerobic digestion and incineration), digestate [D] and biomass ash [A], may be especially valuable within agriculture. These materials offer complementary plant nutrient profiles for alternative fertiliser production (i.e. nitrogen [N] from D and phosphorus [P] from A). In addition, incorporating these materials into the soil could impact upon several soil/plant characteristics, and have positive effects on ecosystem services (eg. nutrient cycling). Therefore, this present work assessed the effects of A/D blends on the soil-plant system under controlled conditions (glasshouse). The overarching aim of "Adding Value to Ash and Digestate [AVAnD]" project is to identify novel nutrient-recycling pathways to maximise soil quality and crop productivity utilising waste streams derived from bioenergy production. Two pot experiments of 6 weeks duration were carried out [Exp. A and Exp. B] using contrasting agricultural soils (neutral loam and sandy acidic soil) and wheat as the crop. A factorial randomised block design was selected, with fertilisation treatment and soil condition (planted/unplanted) as factors. Fertilisation treatments (n=13) were applied at a rate of 63/60 kg N/P2O5 per ha and comprised: control ([C], no fertilisation), urea [U], urea+superphosphate [U+P], fly ash [A1], bottom ash [A2], U+A1; U+A2, anaerobic digestates [D1, D2] and ash/digestate blends [D1A1, D1A2, D2A1, D2A2]. Each block (n=5) contained 8 planted and 5 unplanted pots (104 planted + 65 unplanted experimental units). At the end of the experiment, all the plants were assessed for morphometric traits, while for tissue elemental analyses the total number of replicates per treatment was randomly reduced (n=5/treatment). Soil physico-chemical properties (i.e. available nitrogen, pH) were assessed in unplanted and
New massive gravity and AdS(4) counterterms.
Jatkar, Dileep P; Sinha, Aninda
2011-04-29
We show that the recently proposed Dirac-Born-Infeld extension of new massive gravity emerges naturally as a counterterm in four-dimensional anti-de Sitter space (AdS(4)). The resulting on-shell Euclidean action is independent of the cutoff at zero temperature. We also find that the same choice of counterterm gives the usual area law for the AdS(4) Schwarzschild black hole entropy in a cutoff-independent manner. The parameter values of the resulting counterterm action correspond to a c=0 theory in the context of the duality between AdS(3) gravity and two-dimensional conformal field theory. We rewrite this theory in terms of the gauge field that is used to recast 3D gravity as a Chern-Simons theory.
A high-resolution European dataset for hydrologic modeling
NASA Astrophysics Data System (ADS)
Ntegeka, Victor; Salamon, Peter; Gomes, Goncalo; Sint, Hadewij; Lorini, Valerio; Thielen, Jutta
2013-04-01
inputs to the hydrological calibration and validation of EFAS as well as for establishing long-term discharge "proxy" climatologies which can then in turn be used for statistical analysis to derive return periods or other time series derivatives. In addition, this dataset will be used to assess climatological trends in Europe. Unfortunately, to date no baseline dataset at the European scale exists to test the quality of the herein presented data. Hence, a comparison against other existing datasets can therefore only be an indication of data quality. Due to availability, a comparison was made for precipitation and temperature only, arguably the most important meteorological drivers for hydrologic models. A variety of analyses was undertaken at country scale against data reported to EUROSTAT and E-OBS datasets. The comparison revealed that while the datasets showed overall similar temporal and spatial patterns, there were some differences in magnitudes especially for precipitation. It is not straightforward to define the specific cause for these differences. However, in most cases the comparatively low observation station density appears to be the principal reason for the differences in magnitude.
Added value of double reading in diagnostic radiology,a systematic review.
Geijer, Håkan; Geijer, Mats
2018-06-01
Double reading in diagnostic radiology can find discrepancies in the original report, but a systematic program of double reading is resource consuming. There are conflicting opinions on the value of double reading. The purpose of the current study was to perform a systematic review on the value of double reading. A systematic review was performed to find studies calculating the rate of misses and overcalls with the aim of establishing the added value of double reading by human observers. The literature search resulted in 1610 hits. After abstract and full-text reading, 46 articles were selected for analysis. The rate of discrepancy varied from 0.4 to 22% depending on study setting. Double reading by a sub-specialist, in general, led to high rates of changed reports. The systematic review found rather low discrepancy rates. The benefit of double reading must be balanced by the considerable number of working hours a systematic double-reading scheme requires. A more profitable scheme might be to use systematic double reading for selected, high-risk examination types. A second conclusion is that there seems to be a value of sub-specialisation for increased report quality. A consequent implementation of this would have far-reaching organisational effects. • In double reading, two or more radiologists read the same images. • A systematic literature review was performed. • The discrepancy rates varied from 0.4 to 22% in various studies. • Double reading by sub-specialists found high discrepancy rates.
Value Added Models and the Implementation of the National Standards of K-12 Physical Education
ERIC Educational Resources Information Center
Seymour, Clancy M.; Garrison, Mark J.
2017-01-01
The implementation of value-added models of teacher evaluation continue to expand in public education, but the effects of using student test scores to evaluate K-12 physical educators necessitates further discussion. Using the five National Standards for K-12 Physical Education from the Society of Health and Physical Educators America (SHAPE),…
ERIC Educational Resources Information Center
Khaled, Anne; Gulikers, Judith; Biemans, Harm; van der Wel, Marjan; Mulder, Martin
2014-01-01
The intentions with which hands-on simulations are used in vocational education are not always clear. Also, pedagogical-didactic approaches in hands-on simulations are not well conceptualised from a learning theory perspective. This makes it difficult to pinpoint the added value that hands-on simulations can have in an innovative vocational…
Transforming Care at the Bedside (TCAB): enhancing direct care and value-added care.
Dearmon, Valorie; Roussel, Linda; Buckner, Ellen B; Mulekar, Madhuri; Pomrenke, Becky; Salas, Sheri; Mosley, Aimee; Brown, Stephanie; Brown, Ann
2013-05-01
The purpose of this study was to examine the effectiveness of a Transforming Care at the Bedside initiative from a unit perspective. Improving patient outcomes and nurses' work environments are the goals of Transforming Care at the Bedside. Transforming Care at the Bedside creates programs of change originating at the point of care and directly promoting engagement of nurses to transform work processes and quality of care on medical-surgical units. This descriptive comparative study draws on multiple data sources from two nursing units: a Transforming Care at the Bedside unit where staff tested, adopted and implemented improvement ideas, and a control unit where staff continued traditional practices. Change theory provided the framework for the study. Direct care and value-added care increased on Transforming Care at the Bedside unit compared with the control unit. Transforming Care at the Bedside unit decreased in incidental overtime. Nurses reported that the process challenged old ways of thinking and increased nursing innovations. Hourly rounding, bedside reporting and the use of pain boards were seen as positive innovations. Evidence supported the value-added dimension of the Transforming Care at the Bedside process at the unit level. Nurses recognized the significance of their input into processes of change. Transformational leadership and frontline projects provide a vehicle for innovation through application of human capital. © 2012 Blackwell Publishing Ltd.
Goossens, Joery; Bjerke, Maria; Struyfs, Hanne; Niemantsverdriet, Ellis; Somers, Charisse; Van den Bossche, Tobi; Van Mossevelde, Sara; De Vil, Bart; Sieben, Anne; Martin, Jean-Jacques; Cras, Patrick; Goeman, Johan; De Deyn, Peter Paul; Van Broeckhoven, Christine; van der Zee, Julie; Engelborghs, Sebastiaan
2017-07-14
The Alzheimer's disease (AD) cerebrospinal fluid (CSF) biomarkers Aβ 1-42 , t-tau, and p-tau 181 overlap with other diseases. New tau modifications or epitopes, such as the non-phosphorylated tau fraction (p-tau rel ), may improve differential dementia diagnosis. The goal of this study is to investigate if p-tau rel can improve the diagnostic performance of the AD CSF biomarker panel for differential dementia diagnosis. The study population consisted of 45 AD, 45 frontotemporal lobar degeneration (FTLD), 45 dementia with Lewy bodies (DLB), and 21 Creutzfeldt-Jakob disease (CJD) patients, and 20 cognitively healthy controls. A substantial subset of the patients was pathology-confirmed. CSF levels of Aβ 1-42 , t-tau, p-tau 181 , and p-tau rel were determined with commercially available single-analyte enzyme-linked immunosorbent assay (ELISA) kits. Diagnostic performance was evaluated by receiver operating characteristic (ROC) curve analyses, and area under the curve (AUC) values were compared using DeLong tests. The diagnostic performance of single markers as well as biomarker ratios was determined for each pairwise comparison of different dementia groups and controls. The addition of p-tau rel to the AD biomarker panel decreased its diagnostic performance when discriminating non-AD, FTLD, and DLB from AD. As a single marker, p-tau rel increased the diagnostic performance for CJD. No significant difference was found in AUC values with the addition of p-tau rel when differentiating between AD or non-AD dementias and controls. The addition of p-tau rel to the AD CSF biomarker panel failed to improve differentiation between AD and non-AD dementias.
QCD Condensates and Holographic Wilson Loops for Asymptotically AdS Spaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quevedo, R. Carcasses; Goity, Jose L.; Trinchero, Roberto C.
2014-02-01
The minimization of the Nambu-Goto (NG) action for a surface whose contour defines a circular Wilson loop of radius a placed at a finite value of the coordinate orthogonal to the border is considered. This is done for asymptotically AdS spaces. The condensates of dimension n = 2, 4, 6, 8, and 10 are calculated in terms of the coefficients in the expansion in powers of the radius a of the on-shell subtracted NG action for small a->0. The subtraction employed is such that it presents no conflict with conformal invariance in the AdS case and need not introduce anmore » additional infrared scale for the case of confining geometries. It is shown that the UV value of the gluon condensates is universal in the sense that it only depends on the first coefficients of the difference with the AdS case.« less
Self-consistent geodesic equation and quantum tunneling from charged AdS black holes
NASA Astrophysics Data System (ADS)
Deng, Gao-Ming
2017-12-01
Some urgent shortcomings in previous derivations of geodesic equations are remedied in this paper. In contrast to the unnatural and awkward treatment in previous works, here we derive the geodesic equations of massive and massless particles in a unified and self- consistent manner. Furthermore, we extend to investigate the Hawking radiation via tunneling from charged black holes in the context of AdS spacetime. Of special interest, the application of the first law of black hole thermodynamics in tunneling integration manifestly simplifies the calculation.
Interpolated Sounding and Gridded Sounding Value-Added Products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toto, T.; Jensen, M.
Standard Atmospheric Radiation Measurement (ARM) Climate Research Facility sounding files provide atmospheric state data in one dimension of increasing time and height per sonde launch. Many applications require a quick estimate of the atmospheric state at higher time resolution. The INTERPOLATEDSONDE (i.e., Interpolated Sounding) Value-Added Product (VAP) transforms sounding data into continuous daily files on a fixed time-height grid, at 1-minute time resolution, on 332 levels, from the surface up to a limit of approximately 40 km. The grid extends that high so the full height of soundings can be captured; however, most soundings terminate at an altitude between 25more » and 30 km, above which no data is provided. Between soundings, the VAP linearly interpolates atmospheric state variables in time for each height level. In addition, INTERPOLATEDSONDE provides relative humidity scaled to microwave radiometer (MWR) observations.The INTERPOLATEDSONDE VAP, a continuous time-height grid of relative humidity-corrected sounding data, is intended to provide input to higher-order products, such as the Merged Soundings (MERGESONDE; Troyan 2012) VAP, which extends INTERPOLATEDSONDE by incorporating model data. The INTERPOLATEDSONDE VAP also is used to correct gaseous attenuation of radar reflectivity in products such as the KAZRCOR VAP.« less
Gonzalo, Jed D; Graaf, Deanna; Ahluwalia, Amarpreet; Wolpaw, Dan R; Thompson, Britta M
2018-03-21
After emphasizing biomedical and clinical sciences for over a century, US medical schools are expanding experiential roles that allow students to learn about health care delivery while also adding value to patient care. After developing a program where all 1st-year medical students are integrated into interprofessional care teams to contribute to patient care, authors use a diffusion of innovations framework to explore and identify barriers, facilitators, and best practices for implementing value-added clinical systems learning roles. In 2016, authors conducted 32 clinical-site observations, 29 1:1 interviews with mentors, and four student focus-group interviews. Data were transcribed verbatim, and a thematic analysis was used to identify themes. Authors discussed drafts of the categorization scheme, and agreed upon results and quotations. Of 36 sites implementing the program, 17 (47%) remained, 8 (22%) significantly modified, and 11 (31%) withdrew from the program. Identified strategies for implementing value-added roles included: student education, patient characteristics, patient selection methods, activities performed, and resources. Six themes influencing program implementation and maintenance included: (1) educational benefit, (2) value added to patient care from student work, (3) mentor time and site capacity, (4) student engagement, (5) working relationship between school, site, and students, and, (6) students' continuity at the site. Health systems science is an emerging focus for medical schools, and educators are challenged to design practice-based roles that enhance education and add value to patient care. Health professions' schools implementing value-added roles will need to invest resources and strategize about best-practice strategies to guide efforts.
Adding Value to the Health Care System: Identifying Value-Added Systems Roles for Medical Students.
Gonzalo, Jed D; Graaf, Deanna; Johannes, Bobbie; Blatt, Barbara; Wolpaw, Daniel R
To catalyze learning in Health Systems Science and add value to health systems, education programs are seeking to incorporate students into systems roles, which are not well described. The authors sought to identify authentic roles for students within a range of clinical sites and explore site leaders' perceptions of the value of students performing these roles. From 2013 to 2015, site visits and interviews with leadership from an array of clinical sites (n = 30) were conducted. Thematic analysis was used to identify tasks and benefits of integrating students into interprofessional care teams. Types of systems roles included direct patient benefit activities, including monitoring patient progress with care plans and facilitating access to resources, and clinic benefit activities, including facilitating coordination and improving clinical processes. Perceived benefits included improved value of the clinical mission and enhanced student education. These results elucidate a framework for student roles that enhance learning and add value to health systems.
Castanea sativa by-products: a review on added value and sustainable application.
Braga, Nair; Rodrigues, Francisca; Oliveira, M Beatriz P P
2015-01-01
Castanea sativa Mill. is a species of the family Fagaceae abundant in south Europe and Asia. The fruits (chestnut) are an added value resource in producing countries. Chestnut economic value is increasing not only for nutritional qualities but also for the beneficial health effects related with its consumption. During chestnut processing, a large amount of waste material is generated namely inner shell, outer shell and leaves. Studies on chestnut by-products revealed a good profile of bioactive compounds with antioxidant, anticarcinogenic and cardioprotective properties. These agro-industrial wastes, after valorisation, can be used by other industries, such as pharmaceutical, food or cosmetics, generating more profits, reducing pollution costs and improving social, economic and environmental sustainability. The purpose of this review is to provide knowledge about the type of chestnut by-products produced, the studies concerning its chemical composition and biological activity, and also to discuss other possible applications of these materials.
Barlow, J H; Bancroft, G V; Turner, A P
2005-04-01
Chronic disease is a public health issue that could be addressed, in part, by increasing the ability of individuals to better manage their condition and its consequences on a day-to-day basis. One intervention designed to facilitate this is the Chronic Disease Self Management Course (CDSMC) that is delivered by volunteer, lay tutors who themselves have a chronic disease. Although there is growing evidence of course effectiveness for participants, the experiences of tutors have been neglected. This study aims to address this omission. Telephone interviews were conducted with 11 (six male) tutors: all interviews were transcribed and thematically analysed. Being a volunteer lay-tutor was perceived to be an enjoyable and valuable experience despite the challenges associated with course delivery, such as organizational demands and managing the diverse needs of mixed groups of chronic disease participants that led to a tension between disease-specific needs and the generic approach of the course. Being valued and adding value to the lives of others were key benefits of being a volunteer tutor, along with increased confidence that they were doing something positive for others. Course delivery prompted the initiation and maintenance of tutors' own self-management behaviours.
ERIC Educational Resources Information Center
Amrein-Beardsley, Audrey; Collins, Clarin
2012-01-01
The SAS Educational Value-Added Assessment System (SAS[R] EVAAS[R]) is the most widely used value-added system in the country. It is also self-proclaimed as "the most robust and reliable" system available, with its greatest benefit to help educators improve their teaching practices. This study critically examined the effects of SAS[R] EVAAS[R] as…
A Value-Added Study of Teacher Spillover Effects across Four Core Subjects in Middle Schools
ERIC Educational Resources Information Center
Yuan, Kun
2015-01-01
This study examined the existence, magnitude, and impact of teacher spillover effects (TSEs) across teachers of four subject areas (i.e., mathematics, English language arts [ELA], science, and social studies) on student achievement in each of the four subjects at the middle school level. The author conducted a series of value-added (VA) analyses,…
Condensing Massive Satellite Datasets For Rapid Interactive Analysis
NASA Astrophysics Data System (ADS)
Grant, G.; Gallaher, D. W.; Lv, Q.; Campbell, G. G.; Fowler, C.; LIU, Q.; Chen, C.; Klucik, R.; McAllister, R. A.
2015-12-01
Our goal is to enable users to interactively analyze massive satellite datasets, identifying anomalous data or values that fall outside of thresholds. To achieve this, the project seeks to create a derived database containing only the most relevant information, accelerating the analysis process. The database is designed to be an ancillary tool for the researcher, not an archival database to replace the original data. This approach is aimed at improving performance by reducing the overall size by way of condensing the data. The primary challenges of the project include: - The nature of the research question(s) may not be known ahead of time. - The thresholds for determining anomalies may be uncertain. - Problems associated with processing cloudy, missing, or noisy satellite imagery. - The contents and method of creation of the condensed dataset must be easily explainable to users. The architecture of the database will reorganize spatially-oriented satellite imagery into temporally-oriented columns of data (a.k.a., "data rods") to facilitate time-series analysis. The database itself is an open-source parallel database, designed to make full use of clustered server technologies. A demonstration of the system capabilities will be shown. Applications for this technology include quick-look views of the data, as well as the potential for on-board satellite processing of essential information, with the goal of reducing data latency.
NASA Astrophysics Data System (ADS)
D'Amore, D. V.; Biles, F. E.
2016-12-01
The flow of water is often highlighted as a priority in land management planning and assessments related to climate change. Improved measurement and modeling of soil moisture is required to develop predictive estimates for plant distributions, soil moisture, and snowpack, which all play important roles in ecosystem planning in the face of climate change. Drainage indexes are commonly derived from GIS tools with digital elevation models. Soil moisture classes derived from these tools are useful digital proxies for ecosystem functions associated with the concentration of water on the landscape. We developed a spatially explicit topographically derived soil wetness index (TWI) across the perhumid coastal temperate rainforest (PCTR) of Alaska and British Columbia. Developing applicable drainage indexes in complex terrain and across broad areas required careful application of the appropriate DEM, caution with artifacts in GIS covers and mapping realistic zones of wetlands with the indicator. The large spatial extent of the model has facilitated the mapping of forest habitat and the development of water table depth mapping in the region. A key element of the TWI is the merging of elevation datasets across the US-Canada border where major rivers transect the international boundary. The unified TWI allows for seemless mapping across the international border and unified ecological applications. A python program combined with the unified DEM allows end users to quickly apply the TWI to all areas of the PCTR. This common platform can facilitate model comparison and improvements to local soil moisture conditions, generation of streamflow, and ecological site conditions. In this presentation we highlight the application of the TWI for mapping risk factors related to forest decline and the development of a regional water table depth map. Improved soil moisture maps are critical for deriving spatial models of changes in soil moisture for both plant growth and streamflow across
Assessment of the lumber drying industry and current potential for value-added processing in Alaska.
David L. Nicholls; Kenneth A. Kilborn
2001-01-01
An assessment was done of the lumber drying industry in Alaska. Part 1 of the assessment included an evaluation of kiln capacity, kiln type, and species dried, by geographic region of the state. Part 2 of the assessment considered the value-added potential associated with lumber drying. Various costs related to lumber drying were evaluated in an Excel spreadsheet....
White, Peter S
2013-12-01
Conservation ethics have been based on 2 philosophical value systems: extrinsic value (defined broadly to include all values that derive from something external to the thing valued) and intrinsic value. Valuing biological diversity on the basis of an extrinsic value system is problematic because measurement is often difficult; extrinsic value changes as spatial or temporal scales change; extrinsic value differs on the basis of external factors; some species have trivial or negative extrinsic values; and extrinsic value varies across human cultures and societies and with such factors as socioeconomic conditions, individual experiences, and educational backgrounds. Valuing biological diversity on the basis of an intrinsic value system also poses challenges because intrinsic value can be seen as a disguised form of human extrinsic value; intrinsic value is initially ambiguous as to which objects or characteristics of biological diversity are to being valued; all aspects of biological diversity (e.g., species and ecosystems) are transitory; species and ecosystems are not static concrete entities; and intrinsic value of one species is often in conflict with the intrinsic value of other species. Extrinsic and intrinsic value systems share a common origin, such that extrinsic values are always derived from intrinsic value and life mutely expresses both intrinsic and extrinsic values-these are derived from and are products of biological evolution. Probing the values that underlie conservation helps the community clearly articulate its aims. Derivación de los Valores Extrínsecos de la Biodiversidad a Partir de sus Valores Intrínsecos y de Ambos a Partir de los Primeros Principios de la Evolución. © 2013 Society for Conservation Biology.
Improving the discoverability, accessibility, and citability of omics datasets: a case report.
Darlington, Yolanda F; Naumov, Alexey; McOwiti, Apollo; Kankanamge, Wasula H; Becnel, Lauren B; McKenna, Neil J
2017-03-01
Although omics datasets represent valuable assets for hypothesis generation, model testing, and data validation, the infrastructure supporting their reuse lacks organization and consistency. Using nuclear receptor signaling transcriptomic datasets as proof of principle, we developed a model to improve the discoverability, accessibility, and citability of published omics datasets. Primary datasets were retrieved from archives, processed to extract data points, then subjected to metadata enrichment and gap filling. The resulting secondary datasets were exposed on responsive web pages to support mining of gene lists, discovery of related datasets, and single-click citation integration with popular reference managers. Automated processes were established to embed digital object identifier-driven links to the secondary datasets in associated journal articles, small molecule and gene-centric databases, and a dataset search engine. Our model creates multiple points of access to reprocessed and reannotated derivative datasets across the digital biomedical research ecosystem, promoting their visibility and usability across disparate research communities. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A reanalysis dataset of the South China Sea
Zeng, Xuezhi; Peng, Shiqiu; Li, Zhijin; Qi, Yiquan; Chen, Rongyu
2014-01-01
Ocean reanalysis provides a temporally continuous and spatially gridded four-dimensional estimate of the ocean state for a better understanding of the ocean dynamics and its spatial/temporal variability. Here we present a 19-year (1992–2010) high-resolution ocean reanalysis dataset of the upper ocean in the South China Sea (SCS) produced from an ocean data assimilation system. A wide variety of observations, including in-situ temperature/salinity profiles, ship-measured and satellite-derived sea surface temperatures, and sea surface height anomalies from satellite altimetry, are assimilated into the outputs of an ocean general circulation model using a multi-scale incremental three-dimensional variational data assimilation scheme, yielding a daily high-resolution reanalysis dataset of the SCS. Comparisons between the reanalysis and independent observations support the reliability of the dataset. The presented dataset provides the research community of the SCS an important data source for studying the thermodynamic processes of the ocean circulation and meso-scale features in the SCS, including their spatial and temporal variability. PMID:25977803
ERIC Educational Resources Information Center
Perry, Thomas
2017-01-01
Value-added (VA) measures are currently the predominant approach used to compare the effectiveness of schools. Recent educational effectiveness research, however, has developed alternative approaches including the regression discontinuity (RD) design, which also allows estimation of absolute school effects. Initial research suggests RD is a viable…
ERIC Educational Resources Information Center
Spencer, Bryden
2016-01-01
Value-added models are a class of growth models used in education to assign responsibility for student growth to teachers or schools. For value-added models to be used fairly, sufficient statistical precision is necessary for accurate teacher classification. Previous research indicated precision below practical limits. An alternative approach has…
NASA Astrophysics Data System (ADS)
Norton, P. A., II; Haj, A. E., Jr.
2014-12-01
The United States Geological Survey is currently developing a National Hydrologic Model (NHM) to support and facilitate coordinated and consistent hydrologic modeling efforts at the scale of the continental United States. As part of this effort, the Geospatial Fabric (GF) for the NHM was created. The GF is a database that contains parameters derived from datasets that characterize the physical features of watersheds. The GF was used to aggregate catchments and flowlines defined in the National Hydrography Dataset Plus dataset for more than 100,000 hydrologic response units (HRUs), and to establish initial parameter values for input to the Precipitation-Runoff Modeling System (PRMS). Many parameter values are adjusted in PRMS using an automated calibration process. Using these adjusted parameter values, the PRMS model estimated variables such as evapotranspiration (ET), potential evapotranspiration (PET), snow-covered area (SCA), and snow water equivalent (SWE). In order to evaluate the effectiveness of parameter calibration, and model performance in general, several satellite-based Moderate Resolution Imaging Spectroradiometer (MODIS) and Snow Data Assimilation System (SNODAS) gridded datasets including ET, PET, SCA, and SWE were compared to PRMS-simulated values. The MODIS and SNODAS data were spatially averaged for each HRU, and compared to PRMS-simulated ET, PET, SCA, and SWE values for each HRU in the Upper Missouri River watershed. Default initial GF parameter values and PRMS calibration ranges were evaluated. Evaluation results, and the use of MODIS and SNODAS datasets to update GF parameter values and PRMS calibration ranges, are presented and discussed.
Lepper-Blilie, A N; Berg, E P; Germolus, A J; Buchanan, D S; Berg, P T
2014-01-01
The objectives of this study were to educate consumers about value-added beef cuts and evaluate their palatability responses of a value cut and three traditional cuts. Three hundred and twenty-two individuals participated in the beef value cut education seminar series presented by trained beef industry educators. Seminar participants evaluated tenderness, juiciness, flavor, and overall like of four samples, bottom round, top sirloin, ribeye, and a value cut (Delmonico or Denver), on a 9-point scale. The ribeye and the value cut were found to be similar in all four attributes and differed from the top sirloin and bottom round. Correlations and regression analysis found that flavor was the largest influencing factor for overall like for the ribeye, value cut, and top sirloin. The value cut is comparable to the ribeye and can be a less expensive replacement. © 2013.
Hara, Michikazu; Nakajima, Kiyotaka; Kamata, Keigo
2015-01-01
In recent decades, the substitution of non-renewable fossil resources by renewable biomass as a sustainable feedstock has been extensively investigated for the manufacture of high value-added products such as biofuels, commodity chemicals, and new bio-based materials such as bioplastics. Numerous solid catalyst systems for the effective conversion of biomass feedstocks into value-added chemicals and fuels have been developed. Solid catalysts are classified into four main groups with respect to their structures and substrate activation properties: (a) micro- and mesoporous materials, (b) metal oxides, (c) supported metal catalysts, and (d) sulfonated polymers. This review article focuses on the activation of substrates and/or reagents on the basis of groups (a)–(d), and the corresponding reaction mechanisms. In addition, recent progress in chemocatalytic processes for the production of five industrially important products (5-hydroxymethylfurfural, lactic acid, glyceraldehyde, 1,3-dihydroxyacetone, and furan-2,5-dicarboxylic acid) as bio-based plastic monomers and their intermediates is comprehensively summarized. PMID:27877800
Validating Variational Bayes Linear Regression Method With Multi-Central Datasets.
Murata, Hiroshi; Zangwill, Linda M; Fujino, Yuri; Matsuura, Masato; Miki, Atsuya; Hirasawa, Kazunori; Tanito, Masaki; Mizoue, Shiro; Mori, Kazuhiko; Suzuki, Katsuyoshi; Yamashita, Takehiro; Kashiwagi, Kenji; Shoji, Nobuyuki; Asaoka, Ryo
2018-04-01
To validate the prediction accuracy of variational Bayes linear regression (VBLR) with two datasets external to the training dataset. The training dataset consisted of 7268 eyes of 4278 subjects from the University of Tokyo Hospital. The Japanese Archive of Multicentral Databases in Glaucoma (JAMDIG) dataset consisted of 271 eyes of 177 patients, and the Diagnostic Innovations in Glaucoma Study (DIGS) dataset includes 248 eyes of 173 patients, which were used for validation. Prediction accuracy was compared between the VBLR and ordinary least squared linear regression (OLSLR). First, OLSLR and VBLR were carried out using total deviation (TD) values at each of the 52 test points from the second to fourth visual fields (VFs) (VF2-4) to 2nd to 10th VF (VF2-10) of each patient in JAMDIG and DIGS datasets, and the TD values of the 11th VF test were predicted every time. The predictive accuracy of each method was compared through the root mean squared error (RMSE) statistic. OLSLR RMSEs with the JAMDIG and DIGS datasets were between 31 and 4.3 dB, and between 19.5 and 3.9 dB. On the other hand, VBLR RMSEs with JAMDIG and DIGS datasets were between 5.0 and 3.7, and between 4.6 and 3.6 dB. There was statistically significant difference between VBLR and OLSLR for both datasets at every series (VF2-4 to VF2-10) (P < 0.01 for all tests). However, there was no statistically significant difference in VBLR RMSEs between JAMDIG and DIGS datasets at any series of VFs (VF2-2 to VF2-10) (P > 0.05). VBLR outperformed OLSLR to predict future VF progression, and the VBLR has a potential to be a helpful tool at clinical settings.
Converting citrus wastes into value-added products: Economic and environmently friendly approaches.
Sharma, Kavita; Mahato, Neelima; Cho, Moo Hwan; Lee, Yong Rok
2017-02-01
Citrus fruits, including oranges, grapefruits, lemons, limes, tangerines, and mandarins, are among the most widely cultivated fruits around the globe. Its production is increasing every year due to rising consumer demand. Citrus-processing industries generate huge amounts of wastes every year, and citrus peel waste alone accounts for almost 50% of the wet fruit mass. Citrus waste is of immense economic value as it contains an abundance of various flavonoids, carotenoids, dietary fiber, sugars, polyphenols, essential oils, and ascorbic acid, as well as considerable amounts of some trace elements. Citrus waste also contains high levels of sugars suitable for fermentation for bioethanol production. However, compounds such as D-limonene must be removed for efficient bioethanol production. The aim of the present article was to review the latest advances in various popular methods of extraction for obtaining value-added products from citrus waste/byproducts and their potential utility as a source of various functional compounds. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Viar, Meagan Alexis
2016-01-01
Educational leaders are struggling with the issue of academic reform as it pertains to accountability for student achievement. With increasing pressures to improve student achievement, many states have adopted value-added measures to monitor student growth and teacher effectiveness. This study undertook a quantitative approach to examine the…
ERIC Educational Resources Information Center
Timmermans, Anneke C.; Thomas, Sally M.
2015-01-01
In many countries, policy makers struggle with the development of value-added indicators of school performance for educational accountability purposes and in particular with the choice whether school context measured in the form of student composition variables should be included. This study investigates differences between 7 empirical studies…
ERIC Educational Resources Information Center
Lenkeit, Jenny
2013-01-01
Educational effectiveness research often appeals to "value-added models (VAM)" to gauge the impact of schooling on student learning net of the effect of student background variables. A huge amount of cross-sectional studies do not, however, meet VAM's requirement for longitudinal data. "Contextualised attainment models (CAM)"…
The Use of National Data Sets to Baseline Science Education Reform: Exploring Value-Added Approaches
ERIC Educational Resources Information Center
Homer, Matt; Ryder, Jim; Donnelly, Jim
2011-01-01
This paper uses data from the National Pupil Database to investigate the differences in "performance" across the range of science courses available following the 2006 Key Stage 4 (KS4) science reforms in England. This is a value-added exploration (from Key Stage 3 [KS3] to KS4) aimed not at the student or the school level, but rather at…
Green Net Value Added as a Sustainability Metric Based on ...
Sustainability measurement in economics involves evaluation of environmental and economic impact in an integrated manner. In this study, system level economic data are combined with environmental impact from a life cycle assessment (LCA) of a common product. We are exploring a costing approach that captures traditional costs but also incorporates externality costs to provide a convenient, easily interpretable metric. Green Net Value Added (GNVA) is a type of full cost accounting that incorporates total revenue, the cost of materials and services, depreciation, and environmental externalities. Two, but not all, of the potential environmental impacts calculated by the standard LCIA method (TRACI) could be converted to externality cost values. We compute externality costs disaggregated by upstream sectors, full cost, and GNVA to evaluate the relative sustainability of Bounty® paper towels manufactured at two production facilities. We found that the longer running, more established line had a higher GNVA than the newer line. The dominant factors contributing to externality costs are calculated to come from the stationary sources in the supply chain: electricity generation (27-35%), refineries (20-21%), pulp and paper making (15-23%). Health related externalities from Particulate Matter (PM2.5) and Carbon Dioxide equivalent (CO2e) emissions appear largely driven by electricity usage and emissions by the facilities, followed by pulp processing and transport. Supply
CoINcIDE: A framework for discovery of patient subtypes across multiple datasets.
Planey, Catherine R; Gevaert, Olivier
2016-03-09
Patient disease subtypes have the potential to transform personalized medicine. However, many patient subtypes derived from unsupervised clustering analyses on high-dimensional datasets are not replicable across multiple datasets, limiting their clinical utility. We present CoINcIDE, a novel methodological framework for the discovery of patient subtypes across multiple datasets that requires no between-dataset transformations. We also present a high-quality database collection, curatedBreastData, with over 2,500 breast cancer gene expression samples. We use CoINcIDE to discover novel breast and ovarian cancer subtypes with prognostic significance and novel hypothesized ovarian therapeutic targets across multiple datasets. CoINcIDE and curatedBreastData are available as R packages.
Two ultraviolet radiation datasets that cover China
NASA Astrophysics Data System (ADS)
Liu, Hui; Hu, Bo; Wang, Yuesi; Liu, Guangren; Tang, Liqin; Ji, Dongsheng; Bai, Yongfei; Bao, Weikai; Chen, Xin; Chen, Yunming; Ding, Weixin; Han, Xiaozeng; He, Fei; Huang, Hui; Huang, Zhenying; Li, Xinrong; Li, Yan; Liu, Wenzhao; Lin, Luxiang; Ouyang, Zhu; Qin, Boqiang; Shen, Weijun; Shen, Yanjun; Su, Hongxin; Song, Changchun; Sun, Bo; Sun, Song; Wang, Anzhi; Wang, Genxu; Wang, Huimin; Wang, Silong; Wang, Youshao; Wei, Wenxue; Xie, Ping; Xie, Zongqiang; Yan, Xiaoyuan; Zeng, Fanjiang; Zhang, Fawei; Zhang, Yangjian; Zhang, Yiping; Zhao, Chengyi; Zhao, Wenzhi; Zhao, Xueyong; Zhou, Guoyi; Zhu, Bo
2017-07-01
Ultraviolet (UV) radiation has significant effects on ecosystems, environments, and human health, as well as atmospheric processes and climate change. Two ultraviolet radiation datasets are described in this paper. One contains hourly observations of UV radiation measured at 40 Chinese Ecosystem Research Network stations from 2005 to 2015. CUV3 broadband radiometers were used to observe the UV radiation, with an accuracy of 5%, which meets the World Meteorology Organization's measurement standards. The extremum method was used to control the quality of the measured datasets. The other dataset contains daily cumulative UV radiation estimates that were calculated using an all-sky estimation model combined with a hybrid model. The reconstructed daily UV radiation data span from 1961 to 2014. The mean absolute bias error and root-mean-square error are smaller than 30% at most stations, and most of the mean bias error values are negative, which indicates underestimation of the UV radiation intensity. These datasets can improve our basic knowledge of the spatial and temporal variations in UV radiation. Additionally, these datasets can be used in studies of potential ozone formation and atmospheric oxidation, as well as simulations of ecological processes.
A universal counting of black hole microstates in AdS4
NASA Astrophysics Data System (ADS)
Azzurli, Francesco; Bobev, Nikolay; Crichigno, P. Marcos; Min, Vincent S.; Zaffaroni, Alberto
2018-02-01
Many three-dimensional N=2 SCFTs admit a universal partial topological twist when placed on hyperbolic Riemann surfaces. We exploit this fact to derive a universal formula which relates the planar limit of the topologically twisted index of these SCFTs and their three-sphere partition function. We then utilize this to account for the entropy of a large class of supersymmetric asymptotically AdS4 magnetically charged black holes in M-theory and massive type IIA string theory. In this context we also discuss novel AdS2 solutions of eleven-dimensional supergravity which describe the near horizon region of large new families of supersymmetric black holes arising from M2-branes wrapping Riemann surfaces.
Velghe, Inge; Carleer, Robert; Yperman, Jan; Schreurs, Sonja
2013-04-01
Slow and fast pyrolysis of sludge and sludge/disposal filter cake (FC) mix are performed to investigate the liquid and solid products for their use as value added products. The obtained slow pyrolysis liquid products separate in an oil, a water rich fraction and a valuable crystalline solid 5,5-dimethyl hydantoin. During fast pyrolysis, mainly an oil fraction is formed. Aliphatic acids and amides present in the water rich fractions can be considered as value added products and could be purified. The oil fractions have properties which make them promising as fuel (25-35 MJ/kg, 14-20 wt% water content, 0.2-0.6 O/C value), but upgrading is necessary. Sludge/FC oils have a lower calorific value, due to evaporation of alcohols present in FC. ICP-AES analyses reveal that almost none of the metals present in sludge or sludge/FC are transferred towards the liquid fractions. The metals are enriched in the solid fractions. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Yeh, Stuart S.; Ritter, Joseph
2009-01-01
A cost-effectiveness analysis was conducted of Gordon, Kane, and Staiger's (2006) proposal to raise student achievement by identifying and replacing the bottom quartile of novice teachers, using value-added assessment of teacher performance. The cost effectiveness of this proposal was compared to the cost effectiveness of voucher programs, charter…
Converting Static Image Datasets to Spiking Neuromorphic Datasets Using Saccades.
Orchard, Garrick; Jayawant, Ajinkya; Cohen, Gregory K; Thakor, Nitish
2015-01-01
Creating datasets for Neuromorphic Vision is a challenging task. A lack of available recordings from Neuromorphic Vision sensors means that data must typically be recorded specifically for dataset creation rather than collecting and labeling existing data. The task is further complicated by a desire to simultaneously provide traditional frame-based recordings to allow for direct comparison with traditional Computer Vision algorithms. Here we propose a method for converting existing Computer Vision static image datasets into Neuromorphic Vision datasets using an actuated pan-tilt camera platform. Moving the sensor rather than the scene or image is a more biologically realistic approach to sensing and eliminates timing artifacts introduced by monitor updates when simulating motion on a computer monitor. We present conversion of two popular image datasets (MNIST and Caltech101) which have played important roles in the development of Computer Vision, and we provide performance metrics on these datasets using spike-based recognition algorithms. This work contributes datasets for future use in the field, as well as results from spike-based algorithms against which future works can compare. Furthermore, by converting datasets already popular in Computer Vision, we enable more direct comparison with frame-based approaches.
16 CFR 460.18 - Insulation ads.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Insulation ads. 460.18 Section 460.18... INSULATION § 460.18 Insulation ads. (a) If your ad gives an R-value, you must give the type of insulation and... the R-value, the greater the insulating power. Ask your seller for the fact sheet on R-values.” (b) If...
16 CFR 460.18 - Insulation ads.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Insulation ads. 460.18 Section 460.18... INSULATION § 460.18 Insulation ads. (a) If your ad gives an R-value, you must give the type of insulation and... the R-value, the greater the insulating power. Ask your seller for the fact sheet on R-values.” (b) If...
16 CFR 460.18 - Insulation ads.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Insulation ads. 460.18 Section 460.18... INSULATION § 460.18 Insulation ads. (a) If your ad gives an R-value, you must give the type of insulation and... the R-value, the greater the insulating power. Ask your seller for the fact sheet on R-values.” (b) If...
16 CFR 460.18 - Insulation ads.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Insulation ads. 460.18 Section 460.18... INSULATION § 460.18 Insulation ads. (a) If your ad gives an R-value, you must give the type of insulation and... the R-value, the greater the insulating power. Ask your seller for the fact sheet on R-values.” (b) If...
16 CFR 460.18 - Insulation ads.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Insulation ads. 460.18 Section 460.18... INSULATION § 460.18 Insulation ads. (a) If your ad gives an R-value, you must give the type of insulation and... the R-value, the greater the insulating power. Ask your seller for the fact sheet on R-values.” (b) If...
Lanzafame, S; Giannelli, M; Garaci, F; Floris, R; Duggento, A; Guerrisi, M; Toschi, N
2016-05-01
An increasing number of studies have aimed to compare diffusion tensor imaging (DTI)-related parameters [e.g., mean diffusivity (MD), fractional anisotropy (FA), radial diffusivity (RD), and axial diffusivity (AD)] to complementary new indexes [e.g., mean kurtosis (MK)/radial kurtosis (RK)/axial kurtosis (AK)] derived through diffusion kurtosis imaging (DKI) in terms of their discriminative potential about tissue disease-related microstructural alterations. Given that the DTI and DKI models provide conceptually and quantitatively different estimates of the diffusion tensor, which can also depend on fitting routine, the aim of this study was to investigate model- and algorithm-dependent differences in MD/FA/RD/AD and anisotropy mode (MO) estimates in diffusion-weighted imaging of human brain white matter. The authors employed (a) data collected from 33 healthy subjects (20-59 yr, F: 15, M: 18) within the Human Connectome Project (HCP) on a customized 3 T scanner, and (b) data from 34 healthy subjects (26-61 yr, F: 5, M: 29) acquired on a clinical 3 T scanner. The DTI model was fitted to b-value =0 and b-value =1000 s/mm(2) data while the DKI model was fitted to data comprising b-value =0, 1000 and 3000/2500 s/mm(2) [for dataset (a)/(b), respectively] through nonlinear and weighted linear least squares algorithms. In addition to MK/RK/AK maps, MD/FA/MO/RD/AD maps were estimated from both models and both algorithms. Using tract-based spatial statistics, the authors tested the null hypothesis of zero difference between the two MD/FA/MO/RD/AD estimates in brain white matter for both datasets and both algorithms. DKI-derived MD/FA/RD/AD and MO estimates were significantly higher and lower, respectively, than corresponding DTI-derived estimates. All voxelwise differences extended over most of the white matter skeleton. Fractional differences between the two estimates [(DKI - DTI)/DTI] of most invariants were seen to vary with the invariant value itself as well as with MK
ERIC Educational Resources Information Center
Fagioli, Loris P.
2014-01-01
This study compared a value-added approach to school accountability to the currently used metrics of accountability in California of Adequate Yearly Progress (AYP) and Academic Performance Index (API). Five-year student panel data (N?=?53,733) from 29 elementary schools in a large California school district were used to address the research…
Gesch, Dean B.; Oimoen, Michael J.; Evans, Gayla A.
2014-01-01
The National Elevation Dataset (NED) is the primary elevation data product produced and distributed by the U.S. Geological Survey. The NED provides seamless raster elevation data of the conterminous United States, Alaska, Hawaii, U.S. island territories, Mexico, and Canada. The NED is derived from diverse source datasets that are processed to a specification with consistent resolutions, coordinate system, elevation units, and horizontal and vertical datums. The NED serves as the elevation layer of The National Map, and it provides basic elevation information for earth science studies and mapping applications in the United States and most of North America. An important part of supporting scientific and operational use of the NED is provision of thorough dataset documentation including data quality and accuracy metrics. The focus of this report is on the vertical accuracy of the NED and on comparison of the NED with other similar large-area elevation datasets, namely data from the Shuttle Radar Topography Mission (SRTM) and the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER).
Li, Wen-Xing; Dai, Shao-Xing; Liu, Jia-Qian; Wang, Qian; Li, Gong-Hua; Huang, Jing-Fei
2016-01-01
Alzheimer's disease (AD) and schizophrenia (SZ) are both accompanied by impaired learning and memory functions. This study aims to explore the expression profiles of learning or memory genes between AD and SZ. We downloaded 10 AD and 10 SZ datasets from GEO-NCBI for integrated analysis. These datasets were processed using RMA algorithm and a global renormalization for all studies. Then Empirical Bayes algorithm was used to find the differentially expressed genes between patients and controls. The results showed that most of the differentially expressed genes were related to AD whereas the gene expression profile was little affected in the SZ. Furthermore, in the aspects of the number of differentially expressed genes, the fold change and the brain region, there was a great difference in the expression of learning or memory related genes between AD and SZ. In AD, the CALB1, GABRA5, and TAC1 were significantly downregulated in whole brain, frontal lobe, temporal lobe, and hippocampus. However, in SZ, only two genes CRHBP and CX3CR1 were downregulated in hippocampus, and other brain regions were not affected. The effect of these genes on learning or memory impairment has been widely studied. It was suggested that these genes may play a crucial role in AD or SZ pathogenesis. The different gene expression patterns between AD and SZ on learning and memory functions in different brain regions revealed in our study may help to understand the different mechanism between two diseases.
Dataset for forensic analysis of B-tree file system.
Wani, Mohamad Ahtisham; Bhat, Wasim Ahmad
2018-06-01
Since B-tree file system (Btrfs) is set to become de facto standard file system on Linux (and Linux based) operating systems, Btrfs dataset for forensic analysis is of great interest and immense value to forensic community. This article presents a novel dataset for forensic analysis of Btrfs that was collected using a proposed data-recovery procedure. The dataset identifies various generalized and common file system layouts and operations, specific node-balancing mechanisms triggered, logical addresses of various data structures, on-disk records, recovered-data as directory entries and extent data from leaf and internal nodes, and percentage of data recovered.
NASA Astrophysics Data System (ADS)
Copas, K.; Legind, J. K.; Hahn, A.; Braak, K.; Høftt, M.; Noesgaard, D.; Robertson, T.; Méndez Hernández, F.; Schigel, D.; Ko, C.
2017-12-01
GBIF—the Global Biodiversity Information Facility—has recently demonstrated a system that tracks publications back to individual datasets, giving data providers demonstrable evidence of the benefit and utility of sharing data to support an array of scholarly topics and practical applications. GBIF is an open-data network and research infrastructure funded by the world's governments. Its community consists of more than 90 formal participants and almost 1,000 data-publishing institutions, which currently make tens of thousands of datasets containing nearly 800 million species occurrence records freely and publicly available for discovery, use and reuse across a wide range of biodiversity-related research and policy investigations. Starting in 2015 with the help of DataONE, GBIF introduced DOIs as persistent identifiers for the datasets shared through its network. This enhancement soon extended to the assignment of DOIs to user downloads from GBIF.org, which typically filter the available records with a variety of taxonomic, geographic, temporal and other search terms. Despite the lack of widely accepted standards for citing data among researchers and publications, this technical infrastructure is beginning to take hold and support open, transparent, persistent and repeatable use and reuse of species occurrence data. These `download DOIs' provide canonical references for the search results researchers process and use in peer-reviewed articles—a practice GBIF encourages by confirming new DOIs with each download and offering guidelines on citation. GBIF has recently started linking these citation results back to dataset and publisher pages, offering more consistent, traceable evidence of the value of sharing data to support others' research. GBIF's experience may be a useful model for other repositories to follow.
Roelen, Corné A M; Stapelfeldt, Christina M; Heymans, Martijn W; van Rhenen, Willem; Labriola, Merete; Nielsen, Claus V; Bültmann, Ute; Jensen, Chris
2015-06-01
To validate Dutch prognostic models including age, self-rated health and prior sickness absence (SA) for ability to predict high SA in Danish eldercare. The added value of work environment variables to the models' risk discrimination was also investigated. 2,562 municipal eldercare workers (95% women) participated in the Working in Eldercare Survey. Predictor variables were measured by questionnaire at baseline in 2005. Prognostic models were validated for predictions of high (≥30) SA days and high (≥3) SA episodes retrieved from employer records during 1-year follow-up. The accuracy of predictions was assessed by calibration graphs and the ability of the models to discriminate between high- and low-risk workers was investigated by ROC-analysis. The added value of work environment variables was measured with Integrated Discrimination Improvement (IDI). 1,930 workers had complete data for analysis. The models underestimated the risk of high SA in eldercare workers and the SA episodes model had to be re-calibrated to the Danish data. Discrimination was practically useful for the re-calibrated SA episodes model, but not the SA days model. Physical workload improved the SA days model (IDI = 0.40; 95% CI 0.19-0.60) and psychosocial work factors, particularly the quality of leadership (IDI = 0.70; 95% CI 053-0.86) improved the SA episodes model. The prognostic model predicting high SA days showed poor performance even after physical workload was added. The prognostic model predicting high SA episodes could be used to identify high-risk workers, especially when psychosocial work factors are added as predictor variables.
Recycling of hazardous waste from tertiary aluminium industry in a value-added material.
Gonzalo-Delgado, Laura; López-Delgado, Aurora; López, Félix Antonio; Alguacil, Francisco José; López-Andrés, Sol
2011-02-01
The recent European Directive on waste, 2008/98/EC seeks to reduce the exploitation of natural resources through the use of secondary resource management. Thus the main objective of this study was to explore how a waste could cease to be considered as waste and could be utilized for a specific purpose. In this way, a hazardous waste from the tertiary aluminium industry was studied for its use as a raw material in the synthesis of an added-value product, boehmite. This waste is classified as a hazardous residue, principally because in the presence of water or humidity, it releases toxic gases such as hydrogen, ammonia, methane and hydrogen sulfide. The low temperature hydrothermal method developed permits the recovery of 90% of the aluminium content in the residue in the form of a high purity (96%) AlOOH (boehmite). The method of synthesis consists of an initial HCl digestion followed by a gel precipitation. In the first stage a 10% HCl solution is used to yield a 12.63 g L(-1) Al( 3+) solution. In the second stage boehmite is precipitated in the form of a gel by increasing the pH of the acid Al(3+) solution by adding 1 mol L(-1) NaOH solution. Several pH values were tested and boehmite was obtained as the only crystalline phase at pH 8. Boehmite was completely characterized by X-ray diffraction, Fourier transform infrared and scanning electron microscopy. A study of its thermal behaviour was also carried out by thermogravimetric/differential thermal analysis.
A global dataset of crowdsourced land cover and land use reference data.
Fritz, Steffen; See, Linda; Perger, Christoph; McCallum, Ian; Schill, Christian; Schepaschenko, Dmitry; Duerauer, Martina; Karner, Mathias; Dresel, Christopher; Laso-Bayas, Juan-Carlos; Lesiv, Myroslava; Moorthy, Inian; Salk, Carl F; Danylo, Olha; Sturn, Tobias; Albrecht, Franziska; You, Liangzhi; Kraxner, Florian; Obersteiner, Michael
2017-06-13
Global land cover is an essential climate variable and a key biophysical driver for earth system models. While remote sensing technology, particularly satellites, have played a key role in providing land cover datasets, large discrepancies have been noted among the available products. Global land use is typically more difficult to map and in many cases cannot be remotely sensed. In-situ or ground-based data and high resolution imagery are thus an important requirement for producing accurate land cover and land use datasets and this is precisely what is lacking. Here we describe the global land cover and land use reference data derived from the Geo-Wiki crowdsourcing platform via four campaigns. These global datasets provide information on human impact, land cover disagreement, wilderness and land cover and land use. Hence, they are relevant for the scientific community that requires reference data for global satellite-derived products, as well as those interested in monitoring global terrestrial ecosystems in general.
A global dataset of crowdsourced land cover and land use reference data
Fritz, Steffen; See, Linda; Perger, Christoph; McCallum, Ian; Schill, Christian; Schepaschenko, Dmitry; Duerauer, Martina; Karner, Mathias; Dresel, Christopher; Laso-Bayas, Juan-Carlos; Lesiv, Myroslava; Moorthy, Inian; Salk, Carl F.; Danylo, Olha; Sturn, Tobias; Albrecht, Franziska; You, Liangzhi; Kraxner, Florian; Obersteiner, Michael
2017-01-01
Global land cover is an essential climate variable and a key biophysical driver for earth system models. While remote sensing technology, particularly satellites, have played a key role in providing land cover datasets, large discrepancies have been noted among the available products. Global land use is typically more difficult to map and in many cases cannot be remotely sensed. In-situ or ground-based data and high resolution imagery are thus an important requirement for producing accurate land cover and land use datasets and this is precisely what is lacking. Here we describe the global land cover and land use reference data derived from the Geo-Wiki crowdsourcing platform via four campaigns. These global datasets provide information on human impact, land cover disagreement, wilderness and land cover and land use. Hence, they are relevant for the scientific community that requires reference data for global satellite-derived products, as well as those interested in monitoring global terrestrial ecosystems in general. PMID:28608851
Xin, Fengxue; Dong, Weiliang; Jiang, Yujia; Ma, Jiangfeng; Zhang, Wenming; Wu, Hao; Zhang, Min; Jiang, Min
2018-06-01
Butanol is an important bulk chemical and has been regarded as an advanced biofuel. Large-scale production of butanol has been applied for more than 100 years, but its production through acetone-butanol-ethanol (ABE) fermentation process by solventogenic Clostridium species is still not economically viable due to the low butanol titer and yield caused by the toxicity of butanol and a by-product, such as acetone. Renewed interest in biobutanol as a biofuel has spurred technological advances to strain modification and fermentation process design. Especially, with the development of interdisciplinary processes, the sole product or even the mixture of ABE produced through ABE fermentation process can be further used as platform chemicals for high value added product production through enzymatic or chemical catalysis. This review aims to comprehensively summarize the most recent advances on the conversion of acetone, butanol and ABE mixture into various products, such as isopropanol, butyl-butyrate and higher-molecular mass alkanes. Additionally, co-production of other value added products with ABE was also discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Yuting; Rosenberg, Julian N.; Bohutskyi, Pavlo
In this study, the prospects of biofuel production from microalgal carbohydrates and lipids coupled with greenhouse gas mitigation due to photosynthetic assimilation of CO 2 have ushered in a renewed interest in algal feedstock. Furthermore, microalgae (including cyanobacteria) have become established as commercial sources of value-added biochemicals such as polyunsaturated fatty acids and carotenoid pigments used as antioxidants in nutritional supplements and cosmetics. This article presents a comprehensive synopsis of the metabolic basis for accumulating lipids as well as applicable methods of lipid and cellulose bioconversion and final applications of these natural or refined products from microalgal biomass. For lipids,more » one-step in situ transesterification offers a new and more accurate approach to quantify oil content. As a complement to microalgal oil fractions, the utilization of cellulosic biomass from microalgae to produce bioethanol by fermentation, biogas by anaerobic digestion, and bio-oil by hydrothermal liquefaction are discussed. Collectively, a compendium of information spanning green renewable fuels and value-added nutritional compounds is provided.« less
Tang, Yuting; Rosenberg, Julian N.; Bohutskyi, Pavlo; ...
2015-11-16
In this study, the prospects of biofuel production from microalgal carbohydrates and lipids coupled with greenhouse gas mitigation due to photosynthetic assimilation of CO 2 have ushered in a renewed interest in algal feedstock. Furthermore, microalgae (including cyanobacteria) have become established as commercial sources of value-added biochemicals such as polyunsaturated fatty acids and carotenoid pigments used as antioxidants in nutritional supplements and cosmetics. This article presents a comprehensive synopsis of the metabolic basis for accumulating lipids as well as applicable methods of lipid and cellulose bioconversion and final applications of these natural or refined products from microalgal biomass. For lipids,more » one-step in situ transesterification offers a new and more accurate approach to quantify oil content. As a complement to microalgal oil fractions, the utilization of cellulosic biomass from microalgae to produce bioethanol by fermentation, biogas by anaerobic digestion, and bio-oil by hydrothermal liquefaction are discussed. Collectively, a compendium of information spanning green renewable fuels and value-added nutritional compounds is provided.« less
Value-added benefits and utilization of pathologists' assistants.
Vitale, John; Brooks, Reed; Sovocool, Michael; Rader, W Rae
2012-12-01
The role of pathologists' assistants (PAs) in terms of surgical and autopsy prosection has been well established; however, the role of PAs in areas beyond surgical and autopsy pathology, such as laboratory administration and management, education, and research, is not so well understood. To determine the scope and extent of ancillary duties (value-added benefits) performed by PAs. A self-administered, electronic survey was disseminated to all members of the American Association of Pathologists' Assistants with fellowship status to analyze the ancillary duties PAs provide in laboratory administration and management, education, and research. Respondents were from 44 states and most had 6 or more years of experience in various work settings: community hospitals (50%), academic hospitals (30%), private pathology laboratories (15%), and "other" settings (5%). Most were involved in quality assurance programs (64.0%), laboratory accreditation inspections (56.2%), and a large percentage (44.4%) also had direct supervisory experience. Roughly 36% of respondents reported training residents in prosection skills in a clinical setting, while a small percentage reported teaching for-credit courses in a classroom setting (4.9%). The primary research responsibility was fresh tissue procurement for tumor banking (52.7%). Pathologists' assistants currently are involved in ancillary duties beyond surgical and autopsy prosection. Our findings indicate that PAs have a desire to become more involved in these duties, and there is opportunity for pathologists to benefit further by using PAs to the full extent of their knowledge, skills, and interests.
A dataset of human decision-making in teamwork management.
Yu, Han; Shen, Zhiqi; Miao, Chunyan; Leung, Cyril; Chen, Yiqiang; Fauvel, Simon; Lin, Jun; Cui, Lizhen; Pan, Zhengxiang; Yang, Qiang
2017-01-17
Today, most endeavours require teamwork by people with diverse skills and characteristics. In managing teamwork, decisions are often made under uncertainty and resource constraints. The strategies and the effectiveness of the strategies different people adopt to manage teamwork under different situations have not yet been fully explored, partially due to a lack of detailed large-scale data. In this paper, we describe a multi-faceted large-scale dataset to bridge this gap. It is derived from a game simulating complex project management processes. It presents the participants with different conditions in terms of team members' capabilities and task characteristics for them to exhibit their decision-making strategies. The dataset contains detailed data reflecting the decision situations, decision strategies, decision outcomes, and the emotional responses of 1,144 participants from diverse backgrounds. To our knowledge, this is the first dataset simultaneously covering these four facets of decision-making. With repeated measurements, the dataset may help establish baseline variability of decision-making in teamwork management, leading to more realistic decision theoretic models and more effective decision support approaches.
A dataset of human decision-making in teamwork management
Yu, Han; Shen, Zhiqi; Miao, Chunyan; Leung, Cyril; Chen, Yiqiang; Fauvel, Simon; Lin, Jun; Cui, Lizhen; Pan, Zhengxiang; Yang, Qiang
2017-01-01
Today, most endeavours require teamwork by people with diverse skills and characteristics. In managing teamwork, decisions are often made under uncertainty and resource constraints. The strategies and the effectiveness of the strategies different people adopt to manage teamwork under different situations have not yet been fully explored, partially due to a lack of detailed large-scale data. In this paper, we describe a multi-faceted large-scale dataset to bridge this gap. It is derived from a game simulating complex project management processes. It presents the participants with different conditions in terms of team members’ capabilities and task characteristics for them to exhibit their decision-making strategies. The dataset contains detailed data reflecting the decision situations, decision strategies, decision outcomes, and the emotional responses of 1,144 participants from diverse backgrounds. To our knowledge, this is the first dataset simultaneously covering these four facets of decision-making. With repeated measurements, the dataset may help establish baseline variability of decision-making in teamwork management, leading to more realistic decision theoretic models and more effective decision support approaches. PMID:28094787
A dataset of human decision-making in teamwork management
NASA Astrophysics Data System (ADS)
Yu, Han; Shen, Zhiqi; Miao, Chunyan; Leung, Cyril; Chen, Yiqiang; Fauvel, Simon; Lin, Jun; Cui, Lizhen; Pan, Zhengxiang; Yang, Qiang
2017-01-01
Today, most endeavours require teamwork by people with diverse skills and characteristics. In managing teamwork, decisions are often made under uncertainty and resource constraints. The strategies and the effectiveness of the strategies different people adopt to manage teamwork under different situations have not yet been fully explored, partially due to a lack of detailed large-scale data. In this paper, we describe a multi-faceted large-scale dataset to bridge this gap. It is derived from a game simulating complex project management processes. It presents the participants with different conditions in terms of team members' capabilities and task characteristics for them to exhibit their decision-making strategies. The dataset contains detailed data reflecting the decision situations, decision strategies, decision outcomes, and the emotional responses of 1,144 participants from diverse backgrounds. To our knowledge, this is the first dataset simultaneously covering these four facets of decision-making. With repeated measurements, the dataset may help establish baseline variability of decision-making in teamwork management, leading to more realistic decision theoretic models and more effective decision support approaches.
ERIC Educational Resources Information Center
Street, Nathan Lee
2017-01-01
Teacher value-added measures (VAM) are designed to provide information regarding teachers' causal impact on the academic growth of students while controlling for exogenous variables. While some researchers contend VAMs successfully and authentically measure teacher causality on learning, others suggest VAMs cannot adequately control for exogenous…
KCMP Minnesota Tall Tower Nitrous Oxide Inverse Modeling Dataset 2010-2015
Griffis, Timothy J. [University of Minnesota; Baker, John; Millet, Dylan; Chen, Zichong; Wood, Jeff; Erickson, Matt; Lee, Xuhui
2017-01-01
This dataset contains nitrous oxide mixing ratios and supporting information measured at a tall tower (KCMP, 244 m) site near St. Paul, Minnesot, USA. The data include nitrous oxide and carbon dioxide mixing ratios measured at the 100 m level. Turbulence and wind data were measured using a sonic anemometer at the 185 m level. Also included in this dataset are estimates of the "background" nitrous oxide mixing ratios and monthly concentration source footprints derived from WRF-STILT modeling.
An Empirical Method for deriving RBE values associated with Electrons, Photons and Radionuclides
Bellamy, Michael B; Puskin, J.; Eckerman, Keith F.; ...
2015-01-01
There is substantial evidence to justify using relative biological effectiveness (RBE) values greater than one for low-energy electrons and photons. But, in the field of radiation protection, radiation associated with low linear energy transfer (LET) has been assigned a radiation weighting factor w R of one. This value may be suitable for radiation protection but, for risk considerations, it is important to evaluate the potential elevated biological effectiveness of radiation to improve the quality of risk estimates. RBE values between 2 and 3 for tritium are implied by several experimental measurements. Additionally, elevated RBE values have been found for othermore » similar low-energy radiation sources. In this work, RBE values are derived for electrons based upon the fractional deposition of absorbed dose of energies less than a few keV. Using this empirical method, RBE values were also derived for monoenergetic photons and 1070 radionuclides from ICRP Publication 107 for which photons and electrons are the primary emissions.« less
Value of Osteoblast-Derived Exosomes in Bone Diseases.
Ge, Min; Wu, Yingzhi; Ke, Ronghu; Cai, Tianyi; Yang, Junyi; Mu, Xiongzheng
2017-06-01
The authors' purpose is to reveal the value of osteoblast-derived exosomes in bone diseases. Microvesicles from supernatants of mouse Mc3t3 were isolated by ultracentrifugation and then the authors presented the protein profile by proteomics analysis. The authors detected a total number of 1536 proteins by mass spectrometry and found 172 proteins overlap with bone database. The Ingenuity Pathway Analysis shows network of "Skeletal and Muscular System Development and Function, Developmental Disorder, Hereditary Disorder" and pathway about osteogenesis. EFNB1 and transforming growth factor beta receptor 3 in the network, LRP6, bone morphogenetic protein receptor type-1, and SMURF1 in the pathway seemed to be valuable in the exosome research of related bone disease. The authors' study unveiled the content of osteoblast-derived exosome and discussed valuable protein in it which might provide novel prospective in bone diseases research.
Srirangan, Kajan; Bruder, Mark; Akawi, Lamees; Miscevic, Dragan; Kilpatrick, Shane; Moo-Young, Murray; Chou, C Perry
2017-09-01
Diminishing fossil fuel reserves and mounting environmental concerns associated with petrochemical manufacturing practices have generated significant interests in developing whole-cell biocatalytic systems for the production of value-added chemicals and biofuels. Although acetyl-CoA is a common natural biogenic precursor for the biosynthesis of numerous metabolites, propionyl-CoA is unpopular and non-native to most organisms. Nevertheless, with its C3-acyl moiety as a discrete building block, propionyl-CoA can serve as another key biogenic precursor to several biological products of industrial importance. As a result, engineering propionyl-CoA metabolism, particularly in genetically tractable hosts with the use of inexpensive feedstocks, has paved an avenue for novel biomanufacturing. Herein, we present a systematic review on manipulation of propionyl-CoA metabolism as well as relevant genetic and metabolic engineering strategies for microbial production of value-added chemicals and biofuels, including odd-chain alcohols and organic acids, bio(co)polymers and polyketides. [Formula: see text].
Abu-Jamous, Basel; Fa, Rui; Roberts, David J; Nandi, Asoke K
2015-06-04
Collective analysis of the increasingly emerging gene expression datasets are required. The recently proposed binarisation of consensus partition matrices (Bi-CoPaM) method can combine clustering results from multiple datasets to identify the subsets of genes which are consistently co-expressed in all of the provided datasets in a tuneable manner. However, results validation and parameter setting are issues that complicate the design of such methods. Moreover, although it is a common practice to test methods by application to synthetic datasets, the mathematical models used to synthesise such datasets are usually based on approximations which may not always be sufficiently representative of real datasets. Here, we propose an unsupervised method for the unification of clustering results from multiple datasets using external specifications (UNCLES). This method has the ability to identify the subsets of genes consistently co-expressed in a subset of datasets while being poorly co-expressed in another subset of datasets, and to identify the subsets of genes consistently co-expressed in all given datasets. We also propose the M-N scatter plots validation technique and adopt it to set the parameters of UNCLES, such as the number of clusters, automatically. Additionally, we propose an approach for the synthesis of gene expression datasets using real data profiles in a way which combines the ground-truth-knowledge of synthetic data and the realistic expression values of real data, and therefore overcomes the problem of faithfulness of synthetic expression data modelling. By application to those datasets, we validate UNCLES while comparing it with other conventional clustering methods, and of particular relevance, biclustering methods. We further validate UNCLES by application to a set of 14 real genome-wide yeast datasets as it produces focused clusters that conform well to known biological facts. Furthermore, in-silico-based hypotheses regarding the function of a few
The value of benefit data in direct-to-consumer drug ads.
Woloshin, Steven; Schwartz, Lisa M; Welch, H Gilbert
2004-01-01
Direct-to-consumer (DTC) pharmaceutical ads typically describe drug benefits in qualitative terms; they rarely provide data on how well the drug works. We describe an evaluation of a "prescription drug benefit box"-data from the main randomized trials on the chances of various outcomes with and without the drug. Most participants rated the information as "very important" or "important"; almost all found the data easy to understand. Perceptions of drug effectiveness were much lower for ads that incorporated the benefit box than for ads that did not. Most people we interviewed want benefit data in drug ads, can understand these data, and are influenced by them.
Arevalo-Gallegos, Alejandra; Ahmad, Zanib; Asgher, Muhammad; Parra-Saldivar, Roberto; Iqbal, Hafiz M N
2017-06-01
A novel facility from the green technologies to integrate biomass-based carbohydrates, lignin, oils and other materials extraction and transformation into a wider spectrum of marketable and value-added products with a zero waste approach is reviewed. With ever-increasing scientific knowledge, worldwide economic and environmental consciousness, demands of legislative authorities and the manufacture, use, and removal of petrochemical-based by-products, from the last decade, there has been increasing research interests in the value or revalue of lignocellulose-based materials. The potential characteristics like natural abundance, renewability, recyclability, and ease of accessibility all around the year, around the globe, all makes residual biomass as an eco-attractive and petro-alternative candidate. In this context, many significant research efforts have been taken into account to change/replace petroleum-based economy into a bio-based economy, with an aim to develop a comprehensively sustainable, socially acceptable, and eco-friendly society. The present review work mainly focuses on various aspects of bio-refinery as a sustainable technology to process lignocellulose 'materials' into value-added products. Innovations in the bio-refinery world are providing, a portfolio of sustainable and eco-efficient products to compete in the market presently dominated by the petroleum-based products, and therefore, it is currently a subject of intensive research. Copyright © 2017 Elsevier B.V. All rights reserved.
SOA-based model for value-added ITS services delivery.
Herrera-Quintero, Luis Felipe; Maciá-Pérez, Francisco; Marcos-Jorquera, Diego; Gilart-Iglesias, Virgilio
2014-01-01
Integration is currently a key factor in intelligent transportation systems (ITS), especially because of the ever increasing service demands originating from the ITS industry and ITS users. The current ITS landscape is made up of multiple technologies that are tightly coupled, and its interoperability is extremely low, which limits ITS services generation. Given this fact, novel information technologies (IT) based on the service-oriented architecture (SOA) paradigm have begun to introduce new ways to address this problem. The SOA paradigm allows the construction of loosely coupled distributed systems that can help to integrate the heterogeneous systems that are part of ITS. In this paper, we focus on developing an SOA-based model for integrating information technologies (IT) into ITS to achieve ITS service delivery. To develop our model, the ITS technologies and services involved were identified, catalogued, and decoupled. In doing so, we applied our SOA-based model to integrate all of the ITS technologies and services, ranging from the lowest-level technical components, such as roadside unit as a service (RSUAAS), to the most abstract ITS services that will be offered to ITS users (value-added services). To validate our model, a functionality case study that included all of the components of our model was designed.
Relations between elliptic multiple zeta values and a special derivation algebra
NASA Astrophysics Data System (ADS)
Broedel, Johannes; Matthes, Nils; Schlotterer, Oliver
2016-04-01
We investigate relations between elliptic multiple zeta values (eMZVs) and describe a method to derive the number of indecomposable elements of given weight and length. Our method is based on representing eMZVs as iterated integrals over Eisenstein series and exploiting the connection with a special derivation algebra. Its commutator relations give rise to constraints on the iterated integrals over Eisenstein series relevant for eMZVs and thereby allow to count the indecomposable representatives. Conversely, the above connection suggests apparently new relations in the derivation algebra. Under https://tools.aei.mpg.de/emzv we provide relations for eMZVs over a wide range of weights and lengths.
Classification of subsurface objects using singular values derived from signal frames
Chambers, David H; Paglieroni, David W
2014-05-06
The classification system represents a detected object with a feature vector derived from the return signals acquired by an array of N transceivers operating in multistatic mode. The classification system generates the feature vector by transforming the real-valued return signals into complex-valued spectra, using, for example, a Fast Fourier Transform. The classification system then generates a feature vector of singular values for each user-designated spectral sub-band by applying a singular value decomposition (SVD) to the N.times.N square complex-valued matrix formed from sub-band samples associated with all possible transmitter-receiver pairs. The resulting feature vector of singular values may be transformed into a feature vector of singular value likelihoods and then subjected to a multi-category linear or neural network classifier for object classification.
Dust Optical Properties Over North Africa and Arabian Peninsula Derived from the AERONET Dataset
NASA Technical Reports Server (NTRS)
Kim, D.; Chin, M.; Yu, H.; Eck, T. F.; Sinyuk, A.; Smirnov, A.; Holben, B. N.
2011-01-01
Dust optical properties over North Africa and the Arabian Peninsula are extracted from the quality assured multi-year datasets obtained at 14 sites of the Aerosol Robotic Network (AERONET). We select the data with (a) large aerosol optical depth (AOD >= 0.4 at 440 nm) and (b) small Angstrom exponent (A(sub ext)<= 0.2) for retaining high accuracy and reducing interference of non-dust aerosols. The result indicates that the major fraction of high aerosol optical depth days are dominated by dust over these sites even though it varies depending on location and time. We have found that the annual mean and standard deviation of single scattering albedo, asymmetry parameter, real refractive index, and imaginary refractive index for Saharan and Arabian desert dust is 0.944 +/- 0.005, 0.752 +/- 0.014, 1.498 +/- 0.032, and 0.0024 +/- 0.0034 at 550 nm wavelength, respectively. Dust aerosol selected by this method is less absorbing than the previously reported values over these sites. The weaker absorption of dust from this study is consistent with the studies using remote sensing techniques from satellite. These results can help to constrain uncertainties in estimating global dust shortwave radiative forcing.
ERIC Educational Resources Information Center
Anthony, Jason L.; Williams, Jeffrey M.; Zhang, Zhoe; Landry, Susan H.; Dunkelberger, Martha J.
2014-01-01
Research Findings: In an effort toward developing a comprehensive, effective, scalable, and sustainable early childhood education program for at-risk populations, we conducted an experimental evaluation of the value added by 2 family involvement programs to the Texas Early Education Model (TEEM). A total of 91 preschool classrooms that served…
ERIC Educational Resources Information Center
Liew, Chern Li; Chennupati, K. R.; Foo, Schubert
2001-01-01
Explores the potential and impact of an innovative information environment in enhancing user activities in using electronic documents for various tasks, and to support the value-adding of these e-documents. Discusses the conceptual design and prototyping of a proposed environment, PROPIE. Presents an empirical and formative evaluation of the…
Vacuum currents in braneworlds on AdS bulk with compact dimensions
NASA Astrophysics Data System (ADS)
Bellucci, S.; Saharian, A. A.; Vardanyan, V.
2015-11-01
The two-point function and the vacuum expectation value (VEV) of the current density are investigated for a massive charged scalar field with arbitrary curvature coupling in the geometry of a brane on the background of AdS spacetime with partial toroidal compactification. The presence of a gauge field flux, enclosed by compact dimensions, is assumed. On the brane the field obeys Robin boundary condition and along compact dimensions periodicity conditions with general phases are imposed. There is a range in the space of the values for the coefficient in the boundary condition where the Poincaré vacuum is unstable. This range depends on the location of the brane and is different for the regions between the brane and AdS boundary and between the brane and the horizon. In models with compact dimensions the stability condition is less restrictive than that for the AdS bulk with trivial topology. The vacuum charge density and the components of the current along non-compact dimensions vanish. The VEV of the current density along compact dimensions is a periodic function of the gauge field flux with the period equal to the flux quantum. It is decomposed into the boundary-free and brane-induced contributions. The asymptotic behavior of the latter is investigated near the brane, near the AdS boundary and near the horizon. It is shown that, in contrast to the VEVs of the field squared an denergy-momentum tensor, the current density is finite on the brane and vanishes for the special case of Dirichlet boundary condition. Both the boundary-free and brane-induced contributions vanish on the AdS boundary. The brane-induced contribution vanishes on the horizon and for points near the horizon the current is dominated by the boundary-free part. In the near-horizon limit, the latter is connected to the corresponding quantity for a massless field in the Minkowski bulk by a simple conformal relation. Depending on the value of the Robin coefficient, the presence of the brane can either
Opportunity for high value-added chemicals from food supply chain wastes.
Matharu, Avtar S; de Melo, Eduardo M; Houghton, Joseph A
2016-09-01
With approximately 1.3 billion tonnes of food wasted per annum, food supply chain wastes (FSCWs) may be viewed as the contemporary Periodic Table of biobased feedstock chemicals (platform molecules) and functional materials. Herein, the global drivers and case for food waste valorisation within the context of global sustainability, sustainable development goals and the bioeconomy are discussed. The emerging potential of high value added chemicals from certain tropical FSCW is considered as these are grown in three major geographical areas: Brazil, India and China, and likely to increase in volume. FSCW in the context of biorefineries is discussed and two case studies are reported, namely: waste potato, and; orange peel waste. Interestingly, both waste feedstocks, like many others, produce proteins and with the global demand for vegetable proteins on the rise then proteins from FSCW may become a dominant area. Copyright © 2016 Elsevier Ltd. All rights reserved.
Thermodynamic Volume in AdS/CFT
NASA Astrophysics Data System (ADS)
Kim, Kyung Kiu; Ahn, Byoungjoon
2018-01-01
In this note, we study on extended thermodynamics of AdS black holes by varying cosmological constant. We found and discussed pressure and volume of both bulk and boundary physics through AdS/CFT correspondence. In particular, we derive the relation between thermodynamic volume and a chemical potential for M2 brane dual to four dimensional AdS space. In addition, we show that thermodynamic volume of hyperbolic black hole is related to `entanglement pressure' coming from a generalized first law of entanglement entropy.
Speck, Olga; Speck, David; Horn, Rafael; Gantner, Johannes; Sedlbauer, Klaus Peter
2017-01-24
Over the last few decades, the systematic approach of knowledge transfer from biological concept generators to technical applications has received increasing attention, particularly because marketable bio-derived developments are often described as sustainable. The objective of this paper is to rationalize and refine the discussion about bio-derived developments also with respect to sustainability by taking descriptive, normative and emotional aspects into consideration. In the framework of supervised learning, a dataset of 70 biology-derived and technology-derived developments characterised by 9 different attributes together with their respective values and assigned to one of 17 classes was created. On the basis of the dataset a decision tree was generated which can be used as a straightforward classification tool to identify biology-derived and technology-derived developments. The validation of the applied learning procedure achieved an average accuracy of 90.0%. Additional extraordinary qualities of technical applications are generally discussed by means of selected biology-derived and technology-derived examples with reference to normative (contribution to sustainability) and emotional aspects (aesthetics and symbolic character). In the context of a case study from the building sector, all aspects are critically discussed.
Statistical tests and identifiability conditions for pooling and analyzing multisite datasets
Zhou, Hao Henry; Singh, Vikas; Johnson, Sterling C.; Wahba, Grace
2018-01-01
When sample sizes are small, the ability to identify weak (but scientifically interesting) associations between a set of predictors and a response may be enhanced by pooling existing datasets. However, variations in acquisition methods and the distribution of participants or observations between datasets, especially due to the distributional shifts in some predictors, may obfuscate real effects when datasets are combined. We present a rigorous statistical treatment of this problem and identify conditions where we can correct the distributional shift. We also provide an algorithm for the situation where the correction is identifiable. We analyze various properties of the framework for testing model fit, constructing confidence intervals, and evaluating consistency characteristics. Our technical development is motivated by Alzheimer’s disease (AD) studies, and we present empirical results showing that our framework enables harmonizing of protein biomarkers, even when the assays across sites differ. Our contribution may, in part, mitigate a bottleneck that researchers face in clinical research when pooling smaller sized datasets and may offer benefits when the subjects of interest are difficult to recruit or when resources prohibit large single-site studies. PMID:29386387
Statistical tests and identifiability conditions for pooling and analyzing multisite datasets.
Zhou, Hao Henry; Singh, Vikas; Johnson, Sterling C; Wahba, Grace
2018-02-13
When sample sizes are small, the ability to identify weak (but scientifically interesting) associations between a set of predictors and a response may be enhanced by pooling existing datasets. However, variations in acquisition methods and the distribution of participants or observations between datasets, especially due to the distributional shifts in some predictors, may obfuscate real effects when datasets are combined. We present a rigorous statistical treatment of this problem and identify conditions where we can correct the distributional shift. We also provide an algorithm for the situation where the correction is identifiable. We analyze various properties of the framework for testing model fit, constructing confidence intervals, and evaluating consistency characteristics. Our technical development is motivated by Alzheimer's disease (AD) studies, and we present empirical results showing that our framework enables harmonizing of protein biomarkers, even when the assays across sites differ. Our contribution may, in part, mitigate a bottleneck that researchers face in clinical research when pooling smaller sized datasets and may offer benefits when the subjects of interest are difficult to recruit or when resources prohibit large single-site studies. Copyright © 2018 the Author(s). Published by PNAS.
NASA Astrophysics Data System (ADS)
Weber, Mark; Coldewey-Egbers, Melanie; Fioletov, Vitali E.; Frith, Stacey M.; Wild, Jeannette D.; Burrows, John P.; Long, Craig S.; Loyola, Diego
2018-02-01
We report on updated trends using different merged datasets from satellite and ground-based observations for the period from 1979 to 2016. Trends were determined by applying a multiple linear regression (MLR) to annual mean zonal mean data. Merged datasets used here include NASA MOD v8.6 and National Oceanic and Atmospheric Administration (NOAA) merge v8.6, both based on data from the series of Solar Backscatter UltraViolet (SBUV) and SBUV-2 satellite instruments (1978-present) as well as the Global Ozone Monitoring Experiment (GOME)-type Total Ozone (GTO) and GOME-SCIAMACHY-GOME-2 (GSG) merged datasets (1995-present), mainly comprising satellite data from GOME, the Scanning Imaging Absorption Spectrometer for Atmospheric Chartography (SCIAMACHY), and GOME-2A. The fifth dataset consists of the monthly mean zonal mean data from ground-based measurements collected at World Ozone and UV Data Center (WOUDC). The addition of four more years of data since the last World Meteorological Organization (WMO) ozone assessment (2013-2016) shows that for most datasets and regions the trends since the stratospheric halogen reached its maximum (˜ 1996 globally and ˜ 2000 in polar regions) are mostly not significantly different from zero. However, for some latitudes, in particular the Southern Hemisphere extratropics and Northern Hemisphere subtropics, several datasets show small positive trends of slightly below +1 % decade-1 that are barely statistically significant at the 2σ uncertainty level. In the tropics, only two datasets show significant trends of +0.5 to +0.8 % decade-1, while the others show near-zero trends. Positive trends since 2000 have been observed over Antarctica in September, but near-zero trends are found in October as well as in March over the Arctic. Uncertainties due to possible drifts between the datasets, from the merging procedure used to combine satellite datasets and related to the low sampling of ground-based data, are not accounted for in the trend
NASA Astrophysics Data System (ADS)
Butler, P. G.; Scourse, J. D.; Richardson, C. A.; Wanamaker, A. D., Jr.
2009-04-01
Determinations of the local correction (ΔR) to the globally averaged marine radiocarbon reservoir age are often isolated in space and time, derived from heterogeneous sources and constrained by significant uncertainties. Although time series of ΔR at single sites can be obtained from sediment cores, these are subject to multiple uncertainties related to sedimentation rates, bioturbation and interspecific variations in the source of radiocarbon in the analysed samples. Coral records provide better resolution, but these are available only for tropical locations. It is shown here that it is possible to use the shell of the long-lived bivalve mollusc Arctica islandica as a source of high resolution time series of absolutely-dated marine radiocarbon determinations for the shelf seas surrounding the North Atlantic ocean. Annual growth increments in the shell can be crossdated and chronologies can be constructed in a precise analogue with the use of tree-rings. Because the calendar dates of the samples are known, ΔR can be determined with high precision and accuracy and because all the samples are from the same species, the time series of ΔR values possesses a high degree of internal consistency. Presented here is a multi-centennial (AD 1593 - AD 1933) time series of 31 ΔR values for a site in the Irish Sea close to the Isle of Man. The mean value of ΔR (-62 14C yrs) does not change significantly during this period but increased variability is apparent before AD 1750.
Balachandran, U.; Dusek, J.T.; Kleefisch, M.S.; Kobylinski, T.P.
1996-11-12
A functionally gradient material for a membrane reactor for converting methane gas into value-added-products includes an outer tube of perovskite, which contacts air; an inner tube which contacts methane gas, of zirconium oxide, and a bonding layer between the perovskite and zirconium oxide layers. The bonding layer has one or more layers of a mixture of perovskite and zirconium oxide, with the layers transitioning from an excess of perovskite to an excess of zirconium oxide. The transition layers match thermal expansion coefficients and other physical properties between the two different materials. 7 figs.
Balachandran, Uthamalingam; Dusek, Joseph T.; Kleefisch, Mark S.; Kobylinski, Thadeus P.
1996-01-01
A functionally gradient material for a membrane reactor for converting methane gas into value-added-products includes an outer tube of perovskite, which contacts air; an inner tube which contacts methane gas, of zirconium oxide, and a bonding layer between the perovskite and zirconium oxide layers. The bonding layer has one or more layers of a mixture of perovskite and zirconium oxide, with the layers transitioning from an excess of perovskite to an excess of zirconium oxide. The transition layers match thermal expansion coefficients and other physical properties between the two different materials.
Realistic computer network simulation for network intrusion detection dataset generation
NASA Astrophysics Data System (ADS)
Payer, Garrett
2015-05-01
The KDD-99 Cup dataset is dead. While it can continue to be used as a toy example, the age of this dataset makes it all but useless for intrusion detection research and data mining. Many of the attacks used within the dataset are obsolete and do not reflect the features important for intrusion detection in today's networks. Creating a new dataset encompassing a large cross section of the attacks found on the Internet today could be useful, but would eventually fall to the same problem as the KDD-99 Cup; its usefulness would diminish after a period of time. To continue research into intrusion detection, the generation of new datasets needs to be as dynamic and as quick as the attacker. Simply examining existing network traffic and using domain experts such as intrusion analysts to label traffic is inefficient, expensive, and not scalable. The only viable methodology is simulation using technologies including virtualization, attack-toolsets such as Metasploit and Armitage, and sophisticated emulation of threat and user behavior. Simulating actual user behavior and network intrusion events dynamically not only allows researchers to vary scenarios quickly, but enables online testing of intrusion detection mechanisms by interacting with data as it is generated. As new threat behaviors are identified, they can be added to the simulation to make quicker determinations as to the effectiveness of existing and ongoing network intrusion technology, methodology and models.
Logarithmic corrections to entropy of magnetically charged AdS4 black holes
NASA Astrophysics Data System (ADS)
Jeon, Imtak; Lal, Shailesh
2017-11-01
Logarithmic terms are quantum corrections to black hole entropy determined completely from classical data, thus providing a strong check for candidate theories of quantum gravity purely from physics in the infrared. We compute these terms in the entropy associated to the horizon of a magnetically charged extremal black hole in AdS4×S7 using the quantum entropy function and discuss the possibility of matching against recently derived microscopic expressions.
Deans, Rachel; Wade, Shawna
2011-01-01
Growing demand from clients waiting to access vital services in a healthcare sector under economic constraint, coupled with the pressure for ongoing improvement within a multi-faceted organization, can have a significant impact on the front-line staff, who are essential to the successful implementation of any quality improvement initiative. The Lean methodology is a management system for continuous improvement based on the Toyota Production System; it focuses on two main themes: respect for people and the elimination of waste or non-value-added activities. Within the Lean process, value-added is used to describe any activity that contributes directly to satisfying the needs of the client, and non-value-added refers to any activity that takes time, space or resources but does not contribute directly to satisfying client needs. Through the revision of existing models of service delivery, the authors' organization has made an impact on increasing access to care and has supported successful engagement of staff in the process, while ensuring that the focus remains on the central needs of clients and families accessing services. While the performance metrics continue to exhibit respectable results for this strategic priority, further gains are expected over the next 18-24 months.
On piecewise interpolation techniques for estimating solar radiation missing values in Kedah
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saaban, Azizan; Zainudin, Lutfi; Bakar, Mohd Nazari Abu
2014-12-04
This paper discusses the use of piecewise interpolation method based on cubic Ball and Bézier curves representation to estimate the missing value of solar radiation in Kedah. An hourly solar radiation dataset is collected at Alor Setar Meteorology Station that is taken from Malaysian Meteorology Deparment. The piecewise cubic Ball and Bézier functions that interpolate the data points are defined on each hourly intervals of solar radiation measurement and is obtained by prescribing first order derivatives at the starts and ends of the intervals. We compare the performance of our proposed method with existing methods using Root Mean Squared Errormore » (RMSE) and Coefficient of Detemination (CoD) which is based on missing values simulation datasets. The results show that our method is outperformed the other previous methods.« less
ERIC Educational Resources Information Center
Nakamura, Yugo
2013-01-01
Value-added models (VAMs) have received considerable attention as a tool to transform our public education system. However, as VAMs are studied by researchers from a broad range of academic disciplines who remain divided over the best methods in analyzing the models and stakeholders without the extensive statistical background have been excluded…
An empirical method for deriving RBE values associated with electrons, photons and radionuclides.
Bellamy, M; Puskin, J; Hertel, N; Eckerman, K
2015-12-01
There is substantial evidence to justify using relative biological effectiveness (RBE) values of >1 for low-energy electrons and photons. But, in the field of radiation protection, radiation associated with low linear energy transfer has been assigned a radiation weighting factor wR of 1. This value may be suitable for radiation protection but, for risk considerations, it is important to evaluate the potential elevated biological effectiveness of radiation to improve the quality of risk estimates. RBE values between 2 and 3 for tritium are implied by several experimental measurements. Additionally, elevated RBE values have been found for other similar low-energy radiation sources. In this work, RBE values are derived for electrons based upon the fractional deposition of absorbed dose of energies less than a few kiloelectron volts. Using this empirical method, RBE values were also derived for monoenergetic photons and 1070 radionuclides from ICRP Publication 107 for which photons and electrons are the primary emissions. Published by Oxford University Press 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Loftus, Stacie K
2018-05-01
The number of melanocyte- and melanoma-derived next generation sequence genome-scale datasets have rapidly expanded over the past several years. This resource guide provides a summary of publicly available sources of melanocyte cell derived whole genome, exome, mRNA and miRNA transcriptome, chromatin accessibility and epigenetic datasets. Also highlighted are bioinformatic resources and tools for visualization and data queries which allow researchers a genome-scale view of the melanocyte. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.
Biochemical transformation of lignin for deriving valued commodities from lignocellulose
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gall, Daniel L.; Ralph, John; Donohue, Timothy J.
The biochemical properties of lignin present major obstacles to deriving societally beneficial entities from lignocellulosic biomass, an abundant and renewable feedstock. Similar to other biopolymers such as polysaccharides, polypeptides, and ribonucleic acids, lignin polymers are derived from multiple types of monomeric units. However, lignin’s renowned recalcitrance is largely attributable to its racemic nature and the variety of covalent inter-unit linkages through which its aromatic monomers are linked. Indeed, unlike other biopolymers whose monomers are consistently inter-linked by a single type of covalent bond, the monomeric units in lignin are linked via non-enzymatic, combinatorial radical coupling reactions that give rise tomore » a variety of inter-unit covalent bonds in mildly branched racemic polymers. Yet, despite the chemical complexity and stability of lignin, significant strides have been made in recent years to identify routes through which valued commodities can be derived from it. This paper discusses emerging biological and biochemical means through which degradation of lignin to aromatic monomers can lead to the derivation of commercially valuable products.« less
Biochemical transformation of lignin for deriving valued commodities from lignocellulose
Gall, Daniel L.; Ralph, John; Donohue, Timothy J.; ...
2017-03-24
The biochemical properties of lignin present major obstacles to deriving societally beneficial entities from lignocellulosic biomass, an abundant and renewable feedstock. Similar to other biopolymers such as polysaccharides, polypeptides, and ribonucleic acids, lignin polymers are derived from multiple types of monomeric units. However, lignin’s renowned recalcitrance is largely attributable to its racemic nature and the variety of covalent inter-unit linkages through which its aromatic monomers are linked. Indeed, unlike other biopolymers whose monomers are consistently inter-linked by a single type of covalent bond, the monomeric units in lignin are linked via non-enzymatic, combinatorial radical coupling reactions that give rise tomore » a variety of inter-unit covalent bonds in mildly branched racemic polymers. Yet, despite the chemical complexity and stability of lignin, significant strides have been made in recent years to identify routes through which valued commodities can be derived from it. This paper discusses emerging biological and biochemical means through which degradation of lignin to aromatic monomers can lead to the derivation of commercially valuable products.« less
An innovative privacy preserving technique for incremental datasets on cloud computing.
Aldeen, Yousra Abdul Alsahib S; Salleh, Mazleena; Aljeroudi, Yazan
2016-08-01
Cloud computing (CC) is a magnificent service-based delivery with gigantic computer processing power and data storage across connected communications channels. It imparted overwhelming technological impetus in the internet (web) mediated IT industry, where users can easily share private data for further analysis and mining. Furthermore, user affable CC services enable to deploy sundry applications economically. Meanwhile, simple data sharing impelled various phishing attacks and malware assisted security threats. Some privacy sensitive applications like health services on cloud that are built with several economic and operational benefits necessitate enhanced security. Thus, absolute cyberspace security and mitigation against phishing blitz became mandatory to protect overall data privacy. Typically, diverse applications datasets are anonymized with better privacy to owners without providing all secrecy requirements to the newly added records. Some proposed techniques emphasized this issue by re-anonymizing the datasets from the scratch. The utmost privacy protection over incremental datasets on CC is far from being achieved. Certainly, the distribution of huge datasets volume across multiple storage nodes limits the privacy preservation. In this view, we propose a new anonymization technique to attain better privacy protection with high data utility over distributed and incremental datasets on CC. The proficiency of data privacy preservation and improved confidentiality requirements is demonstrated through performance evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Value added applications are needed for peanut meal, which is the high protein byproduct of commercial peanut oil production. Peanut meal dispersions were hydrolyzed with alcalase, flavourzyme and pepsin in an effort to improve functional and nutritional properties of the resulting water soluble ex...
Statistical Reference Datasets
National Institute of Standards and Technology Data Gateway
Statistical Reference Datasets (Web, free access) The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.
Wani, K.A.; Mamta; Rao, R.J.
2013-01-01
Solid waste management is a worldwide problem and it is becoming more and more complicated day by day due to rise in population, industrialization and changes in our life style. Transformation of industrial sludges into vermicompost is of double interest: on the one hand, a waste is converted into value added product, and, on the other, it controls a pollutant that is a consequence of increasing industrialization. Garden waste, kitchen waste and cow dung were subjected to recycle through vermicomposting by using the epigeic earthworm Eisenia fetida under field conditions. The pH, moisture content, total organic carbon, humus, nitrogen, phosphorous and potassium in vermicompost was analysed. It was found that moisture content, total organic carbon, humus, nitrogen, phosphorous and potassium was high in cow dung, followed by kitchen waste and garden waste. This study clearly indicates that vermicomposting of garden waste, kitchen waste and cow dung can not only produce a value added produce (vermicomposting) but at the same time reduce the quantity of waste. PMID:23961230
NASA Technical Reports Server (NTRS)
Armstrong, Edward; Tauer, Eric
2013-01-01
The presentation focused on describing a new dataset lifecycle policy that the NASA Physical Oceanography DAAC (PO.DAAC) has implemented for its new and current datasets to foster improved stewardship and consistency across its archive. The overarching goal is to implement this dataset lifecycle policy for all new GHRSST GDS2 datasets and bridge the mission statements from the GHRSST Project Office and PO.DAAC to provide the best quality SST data in a cost-effective, efficient manner, preserving its integrity so that it will be available and usable to a wide audience.
Faber, Irene R; Pion, Johan; Munivrana, Goran; Faber, Niels R; Nijhuis-Van der Sanden, Maria W G
2017-04-18
Talent detection intends to support lifelong sports participation, reduce dropouts and stimulate sports at the elite level. For this purpose it is important to reveal the specific profile which directs children to the sports that connect to their strengths and preferences. This study evaluated a perceptuomotor skills assessment as part of talent detection for table tennis, a sport in which perceptuomotor skills are considered essential to cope with the difficult technical aspects. Primary school children (n = 121) and gifted young table tennis players (n = 146) were assessed using the Dutch perceptuomotor skills assessment measuring "ball control" and "gross motor function". A discriminant function analysis confirmed the added value by identifying primary school children fitting the table tennis perceptuomotor profile of the young gifted table tennis players (28%). General linear model analyses for the assessment's individual test items showed that the table tennis players outperformed their primary school peers on all "ball control" items (P < 0.001). In conclusion, the assessment appears to be of added value for talent detection in table tennis at this young age. Longitudinal studies need to reveal the predictive value for sports participation and elite sports.
AIP1OGREN: Aerosol Observing Station Intensive Properties Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koontz, Annette; Flynn, Connor
The aip1ogren value-added product (VAP) computes several aerosol intensive properties. It requires as input calibrated, corrected, aerosol extensive properties (scattering and absorption coefficients, primarily) from the Aerosol Observing Station (AOS). Aerosol extensive properties depend on both the nature of the aerosol and the amount of the aerosol. We compute several properties as relationships between the various extensive properties. These intensive properties are independent of aerosol amount and instead relate to intrinsic properties of the aerosol itself. Along with the original extensive properties we report aerosol single-scattering albedo, hemispheric backscatter fraction, asymmetry parameter, and Ångström exponent for scattering and absorption withmore » one-minute averaging. An hourly averaged file is produced from the 1-minute files that includes all extensive and intensive properties as well as submicron scattering and submicron absorption fractions. Finally, in both the minutely and hourly files the aerosol radiative forcing efficiency is provided.« less
NASA Technical Reports Server (NTRS)
Weber, Mark; Coldewey-Egbers, Melanie; Fioletov, Vitali E.; Frith, Stacey M.; Wild, Jeannette D.; Burrows, John P.; Loyola, Diego
2018-01-01
We report on updated trends using different merged datasets from satellite and ground-based observations for the period from 1979 to 2016. Trends were determined by applying a multiple linear regression (MLR) to annual mean zonal mean data. Merged datasets used here include NASA MOD v8.6 and National Oceanic and Atmospheric Administration (NOAA) merge v8.6, both based on data from the series of Solar Backscatter UltraViolet (SBUV) and SBUV-2 satellite instruments (1978–present) as well as the Global Ozone Monitoring Experiment (GOME)-type Total Ozone (GTO) and GOME-SCIAMACHY-GOME-2 (GSG) merged datasets (1995-present), mainly comprising satellite data from GOME, the Scanning Imaging Absorption Spectrometer for Atmospheric Chartography (SCIAMACHY), and GOME-2A. The fifth dataset consists of the monthly mean zonal mean data from ground-based measurements collected at World Ozone and UV Data Center (WOUDC). The addition of four more years of data since the last World Meteorological Organization (WMO) ozone assessment (2013-2016) shows that for most datasets and regions the trends since the stratospheric halogen reached its maximum (approximately 1996 globally and approximately 2000 in polar regions) are mostly not significantly different from zero. However, for some latitudes, in particular the Southern Hemisphere extratropics and Northern Hemisphere subtropics, several datasets show small positive trends of slightly below +1 percent decade(exp. -1) that are barely statistically significant at the 2 Sigma uncertainty level. In the tropics, only two datasets show significant trends of +0.5 to +0.8 percent(exp.-1), while the others show near-zero trends. Positive trends since 2000 have been observed over Antarctica in September, but near-zero trends are found in October as well as in March over the Arctic. Uncertainties due to possible drifts between the datasets, from the merging procedure used to combine satellite datasets and related to the low sampling of
Starch--value addition by modification.
Tharanathan, Rudrapatnam N
2005-01-01
Starch is one of the most important but flexible food ingredients possessing value added attributes for innumerable industrial applications. Its various chemically modified derivatives offer a great scope of high technological value in both food and non-food industries. Modified starches are designed to overcome one or more of the shortcomings, such as loss of viscosity and thickening power upon cooking and storage, particularly at low pH, retrogradation characteristics, syneresis, etc., of native starches. Oxidation, esterification, hydroxyalkylation, dextrinization, and cross-linking are some of the modifications commonly employed to prepare starch derivatives. In a way, starch modification provides desirable functional attributes as well as offering economic alternative to other hydrocolloid ingredients, such as gums and mucilages, which are unreliable in quality and availability. Resistant starch, a highly retrograded starch fractionformed upon food processing, is another useful starch derivative. It exhibits the beneficial physiological effects of therapeutic and nutritional values akin to dietary fiber. There awaits considerable opportunity for future developments, especially for tailor-made starch derivatives with multiple modifications and with the desired functional and nutritional properties, although the problem of obtaining legislative approval for the use of novel starch derivatives in processed food formulations is still under debate. Nevertheless, it can be predicted that new ventures in starch modifications and their diverse applications will continue to be of great interest in applied research.
High value added lipids produced by microorganisms: a potential use of sugarcane vinasse.
Fernandes, Bruna Soares; Vieira, João Paulo Fernandes; Contesini, Fabiano Jares; Mantelatto, Paulo Eduardo; Zaiat, Marcelo; Pradella, José Geraldo da Cruz
2017-12-01
This review aims to present an innovative concept of high value added lipids produced by heterotrophic microorganisms, bacteria and fungi, using carbon sources, such as sugars, acids and alcohols that could come from sugarcane vinasse, which is the main byproduct from ethanol production that is released in the distillation step. Vinasse is a rich carbon source and low-cost feedstock produced in large amounts from ethanol production. In 2019, the Brazilian Ministry of Agriculture, Livestock and Food Supply estimates that growth of ethanol domestic consumption will be 58.8 billion liters, more than double the amount in 2008. This represents the annual production of more than 588 billion liters of vinasse, which is currently used as a fertilizer in the sugarcane crop, due to its high concentration of minerals, mainly potassium. However, studies indicate some disadvantages such as the generation of Greenhouse Gas emission during vinasse distribution in the crop, as well as the possibility of contaminating the groundwater and soil. Therefore, the development of programs for sustainable use of vinasse is a priority. One profitable alternative is the fermentation of vinasse, followed by an anaerobic digester, in order to obtain biomaterials such as lipids, other byproducts, and methane. Promising high value added lipids, for instance carotenoids and polyunsaturated fatty acids (PUFAS), with a predicted market of millions of US$, could be produced using vinasse as carbon source, to guide an innovative concept for sustainable production. Example of lipids obtained from the fermentation of compounds present in vinasse are vitamin D, which comes from yeast sucrose fermentation and Omega 3, which can be obtained by bacteria and fungi fermentation. Additionally, several other compounds present in vinasse can be used for this purpose, including sucrose, ethanol, lactate, pyruvate, acetate and other carbon sources. Finally, this paper illustrates the potential market and
ERIC Educational Resources Information Center
Reinvention Center, 2004
2004-01-01
This document presents the proceedings of the Reinvention Center's second major conference, "Integrating Research into Undergraduate Education: The Value Added," co-sponsored by the National Science Foundation and the Woodrow Wilson National Fellowship Foundation. The goal of the conference was to distill the distinct characteristics of the…
NASA Technical Reports Server (NTRS)
Cornford, S.; Gibbel, M.
1997-01-01
NASA's Code QT Test Effectiveness Program is funding a series of applied research activities focused on utilizing the principles of physics and engineering of failure and those of engineering economics to assess and improve the value-added by the various validation and verification activities to organizations.
ERIC Educational Resources Information Center
Guarino, Cassandra M.; Maxfield, Michelle; Reckase, Mark D.; Thompson, Paul; Wooldridge, Jeffrey M.
2014-01-01
Empirical Bayes' (EB) estimation is a widely used procedure to calculate teacher value-added. It is primarily viewed as a way to make imprecise estimates more reliable. In this paper we review the theory of EB estimation and use simulated data to study its ability to properly rank teachers. We compare the performance of EB estimators with that of…
Relative Error Evaluation to Typical Open Global dem Datasets in Shanxi Plateau of China
NASA Astrophysics Data System (ADS)
Zhao, S.; Zhang, S.; Cheng, W.
2018-04-01
Produced by radar data or stereo remote sensing image pairs, global DEM datasets are one of the most important types for DEM data. Relative error relates to surface quality created by DEM data, so it relates to geomorphology and hydrologic applications using DEM data. Taking Shanxi Plateau of China as the study area, this research evaluated the relative error to typical open global DEM datasets including Shuttle Radar Terrain Mission (SRTM) data with 1 arc second resolution (SRTM1), SRTM data with 3 arc second resolution (SRTM3), ASTER global DEM data in the second version (GDEM-v2) and ALOS world 3D-30m (AW3D) data. Through process and selection, more than 300,000 ICESat/GLA14 points were used as the GCP data, and the vertical error was computed and compared among four typical global DEM datasets. Then, more than 2,600,000 ICESat/GLA14 point pairs were acquired using the distance threshold between 100 m and 500 m. Meanwhile, the horizontal distance between every point pair was computed, so the relative error was achieved using slope values based on vertical error difference and the horizontal distance of the point pairs. Finally, false slope ratio (FSR) index was computed through analyzing the difference between DEM and ICESat/GLA14 values for every point pair. Both relative error and FSR index were categorically compared for the four DEM datasets under different slope classes. Research results show: Overall, AW3D has the lowest relative error values in mean error, mean absolute error, root mean square error and standard deviation error; then the SRTM1 data, its values are a little higher than AW3D data; the SRTM3 and GDEM-v2 data have the highest relative error values, and the values for the two datasets are similar. Considering different slope conditions, all the four DEM data have better performance in flat areas but worse performance in sloping regions; AW3D has the best performance in all the slope classes, a litter better than SRTM1; with slope increasing
Biotransformation of lignocellulosic materials into value-added products-A review.
Bilal, Muhammad; Asgher, Muhammad; Iqbal, Hafiz M N; Hu, Hongbo; Zhang, Xuehong
2017-05-01
In the past decade, with the key biotechnological advancements, lignocellulosic materials have gained a particular importance. In serious consideration of global economic, environmental and energy issues, research scientists have been re-directing their interests in (re)-valorizing naturally occurring lignocellulosic-based materials. In this context, lignin-modifying enzymes (LMEs) have gained considerable attention in numerous industrial and biotechnological processes. However, their lower catalytic efficiencies and operational stabilities limit their practical and multipurpose applications in various sectors. Therefore, to expand the range of natural industrial biocatalysts e.g. LMEs, significant progress related to the enzyme biotechnology has appeared. Owing to the abundant lignocellulose availability along with LMEs in combination with the scientific advances in the biotechnological era, solid-phase biocatalysts can be economically tailored on a large scale. This review article outlines first briefly on the lignocellulose materials as a potential source for biotransformation into value-added products including composites, fine chemicals, nutraceutical, delignification, and enzymes. Comprehensive information is also given on the purification and characterization of LMEs to present their potential for the industrial and biotechnological sector. Copyright © 2017 Elsevier B.V. All rights reserved.
2013-10-01
exchange (COBie), Building Information Modeling ( BIM ), value-added analysis, business processes, project management 16. SECURITY CLASSIFICATION OF: 17...equipment. The innovative aspect of Building In- formation Modeling ( BIM ) is that it creates a computable building descrip- tion. The ability to use a...interoperability. In order for the building information to be interoperable, it must also con- form to a common data model , or schema, that defines the class
Derived crop management data for the LandCarbon Project
Schmidt, Gail; Liu, Shu-Guang; Oeding, Jennifer
2011-01-01
products are used as input to the LandCarbon models to represent the historic and the future scenario management data. The overall algorithm to generate each of the gridded management products is based on the land cover and the derived crop type. For each year in the land cover dataset, the algorithm loops through each 250-meter pixel in the ecoregion. If the current pixel in the land cover dataset is an agriculture pixel, then the crop type is determined. Once the crop type is derived, then the crop harvest, manure, fertilizer, tillage, and cover crop values are derived independently for that crop type. The following is the overall algorithm used for the set of derived grids. The specific algorithm to generate each management dataset is discussed in the respective section for that dataset, along with special data handling and a description of the output product.