LONGITUDINAL COHORT METHODS STUDIES
Accurate exposure classification tools are required to link exposure with health effects in epidemiological studies. Exposure classification for occupational studies is relatively easy compared to predicting residential childhood exposures. Recent NHEXAS (Maryland) study articl...
Accurate exposure classification tools are required to link exposure with health effects in epidemiological studies. Although long-term integrated exposure measurements are a critical component of exposure assessment, the ability to include these measurements into epidemiologic...
Accurate exposure classification tools are required to link exposure with health effects in epidemiological studies. Long-term, time-integrated exposure measures would be desirable to address the problem of developing appropriate residential childhood exposure classifications. ...
METHODS STUDIES FOR THE NATIONAL CHILDREN'S STUDY: SEMIPERMEABLE MEMBRANE DEVICE (SPMD)
Accurate exposure classification tools are required to link exposure with health effects in epidemiological studies. Although long-term integrated exposure measurements are a critical component of exposure assessment, the ability to include these measurements into epidemiologic...
METHODS STUDIES FOR THE NATIONAL CHILDREN'S STUDY: MOLECULARLY IMPRINTED POLYMERS
Accurate exposure classification tools are required to link exposure with health effects in epidemiological studies. Although long-term integrated exposure measurements are a critical component of exposure assessment, the ability to include these measurements into epidemiologic...
EXPOSURE ASSESSMENT METHODS DEVELOPMENT PILOTS FOR THE NATIONAL CHILDREN'S STUDY
Accurate exposure classification tools are needed to link exposure with health effects. EPA began methods development pilot studies in 2000 to address general questions about exposures and outcome measures. Selected pilot studies are highlighted in this poster. The “Literature Re...
Final Ecosystem Goods and Services Classification System (FEGS-CS)
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
RELIABILITY OF BIOMARKERS OF PESTICIDE EXPOSURE AMONG CHILDREN AND ADULTS IN CTEPP OHIO
Urinary biomarkers offer the potential for providing an efficient tool for exposure classification by reflecting the aggregate of all exposure routes. Substantial variability observed in urinary pesticide metabolite concentrations over short periods of time, however, has cast so...
Ko, Yi-An; Mukherjee, Bhramar; Smith, Jennifer A; Kardia, Sharon L R; Allison, Matthew; Diez Roux, Ana V
2016-11-01
There has been an increased interest in identifying gene-environment interaction (G × E) in the context of multiple environmental exposures. Most G × E studies analyze one exposure at a time, but we are exposed to multiple exposures in reality. Efficient analysis strategies for complex G × E with multiple environmental factors in a single model are still lacking. Using the data from the Multiethnic Study of Atherosclerosis, we illustrate a two-step approach for modeling G × E with multiple environmental factors. First, we utilize common clustering and classification strategies (e.g., k-means, latent class analysis, classification and regression trees, Bayesian clustering using Dirichlet Process) to define subgroups corresponding to distinct environmental exposure profiles. Second, we illustrate the use of an additive main effects and multiplicative interaction model, instead of the conventional saturated interaction model using product terms of factors, to study G × E with the data-driven exposure subgroups defined in the first step. We demonstrate useful analytical approaches to translate multiple environmental exposures into one summary class. These tools not only allow researchers to consider several environmental exposures in G × E analysis but also provide some insight into how genes modify the effect of a comprehensive exposure profile instead of examining effect modification for each exposure in isolation.
Hartling, Lisa; Bond, Kenneth; Santaguida, P Lina; Viswanathan, Meera; Dryden, Donna M
2011-08-01
To develop and test a study design classification tool. We contacted relevant organizations and individuals to identify tools used to classify study designs and ranked these using predefined criteria. The highest ranked tool was a design algorithm developed, but no longer advocated, by the Cochrane Non-Randomized Studies Methods Group; this was modified to include additional study designs and decision points. We developed a reference classification for 30 studies; 6 testers applied the tool to these studies. Interrater reliability (Fleiss' κ) and accuracy against the reference classification were assessed. The tool was further revised and retested. Initial reliability was fair among the testers (κ=0.26) and the reference standard raters κ=0.33). Testing after revisions showed improved reliability (κ=0.45, moderate agreement) with improved, but still low, accuracy. The most common disagreements were whether the study design was experimental (5 of 15 studies), and whether there was a comparison of any kind (4 of 15 studies). Agreement was higher among testers who had completed graduate level training versus those who had not. The moderate reliability and low accuracy may be because of lack of clarity and comprehensiveness of the tool, inadequate reporting of the studies, and variability in tester characteristics. The results may not be generalizable to all published studies, as the test studies were selected because they had posed challenges for previous reviewers with respect to their design classification. Application of such a tool should be accompanied by training, pilot testing, and context-specific decision rules. Copyright © 2011 Elsevier Inc. All rights reserved.
Radiographic readings for asbestosis: misuse of science--validation of the ILO classification.
Miller, Albert
2007-01-01
Radiographic readings for pneumoconiosis (both asbestosis and silicosis), even those using the International Labour Office (ILO) Classification, have received widespread negative coverage in the media and strong judicial rebuke. The medical literature over the past 90 years was reviewed for the relationships between radiographic severity (standardized as the ILO profusion score) and indices of exposure to silica or asbestos, tissue burden of silica particles or asbestos fibers, histologic fibrosis, various measurements of pulmonary function and mortality. Evidence from many different disciplines has demonstrated that the ILO profusion score correlates with occupational exposure, dust burden in the lung, histologic fibrosis and, more recently, with physiologic impairment and mortality. The ILO Classification has therefore been validated as a scientific tool. Its fraudulent misuse by "hired-gun" physicians, attorneys and elements of the compensation system to falsify claims of asbestosis and/or silicosis (often in the same claimant) must be condemned. (c) 2006 Wiley-Liss, Inc.
Evaluation of air quality zone classification methods based on ambient air concentration exposure.
Freeman, Brian; McBean, Ed; Gharabaghi, Bahram; Thé, Jesse
2017-05-01
Air quality zones are used by regulatory authorities to implement ambient air standards in order to protect human health. Air quality measurements at discrete air monitoring stations are critical tools to determine whether an air quality zone complies with local air quality standards or is noncompliant. This study presents a novel approach for evaluation of air quality zone classification methods by breaking the concentration distribution of a pollutant measured at an air monitoring station into compliance and exceedance probability density functions (PDFs) and then using Monte Carlo analysis with the Central Limit Theorem to estimate long-term exposure. The purpose of this paper is to compare the risk associated with selecting one ambient air classification approach over another by testing the possible exposure an individual living within a zone may face. The chronic daily intake (CDI) is utilized to compare different pollutant exposures over the classification duration of 3 years between two classification methods. Historical data collected from air monitoring stations in Kuwait are used to build representative models of 1-hr NO 2 and 8-hr O 3 within a zone that meets the compliance requirements of each method. The first method, the "3 Strike" method, is a conservative approach based on a winner-take-all approach common with most compliance classification methods, while the second, the 99% Rule method, allows for more robust analyses and incorporates long-term trends. A Monte Carlo analysis is used to model the CDI for each pollutant and each method with the zone at a single station and with multiple stations. The model assumes that the zone is already in compliance with air quality standards over the 3 years under the different classification methodologies. The model shows that while the CDI of the two methods differs by 2.7% over the exposure period for the single station case, the large number of samples taken over the duration period impacts the sensitivity of the statistical tests, causing the null hypothesis to fail. Local air quality managers can use either methodology to classify the compliance of an air zone, but must accept that the 99% Rule method may cause exposures that are statistically more significant than the 3 Strike method. A novel method using the Central Limit Theorem and Monte Carlo analysis is used to directly compare different air standard compliance classification methods by estimating the chronic daily intake of pollutants. This method allows air quality managers to rapidly see how individual classification methods may impact individual population groups, as well as to evaluate different pollutants based on dosage and exposure when complete health impacts are not known.
Matgéné: a program to develop job-exposure matrices in the general population in France.
Févotte, Joëlle; Dananché, Brigitte; Delabre, Laurène; Ducamp, Stephane; Garras, Loïc; Houot, Marie; Luce, Danièle; Orlowski, Ewa; Pilorget, Corinne; Lacourt, Aude; Brochard, Patrick; Goldberg, Marcel; Imbernon, Ellen
2011-10-01
Matgéné is a program to develop job-exposure matrices (JEMs) adapted to the general population in France for the period since 1950. The aim is to create retrospective exposure assessment tools for estimating the prevalence of occupational exposure to various agents that can then be correlated to health-related parameters. JEMs were drawn up by a team of six industrial hygienists who based their assessments on available occupational measurement, economic and statistical data, and several thousand job descriptions from epidemiological studies performed in France since 1984. Each JEM is specific to one agent, assessing exposure for a set of homogeneous combinations (occupation × activity × period) according to two occupational classifications (ISCO 1968 and PCS 1994) and one economic activities classification (NAF 2000). The cells of the JEM carry an estimate of the probability and level of exposure. Level is estimated by the duration and intensity of exposure-linked tasks or by description of the tasks when exposure measurement data are lacking for the agent in question. The JEMs were applied to a representative sample of the French population in 2007, and prevalence for each exposure was estimated in various population groups. All documents and data are available on a dedicated website. By the end of 2010, 18 JEMs have been developed and eight are under development, concerning a variety of chemical agents: organic and mineral dust, mineral fibers, and solvents. By implementation in the French population, exposure prevalences were calculated at different dates and for complete careers, and attributable risk fractions were estimated for certain pathologies. Some of these results were validated by comparison with those of other programs. Initial Matgéné JEMs results are in agreement with the French and international literature, thus validating the methodology. Exposure estimates precision, however, vary between agents and according to the amount of exposure measurement data available. These JEMs are important epidemiological tools, and improving their quality will require investment in occupational health data harvesting, especially in the case of low-level exposures.
Machine learning algorithms for mode-of-action classification in toxicity assessment.
Zhang, Yile; Wong, Yau Shu; Deng, Jian; Anton, Cristina; Gabos, Stephan; Zhang, Weiping; Huang, Dorothy Yu; Jin, Can
2016-01-01
Real Time Cell Analysis (RTCA) technology is used to monitor cellular changes continuously over the entire exposure period. Combining with different testing concentrations, the profiles have potential in probing the mode of action (MOA) of the testing substances. In this paper, we present machine learning approaches for MOA assessment. Computational tools based on artificial neural network (ANN) and support vector machine (SVM) are developed to analyze the time-concentration response curves (TCRCs) of human cell lines responding to tested chemicals. The techniques are capable of learning data from given TCRCs with known MOA information and then making MOA classification for the unknown toxicity. A novel data processing step based on wavelet transform is introduced to extract important features from the original TCRC data. From the dose response curves, time interval leading to higher classification success rate can be selected as input to enhance the performance of the machine learning algorithm. This is particularly helpful when handling cases with limited and imbalanced data. The validation of the proposed method is demonstrated by the supervised learning algorithm applied to the exposure data of HepG2 cell line to 63 chemicals with 11 concentrations in each test case. Classification success rate in the range of 85 to 95 % are obtained using SVM for MOA classification with two clusters to cases up to four clusters. Wavelet transform is capable of capturing important features of TCRCs for MOA classification. The proposed SVM scheme incorporated with wavelet transform has a great potential for large scale MOA classification and high-through output chemical screening.
NASA Astrophysics Data System (ADS)
Ramos, M. Rosário; Carolino, E.; Viegas, Carla; Viegas, Sandra
2016-06-01
Health effects associated with occupational exposure to particulate matter have been studied by several authors. In this study were selected six industries of five different areas: Cork company 1, Cork company 2, poultry, slaughterhouse for cattle, riding arena and production of animal feed. The measurements tool was a portable device for direct reading. This tool provides information on the particle number concentration for six different diameters, namely 0.3 µm, 0.5 µm, 1 µm, 2.5 µm, 5 µm and 10 µm. The focus on these features is because they might be more closely related with adverse health effects. The aim is to identify the particles that better discriminate the industries, with the ultimate goal of classifying industries regarding potential negative effects on workers' health. Several methods of discriminant analysis were applied to data of occupational exposure to particulate matter and compared with respect to classification accuracy. The selected methods were linear discriminant analyses (LDA); linear quadratic discriminant analysis (QDA), robust linear discriminant analysis with selected estimators (MLE (Maximum Likelihood Estimators), MVE (Minimum Volume Elipsoid), "t", MCD (Minimum Covariance Determinant), MCD-A, MCD-B), multinomial logistic regression and artificial neural networks (ANN). The predictive accuracy of the methods was accessed through a simulation study. ANN yielded the highest rate of classification accuracy in the data set under study. Results indicate that the particle number concentration of diameter size 0.5 µm is the parameter that better discriminates industries.
Exposure to traffic pollution: comparison between measurements and a model.
Alili, F; Momas, I; Callais, F; Le Moullec, Y; Sacre, C; Chiron, M; Flori, J P
2001-01-01
French researchers from the Building Scientific and Technical Center have produced a traffic-exposure index. To achieve this, they used an air pollution dispersion model that enabled them to calculate automobile pollutant concentrations in front of subjects' residences and places of work. Researchers used this model, which was tested at 27 Paris canyon street sites, and compared nitrogen oxides measurements obtained with passive samplers during a 6-wk period and calculations derived from the model. There was a highly significant correlation (r = .83) between the 2 series of values; their mean concentrations were not significantly different. The results suggested that the aforementioned model could be a useful epidemiological tool for the classification of city dwellers by present-or even cumulative exposure to automobile air pollution.
Friesen, Melissa C.; Locke, Sarah J.; Tornow, Carina; Chen, Yu-Cheng; Koh, Dong-Hee; Stewart, Patricia A.; Purdue, Mark; Colt, Joanne S.
2014-01-01
Objectives: Lifetime occupational history (OH) questionnaires often use open-ended questions to capture detailed information about study participants’ jobs. Exposure assessors use this information, along with responses to job- and industry-specific questionnaires, to assign exposure estimates on a job-by-job basis. An alternative approach is to use information from the OH responses and the job- and industry-specific questionnaires to develop programmable decision rules for assigning exposures. As a first step in this process, we developed a systematic approach to extract the free-text OH responses and convert them into standardized variables that represented exposure scenarios. Methods: Our study population comprised 2408 subjects, reporting 11991 jobs, from a case–control study of renal cell carcinoma. Each subject completed a lifetime OH questionnaire that included verbatim responses, for each job, to open-ended questions including job title, main tasks and activities (task), tools and equipment used (tools), and chemicals and materials handled (chemicals). Based on a review of the literature, we identified exposure scenarios (occupations, industries, tasks/tools/chemicals) expected to involve possible exposure to chlorinated solvents, trichloroethylene (TCE) in particular, lead, and cadmium. We then used a SAS macro to review the information reported by study participants to identify jobs associated with each exposure scenario; this was done using previously coded standardized occupation and industry classification codes, and a priori lists of associated key words and phrases related to possibly exposed tasks, tools, and chemicals. Exposure variables representing the occupation, industry, and task/tool/chemicals exposure scenarios were added to the work history records of the study respondents. Our identification of possibly TCE-exposed scenarios in the OH responses was compared to an expert’s independently assigned probability ratings to evaluate whether we missed identifying possibly exposed jobs. Results: Our process added exposure variables for 52 occupation groups, 43 industry groups, and 46 task/tool/chemical scenarios to the data set of OH responses. Across all four agents, we identified possibly exposed task/tool/chemical exposure scenarios in 44–51% of the jobs in possibly exposed occupations. Possibly exposed task/tool/chemical exposure scenarios were found in a nontrivial 9–14% of the jobs not in possibly exposed occupations, suggesting that our process identified important information that would not be captured using occupation alone. Our extraction process was sensitive: for jobs where our extraction of OH responses identified no exposure scenarios and for which the sole source of information was the OH responses, only 0.1% were assessed as possibly exposed to TCE by the expert. Conclusions: Our systematic extraction of OH information found useful information in the task/chemicals/tools responses that was relatively easy to extract and that was not available from the occupational or industry information. The extracted variables can be used as inputs in the development of decision rules, especially for jobs where no additional information, such as job- and industry-specific questionnaires, is available. PMID:24590110
This map service displays all air-related layers used in the USEPA Community/Tribal-Focused Exposure and Risk Screening Tool (C/T-FERST) mapping application (https://www.epa.gov/c-ferst). The following data sources (and layers) are contained in this service:USEPA's 2005 National-Scale Air Toxic Assessment (NATA) data. Data are shown at the census tract level (2000 census tract boundaries, US Census Bureau) for Cumulative Cancer and Non-Cancer risks (Neurological and Respiratory) from 139 air toxics. In addition, individual pollutant estimates of Ambient Concentration, Exposure Concentration, Cancer, and Non-Cancer risks (Neurological and Respiratory) are provided for: Acetaldehyde, Acrolein, Arsenic, Benzene, 1,3-Butadiene, Chromium, Diesel PM, Formaldehyde, Lead, Naphthalene, and Polycyclic Aromatic Hydrocarbon (PAH). The original Access tables were downloaded from USEPA's Office of Air and Radiation (OAR) https://www.epa.gov/national-air-toxics-assessment/2005-national-air-toxics-assessment. The data classification (defined interval) for this map service was developed for USEPA's Office of Research and Development's (ORD) Community-Focused Exposure and Risk Screening Tool (C-FERST) per guidance provided by OAR.The 2005 NATA provides information on 177 of the 187 Clean Air Act air toxics (https://www.epa.gov/sites/production/files/2015-10/documents/2005-nata-pollutants.pdf) plus diesel particulate matter (diesel PM was assessed for non-cancer only). For addit
Evaluating terrain based criteria for snow avalanche exposure ratings using GIS
NASA Astrophysics Data System (ADS)
Delparte, Donna; Jamieson, Bruce; Waters, Nigel
2010-05-01
Snow avalanche terrain in backcountry regions of Canada is increasingly being assessed based upon the Avalanche Terrain Exposure Scale (ATES). ATES is a terrain based classification introduced in 2004 by Parks Canada to identify "simple", "challenging" and "complex" backcountry areas. The ATES rating system has been applied to well over 200 backcountry routes, has been used in guidebooks, trailhead signs and maps and is part of the trip planning component of the AVALUATOR™, a simple decision-support tool for backcountry users. Geographic Information Systems (GIS) offers a means to model and visualize terrain based criteria through the use of digital elevation model (DEM) and land cover data. Primary topographic variables such as slope, aspect and curvature are easily derived from a DEM and are compatible with the equivalent evaluation criteria in ATES. Other components of the ATES classification are difficult to extract from a DEM as they are not strictly terrain based. An overview is provided of the terrain variables that can be generated from DEM and land cover data; criteria from ATES which are not clearly terrain based are identified for further study or revision. The second component of this investigation was the development of an algorithm for inputting suitable ATES criteria into a GIS, thereby mimicking the process avalanche experts use when applying the ATES classification to snow avalanche terrain. GIS based classifications were compared to existing expert assessments for validity. The advantage of automating the ATES classification process through GIS is to assist avalanche experts with categorizing and mapping remote backcountry terrain.
Kapellusch, Jay M; Bao, Stephen S; Silverstein, Barbara A; Merryweather, Andrew S; Thiese, Mathew S; Hegmann, Kurt T; Garg, Arun
2017-12-01
The Strain Index (SI) and the American Conference of Governmental Industrial Hygienists (ACGIH) Threshold Limit Value for Hand Activity Level (TLV for HAL) use different constituent variables to quantify task physical exposures. Similarly, time-weighted-average (TWA), Peak, and Typical exposure techniques to quantify physical exposure from multi-task jobs make different assumptions about each task's contribution to the whole job exposure. Thus, task and job physical exposure classifications differ depending upon which model and technique are used for quantification. This study examines exposure classification agreement, disagreement, correlation, and magnitude of classification differences between these models and techniques. Data from 710 multi-task job workers performing 3,647 tasks were analyzed using the SI and TLV for HAL models, as well as with the TWA, Typical and Peak job exposure techniques. Physical exposures were classified as low, medium, and high using each model's recommended, or a priori limits. Exposure classification agreement and disagreement between models (SI, TLV for HAL) and between job exposure techniques (TWA, Typical, Peak) were described and analyzed. Regardless of technique, the SI classified more tasks as high exposure than the TLV for HAL, and the TLV for HAL classified more tasks as low exposure. The models agreed on 48.5% of task classifications (kappa = 0.28) with 15.5% of disagreement between low and high exposure categories. Between-technique (i.e., TWA, Typical, Peak) agreement ranged from 61-93% (kappa: 0.16-0.92) depending on whether the SI or TLV for HAL was used. There was disagreement between the SI and TLV for HAL and between the TWA, Typical and Peak techniques. Disagreement creates uncertainty for job design, job analysis, risk assessments, and developing interventions. Task exposure classifications from the SI and TLV for HAL might complement each other. However, TWA, Typical, and Peak job exposure techniques all have limitations. Part II of this article examines whether the observed differences between these models and techniques produce different exposure-response relationships for predicting prevalence of carpal tunnel syndrome.
Muscatiello, Neil; Wilson, Lloyd; Dziewulski, David
2016-01-01
We identified hospital visits with reported exposure to harmful algal blooms, an emerging public health concern because of toxicity and increased incidence. We used the World Health Organization’s International Classification of Disease (ICD) medical code specifying environmental exposure to harmful algal blooms to extract hospital visit records in New York State from 2008 to 2014. Using the ICD code, we identified 228 hospital visits with reported exposure to harmful algal blooms. They occurred all year long and had multiple principal diagnoses. Of all hospital visits, 94.7% were managed in the emergency department and 5.3% were hospitalizations. As harmful algal bloom surveillance increases, the ICD code will be a beneficial tool to public health only if used properly. PMID:26794161
Friesen, Melissa C; Locke, Sarah J; Tornow, Carina; Chen, Yu-Cheng; Koh, Dong-Hee; Stewart, Patricia A; Purdue, Mark; Colt, Joanne S
2014-06-01
Lifetime occupational history (OH) questionnaires often use open-ended questions to capture detailed information about study participants' jobs. Exposure assessors use this information, along with responses to job- and industry-specific questionnaires, to assign exposure estimates on a job-by-job basis. An alternative approach is to use information from the OH responses and the job- and industry-specific questionnaires to develop programmable decision rules for assigning exposures. As a first step in this process, we developed a systematic approach to extract the free-text OH responses and convert them into standardized variables that represented exposure scenarios. Our study population comprised 2408 subjects, reporting 11991 jobs, from a case-control study of renal cell carcinoma. Each subject completed a lifetime OH questionnaire that included verbatim responses, for each job, to open-ended questions including job title, main tasks and activities (task), tools and equipment used (tools), and chemicals and materials handled (chemicals). Based on a review of the literature, we identified exposure scenarios (occupations, industries, tasks/tools/chemicals) expected to involve possible exposure to chlorinated solvents, trichloroethylene (TCE) in particular, lead, and cadmium. We then used a SAS macro to review the information reported by study participants to identify jobs associated with each exposure scenario; this was done using previously coded standardized occupation and industry classification codes, and a priori lists of associated key words and phrases related to possibly exposed tasks, tools, and chemicals. Exposure variables representing the occupation, industry, and task/tool/chemicals exposure scenarios were added to the work history records of the study respondents. Our identification of possibly TCE-exposed scenarios in the OH responses was compared to an expert's independently assigned probability ratings to evaluate whether we missed identifying possibly exposed jobs. Our process added exposure variables for 52 occupation groups, 43 industry groups, and 46 task/tool/chemical scenarios to the data set of OH responses. Across all four agents, we identified possibly exposed task/tool/chemical exposure scenarios in 44-51% of the jobs in possibly exposed occupations. Possibly exposed task/tool/chemical exposure scenarios were found in a nontrivial 9-14% of the jobs not in possibly exposed occupations, suggesting that our process identified important information that would not be captured using occupation alone. Our extraction process was sensitive: for jobs where our extraction of OH responses identified no exposure scenarios and for which the sole source of information was the OH responses, only 0.1% were assessed as possibly exposed to TCE by the expert. Our systematic extraction of OH information found useful information in the task/chemicals/tools responses that was relatively easy to extract and that was not available from the occupational or industry information. The extracted variables can be used as inputs in the development of decision rules, especially for jobs where no additional information, such as job- and industry-specific questionnaires, is available. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2014.
Corvaro, M; Gehen, S; Andrews, K; Chatfield, R; Arasti, C; Mehta, J
2016-12-01
Acute systemic (oral, dermal, inhalation) toxicity testing of agrochemical formulations (end-use products) is mainly needed for Classification and Labelling (C&L) and definition of personal protection equipment (PPE). A retrospective analysis of 225 formulations with available in vivo data showed that: A) LD 50 /LC 50 values were above limit doses in <20.2% via oral route but only in <1% and <2.4% of cases via dermal and inhalation route, respectively; B) for each formulation the acute oral toxicity is always equal or greater than the Acute Toxicity Estimate (ATE) via the other two routes; C) the GHS (Global Harmonised System) computational method based on ATE, currently of limited acceptance, has very high accuracy and specificity for prediction of agrochemical mixture toxicity according to the internationally established classification thresholds. By integrating this evidence, an exposure- and data-based waiving strategy is proposed to determine classification and adequate PPE and to ensure only triggered animal testing is used. Safety characterisation above 2000 mg/kg body weight or 1.0 mg/L air should not be recommended, based on the agrochemical exposure scenarios. The global implementation of these tools would allow a remarkable reduction (up to 95%) in in vivo testing, often inducing lethality and/or severe toxicity, for agrochemical formulations. Copyright © 2016. Published by Elsevier Inc.
Cunningham, Barbara Jane; Hidecker, Mary Jo Cooley; Thomas-Stonell, Nancy; Rosenbaum, Peter
2018-05-01
In this paper, we present our experiences - both successes and challenges - in implementing evidence-based classification tools into clinical practice. We also make recommendations for others wanting to promote the uptake and application of new research-based assessment tools. We first describe classification systems and the benefits of using them in both research and practice. We then present a theoretical framework from Implementation Science to report strategies we have used to implement two research-based classification tools into practice. We also illustrate some of the challenges we have encountered by reporting results from an online survey investigating 58 Speech-language Pathologists' knowledge and use of the Communication Function Classification System (CFCS), a new tool to classify children's functional communication skills. We offer recommendations for researchers wanting to promote the uptake of new tools in clinical practice. Specifically, we identify structural, organizational, innovation, practitioner, and patient-related factors that we recommend researchers address in the design of implementation interventions. Roles and responsibilities of both researchers and clinicians in making implementations science a success are presented. Implications for rehabilitation Promoting uptake of new and evidence-based tools into clinical practice is challenging. Implementation science can help researchers to close the knowledge-to-practice gap. Using concrete examples, we discuss our experiences in implementing evidence-based classification tools into practice within a theoretical framework. Recommendations are provided for researchers wanting to implement new tools in clinical practice. Implications for researchers and clinicians are presented.
ERIC Educational Resources Information Center
Funk, Kerri L.; Tseng, M. S.
Two groups of 32 educable mentally retarded children (ages 7 to 14 years) were compared as to their arithmetic and classification performances attributable to the presence or absence of a 4 1/2 week exposure to classification tasks. The randomized block pretest-posttest design was used. The experimental group and the control group were matched on…
Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel
2014-11-01
With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.
Schyllert, Christian; Andersson, Martin; Hedman, Linnea; Ekström, Magnus; Backman, Helena; Lindberg, Anne; Rönmark, Eva
2018-01-01
Objectives : To evaluate the ability of three different job title classification systems to identify subjects at risk for respiratory symptoms and asthma by also taking the effect of exposure to vapours, gas, dust, and fumes (VGDF) into account. Background : Respiratory symptoms and asthma may be caused by occupational factors. There are different ways to classify occupational exposure. In this study, self-reported occupational exposure to vapours, gas, dust and fumes was used as well as job titles classifed into occupational and socioeconomic Groups according to three different systems. Design: This was a large population-based study of adults aged 30-69 years in Northern Sweden ( n = 9,992, 50% women). Information on job titles, VGDF-exposure, smoking habits, asthma and respiratory symptoms was collected by a postal survey. Job titles were used for classification into socioeconomic and occupational groups based on three classification systems; Socioeconomic classification (SEI), the Nordic Occupations Classification 1983 (NYK), and the Swedish Standard Classification of Occupations 2012 (SSYK). Associations were analysed by multivariable logistic regression. Results : Occupational exposure to VGDF was a risk factor for all respiratory symptoms and asthma (odds ratios (ORs) 1.3-2.4). Productive cough was associated with the socioeconomic groups of manual workers (ORs 1.5-2.1) and non-manual employees (ORs 1.6-1.9). These groups include occupations such as construction and transportation workers, service workers, nurses, teachers and administration clerks which by the SSYK classification were associated with productive cough (ORs 2.4-3.7). Recurrent wheeze was significantly associated with the SEI group manual workers (ORs 1.5-1.7). After adjustment for also VGDF, productive cough remained significantly associated with the SEI groups manual workers in service and non-manual employees, and the SSYK-occupational groups administration, service, and elementary occupations. Conclusions : In this cross-sectional study, two of the three different classification systems, SSYK and SEI gave similar results and identified groups with increased risk for respiratory symptoms while NYK did not give conclusive results. Furthermore, several associations were independent of exposure to VGDF indicating that also other job-related factors than VGDF are of importance.
U.S. Geological Survey ArcMap Sediment Classification tool
O'Malley, John
2007-01-01
The U.S. Geological Survey (USGS) ArcMap Sediment Classification tool is a custom toolbar that extends the Environmental Systems Research Institute, Inc. (ESRI) ArcGIS 9.2 Desktop application to aid in the analysis of seabed sediment classification. The tool uses as input either a point data layer with field attributes containing percentage of gravel, sand, silt, and clay or four raster data layers representing a percentage of sediment (0-100%) for the various sediment grain size analysis: sand, gravel, silt and clay. This tool is designed to analyze the percent of sediment at a given location and classify the sediments according to either the Folk (1954, 1974) or Shepard (1954) as modified by Schlee(1973) classification schemes. The sediment analysis tool is based upon the USGS SEDCLASS program (Poppe, et al. 2004).
Assessing Risk for Future Firearms Violence in Young People Who Present to ED.
2017-06-01
A new clinical index tool designed specifically for the emergency environment predicts the risk for future firearms violence in young people 14-24 years of age. The approach employs a brief, 10-point instrument that can be administered in one to two minutes, according to investigators. They also note that while the tool is based on data from a single ED in Flint, Ml, the tool should be applicable to urban EDs in regions that have similar characteristics. To create the tool, investigators used data from the Flint Youth Injury Study, an investigation of a group of patients 14-24 years of age who reported using drugs in the previous six months and accessed care at a Level I trauma center. Using a machine learning classification approach, investigators combed through the data, finding that the most predictive factors for firearm violence could be categorized into four domains: peer and partner violence victimization, community violence exposure, peer/family influences, and fighting. Ideally, investigators note the tool will be employed along with interventions targeted toward patients at high risk for future firearms violence.
NASA Astrophysics Data System (ADS)
Craig, Paul; Kennedy, Jessie
2008-01-01
An increasingly common approach being taken by taxonomists to define the relationships between taxa in alternative hierarchical classifications is to use a set-based notation which states relationship between two taxa from alternative classifications. Textual recording of these relationships is cumbersome and difficult for taxonomists to manage. While text based GUI tools are beginning to appear which ease the process, these have several limitations. Interactive visual tools offer greater potential to allow taxonomists to explore the taxa in these hierarchies and specify such relationships. This paper describes the Concept Relationship Editor, an interactive visualisation tool designed to support the assertion of relationships between taxonomic classifications. The tool operates using an interactive space-filling adjacency layout which allows users to expand multiple lists of taxa with common parents so they can explore and assert relationships between two classifications.
On the Strength and Validity of Hazard Banding.
Scheffers, Theo; Doornaert, Blandine; Berne, Nathalie; van Breukelen, Gerard; Leplay, Antoine; van Miert, Erik
2016-11-01
Hazard Banding (HB) is a process of allocating chemical substances in bands of increasing health hazard based on their hazard classifications. Recent Control Banding (CB) tools use the classifications of the United Nations Global Harmonized System (UN GHS) or the European Union Classifications, Labelling and Packaging (EU CLP) which are grouped over 5 HBs. The use of CB is growing worldwide for the risk control of substances without an Occupational Exposure Limit Value (OELV). Well-known CB-tools like HSE-COSHH Essentials, BAuA-Einfaches Maßnahmenkonzept Gefahrstoffe (EMKG), and DGUV-IFA-Spaltenmodell (IFA) use however different GHS/CLP groupings which may lead to dissimilar HBs and control regimes for individual substances. And as the choice for a CB tool seems to be determined by geography and/or local status these differences may hamper a global, aligned HSE approach. Therefore, the HB-engines of the three public CBs and an in-company (Solvay) CB called 'Occupational Exposure Banding' (S-OEB) were compared mutually and ranked in their relation with the OELV as the 'de facto' standard. This was investigated graphically and using a 5 strength indicator, statistical method. A data set of 229 substances with high-quality GHS/CLP classifications and OELVs was used. HB concentration ranges, as linked to S-OEB and COSHH, were validated against the corresponding OELV distributions. The four HB-engines allocate between 23 and 64% of the 229 substances in the same bands. The remaining substances differ at least one band, with IFA placing more substances in a higher hazard band, EMKG doing the opposite and COSHH and S-OEB in between. The overall strength scores of S-OEB, IFA, and EMGK HB-engines are higher than COSHH, with S-OEB having the highest overall strength score. The lower ends of the concentration ranges defined for the 3 'highest' hazard bands of S-OEB were in good agreement with the 10 th percentiles of the corresponding OELV distributions obtained from the substance data set. The lower ends of the COSHH concentration ranges comply with the 10 th percentiles of the COSHH OELV distributions for dust/aerosol but not for vapour/gas substances. Both the S-OEB and COSHH concentration ranges underestimate the overall width of the OELV distributions that can span 2-3 orders of magnitude. As the performance of the S-OEB HB-engine meets our criteria of being at least as good as the public engines, it will be used as a standard within Solvay's global operations. In addition, the method described here to evaluate the strength of HB-engines and the validity of their corresponding concentration ranges is a useful tool enabling further developments and worldwide alignment of HB. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Content Classification: Leveraging New Tools and Librarians' Expertise.
ERIC Educational Resources Information Center
Starr, Jennie
1999-01-01
Presents factors for librarians to consider when decision-making about information retrieval. Discusses indexing theory; thesauri aids; controlled vocabulary or thesauri to increase access; humans versus machines; automated tools; product evaluations and evaluation criteria; automated classification tools; content server products; and document…
Implicit structured sequence learning: an fMRI study of the structural mere-exposure effect
Folia, Vasiliki; Petersson, Karl Magnus
2014-01-01
In this event-related fMRI study we investigated the effect of 5 days of implicit acquisition on preference classification by means of an artificial grammar learning (AGL) paradigm based on the structural mere-exposure effect and preference classification using a simple right-linear unification grammar. This allowed us to investigate implicit AGL in a proper learning design by including baseline measurements prior to grammar exposure. After 5 days of implicit acquisition, the fMRI results showed activations in a network of brain regions including the inferior frontal (centered on BA 44/45) and the medial prefrontal regions (centered on BA 8/32). Importantly, and central to this study, the inclusion of a naive preference fMRI baseline measurement allowed us to conclude that these fMRI findings were the intrinsic outcomes of the learning process itself and not a reflection of a preexisting functionality recruited during classification, independent of acquisition. Support for the implicit nature of the knowledge utilized during preference classification on day 5 come from the fact that the basal ganglia, associated with implicit procedural learning, were activated during classification, while the medial temporal lobe system, associated with explicit declarative memory, was consistently deactivated. Thus, preference classification in combination with structural mere-exposure can be used to investigate structural sequence processing (syntax) in unsupervised AGL paradigms with proper learning designs. PMID:24550865
Implicit structured sequence learning: an fMRI study of the structural mere-exposure effect.
Folia, Vasiliki; Petersson, Karl Magnus
2014-01-01
In this event-related fMRI study we investigated the effect of 5 days of implicit acquisition on preference classification by means of an artificial grammar learning (AGL) paradigm based on the structural mere-exposure effect and preference classification using a simple right-linear unification grammar. This allowed us to investigate implicit AGL in a proper learning design by including baseline measurements prior to grammar exposure. After 5 days of implicit acquisition, the fMRI results showed activations in a network of brain regions including the inferior frontal (centered on BA 44/45) and the medial prefrontal regions (centered on BA 8/32). Importantly, and central to this study, the inclusion of a naive preference fMRI baseline measurement allowed us to conclude that these fMRI findings were the intrinsic outcomes of the learning process itself and not a reflection of a preexisting functionality recruited during classification, independent of acquisition. Support for the implicit nature of the knowledge utilized during preference classification on day 5 come from the fact that the basal ganglia, associated with implicit procedural learning, were activated during classification, while the medial temporal lobe system, associated with explicit declarative memory, was consistently deactivated. Thus, preference classification in combination with structural mere-exposure can be used to investigate structural sequence processing (syntax) in unsupervised AGL paradigms with proper learning designs.
Downs, Nathan J; Harrison, Simone L; Chavez, Daniel R Garzon; Parisi, Alfio V
2016-05-01
Classroom teachers located in Queensland, Australia are exposed to high levels of ambient solar ultraviolet as part of the occupational requirement to provide supervision of children during lunch and break times. We investigated the relationship between periods of outdoor occupational radiant exposure and available ambient solar radiation across different teaching classifications and schools relative to the daily occupational solar ultraviolet radiation (HICNIRP) protection standard of 30J/m(2). Self-reported daily sun exposure habits (n=480) and personal radiant exposures were monitored using calibrated polysulphone dosimeters (n=474) in 57 teaching staff from 6 different schools located in tropical north and southern Queensland. Daily radiant exposure patterns among teaching groups were compared to the ambient UV-Index. Personal sun exposures were stratified among teaching classifications, school location, school ownership (government vs non-government), and type (primary vs secondary). Median daily radiant exposures were 15J/m(2) and 5J/m(2)HICNIRP for schools located in northern and southern Queensland respectively. Of the 474 analyzed dosimeter-days, 23.0% were found to exceed the solar radiation protection standard, with the highest prevalence found among physical education teachers (57.4% dosimeter-days), followed by teacher aides (22.6% dosimeter-days) and classroom teachers (18.1% dosimeter-days). In Queensland, peak outdoor exposure times of teaching staff correspond with periods of extreme UV-Index. The daily occupational HICNIRP radiant exposure standard was exceeded in all schools and in all teaching classifications. Copyright © 2016 Elsevier B.V. All rights reserved.
Burstyn, Igor; Slutsky, Anton; Lee, Derrick G; Singer, Alison B; An, Yuan; Michael, Yvonne L
2014-05-01
Epidemiologists typically collect narrative descriptions of occupational histories because these are less prone than self-reported exposures to recall bias of exposure to a specific hazard. However, the task of coding these narratives can be daunting and prohibitively time-consuming in some settings. The aim of this manuscript is to evaluate the performance of a computer algorithm to translate the narrative description of occupational codes into standard classification of jobs (2010 Standard Occupational Classification) in an epidemiological context. The fundamental question we address is whether exposure assignment resulting from manual (presumed gold standard) coding of the narratives is materially different from that arising from the application of automated coding. We pursued our work through three motivating examples: assessment of physical demands in Women's Health Initiative observational study, evaluation of predictors of exposure to coal tar pitch volatiles in the US Occupational Safety and Health Administration's (OSHA) Integrated Management Information System, and assessment of exposure to agents known to cause occupational asthma in a pregnancy cohort. In these diverse settings, we demonstrate that automated coding of occupations results in assignment of exposures that are in reasonable agreement with results that can be obtained through manual coding. The correlation between physical demand scores based on manual and automated job classification schemes was reasonable (r = 0.5). The agreement between predictive probability of exceeding the OSHA's permissible exposure level for polycyclic aromatic hydrocarbons, using coal tar pitch volatiles as a surrogate, based on manual and automated coding of jobs was modest (Kendall rank correlation = 0.29). In the case of binary assignment of exposure to asthmagens, we observed that fair to excellent agreement in classifications can be reached, depending on presence of ambiguity in assigned job classification (κ = 0.5-0.8). Thus, the success of automated coding appears to depend on the setting and type of exposure that is being assessed. Our overall recommendation is that automated translation of short narrative descriptions of jobs for exposure assessment is feasible in some settings and essential for large cohorts, especially if combined with manual coding to both assess reliability of coding and to further refine the coding algorithm.
Oliveira, Gisele Augusto Rodrigues; Ducas, Rafael do Nascimento; Teixeira, Gabriel Campos; Batista, Aline Carvalho; Oliveira, Danielle Palma; Valadares, Marize Campos
2015-09-01
Eye irritation evaluation is mandatory for predicting health risks in consumers exposed to textile dyes. The two dyes, Reactive Orange 16 (RO16) and Reactive Green 19 (RG19) are classified as Category 2A (irritating to eyes) based on the UN Globally Harmonized System for classification (UN GHS), according to the Draize test. On the other hand, animal welfare considerations and the enforcement of a new regulation in the EU are drawing much attention in reducing or replacing animal experiments with alternative methods. This study evaluated the eye irritation of the two dyes RO16 and RG19 by combining the Short Time Exposure (STE) and the Bovine Corneal Opacity and Permeability (BCOP) assays and then comparing them with in vivo data from the GHS classification. The STE test (first level screening) categorized both dyes as GHS Category 1 (severe irritant). In the BCOP, dye RG19 was also classified as GHS Category 1 while dye RO16 was classified as GHS no prediction can be made. Both dyes caused damage to the corneal tissue as confirmed by histopathological analysis. Our findings demonstrated that the STE test did not contribute to arriving at a better conclusion about the eye irritation potential of the dyes when used in conjunction with the BCOP test. Adding the histopathology to the BCOP test could be an appropriate tool for a more meaningful prediction of the eye irritation potential of dyes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Integrating multisource imagery and GIS analysis for mapping Bermuda`s benthic habitats
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vierros, M.K.
1997-06-01
Bermuda is a group of isolated oceanic situated in the northwest Atlantic Ocean and surrounded by the Sargasso Sea. Bermuda possesses the northernmost coral reefs and mangroves in the Atlantic Ocean, and because of its high population density, both the terrestrial and marine environments are under intense human pressure. Although a long record of scientific research exists, this study is the first attempt to comprehensively map the area`s benthic habitats, despite the need for such a map for resource assessment and management purposes. Multi-source and multi-date imagery were used for producing the habitat map due to lack of a completemore » up-to-date image. Classifications were performed with SPOT data, and the results verified from recent aerial photography and current aerial video, along with extensive ground truthing. Stratification of the image into regions prior to classification reduced the confusing effects of varying water depth. Classification accuracy in shallow areas was increased by derivation of a texture pseudo-channel, while bathymetry was used as a classification tool in deeper areas, where local patterns of zonation were well known. Because of seasonal variation in extent of seagrasses, a classification scheme based on density could not be used. Instead, a set of classes based on the seagrass area`s exposure to the open ocean were developed. The resulting habitat map is currently being assessed for accuracy with promising preliminary results, indicating its usefulness as a basis for future resource assessment studies.« less
The P600 in Implicit Artificial Grammar Learning.
Silva, Susana; Folia, Vasiliki; Hagoort, Peter; Petersson, Karl Magnus
2017-01-01
The suitability of the artificial grammar learning (AGL) paradigm to capture relevant aspects of the acquisition of linguistic structures has been empirically tested in a number of EEG studies. Some have shown a syntax-related P600 component, but it has not been ruled out that the AGL P600 effect is a response to surface features (e.g., subsequence familiarity) rather than the underlying syntax structure. Therefore, in this study, we controlled for the surface characteristics of the test sequences (associative chunk strength) and recorded the EEG before (baseline preference classification) and after (preference and grammaticality classification) exposure to a grammar. After exposure, a typical, centroparietal P600 effect was elicited by grammatical violations and not by unfamiliar subsequences, suggesting that the AGL P600 effect signals a response to structural irregularities. Moreover, preference and grammaticality classification showed a qualitatively similar ERP profile, strengthening the idea that the implicit structural mere-exposure paradigm in combination with preference classification is a suitable alternative to the traditional grammaticality classification test. Copyright © 2016 Cognitive Science Society, Inc.
LaKind, Judy S; Sobus, Jon R; Goodman, Michael; Barr, Dana Boyd; Fürst, Peter; Albertini, Richard J; Arbuckle, Tye E; Schoeters, Greet; Tan, Yu-Mei; Teeguarden, Justin; Tornero-Velez, Rogelio; Weisel, Clifford P
2014-12-01
The quality of exposure assessment is a major determinant of the overall quality of any environmental epidemiology study. The use of biomonitoring as a tool for assessing exposure to ubiquitous chemicals with short physiologic half-lives began relatively recently. These chemicals present several challenges, including their presence in analytical laboratories and sampling equipment, difficulty in establishing temporal order in cross-sectional studies, short- and long-term variability in exposures and biomarker concentrations, and a paucity of information on the number of measurements required for proper exposure classification. To date, the scientific community has not developed a set of systematic guidelines for designing, implementing and interpreting studies of short-lived chemicals that use biomonitoring as the exposure metric or for evaluating the quality of this type of research for WOE assessments or for peer review of grants or publications. We describe key issues that affect epidemiology studies using biomonitoring data on short-lived chemicals and propose a systematic instrument--the Biomonitoring, Environmental Epidemiology, and Short-lived Chemicals (BEES-C) instrument--for evaluating the quality of research proposals and studies that incorporate biomonitoring data on short-lived chemicals. Quality criteria for three areas considered fundamental to the evaluation of epidemiology studies that include biological measurements of short-lived chemicals are described: 1) biomarker selection and measurement, 2) study design and execution, and 3) general epidemiological study design considerations. We recognize that the development of an evaluative tool such as BEES-C is neither simple nor non-controversial. We hope and anticipate that the instrument will initiate further discussion/debate on this topic. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
LaKind, Judy S.; Sobus, Jon R.; Goodman, Michael; Barr, Dana Boyd; Fürst, Peter; Albertini, Richard J.; Arbuckle, Tye E.; Schoeters, Greet; Tan, Yu-Mei; Teeguarden, Justin; Tornero-Velez, Rogelio; Weisel, Clifford P.
2015-01-01
The quality of exposure assessment is a major determinant of the overall quality of any environmental epidemiology study. The use of biomonitoring as a tool for assessing exposure to ubiquitous chemicals with short physiologic half-lives began relatively recently. These chemicals present several challenges, including their presence in analytical laboratories and sampling equipment, difficulty in establishing temporal order in cross-sectional studies, short- and long-term variability in exposures and biomarker concentrations, and a paucity of information on the number of measurements required for proper exposure classification. To date, the scientific community has not developed a set of systematic guidelines for designing, implementing and interpreting studies of short-lived chemicals that use biomonitoring as the exposure metric or for evaluating the quality of this type of research for WOE assessments or for peer review of grants or publications. We describe key issues that affect epidemiology studies using biomonitoring data on short-lived chemicals and propose a systematic instrument – the Biomonitoring, Environmental Epidemiology, and Short-lived Chemicals (BEES-C) instrument – for evaluating the quality of research proposals and studies that incorporate biomonitoring data on short-lived chemicals. Quality criteria for three areas considered fundamental to the evaluation of epidemiology studies that include biological measurements of short-lived chemicals are described: 1) biomarker selection and measurement, 2) study design and execution, and 3) general epidemiological study design considerations. We recognize that the development of an evaluative tool such as BEES-C is neither simple nor non-controversial. We hope and anticipate that the instrument will initiate further discussion/debate on this topic. PMID:25137624
Maier, Andrew; Vincent, Melissa J; Parker, Ann; Gadagbui, Bernard K; Jayjock, Michael
2015-12-01
Asthma is a complex syndrome with significant consequences for those affected. The number of individuals affected is growing, although the reasons for the increase are uncertain. Ensuring the effective management of potential exposures follows from substantial evidence that exposure to some chemicals can increase the likelihood of asthma responses. We have developed a safety assessment approach tailored to the screening of asthma risks from residential consumer product ingredients as a proactive risk management tool. Several key features of the proposed approach advance the assessment resources often used for asthma issues. First, a quantitative health benchmark for asthma or related endpoints (irritation and sensitization) is provided that extends qualitative hazard classification methods. Second, a parallel structure is employed to include dose-response methods for asthma endpoints and methods for scenario specific exposure estimation. The two parallel tracks are integrated in a risk characterization step. Third, a tiered assessment structure is provided to accommodate different amounts of data for both the dose-response assessment (i.e., use of existing benchmarks, hazard banding, or the threshold of toxicological concern) and exposure estimation (i.e., use of empirical data, model estimates, or exposure categories). Tools building from traditional methods and resources have been adapted to address specific issues pertinent to asthma toxicology (e.g., mode-of-action and dose-response features) and the nature of residential consumer product use scenarios (e.g., product use patterns and exposure durations). A case study for acetic acid as used in various sentinel products and residential cleaning scenarios was developed to test the safety assessment methodology. In particular, the results were used to refine and verify relationships among tiered approaches such that each lower data tier in the approach provides a similar or greater margin of safety for a given scenario. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Enzyme-Activated Fluorogenic Probes for Live-Cell and in Vivo Imaging.
Chyan, Wen; Raines, Ronald T
2018-06-20
Fluorogenic probes, small-molecule sensors that unmask brilliant fluorescence upon exposure to specific stimuli, are powerful tools for chemical biology. Those probes that respond to enzymatic activity illuminate the complex dynamics of biological processes at a level of spatiotemporal detail and sensitivity unmatched by other techniques. Here, we review recent advances in enzyme-activated fluorogenic probes for biological imaging. We organize our survey by enzyme classification, with emphasis on fluorophore masking strategies, modes of enzymatic activation, and the breadth of current and future applications. Key challenges such as probe selectivity and spectroscopic requirements are described alongside of therapeutic, diagnostic, and theranostic opportunities.
Predicting Drug-induced Hepatotoxicity Using QSAR and Toxicogenomics Approaches
Low, Yen; Uehara, Takeki; Minowa, Yohsuke; Yamada, Hiroshi; Ohno, Yasuo; Urushidani, Tetsuro; Sedykh, Alexander; Muratov, Eugene; Fourches, Denis; Zhu, Hao; Rusyn, Ivan; Tropsha, Alexander
2014-01-01
Quantitative Structure-Activity Relationship (QSAR) modeling and toxicogenomics are used independently as predictive tools in toxicology. In this study, we evaluated the power of several statistical models for predicting drug hepatotoxicity in rats using different descriptors of drug molecules, namely their chemical descriptors and toxicogenomic profiles. The records were taken from the Toxicogenomics Project rat liver microarray database containing information on 127 drugs (http://toxico.nibio.go.jp/datalist.html). The model endpoint was hepatotoxicity in the rat following 28 days of exposure, established by liver histopathology and serum chemistry. First, we developed multiple conventional QSAR classification models using a comprehensive set of chemical descriptors and several classification methods (k nearest neighbor, support vector machines, random forests, and distance weighted discrimination). With chemical descriptors alone, external predictivity (Correct Classification Rate, CCR) from 5-fold external cross-validation was 61%. Next, the same classification methods were employed to build models using only toxicogenomic data (24h after a single exposure) treated as biological descriptors. The optimized models used only 85 selected toxicogenomic descriptors and had CCR as high as 76%. Finally, hybrid models combining both chemical descriptors and transcripts were developed; their CCRs were between 68 and 77%. Although the accuracy of hybrid models did not exceed that of the models based on toxicogenomic data alone, the use of both chemical and biological descriptors enriched the interpretation of the models. In addition to finding 85 transcripts that were predictive and highly relevant to the mechanisms of drug-induced liver injury, chemical structural alerts for hepatotoxicity were also identified. These results suggest that concurrent exploration of the chemical features and acute treatment-induced changes in transcript levels will both enrich the mechanistic understanding of sub-chronic liver injury and afford models capable of accurate prediction of hepatotoxicity from chemical structure and short-term assay results. PMID:21699217
Roger D. Ottmar; David V. Sandberg; Cynthia L. Riccardi; Susan J. Prichard
2007-01-01
We present an overview of the Fuel Characteristic Classification System (FCCS), a tool that enables land managers, regulators, and scientists to create and catalog fuelbeds and to classify those fuelbeds for their capacity to support fire and consume fuels. The fuelbed characteristics and fire classification from this tool will provide inputs for current and future...
Presence of an epigenetic signature of prenatal cigarette smoke exposure in childhood☆
Ladd-Acosta, Christine; Shu, Chang; Lee, Brian K.; Gidaya, Nicole; Singer, Alison; Schieve, Laura A.; Schendel, Diana E.; Jones, Nicole; Daniels, Julie L.; Windham, Gayle C.; Newschaffer, Craig J.; Croen, Lisa A.; Feinberg, Andrew P.; Fallin, M. Daniele
2016-01-01
Prenatal exposure to tobacco smoke has lifelong health consequences. Epigenetic signatures such as differences in DNA methylation (DNAm) may be a biomarker of exposure and, further, might have functional significance for how in utero tobacco exposure may influence disease risk. Differences in infant DNAm associated with maternal smoking during pregnancy have been identified. Here we assessed whether these infant DNAm patterns are detectible in early childhood, whether they are specific to smoking, and whether childhood DNAm can classify prenatal smoke exposure status. Using the Infinium 450 K array, we measured methylation at 26 CpG loci that were previously associated with prenatal smoking in infant cord blood from 572 children, aged 3–5, with differing prenatal exposure to cigarette smoke in the Study to Explore Early Development (SEED). Striking concordance was found between the pattern of prenatal smoking associated DNAm among preschool aged children in SEED and those observed at birth in other studies. These DNAm changes appear to be tobacco-specific. Support vector machine classification models and 10-fold cross-validation were applied to show classification accuracy for childhood DNAm at these 26 sites as a biomarker of prenatal smoking exposure. Classification models showed prenatal exposure to smoking can be assigned with 81% accuracy using childhood DNAm patterns at these 26 loci. These findings support the potential for blood-derived DNAm measurements to serve as biomarkers for prenatal exposure. PMID:26610292
Rabban, J; Adler, J; Rosen, C; Blair, J; Sheridan, R
1997-09-01
Railway and subway-associated electrical trauma is rare and typically involves high voltage (> 20,000) arc injuries. Not all rail systems utilize such high voltage. We report 16 cases of electrical trauma due to 600 V direct contact with subway 'third' rails. A case series of injured patients presenting to Shriners Burns Institute, Boston or Massachusetts General Hospital between 1970 and 1995 was retrospectively analyzed. A total of 16 cases was identified. Among seven subway workers, the mechanism of rail contact was unintentional by a tool, a hand or by falling; no deaths occurred. Among nine non-occupational victims, injuries involved suicide attempts, unintentional falls, or risk-taking behavior. This group suffered greater burn severity, operative procedures, and complications; three deaths occurred. This is the largest report series of direct electrical trauma from a subway third rail. The high morbidity and mortality from this 600 V contact suggests that the traditional classification of low voltage (< 1000 V) exposure can be subdivided to reflect the serious and lethal potential of intermediate range exposures compared to household range exposures (0-220 V).
Okokon, Enembe Oku; Roivainen, Päivi; Kheifets, Leeka; Mezei, Gabor; Juutilainen, Jukka
2014-01-01
Previous studies have shown that populations of multiapartment buildings with indoor transformer stations may serve as a basis for improved epidemiological studies on the relationship between childhood leukaemia and extremely-low-frequency (ELF) magnetic fields (MFs). This study investigated whether classification based on structural characteristics of the transformer stations would improve ELF MF exposure assessment. The data included MF measurements in apartments directly above transformer stations ("exposed" apartments) in 30 buildings in Finland, and reference apartments in the same buildings. Transformer structural characteristics (type and location of low-voltage conductors) were used to classify exposed apartments into high-exposure (HE) and intermediate-exposure (IE) categories. An exposure gradient was observed: both the time-average MF and time above a threshold (0.4 μT) were highest in the HE apartments and lowest in the reference apartments, showing a statistically significant trend. The differences between HE and IE apartments, however, were not statistically significant. A simulation exercise showed that the three-category classification did not perform better than a two-category classification (exposed and reference apartments) in detecting the existence of an increased risk. However, data on the structural characteristics of transformers is potentially useful for evaluating exposure-response relationship.
Seo, Hyun-Ju; Kim, Soo Young; Lee, Yoon Jae; Jang, Bo-Hyoung; Park, Ji-Eun; Sheen, Seung-Soo; Hahn, Seo Kyung
2016-02-01
To develop a study Design Algorithm for Medical Literature on Intervention (DAMI) and test its interrater reliability, construct validity, and ease of use. We developed and then revised the DAMI to include detailed instructions. To test the DAMI's reliability, we used a purposive sample of 134 primary, mainly nonrandomized studies. We then compared the study designs as classified by the original authors and through the DAMI. Unweighted kappa statistics were computed to test interrater reliability and construct validity based on the level of agreement between the original and DAMI classifications. Assessment time was also recorded to evaluate ease of use. The DAMI includes 13 study designs, including experimental and observational studies of interventions and exposure. Both the interrater reliability (unweighted kappa = 0.67; 95% CI [0.64-0.75]) and construct validity (unweighted kappa = 0.63, 95% CI [0.52-0.67]) were substantial. Mean classification time using the DAMI was 4.08 ± 2.44 minutes (range, 0.51-10.92). The DAMI showed substantial interrater reliability and construct validity. Furthermore, given its ease of use, it could be used to accurately classify medical literature for systematic reviews of interventions although minimizing disagreement between authors of such reviews. Copyright © 2016 Elsevier Inc. All rights reserved.
Classification Algorithms for Big Data Analysis, a Map Reduce Approach
NASA Astrophysics Data System (ADS)
Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.
2015-03-01
Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.
Algorithmic Classification of Five Characteristic Types of Paraphasias.
Fergadiotis, Gerasimos; Gorman, Kyle; Bedrick, Steven
2016-12-01
This study was intended to evaluate a series of algorithms developed to perform automatic classification of paraphasic errors (formal, semantic, mixed, neologistic, and unrelated errors). We analyzed 7,111 paraphasias from the Moss Aphasia Psycholinguistics Project Database (Mirman et al., 2010) and evaluated the classification accuracy of 3 automated tools. First, we used frequency norms from the SUBTLEXus database (Brysbaert & New, 2009) to differentiate nonword errors and real-word productions. Then we implemented a phonological-similarity algorithm to identify phonologically related real-word errors. Last, we assessed the performance of a semantic-similarity criterion that was based on word2vec (Mikolov, Yih, & Zweig, 2013). Overall, the algorithmic classification replicated human scoring for the major categories of paraphasias studied with high accuracy. The tool that was based on the SUBTLEXus frequency norms was more than 97% accurate in making lexicality judgments. The phonological-similarity criterion was approximately 91% accurate, and the overall classification accuracy of the semantic classifier ranged from 86% to 90%. Overall, the results highlight the potential of tools from the field of natural language processing for the development of highly reliable, cost-effective diagnostic tools suitable for collecting high-quality measurement data for research and clinical purposes.
Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven
2006-01-01
Applying the Hydroecological Integrity Assessment Process involves four steps: (1) a hydrologic classification of relatively unmodified streams in a geographic area using long-term gage records and 171 ecologically relevant indices; (2) the identification of statistically significant, nonredundant, hydroecologically relevant indices associated with the five major flow components for each stream class; and (3) the development of a stream-classification tool and a hydrologic assessment tool. Four computer software tools have been developed.
USDA-ARS?s Scientific Manuscript database
The iPhyClassifier is an Internet-based research tool for quick identification and classification of diverse phytoplasmas. The iPhyClassifier simulates laboratory restriction enzyme digestions and subsequent gel electrophoresis and generates virtual restriction fragment length polymorphism (RFLP) p...
Barnard, Juliana; Rose, Cecile; Newman, Lee; Canner, Martha; Martyny, John; McCammon, Chuck; Bresnitz, Eddy; Rossman, Milt; Thompson, Bruce; Rybicki, Benjamin; Weinberger, Steven E; Moller, David R; McLennan, Geoffrey; Hunninghake, Gary; DePalo, Louis; Baughman, Robert P; Iannuzzi, Michael C; Judson, Marc A; Knatterud, Genell L; Teirstein, Alvin S; Yeager, Henry; Johns, Carol J; Rabin, David L; Cherniack, Reuben
2005-03-01
To determine whether specific occupations and industries may be associated with sarcoidosis. A Case Control Etiologic Study of Sarcoidosis (ACCESS) obtained occupational and environmental histories on 706 newly diagnosed sarcoidosis cases and matched controls. We used Standard Industrial Classification (SIC) and Standard Occupational Classification (SOC) to assess occupational contributions to sarcoidosis risk. Univariable analysis identified elevated risk of sarcoidosis for workers with industrial organic dust exposures, especially in Caucasian workers. Workers for suppliers of building materials, hardware, and gardening materials were at an increased risk of sarcoidosis as were educators. Work providing childcare was negatively associated with sarcoidosis risk. Jobs with metal dust or metal fume exposures were negatively associated with sarcoidosis risk, especially in Caucasian workers. In this study, we found that exposures in particular occupational settings may contribute to sarcoidosis risk.
Cluster categorization of urban roads to optimize their noise monitoring.
Zambon, G; Benocci, R; Brambilla, G
2016-01-01
Road traffic in urban areas is recognized to be associated with urban mobility and public health, and it is often the main source of noise pollution. Lately, noise maps have been considered a powerful tool to estimate the population exposure to environmental noise, but they need to be validated by measured noise data. The project Dynamic Acoustic Mapping (DYNAMAP), co-funded in the framework of the LIFE 2013 program, is aimed to develop a statistically based method to optimize the choice and the number of monitoring sites and to automate the noise mapping update using the data retrieved from a low-cost monitoring network. Indeed, the first objective should improve the spatial sampling based on the legislative road classification, as this classification is mainly based on the geometrical characteristics of the road, rather than its noise emission. The present paper describes the statistical approach of the methodology under development and the results of its preliminary application to a limited sample of roads in the city of Milan. The resulting categorization of roads, based on clustering the 24-h hourly L Aeqh, looks promising to optimize the spatial sampling of noise monitoring toward a description of the noise pollution due to complex urban road networks more efficient than that based on the legislative road classification.
Mi, Huaiyu; Huang, Xiaosong; Muruganujan, Anushya; Tang, Haiming; Mills, Caitlin; Kang, Diane; Thomas, Paul D
2017-01-04
The PANTHER database (Protein ANalysis THrough Evolutionary Relationships, http://pantherdb.org) contains comprehensive information on the evolution and function of protein-coding genes from 104 completely sequenced genomes. PANTHER software tools allow users to classify new protein sequences, and to analyze gene lists obtained from large-scale genomics experiments. In the past year, major improvements include a large expansion of classification information available in PANTHER, as well as significant enhancements to the analysis tools. Protein subfamily functional classifications have more than doubled due to progress of the Gene Ontology Phylogenetic Annotation Project. For human genes (as well as a few other organisms), PANTHER now also supports enrichment analysis using pathway classifications from the Reactome resource. The gene list enrichment tools include a new 'hierarchical view' of results, enabling users to leverage the structure of the classifications/ontologies; the tools also allow users to upload genetic variant data directly, rather than requiring prior conversion to a gene list. The updated coding single-nucleotide polymorphisms (SNP) scoring tool uses an improved algorithm. The hidden Markov model (HMM) search tools now use HMMER3, dramatically reducing search times and improving accuracy of E-value statistics. Finally, the PANTHER Tree-Attribute Viewer has been implemented in JavaScript, with new views for exploring protein sequence evolution. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
van Tongeren, Martie; Lamb, Judith; Cherrie, John W; MacCalman, Laura; Basinas, Ioannis; Hesse, Susanne
2017-10-01
Tier 1 exposure tools recommended for use under REACH are designed to easily identify situations that may pose a risk to health through conservative exposure predictions. However, no comprehensive evaluation of the performance of the lower tier tools has previously been carried out. The ETEAM project aimed to evaluate several lower tier exposure tools (ECETOC TRA, MEASE, and EMKG-EXPO-TOOL) as well as one higher tier tool (STOFFENMANAGER®). This paper describes the results of the external validation of tool estimates using measurement data. Measurement data were collected from a range of providers, both in Europe and United States, together with contextual information. Individual measurement and aggregated measurement data were obtained. The contextual information was coded into the tools to obtain exposure estimates. Results were expressed as percentage of measurements exceeding the tool estimates and presented by exposure category (non-volatile liquid, volatile liquid, metal abrasion, metal processing, and powder handling). We also explored tool performance for different process activities as well as different scenario conditions and exposure levels. In total, results from nearly 4000 measurements were obtained, with the majority for the use of volatile liquids and powder handling. The comparisons of measurement results with tool estimates suggest that the tools are generally conservative. However, the tools were more conservative when estimating exposure from powder handling compared to volatile liquids and other exposure categories. In addition, results suggested that tool performance varies between process activities and scenario conditions. For example, tools were less conservative when estimating exposure during activities involving tabletting, compression, extrusion, pelletisation, granulation (common process activity PROC14) and transfer of substance or mixture (charging and discharging) at non-dedicated facilities (PROC8a; powder handling only). With the exception of STOFFENMANAGER® (for estimating exposure during powder handling), the tools were less conservative for scenarios with lower estimated exposure levels. This is the most comprehensive evaluation of the performance of REACH exposure tools carried out to date. The results show that, although generally conservative, the tools may not always achieve the performance specified in the REACH guidance, i.e. using the 75th or 90th percentile of the exposure distribution for the risk characterisation. Ongoing development, adjustment, and recalibration of the tools with new measurement data are essential to ensure adequate characterisation and control of worker exposure to hazardous substances. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
NASA Astrophysics Data System (ADS)
Mohamed, Abdul Aziz; Hasan, Abu Bakar; Ghazali, Abu Bakar Mhd.
2017-01-01
Classification of large data into respected classes or groups could be carried out with the help of artificial intelligence (AI) tools readily available in the market. To get the optimum or best results, optimization tool could be applied on those data. Classification and optimization have been used by researchers throughout their works, and the outcomes were very encouraging indeed. Here, the authors are trying to share what they have experienced in three different areas of applied research.
Mycofier: a new machine learning-based classifier for fungal ITS sequences.
Delgado-Serrano, Luisa; Restrepo, Silvia; Bustos, Jose Ricardo; Zambrano, Maria Mercedes; Anzola, Juan Manuel
2016-08-11
The taxonomic and phylogenetic classification based on sequence analysis of the ITS1 genomic region has become a crucial component of fungal ecology and diversity studies. Nowadays, there is no accurate alignment-free classification tool for fungal ITS1 sequences for large environmental surveys. This study describes the development of a machine learning-based classifier for the taxonomical assignment of fungal ITS1 sequences at the genus level. A fungal ITS1 sequence database was built using curated data. Training and test sets were generated from it. A Naïve Bayesian classifier was built using features from the primary sequence with an accuracy of 87 % in the classification at the genus level. The final model was based on a Naïve Bayes algorithm using ITS1 sequences from 510 fungal genera. This classifier, denoted as Mycofier, provides similar classification accuracy compared to BLASTN, but the database used for the classification contains curated data and the tool, independent of alignment, is more efficient and contributes to the field, given the lack of an accurate classification tool for large data from fungal ITS1 sequences. The software and source code for Mycofier are freely available at https://github.com/ldelgado-serrano/mycofier.git .
Kimber, Melissa; Adham, Sami; Gill, Sana; McTavish, Jill; MacMillan, Harriet L
2018-02-01
Increasingly recognized as a distinct form of childhood maltreatment, children's exposure to intimate partner violence (IPV) has been shown to be associated with an array of negative psychosocial outcomes, including elevated risk for additional violence over the life course. Although studies have identified child exposure to IPV as a predictor of IPV perpetration in adulthood, no review has critically evaluated the methodology of this quantitative work. The present study examines the association between childhood exposure to IPV and the perpetration of IPV in adulthood based on a systematic review of the literature from inception to January 4, 2016. Databases searched included Medline, Embase, PsycINFO, CINAHL, Cochrane Database of Systematic Reviews, Sociological Abstracts and ERIC. Database searches were complemented with backward and forward citation chaining. Studies were critically appraised using the Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies. Of 5601 articles identified by the search, 19 studies were included for data extraction. Sixteen of these studies found that child exposure to IPV was significantly and positively associated with adult IPV perpetration; three studies reported null findings. The methodological quality of the studies was low. Work thus far has tended to focus on child exposure to physical IPV and the perpetration of physical IPV within heterosexual contexts. In addition, measures of child exposure to IPV vary in their classification of what exposure entails. We critically discuss the strengths and limitations of the existing evidence and the theoretical frameworks informing this work. Copyright © 2017 Elsevier Ltd. All rights reserved.
Darwich, Adam S; Henderson, Kathryn; Burgin, Angela; Ward, Nicola; Whittam, Janet; Ammori, Basil J; Ashcroft, Darren M; Rostami-Hodjegan, Amin
2012-11-01
Changes to oral drug bioavailability have been observed post bariatric surgery. However, the magnitude and the direction of changes have not been assessed systematically to provide insights into the parameters governing the observed trends. Understanding these can help with dose adjustments. Analysis of drug characteristics based on a biopharmaceutical classification system is not adequate to explain observed trends in altered oral drug bioavailability following bariatric surgery, although the findings suggest solubility to play an important role. To identify the most commonly prescribed drugs in a bariatric surgery population and to assess existing evidence regarding trends in oral drug bioavailability post bariatric surgery. A retrospective audit was undertaken to document commonly prescribed drugs amongst patients undergoing bariatric surgery in an NHS hospital in the UK and to assess practice for drug administration following bariatric surgery. The available literature was examined for trends relating to drug permeability and solubility with regards to the Biopharmaceutics Classification System (BCS) and main route of elimination. No significant difference in the 'post/pre surgery oral drug exposure ratio' (ppR) was apparent between BCS class I to IV drugs, with regards to dose number (Do) or main route of elimination. Drugs classified as 'solubility limited' displayed an overall reduction as compared with 'freely soluble' compounds, as well as an unaltered and increased ppR. Clinical studies establishing guidelines for commonly prescribed drugs, and the monitoring of drugs exhibiting a narrow therapeutic window or without a readily assessed clinical endpoint, are warranted. Using mechanistically based pharmacokinetic modelling for simulating the multivariate nature of changes in drug exposure may serve as a useful tool in the further understanding of postoperative trends in oral drug exposure and in developing practical clinical guidance. © 2012 The Authors. British Journal of Clinical Pharmacology © 2012 The British Pharmacological Society.
Acute Mountain Sickness Symptoms Depend on Normobaric versus Hypobaric Hypoxia
Strangman, Gary E.; Harris, N. Stuart; Muza, Stephen R.
2016-01-01
Acute mountain sickness (AMS), characterized by headache, nausea, fatigue, and dizziness when unacclimatized individuals rapidly ascend to high altitude, is exacerbated by exercise and can be disabling. Although AMS is observed in both normobaric (NH) and hypobaric hypoxia (HH), recent evidence suggests that NH and HH produce different physiological responses. We evaluated whether AMS symptoms were different in NH and HH during the initial stages of exposure and if the assessment tool mattered. Seventy-two 8 h exposures to normobaric normoxia (NN), NH, or HH were experienced by 36 subjects. The Environmental Symptoms Questionnaire (ESQ) and Lake Louise Self-report (LLS) were administered, resulting in a total of 360 assessments, with each subject answering the questionnaire 5 times during each of their 2 exposure days. Classification tree analysis indicated that symptoms contributing most to AMS were different in NH (namely, feeling sick and shortness of breath) compared to HH (characterized most by feeling faint, appetite loss, light headedness, and dim vision). However, the differences were not detected using the LLS. These results suggest that during the initial hours of exposure (1) AMS in HH may be a qualitatively different experience than in NH and (2) NH and HH may not be interchangeable environments. PMID:27847819
NASA Astrophysics Data System (ADS)
Wang, Sheng-Wei; Majeed, Mohammed A.; Chu, Pei-Ling; Lin, Hui-Chih
Socioeconomic and demographic factors have been found to significantly affect time-activity patterns in population cohorts that can subsequently influence personal exposures to air pollutants. This study investigates relationships between personal exposures to eight VOCs (benzene, toluene, ethylbenzene, o-xylene, m-,p-xylene, chloroform, 1,4-dichlorobenzene, and tetrachloroethene) and socioeconomic, demographic, time-activity pattern factors using data collected from the 1999-2000 National Health and Nutrition Examination Survey (NHANES) VOC study. Socio-demographic factors (such as race/ethnicity and family income) were generally found to significantly influence personal exposures to the three chlorinated compounds. This was mainly due to the associations paired by race/ethnicity and urban residence, race/ethnicity and use of air freshener in car, family income and use of dry-cleaner, which can in turn affect exposures to chloroform, 1,4-dichlorobenzene, and tetrachloroethene, respectively. For BTEX, the traffic-related compounds, housing characteristics (leaving home windows open and having an attached garage) and personal activities related to the uses of fuels or solvent-related products played more significant roles in influencing exposures. Significant differences in BTEX exposures were also commonly found in relation to gender, due to associated significant differences in time spent at work/school and outdoors. The coupling of Classification and Regression Tree (CART) and Bootstrap Aggregating (Bagging) techniques were used as effective tools for characterizing robust sets of significant VOC exposure factors presented above, which conventional statistical approaches could not accomplish. Identification of these significant VOC exposure factors can be used to generate hypotheses for future investigations about possible significant VOC exposure sources and pathways in the general U.S. population.
Using self-organizing maps to develop ambient air quality classifications: a time series example
2014-01-01
Background Development of exposure metrics that capture features of the multipollutant environment are needed to investigate health effects of pollutant mixtures. This is a complex problem that requires development of new methodologies. Objective Present a self-organizing map (SOM) framework for creating ambient air quality classifications that group days with similar multipollutant profiles. Methods Eight years of day-level data from Atlanta, GA, for ten ambient air pollutants collected at a central monitor location were classified using SOM into a set of day types based on their day-level multipollutant profiles. We present strategies for using SOM to develop a multipollutant metric of air quality and compare results with more traditional techniques. Results Our analysis found that 16 types of days reasonably describe the day-level multipollutant combinations that appear most frequently in our data. Multipollutant day types ranged from conditions when all pollutants measured low to days exhibiting relatively high concentrations for either primary or secondary pollutants or both. The temporal nature of class assignments indicated substantial heterogeneity in day type frequency distributions (~1%-14%), relatively short-term durations (<2 day persistence), and long-term and seasonal trends. Meteorological summaries revealed strong day type weather dependencies and pollutant concentration summaries provided interesting scenarios for further investigation. Comparison with traditional methods found SOM produced similar classifications with added insight regarding between-class relationships. Conclusion We find SOM to be an attractive framework for developing ambient air quality classification because the approach eases interpretation of results by allowing users to visualize classifications on an organized map. The presented approach provides an appealing tool for developing multipollutant metrics of air quality that can be used to support multipollutant health studies. PMID:24990361
nRC: non-coding RNA Classifier based on structural features.
Fiannaca, Antonino; La Rosa, Massimo; La Paglia, Laura; Rizzo, Riccardo; Urso, Alfonso
2017-01-01
Non-coding RNA (ncRNA) are small non-coding sequences involved in gene expression regulation of many biological processes and diseases. The recent discovery of a large set of different ncRNAs with biologically relevant roles has opened the way to develop methods able to discriminate between the different ncRNA classes. Moreover, the lack of knowledge about the complete mechanisms in regulative processes, together with the development of high-throughput technologies, has required the help of bioinformatics tools in addressing biologists and clinicians with a deeper comprehension of the functional roles of ncRNAs. In this work, we introduce a new ncRNA classification tool, nRC (non-coding RNA Classifier). Our approach is based on features extraction from the ncRNA secondary structure together with a supervised classification algorithm implementing a deep learning architecture based on convolutional neural networks. We tested our approach for the classification of 13 different ncRNA classes. We obtained classification scores, using the most common statistical measures. In particular, we reach an accuracy and sensitivity score of about 74%. The proposed method outperforms other similar classification methods based on secondary structure features and machine learning algorithms, including the RNAcon tool that, to date, is the reference classifier. nRC tool is freely available as a docker image at https://hub.docker.com/r/tblab/nrc/. The source code of nRC tool is also available at https://github.com/IcarPA-TBlab/nrc.
A Neural-Network-Based Semi-Automated Geospatial Classification Tool
NASA Astrophysics Data System (ADS)
Hale, R. G.; Herzfeld, U. C.
2014-12-01
North America's largest glacier system, the Bering Bagley Glacier System (BBGS) in Alaska, surged in 2011-2013, as shown by rapid mass transfer, elevation change, and heavy crevassing. Little is known about the physics controlling surge glaciers' semi-cyclic patterns; therefore, it is crucial to collect and analyze as much data as possible so that predictive models can be made. In addition, physical signs frozen in ice in the form of crevasses may help serve as a warning for future surges. The BBGS surge provided an opportunity to develop an automated classification tool for crevasse classification based on imagery collected from small aircraft. The classification allows one to link image classification to geophysical processes associated with ice deformation. The tool uses an approach that employs geostatistical functions and a feed-forward perceptron with error back-propagation. The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network (NN) can recognize. In an application to preform analysis on airborne video graphic data from the surge of the BBGS, an NN was able to distinguish 18 different crevasse classes with 95 percent or higher accuracy, for over 3,000 images. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we designed the tool's semi-automated pre-training algorithm to be adaptable. The tool can be optimized to specific settings and variables of image analysis: (airborne and satellite imagery, different camera types, observation altitude, number and types of classes, and resolution). The generalization of the classification tool brings three important advantages: (1) multiple types of problems in geophysics can be studied, (2) the training process is sufficiently formalized to allow non-experts in neural nets to perform the training process, and (3) the time required to manually pre-sort imagery into classes is greatly reduced.
Limitations and implications of stream classification
Juracek, K.E.; Fitzpatrick, F.A.
2003-01-01
Stream classifications that are based on channel form, such as the Rosgen Level II classification, are useful tools for the physical description and grouping of streams and for providing a means of communication for stream studies involving scientists and (or) managers with different backgrounds. The Level II classification also is used as a tool to assess stream stability, infer geomorphic processes, predict future geomorphic response, and guide stream restoration or rehabilitation activities. The use of the Level II classification for these additional purposes is evaluated in this paper. Several examples are described to illustrate the limitations and management implications of the Level II classification. Limitations include: (1) time dependence, (2) uncertain applicability across physical environments, (3) difficulty in identification of a true equilibrium condition, (4) potential for incorrect determination of bankfull elevation, and (5) uncertain process significance of classification criteria. Implications of using stream classifications based on channel form, such as Rosgen's, include: (1) acceptance of the limitations, (2) acceptance of the risk of classifying streams incorrectly, and (3) classification results may be used inappropriately. It is concluded that use of the Level II classification for purposes beyond description and communication is not appropriate. Research needs are identified that, if addressed, may help improve the usefulness of the Level II classification.
Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F
2014-06-01
To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified). © 2013 Elsevier B.V. All rights reserved.
Promoting consistent use of the communication function classification system (CFCS).
Cunningham, Barbara Jane; Rosenbaum, Peter; Hidecker, Mary Jo Cooley
2016-01-01
We developed a Knowledge Translation (KT) intervention to standardize the way speech-language pathologists working in Ontario Canada's Preschool Speech and Language Program (PSLP) used the Communication Function Classification System (CFCS). This tool was being used as part of a provincial program evaluation and standardizing its use was critical for establishing reliability and validity within the provincial dataset. Two theoretical foundations - Diffusion of Innovations and the Communication Persuasion Matrix - were used to develop and disseminate the intervention to standardize use of the CFCS among a cohort speech-language pathologists. A descriptive pre-test/post-test study was used to evaluate the intervention. Fifty-two participants completed an electronic pre-test survey, reviewed intervention materials online, and then immediately completed an electronic post-test survey. The intervention improved clinicians' understanding of how the CFCS should be used, their intentions to use the tool in the standardized way, and their abilities to make correct classifications using the tool. Findings from this work will be shared with representatives of the Ontario PSLP. The intervention may be disseminated to all speech-language pathologists working in the program. This study can be used as a model for developing and disseminating KT interventions for clinicians in paediatric rehabilitation. The Communication Function Classification System (CFCS) is a new tool that allows speech-language pathologists to classify children's skills into five meaningful levels of function. There is uncertainty and inconsistent practice in the field about the methods for using this tool. This study used combined two theoretical frameworks to develop an intervention to standardize use of the CFCS among a cohort of speech-language pathologists. The intervention effectively increased clinicians' understanding of the methods for using the CFCS, ability to make correct classifications, and intention to use the tool in the standardized way in the future.
Classification accuracy for stratification with remotely sensed data
Raymond L. Czaplewski; Paul L. Patterson
2003-01-01
Tools are developed that help specify the classification accuracy required from remotely sensed data. These tools are applied during the planning stage of a sample survey that will use poststratification, prestratification with proportional allocation, or double sampling for stratification. Accuracy standards are developed in terms of an âerror matrix,â which is...
Workplace screening for hand dermatitis: a pilot study.
Nichol, K; Copes, R; Spielmann, S; Kersey, K; Eriksson, J; Holness, D L
2016-01-01
Health care workers (HCWs) are at increased risk for developing occupational skin disease (OSD) such as dermatitis primarily due to exposure to wet work. Identification of risk factors and workplace screening can help early detection of OSD to avoid the condition becoming chronic. To determine risk factors and clinical findings for hand dermatitis using a workplace screening tool. Employees at a large teaching hospital in Toronto, Canada, were invited to complete a two-part hand dermatitis screening tool. Part 1 inquired about hand hygiene practices and Part 2 comprised a visual assessment of participants' hands by a health professional and classification as (i) normal, (ii) mild dermatitis or (iii) moderate/severe dermatitis. Risk factors were determined using chi-square and Cochran-Armitage analysis on a dichotomous variable, where Yes represented either a mild or moderate/severe disease classification. There were 183 participants out of 643 eligible employees; response rate 28%. Mild or moderate/severe dermatitis was present in 72% of participants. These employees were more likely to work directly with patients, have worked longer in a health care setting, wash hands and change gloves more frequently, wear gloves for more hours per day, have a history of eczema or dermatitis and report a current rash on the hands or rash in the past 12 months. There was a high percentage of HCWs with dermatitis and risk factors for dermatitis. These findings argue for increased attention to prevention and early identification of hand dermatitis and support further testing of the workplace screening tool. © The Author 2015. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Global Dynamic Exposure and the OpenBuildingMap - Communicating Risk and Involving Communities
NASA Astrophysics Data System (ADS)
Schorlemmer, Danijel; Beutin, Thomas; Hirata, Naoshi; Hao, Ken; Wyss, Max; Cotton, Fabrice; Prehn, Karsten
2017-04-01
Detailed understanding of local risk factors regarding natural catastrophes requires in-depth characterization of the local exposure. Current exposure capture techniques have to find the balance between resolution and coverage. We aim at bridging this gap by employing a crowd-sourced approach to exposure capturing, focusing on risk related to earthquake hazard. OpenStreetMap (OSM), the rich and constantly growing geographical database, is an ideal foundation for this task. More than 3.5 billion geographical nodes, more than 200 million building footprints (growing by 100'000 per day), and a plethora of information about school, hospital, and other critical facilities allows us to exploit this dataset for risk-related computations. We are combining the strengths of crowd-sourced data collection with the knowledge of experts in extracting the most information from these data. Besides relying on the very active OpenStreetMap community and the Humanitarian OpenStreetMap Team, which are collecting building information at high pace, we are providing a tailored building capture tool for mobile devices. This tool is facilitating simple and fast building property capturing for OpenStreetMap by any person or interested community. With our OpenBuildingMap system, we are harvesting this dataset by processing every building in near-realtime. We are collecting exposure and vulnerability indicators from explicitly provided data (e.g. hospital locations), implicitly provided data (e.g. building shapes and positions), and semantically derived data, i.e. interpretation applying expert knowledge. The expert knowledge is needed to translate the simple building properties as captured by OpenStreetMap users into vulnerability and exposure indicators and subsequently into building classifications as defined in the Building Taxonomy 2.0 developed by the Global Earthquake Model (GEM) and the European Macroseismic Scale (EMS98). With this approach, we increase the resolution of existing exposure models from aggregated exposure information to building-by-building vulnerability. We report on our method, on the software development for the mobile application and the server-side analysis system, and on the OpenBuildingMap (www.openbuildingmap.org), our global Tile Map Service focusing on building properties. The free/open framework we provide can be used on commodity hardware for local to regional exposure capturing, for stakeholders in disaster management and mitigation for communicating risk, and for communities to understand their risk.
A New Tool for Climatic Analysis Using the Koppen Climate Classification
ERIC Educational Resources Information Center
Larson, Paul R.; Lohrengel, C. Frederick, II
2011-01-01
The purpose of climate classification is to help make order of the seemingly endless spatial distribution of climates. The Koppen classification system in a modified format is the most widely applied system in use today. This system may not be the best nor most complete climate classification that can be conceived, but it has gained widespread…
77 FR 1633 - Bacillus Subtilis Strain CX-9060; Exemption From the Requirement of a Tolerance
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-11
... Classification System (NAICS) codes have been provided to assist you and others in determining whether this... other exposures for which there is reliable information.'' This includes exposure through drinking water... exposure to the pesticide through food, drinking water, and through other exposures that occur as a result...
Hydrologic Landscape Classification to Estimate Bristol Bay Watershed Hydrology
The use of hydrologic landscapes has proven to be a useful tool for broad scale assessment and classification of landscapes across the United States. These classification systems help organize larger geographical areas into areas of similar hydrologic characteristics based on cl...
Douglas, P; Tyrrel, S F; Kinnersley, R P; Whelan, M; Longhurst, P J; Walsh, K; Pollard, S J T; Drew, G H
2016-12-15
Bioaerosols are released in elevated quantities from composting facilities and are associated with negative health effects, although dose-response relationships are not well understood, and require improved exposure classification. Dispersion modelling has great potential to improve exposure classification, but has not yet been extensively used or validated in this context. We present a sensitivity analysis of the ADMS dispersion model specific to input parameter ranges relevant to bioaerosol emissions from open windrow composting. This analysis provides an aid for model calibration by prioritising parameter adjustment and targeting independent parameter estimation. Results showed that predicted exposure was most sensitive to the wet and dry deposition modules and the majority of parameters relating to emission source characteristics, including pollutant emission velocity, source geometry and source height. This research improves understanding of the accuracy of model input data required to provide more reliable exposure predictions. Copyright © 2016. Published by Elsevier Ltd.
Assessment and classification of cancer breakthrough pain: a systematic literature review.
Haugen, Dagny Faksvåg; Hjermstad, Marianne Jensen; Hagen, Neil; Caraceni, Augusto; Kaasa, Stein
2010-06-01
Temporal variations in cancer pain intensity are highly prevalent, and are often difficult to manage. However, the phenomenon is not well understood: several definitions and approaches to classification and bedside assessment of cancer breakthrough pain (BTP) have been described. The present study is a systematic review of published literature on cancer BTP to answer the following questions: which terms and definitions have been used; are there validated assessment tools; which domains of BTP do the tools delineate, and which items do they contain; how have assessment tools been applied within clinical studies; and are there validated classification systems for BTP. A systematic search of the peer-reviewed literature was performed using five major databases. Of 375 titles and abstracts initially identified, 51 articles were examined in detail. Analysis of these publications indicates a range of overlapping but distinct definitions have been used to characterize BTP; 42 of the included papers presented one or more ways of classifying BTP; and while 10 tools to assess patients' experience of BTP were identified, only 2 have been partially validated. We conclude that there is no widely accepted definition, classification system or well-validated assessment tool for cancer-related breakthrough pain, but there is strong concurrence on most of its key attributes. With further work in this area, an internationally agreed upon definition and classification system for cancer-related breakthrough pain, and a standard approach on how to measure it, hold the promise to improve patient care and support research in this poor-prognosis cancer pain syndrome.
Alecu, C S; Jitaru, E; Moisil, I
2000-01-01
This paper presents some tools designed and implemented for learning-related purposes; these tools can be downloaded or run on the TeleNurse web site. Among other facilities, TeleNurse web site is hosting now the version 1.2 of SysTerN (terminology system for nursing) which can be downloaded on request and also the "Evaluation of Translation" form which has been designed in order to improve the Romanian translation of the ICNP (the International Classification of Nursing Practice). SysTerN has been developed using the framework of the TeleNurse ID--ENTITY Telematics for Health EU project. This version is using the beta version of ICNP containing Phenomena and Actions classification. This classification is intended to facilitate documentation of nursing practice, by providing a terminology or vocabulary for use in the description of the nursing process. The TeleNurse site is bilingual, Romanian-English, in order to enlarge the discussion forum with members from other CEE (or Non-CEE) countries.
Kapellusch, Jay M; Silverstein, Barbara A; Bao, Stephen S; Thiese, Mathew S; Merryweather, Andrew S; Hegmann, Kurt T; Garg, Arun
2018-02-01
The Strain Index (SI) and the American Conference of Governmental Industrial Hygienists (ACGIH) threshold limit value for hand activity level (TLV for HAL) have been shown to be associated with prevalence of distal upper-limb musculoskeletal disorders such as carpal tunnel syndrome (CTS). The SI and TLV for HAL disagree on more than half of task exposure classifications. Similarly, time-weighted average (TWA), peak, and typical exposure techniques used to quantity physical exposure from multi-task jobs have shown between-technique agreement ranging from 61% to 93%, depending upon whether the SI or TLV for HAL model was used. This study compared exposure-response relationships between each model-technique combination and prevalence of CTS. Physical exposure data from 1,834 workers (710 with multi-task jobs) were analyzed using the SI and TLV for HAL and the TWA, typical, and peak multi-task job exposure techniques. Additionally, exposure classifications from the SI and TLV for HAL were combined into a single measure and evaluated. Prevalent CTS cases were identified using symptoms and nerve-conduction studies. Mixed effects logistic regression was used to quantify exposure-response relationships between categorized (i.e., low, medium, and high) physical exposure and CTS prevalence for all model-technique combinations, and for multi-task workers, mono-task workers, and all workers combined. Except for TWA TLV for HAL, all model-technique combinations showed monotonic increases in risk of CTS with increased physical exposure. The combined-models approach showed stronger association than the SI or TLV for HAL for multi-task workers. Despite differences in exposure classifications, nearly all model-technique combinations showed exposure-response relationships with prevalence of CTS for the combined sample of mono-task and multi-task workers. Both the TLV for HAL and the SI, with the TWA or typical techniques, appear useful for epidemiological studies and surveillance. However, the utility of TWA, typical, and peak techniques for job design and intervention is dubious.
Patient classification tool in home health care.
Pavasaris, B
1989-01-01
Medicare's system of diagnosis related groups for health care cost reimbursements is inadequate for the special requirements of home health care. A visiting nurses association's patient classification tool correlates a meticulous record of professional time spent per patient with patient diagnosis and level of care, aimed at helping policymakers develop a more equitable DRG-based prospective payment formula for home care costs.
[Difficulties of the methods for studying environmental exposure and neural tube defects].
Borja-Aburto, V H; Bermúdez-Castro, O; Lacasaña-Navarro, M; Kuri, P; Bustamante-Montes, P; Torres-Meza, V
1999-01-01
To discuss the attitudes in the assessment of environmental exposures as risk factors associated with neural tube defects, and to present the main risk factors studied to date. Environmental exposures have been suggested to have a roll in the genesis of birth defects. However, studies conducted in human populations have found difficulties in the design and conduction to show such an association for neural tube defects (anencephaly, espina bifida and encephalocele) because of problems raised from: a) the frequency measures used to compare time trends and communities, b) the classification of heterogeneous malformations, c) the inclusion of maternal, paternal and fetal factors as an integrated process and, d) the assessment of environmental exposures. Hypothetically both maternal and paternal environmental exposures can produce damage before and after conception by direct action on the embryo and the fetus-placenta complex. Therefore, in the assessment of environmental exposures we need to take into account: a) both paternal and maternal exposures; b) the critical exposure period, three months before conception for paternal exposures and one month around the conceptional period for maternal exposures; c) quantitatively evaluate environmental exposures when possible, avoiding a dichotomous classification; d) the use of biological markers of exposure is highly recommended as well as markers of genetic susceptibility.
Robert E. Keane; Jason M. Herynk; Chris Toney; Shawn P. Urbanski; Duncan C. Lutes; Roger D. Ottmar
2015-01-01
Fuel classifications are integral tools in fire management and planning because they are used as inputs to fire behavior and effects simulation models. Fuel Loading Models (FLMs) and Fuel Characteristic Classification System (FCCSs) fuelbeds are the most popular classifications used throughout wildland fire science and management, but they have yet to be thoroughly...
The Reliability of Galaxy Classifications by Citizen Scientists
NASA Astrophysics Data System (ADS)
Francis, Lennox; Kautsch, Stefan J.; Bizyaev, Dmitry
2017-01-01
Citizen scientists are becoming more and more important in helping professionals working through big data. An example in astronomy is crowdsourced galaxy classification. But how reliable are these classifications for studies of galaxy evolution? We present a tool in order to investigate those morphological classifications and test it on a diverse population on our campus. We observe a slight offset towards earlier Hubble types in the crowdsourced morphologies, when compared to professional classifications.
Stehle, Sebastian; Bub, Sascha; Schulz, Ralf
2018-10-15
The decades-long agricultural use of insecticides resulted in frequent contamination of surface waters globally regularly posing high risks for the aquatic biodiversity. However, the concentration levels of individual insecticide compounds have by now not been compiled and reported using global scale data, hampering our knowledge on the insecticide exposure of aquatic ecosystems. Here, we specify measured insecticide concentrations (MICs, comprising in total 11,300 water and sediment concentrations taken from a previous publication) for 28 important insecticide compounds covering four major insecticide classes. Results show that organochlorine and organophosphate insecticides, which dominated the global insecticide market for decades, have been detected most often and at highest concentration levels in surface waters globally. In comparison, MICs of the more recent pyrethroids and neonicotinoids were less often reported and generally at lower concentrations as a result of their later market introduction and lower application rates. An online insecticide classification calculator (ICC; available at: https://static.magic.eco/icc/v1) is provided in order to enable the comparison and classification of prospective MICs with available global insecticide concentrations. Spatial analyses of existing data show that most MICs were reported for surface waters in North America, Asia and Europe, whereas highest concentration levels were detected in Africa, Asia and South America. An evaluation of water and sediment MICs showed that theoretical organic carbon-water partition coefficients (K OC ) determined in the laboratory overestimated K OC values based on actual field concentrations by up to a factor of more than 20, with highest deviations found for highly sorptive pyrethroids. Overall, the comprehensive compilation of insecticide field concentrations presented here is a valuable tool for the classification of future surface water monitoring results and serves as important input data for more field relevant toxicity testing approaches and pesticide exposure and risk assessment schemes. Copyright © 2018 Elsevier B.V. All rights reserved.
Saxena, Prem Narain
2013-01-01
Despite recent advances in understanding mechanism of toxicity, the development of biomarkers (biochemicals that vary significantly with exposure to chemicals) for pesticides and environmental contaminants exposure is still a challenging task. Carbofuran is one of the most commonly used pesticides in agriculture and said to be most toxic carbamate pesticide. It is necessary to identify the biochemicals that can vary significantly after carbofuran exposure on earthworms which will help to assess the soil ecotoxicity. Initially, we have optimized the extraction conditions which are suitable for high-throughput gas chromatography mass spectrometry (GC-MS) based metabolomics for the tissue of earthworm, Metaphire posthuma. Upon evaluation of five different extraction solvent systems, 80% methanol was found to have good extraction efficiency based on the yields of metabolites, multivariate analysis, total number of peaks and reproducibility of metabolites. Later the toxicity evaluation was performed to characterize the tissue specific metabolomic perturbation of earthworm, Metaphire posthuma after exposure to carbofuran at three different concentration levels (0.15, 0.3 and 0.6 mg/kg of soil). Seventeen metabolites, contributing to the best classification performance of highest dose dependent carbofuran exposed earthworms from healthy controls were identified. This study suggests that GC-MS based metabolomic approach was precise and sensitive to measure the earthworm responses to carbofuran exposure in soil, and can be used as a promising tool for environmental eco-toxicological studies. PMID:24324663
Darwich, Adam S; Henderson, Kathryn; Burgin, Angela; Ward, Nicola; Whittam, Janet; Ammori, Basil J; Ashcroft, Darren M; Rostami-Hodjegan, Amin
2012-01-01
AIMS To identify the most commonly prescribed drugs in a bariatric surgery population and to assess existing evidence regarding trends in oral drug bioavailability post bariatric surgery. METHODS A retrospective audit was undertaken to document commonly prescribed drugs amongst patients undergoing bariatric surgery in an NHS hospital in the UK and to assess practice for drug administration following bariatric surgery. The available literature was examined for trends relating to drug permeability and solubility with regards to the Biopharmaceutics Classification System (BCS) and main route of elimination. RESULTS No significant difference in the ‘post/pre surgery oral drug exposure ratio’ (ppR) was apparent between BCS class I to IV drugs, with regards to dose number (Do) or main route of elimination. Drugs classified as ‘solubility limited’ displayed an overall reduction as compared with ‘freely soluble’ compounds, as well as an unaltered and increased ppR. CONCLUSION Clinical studies establishing guidelines for commonly prescribed drugs, and the monitoring of drugs exhibiting a narrow therapeutic window or without a readily assessed clinical endpoint, are warranted. Using mechanistically based pharmacokinetic modelling for simulating the multivariate nature of changes in drug exposure may serve as a useful tool in the further understanding of postoperative trends in oral drug exposure and in developing practical clinical guidance. PMID:22463107
Automatic classification of blank substrate defects
NASA Astrophysics Data System (ADS)
Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati
2014-10-01
Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask Technology Center (MPMask). The Calibre ADC tool was qualified on production mask blanks against the manual classification. The classification accuracy of ADC is greater than 95% for critical defects with an overall accuracy of 90%. The sensitivity to weak defect signals and locating the defect in the images is a challenge we are resolving. The performance of the tool has been demonstrated on multiple mask types and is ready for deployment in full volume mask manufacturing production flow. Implementation of Calibre ADC is estimated to reduce the misclassification of critical defects by 60-80%.
New tools for evaluating LQAS survey designs
2014-01-01
Lot Quality Assurance Sampling (LQAS) surveys have become increasingly popular in global health care applications. Incorporating Bayesian ideas into LQAS survey design, such as using reasonable prior beliefs about the distribution of an indicator, can improve the selection of design parameters and decision rules. In this paper, a joint frequentist and Bayesian framework is proposed for evaluating LQAS classification accuracy and informing survey design parameters. Simple software tools are provided for calculating the positive and negative predictive value of a design with respect to an underlying coverage distribution and the selected design parameters. These tools are illustrated using a data example from two consecutive LQAS surveys measuring Oral Rehydration Solution (ORS) preparation. Using the survey tools, the dependence of classification accuracy on benchmark selection and the width of the ‘grey region’ are clarified in the context of ORS preparation across seven supervision areas. Following the completion of an LQAS survey, estimation of the distribution of coverage across areas facilitates quantifying classification accuracy and can help guide intervention decisions. PMID:24528928
New tools for evaluating LQAS survey designs.
Hund, Lauren
2014-02-15
Lot Quality Assurance Sampling (LQAS) surveys have become increasingly popular in global health care applications. Incorporating Bayesian ideas into LQAS survey design, such as using reasonable prior beliefs about the distribution of an indicator, can improve the selection of design parameters and decision rules. In this paper, a joint frequentist and Bayesian framework is proposed for evaluating LQAS classification accuracy and informing survey design parameters. Simple software tools are provided for calculating the positive and negative predictive value of a design with respect to an underlying coverage distribution and the selected design parameters. These tools are illustrated using a data example from two consecutive LQAS surveys measuring Oral Rehydration Solution (ORS) preparation. Using the survey tools, the dependence of classification accuracy on benchmark selection and the width of the 'grey region' are clarified in the context of ORS preparation across seven supervision areas. Following the completion of an LQAS survey, estimation of the distribution of coverage across areas facilitates quantifying classification accuracy and can help guide intervention decisions.
DATA-MEAns: an open source tool for the classification and management of neural ensemble recordings.
Bonomini, María P; Ferrandez, José M; Bolea, Jose Angel; Fernandez, Eduardo
2005-10-30
The number of laboratories using techniques that allow to acquire simultaneous recordings of as many units as possible is considerably increasing. However, the development of tools used to analyse this multi-neuronal activity is generally lagging behind the development of the tools used to acquire these data. Moreover, the data exchange between research groups using different multielectrode acquisition systems is hindered by commercial constraints such as exclusive file structures, high priced licenses and hard policies on intellectual rights. This paper presents a free open-source software for the classification and management of neural ensemble data. The main goal is to provide a graphical user interface that links the experimental data to a basic set of routines for analysis, visualization and classification in a consistent framework. To facilitate the adaptation and extension as well as the addition of new routines, tools and algorithms for data analysis, the source code and documentation are freely available.
Kilańska, D; Gaworska-Krzemińska, A; Grabowska, H; Gorzkowicz, B
2016-09-01
The development of a nursing practice, improvements in nurses' autonomy, and increased professional and personal responsibility for the medical services provided all require professional documentation with records of health status assessments, decisions undertaken, actions and their outcomes for each patient. The International Classification for Nursing Practice is a tool that meets all of these needs, and although it requires continuous evaluation, it offers professional documentation and communication in the practitioner and researcher community. The aim of this paper is to present a theoretical critique of an issue related to policy and experience of the current situation in Polish nursing - especially of the efforts to standardize nursing practices through the introduction and development of the Classification in Poland. Despite extensive promotion and training by International Council of Nurses members worldwide, there are still many countries where the Classification has not been implemented as a standard tool in healthcare facilities. Recently, a number of initiatives were undertaken in cooperation with the local and state authorities to disseminate the Classification in healthcare facilities. Thanks to intense efforts by the Polish Nurses Association and the International Council of Nurses Accredited Center for ICNP(®) Research & Development at the Medical University of Łódź, the Classification is known in Poland and has been tested at several centres. Nevertheless, an actual implementation that would allow for national and international interoperability requires strategic governmental decisions and close cooperation with information technology companies operating in the country. Discussing the barriers to the implementation of the Classification can improve understanding of it and its use. At a policy level, decision makers need to understand that use Classification in eHealth services and tools it is necessary to achieve interoperability. © 2016 International Council of Nurses.
Addressing multi-label imbalance problem of surgical tool detection using CNN.
Sahu, Manish; Mukhopadhyay, Anirban; Szengel, Angelika; Zachow, Stefan
2017-06-01
A fully automated surgical tool detection framework is proposed for endoscopic video streams. State-of-the-art surgical tool detection methods rely on supervised one-vs-all or multi-class classification techniques, completely ignoring the co-occurrence relationship of the tools and the associated class imbalance. In this paper, we formulate tool detection as a multi-label classification task where tool co-occurrences are treated as separate classes. In addition, imbalance on tool co-occurrences is analyzed and stratification techniques are employed to address the imbalance during convolutional neural network (CNN) training. Moreover, temporal smoothing is introduced as an online post-processing step to enhance runtime prediction. Quantitative analysis is performed on the M2CAI16 tool detection dataset to highlight the importance of stratification, temporal smoothing and the overall framework for tool detection. The analysis on tool imbalance, backed by the empirical results, indicates the need and superiority of the proposed framework over state-of-the-art techniques.
Binetti, R; Costamagna, F M; Marcello, I
2001-01-01
International, national and regulatory classification, evaluation, guidelines and occupational exposure values regarding vinyl chloride and 1,2-dichloroethane, carried out by European Union (EU). Environmental Protection Agency (US EPA), International Agency for Research on Cancer (IARC), Italian National Advisory Toxicological Committee (CCTN), Occupational Safety and Health Administration (OSHA), World Health Organization (WHO), National Institute for Occupational Safety and Health (NIOSH), American Conference of Governmental Industrial Hygienists (ACGIH) and other institutions, have been considered with particular reference to the carcinogenic effects. Moreover information is reported in support of classification and evaluation and a short historical review since early 1970s, when first evidence that occupational exposure to VC could lead to angiosarcoma was published.
Gross, Douglas P; Zhang, Jing; Steenstra, Ivan; Barnsley, Susan; Haws, Calvin; Amell, Tyler; McIntosh, Greg; Cooper, Juliette; Zaiane, Osmar
2013-12-01
To develop a classification algorithm and accompanying computer-based clinical decision support tool to help categorize injured workers toward optimal rehabilitation interventions based on unique worker characteristics. Population-based historical cohort design. Data were extracted from a Canadian provincial workers' compensation database on all claimants undergoing work assessment between December 2009 and January 2011. Data were available on: (1) numerous personal, clinical, occupational, and social variables; (2) type of rehabilitation undertaken; and (3) outcomes following rehabilitation (receiving time loss benefits or undergoing repeat programs). Machine learning, concerned with the design of algorithms to discriminate between classes based on empirical data, was the foundation of our approach to build a classification system with multiple independent and dependent variables. The population included 8,611 unique claimants. Subjects were predominantly employed (85 %) males (64 %) with diagnoses of sprain/strain (44 %). Baseline clinician classification accuracy was high (ROC = 0.86) for selecting programs that lead to successful return-to-work. Classification performance for machine learning techniques outperformed the clinician baseline classification (ROC = 0.94). The final classifiers were multifactorial and included the variables: injury duration, occupation, job attachment status, work status, modified work availability, pain intensity rating, self-rated occupational disability, and 9 items from the SF-36 Health Survey. The use of machine learning classification techniques appears to have resulted in classification performance better than clinician decision-making. The final algorithm has been integrated into a computer-based clinical decision support tool that requires additional validation in a clinical sample.
Applications of Support Vector Machines In Chemo And Bioinformatics
NASA Astrophysics Data System (ADS)
Jayaraman, V. K.; Sundararajan, V.
2010-10-01
Conventional linear & nonlinear tools for classification, regression & data driven modeling are being replaced on a rapid scale by newer techniques & tools based on artificial intelligence and machine learning. While the linear techniques are not applicable for inherently nonlinear problems, newer methods serve as attractive alternatives for solving real life problems. Support Vector Machine (SVM) classifiers are a set of universal feed-forward network based classification algorithms that have been formulated from statistical learning theory and structural risk minimization principle. SVM regression closely follows the classification methodology. In this work recent applications of SVM in Chemo & Bioinformatics will be described with suitable illustrative examples.
Gemovic, Branislava; Perovic, Vladimir; Glisic, Sanja; Veljkovic, Nevena
2013-01-01
There are more than 500 amino acid substitutions in each human genome, and bioinformatics tools irreplaceably contribute to determination of their functional effects. We have developed feature-based algorithm for the detection of mutations outside conserved functional domains (CFDs) and compared its classification efficacy with the most commonly used phylogeny-based tools, PolyPhen-2 and SIFT. The new algorithm is based on the informational spectrum method (ISM), a feature-based technique, and statistical analysis. Our dataset contained neutral polymorphisms and mutations associated with myeloid malignancies from epigenetic regulators ASXL1, DNMT3A, EZH2, and TET2. PolyPhen-2 and SIFT had significantly lower accuracies in predicting the effects of amino acid substitutions outside CFDs than expected, with especially low sensitivity. On the other hand, only ISM algorithm showed statistically significant classification of these sequences. It outperformed PolyPhen-2 and SIFT by 15% and 13%, respectively. These results suggest that feature-based methods, like ISM, are more suitable for the classification of amino acid substitutions outside CFDs than phylogeny-based tools.
Cox, Kyley J; Porucznik, Christina A; Anderson, David J; Brozek, Eric M; Szczotka, Kathryn M; Bailey, Nicole M; Wilkins, Diana G; Stanford, Joseph B
2016-04-01
Bisphenol A (BPA) is an endocrine disruptor and potential reproductive toxicant, but results of epidemiologic studies have been mixed and have been criticized for inadequate exposure assessment that often relies on a single measurement. Our goal was to describe the distribution of BPA concentrations in serial urinary specimens, assess temporal variability, and provide estimates of exposure classification when randomly selected samples are used to predict average exposure. We collected and analyzed 2,614 urine specimens from 83 Utah couples beginning in 2012. Female participants collected daily first-morning urine specimens during one to two menstrual cycles and male partners collected specimens during the woman's fertile window for each cycle. We measured urinary BPA concentrations and calculated geometric means (GM) for each cycle, characterized the distribution of observed values and temporal variability using intraclass correlation coefficients, and performed surrogate category analyses to determine how well repeat samples could classify exposure. The GM urine BPA concentration was 2.78 ng/mL among males and 2.44 ng/mL among females. BPA had a high degree of variability among both males (ICC = 0.18; 95% CI: 0.11, 0.26) and females (ICC = 0.11; 95% CI: 0.08, 0.16). Based on our more stringent surrogate category analysis, to reach proportions ≥ 0.80 for sensitivity, specificity, and positive predictive value (PPV) among females, 6 and 10 repeat samples for the high and low tertiles, respectively, were required. For the medium tertile, specificity reached 0.87 with 10 repeat samples, but even with 11 samples, sensitivity and PPV did not exceed 0.36. Five repeat samples, among males, yielded sensitivity and PPV values ≥ 0.75 for the high and low tertiles, but, similar to females, classification for the medium tertile was less accurate. Repeated urinary specimens are required to characterize typical BPA exposure. Cox KJ, Porucznik CA, Anderson DJ, Brozek EM, Szczotka KM, Bailey NM, Wilkins DG, Stanford JB. 2016. Exposure classification and temporal variability in urinary bisphenol A concentrations among couples in Utah-the HOPE study. Environ Health Perspect 124:498-506; http://dx.doi.org/10.1289/ehp.1509752.
Real-Time Fault Classification for Plasma Processes
Yang, Ryan; Chen, Rongshun
2011-01-01
Plasma process tools, which usually cost several millions of US dollars, are often used in the semiconductor fabrication etching process. If the plasma process is halted due to some process fault, the productivity will be reduced and the cost will increase. In order to maximize the product/wafer yield and tool productivity, a timely and effective fault process detection is required in a plasma reactor. The classification of fault events can help the users to quickly identify fault processes, and thus can save downtime of the plasma tool. In this work, optical emission spectroscopy (OES) is employed as the metrology sensor for in-situ process monitoring. Splitting into twelve different match rates by spectrum bands, the matching rate indicator in our previous work (Yang, R.; Chen, R.S. Sensors 2010, 10, 5703–5723) is used to detect the fault process. Based on the match data, a real-time classification of plasma faults is achieved by a novel method, developed in this study. Experiments were conducted to validate the novel fault classification. From the experimental results, we may conclude that the proposed method is feasible inasmuch that the overall accuracy rate of the classification for fault event shifts is 27 out of 28 or about 96.4% in success. PMID:22164001
The Population Life-course Exposure to Health Effects Modeling (PLETHEM) platform being developed provides a tool that links results from emerging toxicity testing tools to exposure estimates for humans as defined by the USEPA. A reverse dosimetry case study using phthalates was ...
SoFoCles: feature filtering for microarray classification based on gene ontology.
Papachristoudis, Georgios; Diplaris, Sotiris; Mitkas, Pericles A
2010-02-01
Marker gene selection has been an important research topic in the classification analysis of gene expression data. Current methods try to reduce the "curse of dimensionality" by using statistical intra-feature set calculations, or classifiers that are based on the given dataset. In this paper, we present SoFoCles, an interactive tool that enables semantic feature filtering in microarray classification problems with the use of external, well-defined knowledge retrieved from the Gene Ontology. The notion of semantic similarity is used to derive genes that are involved in the same biological path during the microarray experiment, by enriching a feature set that has been initially produced with legacy methods. Among its other functionalities, SoFoCles offers a large repository of semantic similarity methods that are used in order to derive feature sets and marker genes. The structure and functionality of the tool are discussed in detail, as well as its ability to improve classification accuracy. Through experimental evaluation, SoFoCles is shown to outperform other classification schemes in terms of classification accuracy in two real datasets using different semantic similarity computation approaches.
European solvent industry group generic exposure scenario risk and exposure tool
Zaleski, Rosemary T; Qian, Hua; Zelenka, Michael P; George-Ares, Anita; Money, Chris
2014-01-01
The European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) was developed to facilitate the safety evaluation of consumer uses of solvents, as required by the European Union Registration, Evaluation and Authorization of Chemicals (REACH) Regulation. This exposure-based risk assessment tool provides estimates of both exposure and risk characterization ratios for consumer uses. It builds upon the consumer portion of the European Center for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) tool by implementing refinements described in ECETOC TR107. Technical enhancements included the use of additional data to refine scenario defaults and the ability to include additional parameters in exposure calculations. Scenarios were also added to cover all frequently encountered consumer uses of solvents. The TRA tool structure was modified to automatically determine conditions necessary for safe use. EGRET reports results using specific standard phrases in a format consistent with REACH exposure scenario guidance, in order that the outputs can be readily assimilated within safety data sheets and similar information technology systems. Evaluation of tool predictions for a range of commonly encountered consumer uses of solvents found it provides reasonable yet still conservative exposure estimates. PMID:23361440
European solvent industry group generic exposure scenario risk and exposure tool.
Zaleski, Rosemary T; Qian, Hua; Zelenka, Michael P; George-Ares, Anita; Money, Chris
2014-01-01
The European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) was developed to facilitate the safety evaluation of consumer uses of solvents, as required by the European Union Registration, Evaluation and Authorization of Chemicals (REACH) Regulation. This exposure-based risk assessment tool provides estimates of both exposure and risk characterization ratios for consumer uses. It builds upon the consumer portion of the European Center for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) tool by implementing refinements described in ECETOC TR107. Technical enhancements included the use of additional data to refine scenario defaults and the ability to include additional parameters in exposure calculations. Scenarios were also added to cover all frequently encountered consumer uses of solvents. The TRA tool structure was modified to automatically determine conditions necessary for safe use. EGRET reports results using specific standard phrases in a format consistent with REACH exposure scenario guidance, in order that the outputs can be readily assimilated within safety data sheets and similar information technology systems. Evaluation of tool predictions for a range of commonly encountered consumer uses of solvents found it provides reasonable yet still conservative exposure estimates.
Semi-supervised classification tool for DubaiSat-2 multispectral imagery
NASA Astrophysics Data System (ADS)
Al-Mansoori, Saeed
2015-10-01
This paper addresses a semi-supervised classification tool based on a pixel-based approach of the multi-spectral satellite imagery. There are not many studies demonstrating such algorithm for the multispectral images, especially when the image consists of 4 bands (Red, Green, Blue and Near Infrared) as in DubaiSat-2 satellite images. The proposed approach utilizes both unsupervised and supervised classification schemes sequentially to identify four classes in the image, namely, water bodies, vegetation, land (developed and undeveloped areas) and paved areas (i.e. roads). The unsupervised classification concept is applied to identify two classes; water bodies and vegetation, based on a well-known index that uses the distinct wavelengths of visible and near-infrared sunlight that is absorbed and reflected by the plants to identify the classes; this index parameter is called "Normalized Difference Vegetation Index (NDVI)". Afterward, the supervised classification is performed by selecting training homogenous samples for roads and land areas. Here, a precise selection of training samples plays a vital role in the classification accuracy. Post classification is finally performed to enhance the classification accuracy, where the classified image is sieved, clumped and filtered before producing final output. Overall, the supervised classification approach produced higher accuracy than the unsupervised method. This paper shows some current preliminary research results which point out the effectiveness of the proposed technique in a virtual perspective.
Between-User Reliability of Tier 1 Exposure Assessment Tools Used Under REACH.
Lamb, Judith; Galea, Karen S; Miller, Brian G; Hesse, Susanne; Van Tongeren, Martie
2017-10-01
When applying simple screening (Tier 1) tools to estimate exposure to chemicals in a given exposure situation under the Registration, Evaluation, Authorisation and restriction of CHemicals Regulation 2006 (REACH), users must select from several possible input parameters. Previous studies have suggested that results from exposure assessments using expert judgement and from the use of modelling tools can vary considerably between assessors. This study aimed to investigate the between-user reliability of Tier 1 tools. A remote-completion exercise and in person workshop were used to identify and evaluate tool parameters and factors such as user demographics that may be potentially associated with between-user variability. Participants (N = 146) generated dermal and inhalation exposure estimates (N = 4066) from specified workplace descriptions ('exposure situations') and Tier 1 tool combinations (N = 20). Interactions between users, tools, and situations were investigated and described. Systematic variation associated with individual users was minor compared with random between-user variation. Although variation was observed between choices made for the majority of input parameters, differing choices of Process Category ('PROC') code/activity descriptor and dustiness level impacted most on the resultant exposure estimates. Exposure estimates ranging over several orders of magnitude were generated for the same exposure situation by different tool users. Such unpredictable between-user variation will reduce consistency within REACH processes and could result in under-estimation or overestimation of exposure, risking worker ill-health or the implementation of unnecessary risk controls, respectively. Implementation of additional support and quality control systems for all tool users is needed to reduce between-assessor variation and so ensure both the protection of worker health and avoidance of unnecessary business risk management expenditure. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Forming Tool Use Representations: A Neurophysiological Investigation into Tool Exposure
ERIC Educational Resources Information Center
Mizelle, John Christopher; Tang, Teresa; Pirouz, Nikta; Wheaton, Lewis A.
2011-01-01
Prior work has identified a common left parietofrontal network for storage of tool-related information for various tasks. How these representations become established within this network on the basis of different modes of exposure is unclear. Here, healthy subjects engaged in physical practice (direct exposure) with familiar and unfamiliar tools.…
Thompson, Bryony A.; Greenblatt, Marc S.; Vallee, Maxime P.; Herkert, Johanna C.; Tessereau, Chloe; Young, Erin L.; Adzhubey, Ivan A.; Li, Biao; Bell, Russell; Feng, Bingjian; Mooney, Sean D.; Radivojac, Predrag; Sunyaev, Shamil R.; Frebourg, Thierry; Hofstra, Robert M.W.; Sijmons, Rolf H.; Boucher, Ken; Thomas, Alun; Goldgar, David E.; Spurdle, Amanda B.; Tavtigian, Sean V.
2015-01-01
Classification of rare missense substitutions observed during genetic testing for patient management is a considerable problem in clinical genetics. The Bayesian integrated evaluation of unclassified variants is a solution originally developed for BRCA1/2. Here, we take a step toward an analogous system for the mismatch repair (MMR) genes (MLH1, MSH2, MSH6, and PMS2) that confer colon cancer susceptibility in Lynch syndrome by calibrating in silico tools to estimate prior probabilities of pathogenicity for MMR gene missense substitutions. A qualitative five-class classification system was developed and applied to 143 MMR missense variants. This identified 74 missense substitutions suitable for calibration. These substitutions were scored using six different in silico tools (Align-Grantham Variation Grantham Deviation, multivariate analysis of protein polymorphisms [MAPP], Mut-Pred, PolyPhen-2.1, Sorting Intolerant From Tolerant, and Xvar), using curated MMR multiple sequence alignments where possible. The output from each tool was calibrated by regression against the classifications of the 74 missense substitutions; these calibrated outputs are interpretable as prior probabilities of pathogenicity. MAPP was the most accurate tool and MAPP + PolyPhen-2.1 provided the best-combined model (R2 = 0.62 and area under receiver operating characteristic = 0.93). The MAPP + PolyPhen-2.1 output is sufficiently predictive to feed as a continuous variable into the quantitative Bayesian integrated evaluation for clinical classification of MMR gene missense substitutions. PMID:22949387
Neumann, H G; Thielmann, H W; Filser, J G; Gelbke, H P; Greim, H; Kappus, H; Norpoth, K H; Reuter, U; Vamvakas, S; Wardenbach, P; Wichmann, H E
1998-01-01
Carcinogenic chemicals in the work area were previously classified into three categories in section III of the German List of MAK and BAT values (the list of values on maximum workplace concentrations and biological tolerance for occupational exposures). This classification was based on qualitative criteria and reflected essentially the weight of evidence available for judging the carcinogenic potential of the chemicals. In the new classification scheme the former sections IIIA1, IIIA2, and IIIB are retained as categories 1, 2, and 3, to correspond with European Union regulations. On the basis of our advancing knowledge of reaction mechanisms and the potency of carcinogens, these three categories are supplemented with two additional categories. The essential feature of substances classified in the new categories is that exposure to these chemicals does not contribute significantly to the risk of cancer to man, provided that an appropriate exposure limit (MAK value) is observed. Chemicals known to act typically by non-genotoxic mechanisms, and for which information is available that allows evaluation of the effects of low-dose exposures, are classified in category 4. Genotoxic chemicals for which low carcinogenic potency can be expected on the basis of dose/response relationships and toxicokinetics and for which risk at low doses can be assessed are classified in category 5. The basis for a better differentiation of carcinogens is discussed, the new categories are defined, and possible criteria for classification are described. Examples for category 4 (1,4-dioxane) and category 5 (styrene) are presented.
NASA Astrophysics Data System (ADS)
Barton, Sinead J.; Kerr, Laura T.; Domijan, Katarina; Hennelly, Bryan M.
2016-04-01
Raman micro-spectroscopy is an optoelectronic technique that can be used to evaluate the chemical composition of biological samples and has been shown to be a powerful diagnostic tool for the investigation of various cancer related diseases including bladder, breast, and cervical cancer. Raman scattering is an inherently weak process with approximately 1 in 107 photons undergoing scattering and for this reason, noise from the recording system can have a significant impact on the quality of the signal, and its suitability for diagnostic classification. The main sources of noise in the recorded signal are shot noise, CCD dark current, and CCD readout noise. Shot noise results from the low signal photon count while dark current results from thermally generated electrons in the semiconductor pixels. Both of these noise sources are time dependent; readout noise is time independent but is inherent in each individual recording and results in the fundamental limit of measurement, arising from the internal electronics of the camera. In this paper, each of the aforementioned noise sources are analysed in isolation, and used to experimentally validate a mathematical model. This model is then used to simulate spectra that might be acquired under various experimental conditions including the use of different cameras, different source wavelength, and power etc. Simulated noisy datasets of T24 and RT112 cell line spectra are generated based on true cell Raman spectrum irradiance values (recorded using very long exposure times) and the addition of simulated noise. These datasets are then input to multivariate classification using Principal Components Analysis and Linear Discriminant Analysis. This method enables an investigation into the effect of noise on the sensitivity and specificity of Raman based classification under various experimental conditions and using different equipment.
Evaluation of thyroid eye disease: quality-of-life questionnaire (TED-QOL) in Korean patients.
Son, Byeong Jae; Lee, Sang Yeul; Yoon, Jin Sook
2014-04-01
To assess impaired quality of life (QOL) of Korean patients with thyroid eye disease (TED) using the TED-QOL questionnaire, to evaluate the adaptability of the questionnaire, and to assess the correlation between TED-QOL and scales of disease severity. Prospective, cross-sectional study. Total of 90 consecutive adult patients with TED and Graves' disease were included in this study. TED-QOL was translated into Korean and administered to the patients. The results were compared with clinical severity scores (clinical activity score, VISA (vision loss (optic neuropathy); inflammation; strabismus/motility; appearance/exposure) classification, modified NOSPECS (no signs or symptoms; only signs; soft tissue; proptosis; extraocular muscle; cornea; sight loss) score, Gorman diplopia scale, and European Group of Graves' Orbitopathy Classification). Clinical scores indicating inflammation and strabismus in patients with TED were positively correlated with overall and visual function-related QOL (Spearman coefficient 0.21-0.38, p < 0.05). Clinical scores associated with appearance were positively correlated with appearance-related QOL (Spearman coefficient 0.26-0.27, p < 0.05). In multivariate analysis, age, soft-tissue inflammation, motility disorder of modified NOSPECS, and motility disorder of VISA classification had positive correlation with overall and function-related QOL. Sex, soft-tissue inflammation, proptosis of modified NOSPECS, and appearance of VISA classification had correlation with appearance-related QOL. In addition, validity of TED-QOL was proved sufficient based on the outcomes of patient interviews and correlation between the subscales of TED-QOL. TED-QOL showed significant correlations with various objective clinical parameters of TED. TED-QOL was a simple and useful tool for rapid evaluation of QOL in daily outpatient clinics, which could be readily translated into different languages to be widely applicable to various populations. Copyright © 2014 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.
An important challenge for an integrative approach to developmental systems toxicology is associating putative molecular initiating events (MIEs), cell signaling pathways, cell function and modeled fetal exposure kinetics. We have developed a chemical classification model based o...
Bào, Yīmíng; Kuhn, Jens H
2018-01-01
During the last decade, genome sequence-based classification of viruses has become increasingly prominent. Viruses can be even classified based on coding-complete genome sequence data alone. Nevertheless, classification remains arduous as experts are required to establish phylogenetic trees to depict the evolutionary relationships of such sequences for preliminary taxonomic placement. Pairwise sequence comparison (PASC) of genomes is one of several novel methods for establishing relationships among viruses. This method, provided by the US National Center for Biotechnology Information as an open-access tool, circumvents phylogenetics, and yet PASC results are often in agreement with those of phylogenetic analyses. Computationally inexpensive, PASC can be easily performed by non-taxonomists. Here we describe how to use the PASC tool for the preliminary classification of novel viral hemorrhagic fever-causing viruses.
Image-based deep learning for classification of noise transients in gravitational wave detectors
NASA Astrophysics Data System (ADS)
Razzano, Massimiliano; Cuoco, Elena
2018-05-01
The detection of gravitational waves has inaugurated the era of gravitational astronomy and opened new avenues for the multimessenger study of cosmic sources. Thanks to their sensitivity, the Advanced LIGO and Advanced Virgo interferometers will probe a much larger volume of space and expand the capability of discovering new gravitational wave emitters. The characterization of these detectors is a primary task in order to recognize the main sources of noise and optimize the sensitivity of interferometers. Glitches are transient noise events that can impact the data quality of the interferometers and their classification is an important task for detector characterization. Deep learning techniques are a promising tool for the recognition and classification of glitches. We present a classification pipeline that exploits convolutional neural networks to classify glitches starting from their time-frequency evolution represented as images. We evaluated the classification accuracy on simulated glitches, showing that the proposed algorithm can automatically classify glitches on very fast timescales and with high accuracy, thus providing a promising tool for online detector characterization.
Designing a training tool for imaging mental models
NASA Technical Reports Server (NTRS)
Dede, Christopher J.; Jayaram, Geetha
1990-01-01
The training process can be conceptualized as the student acquiring an evolutionary sequence of classification-problem solving mental models. For example a physician learns (1) classification systems for patient symptoms, diagnostic procedures, diseases, and therapeutic interventions and (2) interrelationships among these classifications (e.g., how to use diagnostic procedures to collect data about a patient's symptoms in order to identify the disease so that therapeutic measures can be taken. This project developed functional specifications for a computer-based tool, Mental Link, that allows the evaluative imaging of such mental models. The fundamental design approach underlying this representational medium is traversal of virtual cognition space. Typically intangible cognitive entities and links among them are visible as a three-dimensional web that represents a knowledge structure. The tool has a high degree of flexibility and customizability to allow extension to other types of uses, such a front-end to an intelligent tutoring system, knowledge base, hypermedia system, or semantic network.
EPA EcoBox Tools by Exposure Pathways - Exposure Pathways In ERA
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Choi, Sangjun; Kang, Dongmug; Park, Donguk; Lee, Hyunhee; Choi, Bongkyoo
2017-03-01
The goal of this study is to develop a general population job-exposure matrix (GPJEM) on asbestos to estimate occupational asbestos exposure levels in the Republic of Korea. Three Korean domestic quantitative exposure datasets collected from 1984 to 2008 were used to build the GPJEM. Exposure groups in collected data were reclassified based on the current Korean Standard Industrial Classification (9 th edition) and the Korean Standard Classification of Occupations code (6 th edition) that is in accordance to international standards. All of the exposure levels were expressed by weighted arithmetic mean (WAM) and minimum and maximum concentrations. Based on the established GPJEM, the 112 exposure groups could be reclassified into 86 industries and 74 occupations. In the 1980s, the highest exposure levels were estimated in "knitting and weaving machine operators" with a WAM concentration of 7.48 fibers/mL (f/mL); in the 1990s, "plastic products production machine operators" with 5.12 f/mL, and in the 2000s "detergents production machine operators" handling talc containing asbestos with 2.45 f/mL. Of the 112 exposure groups, 44 groups had higher WAM concentrations than the Korean occupational exposure limit of 0.1 f/mL. The newly constructed GPJEM which is generated from actual domestic quantitative exposure data could be useful in evaluating historical exposure levels to asbestos and could contribute to improved prediction of asbestos-related diseases among Koreans.
Arnone, Mario; Koppisch, Dorothea; Smola, Thomas; Gabriel, Stefan; Verbist, Koen; Visser, Remco
2015-10-01
Many control banding tools use hazard banding in risk assessments for the occupational handling of hazardous substances. The outcome of these assessments can be combined with advice for the required risk management measures (RMMs). The Globally Harmonised System of Classification and Labelling of Chemicals (GHS) has resulted in a change in the hazard communication elements, i.e. Hazard (H) statements instead of Risk-phrases. Hazard banding schemes that depend on the old form of safety information have to be adapted to the new rules. The purpose of this publication is to outline the rationales for the assignment of hazard bands to H statements under the GHS. Based on this, this publication proposes a hazard banding scheme that uses the information from the safety data sheets as the basis for assignment. The assignment of hazard bands tiered according to the severity of the underlying hazards supports the important principle of substitution. Additionally, the set of assignment rules permits an exposure-route-specific assignment of hazard bands, which is necessary for the proposed route-specific RMMs. Ideally, all control banding tools should apply the same assignment rules. This GHS-compliant hazard banding scheme can hopefully help to establish a unified hazard banding strategy in the various control banding tools. Copyright © 2015 Elsevier Inc. All rights reserved.
Hyland, Philip; Murphy, Jamie; Shevlin, Mark; Vallières, Frédérique; McElroy, Eoin; Elklit, Ask; Christoffersen, Mogens; Cloitre, Marylène
2017-06-01
The World Health Organization's 11th revision to the International Classification of Diseases manual (ICD-11) will differentiate between two stress-related disorders: PTSD and Complex PTSD (CPTSD). ICD-11 proposals suggest that trauma exposure which is prolonged and/or repeated, or consists of multiple forms, that also occurs under circumstances where escape from the trauma is difficult or impossible (e.g., childhood abuse) will confer greater risk for CPTSD as compared to PTSD. The primary objective of the current study was to provide an empirical assessment of this proposal. A stratified, random probability sample of a Danish birth cohort (aged 24) was interviewed by the Danish National Centre for Social Research (N = 2980) in 2008-2009. Data from this interview were used to generate an ICD-11 symptom-based classification of PTSD and CPTSD. The majority of the sample (87.1%) experienced at least one of eight traumatic events spanning childhood and early adulthood. There was some indication that being female increased the risk for both PTSD and CPTSD classification. Multinomial logistic regression results found that childhood sexual abuse (OR = 4.98) and unemployment status (OR = 4.20) significantly increased risk of CPTSD classification as compared to PTSD. A dose-response relationship was observed between exposure to multiple forms of childhood interpersonal trauma and risk of CPTSD classification, as compared to PTSD. Results provide empirical support for the ICD-11 proposals that childhood interpersonal traumatic exposure increases risk of CPTSD symptom development.
Occupational Noise Exposure of Employees at Locally-Owned Restaurants in a College Town
Green, Deirdre R.; Anthony, T. Renée
2016-01-01
While many restaurant employees work in loud environments, in both dining and food preparation areas, little is known about worker exposures to noise. The risk of hearing loss to millions of food service workers around the country is unknown. This study evaluated full-shift noise exposure to workers at six locally-owned restaurants to examine risk factors associated with noise exposures during the day shift. Participants included cooks, counter attendants, bartenders, and waiters at full-service restaurants with bar service and at limited-service restaurants that provided counter service only. Assessments were made on weekdays and weekends, both during the summer and the fall (with a local university in session) to examine whether the time of week or year affects noise exposures to this population in a college town. In addition, the relationships between noise exposures and the type of restaurant and job classification were assessed. One-hundred eighty full-shift time-weighted average (TWA) exposures were assessed, using both Occupational Safety and Health Administration (OSHA) and National Institute for Occupational Safety and Health (NIOSH) criteria. No TWA measurements exceeded the 90 dBA OSHA 8 hr permissible exposure limit, although six projected TWAs exceeded the 85 dBA OSHA hearing conservation action limit. Using NIOSH criteria, TWAs ranged from 69–90 dBA with a mean of 80 dBA (SD = 4 dBA). Nearly 8% (14) of the exposures exceeded the NIOSH 8-hr 85 dBA. Full-shift exposures were larger for all workers in full-service restaurants (p < 0.001) and for cooks (p = 0.003), regardless of restaurant type. The fall semester (p = 0.003) and weekend (p = 0.048) exposures were louder than summer and weekdays. Multiple linear regression analysis suggested that the combination of restaurant type, job classification, and season had a significant effect on restaurant worker noise exposures (p < 0.001) in this college town. While evening/night shift exposures, where noise exposures may be anticipated to be louder, were not assessed, this study identified that restaurant type, job classification, time of week, and season significantly affected the noise exposures for day-shift workers. Intervention studies to prevent noise-induced hearing loss (NIHL) should consider these variables. PMID:25738733
Occupational Noise Exposure of Employees at Locally-Owned Restaurants in a College Town.
Green, Deirdre R; Anthony, T Renée
2015-01-01
While many restaurant employees work in loud environments, in both dining and food preparation areas, little is known about worker exposures to noise. The risk of hearing loss to millions of food service workers around the country is unknown. This study evaluated full-shift noise exposure to workers at six locally-owned restaurants to examine risk factors associated with noise exposures during the day shift. Participants included cooks, counter attendants, bartenders, and waiters at full-service restaurants with bar service and at limited-service restaurants that provided counter service only. Assessments were made on weekdays and weekends, both during the summer and the fall (with a local university in session) to examine whether the time of week or year affects noise exposures to this population in a college town. In addition, the relationships between noise exposures and the type of restaurant and job classification were assessed. One-hundred eighty full-shift time-weighted average (TWA) exposures were assessed, using both Occupational Safety and Health Administration (OSHA) and National Institute for Occupational Safety and Health (NIOSH) criteria. No TWA measurements exceeded the 90 dBA OSHA 8 hr permissible exposure limit, although six projected TWAs exceeded the 85 dBA OSHA hearing conservation action limit. Using NIOSH criteria, TWAs ranged from 69-90 dBA with a mean of 80 dBA (SD = 4 dBA). Nearly 8% (14) of the exposures exceeded the NIOSH 8-hr 85 dBA. Full-shift exposures were larger for all workers in full-service restaurants (p < 0.001) and for cooks (p = 0.003), regardless of restaurant type. The fall semester (p = 0.003) and weekend (p = 0.048) exposures were louder than summer and weekdays. Multiple linear regression analysis suggested that the combination of restaurant type, job classification, and season had a significant effect on restaurant worker noise exposures (p < 0.001) in this college town. While evening/night shift exposures, where noise exposures may be anticipated to be louder, were not assessed, this study identified that restaurant type, job classification, time of week, and season significantly affected the noise exposures for day-shift workers. Intervention studies to prevent noise-induced hearing loss (NIHL) should consider these variables.
This page provides access to a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases,
A decision-support tool for the control of urban noise pollution.
Suriano, Marcia Thais; de Souza, Léa Cristina Lucas; da Silva, Antonio Nelson Rodrigues
2015-07-01
Improving the quality of life is increasingly seen as an important urban planning goal. In order to reach it, various tools are being developed to mitigate the negative impacts of human activities on society. This paper develops a methodology for quantifying the population's exposure to noise, by proposing a classification of urban blocks. Taking into account the vehicular flow and traffic composition of the surroundings of urban blocks, we generated a noise map by applying a computational simulation. The urban blocks were classified according to their noise range and then the population was estimated for each urban block, by a process which was based on the census tract and the constructed area of the blocks. The acoustical classes of urban blocks and the number of inhabitants per block were compared, so that the population exposed to noise levels above 65 dB(A) could be estimated, which is the highest limit established by legislation. As a result, we developed a map of the study area, so that urban blocks that should be priority targets for noise mitigation actions can be quickly identified.
Poisoning by Herbs and Plants: Rapid Toxidromic Classification and Diagnosis.
Diaz, James H
2016-03-01
The American Association of Poison Control Centers has continued to report approximately 50,000 telephone calls or 8% of incoming calls annually related to plant exposures, mostly in children. Although the frequency of plant ingestions in children is related to the presence of popular species in households, adolescents may experiment with hallucinogenic plants; and trekkers and foragers may misidentify poisonous plants as edible. Since plant exposures have continued at a constant rate, the objectives of this review were (1) to review the epidemiology of plant poisonings; and (2) to propose a rapid toxidromic classification system for highly toxic plant ingestions for field use by first responders in comparison to current classification systems. Internet search engines were queried to identify and select peer-reviewed articles on plant poisonings using the key words in order to classify plant poisonings into four specific toxidromes: cardiotoxic, neurotoxic, cytotoxic, and gastrointestinal-hepatotoxic. A simple toxidromic classification system of plant poisonings may permit rapid diagnoses of highly toxic versus less toxic and nontoxic plant ingestions both in households and outdoors; direct earlier management of potentially serious poisonings; and reduce costly inpatient evaluations for inconsequential plant ingestions. The current textbook classification schemes for plant poisonings were complex in comparison to the rapid classification system; and were based on chemical nomenclatures and pharmacological effects, and not on clearly presenting toxidromes. Validation of the rapid toxidromic classification system as compared to existing chemical classification systems for plant poisonings will require future adoption and implementation of the toxidromic system by its intended users. Copyright © 2016 Wilderness Medical Society. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Sanhouse-García, Antonio J.; Rangel-Peraza, Jesús Gabriel; Bustos-Terrones, Yaneth; García-Ferrer, Alfonso; Mesas-Carrascosa, Francisco J.
2016-02-01
Land cover classification is often based on different characteristics between their classes, but with great homogeneity within each one of them. This cover is obtained through field work or by mean of processing satellite images. Field work involves high costs; therefore, digital image processing techniques have become an important alternative to perform this task. However, in some developing countries and particularly in Casacoima municipality in Venezuela, there is a lack of geographic information systems due to the lack of updated information and high costs in software license acquisition. This research proposes a low cost methodology to develop thematic mapping of local land use and types of coverage in areas with scarce resources. Thematic mapping was developed from CBERS-2 images and spatial information available on the network using open source tools. The supervised classification method per pixel and per region was applied using different classification algorithms and comparing them among themselves. Classification method per pixel was based on Maxver algorithms (maximum likelihood) and Euclidean distance (minimum distance), while per region classification was based on the Bhattacharya algorithm. Satisfactory results were obtained from per region classification, where overall reliability of 83.93% and kappa index of 0.81% were observed. Maxver algorithm showed a reliability value of 73.36% and kappa index 0.69%, while Euclidean distance obtained values of 67.17% and 0.61% for reliability and kappa index, respectively. It was demonstrated that the proposed methodology was very useful in cartographic processing and updating, which in turn serve as a support to develop management plans and land management. Hence, open source tools showed to be an economically viable alternative not only for forestry organizations, but for the general public, allowing them to develop projects in economically depressed and/or environmentally threatened areas.
A Visual mining based framework for classification accuracy estimation
NASA Astrophysics Data System (ADS)
Arun, Pattathal Vijayakumar
2013-12-01
Classification techniques have been widely used in different remote sensing applications and correct classification of mixed pixels is a tedious task. Traditional approaches adopt various statistical parameters, however does not facilitate effective visualisation. Data mining tools are proving very helpful in the classification process. We propose a visual mining based frame work for accuracy assessment of classification techniques using open source tools such as WEKA and PREFUSE. These tools in integration can provide an efficient approach for getting information about improvements in the classification accuracy and helps in refining training data set. We have illustrated framework for investigating the effects of various resampling methods on classification accuracy and found that bilinear (BL) is best suited for preserving radiometric characteristics. We have also investigated the optimal number of folds required for effective analysis of LISS-IV images. Techniki klasyfikacji są szeroko wykorzystywane w różnych aplikacjach teledetekcyjnych, w których poprawna klasyfikacja pikseli stanowi poważne wyzwanie. Podejście tradycyjne wykorzystujące różnego rodzaju parametry statystyczne nie zapewnia efektywnej wizualizacji. Wielce obiecujące wydaje się zastosowanie do klasyfikacji narzędzi do eksploracji danych. W artykule zaproponowano podejście bazujące na wizualnej analizie eksploracyjnej, wykorzystujące takie narzędzia typu open source jak WEKA i PREFUSE. Wymienione narzędzia ułatwiają korektę pół treningowych i efektywnie wspomagają poprawę dokładności klasyfikacji. Działanie metody sprawdzono wykorzystując wpływ różnych metod resampling na zachowanie dokładności radiometrycznej i uzyskując najlepsze wyniki dla metody bilinearnej (BL).
BRCA1/2 missense mutations and the value of in-silico analyses.
Sadowski, Carolin E; Kohlstedt, Daniela; Meisel, Cornelia; Keller, Katja; Becker, Kerstin; Mackenroth, Luisa; Rump, Andreas; Schröck, Evelin; Wimberger, Pauline; Kast, Karin
2017-11-01
The clinical implications of genetic variants in BRCA1/2 in healthy and affected individuals are considerable. Variant interpretation, however, is especially challenging for missense variants. The majority of them are classified as variants of unknown clinical significance (VUS). Computational (in-silico) predictive programs are easy to access, but represent only one tool out of a wide range of complemental approaches to classify VUS. With this single-center study, we aimed to evaluate the impact of in-silico analyses in a spectrum of different BRCA1/2 missense variants. We conducted mutation analysis of BRCA1/2 in 523 index patients with suspected hereditary breast and ovarian cancer (HBOC). Classification of the genetic variants was performed according to the German Consortium (GC)-HBOC database. Additionally, all missense variants were classified by the following three in-silico prediction tools: SIFT, Mutation Taster (MT2) and PolyPhen2 (PPH2). Overall 201 different variants, 68 of which constituted missense variants were ranked as pathogenic, neutral, or unknown. The classification of missense variants by in-silico tools resulted in a higher amount of pathogenic mutations (25% vs. 13.2%) compared to the GC-HBOC-classification. Altogether, more than fifty percent (38/68, 55.9%) of missense variants were ranked differently. Sensitivity of in-silico-tools for mutation prediction was 88.9% (PPH2), 100% (SIFT) and 100% (MT2). We found a relevant discrepancy in variant classification by using in-silico prediction tools, resulting in potential overestimation and/or underestimation of cancer risk. More reliable, notably gene-specific, prediction tools and functional tests are needed to improve clinical counseling. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
A study of the mortality of Cornish tin miners.
Fox, A J; Goldlbatt, P; Kinlen, L J
1981-01-01
Increased mortality from cancer of the lung has been found in several studies of miners exposed to high levels of radioactivity in underground air. In view of their exposure to raised levels of radiation, we have studied the mortality of a group of men recorded as Cornish tin miners in 1939. Using occupational description, a crude classification of exposure was derived for these miners. The meaningfulness of this classification was supported by differences in mortality from silicosis and silicotuberculosis. A twofold excess of cancer of the lung was found for underground miners, while for other categories mortality from this cause was less than expected. This supports the findings of previous studies on exposure to radon and its daughters. An excess of cancer of the stomach was also observed among underground miners. PMID:7317301
78 FR 33744 - Sedaxane; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-05
.... The following list of North American Industrial Classification System (NAICS) codes is not intended to... the data supporting the petition, EPA has corrected commodity definitions and recommended additional... exposure through drinking water and in residential settings, but does not include occupational exposure...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-11
.../Exposure Analysis Modeling System and Screening Concentration in Ground Water (SCI-GROW) models, the... Classification System (NAICS) codes have been provided to assist you and others in determining whether this... reliable information.'' This includes exposure through drinking water and in residential settings, but does...
75 FR 40745 - Cyazofamid; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-14
... Model/Exposure Analysis Modeling System (PRZM/EXAMS) model for surface water and the Screening... listed in this unit could also be affected. The North American Industrial Classification System (NAICS... there is reliable information.'' This includes exposure through drinking water and in residential...
HIV classification using the coalescent theory
Bulla, Ingo; Schultz, Anne-Kathrin; Schreiber, Fabian; Zhang, Ming; Leitner, Thomas; Korber, Bette; Morgenstern, Burkhard; Stanke, Mario
2010-01-01
Motivation: Existing coalescent models and phylogenetic tools based on them are not designed for studying the genealogy of sequences like those of HIV, since in HIV recombinants with multiple cross-over points between the parental strains frequently arise. Hence, ambiguous cases in the classification of HIV sequences into subtypes and circulating recombinant forms (CRFs) have been treated with ad hoc methods in lack of tools based on a comprehensive coalescent model accounting for complex recombination patterns. Results: We developed the program ARGUS that scores classifications of sequences into subtypes and recombinant forms. It reconstructs ancestral recombination graphs (ARGs) that reflect the genealogy of the input sequences given a classification hypothesis. An ARG with maximal probability is approximated using a Markov chain Monte Carlo approach. ARGUS was able to distinguish the correct classification with a low error rate from plausible alternative classifications in simulation studies with realistic parameters. We applied our algorithm to decide between two recently debated alternatives in the classification of CRF02 of HIV-1 and find that CRF02 is indeed a recombinant of Subtypes A and G. Availability: ARGUS is implemented in C++ and the source code is available at http://gobics.de/software Contact: ibulla@uni-goettingen.de Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:20400454
Robust tissue classification for reproducible wound assessment in telemedicine environments
NASA Astrophysics Data System (ADS)
Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves
2010-04-01
In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.
Exposure Assessment Tools by Media - Air
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Routes - Inhalation
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Chemical Classes
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Routes
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media - Food
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Routes - Ingestion
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Approaches
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Routes - Dermal
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
1996-10-01
Diet 16. PRICE CODE 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. LIMITATION OF ABSTRACT OF REPORT OF THIS PAGE...approach, Frank et al. (1993) compared DDE and PCB residues in the general diet with blood levels of Ontario residents. Blood samples were obtained from...sources of PCBs and HCB in this geographical region. In a similar study, Kashyap et al. (1994) monitored DDT levels in duplicate diet samples and
NASA Astrophysics Data System (ADS)
Meliker, Jaymie R.; Slotnick, Melissa J.; Avruskin, Gillian A.; Kaufmann, Andrew; Jacquez, Geoffrey M.; Nriagu, Jerome O.
2005-05-01
A thorough assessment of human exposure to environmental agents should incorporate mobility patterns and temporal changes in human behaviors and concentrations of contaminants; yet the temporal dimension is often under-emphasized in exposure assessment endeavors, due in part to insufficient tools for visualizing and examining temporal datasets. Spatio-temporal visualization tools are valuable for integrating a temporal component, thus allowing for examination of continuous exposure histories in environmental epidemiologic investigations. An application of these tools to a bladder cancer case-control study in Michigan illustrates continuous exposure life-lines and maps that display smooth, continuous changes over time. Preliminary results suggest increased risk of bladder cancer from combined exposure to arsenic in drinking water (>25 μg/day) and heavy smoking (>30 cigarettes/day) in the 1970s and 1980s, and a possible cancer cluster around automotive, paint, and organic chemical industries in the early 1970s. These tools have broad application for examining spatially- and temporally-specific relationships between exposures to environmental risk factors and disease.
Koua, Dominique; Kuhn-Nentwig, Lucia
2017-01-01
Spider venoms are rich cocktails of bioactive peptides, proteins, and enzymes that are being intensively investigated over the years. In order to provide a better comprehension of that richness, we propose a three-level family classification system for spider venom components. This classification is supported by an exhaustive set of 219 new profile hidden Markov models (HMMs) able to attribute a given peptide to its precise peptide type, family, and group. The proposed classification has the advantages of being totally independent from variable spider taxonomic names and can easily evolve. In addition to the new classifiers, we introduce and demonstrate the efficiency of hmmcompete, a new standalone tool that monitors HMM-based family classification and, after post-processing the result, reports the best classifier when multiple models produce significant scores towards given peptide queries. The combined used of hmmcompete and the new spider venom component-specific classifiers demonstrated 96% sensitivity to properly classify all known spider toxins from the UniProtKB database. These tools are timely regarding the important classification needs caused by the increasing number of peptides and proteins generated by transcriptomic projects. PMID:28786958
Exposure assessment in health assessments for hand-arm vibration syndrome.
Mason, H J; Poole, K; Young, C
2011-08-01
Assessing past cumulative vibration exposure is part of assessing the risk of hand-arm vibration syndrome (HAVS) in workers exposed to hand-arm vibration and invariably forms part of a medical assessment of such workers. To investigate the strength of relationships between the presence and severity of HAVS and different cumulative exposure metrics obtained from a self-reporting questionnaire. Cumulative exposure metrics were constructed from a tool-based questionnaire applied in a group of HAVS referrals and workplace field studies. These metrics included simple years of vibration exposure, cumulative total hours of all tool use and differing combinations of acceleration magnitudes for specific tools and their daily use, including the current frequency-weighting method contained in ISO 5349-1:2001. Use of simple years of exposure is a weak predictor of HAVS or its increasing severity. The calculation of cumulative hours across all vibrating tools used is a more powerful predictor. More complex calculations based on involving likely acceleration data for specific classes of tools, either frequency weighted or not, did not offer a clear further advantage in this dataset. This may be due to the uncertainty associated with workers' recall of their past tool usage or the variability between tools in the magnitude of their vibration emission. Assessing years of exposure or 'latency' in a worker should be replaced by cumulative hours of tool use. This can be readily obtained using a tool-pictogram-based self-reporting questionnaire and a simple spreadsheet calculation.
Exposure Assessment Tools by Chemical Classes - Nanomaterials
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Chemical Classes - Other Organics
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Lifestages and Populations - Lifestages
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Lifestages and Populations
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media - Consumer Products
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media - Water and Sediment
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Tiers and Types
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media - Soil and Dust
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Chemical Classes - Pesticides
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media - Aquatic Biota
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media - Soil and Dust
2017-02-13
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Chemical Classes - Other ...
2017-02-13
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways - Soil
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways - Food Chains
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways - References
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways - Air
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
CLARIPED: a new tool for risk classification in pediatric emergencies.
Magalhães-Barbosa, Maria Clara de; Prata-Barbosa, Arnaldo; Alves da Cunha, Antonio José Ledo; Lopes, Cláudia de Souza
2016-09-01
To present a new pediatric risk classification tool, CLARIPED, and describe its development steps. Development steps: (i) first round of discussion among experts, first prototype; (ii) pre-test of reliability, 36 hypothetical cases; (iii) second round of discussion to perform adjustments; (iv) team training; (v) pre-test with patients in real time; (vi) third round of discussion to perform new adjustments; (vii) final pre-test of validity (20% of medical treatments in five days). CLARIPED features five urgency categories: Red (Emergency), Orange (very urgent), Yellow (urgent), Green (little urgent) and Blue (not urgent). The first classification step includes the measurement of four vital signs (Vipe score); the second step consists in the urgency discrimination assessment. Each step results in assigning a color, selecting the most urgent one for the final classification. Each color corresponds to a maximum waiting time for medical care and referral to the most appropriate physical area for the patient's clinical condition. The interobserver agreement was substantial (kappa=0.79) and the final pre-test, with 82 medical treatments, showed good correlation between the proportion of patients in each urgency category and the number of used resources (p<0.001). CLARIPED is an objective and easy-to-use tool for simple risk classification, of which pre-tests suggest good reliability and validity. Larger-scale studies on its validity and reliability in different health contexts are ongoing and can contribute to the implementation of a nationwide pediatric risk classification system. Copyright © 2016 Sociedade de Pediatria de São Paulo. Publicado por Elsevier Editora Ltda. All rights reserved.
76 FR 70890 - Fenamidone; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
.../models/water/index.htm . Based on the Pesticide Root Zone Model/Exposure Analysis Modeling System (PRZM... listed in this unit could also be affected. The North American Industrial Classification System (NAICS... there is reliable information.'' This includes exposure through drinking water and in residential...
Malnutrition risk in hospitalized children: use of 3 screening tools in a large European population.
Chourdakis, Michael; Hecht, Christina; Gerasimidis, Konstantinos; Joosten, Koen Fm; Karagiozoglou-Lampoudi, Thomais; Koetse, Harma A; Ksiazyk, Janusz; Lazea, Cecilia; Shamir, Raanan; Szajewska, Hania; Koletzko, Berthold; Hulst, Jessie M
2016-05-01
Several malnutrition screening tools have been advocated for use in pediatric inpatients. We evaluated how 3 popular pediatric nutrition screening tools [i.e., the Pediatric Yorkhill Malnutrition Score (PYMS), the Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP), and the Screening Tool for Risk of Impaired Nutritional Status and Growth (STRONGKIDS)] compared with and were related to anthropometric measures, body composition, and clinical variables in patients who were admitted to tertiary hospitals across Europe. The 3 screening tools were applied in 2567 inpatients at 14 hospitals across 12 European countries. The classification of patients into different nutritional risk groups was compared between tools and related to anthropometric measures and clinical variables [e.g., length of hospital stay (LOS) and infection rates]. A similar rate of completion of the screening tools for each tool was achieved (PYMS: 86%; STAMP: 84%; and STRONGKIDS: 81%). Risk classification differed markedly by tool, with an overall agreement of 41% between tools. Children categorized as high risk (PYMS: 25%; STAMP: 23%; and STRONGKIDS: 10%) had a longer LOS than that of children at low risk (1.4, 1.4, and 1.8 d longer, respectively; P < 0.001). In high-risk patients identified with the PYMS, 22% of them had low (<-2) body mass index (BMI) SD-scores (SDSs), and 8% of them had low height-for-age SDSs. For the STAMP, the percentages were 19% and 14%, respectively, and for the STRONGKIDS, the percentages were 23% and 19%, respectively. The identification and classification of malnutrition risk varied across the pediatric tools used. A considerable portion of children with subnormal anthropometric measures were not identified with all of the tools. The data obtained do not allow recommending the use of any of these screening tools for clinical practice. This study was registered at clinicaltrials.gov as NCT01132742. © 2016 American Society for Nutrition.
NASA Astrophysics Data System (ADS)
Navares, Ricardo; Aznarte, José Luis
2017-04-01
In this paper, we approach the problem of predicting the concentrations of Poaceae pollen which define the main pollination season in the city of Madrid. A classification-based approach, based on a computational intelligence model (random forests), is applied to forecast the dates in which risk concentration levels are to be observed. Unlike previous works, the proposal extends the range of forecasting horizons up to 6 months ahead. Furthermore, the proposed model allows to determine the most influential factors for each horizon, making no assumptions about the significance of the weather features. The performace of the proposed model proves it as a successful tool for allergy patients in preventing and minimizing the exposure to risky pollen concentrations and for researchers to gain a deeper insight on the factors driving the pollination season.
Navares, Ricardo; Aznarte, José Luis
2017-04-01
In this paper, we approach the problem of predicting the concentrations of Poaceae pollen which define the main pollination season in the city of Madrid. A classification-based approach, based on a computational intelligence model (random forests), is applied to forecast the dates in which risk concentration levels are to be observed. Unlike previous works, the proposal extends the range of forecasting horizons up to 6 months ahead. Furthermore, the proposed model allows to determine the most influential factors for each horizon, making no assumptions about the significance of the weather features. The performace of the proposed model proves it as a successful tool for allergy patients in preventing and minimizing the exposure to risky pollen concentrations and for researchers to gain a deeper insight on the factors driving the pollination season.
VoPham, Trang; Wilson, John P; Ruddell, Darren; Rashed, Tarek; Brooks, Maria M; Yuan, Jian-Min; Talbott, Evelyn O; Chang, Chung-Chou H; Weissfeld, Joel L
2015-08-01
Accurate pesticide exposure estimation is integral to epidemiologic studies elucidating the role of pesticides in human health. Humans can be exposed to pesticides via residential proximity to agricultural pesticide applications (drift). We present an improved geographic information system (GIS) and remote sensing method, the Landsat method, to estimate agricultural pesticide exposure through matching pesticide applications to crops classified from temporally concurrent Landsat satellite remote sensing images in California. The image classification method utilizes Normalized Difference Vegetation Index (NDVI) values in a combined maximum likelihood classification and per-field (using segments) approach. Pesticide exposure is estimated according to pesticide-treated crop fields intersecting 500 m buffers around geocoded locations (e.g., residences) in a GIS. Study results demonstrate that the Landsat method can improve GIS-based pesticide exposure estimation by matching more pesticide applications to crops (especially temporary crops) classified using temporally concurrent Landsat images compared to the standard method that relies on infrequently updated land use survey (LUS) crop data. The Landsat method can be used in epidemiologic studies to reconstruct past individual-level exposure to specific pesticides according to where individuals are located.
Exposure Assessment Tools by Approaches - Indirect Estimation (Scenario Evaluation)
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases, mode
Exposure Assessment Tools by Lifestages and Populations - General Population
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Lifestages and Populations - Occupational Workers
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Lifestages and Populations - Residential Consumers
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Tiers and Types - Aggregate and Cumulative
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Chemical Classes - Inorganics and Fibers
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways - Water and Sediment
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Machine Learning Algorithms for Automatic Classification of Marmoset Vocalizations
Ribeiro, Sidarta; Pereira, Danillo R.; Papa, João P.; de Albuquerque, Victor Hugo C.
2016-01-01
Automatic classification of vocalization type could potentially become a useful tool for acoustic the monitoring of captive colonies of highly vocal primates. However, for classification to be useful in practice, a reliable algorithm that can be successfully trained on small datasets is necessary. In this work, we consider seven different classification algorithms with the goal of finding a robust classifier that can be successfully trained on small datasets. We found good classification performance (accuracy > 0.83 and F1-score > 0.84) using the Optimum Path Forest classifier. Dataset and algorithms are made publicly available. PMID:27654941
Criteria for solvent-induced chronic toxic encephalopathy: a systematic review.
van der Hoek, J A; Verberk, M M; Hageman, G
2000-08-01
In 1985, a WHO Working Group presented diagnostic criteria and a classification for solvent-induced chronic toxic encephalopathy (CTE). In the same year, the "Workshop on neurobehavioral effects of solvents" in Raleigh, N.C., USA introduced a somewhat different classification for CTE. The objective of this review is to study the diagnostic procedures that are used to establish the diagnosis of CTE, and the extent to which the diagnostic criteria and classification of the WHO, and the classification of the Raleigh Working Group, are applied. A systematic search of studies on CTE was performed, and the diagnostic criteria and use of the WHO and Raleigh classifications were listed. We retrieved 30 original articles published in English from 1985 to 1998, in which CTE was diagnosed. Only two articles did not report the duration of solvent exposure. The type of solvent(s) involved was described in detail in four articles, poorly in 17 articles, and not at all in nine articles. Tests of general intelligence were used in 19 articles, and tests of both attention and mental flexibility and of learning and memory were used in 18 articles. Exclusion, by interview, of potentially confounding conditions, such as somatic diseases with central nervous effects and psychiatric diseases, was reported in 21 and 16 articles, respectively. In only six of the articles were both the WHO diagnostic criteria and the WHO or Raleigh classifications used. In the future, parameters of exposure, psychological test results, and use of medication that possibly affects psychological test results should always be described. We list some advantages and disadvantages of the Raleigh and WHO classifications. To aid inter-study comparisons, the diagnosis of CTE should be categorized and reported according to an internationally accepted classification.
The foodscape: classification and field validation of secondary data sources.
Lake, Amelia A; Burgoine, Thomas; Greenhalgh, Fiona; Stamp, Elaine; Tyrrell, Rachel
2010-07-01
The aims were to: develop a food environment classification tool and to test the acceptability and validity of three secondary sources of food environment data within a defined urban area of Newcastle-Upon-Tyne, using a field validation method. A 21 point (with 77 sub-categories) classification tool was developed. The fieldwork recorded 617 establishments selling food and/or food products. The sensitivity analysis of the secondary sources against fieldwork for the Newcastle City Council data was good (83.6%), while Yell.com and the Yellow Pages were low (51.2% and 50.9%, respectively). To improve the quality of secondary data, multiple sources should be used in order to achieve a realistic picture of the foodscape. 2010 Elsevier Ltd. All rights reserved.
The Language Exposure Assessment Tool: Quantifying Language Exposure in Infants and Children
ERIC Educational Resources Information Center
DeAnda, Stephanie; Bosch, Laura; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret
2016-01-01
Purpose: The aim of this study was to develop the Language Exposure Assessment Tool (LEAT) and to examine its cross-linguistic validity, reliability, and utility. The LEAT is a computerized interview-style assessment that requests parents to estimate language exposure. The LEAT yields an automatic calculation of relative language exposure and…
77 FR 58045 - Clopyralid; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-19
... Classification System (NAICS) codes have been provided to assist you and others in determining whether this... data supporting the petition, EPA has determined that the proposed tolerance on rapeseed subgroup 20A... exposure through drinking water and in residential settings, but does not include occupational exposure...
77 FR 10962 - Flazasulfuron; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-24
.../water/index.htm . Based on the Pesticide Root Zone Model/Exposure Analysis Modeling System (PRZM/EXAMS... Classification System (NAICS) codes have been provided to assist you and others in determining whether this... reliable information.'' This includes exposure through drinking water and in residential settings, but does...
75 FR 17566 - Flutolanil; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-07
... affected. The North American Industrial Classification System (NAICS) codes have been provided to assist...) benzamide and calculated as flutolanil.'' Based on review of the data supporting the petition, EPA has also... exposures for which there is reliable information.'' This includes exposure through drinking water and in...
Overview on association of different types of leukemias with radiation exposure.
Gluzman, D F; Sklyarenko, L M; Zavelevich, M P; Koval, S V; Ivanivska, T S; Rodionova, N K
2015-06-01
Exposure to ionizing radiation is associated with increasing risk of various types of hematological malignancies. The results of major studies on association of leukemias and radiation exposure of large populations in Japan and in Ukraine are analyzed. The patterns of different types of leukemia in 295 Chernobyl clean-up workers diagnosed according to the criteria of up-to-date World Health Organization classification within 10-25 years following Chernobyl catastrophe are summarized. In fact, a broad spectrum of radiation-related hematological malignancies has been revealed both in Life Span Study in Japan and in study of Chernobyl clean-up workers in Ukraine. The importance of the precise diagnosis of tumors of hematopoietic and lymphoid tissues according to up-to-date classifications for elucidating the role of radiation as a causative factor of leukemias is emphasized. Such studies are of high importance since according to the recent findings, radiation-associated excess risks of several types of leukemias seem to persist throughout the follow-up period up to 55 years after the radiation exposure.
Exposure Assessment Tools by Tiers and Types - Screening-Level and Refined
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Approaches - Direct Measurement (Point-of-Contact Measurement)
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases, mode
Exposure Assessment Tools by Tiers and Types - Deterministic and Probabilistic Assessments
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Study of Software Tools to Support Systems Engineering Management
2015-06-01
Management 15. NUMBER OF PAGES 137 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF THIS...AVAILABILITY STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) According to a...PAGE Unclassified 19. SECURITY CLASSIFICATION OF ABSTRACT Unclassified 20. LIMITATION OF ABSTRACT UU NSN 7540–01–280–5500 Standard Form 298
A signature dissimilarity measure for trabecular bone texture in knee radiographs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woloszynski, T.; Podsiadlo, P.; Stachowiak, G. W.
Purpose: The purpose of this study is to develop a dissimilarity measure for the classification of trabecular bone (TB) texture in knee radiographs. Problems associated with the traditional extraction and selection of texture features and with the invariance to imaging conditions such as image size, anisotropy, noise, blur, exposure, magnification, and projection angle were addressed. Methods: In the method developed, called a signature dissimilarity measure (SDM), a sum of earth mover's distances calculated for roughness and orientation signatures is used to quantify dissimilarities between textures. Scale-space theory was used to ensure scale and rotation invariance. The effects of image size,more » anisotropy, noise, and blur on the SDM developed were studied using computer generated fractal texture images. The invariance of the measure to image exposure, magnification, and projection angle was studied using x-ray images of human tibia head. For the studies, Mann-Whitney tests with significance level of 0.01 were used. A comparison study between the performances of a SDM based classification system and other two systems in the classification of Brodatz textures and the detection of knee osteoarthritis (OA) were conducted. The other systems are based on weighted neighbor distance using compound hierarchy of algorithms representing morphology (WND-CHARM) and local binary patterns (LBP). Results: Results obtained indicate that the SDM developed is invariant to image exposure (2.5-30 mA s), magnification (x1.00-x1.35), noise associated with film graininess and quantum mottle (<25%), blur generated by a sharp film screen, and image size (>64x64 pixels). However, the measure is sensitive to changes in projection angle (>5 deg.), image anisotropy (>30 deg.), and blur generated by a regular film screen. For the classification of Brodatz textures, the SDM based system produced comparable results to the LBP system. For the detection of knee OA, the SDM based system achieved 78.8% classification accuracy and outperformed the WND-CHARM system (64.2%). Conclusions: The SDM is well suited for the classification of TB texture images in knee OA detection and may be useful for the texture classification of medical images in general.« less
Employment discrimination, segregation, and health.
Darity, William A
2003-02-01
The author examines available evidence on the effects of exposure to joblessness on emotional well-being according to race and sex. The impact of racism on general health outcomes also is considered, particularly racism in the specific form of wage discrimination. Perceptions of racism and measured exposures to racism may be distinct triggers for adverse health outcomes. Whether the effects of racism are best evaluated on the basis of self-classification or social classification of racial identity is unclear. Some research sorts between the effects of race and socioeconomic status on health. The development of a new longitudinal database will facilitate more accurate identification of connections between racism and negative health effects.
Employment Discrimination, Segregation, and Health
Darity, William A.
2003-01-01
The author examines available evidence on the effects of exposure to joblessness on emotional well-being according to race and sex. The impact of racism on general health outcomes also is considered, particularly racism in the specific form of wage discrimination. Perceptions of racism and measured exposures to racism may be distinct triggers for adverse health outcomes. Whether the effects of racism are best evaluated on the basis of self-classification or social classification of racial identity is unclear. Some research sorts between the effects of race and socioeconomic status on health. The development of a new longitudinal database will facilitate more accurate identification of connections between racism and negative health effects. PMID:12554574
NASA Astrophysics Data System (ADS)
Sliney, David H.
1990-07-01
Historically many different agencies and standards organizations have proposed laser occupational exposure limits (EL1s) or maximum permissible exposure (MPE) levels. Although some safety standards have been limited in scope to manufacturer system safety performance standards or to codes of practice most have included occupational EL''s. Initially in the 1960''s attention was drawn to setting EL''s however as greater experience accumulated in the use of lasers and some accident experience had been gained safety procedures were developed. It became clear by 1971 after the first decade of laser use that detailed hazard evaluation of each laser environment was too complex for most users and a scheme of hazard classification evolved. Today most countries follow a scheme of four major hazard classifications as defined in Document WS 825 of the International Electrotechnical Commission (IEC). The classifications and the associated accessible emission limits (AEL''s) were based upon the EL''s. The EL and AEL values today are in surprisingly good agreement worldwide. There exists a greater range of safety requirements for the user for each class of laser. The current MPE''s (i. e. EL''s) and their basis are highlighted in this presentation. 2. 0
A Data-Driven Framework for Incorporating New Tools for ...
This talk was given during the “Exposure-Based Toxicity Testing” session at the annual meeting of the International Society for Exposure Science. It provided an update on the state of the science and tools that may be employed in risk-based prioritization efforts. It outlined knowledge gained from the data provided using these high-throughput tools to assess chemical bioactivity and to predict chemical exposures and also identified future needs. It provided an opportunity to showcase ongoing research efforts within the National Exposure Research Laboratory and the National Center for Computational Toxicology within the Office of Research and Development to an international audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Butcher, Jason T.; Stewart, Paul M.; Simon, Thomas P.
2003-01-01
Ninety-four sites were used to analyze the effects of two different classification strategies on the Benthic Community Index (BCI). The first, a priori classification, reflected the wetland status of the streams; the second, a posteriori classification, used a bio-environmental analysis to select classification variables. Both classifications were examined by measuring classification strength and testing differences in metric values with respect to group membership. The a priori (wetland) classification strength (83.3%) was greater than the a posteriori (bio-environmental) classification strength (76.8%). Both classifications found one metric that had significant differences between groups. The original index was modified to reflect the wetland classification by re-calibrating the scoring criteria for percent Crustacea and Mollusca. A proposed refinement to the original Benthic Community Index is suggested. This study shows the importance of using hypothesis-driven classifications, as well as exploratory statistical analysis, to evaluate alternative ways to reveal environmental variability in biological assessment tools.
Ivanov, Iliya V; Leitritz, Martin A; Norrenberg, Lars A; Völker, Michael; Dynowski, Marek; Ueffing, Marius; Dietter, Johannes
2016-02-01
Abnormalities of blood vessel anatomy, morphology, and ratio can serve as important diagnostic markers for retinal diseases such as AMD or diabetic retinopathy. Large cohort studies demand automated and quantitative image analysis of vascular abnormalities. Therefore, we developed an analytical software tool to enable automated standardized classification of blood vessels supporting clinical reading. A dataset of 61 images was collected from a total of 33 women and 8 men with a median age of 38 years. The pupils were not dilated, and images were taken after dark adaption. In contrast to current methods in which classification is based on vessel profile intensity averages, and similar to human vision, local color contrast was chosen as a discriminator to allow artery vein discrimination and arterial-venous ratio (AVR) calculation without vessel tracking. With 83% ± 1 standard error of the mean for our dataset, we achieved best classification for weighted lightness information from a combination of the red, green, and blue channels. Tested on an independent dataset, our method reached 89% correct classification, which, when benchmarked against conventional ophthalmologic classification, shows significantly improved classification scores. Our study demonstrates that vessel classification based on local color contrast can cope with inter- or intraimage lightness variability and allows consistent AVR calculation. We offer an open-source implementation of this method upon request, which can be integrated into existing tool sets and applied to general diagnostic exams.
NASA Astrophysics Data System (ADS)
Srinivasan, Yeshwanth; Hernes, Dana; Tulpule, Bhakti; Yang, Shuyu; Guo, Jiangling; Mitra, Sunanda; Yagneswaran, Sriraja; Nutter, Brian; Jeronimo, Jose; Phillips, Benny; Long, Rodney; Ferris, Daron
2005-04-01
Automated segmentation and classification of diagnostic markers in medical imagery are challenging tasks. Numerous algorithms for segmentation and classification based on statistical approaches of varying complexity are found in the literature. However, the design of an efficient and automated algorithm for precise classification of desired diagnostic markers is extremely image-specific. The National Library of Medicine (NLM), in collaboration with the National Cancer Institute (NCI), is creating an archive of 60,000 digitized color images of the uterine cervix. NLM is developing tools for the analysis and dissemination of these images over the Web for the study of visual features correlated with precancerous neoplasia and cancer. To enable indexing of images of the cervix, it is essential to develop algorithms for the segmentation of regions of interest, such as acetowhitened regions, and automatic identification and classification of regions exhibiting mosaicism and punctation. Success of such algorithms depends, primarily, on the selection of relevant features representing the region of interest. We present color and geometric features based statistical classification and segmentation algorithms yielding excellent identification of the regions of interest. The distinct classification of the mosaic regions from the non-mosaic ones has been obtained by clustering multiple geometric and color features of the segmented sections using various morphological and statistical approaches. Such automated classification methodologies will facilitate content-based image retrieval from the digital archive of uterine cervix and have the potential of developing an image based screening tool for cervical cancer.
Reboiro-Jato, Miguel; Arrais, Joel P; Oliveira, José Luis; Fdez-Riverola, Florentino
2014-01-30
The diagnosis and prognosis of several diseases can be shortened through the use of different large-scale genome experiments. In this context, microarrays can generate expression data for a huge set of genes. However, to obtain solid statistical evidence from the resulting data, it is necessary to train and to validate many classification techniques in order to find the best discriminative method. This is a time-consuming process that normally depends on intricate statistical tools. geneCommittee is a web-based interactive tool for routinely evaluating the discriminative classification power of custom hypothesis in the form of biologically relevant gene sets. While the user can work with different gene set collections and several microarray data files to configure specific classification experiments, the tool is able to run several tests in parallel. Provided with a straightforward and intuitive interface, geneCommittee is able to render valuable information for diagnostic analyses and clinical management decisions based on systematically evaluating custom hypothesis over different data sets using complementary classifiers, a key aspect in clinical research. geneCommittee allows the enrichment of microarrays raw data with gene functional annotations, producing integrated datasets that simplify the construction of better discriminative hypothesis, and allows the creation of a set of complementary classifiers. The trained committees can then be used for clinical research and diagnosis. Full documentation including common use cases and guided analysis workflows is freely available at http://sing.ei.uvigo.es/GC/.
Review of the literature on benzene exposure and leukemia subtypes.
Schnatter, A Robert; Rosamilia, Kim; Wojcik, Nancy C
2005-05-30
The epidemiologic literature on benzene exposure and leukemia in the MEDLINE and TOXNET databases was examined through October 2004 using the keywords "benzene", "leukemia" and "adverse health effects". This search was complemented by reviewing the reference lists from extant literature reviews and criteria documents on benzene. Published studies were characterized according to the type of industry studied and design, exposure assessment, disease classification, and control for confounding variables. Study design consisted of either cohort studies or case-control studies, which were further categorized into population-based and nested case-control studies. Disease classification considered the source of diagnostic information, whether there was clinical confirmation from medical records or histopathological, morphological and/or cytogenetic reviews, and as to whether the International Classification of Diseases (ICD) or the French-American-British (FAB) schemes were used (no studies used the Revised European-American Lymphoma (REAL) classification scheme). Nine cohort and 13 case-control studies met inclusion criteria for this review. High and significant acute myeloid leukemia risks with positive dose response relationships were identified across study designs, particularly in the "well-conducted" cohort studies and especially in more highly exposed workers in rubber, shoe, and paint industries. Risks for chronic lymphocytic leukemia (CLL) tended to show elevations in nested case-control studies, with possible dose response relationships in at least two of the three studies. However, cohort studies on CLL show no such risks. Data for chronic myeloid leukemia and acute lymphocytic leukemia are sparse and inconclusive.
The present report describes a strategy to refine the current Cramer classification of the TTC concept using a broad database (DB) termed TTC RepDose. Cramer classes 1-3 overlap to some extent, indicating a need for a better separation of structural classes likely to be toxic, mo...
A New Item Selection Procedure for Mixed Item Type in Computerized Classification Testing.
ERIC Educational Resources Information Center
Lau, C. Allen; Wang, Tianyou
This paper proposes a new Information-Time index as the basis for item selection in computerized classification testing (CCT) and investigates how this new item selection algorithm can help improve test efficiency for item pools with mixed item types. It also investigates how practical constraints such as item exposure rate control, test…
77 FR 4248 - Cyazofamid; Pesticide Tolerances for Emergency Exemptions
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-27
.../water/index.htm . Based on the Pesticide Root Zone Model/Exposure Analysis Modeling System (PRZM/EXAMS... Classification System (NAICS) codes have been provided to assist you and others in determining whether this... reliable information.'' This includes exposure through drinking water and in residential settings, but does...
Acuity systems dialogue and patient classification system essentials.
Harper, Kelle; McCully, Crystal
2007-01-01
Obtaining resources for quality patient care is a major responsibility of nurse leaders and requires accurate information in the political world of budgeting. Patient classification systems (PCS) assist nurse managers in controlling cost and improving patient care while appropriately using financial resources. This paper communicates acuity systems development, background, flaws, and components while discussing a few tools currently available. It also disseminates the development of a new acuity tool, the Patient Classification System. The PCS tool, developed in a small rural hospital, uses 5 broad concepts: (1) medications, (2) complicated procedures, (3) education, (4) psychosocial issues, and (5) complicated intravenous medications. These concepts embrace a 4-tiered scale that differentiates significant patient characteristics and assists in staffing measures for equality in patient staffing and improving quality of care and performance. Data obtained through use of the PCS can be used by nurse leaders to effectively and objectively lobby for appropriate patient care resources. Two questionnaires distributed to registered nurses on a medical-surgical unit evaluated the nurses' opinion of the 5 concepts and the importance for establishing patient acuity for in-patient care. Interrater reliability among nurses was 87% with the author's acuity tool.
Can surgical simulation be used to train detection and classification of neural networks?
Zisimopoulos, Odysseas; Flouty, Evangello; Stacey, Mark; Muscroft, Sam; Giataganas, Petros; Nehme, Jean; Chow, Andre; Stoyanov, Danail
2017-10-01
Computer-assisted interventions (CAI) aim to increase the effectiveness, precision and repeatability of procedures to improve surgical outcomes. The presence and motion of surgical tools is a key information input for CAI surgical phase recognition algorithms. Vision-based tool detection and recognition approaches are an attractive solution and can be designed to take advantage of the powerful deep learning paradigm that is rapidly advancing image recognition and classification. The challenge for such algorithms is the availability and quality of labelled data used for training. In this Letter, surgical simulation is used to train tool detection and segmentation based on deep convolutional neural networks and generative adversarial networks. The authors experiment with two network architectures for image segmentation in tool classes commonly encountered during cataract surgery. A commercially-available simulator is used to create a simulated cataract dataset for training models prior to performing transfer learning on real surgical data. To the best of authors' knowledge, this is the first attempt to train deep learning models for surgical instrument detection on simulated data while demonstrating promising results to generalise on real data. Results indicate that simulated data does have some potential for training advanced classification methods for CAI systems.
Jaiswara, Ranjana; Nandi, Diptarup; Balakrishnan, Rohini
2013-01-01
Traditional taxonomy based on morphology has often failed in accurate species identification owing to the occurrence of cryptic species, which are reproductively isolated but morphologically identical. Molecular data have thus been used to complement morphology in species identification. The sexual advertisement calls in several groups of acoustically communicating animals are species-specific and can thus complement molecular data as non-invasive tools for identification. Several statistical tools and automated identifier algorithms have been used to investigate the efficiency of acoustic signals in species identification. Despite a plethora of such methods, there is a general lack of knowledge regarding the appropriate usage of these methods in specific taxa. In this study, we investigated the performance of two commonly used statistical methods, discriminant function analysis (DFA) and cluster analysis, in identification and classification based on acoustic signals of field cricket species belonging to the subfamily Gryllinae. Using a comparative approach we evaluated the optimal number of species and calling song characteristics for both the methods that lead to most accurate classification and identification. The accuracy of classification using DFA was high and was not affected by the number of taxa used. However, a constraint in using discriminant function analysis is the need for a priori classification of songs. Accuracy of classification using cluster analysis, which does not require a priori knowledge, was maximum for 6-7 taxa and decreased significantly when more than ten taxa were analysed together. We also investigated the efficacy of two novel derived acoustic features in improving the accuracy of identification. Our results show that DFA is a reliable statistical tool for species identification using acoustic signals. Our results also show that cluster analysis of acoustic signals in crickets works effectively for species classification and identification.
Use of machine learning methods to classify Universities based on the income structure
NASA Astrophysics Data System (ADS)
Terlyga, Alexandra; Balk, Igor
2017-10-01
In this paper we discuss use of machine learning methods such as self organizing maps, k-means and Ward’s clustering to perform classification of universities based on their income. This classification will allow us to quantitate classification of universities as teaching, research, entrepreneur, etc. which is important tool for government, corporations and general public alike in setting expectation and selecting universities to achieve different goals.
Automating document classification for the Immune Epitope Database
Wang, Peng; Morgan, Alexander A; Zhang, Qing; Sette, Alessandro; Peters, Bjoern
2007-01-01
Background The Immune Epitope Database contains information on immune epitopes curated manually from the scientific literature. Like similar projects in other knowledge domains, significant effort is spent on identifying which articles are relevant for this purpose. Results We here report our experience in automating this process using Naïve Bayes classifiers trained on 20,910 abstracts classified by domain experts. Improvements on the basic classifier performance were made by a) utilizing information stored in PubMed beyond the abstract itself b) applying standard feature selection criteria and c) extracting domain specific feature patterns that e.g. identify peptides sequences. We have implemented the classifier into the curation process determining if abstracts are clearly relevant, clearly irrelevant, or if no certain classification can be made, in which case the abstracts are manually classified. Testing this classification scheme on an independent dataset, we achieve 95% sensitivity and specificity in the 51.1% of abstracts that were automatically classified. Conclusion By implementing text classification, we have sped up the reference selection process without sacrificing sensitivity or specificity of the human expert classification. This study provides both practical recommendations for users of text classification tools, as well as a large dataset which can serve as a benchmark for tool developers. PMID:17655769
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Carroll, Kristen L; Murray, Kathleen A; MacLeod, Lynne M; Hennessey, Theresa A; Woiczik, Marcella R; Roach, James W
2011-06-01
Numerous studies underscore the poor intraobserver and interobserver reliability of both the center edge angle (CEA) and the Severin classification using plain film measurements. In this study, experienced observers applied a computer-assisted measurement program to determine the CEA in digital pelvic radiographs of adults who had been previously treated for dysplasia of the hip (DDH). Using a teaching aid/algorithm of the Severin classification, the observers then assigned a Severin rating to these hips. Intraobserver and interobserver errors were then calculated on both the CEA measurements and the Severin classifications. Four pediatric orthopaedic surgeons and 1 pediatric radiologist calculated the CEAs using the OrthoView TM planning system and then determined the Severin classification on 41 blinded digital pelvic radiographs. The radiographs were evaluated by each examiner twice, with evaluations separated by 2 months. All examiners reviewed a Severin classification algorithm before making their Severin assignments. The intraobserver and interobserver reliability for both the CEA and the Severin classification were calculated using the interclass correlation coefficients and Cohen and Fleiss κ scores, respectively. The intraobserver and interobserver reliability for CEA measurement was moderate to almost perfect. When we separated the Severin classification into 3 clinically relevant groups of good (Severin I and II), dysplastic (Severin III), and poor (Severin IV and above), our interobserver reliability neared almost perfect. The Severin classification is an extremely useful and oft-used radiographic measure for the success of DDH treatment. Our research found digital radiography, computer-aided measurement tools, the use of a Severin algorithm, and separating the Severin classification into 3 clinically relevant groups significantly increased the intraobserver and interobserver reliability of both the CEA and Severin classification. This finding will assist future studies using the CEA and Severin classification in the radiographic assessment of DDH treatment outcomes.
Garcia-Chimeno, Yolanda; Garcia-Zapirain, Begonya
2015-01-01
The classification of subjects' pathologies enables a rigorousness to be applied to the treatment of certain pathologies, as doctors on occasions play with so many variables that they can end up confusing some illnesses with others. Thanks to Machine Learning techniques applied to a health-record database, it is possible to make using our algorithm. hClass contains a non-linear classification of either a supervised, non-supervised or semi-supervised type. The machine is configured using other techniques such as validation of the set to be classified (cross-validation), reduction in features (PCA) and committees for assessing the various classifiers. The tool is easy to use, and the sample matrix and features that one wishes to classify, the number of iterations and the subjects who are going to be used to train the machine all need to be introduced as inputs. As a result, the success rate is shown either via a classifier or via a committee if one has been formed. A 90% success rate is obtained in the ADABoost classifier and 89.7% in the case of a committee (comprising three classifiers) when PCA is applied. This tool can be expanded to allow the user to totally characterise the classifiers by adjusting them to each classification use.
Wiegmann, D A; Shappell, S A
2001-11-01
The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based on Reason's (1990) model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of aircrew and organizational factors. The purpose of the present study was to assess the utility of the HFACS framework as an error analysis and classification tool outside the military. The HFACS framework was used to analyze human error data associated with aircrew-related commercial aviation accidents that occurred between January 1990 and December 1996 using database records maintained by the NTSB and the FAA. Investigators were able to reliably accommodate all the human causal factors associated with the commercial aviation accidents examined in this study using the HFACS system. In addition, the classification of data using HFACS highlighted several critical safety issues in need of intervention research. These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation arena. However, additional research is needed to examine its applicability to areas outside the flight deck, such as aircraft maintenance and air traffic control domains.
Young Kim, Eun; Johnson, Hans J
2013-01-01
A robust multi-modal tool, for automated registration, bias correction, and tissue classification, has been implemented for large-scale heterogeneous multi-site longitudinal MR data analysis. This work focused on improving the an iterative optimization framework between bias-correction, registration, and tissue classification inspired from previous work. The primary contributions are robustness improvements from incorporation of following four elements: (1) utilize multi-modal and repeated scans, (2) incorporate high-deformable registration, (3) use extended set of tissue definitions, and (4) use of multi-modal aware intensity-context priors. The benefits of these enhancements were investigated by a series of experiments with both simulated brain data set (BrainWeb) and by applying to highly-heterogeneous data from a 32 site imaging study with quality assessments through the expert visual inspection. The implementation of this tool is tailored for, but not limited to, large-scale data processing with great data variation with a flexible interface. In this paper, we describe enhancements to a joint registration, bias correction, and the tissue classification, that improve the generalizability and robustness for processing multi-modal longitudinal MR scans collected at multi-sites. The tool was evaluated by using both simulated and simulated and human subject MRI images. With these enhancements, the results showed improved robustness for large-scale heterogeneous MRI processing.
Methods and potentials for using satellite image classification in school lessons
NASA Astrophysics Data System (ADS)
Voss, Kerstin; Goetzke, Roland; Hodam, Henryk
2011-11-01
The FIS project - FIS stands for Fernerkundung in Schulen (Remote Sensing in Schools) - aims at a better integration of the topic "satellite remote sensing" in school lessons. According to this, the overarching objective is to teach pupils basic knowledge and fields of application of remote sensing. Despite the growing significance of digital geomedia, the topic "remote sensing" is not broadly supported in schools. Often, the topic is reduced to a short reflection on satellite images and used only for additional illustration of issues relevant for the curriculum. Without addressing the issue of image data, this can hardly contribute to the improvement of the pupils' methodical competences. Because remote sensing covers more than simple, visual interpretation of satellite images, it is necessary to integrate remote sensing methods like preprocessing, classification and change detection. Dealing with these topics often fails because of confusing background information and the lack of easy-to-use software. Based on these insights, the FIS project created different simple analysis tools for remote sensing in school lessons, which enable teachers as well as pupils to be introduced to the topic in a structured way. This functionality as well as the fields of application of these analysis tools will be presented in detail with the help of three different classification tools for satellite image classification.
Screening and Mitigation of Layperson Anxiety in Aerospace Environments.
Mulcahy, Robert A; Blue, Rebecca S; Vardiman, Johnené L; Castleberry, Tarah L; Vanderploeg, James M
Anxiety may present challenges for commercial spaceflight operations, as little is known regarding the psychological effects of spaceflight on laypersons. A recent investigation evaluated measures of anxiety during centrifuge-simulated suborbital commercial spaceflight, highlighting the potential for severe anxiousness to interrupt spaceflight operations. To pave the way for future research, an extensive literature review identified existing knowledge that may contribute to formation of interventions for anxiety in commercial spaceflight. Useful literature was identified regarding anxiety from a variety of fields, including centrifugation, fear of flying, motion sickness, and military operations. Fear of flying is the most extensively studied area, with some supportive evidence from centrifugation studies. Virtual reality exposure (VRE) is as effective as actual training flight exposure (or analog exposure) in mitigation of flight-related anxiety. The addition of other modalities, such as cognitive behavioral therapy or biofeedback, to VRE improves desensitization compared to VRE alone. Motion sickness-susceptible individuals demonstrate higher trait anxiety than nonsusceptible individuals; for this reason, motion sickness susceptibility questionnaires may be useful measures to identify at-risk individuals. Some military studies indicate that psychiatric history and personality classification may have predictive value in future research. Medication countermeasures consisting of benzodiazepines may quell in-flight anxiety, but do not likely improve anxiety on repeat exposure. The scarce available literature addressing anxiety in unique environments indicates that training/repeated exposure may mitigate anxiety. Anxiety and personality indices may be helpful screening tools, while pharmaceuticals may be useful countermeasures when needed. Mulcahy RA, Blue RS, Vardiman JL, Castleberry TL, Vanderploeg JM. Screening and mitigation of layperson anxiety in aerospace environments. Aerosp Med Hum Perform. 2016; 87(10):882-889.
78 FR 24094 - Azoxystrobin; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-24
... Classification System (NAICS) codes is not intended to be exhaustive, but rather provides a guide to help readers... response to the notice of filing. Based upon review of the data supporting the petition, EPA is... exposures for which there is reliable information.'' This includes exposure through drinking water and in...
Canizo, Brenda V; Escudero, Leticia B; Pérez, María B; Pellerano, Roberto G; Wuilloud, Rodolfo G
2018-03-01
The feasibility of the application of chemometric techniques associated with multi-element analysis for the classification of grape seeds according to their provenance vineyard soil was investigated. Grape seed samples from different localities of Mendoza province (Argentina) were evaluated. Inductively coupled plasma mass spectrometry (ICP-MS) was used for the determination of twenty-nine elements (Ag, As, Ce, Co, Cs, Cu, Eu, Fe, Ga, Gd, La, Lu, Mn, Mo, Nb, Nd, Ni, Pr, Rb, Sm, Te, Ti, Tl, Tm, U, V, Y, Zn and Zr). Once the analytical data were collected, supervised pattern recognition techniques such as linear discriminant analysis (LDA), partial least square discriminant analysis (PLS-DA), k-nearest neighbors (k-NN), support vector machine (SVM) and Random Forest (RF) were applied to construct classification/discrimination rules. The results indicated that nonlinear methods, RF and SVM, perform best with up to 98% and 93% accuracy rate, respectively, and therefore are excellent tools for classification of grapes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Visual Impairment/lntracranial Pressure Risk Clinical Care Data Tools
NASA Technical Reports Server (NTRS)
Van Baalen, Mary; Mason, Sara S.; Taiym, Wafa; Wear, Mary L.; Moynihan, Shannan; Alexander, David; Hart, Steve; Tarver, William
2014-01-01
Prior to 2010, several ISS crewmembers returned from spaceflight with changes to their vision, ranging from a mild hyperopic shift to frank disc edema. As a result, NASA expanded clinical vision testing to include more comprehensive medical imaging, including Optical Coherence Tomography and 3 Tesla Brain and Orbit MRIs. The Space and Clinical Operations (SCO) Division developed a clinical practice guideline that classified individuals based on their symptoms and diagnoses to facilitate clinical care. For the purposes of clinical surveillance, this classification was applied retrospectively to all crewmembers who had sufficient testing for classification. This classification is also a tool that has been leveraged for researchers to identify potential risk factors. In March 2014, driven in part by a more comprehensive understanding of the imaging data and increased imaging capability on orbit, the SCO Division revised their clinical care guidance to outline in-flight care and increase post-flight follow up. The new clinical guidance does not include a classification scheme
Historical limitations of determinant based exposure groupings in the rubber manufacturing industry
Vermeulen, R; Kromhout, H
2005-01-01
Aims: To study the validity of using a cross-sectional industry-wide exposure survey to develop exposure groupings for epidemiological purposes that extend beyond the time period in which the exposure data were collected. Methods: Exposure determinants were used to group workers into high, medium, and low exposure groups. The contrast of this grouping and other commonly used grouping schemes based on plant and department within this exposure survey and a previously conducted survey within the same industry (and factories) were estimated and compared. Results: Grouping of inhalable and dermal exposure based on exposure determinants resulted in the highest, but still modest, contrast (ε ∼ 0.3). Classifying subjects based on a combination of plant and department resulted in a slightly lower contrast (ε ∼ 0.2). If the determinant based grouping derived from the 1997 exposure survey was used to classify workers in the 1988 survey the average contrast decreased significantly for both exposures (ε ∼ 0.1). On the contrary, the exposure classification based on plant and department increased in contrast (from ε ∼ 0.2 to ε ∼ 0.3) and retained its relative ranking overtime. Conclusions: Although determinant based groupings seem to result in more efficient groupings within a cross-sectional survey, they have to be used with caution as they might result in significant less contrast beyond the studied population or time period. It is concluded that a classification based on plant and department might be more desirable for retrospective studies in the rubber manufacturing industry, as they seem to have more historical relevance and are most likely more accurately recorded historically than information on exposure determinants in a particular industry. PMID:16234406
New decision support tool for acute lymphoblastic leukemia classification
NASA Astrophysics Data System (ADS)
Madhukar, Monica; Agaian, Sos; Chronopoulos, Anthony T.
2012-03-01
In this paper, we build up a new decision support tool to improve treatment intensity choice in childhood ALL. The developed system includes different methods to accurately measure furthermore cell properties in microscope blood film images. The blood images are exposed to series of pre-processing steps which include color correlation, and contrast enhancement. By performing K-means clustering on the resultant images, the nuclei of the cells under consideration are obtained. Shape features and texture features are then extracted for classification. The system is further tested on the classification of spectra measured from the cell nuclei in blood samples in order to distinguish normal cells from those affected by Acute Lymphoblastic Leukemia. The results show that the proposed system robustly segments and classifies acute lymphoblastic leukemia based on complete microscopic blood images.
A software tool for automatic classification and segmentation of 2D/3D medical images
NASA Astrophysics Data System (ADS)
Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur
2013-02-01
Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.
Rotorcraft Conceptual Design Environment
2009-10-01
systems engineering design tool sets. The DaVinci Project vision is to develop software architecture and tools specifically for acquisition system...enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described. Introduction...information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION
48 CFR 225.7001 - Definitions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Mooring Chain. (c) “End product” is defined in the clause at 252.225-7012, Preference for Certain Domestic Commodities. (d) Hand or measuring tools means those tools listed in Federal supply classifications 51 and 52...
48 CFR 225.7001 - Definitions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Mooring Chain. (c) “End product” is defined in the clause at 252.225-7012, Preference for Certain Domestic Commodities. (d) Hand or measuring tools means those tools listed in Federal supply classifications 51 and 52...
48 CFR 225.7001 - Definitions.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Mooring Chain. (c) “End product” is defined in the clause at 252.225-7012, Preference for Certain Domestic Commodities. (d) Hand or measuring tools means those tools listed in Federal supply classifications 51 and 52...
48 CFR 225.7001 - Definitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Mooring Chain. (c) “End product” is defined in the clause at 252.225-7012, Preference for Certain Domestic Commodities. (d) Hand or measuring tools means those tools listed in Federal supply classifications 51 and 52...
48 CFR 225.7001 - Definitions.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Mooring Chain. (c) “End product” is defined in the clause at 252.225-7012, Preference for Certain Domestic Commodities. (d) Hand or measuring tools means those tools listed in Federal supply classifications 51 and 52...
EPA EcoBox Tools by Stressors - Biological
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Stressors - Physical
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Effects - Aquatic
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Effects - References
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Stressors - References
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Stressors - Chemical
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Receptors - Biota
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Receptors - References
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Effects - Terrestrial
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
ERIC Educational Resources Information Center
Schutter, Linda S.; Brinker, Richard P.
1992-01-01
A review of the literature on biological and environmental effects of cocaine use suggests that the classification of infants and young children as prenatally cocaine exposed is neither descriptive nor predictive of behavior. The classification of behavior rather than labeling of the child is encouraged, as are partnerships with families of…
DOT National Transportation Integrated Search
2001-02-01
The Human Factors Analysis and Classification System (HFACS) is a general human error framework : originally developed and tested within the U.S. military as a tool for investigating and analyzing the human : causes of aviation accidents. Based upon ...
Providing the Missing Link: the Exposure Science Ontology ExO
Although knowledge-discovery tools are new to the exposure science community, these tools are critical for leveraging exposure information to design health studies and interpret results for improved public health decisions. Standardized ontologies define relationships, allow for ...
EPA EcoBox Tools by Receptors - Habitats and Ecosystems
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Effects - Effects In ERA
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Stressors - Stressors in ERA
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Receptors - Receptors in ERA
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA ExpoBox: Submit Tool Information
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases, mode
Classification of Radiological Changes in Burst Fractures
Şentürk, Salim; Öğrenci, Ahmet; Gürçay, Ahmet Gürhan; Abdioğlu, Ahmet Atilla; Yaman, Onur; Özer, Ali Fahir
2018-01-01
AIM: Burst fractures can occur with different radiological images after high energy. We aimed to simplify radiological staging of burst fractures. METHODS: Eighty patients whom exposed spinal trauma and had burst fracture were evaluated concerning age, sex, fracture segment, neurological deficit, secondary organ injury and radiological changes that occurred. RESULTS: We performed a new classification in burst fractures at radiological images. CONCLUSIONS: According to this classification system, secondary organ injury and neurological deficit can be an indicator of energy exposure. If energy is high, the clinical status will be worse. Thus, we can get an idea about the likelihood of neurological deficit and secondary organ injuries. This classification has simplified the radiological staging of burst fractures and is a classification that gives a very accurate idea about the neurological condition. PMID:29531604
Control banding approaches for nanomaterials.
Brouwer, Derk H
2012-07-01
Control banding (CB) has been developed as a pragmatic tool to manage the risk resulting from exposure to a wide variety of potentially hazardous substances in the absence of firm toxicological and exposure information. Currently, the CB approach is applied for emerging risks such as nanoparticles, by the development of various CB-based tools. Six of these are compared. Despite their similarity, i.e. combining hazard and exposure into control or risk bands, the structure, the applicability domains, and the assignment of the hazard and exposure bands, show differences that may affect the consistency of the resulting outcome amongst the various CB tools. The value of the currently available CB tools for nanomaterials can be enhanced by transparently elucidating these differences for user consideration during the selection of a tool for a specific scenario of application.
Challenges of interoperability using HL7 v3 in Czech healthcare.
Nagy, Miroslav; Preckova, Petra; Seidl, Libor; Zvarova, Jana
2010-01-01
The paper describes several classification systems that could improve patient safety through semantic interoperability among contemporary electronic health record systems (EHR-Ss) with support of the HL7 v3 standard. We describe a proposal and a pilot implementation of a semantic interoperability platform (SIP) interconnecting current EHR-Ss by using HL7 v3 messages and concepts mappings on most widely used classification systems. The increasing number of classification systems and nomenclatures requires designing of various conversion tools for transfer between main classification systems. We present the so-called LIM filler module and the HL7 broker, which are parts of the SIP, playing the role of such conversion tools. The analysis of suitability and usability of individual terminological thesauri has been started by mapping of clinical contents of the Minimal Data Model for Cardiology (MDMC) to various terminological classification systems. A national-wide implementation of the SIP would include adopting and translating international coding systems and nomenclatures, and developing implementation guidelines facilitating the migration from national standards to international ones. Our research showed that creation of such a platform is feasible; however, it will require a huge effort to adapt fully the Czech healthcare system to the European environment.
Muench, Eugene V.
1971-01-01
A computerized English/Spanish correlation index to five biomedical library classification schemes and a computerized English/Spanish, Spanish/English listings of MeSH are described. The index was accomplished by supplying appropriate classification numbers of five classification schemes (National Library of Medicine; Library of Congress; Dewey Decimal; Cunningham; Boston Medical) to MeSH and a Spanish translation of MeSH The data were keypunched, merged on magnetic tape, and sorted in a computer alphabetically by English and Spanish subject headings and sequentially by classification number. Some benefits and uses of the index are: a complete index to classification schemes based on MeSH terms; a tool for conversion of classification numbers when reclassifying collections; a Spanish index and a crude Spanish translation of five classification schemes; a data base for future applications, e.g., automatic classification. Other classification schemes, such as the UDC, and translations of MeSH into other languages can be added. PMID:5172471
Pyroglyphid mites as a source of work-related allergens.
Macan, Jelena; Kanceljak-Macan, Božica; Milković-Kraus, Sanja
2012-01-01
Pyroglyphid mites are primarily associated with allergen exposure at home; hence the name house dust mites. However, we have found numerous studies reporting pyroglyhid mite levels in public and occupational settings. This review presents the findings of house dust mite allergens (family Pyroglyphidae, species Dermatophagoides) as potential work-related risk factors and proposes occupations at risk of house dust mite-related diseases. Pyroglyphid mites or their allergens are found in various workplaces, but clinically relevant exposures have been observed in hotels, cinemas, schools, day-care centres, libraries, public transportation (buses, trains, taxies, and airplanes), fishing-boats, submarines, poultry farms, and churches. Here we propose a classification of occupational risk as low (occasional exposure to mite allergen levels up to 2 μg g(-1)), moderate (exposure between 2 μg g(-1) and 10 μg g(-1)), and high (exposure >10 μg g(-1)). The classification of risk should include factors relevant for indoor mite population (climate, building characteristics, and cleaning schedule). To avoid development or aggravation of allergies associated with exposure to house dust mites at work, occupational physicians should assess exposure risk at work, propose proper protection, provide vocational guidance to persons at risk and conduct pre-employment and periodic examinations to diagnose new allergy cases. Protection at work should aim to control dust mite levels at work. Measures may include proper interior design and regular cleaning and building maintenance.
Röösli, Martin; Jenni, Daniela; Kheifets, Leeka; Mezei, Gabor
2011-08-15
The aim of this study was to evaluate an exposure assessment method that classifies apartments in three exposure categories of extremely low frequency magnetic fields (ELF-MF) based on the location of the apartment relative to the transformer room. We completed measurements in 39 apartments in 18 buildings. In each room of the apartments ELF-MF was concurrently measured with 5 to 6 EMDEX II meters for 10 min. Measured arithmetic mean ELF-MF was 0.59 μT in 8 apartments that were fully adjacent to a transformer room, either directly above the transformer or touching the transformer room wall-to-wall. In apartments that only partly touched the transformer room at corners or edges, average ELF-MF level was 0.14 μT. Average exposure in the remaining apartments was 0.10 μT. Kappa coefficient for exposure classification was 0.64 (95%-CI: 0.45-0.82) if only fully adjacent apartments were considered as highly exposed (>0.4 μT). We found a distinct ELF-MF exposure gradient in buildings with transformer. Exposure classification based on the location of the apartment relative to the transformer room appears feasible. Such an approach considerably reduces effort for exposure assessment and may be used to eliminate selection bias in future epidemiologic studies. Copyright © 2011 Elsevier B.V. All rights reserved.
Segmentation of bone and soft tissue regions in digital radiographic images of extremities
NASA Astrophysics Data System (ADS)
Pakin, S. Kubilay; Gaborski, Roger S.; Barski, Lori L.; Foos, David H.; Parker, Kevin J.
2001-07-01
This paper presents an algorithm for segmentation of computed radiography (CR) images of extremities into bone and soft tissue regions. The algorithm is a region-based one in which the regions are constructed using a growing procedure with two different statistical tests. Following the growing process, tissue classification procedure is employed. The purpose of the classification is to label each region as either bone or soft tissue. This binary classification goal is achieved by using a voting procedure that consists of clustering of regions in each neighborhood system into two classes. The voting procedure provides a crucial compromise between local and global analysis of the image, which is necessary due to strong exposure variations seen on the imaging plate. Also, the existence of regions whose size is large enough such that exposure variations can be observed through them makes it necessary to use overlapping blocks during the classification. After the classification step, resulting bone and soft tissue regions are refined by fitting a 2nd order surface to each tissue, and reevaluating the label of each region according to the distance between the region and surfaces. The performance of the algorithm is tested on a variety of extremity images using manually segmented images as gold standard. The experiments showed that our algorithm provided a bone boundary with an average area overlap of 90% compared to the gold standard.
Tamura, Taro; Suganuma, Narufumi; Hering, Kurt G; Vehmas, Tapio; Itoh, Harumi; Akira, Masanori; Takashima, Yoshihiro; Hirano, Harukazu; Kusaka, Yukinori
2015-01-01
The International Classification of High-resolution Computed Tomography (HRCT) for Occupational and Environmental Respiratory Diseases (ICOERD) has been developed for the screening, diagnosis, and epidemiological reporting of respiratory diseases caused by occupational hazards. This study aimed to establish a correlation between readings of HRCT (according to the ICOERD) and those of chest radiography (CXR) pneumoconiotic parenchymal opacities (according to the International Labor Organization Classification/International Classification of Radiographs of Pneumoconioses [ILO/ICRP]). Forty-six patients with and 28 controls without mineral dust exposure underwent posterior-anterior CXR and HRCT. We recorded all subjects' exposure and smoking history. Experts independently read CXRs (using ILO/ICRP). Experts independently assessed HRCT using the ICOERD parenchymal abnormalities grades for well-defined rounded opacities (RO), linear and/or irregular opacities (IR), and emphysema (EM). The correlation between the ICOERD summed grades and ILO/ICRP profusions was evaluated using Spearman's rank-order correlation. Twenty-three patients had small opacities on CXR. HRCT showed that 21 patients had RO; 20 patients, IR opacities; and 23 patients, EM. The correlation between ILO/ICRP profusions and the ICOERD grades was 0.844 for rounded opacities (p<0.01). ICOERD readings from HRCT scans correlated well with previously validated ILO/ICRP criteria. The ICOERD adequately detects pneumoconiotic micronodules and can be used for the interpretation of pneumoconiosis.
Faber-Langendoen, D.; Aaseng, N.; Hop, K.; Lew-Smith, M.; Drake, J.
2007-01-01
Question: How can the U.S. National Vegetation Classification (USNVC) serve as an effective tool for classifying and mapping vegetation, and inform assessments and monitoring? Location: Voyageurs National Park, northern Minnesota, U.S.A and environs. The park contains 54 243 ha of terrestrial habitat in the sub-boreal region of North America. Methods: We classified and mapped the natural vegetation using the USNVC, with 'alliance' and 'association' as base units. We compiled 259 classification plots and 1251 accuracy assessment test plots. Both plot and type ordinations were used to analyse vegetation and environmental patterns. Color infrared aerial photography (1:15840 scale) was used for mapping. Polygons were manually drawn, then transferred into digital form. Classification and mapping products are stored in publicly available databases. Past fire and logging events were used to assess distribution of forest types. Results and Discussion: Ordination and cluster analyses confirmed 49 associations and 42 alliances, with three associations ranked as globally vulnerable to extirpation. Ordination provided a useful summary of vegetation and ecological gradients. Overall map accuracy was 82.4%. Pinus banksiana - Picea mariana forests were less frequent in areas unburned since the 1930s. Conclusion: The USNVC provides a consistent ecological tool for summarizing and mapping vegetation. The products provide a baseline for assessing forests and wetlands, including fire management. The standardized classification and map units provide local to continental perspectives on park resources through linkages to state, provincial, and national classifications in the U.S. and Canada, and to NatureServe's Ecological Systems classification. ?? IAVS; Opulus Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yi-Xin; Zeng, Qiang; Wang, Le
Urinary haloacetic acids (HAAs), such as dichloroacetic acid (DCAA) and trichloroacetic acid (TCAA), have been suggested as potential biomarkers of exposure to drinking water disinfection byproducts (DBPs). However, variable exposure to and the short elimination half-lives of these biomarkers can result in considerable variability in urinary measurements, leading to exposure misclassification. Here we examined the variability of DCAA and TCAA levels in the urine among eleven men who provided urine samples on 8 days over 3 months. The urinary concentrations of DCAA and TCAA were measured by gas chromatography coupled with electron capture detection. We calculated the intraclass correlation coefficientsmore » (ICCs) to characterize the within-person and between-person variances and computed the sensitivity and specificity to assess how well single or multiple urine collections accurately determined personal 3-month average DCAA and TCAA levels. The within-person variance was much higher than the between-person variance for all three sample types (spot, first morning, and 24-h urine samples) for DCAA (ICC=0.08–0.37) and TCAA (ICC=0.09–0.23), regardless of the sampling interval. A single-spot urinary sample predicted high (top 33%) 3-month average DCAA and TCAA levels with high specificity (0.79 and 0.78, respectively) but relatively low sensitivity (0.47 and 0.50, respectively). Collecting two or three urine samples from each participant improved the classification. The poor reproducibility of the measured urinary DCAA and TCAA concentrations indicate that a single measurement may not accurately reflect individual long-term exposure. Collection of multiple urine samples from one person is an option for reducing exposure classification errors in studies exploring the effects of DBP exposure on reproductive health. - Highlights: • We evaluated the variability of DCAA and TCAA levels in the urine among men. • Urinary DCAA and TCAA levels varied greatly over a 3-month period. • Single measurement may not accurately reflect personal long-term exposure levels. • Collecting multiple samples from one person improved the exposure classification.« less
Morfeld, Peter; Bruch, Joachim; Levy, Len; Ngiewih, Yufanyi; Chaudhuri, Ishrat; Muranko, Henry J; Myerson, Ross; McCunney, Robert J
2015-04-23
We analyze the scientific basis and methodology used by the German MAK Commission in their recommendations for exposure limits and carcinogen classification of "granular biopersistent particles without known specific toxicity" (GBS). These recommendations are under review at the European Union level. We examine the scientific assumptions in an attempt to reproduce the results. MAK's human equivalent concentrations (HECs) are based on a particle mass and on a volumetric model in which results from rat inhalation studies are translated to derive occupational exposure limits (OELs) and a carcinogen classification. We followed the methods as proposed by the MAK Commission and Pauluhn 2011. We also examined key assumptions in the metrics, such as surface area of the human lung, deposition fractions of inhaled dusts, human clearance rates; and risk of lung cancer among workers, presumed to have some potential for lung overload, the physiological condition in rats associated with an increase in lung cancer risk. The MAK recommendations on exposure limits for GBS have numerous incorrect assumptions that adversely affect the final results. The procedures to derive the respirable occupational exposure limit (OEL) could not be reproduced, a finding raising considerable scientific uncertainty about the reliability of the recommendations. Moreover, the scientific basis of using the rat model is confounded by the fact that rats and humans show different cellular responses to inhaled particles as demonstrated by bronchoalveolar lavage (BAL) studies in both species. Classifying all GBS as carcinogenic to humans based on rat inhalation studies in which lung overload leads to chronic inflammation and cancer is inappropriate. Studies of workers, who have been exposed to relevant levels of dust, have not indicated an increase in lung cancer risk. Using the methods proposed by the MAK, we were unable to reproduce the OEL for GBS recommended by the Commission, but identified substantial errors in the models. Considerable shortcomings in the use of lung surface area, clearance rates, deposition fractions; as well as using the mass and volumetric metrics as opposed to the particle surface area metric limit the scientific reliability of the proposed GBS OEL and carcinogen classification.
NASA Astrophysics Data System (ADS)
García-Flores, Agustín.; Paz-Gallardo, Abel; Plaza, Antonio; Li, Jun
2016-10-01
This paper describes a new web platform dedicated to the classification of satellite images called Hypergim. The current implementation of this platform enables users to perform classification of satellite images from any part of the world thanks to the worldwide maps provided by Google Maps. To perform this classification, Hypergim uses unsupervised algorithms like Isodata and K-means. Here, we present an extension of the original platform in which we adapt Hypergim in order to use supervised algorithms to improve the classification results. This involves a significant modification of the user interface, providing the user with a way to obtain samples of classes present in the images to use in the training phase of the classification process. Another main goal of this development is to improve the runtime of the image classification process. To achieve this goal, we use a parallel implementation of the Random Forest classification algorithm. This implementation is a modification of the well-known CURFIL software package. The use of this type of algorithms to perform image classification is widespread today thanks to its precision and ease of training. The actual implementation of Random Forest was developed using CUDA platform, which enables us to exploit the potential of several models of NVIDIA graphics processing units using them to execute general purpose computing tasks as image classification algorithms. As well as CUDA, we use other parallel libraries as Intel Boost, taking advantage of the multithreading capabilities of modern CPUs. To ensure the best possible results, the platform is deployed in a cluster of commodity graphics processing units (GPUs), so that multiple users can use the tool in a concurrent way. The experimental results indicate that this new algorithm widely outperform the previous unsupervised algorithms implemented in Hypergim, both in runtime as well as precision of the actual classification of the images.
Faust, Kevin; Xie, Quin; Han, Dominick; Goyle, Kartikay; Volynskaya, Zoya; Djuric, Ugljesa; Diamandis, Phedias
2018-05-16
There is growing interest in utilizing artificial intelligence, and particularly deep learning, for computer vision in histopathology. While accumulating studies highlight expert-level performance of convolutional neural networks (CNNs) on focused classification tasks, most studies rely on probability distribution scores with empirically defined cutoff values based on post-hoc analysis. More generalizable tools that allow humans to visualize histology-based deep learning inferences and decision making are scarce. Here, we leverage t-distributed Stochastic Neighbor Embedding (t-SNE) to reduce dimensionality and depict how CNNs organize histomorphologic information. Unique to our workflow, we develop a quantitative and transparent approach to visualizing classification decisions prior to softmax compression. By discretizing the relationships between classes on the t-SNE plot, we show we can super-impose randomly sampled regions of test images and use their distribution to render statistically-driven classifications. Therefore, in addition to providing intuitive outputs for human review, this visual approach can carry out automated and objective multi-class classifications similar to more traditional and less-transparent categorical probability distribution scores. Importantly, this novel classification approach is driven by a priori statistically defined cutoffs. It therefore serves as a generalizable classification and anomaly detection tool less reliant on post-hoc tuning. Routine incorporation of this convenient approach for quantitative visualization and error reduction in histopathology aims to accelerate early adoption of CNNs into generalized real-world applications where unanticipated and previously untrained classes are often encountered.
Description of Updates for MCCEM Version 1.2 (February 2001)
The EPA Office of Pollution Prevention and Toxics, Economics, Exposure, and Technology Division has developed several exposure assessment tools and models. A description of the models and tools and the definition of exposure are given in separate web page
NASA Astrophysics Data System (ADS)
Rana, Narender; Chien, Chester
2018-03-01
A key sensor element in a Hard Disk Drive (HDD) is the read-write head device. The device is complex 3D shape and its fabrication requires over thousand process steps with many of them being various types of image inspection and critical dimension (CD) metrology steps. In order to have high yield of devices across a wafer, very tight inspection and metrology specifications are implemented. Many images are collected on a wafer and inspected for various types of defects and in CD metrology the quality of image impacts the CD measurements. Metrology noise need to be minimized in CD metrology to get better estimate of the process related variations for implementing robust process controls. Though there are specialized tools available for defect inspection and review allowing classification and statistics. However, due to unavailability of such advanced tools or other reasons, many times images need to be manually inspected. SEM Image inspection and CD-SEM metrology tools are different tools differing in software as well. SEM Image inspection and CD-SEM metrology tools are separate tools differing in software and purpose. There have been cases where a significant numbers of CD-SEM images are blurred or have some artefact and there is a need for image inspection along with the CD measurement. Tool may not report a practical metric highlighting the quality of image. Not filtering CD from these blurred images will add metrology noise to the CD measurement. An image classifier can be helpful here for filtering such data. This paper presents the use of artificial intelligence in classifying the SEM images. Deep machine learning is used to train a neural network which is then used to classify the new images as blurred and not blurred. Figure 1 shows the image blur artefact and contingency table of classification results from the trained deep neural network. Prediction accuracy of 94.9 % was achieved in the first model. Paper covers other such applications of the deep neural network in image classification for inspection, review and metrology.
Diagnosis of streamflow prediction skills in Oregon using Hydrologic Landscape Classification
A complete understanding of why rainfall-runoff models provide good streamflow predictions at catchments in some regions, but fail to do so in other regions, has still not been achieved. Here, we argue that a hydrologic classification system is a robust conceptual tool that is w...
Where and why do models fail? Perspectives from Oregon Hydrologic Landscape classification
A complete understanding of why rainfall-runoff models provide good streamflow predictions at catchments in some regions, but fail to do so in other regions, has still not been achieved. Here, we argue that a hydrologic classification system is a robust conceptual tool that is w...
Use of Classification Agreement Analyses to Evaluate RTI Implementation
ERIC Educational Resources Information Center
VanDerHeyden, Amanda
2010-01-01
RTI as a framework for decision making has implications for the diagnosis of specific learning disabilities. Any diagnostic tool must meet certain standards to demonstrate that its use leads to predictable decisions with minimal risk. Classification agreement analyses are described as optimal for demonstrating the technical adequacy of RTI…
Conceptual Change through Changing the Process of Comparison
ERIC Educational Resources Information Center
Wasmann-Frahm, Astrid
2009-01-01
Classification can serve as a tool for conceptualising ideas about vertebrates. Training enhances classification skills as well as sharpening concepts. The method described in this paper is based on the "hybrid-model" of comparison that proposes two independently working processes: associative and theory-based. The two interact during a…
Model Bloodborne Pathogens: Exposure Control Plan for Wisconsin Public Schools. Bulletin No. 93311.
ERIC Educational Resources Information Center
Wisconsin State Dept. of Public Instruction, Madison.
This document is intended to assist local school districts in complying with the Wisconsin Department of Industry, Labor and Human Relations (DILHR) Health and Safety Standard. Following an overview of the plan, the guide is organized into six chapters: (1) "Exposure Determination" discusses job classifications, tasks, and procedures;…
Analyses of rear-end crashes based on classification tree models.
Yan, Xuedong; Radwan, Essam
2006-09-01
Signalized intersections are accident-prone areas especially for rear-end crashes due to the fact that the diversity of the braking behaviors of drivers increases during the signal change. The objective of this article is to improve knowledge of the relationship between rear-end crashes occurring at signalized intersections and a series of potential traffic risk factors classified by driver characteristics, environments, and vehicle types. Based on the 2001 Florida crash database, the classification tree method and Quasi-induced exposure concept were used to perform the statistical analysis. Two binary classification tree models were developed in this study. One was used for the crash comparison between rear-end and non-rear-end to identify those specific trends of the rear-end crashes. The other was constructed for the comparison between striking vehicles/drivers (at-fault) and struck vehicles/drivers (not-at-fault) to find more complex crash pattern associated with the traffic attributes of driver, vehicle, and environment. The modeling results showed that the rear-end crashes are over-presented in the higher speed limits (45-55 mph); the rear-end crash propensity for daytime is apparently larger than nighttime; and the reduction of braking capacity due to wet and slippery road surface conditions would definitely contribute to rear-end crashes, especially at intersections with higher speed limits. The tree model segmented drivers into four homogeneous age groups: < 21 years, 21-31 years, 32-75 years, and > 75 years. The youngest driver group shows the largest crash propensity; in the 21-31 age group, the male drivers are over-involved in rear-end crashes under adverse weather conditions and the 32-75 years drivers driving large size vehicles have a larger crash propensity compared to those driving passenger vehicles. Combined with the quasi-induced exposure concept, the classification tree method is a proper statistical tool for traffic-safety analysis to investigate crash propensity. Compared to the logistic regression models, tree models have advantages for handling continuous independent variables and easily explaining the complex interaction effect with more than two independent variables. This research recommended that at signalized intersections with higher speed limits, reducing the speed limit to 40 mph efficiently contribute to a lower accident rate. Drivers involved in alcohol use may increase not only rear-end crash risk but also the driver injury severity. Education and enforcement countermeasures should focus on the driver group younger than 21 years. Further studies are suggested to compare crash risk distributions of the driver age for other main crash types to seek corresponding traffic countermeasures.
Rivals, Florent; Prignano, Luce; Semprebon, Gina M.; Lozano, Sergi
2015-01-01
The seasonality of human occupations in archaeological sites is highly significant for the study of hominin behavioural ecology, in particular the hunting strategies for their main prey-ungulates. We propose a new tool to quantify such seasonality from tooth microwear patterns in a dataset of ten large samples of extant ungulates resulting from well-known mass mortality events. The tool is based on the combination of two measures of variability of scratch density, namely standard deviation and coefficient of variation. The integration of these two measurements of variability permits the classification of each case into one of the following three categories: (1) short events, (2) long-continued event and (3) two separated short events. The tool is tested on a selection of eleven fossil samples from five Palaeolithic localities in Western Europe which show a consistent classification in the three categories. The tool proposed here opens new doors to investigate seasonal patterns of ungulate accumulations in archaeological sites using non-destructive sampling. PMID:26616864
NASA Astrophysics Data System (ADS)
Prasetyo, T.; Amar, S.; Arendra, A.; Zam Zami, M. K.
2018-01-01
This study develops an on-line detection system to predict the wear of DCMT070204 tool tip during the cutting process of the workpiece. The machine used in this research is CNC ProTurn 9000 to cut ST42 steel cylinder. The audio signal has been captured using the microphone placed in the tool post and recorded in Matlab. The signal is recorded at the sampling rate of 44.1 kHz, and the sampling size of 1024. The recorded signal is 110 data derived from the audio signal while cutting using a normal chisel and a worn chisel. And then perform signal feature extraction in the frequency domain using Fast Fourier Transform. Feature selection is done based on correlation analysis. And tool wear classification was performed using artificial neural networks with 33 input features selected. This artificial neural network is trained with back propagation method. Classification performance testing yields an accuracy of 74%.
Hybrid approach for robust diagnostics of cutting tools
NASA Astrophysics Data System (ADS)
Ramamurthi, K.; Hough, C. L., Jr.
1994-03-01
A new multisensor based hybrid technique has been developed for robust diagnosis of cutting tools. The technique combines the concepts of pattern classification and real-time knowledge based systems (RTKBS) and draws upon their strengths; learning facility in the case of pattern classification and a higher level of reasoning in the case of RTKBS. It eliminates some of their major drawbacks: false alarms or delayed/lack of diagnosis in case of pattern classification and tedious knowledge base generation in case of RTKBS. It utilizes a dynamic distance classifier, developed upon a new separability criterion and a new definition of robust diagnosis for achieving these benefits. The promise of this technique has been proven concretely through an on-line diagnosis of drill wear. Its suitability for practical implementation is substantiated by the use of practical, inexpensive, machine-mounted sensors and low-cost delivery systems.
NASA Astrophysics Data System (ADS)
Wu, Shulian; Peng, Yuanyuan; Hu, Liangjun; Zhang, Xiaoman; Li, Hui
2016-01-01
Second harmonic generation microscopy (SHGM) was used to monitor the process of chronological aging skin in vivo. The collagen structures of mice model with different ages were obtained using SHGM. Then, texture feature with contrast, correlation and entropy were extracted and analysed using the grey level co-occurrence matrix. At last, the neural network tool of Matlab was applied to train the texture of collagen in different statues during the aging process. And the simulation of mice collagen texture was carried out. The results indicated that the classification accuracy reach 85%. Results demonstrated that the proposed approach effectively detected the target object in the collagen texture image during the chronological aging process and the analysis tool based on neural network applied the skin of classification and feature extraction method is feasible.
Advancing Exposure Characterization for Chemical Evaluation and Risk Assessment
A new generation of scientific tools has emerged to rapidly measure signals from cells, tissues, and organisms following exposure to chemicals. High-visibility efforts to apply these tools for efficient toxicity testing raise important research questions in exposure science. As v...
EPA EcoBox Tools by Receptors - Endangered, Threatened or Other Species of Concern
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
NASA Astrophysics Data System (ADS)
Wood, N. J.; Spielman, S.
2012-12-01
Near-field tsunami hazards are credible threats to many coastal communities throughout the world. Along the U.S. Pacific Northwest coast, low-lying areas could be inundated by a series of catastrophic tsunamis that begin to arrive in a matter of minutes following a major Cascadia subduction zone (CSZ) earthquake. Previous research has documented the residents, employees, tourists at public venues, customers at local businesses, and vulnerable populations at dependent-care facilities that are in CSZ-related tsunami-prone areas of northern California, Oregon, and the open-ocean coast of Washington. Community inventories of demographic attributes and other characteristics of the at-risk population have helped emergency managers to develop preparedness and outreach efforts. Although useful for distinct risk-reduction issues, these data can be difficult to fully appreciate holistically given the large number of community attributes. This presentation summarizes analytical efforts to classify communities with similar characteristics of community exposure to tsunami hazards. This work builds on past State-focused inventories of community exposure to CSZ-related tsunami hazards in northern California, Oregon, and Washington. Attributes used in the classification, or cluster analysis, fall into several categories, including demography of residents, spatial extent of the developed footprint based on mid-resolution land cover data, distribution of the local workforce, and the number and type of public venues, dependent-care facilities, and community-support businesses. As we were unsure of the number of different types of communities, we used an unsupervised-model-based clustering algorithm and a v-fold, cross-validation procedure (v=50) to identify the appropriate number of community types. Ultimately we selected class solutions that provided the appropriate balance between parsimony and model fit. The goal of the exposure classification is to provide emergency managers with a general sense of the types of communities in tsunami hazard zones based on similar exposure characteristics instead of only providing an exhaustive list of attributes for individual communities. This community-exposure classification scheme can be then used to target and prioritize risk-reduction efforts that address common issues across multiple communities, instead of community-specific efforts. Examples include risk-reduction efforts that focus on similar demographic attributes of the at-risk population or on the type of service populations that dominate tsunami-prone areas. The presentation will include a discussion of the utility of proposed place classifications to support regional preparedness and outreach efforts.
Lo, Benjamin W Y; Fukuda, Hitoshi; Angle, Mark; Teitelbaum, Jeanne; Macdonald, R Loch; Farrokhyar, Forough; Thabane, Lehana; Levine, Mitchell A H
2016-01-01
Classification and regression tree analysis involves the creation of a decision tree by recursive partitioning of a dataset into more homogeneous subgroups. Thus far, there is scarce literature on using this technique to create clinical prediction tools for aneurysmal subarachnoid hemorrhage (SAH). The classification and regression tree analysis technique was applied to the multicenter Tirilazad database (3551 patients) in order to create the decision-making algorithm. In order to elucidate prognostic subgroups in aneurysmal SAH, neurologic, systemic, and demographic factors were taken into account. The dependent variable used for analysis was the dichotomized Glasgow Outcome Score at 3 months. Classification and regression tree analysis revealed seven prognostic subgroups. Neurological grade, occurrence of post-admission stroke, occurrence of post-admission fever, and age represented the explanatory nodes of this decision tree. Split sample validation revealed classification accuracy of 79% for the training dataset and 77% for the testing dataset. In addition, the occurrence of fever at 1-week post-aneurysmal SAH is associated with increased odds of post-admission stroke (odds ratio: 1.83, 95% confidence interval: 1.56-2.45, P < 0.01). A clinically useful classification tree was generated, which serves as a prediction tool to guide bedside prognostication and clinical treatment decision making. This prognostic decision-making algorithm also shed light on the complex interactions between a number of risk factors in determining outcome after aneurysmal SAH.
Hoppe, Christian; Obermeier, Patrick; Muehlhans, Susann; Alchikh, Maren; Seeber, Lea; Tief, Franziska; Karsch, Katharina; Chen, Xi; Boettcher, Sindy; Diedrich, Sabine; Conrad, Tim; Kisler, Bron; Rath, Barbara
2016-10-01
Regulatory authorities often receive poorly structured safety reports requiring considerable effort to investigate potential adverse events post hoc. Automated question-and-answer systems may help to improve the overall quality of safety information transmitted to pharmacovigilance agencies. This paper explores the use of the VACC-Tool (ViVI Automated Case Classification Tool) 2.0, a mobile application enabling physicians to classify clinical cases according to 14 pre-defined case definitions for neuroinflammatory adverse events (NIAE) and in full compliance with data standards issued by the Clinical Data Interchange Standards Consortium. The validation of the VACC-Tool 2.0 (beta-version) was conducted in the context of a unique quality management program for children with suspected NIAE in collaboration with the Robert Koch Institute in Berlin, Germany. The VACC-Tool was used for instant case classification and for longitudinal follow-up throughout the course of hospitalization. Results were compared to International Classification of Diseases , Tenth Revision (ICD-10) codes assigned in the emergency department (ED). From 07/2013 to 10/2014, a total of 34,368 patients were seen in the ED, and 5243 patients were hospitalized; 243 of these were admitted for suspected NIAE (mean age: 8.5 years), thus participating in the quality management program. Using the VACC-Tool in the ED, 209 cases were classified successfully, 69 % of which had been missed or miscoded in the ED reports. Longitudinal follow-up with the VACC-Tool identified additional NIAE. Mobile applications are taking data standards to the point of care, enabling clinicians to ascertain potential adverse events in the ED setting and during inpatient follow-up. Compliance with Clinical Data Interchange Standards Consortium (CDISC) data standards facilitates data interoperability according to regulatory requirements.
Krause, Fabian G; Di Silvestro, Matthew; Penner, Murray J; Wing, Kevin J; Glazebrook, Mark A; Daniels, Timothy R; Lau, Johnny T C; Younger, Alastair S E
2012-02-01
End-stage ankle arthritis is operatively treated with numerous designs of total ankle replacement and different techniques for ankle fusion. For superior comparison of these procedures, outcome research requires a classification system to stratify patients appropriately. A postoperative 4-type classification system was designed by 6 fellowship-trained foot and ankle surgeons. Four surgeons reviewed blinded patient profiles and radiographs on 2 occasions to determine the interobserver and intraobserver reliability of the classification. Excellent interobserver reliability (κ = .89) and intraobserver reproducibility (κ = .87) were demonstrated for the postoperative classification system. In conclusion, the postoperative Canadian Orthopaedic Foot and Ankle Society (COFAS) end-stage ankle arthritis classification system appears to be a valid tool to evaluate the outcome of patients operated for end-stage ankle arthritis.
Kanaan, Mona; Gilbody, Simon; Hanratty, Barbara
2016-01-01
Objectives We present a novel way of classifying and comparing measures of social relationships to help readers interpret the growing literature on loneliness and social isolation and to provide researchers with a starting point to guide their choice of measuring tool. Methods Measures of social relationships used in epidemiological studies were identified from two systematic reviews—one review on the association between social relationships and health and social care service use, and a second review on the association between social relationships and health. Questions from each measure were retrieved and tabulated to derive a classification of social relationship measures. Results We present a classification of measures according to two dimensions: (1) whether instruments cover structural or functional aspects of social relationships and (2) the degree of subjectivity asked of respondents. We explain how this classification can be used to clarify the remit of the many questionnaires used in the literature and to compare them. Conclusions Different dimensions of social relationships are likely to have different implications for health. Our classification of social relationship measures transcends disciplinary and conceptual boundaries, allowing researchers to compare tools that developed from different theoretical perspectives. Careful choice of measures is essential to further our understanding of the links between social relationships and health, to identify people in need of help and to design appropriate prevention and intervention strategies. PMID:27091822
Introduction to the Community-Focused Exposure and Risk Screening Tool (C-FERST)
EPA scientists are working partners to design and test the Community-Focused Exposure and Risk Screening Tool (C-FERST), a community mapping, information access, and assessment tool to help assess risk and assist in decision making with communities
An Upper Bound for Population Exposure Variability (SOT)
Tools for the rapid assessment of exposure potential are needed in order to put the results of rapidly-applied tools for assessing biological activity, such as ToxCast® and other high throughput methodologies, into a quantitative exposure context. The ExpoCast models (Wambaugh et...
De Matteis, Sara; Jarvis, Deborah; Young, Heather; Young, Alan; Allen, Naomi; Potts, James; Darnton, Andrew; Rushton, Lesley; Cullinan, Paul
2017-03-01
Objectives The standard approach to the assessment of occupational exposures is through the manual collection and coding of job histories. This method is time-consuming and costly and makes it potentially unfeasible to perform high quality analyses on occupational exposures in large population-based studies. Our aim was to develop a novel, efficient web-based tool to collect and code lifetime job histories in the UK Biobank, a population-based cohort of over 500 000 participants. Methods We developed OSCAR (occupations self-coding automatic recording) based on the hierarchical structure of the UK Standard Occupational Classification (SOC) 2000, which allows individuals to collect and automatically code their lifetime job histories via a simple decision-tree model. Participants were asked to find each of their jobs by selecting appropriate job categories until they identified their job title, which was linked to a hidden 4-digit SOC code. For each occupation a job title in free text was also collected to estimate Cohen's kappa (κ) inter-rater agreement between SOC codes assigned by OSCAR and an expert manual coder. Results OSCAR was administered to 324 653 UK Biobank participants with an existing email address between June and September 2015. Complete 4-digit SOC-coded lifetime job histories were collected for 108 784 participants (response rate: 34%). Agreement between the 4-digit SOC codes assigned by OSCAR and the manual coder for a random sample of 400 job titles was moderately good [κ=0.45, 95% confidence interval (95% CI) 0.42-0.49], and improved when broader job categories were considered (κ=0.64, 95% CI 0.61-0.69 at a 1-digit SOC-code level). Conclusions OSCAR is a novel, efficient, and reasonably reliable web-based tool for collecting and automatically coding lifetime job histories in large population-based studies. Further application in other research projects for external validation purposes is warranted.
Design and acceptability of the Aviation Laser Exposure Self-Assessment (ALESA).
Waggel, Stephanie E; Hutchison, Ewan J
2013-03-01
There is an increase in reports of aircraft exposed to lasers directed from the ground. Approximately 2300 cases were reported to the United Kingdom Civil Aviation Authority (UK CAA) in 2011. While the likelihood of injuries directly resulting from such exposure is currently low, this may increase with the increasing power of lasers. A project was undertaken to develop a rapid self-assessment tool that would assist pilots in determining whether permanent injury has occurred after eye exposure to a laser and whether professional assessment should be sought. Laser incidents reported to the UK CAA were analyzed. With the aid of expert advice, the Aviation Laser Exposure Self-Assessment (ALESA) tool was produced using key aspects of illumination needed to determine the risk of harm. There were 25 pilots and flight crewmembers who have experienced aviation laser exposure who were given the tool and their response was assessed with a questionnaire. There was a favorable response to five of the six measured aspects of the ALESA. The ALESA is considered a useful tool for pilots following laser exposure during flight. The UK CAA is making the tool available through its website.
Di Lorenzo, C; Ambrosini, A; Coppola, G; Pierelli, F
2009-01-01
Headache is considered as a common symptom of heat stress disorders (HSD), but no forms of secondary headache from heat exposure are reported in the International Classification of Headache Disorders-2 Edition (ICHD-II). Heat-stroke (HS) is the HSD most severe condition, it may be divided into two forms: classic (due to a long period environmental heat exposure) and exertional (a severe condition caused by strenuous physical exercises in heat environmental conditions). Here we report the case of a patient who developed a headache clinical picture fulfilling the diagnostic criteria for new daily persistent headache (NDPH), after an exertional HS, and discuss about possible pathophysiological mechanisms and classification aspects of headache induced by heat conditions.
Heat stress disorders and headache: a case of new daily persistent headache secondary to heat stroke
Di Lorenzo, C; Ambrosini, A; Coppola, G; Pierelli, F
2009-01-01
Headache is considered as a common symptom of heat stress disorders (HSD), but no forms of secondary headache from heat exposure are reported in the International Classification of Headache Disorders-2 Edition (ICHD-II). Heat-stroke (HS) is the HSD most severe condition, it may be divided into two forms: classic (due to a long period environmental heat exposure) and exertional (a severe condition caused by strenuous physical exercises in heat environmental conditions). Here we report the case of a patient who developed a headache clinical picture fulfilling the diagnostic criteria for new daily persistent headache (NDPH), after an exertional HS, and discuss about possible pathophysiological mechanisms and classification aspects of headache induced by heat conditions. PMID:21686677
Perspectives on the causes of childhood leukemia.
Wiemels, Joseph
2012-04-05
Acute leukemia is the most common cancer in children but the causes of the disease in the majority of cases are not known. About 80% are precursor-B cell in origin (CD19+, CD10+), and this immunophenotype has increased in incidence over the past several decades in the Western world. Part of this increase may be due to the introduction of new chemical exposures into the child's environment including parental smoking, pesticides, traffic fumes, paint and household chemicals. However, much of the increase in leukemia rates is likely linked to altered patterns of infection during early childhood development, mirroring causal pathways responsible for a similarly increased incidence of other childhood-diagnosed immune-related illnesses including allergy, asthma, and type 1 diabetes. Factors linked to childhood leukemia that are likely surrogates for immune stimulation include exposure to childcare settings, parity status and birth order, vaccination history, and population mixing. In case-control studies, acute lymphoblastic leukemia (ALL) is consistently inversely associated with greater exposure to infections, via daycare and later birth order. New evidence suggests also that children who contract leukemia may harbor a congenital defect in immune responder status, as indicated by lower levels of the immunosuppressive cytokine IL-10 at birth in children who grow up to contract leukemia, as well as higher need for clinical care for infections within the first year of life despite having lower levels of exposure to infections. One manifestation of this phenomenon may be leukemia clusters which tend to appear as a leukemia "outbreak" among populations with low herd immunity to a new infection. Critical answers to the etiology of childhood leukemia will require incorporating new tools into traditional epidemiologic approaches - including the classification of leukemia at a molecular scale, better exposure assessments at all points in a child's life, a comprehensive understanding of genetic risk factors, and an appraisal of the interplay between infectious exposures and the status of immune response in individuals. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Perspectives on the Causes of Childhood Leukemia
Wiemels, Joseph
2013-01-01
Acute leukemia is the most common cancer in children but the causes of the disease in the majority of cases are not known. About 80% are precursor-B cell in origin (CD19+, CD10+), and this immunophenotype has increased in incidence over the past several decades in the Western world. Part of this increase may be due to the introduction of new chemical exposures into the child's environment including parental smoking, pesticides, traffic fumes, paint and household chemicals. However, much of the increase in leukemia rates is likely linked to altered patterns of infection during early childhood development, mirroring causal pathways responsible for a similarly increased incidence of other childhood-diagnosed immune-related illnesses including allergy, asthma, and type 1 diabetes. Factors linked to childhood leukemia that are likely surrogates for immune stimulation include exposure to childcare settings, parity status and birth order, vaccination history, and population mixing. In case-control studies, acute lymphoblastic leukemia (ALL) is consistently inversely associated with greater exposure to infections, via daycare and later birth order. New evidence suggests also that children who contract leukemia may harbor a congenital defect in immune responder status, as indicated by lower levels of the immunosuppressive cytokine IL-10 at birth in children who grow up to contract leukemia, as well as higher need for clinical care for infections within the first year of life despite having lower levels of exposure to infections. One manifestation of this phenomenon may be leukemia clusters which tend to appear as a leukemia “outbreak” among populations with low herd immunity to a new infection. Critical answers to the etiology of childhood leukemia will require incorporating new tools into traditional epidemiologic approaches – including the classification of leukemia at a molecular scale, better exposure assessments at all points in a child's life, a comprehensive understanding of genetic risk factors, and an appraisal of the interplay between infectious exposures and the status of immune response in individuals. PMID:22326931
Habitat classification modelling with incomplete data: Pushing the habitat envelope
Phoebe L. Zarnetske; Thomas C. Edwards; Gretchen G. Moisen
2007-01-01
Habitat classification models (HCMs) are invaluable tools for species conservation, land-use planning, reserve design, and metapopulation assessments, particularly at broad spatial scales. However, species occurrence data are often lacking and typically limited to presence points at broad scales. This lack of absence data precludes the use of many statistical...
Use of DAVID algorithms for gene functional classification in a non-model organism, rainbow trout
USDA-ARS?s Scientific Manuscript database
Gene functional clustering is essential in transcriptome data analysis but software programs are not always suitable for use with non-model species. The DAVID Gene Functional Classification Tool has been widely used for soft clustering in model species, but requires adaptations for use in non-model ...
A job-exposure matrix for use in population based studies in England and Wales.
Pannett, B; Coggon, D; Acheson, E D
1985-01-01
The job-exposure matrix described has been developed for use in population based studies of occupational morbidity and mortality in England and Wales. The job axis of the matrix is based on the Registrar General's 1966 classification of occupations and 1968 classification of industries, and comprises 669 job categories. The exposure axis is made up of 49 chemical, physical, and biological agents, most of which are known or suspected causes of occupational disease. In the body of the matrix associations between jobs and exposures are graded to four levels. The matrix has been applied to data from a case-control study of lung cancer in which occupational histories were elicited by means of a postal questionnaire. Estimates of exposure to five known or suspected carcinogens (asbestos, chromates, cutting oils, formaldehyde, and inhaled polycyclic aromatic hydrocarbons were compared with those obtained by detailed review of individual occupational histories. When the matrix was used exposures were attributed to jobs more frequently than on the basis of individual histories. Lung cancer was significantly more common among subjects classed by the matrix as having potential exposure to chromates, but neither method of assigning exposures produced statistically significant associations with asbestos or polycyclic aromatic hydrocarbons. Possible explanations for the failure to show a clear effect of these known carcinogens are discussed. The greater accuracy of exposures inferred directly from individual histories was reflected in steeper dose response curves for asbestos, chromates, and polycyclic aromatic hydrocarbons. The improvement over results obtained with the matrix, however, was not great. For occupational data of the type examined in this study, direct exposure estimates offer little advantage over those provided at lower cost by a matrix. PMID:4063222
Classifier dependent feature preprocessing methods
NASA Astrophysics Data System (ADS)
Rodriguez, Benjamin M., II; Peterson, Gilbert L.
2008-04-01
In mobile applications, computational complexity is an issue that limits sophisticated algorithms from being implemented on these devices. This paper provides an initial solution to applying pattern recognition systems on mobile devices by combining existing preprocessing algorithms for recognition. In pattern recognition systems, it is essential to properly apply feature preprocessing tools prior to training classification models in an attempt to reduce computational complexity and improve the overall classification accuracy. The feature preprocessing tools extended for the mobile environment are feature ranking, feature extraction, data preparation and outlier removal. Most desktop systems today are capable of processing a majority of the available classification algorithms without concern of processing while the same is not true on mobile platforms. As an application of pattern recognition for mobile devices, the recognition system targets the problem of steganalysis, determining if an image contains hidden information. The measure of performance shows that feature preprocessing increases the overall steganalysis classification accuracy by an average of 22%. The methods in this paper are tested on a workstation and a Nokia 6620 (Symbian operating system) camera phone with similar results.
Gao, Xiang; Lin, Huaiying; Revanna, Kashi; Dong, Qunfeng
2017-05-10
Species-level classification for 16S rRNA gene sequences remains a serious challenge for microbiome researchers, because existing taxonomic classification tools for 16S rRNA gene sequences either do not provide species-level classification, or their classification results are unreliable. The unreliable results are due to the limitations in the existing methods which either lack solid probabilistic-based criteria to evaluate the confidence of their taxonomic assignments, or use nucleotide k-mer frequency as the proxy for sequence similarity measurement. We have developed a method that shows significantly improved species-level classification results over existing methods. Our method calculates true sequence similarity between query sequences and database hits using pairwise sequence alignment. Taxonomic classifications are assigned from the species to the phylum levels based on the lowest common ancestors of multiple database hits for each query sequence, and further classification reliabilities are evaluated by bootstrap confidence scores. The novelty of our method is that the contribution of each database hit to the taxonomic assignment of the query sequence is weighted by a Bayesian posterior probability based upon the degree of sequence similarity of the database hit to the query sequence. Our method does not need any training datasets specific for different taxonomic groups. Instead only a reference database is required for aligning to the query sequences, making our method easily applicable for different regions of the 16S rRNA gene or other phylogenetic marker genes. Reliable species-level classification for 16S rRNA or other phylogenetic marker genes is critical for microbiome research. Our software shows significantly higher classification accuracy than the existing tools and we provide probabilistic-based confidence scores to evaluate the reliability of our taxonomic classification assignments based on multiple database matches to query sequences. Despite its higher computational costs, our method is still suitable for analyzing large-scale microbiome datasets for practical purposes. Furthermore, our method can be applied for taxonomic classification of any phylogenetic marker gene sequences. Our software, called BLCA, is freely available at https://github.com/qunfengdong/BLCA .
NASA Astrophysics Data System (ADS)
Ohnuma, Hidetoshi; Kawahira, Hiroichi
1998-09-01
An automatic alternative phase shift mask (PSM) pattern layout tool has been newly developed. This tool is dedicated for embedded DRAM in logic device to shrink gate line width with improving line width controllability in lithography process with a design rule below 0.18 micrometers by the KrF excimer laser exposure. The tool can crete Levenson type PSM used being coupled with a binary mask adopting a double exposure method for positive photo resist. By using graphs, this tool automatically creates alternative PSM patterns. Moreover, it does not give any phase conflicts. By adopting it to actual embedded DRAM in logic cells, we have provided 0.16 micrometers gate resist patterns at both random logic and DRAM areas. The patterns were fabricated using two masks with the double exposure method. Gate line width has been well controlled under a practical exposure-focus window.
Hormonal Regulation of Fluid and Electrolytes: Effects of Heat Exposure and Exercise in the Heat,
1988-02-01
F.N. Craig. Effect of potassium depletion on response to acute heat exposure in unacclimatized man. Am. J. % Physi. 211:117-124, 1966.% 22 . Cochrane...AD-RI92 655 HORMONAL REGULATION OF FLUID AND ELECTROLYTES: EFFECTS 1/ OF HEAT EXPOSURE A.. CU) ARMY RESEARCH INST OF ENVIRONMENTAL MEDICINE NATICK MA...61102A______________________ MI6102BSI CA DA114 11. TITLE (Include Security Classification) Hormnal Regulation of Fluid and Electrolytes: Effects of
Neumann, H G; Vamvakas, S; Thielmann, H W; Gelbke, H P; Filser, J G; Reuter, U; Greim, H; Kappus, H; Norpoth, K H; Wardenbach, P; Wichmann, H E
1998-11-01
Carcinogenic chemicals in the work area are currently classified into three categories in section III of the German List of MAK and BAT Values (list of values on maximum workplace concentrations and biological tolerance for occupational exposures). This classification is based on qualitative criteria and reflects essentially the weight of evidence available for judging the carcinogenic potential of the chemicals. It is proposed that these categories - IIIA1, IIIA2, IIIB - be retained as Categories 1, 2, and 3, to correspond with European Union regulations. On the basis of our advancing knowledge of reaction mechanisms and the potency of carcinogens, these three categories are supplemented with two additional categories. The essential feature of substances classified in the new categories is that exposure to these chemicals does not contribute significantly to risk of cancer to man, provided that an appropriate exposure limit (MAK value) is observed. Chemicals known to act typically by nongenotoxic mechanisms and for which information is available that allows evaluation of the effects of low-dose exposures, are classified in Category 4. Genotoxic chemicals for which low carcinogenic potency can be expected on the basis of dose-response relationships and toxicokinetics, and for which risk at low doses can be assessed are classified in Category 5. The basis for a better differentiation of carcinogens is discussed, the new categories are defined, and possible criteria for classification are described. Examples for Category 4 (1,4-dioxane) and Category 5 (styrene) are presented.
Thomas, Kai; Resseler, Herbert; Spatz, Robert; Hendley, Paul; Sweeney, Paul; Urban, Martin; Kubiak, Roland
2016-11-01
The objective was to refine the standard regulatory exposure scenario used in plant protection product authorisations by developing a more realistic landscape-related GIS-based exposure assessment for terrestrial non-target arthropods. We quantified the proportion of adjacent off-target area in agricultural landscapes potentially exposed to insecticide drift from applications of the active substance fenoxycarb. High-resolution imagery, landscape classification and subsequent stepwise analysis of a whole landscape using drift and interception functions were applied to selected areas in representative fruit-producing regions in Germany. Even under worst-case assumptions regarding treated area, use rate and drift, less than 12% of the non-agricultural habitat area would potentially be exposed to fenoxycarb drift above regulatory acceptable concentrations. Additionally, if the filtering effect of tall vegetation were taken into account, this number would decrease to 6.6%. Further refinements to landscape elements and application conditions indicate that less than 5% of the habitat area might be exposed above regulatory acceptable concentrations, meaning that 95% of the non-agricultural habitat area will be unimpacted (i.e. no unacceptable effects) and can serve as refuge for recolonisation. Approaches and tools are proposed for standardisable and transparent refinements in regulatory risk assessments on the landscape level. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.
Required risk mitigation measures for soil fumigants protect handlers, applicators, and bystanders from pesticide exposure. Measures include buffer zones, sign posting, good agricultural practices, restricted use pesticide classification, and FMPs.
A Data-Driven Framework for Incorporating New Tools for Toxicity, Exposure, and Risk Assessment
This talk was given during the “Exposure-Based Toxicity Testing” session at the annual meeting of the International Society for Exposure Science. It provided an update on the state of the science and tools that may be employed in risk-based prioritization efforts. It ...
DIAGNOSTIC TOOL DEVELOPMENT AND APPLICATION THROUGH REGIONAL CASE STUDIES
Case studies are a useful vehicle for developing and testing conceptual models, classification systems, diagnostic tools and models, and stressor-response relationships. Furthermore, case studies focused on specific places or issues of interest to the Agency provide an excellent ...
Feature selection for the classification of traced neurons.
López-Cabrera, José D; Lorenzo-Ginori, Juan V
2018-06-01
The great availability of computational tools to calculate the properties of traced neurons leads to the existence of many descriptors which allow the automated classification of neurons from these reconstructions. This situation determines the necessity to eliminate irrelevant features as well as making a selection of the most appropriate among them, in order to improve the quality of the classification obtained. The dataset used contains a total of 318 traced neurons, classified by human experts in 192 GABAergic interneurons and 126 pyramidal cells. The features were extracted by means of the L-measure software, which is one of the most used computational tools in neuroinformatics to quantify traced neurons. We review some current feature selection techniques as filter, wrapper, embedded and ensemble methods. The stability of the feature selection methods was measured. For the ensemble methods, several aggregation methods based on different metrics were applied to combine the subsets obtained during the feature selection process. The subsets obtained applying feature selection methods were evaluated using supervised classifiers, among which Random Forest, C4.5, SVM, Naïve Bayes, Knn, Decision Table and the Logistic classifier were used as classification algorithms. Feature selection methods of types filter, embedded, wrappers and ensembles were compared and the subsets returned were tested in classification tasks for different classification algorithms. L-measure features EucDistanceSD, PathDistanceSD, Branch_pathlengthAve, Branch_pathlengthSD and EucDistanceAve were present in more than 60% of the selected subsets which provides evidence about their importance in the classification of this neurons. Copyright © 2018 Elsevier B.V. All rights reserved.
Taylor, William J
2016-03-01
Conjoint analysis of choice or preference data has been used in marketing for over 40 years but has appeared in healthcare settings much more recently. It may be a useful technique for applications within the rheumatology field. Conjoint analysis in rheumatology contexts has mainly used the approaches implemented in 1000Minds Ltd, Dunedin, New Zealand, Sawtooth Software, Orem UT, USA. Examples include classification criteria, composite response criteria, service prioritization tools and utilities assessment. Limitations imposed by very many attributes can be managed using new techniques. Conjoint analysis studies of classification and response criteria suggest that the assumption of equal weighting of attributes cannot be met, which challenges traditional approaches to composite criteria construction. Weights elicited through choice experiments with experts can derive more accurate classification criteria, than unweighted criteria. Studies that find significant variation in attribute weights for composite response criteria for gout make construction of such criteria problematic. Better understanding of various multiattribute phenomena is likely to increase with increased use of conjoint analysis, especially when the attributes concern individual perceptions or opinions. In addition to classification criteria, some applications for conjoint analysis that are emerging in rheumatology include prioritization tools, remission criteria, and utilities for life areas.
Sawanyawisuth, Kittisak; Furuya, Sugio; Park, Eun-Kee; Myong, Jun-Pyo; Ramos-Bonilla, Juan Pablo; Chimed Ochir, Odgerel; Takahashi, Ken
2017-07-27
Background: Asbestos-related diseases (ARD) are occupational hazards with high mortality rates. To identify asbestos exposure by previous occupation is the main issue for ARD compensation for workers. This study aimed to identify risk groups by applying standard classifications of industries and occupations to a national database of compensated ARD victims in Japan. Methods: We identified occupations that carry a risk of asbestos exposure according to the International Standard Industrial Classification of All Economic Activities (ISIC). ARD compensation data from Japan between 2006 and 2013 were retrieved. Each compensated worker was classified by job section and group according to the ISIC code. Risk ratios for compensation were calculated according to the percentage of workers compensated because of ARD in each ISIC category. Results: In total, there were 6,916 workers with ARD who received compensation in Japan between 2008 and 2013. ISIC classification section F (construction) had the highest compensated risk ratio of 6.3. Section C (manufacturing) and section F (construction) had the largest number of compensated workers (2,868 and 3,463, respectively). In the manufacturing section C, 9 out of 13 divisions had a risk ratio of more than 1. For ISIC divisions in the construction section, construction of buildings (division 41) had the highest number of workers registering claims (2,504). Conclusion: ISIC classification of occupations that are at risk of developing ARD can be used to identify the actual risk of workers’ compensation at the national level. Creative Commons Attribution License
Jaiswara, Ranjana; Nandi, Diptarup; Balakrishnan, Rohini
2013-01-01
Traditional taxonomy based on morphology has often failed in accurate species identification owing to the occurrence of cryptic species, which are reproductively isolated but morphologically identical. Molecular data have thus been used to complement morphology in species identification. The sexual advertisement calls in several groups of acoustically communicating animals are species-specific and can thus complement molecular data as non-invasive tools for identification. Several statistical tools and automated identifier algorithms have been used to investigate the efficiency of acoustic signals in species identification. Despite a plethora of such methods, there is a general lack of knowledge regarding the appropriate usage of these methods in specific taxa. In this study, we investigated the performance of two commonly used statistical methods, discriminant function analysis (DFA) and cluster analysis, in identification and classification based on acoustic signals of field cricket species belonging to the subfamily Gryllinae. Using a comparative approach we evaluated the optimal number of species and calling song characteristics for both the methods that lead to most accurate classification and identification. The accuracy of classification using DFA was high and was not affected by the number of taxa used. However, a constraint in using discriminant function analysis is the need for a priori classification of songs. Accuracy of classification using cluster analysis, which does not require a priori knowledge, was maximum for 6–7 taxa and decreased significantly when more than ten taxa were analysed together. We also investigated the efficacy of two novel derived acoustic features in improving the accuracy of identification. Our results show that DFA is a reliable statistical tool for species identification using acoustic signals. Our results also show that cluster analysis of acoustic signals in crickets works effectively for species classification and identification. PMID:24086666
Boulton, Elisabeth; Hawley-Hague, Helen; Vereijken, Beatrix; Clifford, Amanda; Guldemond, Nick; Pfeiffer, Klaus; Hall, Alex; Chesani, Federico; Mellone, Sabato; Bourke, Alan; Todd, Chris
2016-06-01
Recent Cochrane reviews on falls and fall prevention have shown that it is possible to prevent falls in older adults living in the community and in care facilities. Technologies aimed at fall detection, assessment, prediction and prevention are emerging, yet there has been no consistency in describing or reporting on interventions using technologies. With the growth of eHealth and data driven interventions, a common language and classification is required. The FARSEEING Taxonomy of Technologies was developed as a tool for those in the field of biomedical informatics to classify and characterise components of studies and interventions. The Taxonomy Development Group (TDG) comprised experts from across Europe. Through face-to-face meetings and contributions via email, five domains were developed, modified and agreed: Approach; Base; Components of outcome measures; Descriptors of technologies; and Evaluation. Each domain included sub-domains and categories with accompanying definitions. The classification system was tested against published papers and further amendments undertaken, including development of an online tool. Six papers were classified by the TDG with levels of consensus recorded. Testing the taxonomy with papers highlighted difficulties in definitions across international healthcare systems, together with differences of TDG members' backgrounds. Definitions were clarified and amended accordingly, but some difficulties remained. The taxonomy and manual were large documents leading to a lengthy classification process. The development of the online application enabled a much simpler classification process, as categories and definitions appeared only when relevant. Overall consensus for the classified papers was 70.66%. Consensus scores increased as modifications were made to the taxonomy. The FARSEEING Taxonomy of Technologies presents a common language, which should now be adopted in the field of biomedical informatics. In developing the taxonomy as an online tool, it has become possible to continue to develop and modify the classification system to incorporate new technologies and interventions. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Thermographic image analysis as a pre-screening tool for the detection of canine bone cancer
NASA Astrophysics Data System (ADS)
Subedi, Samrat; Umbaugh, Scott E.; Fu, Jiyuan; Marino, Dominic J.; Loughin, Catherine A.; Sackman, Joseph
2014-09-01
Canine bone cancer is a common type of cancer that grows fast and may be fatal. It usually appears in the limbs which is called "appendicular bone cancer." Diagnostic imaging methods such as X-rays, computed tomography (CT scan), and magnetic resonance imaging (MRI) are more common methods in bone cancer detection than invasive physical examination such as biopsy. These imaging methods have some disadvantages; including high expense, high dose of radiation, and keeping the patient (canine) motionless during the imaging procedures. This project study identifies the possibility of using thermographic images as a pre-screening tool for diagnosis of bone cancer in dogs. Experiments were performed with thermographic images from 40 dogs exhibiting the disease bone cancer. Experiments were performed with color normalization using temperature data provided by the Long Island Veterinary Specialists. The images were first divided into four groups according to body parts (Elbow/Knee, Full Limb, Shoulder/Hip and Wrist). Each of the groups was then further divided into three sub-groups according to views (Anterior, Lateral and Posterior). Thermographic pattern of normal and abnormal dogs were analyzed using feature extraction and pattern classification tools. Texture features, spectral feature and histogram features were extracted from the thermograms and were used for pattern classification. The best classification success rate in canine bone cancer detection is 90% with sensitivity of 100% and specificity of 80% produced by anterior view of full-limb region with nearest neighbor classification method and normRGB-lum color normalization method. Our results show that it is possible to use thermographic imaging as a pre-screening tool for detection of canine bone cancer.
Introducing a design exigency to promote student learning through assessment: A case study.
Grealish, Laurie A; Shaw, Julie M
2018-02-01
Assessment technologies are often used to classify student and newly qualified nurse performance as 'pass' or 'fail', with little attention to how these decisions are achieved. Examining the design exigencies of classification technologies, such as performance assessment technologies, provides opportunities to explore flexibility and change in the process of using those technologies. Evaluate an established assessment technology for nursing performance as a classification system. A case study analysis that is focused on the assessment approach and a priori design exigencies of performance assessment technology, in this case the Australian Nursing Standards Assessment Tool 2016. Nurse assessors are required to draw upon their expertise to judge performance, but that judgement is described as a source of bias, creating confusion. The definition of satisfactory performance is 'ready to enter practice'. To pass, the performance on each criterion must be at least satisfactory, indicating to the student that no further improvement is required. The Australian Nursing Standards Assessment Tool 2016 does not have a third 'other' category, which is usually found in classification systems. Introducing a 'not yet competent' category and creating a two-part, mixed methods assessment process can improve the Australian Nursing Standards Assessment Tool 2016 assessment technology. Using a standards approach in the first part, judgement is valued and can generate learning opportunities across a program. Using a measurement approach in the second part, student performance can be 'not yet competent' but still meet criteria for year level performance and a graded pass. Subjecting the Australian Nursing Standards Assessment Tool 2016 assessment technology to analysis as a classification system provides opportunities for innovation in design. This design innovation has the potential to support students who move between programs and clinicians who assess students from different universities. Copyright © 2017 Elsevier Ltd. All rights reserved.
Bag of Visual Words Model with Deep Spatial Features for Geographical Scene Classification
Wu, Lin
2017-01-01
With the popular use of geotagging images, more and more research efforts have been placed on geographical scene classification. In geographical scene classification, valid spatial feature selection can significantly boost the final performance. Bag of visual words (BoVW) can do well in selecting feature in geographical scene classification; nevertheless, it works effectively only if the provided feature extractor is well-matched. In this paper, we use convolutional neural networks (CNNs) for optimizing proposed feature extractor, so that it can learn more suitable visual vocabularies from the geotagging images. Our approach achieves better performance than BoVW as a tool for geographical scene classification, respectively, in three datasets which contain a variety of scene categories. PMID:28706534
Betta, M; Laurino, M; Gemignani, A; Landi, A; Menicucci, D
2015-01-01
Rapid eye movements (REMs) are a peculiar and intriguing aspect of REM sleep, even if their physiological function still remains unclear. During this work, a new automatic tool was developed, aimed at a complete description of REMs activity during the night, both in terms of their timing of occurrence that in term of their directional properties. A classification stage of each singular movement detected during the night according to its main direction, was in fact added to our procedure of REMs detection and ocular artifact removal. A supervised classifier was constructed, using as training and validation set EOG data recorded during voluntary saccades of five healthy volunteers. Different classification methods were tested and compared. The further information about REMs directional characteristic provided by the procedure would represent a valuable tool for a deeper investigation into REMs physiological origin and functional meaning.
An evidence-based diagnostic classification system for low back pain
Vining, Robert; Potocki, Eric; Seidman, Michael; Morgenthal, A. Paige
2013-01-01
Introduction: While clinicians generally accept that musculoskeletal low back pain (LBP) can arise from specific tissues, it remains difficult to confirm specific sources. Methods: Based on evidence supported by diagnostic utility studies, doctors of chiropractic functioning as members of a research clinic created a diagnostic classification system, corresponding exam and checklist based on strength of evidence, and in-office efficiency. Results: The diagnostic classification system contains one screening category, two pain categories: Nociceptive, Neuropathic, one functional evaluation category, and one category for unknown or poorly defined diagnoses. Nociceptive and neuropathic pain categories are each divided into 4 subcategories. Conclusion: This article describes and discusses the strength of evidence surrounding diagnostic categories for an in-office, clinical exam and checklist tool for LBP diagnosis. The use of a standardized tool for diagnosing low back pain in clinical and research settings is encouraged. PMID:23997245
Global Dynamic Exposure and the OpenBuildingMap
NASA Astrophysics Data System (ADS)
Schorlemmer, D.; Beutin, T.; Hirata, N.; Hao, K. X.; Wyss, M.; Cotton, F.; Prehn, K.
2015-12-01
Detailed understanding of local risk factors regarding natural catastrophes requires in-depth characterization of the local exposure. Current exposure capture techniques have to find the balance between resolution and coverage. We aim at bridging this gap by employing a crowd-sourced approach to exposure capturing focusing on risk related to earthquake hazard. OpenStreetMap (OSM), the rich and constantly growing geographical database, is an ideal foundation for us. More than 2.5 billion geographical nodes, more than 150 million building footprints (growing by ~100'000 per day), and a plethora of information about school, hospital, and other critical facility locations allow us to exploit this dataset for risk-related computations. We will harvest this dataset by collecting exposure and vulnerability indicators from explicitly provided data (e.g. hospital locations), implicitly provided data (e.g. building shapes and positions), and semantically derived data, i.e. interpretation applying expert knowledge. With this approach, we can increase the resolution of existing exposure models from fragility classes distribution via block-by-block specifications to building-by-building vulnerability. To increase coverage, we will provide a framework for collecting building data by any person or community. We will implement a double crowd-sourced approach to bring together the interest and enthusiasm of communities with the knowledge of earthquake and engineering experts. The first crowd-sourced approach aims at collecting building properties in a community by local people and activists. This will be supported by tailored building capture tools for mobile devices for simple and fast building property capturing. The second crowd-sourced approach involves local experts in estimating building vulnerability that will provide building classification rules that translate building properties into vulnerability and exposure indicators as defined in the Building Taxonomy 2.0 developed by the Global Earthquake Model (GEM). These indicators will then be combined with a hazard model using the GEM OpenQuake engine to compute a risk model. The free/open framework we will provide can be used on commodity hardware for local to regional exposure capturing and for communities to understand their earthquake risk.
Gold-standard for computer-assisted morphological sperm analysis.
Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen
2017-04-01
Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm heads. By using the Fourier descriptor and SVM, we achieved the best mean correct classification: only 49%. We conclude that the SCIAN-MorphoSpermGS will provide a standard tool for evaluation of characterization and classification approaches for human sperm heads. Indeed, there is a clear need for a specific shape-based descriptor for human sperm heads and a specific classification approach to tackle the problem of high variability within subcategories of abnormal sperm cells. Copyright © 2017 Elsevier Ltd. All rights reserved.
Donada, Marc; Della Mea, Vincenzo; Cumerlato, Megan; Rankin, Nicole; Madden, Richard
2018-01-01
The International Classification of Health Interventions (ICHI) is a member of the WHO Family of International Classifications, being developed to provide a common tool for reporting and analysing health interventions for statistical purposes. A web-based platform for classification development and update has been specifically developed to support the initial development step and then, after final approval, the continuous revision and update of the classification. The platform provides features for classification editing, versioning, comment management and URI identifiers. During the last 12 months it has been used for developing the ICHI Beta version, replacing the previous process based on the exchange of Excel files. At November 2017, 90 users have provided input to the development of the classification, which has resulted in 2913 comments and 2971 changes in the classification, since June 2017. Further work includes the development of an URI API for machine to machine communication, following the model established for ICD-11.
Stoffenmanager exposure model: company-specific exposure assessments using a Bayesian methodology.
van de Ven, Peter; Fransman, Wouter; Schinkel, Jody; Rubingh, Carina; Warren, Nicholas; Tielemans, Erik
2010-04-01
The web-based tool "Stoffenmanager" was initially developed to assist small- and medium-sized enterprises in the Netherlands to make qualitative risk assessments and to provide advice on control at the workplace. The tool uses a mechanistic model to arrive at a "Stoffenmanager score" for exposure. In a recent study it was shown that variability in exposure measurements given a certain Stoffenmanager score is still substantial. This article discusses an extension to the tool that uses a Bayesian methodology for quantitative workplace/scenario-specific exposure assessment. This methodology allows for real exposure data observed in the company of interest to be combined with the prior estimate (based on the Stoffenmanager model). The output of the tool is a company-specific assessment of exposure levels for a scenario for which data is available. The Bayesian approach provides a transparent way of synthesizing different types of information and is especially preferred in situations where available data is sparse, as is often the case in small- and medium sized-enterprises. Real-world examples as well as simulation studies were used to assess how different parameters such as sample size, difference between prior and data, uncertainty in prior, and variance in the data affect the eventual posterior distribution of a Bayesian exposure assessment.
Sauvé, Jean-François; Siemiatycki, Jack; Labrèche, France; Richardson, Lesley; Pintos, Javier; Sylvestre, Marie-Pierre; Gérin, Michel; Bégin, Denis; Lacourt, Aude; Kirkham, Tracy L; Rémen, Thomas; Pasquet, Romain; Goldberg, Mark S; Rousseau, Marie-Claude; Parent, Marie-Élise; Lavoué, Jérôme
2018-06-12
We developed a job-exposure matrix called CANJEM using data generated in population-based case-control studies of cancer. This article describes some of the decisions in developing CANJEM, and some of its performance characteristics. CANJEM is built from exposure information from 31673 jobs held by study subjects included in our past case-control studies. For each job, experts had evaluated the intensity, frequency, and likelihood of exposure to a predefined list of agents based on jobs histories and descriptions of tasks and workplaces. The creation of CANJEM involved a host of decisions regarding the structure of CANJEM, and operational decisions regarding which parameters to present. The goal was to produce an instrument that would provide great flexibility to the user. In addition to describing these decisions, we conducted analyses to assess how well CANJEM covered the range of occupations found in Canada. Even at quite a high level of resolution of the occupation classifications and time periods, over 90% of the recent Canadian working population would be covered by CANJEM. Prevalence of exposure of specific agents in specific occupations ranges from 0% to nearly 100%, thereby providing the user with basic information to discriminate exposed from unexposed workers. Furthermore, among exposed workers there is information that can be used to discriminate those with high exposure from those with low exposure. CANJEM provides good coverage of the Canadian working population and possibly that of several other countries. Available in several occupation classification systems and including 258 agents, CANJEM can be used to support exposure assessment efforts in epidemiology and prevention of occupational diseases.
NASA Astrophysics Data System (ADS)
Tao, C.-S.; Chen, S.-W.; Li, Y.-Z.; Xiao, S.-P.
2017-09-01
Land cover classification is an important application for polarimetric synthetic aperture radar (PolSAR) data utilization. Rollinvariant polarimetric features such as H / Ani / α / Span are commonly adopted in PolSAR land cover classification. However, target orientation diversity effect makes PolSAR images understanding and interpretation difficult. Only using the roll-invariant polarimetric features may introduce ambiguity in the interpretation of targets' scattering mechanisms and limit the followed classification accuracy. To address this problem, this work firstly focuses on hidden polarimetric feature mining in the rotation domain along the radar line of sight using the recently reported uniform polarimetric matrix rotation theory and the visualization and characterization tool of polarimetric coherence pattern. The former rotates the acquired polarimetric matrix along the radar line of sight and fully describes the rotation characteristics of each entry of the matrix. Sets of new polarimetric features are derived to describe the hidden scattering information of the target in the rotation domain. The latter extends the traditional polarimetric coherence at a given rotation angle to the rotation domain for complete interpretation. A visualization and characterization tool is established to derive new polarimetric features for hidden information exploration. Then, a classification scheme is developed combing both the selected new hidden polarimetric features in rotation domain and the commonly used roll-invariant polarimetric features with a support vector machine (SVM) classifier. Comparison experiments based on AIRSAR and multi-temporal UAVSAR data demonstrate that compared with the conventional classification scheme which only uses the roll-invariant polarimetric features, the proposed classification scheme achieves both higher classification accuracy and better robustness. For AIRSAR data, the overall classification accuracy with the proposed classification scheme is 94.91 %, while that with the conventional classification scheme is 93.70 %. Moreover, for multi-temporal UAVSAR data, the averaged overall classification accuracy with the proposed classification scheme is up to 97.08 %, which is much higher than the 87.79 % from the conventional classification scheme. Furthermore, for multitemporal PolSAR data, the proposed classification scheme can achieve better robustness. The comparison studies also clearly demonstrate that mining and utilization of hidden polarimetric features and information in the rotation domain can gain the added benefits for PolSAR land cover classification and provide a new vision for PolSAR image interpretation and application.
NASA Astrophysics Data System (ADS)
Wu, Jie; Besnehard, Quentin; Marchessoux, Cédric
2011-03-01
Clinical studies for the validation of new medical imaging devices require hundreds of images. An important step in creating and tuning the study protocol is the classification of images into "difficult" and "easy" cases. This consists of classifying the image based on features like the complexity of the background, the visibility of the disease (lesions). Therefore, an automatic medical background classification tool for mammograms would help for such clinical studies. This classification tool is based on a multi-content analysis framework (MCA) which was firstly developed to recognize image content of computer screen shots. With the implementation of new texture features and a defined breast density scale, the MCA framework is able to automatically classify digital mammograms with a satisfying accuracy. BI-RADS (Breast Imaging Reporting Data System) density scale is used for grouping the mammograms, which standardizes the mammography reporting terminology and assessment and recommendation categories. Selected features are input into a decision tree classification scheme in MCA framework, which is the so called "weak classifier" (any classifier with a global error rate below 50%). With the AdaBoost iteration algorithm, these "weak classifiers" are combined into a "strong classifier" (a classifier with a low global error rate) for classifying one category. The results of classification for one "strong classifier" show the good accuracy with the high true positive rates. For the four categories the results are: TP=90.38%, TN=67.88%, FP=32.12% and FN =9.62%.
A few different exposure prediction tools were evaluated for use in the new in vitro-based safety assessment paradigm using di-2-ethylhexyl phthalate (DEHP) and dibutyl phthalate (DnBP) as case compounds. Daily intake of each phthalate was estimated using both high-throughput (HT...
ERIC Educational Resources Information Center
Strobl, Carolin; Malley, James; Tutz, Gerhard
2009-01-01
Recursive partitioning methods have become popular and widely used tools for nonparametric regression and classification in many scientific fields. Especially random forests, which can deal with large numbers of predictor variables even in the presence of complex interactions, have been applied successfully in genetics, clinical medicine, and…
Cox, Emily; Martin, Bradley C; Van Staa, Tjeerd; Garbe, Edeltraut; Siebert, Uwe; Johnson, Michael L
2009-01-01
The goal of comparative effectiveness analysis is to examine the relationship between two variables, treatment, or exposure and effectiveness or outcome. Unlike data obtained through randomized controlled trials, researchers face greater challenges with causal inference with observational studies. Recognizing these challenges, a task force was formed to develop a guidance document on methodological approaches to addresses these biases. The task force was commissioned and a Chair was selected by the International Society for Pharmacoeconomics and Outcomes Research Board of Directors in October 2007. This report, the second of three reported in this issue of the Journal, discusses the inherent biases when using secondary data sources for comparative effectiveness analysis and provides methodological recommendations to help mitigate these biases. The task force report provides recommendations and tools for researchers to mitigate threats to validity from bias and confounding in measurement of exposure and outcome. Recommendations on design of study included: the need for data analysis plan with causal diagrams; detailed attention to classification bias in definition of exposure and clinical outcome; careful and appropriate use of restriction; extreme care to identify and control for confounding factors, including time-dependent confounding. Design of nonrandomized studies of comparative effectiveness face several daunting issues, including measurement of exposure and outcome challenged by misclassification and confounding. Use of causal diagrams and restriction are two techniques that can improve the theoretical basis for analyzing treatment effects in study populations of more homogeneity, with reduced loss of generalizability.
The Community-Focused Exposure and Risk Screening Tool (C-FERST) is an online tool which provides access to resources that can help communities learn more about their environmental issues, and explore exposure and risk reduction options.
Available Tools and Challenges Classifying Cutting-Edge and Historical Astronomical Documents
NASA Astrophysics Data System (ADS)
Lagerstrom, Jill
2015-08-01
The STScI Library assists the Science Policies Division in evaluating and choosing scientific keywords and categories for proposals for the Hubble Space Telescope mission and the upcoming James Webb Space Telescope mission. In addition we are often faced with the question “what is the shape of the astronomical literature?” However, subject classification in astronomy in recent times has not been cultivated. This talk will address the available tools and challenges of classifying cutting-edge as well as historical astronomical documents. In at the process, we will give an overview of current and upcoming practices of subject classification in astronomy.
Reading the lesson: eliciting requirements for a mammography training application
NASA Astrophysics Data System (ADS)
Hartswood, M.; Blot, L.; Taylor, P.; Anderson, S.; Procter, R.; Wilkinson, L.; Smart, L.
2009-02-01
Demonstrations of a prototype training tool were used to elicit requirements for an intelligent training system for screening mammography. The prototype allowed senior radiologists (mentors) to select cases from a distributed database of images to meet the specific training requirements of junior colleagues (trainees) and then provided automated feedback in response to trainees' attempts at interpretation. The tool was demonstrated to radiologists and radiographers working in the breast screening service at four evaluation sessions. Participants highlighted ease of selecting cases that can deliver specific learning objectives as important for delivering effective training. To usefully structure a large data set of training images we undertook a classification exercise of mentor authored free text 'learning points' attached to training case obtained from two screening centres (n=333, n=129 respectively). We were able to adduce a hierarchy of abstract categories representing classes of lesson that groups of cases were intended to convey (e.g. Temporal change, Misleading juxtapositions, Position of lesion, Typical/Atypical presentation, and so on). In this paper we present the method used to devise this classification, the classification scheme itself, initial user-feedback, and our plans to incorporated it into a software tool to aid case selection.
Ta, Goh Choo; Mokhtar, Mazlin Bin; Mohd Mokhtar, Hj Anuar Bin; Ismail, Azmir Bin; Abu Yazid, Mohd Fadhil Bin Hj
2010-01-01
Chemical classification and labelling systems may be roughly similar from one country to another but there are significant differences too. In order to harmonize various chemical classification systems and ultimately provide consistent chemical hazard communication tools worldwide, the Globally Harmonized System of Classification and Labelling of Chemicals (GHS) was endorsed by the United Nations Economic and Social Council (ECOSOC). Several countries, including Japan, Taiwan, Korea and Malaysia, are now in the process of implementing GHS. It is essential to ascertain the comprehensibility of chemical hazard communication tools that are described in the GHS documents, namely the chemical labels and Safety Data Sheets (SDS). Comprehensibility Testing (CT) was carried out with a mixed group of industrial workers in Malaysia (n=150) and factors that influence the comprehensibility were analysed using one-way ANOVA. The ability of the respondents to retrieve information from the SDS was also tested in this study. The findings show that almost all the GHS pictograms meet the ISO comprehension criteria and it is concluded that the underlying core elements that enhance comprehension of GHS pictograms and which are also essential in developing competent persons in the use of SDS are training and education.
Toward a Molecular Understanding of Noise-Induced Hearing Loss
2017-10-01
cell, SAHA, Heat shock, sex differences 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON...threshold shift, Temporary threshold shift, Noise induced hearing loss, Ribotag, RNA-seq, Hair cell, Supporting cell, SAHA, Heat shock, Sex ...also sex -specific. TTS-inducing noise exposure: crosses, calibration, validation cytocochleograms, noise exposure, tissue harvesting, polysome IP
Noise Exposure Questionnaire (NEQ): A Tool for Quantifying Annual Noise Exposure
Johnson, Tiffany A.; Cooper, Susan; Stamper, Greta C.; Chertoff, Mark
2017-01-01
Background Exposure to both occupational and non-occupational noise is recognized as a risk factor for noise-induced hearing loss (NIHL). Although audiologists routinely inquire regarding history of noise exposure, there are limited tools available for quantifying this history or for identifying those individuals who are at highest risk for NIHL. Identifying those at highest risk would allow hearing conservation activities to be focused on those individuals. Purpose To develop a detailed, task-based questionnaire for quantifying an individual’s annual noise exposure arising from both occupational and non-occupational sources (aim 1) and to develop a short screening tool that could be used to identify individuals at high risk of NIHL (aim 2). Research Design Review of relevant literature for questionnaire development followed by a cross-sectional descriptive and correlational investigation of the newly developed questionnaire and screening tool. Study Sample One hundred fourteen college freshmen completed the detailed questionnaire for estimating annual noise exposure (aim 1) and answered the potential screening questions (aim 2). An additional 59 adults participated in data collection where the accuracy of the screening tool was evaluated (aim 2). Data Collection and Analysis In study aim 1, all subjects completed the detailed questionnaire and the potential screening questions. Descriptive statistics were used to quantify subject participation in various noisy activities and their associated annual noise exposure estimates. In study aim 2, linear regression techniques were used to identify screening questions that could be used to predict a subject’s estimated annual noise exposure. Clinical decision theory was then used to assess the accuracy with which the screening tool predicted high and low risk of NIHL in a new group of subjects. Results Responses on the detailed questionnaire indicated that our sample of college freshmen reported high rates of participation in a variety of occupational and non-occupational activities associated with high sound levels. Although participation rates were high, annual noise exposure estimates were below highest-risk levels for many subjects because the frequency of participation in these activities was low in many cases. These data illustrate how the Noise Exposure Questionnaire (NEQ) could be used to provide detailed and specific information regarding an individual’s exposure to noise. The results of aim 2 suggest that the screening tool, the 1-Minute Noise Screen, can be used to identify those subjects with high- and low-risk noise exposure, allowing more in-depth assessment of noise exposure history to be targeted at those most at risk. Conclusions The NEQ can be used to estimate an individual’s annual noise exposure and the 1-Minute Noise Screen can be used to identify those subjects at highest risk of NIHL. These tools allow audiologists to focus hearing conservation efforts on those individuals who are most in need of those services. PMID:28054909
The purpose of this report is to assess the application of tools to community-level assessments of exposure, health and the environment. Various tools and datasets provided different types of information, such as on health effects, chemical types and volumes, facility locations a...
Magalhaes, Sandra; Banwell, Brenda; Bar-Or, Amit; Fortier, Isabel; Hanwell, Heather E; Lim, Ming; Matt, Georg E; Neuteboom, Rinze F; O'Riordan, David L; Schneider, Paul K; Pugliatti, Maura; Shatenstein, Bryna; Tansey, Catherine M; Wassmer, Evangeline; Wolfson, Christina
2018-06-01
While studying the etiology of multiple sclerosis (MS) in children has several methodological advantages over studying etiology in adults, studies are limited by small sample sizes. Using a rigorous methodological process, we developed the Pediatric MS Tool-Kit, a measurement framework that includes a minimal set of core variables to assess etiological risk factors. We solicited input from the International Pediatric MS Study Group to select three risk factors: environmental tobacco smoke (ETS) exposure, sun exposure, and vitamin D intake. To develop the Tool-Kit, we used a Delphi study involving a working group of epidemiologists, neurologists, and content experts from North America and Europe. The Tool-Kit includes six core variables to measure ETS, six to measure sun exposure, and six to measure vitamin D intake. The Tool-Kit can be accessed online ( www.maelstrom-research.org/mica/network/tool-kit ). The goals of the Tool-Kit are to enhance exposure measurement in newly designed pediatric MS studies and comparability of results across studies, and in the longer term to facilitate harmonization of studies, a methodological approach that can be used to circumvent issues of small sample sizes. We believe the Tool-Kit will prove to be a valuable resource to guide pediatric MS researchers in developing study-specific questionnaire.
Cluster Method Analysis of K. S. C. Image
NASA Technical Reports Server (NTRS)
Rodriguez, Joe, Jr.; Desai, M.
1997-01-01
Information obtained from satellite-based systems has moved to the forefront as a method in the identification of many land cover types. Identification of different land features through remote sensing is an effective tool for regional and global assessment of geometric characteristics. Classification data acquired from remote sensing images have a wide variety of applications. In particular, analysis of remote sensing images have special applications in the classification of various types of vegetation. Results obtained from classification studies of a particular area or region serve towards a greater understanding of what parameters (ecological, temporal, etc.) affect the region being analyzed. In this paper, we make a distinction between both types of classification approaches although, focus is given to the unsupervised classification method using 1987 Thematic Mapped (TM) images of Kennedy Space Center.
A Taxonomy of Introductory Physics Concepts.
NASA Astrophysics Data System (ADS)
Mokaya, Fridah; Savkar, Amit; Valente, Diego
We have designed and implemented a hierarchical taxonomic classification of physics concepts for our introductory physics for engineers course sequence taught at the University of Connecticut. This classification can be used to provide a mechanism to measure student progress in learning at the level of individual concepts or clusters of concepts, and also as part of a tool to measure effectiveness of teaching pedagogy. We examine our pre- and post-test FCI results broken down by topics using Hestenes et al.'s taxonomy classification for the FCI, and compare these results with those found using our own taxonomy classification. In addition, we expand this taxonomic classification to measure performance in our other course exams, investigating possible correlations in results achieved across different assessments at the individual topic level. UCONN CLAS(College of Liberal Arts and Science).
TREXMO: A Translation Tool to Support the Use of Regulatory Occupational Exposure Models.
Savic, Nenad; Racordon, Dimitri; Buchs, Didier; Gasic, Bojan; Vernez, David
2016-10-01
Occupational exposure models vary significantly in their complexity, purpose, and the level of expertise required from the user. Different parameters in the same model may lead to different exposure estimates for the same exposure situation. This paper presents a tool developed to deal with this concern-TREXMO or TRanslation of EXposure MOdels. TREXMO integrates six commonly used occupational exposure models, namely, ART v.1.5, STOFFENMANAGER(®) v.5.1, ECETOC TRA v.3, MEASE v.1.02.01, EMKG-EXPO-TOOL, and EASE v.2.0. By enabling a semi-automatic translation between the parameters of these six models, TREXMO facilitates their simultaneous use. For a given exposure situation, defined by a set of parameters in one of the models, TREXMO provides the user with the most appropriate parameters to use in the other exposure models. Results showed that, once an exposure situation and parameters were set in ART, TREXMO reduced the number of possible outcomes in the other models by 1-4 orders of magnitude. The tool should manage to reduce the uncertain entry or selection of parameters in the six models, improve between-user reliability, and reduce the time required for running several models for a given exposure situation. In addition to these advantages, registrants of chemicals and authorities should benefit from more reliable exposure estimates for the risk characterization of dangerous chemicals under Regulation, Evaluation, Authorisation and restriction of CHemicals (REACH). © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Ferris, Laura K; Farberg, Aaron S; Middlebrook, Brooke; Johnson, Clare E; Lassen, Natalie; Oelschlager, Kristen M; Maetzold, Derek J; Cook, Robert W; Rigel, Darrell S; Gerami, Pedram
2017-05-01
A significant proportion of patients with American Joint Committee on Cancer (AJCC)-defined early-stage cutaneous melanoma have disease recurrence and die. A 31-gene expression profile (GEP) that accurately assesses metastatic risk associated with primary cutaneous melanomas has been described. We sought to compare accuracy of the GEP in combination with risk determined using the web-based AJCC Individualized Melanoma Patient Outcome Prediction Tool. GEP results from 205 stage I/II cutaneous melanomas with sufficient clinical data for prognostication using the AJCC tool were classified as low (class 1) or high (class 2) risk. Two 5-year overall survival cutoffs (AJCC 79% and 68%), reflecting survival for patients with stage IIA or IIB disease, respectively, were assigned for binary AJCC risk. Cox univariate analysis revealed significant risk classification of distant metastasis-free and overall survival (hazard ratio range 3.2-9.4, P < .001) for both tools. In all, 43 (21%) cases had discordant GEP and AJCC classification (using 79% cutoff). Eleven of 13 (85%) deaths in that group were predicted as high risk by GEP but low risk by AJCC. Specimens reflect tertiary care center referrals; more effective therapies have been approved for clinical use after accrual. The GEP provides valuable prognostic information and improves identification of high-risk melanomas when used together with the AJCC online prediction tool. Copyright © 2016 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.
Using exposure bands for rapid decision making in the ...
The ILSI Health and Environmental Sciences Institute (HESI) Risk Assessment in the 21st Century (RISK21) project was initiated to address and catalyze improvements in human health risk assessment. RISK21 is a problem formulation-based conceptual roadmap and risk matrix visualization tool, facilitating transparent evaluation of both hazard and exposure components. The RISK21 roadmap is exposure-driven, i.e. exposure is used as the second step (after problem formulation) to define and focus the assessment. This paper describes the exposure tiers of the RISK21 matrix and the approaches to adapt readily available information to more quickly inform exposure at a screening level. In particular, exposure look-up tables were developed from available exposure tools (European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) for worker exposure, ECETOC TRA, European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) for consumer exposure, and USEtox for indirect exposure to humans via the environment) were tested in a hypothetical mosquito bed netting case study. A detailed WHO risk assessment for a similar mosquito net use served as a benchmark for the performance of the RISK21 approach. The case study demonstrated that the screening methodologies provided suitable conservative exposure estimates for risk assessment. The results of this effort showed that the RISK21 approach is useful f
Zartarian, Valerie G; Schultz, Bradley D; Barzyk, Timothy M; Smuts, Marybeth; Hammond, Davyda M; Medina-Vera, Myriam; Geller, Andrew M
2011-12-01
Our primary objective was to provide higher quality, more accessible science to address challenges of characterizing local-scale exposures and risks for enhanced community-based assessments and environmental decision-making. After identifying community needs, priority environmental issues, and current tools, we designed and populated the Community-Focused Exposure and Risk Screening Tool (C-FERST) in collaboration with stakeholders, following a set of defined principles, and considered it in the context of environmental justice. C-FERST is a geographic information system and resource access Web tool under development for supporting multimedia community assessments. Community-level exposure and risk research is being conducted to address specific local issues through case studies. C-FERST can be applied to support environmental justice efforts. It incorporates research to develop community-level data and modeled estimates for priority environmental issues, and other relevant information identified by communities. Initial case studies are under way to refine and test the tool to expand its applicability and transferability. Opportunities exist for scientists to address the many research needs in characterizing local cumulative exposures and risks and for community partners to apply and refine C-FERST.
DEMONSTRATION OF HUMAN EXPOSURE TOOLS
The Human Exposure and Atmospheric Sciences Division (HEASD) of the National Exposure Research Laboratory (NERL) conducts research on exposure measurements, human activity patterns, exposure and dose models, and cumulative exposures critical for the Agency to make scientificall...
I-CAN: the classification and prediction of support needs.
Arnold, Samuel R C; Riches, Vivienne C; Stancliffe, Roger J
2014-03-01
Since 1992, the diagnosis and classification of intellectual disability has been dependent upon three constructs: intelligence, adaptive behaviour and support needs (Luckasson et al. 1992. Mental Retardation: Definition, Classification and Systems of Support. American Association on Intellectual and Developmental Disability, Washington, DC). While the methods and instruments to measure intelligence and adaptive behaviour are well established and generally accepted, the measurement and classification of support needs is still in its infancy. This article explores the measurement and classification of support needs. A study is presented comparing scores on the ICF (WHO, 2001) based I-CAN v4.2 support needs assessment and planning tool with expert clinical judgment using a proposed classification of support needs. A logical classification algorithm was developed and validated on a separate sample. Good internal consistency (range 0.73-0.91, N = 186) and criterion validity (κ = 0.94, n = 49) were found. Further advances in our understanding and measurement of support needs could change the way we assess, describe and classify disability. © 2013 John Wiley & Sons Ltd.
Franson, J.C.; Hohman, W.L.; Moore, J.L.; Smith, M.R.
1996-01-01
We used 363 blood samples collected from wild canvasback dueks (Aythya valisineria) at Catahoula Lake, Louisiana, U.S.A. to evaluate the effect of sample storage time on the efficacy of erythrocytic protoporphyrin as an indicator of lead exposure. The protoporphyrin concentration of each sample was determined by hematofluorometry within 5 min of blood collection and after refrigeration at 4 °C for 24 and 48 h. All samples were analyzed for lead by atomic absorption spectrophotometry. Based on a blood lead concentration of ≥0.2 ppm wet weight as positive evidence for lead exposure, the protoporphyrin technique resulted in overall error rates of 29%, 20%, and 19% and false negative error rates of 47%, 29% and 25% when hematofluorometric determinations were made on blood at 5 min, 24 h, and 48 h, respectively. False positive error rates were less than 10% for all three measurement times. The accuracy of the 24-h erythrocytic protoporphyrin classification of blood samples as positive or negative for lead exposure was significantly greater than the 5-min classification, but no improvement in accuracy was gained when samples were tested at 48 h. The false negative errors were probably due, at least in part, to the lag time between lead exposure and the increase of blood protoporphyrin concentrations. False negatives resulted in an underestimation of the true number of canvasbacks exposed to lead, indicating that hematofluorometry provides a conservative estimate of lead exposure.
An Addendum to "A New Tool for Climatic Analysis Using Köppen Climate Classification"
ERIC Educational Resources Information Center
Larson, Paul R.; Lohrengel, C. Frederick, II
2014-01-01
The Köppen climatic classification system in a modified format is the most widely applied system in use today. Mapping and analysis of hundreds of arid and semiarid climate stations has made the use of the additional fourth letter in BW/BS climates essential. The addition of "s," "w," or "f" to the standard…
Zhao, Xin; Kuipers, Oscar P
2016-11-07
Gram-positive bacteria of the Bacillales are important producers of antimicrobial compounds that might be utilized for medical, food or agricultural applications. Thanks to the wide availability of whole genome sequence data and the development of specific genome mining tools, novel antimicrobial compounds, either ribosomally- or non-ribosomally produced, of various Bacillales species can be predicted and classified. Here, we provide a classification scheme of known and putative antimicrobial compounds in the specific context of Bacillales species. We identify and describe known and putative bacteriocins, non-ribosomally synthesized peptides (NRPs), polyketides (PKs) and other antimicrobials from 328 whole-genome sequenced strains of 57 species of Bacillales by using web based genome-mining prediction tools. We provide a classification scheme for these bacteriocins, update the findings of NRPs and PKs and investigate their characteristics and suitability for biocontrol by describing per class their genetic organization and structure. Moreover, we highlight the potential of several known and novel antimicrobials from various species of Bacillales. Our extended classification of antimicrobial compounds demonstrates that Bacillales provide a rich source of novel antimicrobials that can now readily be tapped experimentally, since many new gene clusters are identified.
Cattaneo, Ruggero; Marci, Maria Chiara; Pietropaoli, Davide; Ortu, Eleonora
2017-01-01
Dysregulation of Autonomic Nervous System (ANS) and central pain pathways in temporomandibular disorders (TMD) is a growing evidence. Authors include some forms of TMD among central sensitization syndromes (CSS), a group of pathologies characterized by central morphofunctional alterations. Central Sensitization Inventory (CSI) is useful for clinical diagnosis. Clinical examination and CSI cannot identify the central site(s) affected in these diseases. Ultralow frequency transcutaneous electrical nerve stimulation (ULFTENS) is extensively used in TMD and in dental clinical practice, because of its effects on descending pain modulation pathways. The Diagnostic Criteria for TMD (DC/TMD) are the most accurate tool for diagnosis and classification of TMD. However, it includes CSI to investigate central aspects of TMD. Preliminary data on sensory ULFTENS show it is a reliable tool for the study of central and autonomic pathways in TMD. An alternative classification based on the presence of Central Sensitization and on individual response to sensory ULFTENS is proposed. TMD may be classified into 4 groups: (a) TMD with Central Sensitization ULFTENS Responders; (b) TMD with Central Sensitization ULFTENS Nonresponders; (c) TMD without Central Sensitization ULFTENS Responders; (d) TMD without Central Sensitization ULFTENS Nonresponders. This pathogenic classification of TMD may help to differentiate therapy and aetiology. PMID:28932132
A Higher Level Classification of All Living Organisms
Ruggiero, Michael A.; Gordon, Dennis P.; Orrell, Thomas M.; Bailly, Nicolas; Bourgoin, Thierry; Brusca, Richard C.; Cavalier-Smith, Thomas; Guiry, Michael D.; Kirk, Paul M.
2015-01-01
We present a consensus classification of life to embrace the more than 1.6 million species already provided by more than 3,000 taxonomists’ expert opinions in a unified and coherent, hierarchically ranked system known as the Catalogue of Life (CoL). The intent of this collaborative effort is to provide a hierarchical classification serving not only the needs of the CoL’s database providers but also the diverse public-domain user community, most of whom are familiar with the Linnaean conceptual system of ordering taxon relationships. This classification is neither phylogenetic nor evolutionary but instead represents a consensus view that accommodates taxonomic choices and practical compromises among diverse expert opinions, public usages, and conflicting evidence about the boundaries between taxa and the ranks of major taxa, including kingdoms. Certain key issues, some not fully resolved, are addressed in particular. Beyond its immediate use as a management tool for the CoL and ITIS (Integrated Taxonomic Information System), it is immediately valuable as a reference for taxonomic and biodiversity research, as a tool for societal communication, and as a classificatory “backbone” for biodiversity databases, museum collections, libraries, and textbooks. Such a modern comprehensive hierarchy has not previously existed at this level of specificity. PMID:25923521
Chandonia, John-Marc; Fox, Naomi K; Brenner, Steven E
2017-02-03
SCOPe (Structural Classification of Proteins-extended, http://scop.berkeley.edu) is a database of relationships between protein structures that extends the Structural Classification of Proteins (SCOP) database. SCOP is an expert-curated ordering of domains from the majority of proteins of known structure in a hierarchy according to structural and evolutionary relationships. SCOPe classifies the majority of protein structures released since SCOP development concluded in 2009, using a combination of manual curation and highly precise automated tools, aiming to have the same accuracy as fully hand-curated SCOP releases. SCOPe also incorporates and updates the ASTRAL compendium, which provides several databases and tools to aid in the analysis of the sequences and structures of proteins classified in SCOPe. SCOPe continues high-quality manual classification of new superfamilies, a key feature of SCOP. Artifacts such as expression tags are now separated into their own class, in order to distinguish them from the homology-based annotations in the remainder of the SCOPe hierarchy. SCOPe 2.06 contains 77,439 Protein Data Bank entries, double the 38,221 structures classified in SCOP. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Retrospective assessment of solvent exposure in paint manufacturing.
Glass, D C; Spurgeon, A; Calvert, I A; Clark, J L; Harrington, J M
1994-01-01
This paper describes how exposure to solvents at two large paint making sites was assessed in a study carried out to investigate the possibility of neuropsychological effects resulting from long term exposure to organic solvents. A job exposure matrix was constructed by buildings and year. A detailed plant history was taken and this was used to identify uniform exposure periods during which workers' exposure to solvents was not thought to have changed significantly. Exposure monitoring data, collected by the company before the study, was then used to characterise exposure within each uniform exposure period. Estimates were made for periods during which no air monitoring was available. Individual detailed job histories were collected for subjects and controls. The job histories were used to estimate exposure on an individual basis with the job exposure matrix. Exposure was expressed as duration, cumulative dose, and intensity of exposure. Classification of exposure by duration alone was found to result in misclassification of subjects. PMID:7951794
Observation versus classification in supervised category learning.
Levering, Kimery R; Kurtz, Kenneth J
2015-02-01
The traditional supervised classification paradigm encourages learners to acquire only the knowledge needed to predict category membership (a discriminative approach). An alternative that aligns with important aspects of real-world concept formation is learning with a broader focus to acquire knowledge of the internal structure of each category (a generative approach). Our work addresses the impact of a particular component of the traditional classification task: the guess-and-correct cycle. We compare classification learning to a supervised observational learning task in which learners are shown labeled examples but make no classification response. The goals of this work sit at two levels: (1) testing for differences in the nature of the category representations that arise from two basic learning modes; and (2) evaluating the generative/discriminative continuum as a theoretical tool for understand learning modes and their outcomes. Specifically, we view the guess-and-correct cycle as consistent with a more discriminative approach and therefore expected it to lead to narrower category knowledge. Across two experiments, the observational mode led to greater sensitivity to distributional properties of features and correlations between features. We conclude that a relatively subtle procedural difference in supervised category learning substantially impacts what learners come to know about the categories. The results demonstrate the value of the generative/discriminative continuum as a tool for advancing the psychology of category learning and also provide a valuable constraint for formal models and associated theories.
Acceleration of Advanced CN Antidote Agents for Mass Exposure Treatments: DMTS
2014-12-01
Intraosseous Injection; Inhalational Delivery 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE...exposure models. We have administered antidotes via intramuscular injection, inhalation, and intraosseous routes. These animal models are all available...injection, inhalation, and intraosseous routes. These animal models are all available for ongoing testing of the novel candidate antidotes as was
ERIC Educational Resources Information Center
Greve, Kevin W.; Springer, Steven; Bianchini, Kevin J.; Black, F. William; Heinly, Matthew T.; Love, Jeffrey M.; Swift, Douglas A.; Ciota, Megan A.
2007-01-01
This study examined the sensitivity and false-positive error rate of reliable digit span (RDS) and the WAIS-III Digit Span (DS) scaled score in persons alleging toxic exposure and determined whether error rates differed from published rates in traumatic brain injury (TBI) and chronic pain (CP). Data were obtained from the files of 123 persons…
1988-10-01
overview of the complexity analysis tool ( CAT ), an automated tool which will analyze mission critical computer resources (MCCR) software. CAT is based...84 MAR UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE 19. ABSTRACT: (cont) CAT automates the metric for BASIC (HP-71), ATLAS (EQUATE), Ada (subset...UNIX 5.2). CAT analyzes source code and computes complexity on a module basis. CAT also generates graphic representations of the logic flow paths and
EUVL mask dual pods to be used for mask shipping and handling in exposure tools
NASA Astrophysics Data System (ADS)
Gomei, Yoshio; Ota, Kazuya; Lystad, John; Halbmair, Dave; He, Long
2007-03-01
The concept of Extreme Ultra-Violet Lithography (EUVL) mask dual pods is proposed for use in both mask shipping and handling in exposure tools. The inner pod was specially designed to protect masks from particle contamination during shipping from mask houses to wafer factories. It can be installed in a load-lock chamber of exposure tools and evacuated while holding the mask inside. The inner pod upper cover is removed just before the mask is installed to a mask stage. Prototypes were manufactured and tested for shipping and for vacuum cycling. We counted particle adders through these actions with a detectable level of 54 nm and up. The adder count was close to zero, or we can say that the obtained result is within the noise level of our present evaluation environment. This indicates that the present concept is highly feasible for EUVL mask shipping and handling in exposure tools.
Alvarez-Casado, Enrique; Hernandez-Soto, Aquiles; Tello, Sandoval; Gual, Rosa
2012-01-01
Occupational musculoskeletal disorders in the upper limbs and its consequences on the impact and prevalence in the work force are subject of many investigations in almost all the production fields. However, the exposure to this kind of risk factor on urban gardeners has not been well studied so far. The kind of plant varieties used in the parks, the tools that they use, as much as the necessary actions for the maintenance of the park, have an impact on the biomechanical overload of the upper limbs. Additionally, the analysis of the exposure to the biomechanical overload on upper limbs in gardening work is a complex task, mainly because it is an activity highly variable and of annual cycle. For this reason an analytical model for risk exposure evaluation is necessary. During this research the work activity of 29 gardeners in 3 urban parks of Barcelona has been analyzed. Each park has a specific acting plan, in relation with the quantity and the typology of vegetal species, its classification and the season of the year. Work and observation and video recording sessions on-site were conducted. The video-graphic registration was done on workers without any prior musculoskeletal disorder and with a minimum labour experience of 5 years. Moreover, the analysis of saturation time, considered as the relation of the repetitive working hours in reference with the hours of effective work was done. Using the registered tasks on video, the biomechanical overload on upper limbs applying the OCRA Checklist method was analyzed. A methodological procedure to analyze the risk exposure in annual working cycle has been proposed. The results that we got allow us to get information that can help in the assignment of the tasks and in the training of staff, as well as in the recommendations of the urban landscape's design. All these aspects have the goal to decrease the risk to develop work-related musculoskeletal disorders.
Use of job-exposure matrices to estimate occupational exposure to pesticides: A review.
Carles, Camille; Bouvier, Ghislaine; Lebailly, Pierre; Baldi, Isabelle
2017-03-01
The health effects of pesticides have been extensively studied in epidemiology, mainly in agricultural populations. However, pesticide exposure assessment remains a key methodological issue for epidemiological studies. Besides self-reported information, expert assessment or metrology, job-exposure matrices still appear to be an interesting tool. We reviewed all existing matrices assessing occupational exposure to pesticides in epidemiological studies and described the exposure parameters they included. We identified two types of matrices, (i) generic ones that are generally used in case-control studies and document broad categories of pesticides in a large range of jobs, and (ii) specific matrices, developed for use in agricultural cohorts, that generally provide exposure metrics at the active ingredient level. The various applications of these matrices in epidemiological studies have proven that they are valuable tools to assess pesticide exposure. Specific matrices are particularly promising for use in agricultural cohorts. However, results obtained with matrices have rarely been compared with those obtained with other tools. In addition, the external validity of the given estimates has not been adequately discussed. Yet, matrices would help in reducing misclassification and in quantifying cumulated exposures, to improve knowledge about the chronic health effects of pesticides.
What is the place of pre-exposure prophylaxis in HIV prevention?
De Man, Jeroen; Colebunders, Robert; Florence, Eric; Laga, Marie; Kenyon, Christopher
2013-01-01
New tools are needed to bring down ongoing high HIV incidence. This review aims to evaluate the place of one of these new tools (pre-exposure prophylaxis) in a comprehensive prevention strategy. Several trials have demonstrated the safety and the efficacy of pre-exposure prophylaxis in HIV prevention. Two large trials have, however, failed to show such efficacy. This was likely due to poor adherence in these trials. New forms of long-acting pre-exposure prophylaxis currently in trials may deal with these problems of low adherence. Pre-exposure prophylaxis has been demonstrated to be cost-effective within certain settings. The introduction of pre-exposure prophylaxis into prevention programs needs to be carefully thought through. For example, pre-exposure prophylaxis-induced risk compensation, at both an individual and population level, could undermine other aspects of a comprehensive HIV prevention program. In conclusion, pre-exposure prophylaxis could be a useful additional tool for the prevention of HIV in specific high-risk groups. It should be implemented in a way that deals with issues such as ensuring high adherence and ensuring that pre-exposure prophylaxis does not detract from, but complements, other more fundamental elements of HIV prevention programs.
Influence of nuclei segmentation on breast cancer malignancy classification
NASA Astrophysics Data System (ADS)
Jelen, Lukasz; Fevens, Thomas; Krzyzak, Adam
2009-02-01
Breast Cancer is one of the most deadly cancers affecting middle-aged women. Accurate diagnosis and prognosis are crucial to reduce the high death rate. Nowadays there are numerous diagnostic tools for breast cancer diagnosis. In this paper we discuss a role of nuclear segmentation from fine needle aspiration biopsy (FNA) slides and its influence on malignancy classification. Classification of malignancy plays a very important role during the diagnosis process of breast cancer. Out of all cancer diagnostic tools, FNA slides provide the most valuable information about the cancer malignancy grade which helps to choose an appropriate treatment. This process involves assessing numerous nuclear features and therefore precise segmentation of nuclei is very important. In this work we compare three powerful segmentation approaches and test their impact on the classification of breast cancer malignancy. The studied approaches involve level set segmentation, fuzzy c-means segmentation and textural segmentation based on co-occurrence matrix. Segmented nuclei were used to extract nuclear features for malignancy classification. For classification purposes four different classifiers were trained and tested with previously extracted features. The compared classifiers are Multilayer Perceptron (MLP), Self-Organizing Maps (SOM), Principal Component-based Neural Network (PCA) and Support Vector Machines (SVM). The presented results show that level set segmentation yields the best results over the three compared approaches and leads to a good feature extraction with a lowest average error rate of 6.51% over four different classifiers. The best performance was recorded for multilayer perceptron with an error rate of 3.07% using fuzzy c-means segmentation.
Using vegetation indices as input into ramdom forest for soybean and weed classification
USDA-ARS?s Scientific Manuscript database
Weed management is a major component of a soybean (Glycine max L.) production system; thus, managers need tools to help them distinguish soybean from weeds. Vegetation indices derived from light reflectance properties of plants have shown promise as tools to enhance differences among plants. The o...
Batterman, Stuart; Burke, Janet; Isakov, Vlad; Lewis, Toby; Mukherjee, Bhramar; Robins, Thomas
2014-01-01
Vehicles are major sources of air pollutant emissions, and individuals living near large roads endure high exposures and health risks associated with traffic-related air pollutants. Air pollution epidemiology, health risk, environmental justice, and transportation planning studies would all benefit from an improved understanding of the key information and metrics needed to assess exposures, as well as the strengths and limitations of alternate exposure metrics. This study develops and evaluates several metrics for characterizing exposure to traffic-related air pollutants for the 218 residential locations of participants in the NEXUS epidemiology study conducted in Detroit (MI, USA). Exposure metrics included proximity to major roads, traffic volume, vehicle mix, traffic density, vehicle exhaust emissions density, and pollutant concentrations predicted by dispersion models. Results presented for each metric include comparisons of exposure distributions, spatial variability, intraclass correlation, concordance and discordance rates, and overall strengths and limitations. While showing some agreement, the simple categorical and proximity classifications (e.g., high diesel/low diesel traffic roads and distance from major roads) do not reflect the range and overlap of exposures seen in the other metrics. Information provided by the traffic density metric, defined as the number of kilometers traveled (VKT) per day within a 300 m buffer around each home, was reasonably consistent with the more sophisticated metrics. Dispersion modeling provided spatially- and temporally-resolved concentrations, along with apportionments that separated concentrations due to traffic emissions and other sources. While several of the exposure metrics showed broad agreement, including traffic density, emissions density and modeled concentrations, these alternatives still produced exposure classifications that differed for a substantial fraction of study participants, e.g., from 20% to 50% of homes, depending on the metric, would be incorrectly classified into “low”, “medium” or “high” traffic exposure classes. These and other results suggest the potential for exposure misclassification and the need for refined and validated exposure metrics. While data and computational demands for dispersion modeling of traffic emissions are non-trivial concerns, once established, dispersion modeling systems can provide exposure information for both on- and near-road environments that would benefit future traffic-related assessments. PMID:25226412
NASA Astrophysics Data System (ADS)
Kaur, Parneet; Singh, Sukhwinder; Garg, Sushil; Harmanpreet
2010-11-01
In this paper we study about classification algorithms for farm DSS. By applying classification algorithms i.e. Limited search, ID3, CHAID, C4.5, Improved C4.5 and One VS all Decision Tree on common data set of crop with specified class, results are obtained. The tool used to derive results is SPINA. The graphical results obtained from tool are compared to suggest best technique to develop farm Decision Support System. This analysis would help to researchers to design effective and fast DSS for farmer to take decision for enhancing their yield.
NASA Astrophysics Data System (ADS)
Judycka, U.; Jagiello, K.; Bober, L.; Błażejowski, J.; Puzyn, T.
2018-06-01
Chemometric tools were applied to investigate the biological behaviour of ampholytic substances in relation to their physicochemical and spectral properties. Results of the Principal Component Analysis suggest that size of molecules and their electronic and spectral characteristics are the key properties required to predict therapeutic relevance of the compounds examined. These properties were used for developing the structure-activity classification model. The classification model allows assessing the therapeutic behaviour of ampholytic substances on the basis of solely values of descriptors that can be obtained computationally. Thus, the prediction is possible without necessity of carrying out time-consuming and expensive laboratory tests, which is its main advantage.
Kazaryan, Airazat M.; Røsok, Bård I.; Edwin, Bjørn
2013-01-01
Background. Morbidity is a cornerstone assessing surgical treatment; nevertheless surgeons have not reached extensive consensus on this problem. Methods and Findings. Clavien, Dindo, and Strasberg with coauthors (1992, 2004, 2009, and 2010) made significant efforts to the standardization of surgical morbidity (Clavien-Dindo-Strasberg classification, last revision, the Accordion classification). However, this classification includes only postoperative complications and has two principal shortcomings: disregard of intraoperative events and confusing terminology. Postoperative events have a major impact on patient well-being. However, intraoperative events should also be recorded and reported even if they do not evidently affect the patient's postoperative well-being. The term surgical complication applied in the Clavien-Dindo-Strasberg classification may be regarded as an incident resulting in a complication caused by technical failure of surgery, in contrast to the so-called medical complications. Therefore, the term surgical complication contributes to misinterpretation of perioperative morbidity. The term perioperative adverse events comprising both intraoperative unfavourable incidents and postoperative complications could be regarded as better alternative. In 2005, Satava suggested a simple grading to evaluate intraoperative surgical errors. Based on that approach, we have elaborated a 3-grade classification of intraoperative incidents so that it can be used to grade intraoperative events of any type of surgery. Refinements have been made to the Accordion classification of postoperative complications. Interpretation. The proposed systematization of perioperative adverse events utilizing the combined application of two appraisal tools, that is, the elaborated classification of intraoperative incidents on the basis of the Satava approach to surgical error evaluation together with the modified Accordion classification of postoperative complication, appears to be an effective tool for comprehensive assessment of surgical outcomes. This concept was validated in regard to various surgical procedures. Broad implementation of this approach will promote the development of surgical science and practice. PMID:23762627
Portnuff, Cory D F; Kleindienst, Samantha; Bogle, Jamie M
2017-09-01
Vestibular-evoked myogenic potentials (VEMPs) are commonly used clinical assessments for patients with complaints of dizziness. However, relatively high air-conducted stimuli are required to elicit the VEMP, and ultimately may compromise safe noise exposure limits. Recently, research has reported the potential for noise-induced hearing loss (NIHL) from VEMP stimulus exposure through studies of reduced otoacoustic emission levels after VEMP testing, as well as a recent case study showing permanent sensorineural hearing loss associated with VEMP exposure. The purpose of this report is to review the potential for hazardous noise exposure from VEMP stimuli and to suggest clinical parameters for safe VEMP testing. Literature review with presentation of clinical guidelines and a clinical tool for estimating noise exposure. The literature surrounding VEMP stimulus-induced hearing loss is reviewed, including several cases of overexposure. The article then presents a clinical calculation tool for the estimation of a patient's safe noise exposure from VEMP stimuli, considering stimulus parameters, and includes a discussion of how varying stimulus parameters affect a patient's noise exposure. Finally, recommendations are provided for recognizing and managing specific patient populations who may be at higher risk for NIHL from VEMP stimulus exposure. A sample protocol is provided that allows for safe noise exposure. VEMP stimuli have the potential to cause NIHL due to high sound exposure levels. However, with proper safety protocols in place, clinicians may reduce or eliminate this risk to their patients. Use of the tools provided, including the noise exposure calculation tool and sample protocols, may help clinicians to understand and ensure safe use of VEMP stimuli. American Academy of Audiology
MATline: a job-exposure matrix for carcinogenic chemicals.
Gilardi, Luisella; Falcone, Umberto; Santoro, Silvano; Coffano, Elena
2008-01-01
MATline is a tool that can be used to predict which industrial processes can be expected to involve the use of a substance that is considered carcinogenic as documented in the literature. The database includes agents carrying risk phrases R45, R49 and R40 according to the method of classification adopted by the EU and/or agents in categories 1, 2A and 2B as classified by the International Agency for Research on Cancer (IARC). Each agent is associated with a list of industrial processes coded according to the tariff headings used by the National Institute of Insurance against Occupational Injuries and Diseases (Istituto Nazionale per l'Assicurazione contro gli Infortuni sul Lavoro, INAIL). The main sources of information are the IARC Monographs and databases available through the National Library of Medicine's TOXNET portal. The matrix currently includes 600 carcinogenic agents, 23 classes of agents and some 7000 links between agents and industrial processes. MATline can be viewed on the www.dors.it website.
Sinclair, Lisa Bundara; Fox, Michael H.; Betts, Donald R.
2015-01-01
SUMMARY This article describes use of the International Classification of Functioning, Disability and Health (ICF) as a tool for strategic planning. The ICF is the international classification system for factors that influence health, including Body Structures, Body Functions, Activities and Participation and Environmental Factors. An overview of strategic planning and the ICF are provided. Selected ICF concepts and nomenclature are used to demonstrate its utility in helping develop a classic planning framework, objectives, measures and actions. Some issues and resolutions for applying the ICF are described. Applying the ICF for strategic health planning is an innovative approach that fosters the inclusion of social ecological health determinants and broad populations. If employed from the onset of planning, the ICF can help public health organizations systematically conceptualize, organize and communicate a strategic health plan. This article is a US Government work and is in the public domain in the USA. PMID:23147247
Classification in childhood disability: focusing on function in the 21st century.
Rosenbaum, Peter; Eliasson, Ann-Christin; Hidecker, Mary Jo Cooley; Palisano, Robert J
2014-08-01
Classification systems in health care are usually based on current understanding of the condition. They are often derived empirically and adopted applying sound principles of measurement science to assess whether they are reliable (consistent) and valid (true) for the purposes to which they are applied. In the past 15 years, the authors have developed and validated classification systems for specific aspects of everyday function in people with cerebral palsy--gross motor function, manual abilities, and communicative function. This article describes the approaches used to conceptualize each aspect of function, develop the tools, and assess their reliability and validity. We report on the utility of each system with respect to clinical applicability, use of these tools for research, and the uptake and impact that they have had around the world. We hope that readers will find these accounts interesting, relevant, and applicable to their daily work with children and youth with disabilities. © The Author(s) 2014.
Sinclair, Lisa Bundara; Fox, Michael H; Betts, Donald R
2013-01-01
This article describes use of the International Classification of Functioning, Disability and Health (ICF) as a tool for strategic planning. The ICF is the international classification system for factors that influence health, including Body Structures, Body Functions, Activities and Participation and Environmental Factors. An overview of strategic planning and the ICF are provided. Selected ICF concepts and nomenclature are used to demonstrate its utility in helping develop a classic planning framework, objectives, measures and actions. Some issues and resolutions for applying the ICF are described. Applying the ICF for strategic health planning is an innovative approach that fosters the inclusion of social ecological health determinants and broad populations. If employed from the onset of planning, the ICF can help public health organizations systematically conceptualize, organize and communicate a strategic health plan. Published 2012. This article is a US Government work and is in the public domain in the USA.
A new epileptic seizure classification based exclusively on ictal semiology.
Lüders, H; Acharya, J; Baumgartner, C; Benbadis, S; Bleasel, A; Burgess, R; Dinner, D S; Ebner, A; Foldvary, N; Geller, E; Hamer, H; Holthausen, H; Kotagal, P; Morris, H; Meencke, H J; Noachtar, S; Rosenow, F; Sakamoto, A; Steinhoff, B J; Tuxhorn, I; Wyllie, E
1999-03-01
Historically, seizure semiology was the main feature in the differential diagnosis of epileptic syndromes. With the development of clinical EEG, the definition of electroclinical complexes became an essential tool to define epileptic syndromes, particularly focal epileptic syndromes. Modern advances in diagnostic technology, particularly in neuroimaging and molecular biology, now permit better definitions of epileptic syndromes. At the same time detailed studies showed that there does not necessarily exist a one-to-one relationship between epileptic seizures or electroclinical complexes and epileptic syndromes. These developments call for the reintroduction of an epileptic seizure classification based exclusively on clinical semiology, similar to the seizure classifications which were used by neurologists before the introduction of the modern diagnostic methods. This classification of epileptic seizures should always be complemented by an epileptic syndrome classification based on all the available clinical information (clinical history, neurological exam, ictal semiology, EEG, anatomical and functional neuroimaging, etc.). Such an approach is more consistent with mainstream clinical neurology and would avoid the current confusion between the classification of epileptic seizures (which in the International Seizure Classification is actually a classification of electroclinical complexes) and the classification of epileptic syndromes.
Kropat, Georg; Bochud, Francois; Jaboyedoff, Michel; Laedermann, Jean-Pascal; Murith, Christophe; Palacios Gruson, Martha; Baechler, Sébastien
2015-09-01
According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables like soil gas radon measurements as well as more detailed geological information. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bruno, Rossella; Alì, Greta; Fontanini, Gabriella
2018-01-01
Malignant pleural mesothelioma (MPM) is an aggressive tumor associated with asbestos exposure. Histopathological analysis of pleural tissues is the gold standard for diagnosis; however, it can be difficult to differentiate malignant from benign pleural lesions. The purpose of this review is to describe the most important biomarkers and new diagnostic tools suggested for this differential diagnosis. There are many studies concerning the separation between MPM and benign pleural proliferations from both pleural tissues or effusions; most of them are based on the evaluation of one or few biomarkers by immunohistochemistry (IHC) or enzyme-linked immunosorbent assays (ELISAs), whereas others focused on the identification of MPM signatures given by microRNA (miRNA) or gene expression profiles as well as on the combination of molecular data and classification algorithms. None of the reported biomarkers showed adequate diagnostic accuracy, except for p16 [evaluated by fluorescent in situ hybridization (FISH)] and BAP1 (evaluated by IHC), both biomarkers are recommended by the International Mesothelioma Interest Group guidelines for histological and cytological diagnosis. BAP1 and p16 showed a specificity of 100% in discerning malignant from benign lesions because they are exclusively unexpressed or deleted in MPM. However, their sensitivity, even when used together, is not completely sufficient, and absence of their alterations cannot confirm the benign nature of the lesion. Recently, the availability of new techniques and increasing knowledge regarding MPM genetics led to the definition of some molecular panels, including genes or miRNAs specifically deregulated in MPM, that are extremely valuable for differential diagnosis. Moreover, the development of classification algorithms is facilitating the application of molecular data for clinical practice. Data regarding new diagnostic tools and MPM signatures are absolutely promising; however, before their application in clinical practice, a prospective validation is necessary, as these approaches could surely improve the differential diagnosis between malignant and benign pleural lesions.
Tagiyeva, Nara; Semple, Sean; Devereux, Graham; Sherriff, Andrea; Henderson, John; Elias, Peter; Ayres, Jon G
2011-06-01
Most of the evidence on agreement between self- and proxy-reported occupational data comes from interview-based studies. The authors aimed to examine agreement between women's reports of their partner's occupation and their partner's own description using questionnaire-based data collected as a part of the prospective, population-based Avon Longitudinal Study of Parents and Children. Information on present occupation was self-reported by women's partners and proxy-reported by women through questionnaires administered at 8 and 21 months after the birth of a child. Job titles were coded to the Standard Occupational Classification (SOC2000) using software developed by the University of Warwick (Computer-Assisted Structured Coding Tool). The accuracy of proxy-report was expressed as percentage agreement and kappa coefficients for four-, three- and two-digit SOC2000 codes obtained in automatic and semiautomatic (manually improved) coding modes. Data from 6016 couples at 8 months and 5232 couples at 21 months postnatally were included in the analyses. The agreement between men's self-reported occupation and women's report of their partner's occupation in fully automatic coding mode at four-, three- and two-digit code level was 65%, 71% and 77% at 8 months and 68%, 73% and 76% at 21 months. The accuracy of agreement was slightly improved by semiautomatic coding of occupations: 73%/73%, 78%/77% and 83%/80% at 8/21 months respectively. While this suggests that women's description of their partners' occupation can be used as a valuable tool in epidemiological research where data from partners are not available, this study revealed no agreement between these young women and their partners at the two-digit level of SOC2000 coding in approximately one in five cases. Proxy reporting of occupation introduces a statistically significant degree of error in classification. The effects of occupational misclassification by proxy reporting in retrospective occupational epidemiological studies based on questionnaire data should be considered.
A simulation study to quantify the impacts of exposure ...
A simulation study to quantify the impacts of exposure measurement error on air pollution health risk estimates in copollutant time-series models The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Valtorta, Nicole K; Kanaan, Mona; Gilbody, Simon; Hanratty, Barbara
2016-04-18
We present a novel way of classifying and comparing measures of social relationships to help readers interpret the growing literature on loneliness and social isolation and to provide researchers with a starting point to guide their choice of measuring tool. Measures of social relationships used in epidemiological studies were identified from two systematic reviews-one review on the association between social relationships and health and social care service use, and a second review on the association between social relationships and health. Questions from each measure were retrieved and tabulated to derive a classification of social relationship measures. We present a classification of measures according to two dimensions: (1) whether instruments cover structural or functional aspects of social relationships and (2) the degree of subjectivity asked of respondents. We explain how this classification can be used to clarify the remit of the many questionnaires used in the literature and to compare them. Different dimensions of social relationships are likely to have different implications for health. Our classification of social relationship measures transcends disciplinary and conceptual boundaries, allowing researchers to compare tools that developed from different theoretical perspectives. Careful choice of measures is essential to further our understanding of the links between social relationships and health, to identify people in need of help and to design appropriate prevention and intervention strategies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Moretti, Marta; Alves, Ines; Maxwell, Gregor
2012-02-01
This article presents the outcome of a systematic literature review exploring the applicability of the International Classification of Functioning, Disability, and Health (ICF) and its Children and Youth version (ICF-CY) at various levels and in processes within the education systems in different countries. A systematic database search using selected search terms has been used. The selection of studies was then refined further using four protocols: inclusion and exclusion protocols at abstract and full text and extraction levels along with a quality protocol. Studies exploring the direct relationship between education and the ICF/ICF-CY were sought.As expected, the results show a strong presence of studies from English-speaking countries, namely from Europe and North America. The articles were mainly published in noneducational journals. The most used ICF/ICF-CY components are activity and participation, participation, and environmental factors. From the analysis of the papers included, the results show that the ICF/ICF-CY is currently used as a research tool, theoretical framework, and tool for implementing educational processes. The ICF/ICF-CY can provide a useful language to the education field where there is currently a lot of disparity in theoretical, praxis, and research issues. Although the systematic literature review does not report a high incidence of the use of the ICF/ICF-CY in education, the results show that the ICF/ICF-CY model and classification have potential to be applied in education systems.
Wood Dust in Joineries and Furniture Manufacturing: An Exposure Determinant and Intervention Study.
Douwes, Jeroen; Cheung, Kerry; Prezant, Bradley; Sharp, Mark; Corbin, Marine; McLean, Dave; 't Mannetje, Andrea; Schlunssen, Vivi; Sigsgaard, Torben; Kromhout, Hans; LaMontagne, Anthony D; Pearce, Neil; McGlothlin, James D
2017-05-01
To assess wood dust exposures and determinants in joineries and furniture manufacturing and to evaluate the efficacy of specific interventions on dust emissions under laboratory conditions. Also, in a subsequent follow-up study in a small sample of joinery workshops, we aimed to develop, implement, and evaluate a cost-effective and practicable intervention to reduce dust exposures. Personal inhalable dust (n = 201) was measured in 99 workers from 10 joineries and 3 furniture-making factories. To assess exposure determinants, full-shift video exposure monitoring (VEM) was conducted in 19 workers and task-based VEM in 32 workers (in 7 joineries and 3 furniture factories). We assessed the efficacy of vacuum extraction on hand tools and the use of vacuum cleaners instead of sweeping and dry wiping under laboratory conditions. These measures were subsequently implemented in three joinery workshops with 'high' (>4 mg m-3) and one with 'low' (<2 mg m-3) baseline exposures. We also included two control workshops (one 'low' and one 'high' exposure workshop) in which no interventions were implemented. Exposures were measured 4 months prior and 4 months following the intervention. Average (geometric means) exposures in joinery and furniture making were 2.5 mg m-3 [geometric standard deviations (GSD) 2.5] and 0.6 mg m-3 (GSD 2.3), respectively. In joinery workers cleaning was associated with a 3.0-fold higher (P < 0.001) dust concentration compared to low exposure tasks (e.g. gluing), while the use of hand tools showed 3.0- to 11.0-fold higher (P < 0.001) exposures. In furniture makers, we found a 5.4-fold higher exposure (P < 0.001) with using a table/circular saw. Laboratory efficiency experiments showed a 10-fold decrease in exposure (P < 0.001) when using a vacuum cleaner. Vacuum extraction on hand tools combined with a downdraft table reduced exposures by 42.5% for routing (P < 0.1) and 85.5% for orbital sanding (P < 0.001). Following intervention measures in joineries, a borderline statistically significant (P < 0.10) reduction in exposure of 30% was found in workshops with 'high' baseline exposures, but no reduction was shown in the workshop with 'low' baseline exposures. Wood dust exposure is high in joinery workers and (to a lesser extent) furniture makers with frequent use of hand tools and cleaning being key drivers of exposure. Vacuum extraction on hand tools and alternative cleaning methods reduced workplace exposures substantially, but may be insufficient to achieve compliance with current occupational exposure limits. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Englert, H; Champion, D; Wu, J C; Giallussi, J; McGrath, M; Manolios, N
2011-02-01
In a patient with early topoisomerase antibody-positive scleroderma, antinuclear antibody positivity was fortuitously observed to predate nailfold capillaroscopy changes. Using this case as a template, the prediagnostic phase of the presumed multifactorial disease may be divided into 5 temporal phases--phase 1 representing conception and intrauterine environment, phase 2 representing the extrauterine environment predating environmental exposure; phase 3 representing the early post-environmental exposure interval with no detectable perturbed body status; phase 4 representing the post-environmental exposure interval characterized by autoantibody production and microvascular changes, and phase 5, the symptomatic clinical prediagnostic interval (Raynaud's, skin, musculoskeletal, gastrointestinal, cardiorespiratory) prompting scleroderma diagnosis. Temporal classification of prescleroderma aids in both the understanding and definition of scleroderma 'onset'. If altered nailfold capillaries and autoantibodies develop at comparable rates, and if the findings from this case--that autoantibody changes precede microvascular changes--are truly representative of the preclinical disease phase, then these findings argue that the evolution of the disease is from within the vessel outwards, rather than vice versa. © 2011 The Authors. Internal Medicine Journal © 2011 Royal Australasian College of Physicians.
The development of the WHO Recommended Classification of Pesticides by Hazard*
Copplestone, J. F.
1988-01-01
The WHO Recommended Classification of Pesticides by Hazard, which was originally intended as a tool to aid international harmonization of pesticide registration, is already 13 years old. Over the years, it has been refined, and it is now accepted by many countries and international organizations. The story of its development illustrates well an international approach to problems as they have arisen. PMID:3264763
ERIC Educational Resources Information Center
Markey, Karen; Demeyer, Anh N.
This research project focuses on the implementation and testing of the Dewey Decimal Classification (DDC) system as an online searcher's tool for subject access, browsing, and display in an online catalog. The research project comprises 12 activities. The three interim reports in this document cover the first seven of these activities: (1) obtain…
Multiuser Transmit Beamforming for Maximum Sum Capacity in Tactical Wireless Multicast Networks
2006-08-01
commonly used extended Kalman filter . See [2, 5, 6] for recent tutorial overviews. In particle filtering , continuous distributions are approximated by...signals (using and developing associated particle filtering tools). Our work on these topics has been reported in seven (IEEE, SIAM) journal papers and...multidimensional scaling, tracking, intercept, particle filters . 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT 18. SECURITY CLASSIFICATION OF
Machine Learning for Biological Trajectory Classification Applications
NASA Technical Reports Server (NTRS)
Sbalzarini, Ivo F.; Theriot, Julie; Koumoutsakos, Petros
2002-01-01
Machine-learning techniques, including clustering algorithms, support vector machines and hidden Markov models, are applied to the task of classifying trajectories of moving keratocyte cells. The different algorithms axe compared to each other as well as to expert and non-expert test persons, using concepts from signal-detection theory. The algorithms performed very well as compared to humans, suggesting a robust tool for trajectory classification in biological applications.
Structure-based classification and ontology in chemistry
2012-01-01
Background Recent years have seen an explosion in the availability of data in the chemistry domain. With this information explosion, however, retrieving relevant results from the available information, and organising those results, become even harder problems. Computational processing is essential to filter and organise the available resources so as to better facilitate the work of scientists. Ontologies encode expert domain knowledge in a hierarchically organised machine-processable format. One such ontology for the chemical domain is ChEBI. ChEBI provides a classification of chemicals based on their structural features and a role or activity-based classification. An example of a structure-based class is 'pentacyclic compound' (compounds containing five-ring structures), while an example of a role-based class is 'analgesic', since many different chemicals can act as analgesics without sharing structural features. Structure-based classification in chemistry exploits elegant regularities and symmetries in the underlying chemical domain. As yet, there has been neither a systematic analysis of the types of structural classification in use in chemistry nor a comparison to the capabilities of available technologies. Results We analyze the different categories of structural classes in chemistry, presenting a list of patterns for features found in class definitions. We compare these patterns of class definition to tools which allow for automation of hierarchy construction within cheminformatics and within logic-based ontology technology, going into detail in the latter case with respect to the expressive capabilities of the Web Ontology Language and recent extensions for modelling structured objects. Finally we discuss the relationships and interactions between cheminformatics approaches and logic-based approaches. Conclusion Systems that perform intelligent reasoning tasks on chemistry data require a diverse set of underlying computational utilities including algorithmic, statistical and logic-based tools. For the task of automatic structure-based classification of chemical entities, essential to managing the vast swathes of chemical data being brought online, systems which are capable of hybrid reasoning combining several different approaches are crucial. We provide a thorough review of the available tools and methodologies, and identify areas of open research. PMID:22480202
Drug safety: Pregnancy rating classifications and controversies.
Wilmer, Erin; Chai, Sandy; Kroumpouzos, George
2016-01-01
This contribution consolidates data on international pregnancy rating classifications, including the former US Food and Drug Administration (FDA), Swedish, and Australian classification systems, as well as the evidence-based medicine system, and discusses discrepancies among them. It reviews the new Pregnancy and Lactation Labeling Rule (PLLR) that replaced the former FDA labeling system with narrative-based labeling requirements. PLLR emphasizes on human data and highlights pregnancy exposure registry information. In this context, the review discusses important data on the safety of most medications used in the management of skin disease in pregnancy. There are also discussions of controversies relevant to the safety of certain dermatologic medications during gestation. Copyright © 2016 Elsevier Inc. All rights reserved.
Toward an Attention-Based Diagnostic Tool for Patients With Locked-in Syndrome.
Lesenfants, Damien; Habbal, Dina; Chatelle, Camille; Soddu, Andrea; Laureys, Steven; Noirhomme, Quentin
2018-03-01
Electroencephalography (EEG) has been proposed as a supplemental tool for reducing clinical misdiagnosis in severely brain-injured populations helping to distinguish conscious from unconscious patients. We studied the use of spectral entropy as a measure of focal attention in order to develop a motor-independent, portable, and objective diagnostic tool for patients with locked-in syndrome (LIS), answering the issues of accuracy and training requirement. Data from 20 healthy volunteers, 6 LIS patients, and 10 patients with a vegetative state/unresponsive wakefulness syndrome (VS/UWS) were included. Spectral entropy was computed during a gaze-independent 2-class (attention vs rest) paradigm, and compared with EEG rhythms (delta, theta, alpha, and beta) classification. Spectral entropy classification during the attention-rest paradigm showed 93% and 91% accuracy in healthy volunteers and LIS patients respectively. VS/UWS patients were at chance level. EEG rhythms classification reached a lower accuracy than spectral entropy. Resting-state EEG spectral entropy could not distinguish individual VS/UWS patients from LIS patients. The present study provides evidence that an EEG-based measure of attention could detect command-following in patients with severe motor disabilities. The entropy system could detect a response to command in all healthy subjects and LIS patients, while none of the VS/UWS patients showed a response to command using this system.
Research on Remote Sensing Geological Information Extraction Based on Object Oriented Classification
NASA Astrophysics Data System (ADS)
Gao, Hui
2018-04-01
The northern Tibet belongs to the Sub cold arid climate zone in the plateau. It is rarely visited by people. The geological working conditions are very poor. However, the stratum exposures are good and human interference is very small. Therefore, the research on the automatic classification and extraction of remote sensing geological information has typical significance and good application prospect. Based on the object-oriented classification in Northern Tibet, using the Worldview2 high-resolution remote sensing data, combined with the tectonic information and image enhancement, the lithological spectral features, shape features, spatial locations and topological relations of various geological information are excavated. By setting the threshold, based on the hierarchical classification, eight kinds of geological information were classified and extracted. Compared with the existing geological maps, the accuracy analysis shows that the overall accuracy reached 87.8561 %, indicating that the classification-oriented method is effective and feasible for this study area and provides a new idea for the automatic extraction of remote sensing geological information.
A Critical Review of Options for Tool and Workpiece Sensing
1989-06-02
Tool Temperature Control ." International Machine Tool Design Res., Vol. 7, pp. 465-75, 1967. 5. Cook, N. H., Subramanian, K., and Basile, S. A...if necessury and identify by block riumber) FIELD GROUP SUB-GROUP 1. Detectors 3. Control Equipment 1 08 2. Sensor Characteristics 4. Process Control ...will provide conceptual designs and recommend a system (Continued) 20. DISTRIBUTION/AVAILABILITY OF ABSTRACT 21 ABSTRACT SECURITY CLASSIFICATION 0
Heat Measurements in Electrolytic Metal-Deuteride Experiments
2015-10-16
zirconia, and zeolites ) prepared by Dr. D. Kidwell at NRL, we attempted to measure excess energy and He production. After operating tens of experiments...we have found that D2 exposure to Pd-filled zeolites and PdNiZrOx catalysts leads to higher temperatures than does H2 exposure. However, we have not...Reactions, SuperWave™, electrolysis, deuterium, zeolite , silica, yttria stabilized zirconia, palladium. 16. SECURITY CLASSIFICATION OF
Fine pattern replication on 10 x 10-mm exposure area using ETS-1 laboratory tool in HIT
NASA Astrophysics Data System (ADS)
Hamamoto, K.; Watanabe, Takeo; Hada, Hideo; Komano, Hiroshi; Kishimura, Shinji; Okazaki, Shinji; Kinoshita, Hiroo
2002-07-01
Utilizing ETS-1 laboratory tool in Himeji Institute of Technology (HIT), as for the fine pattern replicated by using the Cr mask in static exposure, it is replicated in the exposure area of 10 mm by 2 mm in size that the line and space pattern width of 60 nm, the isolated line pattern width of 40 nm, and hole pattern width of 150 nm. According to the synchronous scanning of the mass and wafer with EUVL laboratory tool with reduction optical system which consisted of three-aspherical-mirror in the NewSUBARU facilities succeeded in the line of 60 nm and the space pattern formation in the exposure region of 10mm by 10mm. From the result of exposure characteristics for positive- tone resist for KrF and EB, KrF chemically amplified resist has better characteristics than EB chemically amplified resist.
Belotti, Francesco; Doglietto, Francesco; Schreiber, Alberto; Ravanelli, Marco; Ferrari, Marco; Lancini, Davide; Rampinelli, Vittorio; Hirtler, Lena; Buffoli, Barbara; Bolzoni Villaret, Andrea; Maroldi, Roberto; Rodella, Luigi Fabrizio; Nicolai, Piero; Fontanella, Marco Maria
2018-01-01
Endoscopic visualization does not necessarily correspond to an adequate working space. The need for balancing invasiveness and adequacy of sellar tumor exposure has recently led to the description of multiple endoscopic endonasal transsphenoidal approaches. Comparative anatomic data on these variants are lacking. We sought to quantitatively compare endoscopic endonasal transsphenoidal approaches to the sella and parasellar region, using the concept of "surgical pyramid." Four endoscopic transsphenoidal approaches were performed in 10 injected specimens: 1) hemisphenoidotomy; 2) transrostral; 3) extended transrostral (with superior turbinectomy); and 4) extended transrostral with posterior ethmoidectomy. ApproachViewer software (part of GTx-Eyes II, University Health Network, Toronto, Canada) with a dedicated navigation system was used to quantify the surgical pyramid volume, as well as exposure of sellar and parasellar areas. Statistical analyses were performed with Friedman's tests and Nemenyi's procedure. Hemisphenoidotomy provided limited exposure of the sellar area and a small working volume. A transrostral approach was necessary to expose the entire sella. Exposure of lateral parasellar areas required superior turbinectomy or posterior ethmoidectomy. The differences between each of the modules was statistically significant. The present study validates, from an anatomic point of view, a modular classification of endoscopic endonasal transsphenoidal approaches to the sellar region. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Tarando, Sebastian Roberto; Fetita, Catalin; Brillet, Pierre-Yves
2017-03-01
The infiltrative lung diseases are a class of irreversible, non-neoplastic lung pathologies requiring regular follow-up with CT imaging. Quantifying the evolution of the patient status imposes the development of automated classification tools for lung texture. Traditionally, such classification relies on a two-dimensional analysis of axial CT images. This paper proposes a cascade of the existing CNN based CAD system, specifically tuned-up. The advantage of using a deep learning approach is a better regularization of the classification output. In a preliminary evaluation, the combined approach was tested on a 13 patient database of various lung pathologies, showing an increase of 10% in True Positive Rate (TPR) with respect to the best suited state of the art CNN for this task.
Link prediction boosted psychiatry disorder classification for functional connectivity network
NASA Astrophysics Data System (ADS)
Li, Weiwei; Mei, Xue; Wang, Hao; Zhou, Yu; Huang, Jiashuang
2017-02-01
Functional connectivity network (FCN) is an effective tool in psychiatry disorders classification, and represents cross-correlation of the regional blood oxygenation level dependent signal. However, FCN is often incomplete for suffering from missing and spurious edges. To accurate classify psychiatry disorders and health control with the incomplete FCN, we first `repair' the FCN with link prediction, and then exact the clustering coefficients as features to build a weak classifier for every FCN. Finally, we apply a boosting algorithm to combine these weak classifiers for improving classification accuracy. Our method tested by three datasets of psychiatry disorder, including Alzheimer's Disease, Schizophrenia and Attention Deficit Hyperactivity Disorder. The experimental results show our method not only significantly improves the classification accuracy, but also efficiently reconstructs the incomplete FCN.
Invariant approach to the character classification
NASA Astrophysics Data System (ADS)
Šariri, Kristina; Demoli, Nazif
2008-04-01
Image moments analysis is a very useful tool which allows image description invariant to translation and rotation, scale change and some types of image distortions. The aim of this work was development of simple method for fast and reliable classification of characters by using Hu's and affine moment invariants. Measure of Eucleidean distance was used as a discrimination feature with statistical parameters estimated. The method was tested in classification of Times New Roman font letters as well as sets of the handwritten characters. It is shown that using all Hu's and three affine invariants as discrimination set improves recognition rate by 30%.
Automated structural classification of lipids by machine learning.
Taylor, Ryan; Miller, Ryan H; Miller, Ryan D; Porter, Michael; Dalgleish, James; Prince, John T
2015-03-01
Modern lipidomics is largely dependent upon structural ontologies because of the great diversity exhibited in the lipidome, but no automated lipid classification exists to facilitate this partitioning. The size of the putative lipidome far exceeds the number currently classified, despite a decade of work. Automated classification would benefit ongoing classification efforts by decreasing the time needed and increasing the accuracy of classification while providing classifications for mass spectral identification algorithms. We introduce a tool that automates classification into the LIPID MAPS ontology of known lipids with >95% accuracy and novel lipids with 63% accuracy. The classification is based upon simple chemical characteristics and modern machine learning algorithms. The decision trees produced are intelligible and can be used to clarify implicit assumptions about the current LIPID MAPS classification scheme. These characteristics and decision trees are made available to facilitate alternative implementations. We also discovered many hundreds of lipids that are currently misclassified in the LIPID MAPS database, strongly underscoring the need for automated classification. Source code and chemical characteristic lists as SMARTS search strings are available under an open-source license at https://www.github.com/princelab/lipid_classifier. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
The feasibility of adapting a population-based asthma-specific job exposure matrix (JEM) to NHANES.
McHugh, Michelle K; Symanski, Elaine; Pompeii, Lisa A; Delclos, George L
2010-12-01
To determine the feasibility of applying a job exposure matrix (JEM) for classifying exposures to 18 asthmagens in the National Health and Nutrition Examination Survey (NHANES), 1999-2004. We cross-referenced 490 National Center for Health Statistics job codes used to develop the 40 NHANES occupation groups with 506 JEM job titles and assessed homogeneity in asthmagen exposure across job codes within each occupation group. In total, 399 job codes corresponded to one JEM job title, 32 to more than one job title, and 59 were not in the JEM. Three occupation groups had the same asthmagen exposure across job codes, 11 had no asthmagen exposure, and 26 groups had heterogeneous exposures across jobs codes. The NHANES classification of occupations limits the use of the JEM to evaluate the association between workplace exposures and asthma and more refined occupational data are needed to enhance work-related injury/illness surveillance efforts.
Fei, Baowei; Yang, Xiaofeng; Nye, Jonathon A.; Aarsvold, John N.; Raghunath, Nivedita; Cervo, Morgan; Stark, Rebecca; Meltzer, Carolyn C.; Votaw, John R.
2012-01-01
Purpose: Combined MR/PET is a relatively new, hybrid imaging modality. A human MR/PET prototype system consisting of a Siemens 3T Trio MR and brain PET insert was installed and tested at our institution. Its present design does not offer measured attenuation correction (AC) using traditional transmission imaging. This study is the development of quantification tools including MR-based AC for quantification in combined MR/PET for brain imaging. Methods: The developed quantification tools include image registration, segmentation, classification, and MR-based AC. These components were integrated into a single scheme for processing MR/PET data. The segmentation method is multiscale and based on the Radon transform of brain MR images. It was developed to segment the skull on T1-weighted MR images. A modified fuzzy C-means classification scheme was developed to classify brain tissue into gray matter, white matter, and cerebrospinal fluid. Classified tissue is assigned an attenuation coefficient so that AC factors can be generated. PET emission data are then reconstructed using a three-dimensional ordered sets expectation maximization method with the MR-based AC map. Ten subjects had separate MR and PET scans. The PET with [11C]PIB was acquired using a high-resolution research tomography (HRRT) PET. MR-based AC was compared with transmission (TX)-based AC on the HRRT. Seventeen volumes of interest were drawn manually on each subject image to compare the PET activities between the MR-based and TX-based AC methods. Results: For skull segmentation, the overlap ratio between our segmented results and the ground truth is 85.2 ± 2.6%. Attenuation correction results from the ten subjects show that the difference between the MR and TX-based methods was <6.5%. Conclusions: MR-based AC compared favorably with conventional transmission-based AC. Quantitative tools including registration, segmentation, classification, and MR-based AC have been developed for use in combined MR/PET. PMID:23039679
Farran, Bassam; Channanath, Arshad Mohamed; Behbehani, Kazem; Thanaraj, Thangavel Alphonse
2013-05-14
We build classification models and risk assessment tools for diabetes, hypertension and comorbidity using machine-learning algorithms on data from Kuwait. We model the increased proneness in diabetic patients to develop hypertension and vice versa. We ascertain the importance of ethnicity (and natives vs expatriate migrants) and of using regional data in risk assessment. Retrospective cohort study. Four machine-learning techniques were used: logistic regression, k-nearest neighbours (k-NN), multifactor dimensionality reduction and support vector machines. The study uses fivefold cross validation to obtain generalisation accuracies and errors. Kuwait Health Network (KHN) that integrates data from primary health centres and hospitals in Kuwait. 270 172 hospital visitors (of which, 89 858 are diabetic, 58 745 hypertensive and 30 522 comorbid) comprising Kuwaiti natives, Asian and Arab expatriates. Incident type 2 diabetes, hypertension and comorbidity. Classification accuracies of >85% (for diabetes) and >90% (for hypertension) are achieved using only simple non-laboratory-based parameters. Risk assessment tools based on k-NN classification models are able to assign 'high' risk to 75% of diabetic patients and to 94% of hypertensive patients. Only 5% of diabetic patients are seen assigned 'low' risk. Asian-specific models and assessments perform even better. Pathological conditions of diabetes in the general population or in hypertensive population and those of hypertension are modelled. Two-stage aggregate classification models and risk assessment tools, built combining both the component models on diabetes (or on hypertension), perform better than individual models. Data on diabetes, hypertension and comorbidity from the cosmopolitan State of Kuwait are available for the first time. This enabled us to apply four different case-control models to assess risks. These tools aid in the preliminary non-intrusive assessment of the population. Ethnicity is seen significant to the predictive models. Risk assessments need to be developed using regional data as we demonstrate the applicability of the American Diabetes Association online calculator on data from Kuwait.
Towards linking patients and clinical information: detecting UMLS concepts in e-mail.
Brennan, Patricia Flatley; Aronson, Alan R
2003-01-01
The purpose of this project is to explore the feasibility of detecting terms within the electronic messages of patients that could be used to effectively search electronic knowledge resources and bring health information resources into the hands of patients. Our team is exploring the application of the natural language processing (NLP) tools built within the Lister Hill Center at the National Library of Medicine (NLM) to the challenge of detecting relevant concepts from the Unified Medical Language System (UMLS) within the free text of lay people's electronic messages (e-mail). We obtained a sample of electronic messages sent by patients participating in a randomized field evaluation of an internet-based home care support service to the project nurse, and we subjected elements of these messages to a series of analyses using several vocabularies from the UMLS Metathesaurus and the selected NLP tools. The nursing vocabularies provide an excellent starting point for this exercise because their domain encompasses patient's responses to health challenges. In successive runs we augmented six nursing vocabularies (NANDA Nursing Diagnosis, Nursing Interventions Classification, Nursing Outcomes Classification, Home Health Classification, Omaha System, and the Patient Care Data Set) with selected sets of clinical terminologies (International Classification of Primary Care; International Classification of Primary Care- American English; Micromedex DRUGDEX; National Drug Data File; Thesaurus of Psychological Terms; WHO Adverse Drug Reaction Terminology) and then additionally with either Medical Subject Heading (MeSH) or SNOMED International terms. The best performance was obtained when the nursing vocabularies were complemented with selected clinical terminologies. These findings have implications not only for facilitating lay people's access to electronic knowledge resources but may also be of assistance in developing new tools to aid in linking free text (e.g., clinical notes) to lexically complex knowledge resources such as those emerging from the Human Genome Project.
Evans, Melissa; Hocking, Clare; Kersten, Paula
2017-12-01
This study aim was to evaluate whether the Extended International Classification of Functioning, Disability and Health Core Set for Stroke captured the interventions of a community stroke rehabilitation team situated in a large city in New Zealand. It was proposed that the results would identify the contribution of each discipline, and the gaps and differences in service provision to Māori and non-Māori. Applying the Extended International Classification of Functioning, Disability and Health Core Set for Stroke in this way would also inform whether this core set should be adopted in New Zealand. Interventions were retrospectively extracted from 18 medical records and linked to the International Classification of Functioning, Disability and Health and the Extended International Classification of Functioning, Disability and Health Core Set for Stroke. The frequencies of linked interventions and the health discipline providing the intervention were calculated. Analysis revealed that 98.8% of interventions provided by the rehabilitation team could be linked to the Extended International Classification of Functioning, Disability and Health Core Set for Stroke, with more interventions for body function and structure than for activities and participation; no interventions for emotional concerns; and limited interventions for community, social and civic life. Results support previous recommendations for additions to the EICSS. The results support the use of the Extended International Classification of Functioning, Disability and Health Core Set for Stroke in New Zealand and demonstrates its use as a quality assurance tool that can evaluate the scope and practice of a rehabilitation service. Implications for Rehabilitation The Extended International Classification of Functioning Disability and Health Core Set for Stroke appears to represent the stroke interventions of a community stroke rehabilitation team in New Zealand. As a result, researchers and clinicians may have increased confidence to use this core set in research and clinical practice. The Extended International Classification of Functioning Disability and Health Core Set for Stroke can be used as a quality assurance tool to establish whether a community stroke rehabilitation team is meeting the functional needs of its stroke population.
Community-Focused Exposure and Risk Screening Tool (C-FERST): Introduction and Demonstration
Public Need: Communities and decision makers are concerned about where they live, work, and play. C-FERST is a user-friendly tool that helps: Identify environmental issues in communities; Learn about these issues; Explore exposure and risk reduction options.
INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT
A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...
To address this need, new tools have been created for characterizing, simulating, and evaluating chemical biokinetics. Physiologically-based pharmacokinetic (PBPK) models provide estimates of chemical exposures that produce potentially hazardous tissue concentrations, while tissu...
ERIC Educational Resources Information Center
Ford, Jeremy W.; Missall, Kristen N.; Hosp, John L.; Kuhle, Jennifer L.
2016-01-01
Advances in maze selection curriculum-based measurement have led to several published tools with technical information for interpretation (e.g., norms, benchmarks, cut-scores, classification accuracy) that have increased their usefulness for universal screening. A range of scoring practices have emerged for evaluating student performance on maze…
Williams, Bradley S; D'Amico, Ellen; Kastens, Jude H; Thorp, James H; Flotemersch, Joseph E; Thoms, Martin C
2013-09-01
River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scale and are important strata for framing whole-watershed research questions and management plans. Hierarchical classification procedures aid in HP identification by grouping sections of river based on their hydrogeomorphic character; however, collecting data required for such procedures with field-based methods is often impractical. We developed a set of GIS-based tools that facilitate rapid, low cost riverine landscape characterization and FPZ classification. Our tools, termed RESonate, consist of a custom toolbox designed for ESRI ArcGIS®. RESonate automatically extracts 13 hydrogeomorphic variables from readily available geospatial datasets and datasets derived from modeling procedures. An advanced 2D flood model, FLDPLN, designed for MATLAB® is used to determine valley morphology by systematically flooding river networks. When used in conjunction with other modeling procedures, RESonate and FLDPLN can assess the character of large river networks quickly and at very low costs. Here we describe tool and model functions in addition to their benefits, limitations, and applications.
How a national vegetation classification can help ecological research and management
Franklin, Scott; Comer, Patrick; Evens, Julie; Ezcurra, Exequiel; Faber-Langendoen, Don; Franklin, Janet; Jennings, Michael; Josse, Carmen; Lea, Chris; Loucks, Orie; Muldavin, Esteban; Peet, Robert K.; Ponomarenko, Serguei; Roberts, David G.; Solomeshch, Ayzik; Keeler-Wolf, Todd; Van Kley, James; Weakley, Alan; McKerrow, Alexa; Burke, Marianne; Spurrier, Carol
2015-01-01
The elegance of classification lies in its ability to compile and systematize various terminological conventions and masses of information that are unattainable during typical research projects. Imagine a discipline without standards for collection, analysis, and interpretation; unfortunately, that describes much of 20th-century vegetation ecology. With differing methods, how do we assess community dynamics over decades, much less centuries? How do we compare plant communities from different areas? The need for a widely applied vegetation classification has long been clear. Now imagine a multi-decade effort to assimilate hundreds of disparate vegetation classifications into one common classification for the US. In this letter, we introduce the US National Vegetation Classification (USNVC; www.usnvc.org) as a powerful tool for research and conservation, analogous to the argument made by Schimel and Chadwick (2013) for soils. The USNVC provides a national framework to classify and describe vegetation; here we describe the USNVC and offer brief examples of its efficacy.
Biomonitoring - An Exposure Science Tool for Exposure and Risk Assessment
Biomonitoring studies of environmental stressors are useful for confirming exposures, estimating dose levels, and evaluating human health risks. However, the complexities of exposure-biomarker and biomarker-response relationships have limited the use of biomarkers in exposure sc...
Can poison control data be used for pharmaceutical poisoning surveillance?
Naun, Christopher A; Olsen, Cody S; Dean, J Michael; Olson, Lenora M; Cook, Lawrence J; Keenan, Heather T
2011-05-01
To determine the association between the frequencies of pharmaceutical exposures reported to a poison control center (PCC) and those seen in the emergency department (ED). A statewide population-based retrospective comparison of frequencies of ED pharmaceutical poisonings with frequencies of pharmaceutical exposures reported to a regional PCC. ED poisonings, identified by International Classification of Diseases, Version 9 (ICD-9) codes, were grouped into substance categories. Using a reproducible algorithm facilitated by probabilistic linkage, codes from the PCC classification system were mapped into the same categories. A readily identifiable subset of PCC calls was selected for comparison. Correlations between frequencies of quarterly exposures by substance categories were calculated using Pearson correlation coefficients and partial correlation coefficients with adjustment for seasonality. PCC reported exposures correlated with ED poisonings in nine of 10 categories. Partial correlation coefficients (r(p)) indicated strong associations (r(p)>0.8) for three substance categories that underwent large changes in their incidences (opiates, benzodiazepines, and muscle relaxants). Six substance categories were moderately correlated (r(p)>0.6). One category, salicylates, showed no association. Limitations Imperfect overlap between ICD-9 and PCC codes may have led to miscategorization. Substances without changes in exposure frequency have inadequate variability to detect association using this method. PCC data are able to effectively identify trends in poisonings seen in EDs and may be useful as part of a pharmaceutical poisoning surveillance system. The authors developed an algorithm-driven technique for mapping American Association of Poison Control Centers codes to ICD-9 codes and identified a useful subset of poison control exposures for analysis.
NASA Astrophysics Data System (ADS)
Zink, Frank Edward
The detection and classification of pulmonary nodules is of great interest in chest radiography. Nodules are often indicative of primary cancer, and their detection is particularly important in asymptomatic patients. The ability to classify nodules as calcified or non-calcified is important because calcification is a positive indicator that the nodule is benign. Dual-energy methods offer the potential to improve both the detection and classification of nodules by allowing the formation of material-selective images. Tissue-selective images can improve detection by virtue of the elimination of obscuring rib structure. Bone -selective images are essentially calcium images, allowing classification of the nodule. A dual-energy technique is introduced which uses a computed radiography system to acquire dual-energy chest radiographs in a single-exposure. All aspects of the dual-energy technique are described, with particular emphasis on scatter-correction, beam-hardening correction, and noise-reduction algorithms. The adaptive noise-reduction algorithm employed improves material-selective signal-to-noise ratio by up to a factor of seven with minimal sacrifice in selectivity. A clinical comparison study is described, undertaken to compare the dual-energy technique to conventional chest radiography for the tasks of nodule detection and classification. Observer performance data were collected using the Free Response Observer Characteristic (FROC) method and the bi-normal Alternative FROC (AFROC) performance model. Results of the comparison study, analyzed using two common multiple observer statistical models, showed that the dual-energy technique was superior to conventional chest radiography for detection of nodules at a statistically significant level (p < .05). Discussion of the comparison study emphasizes the unique combination of data collection and analysis techniques employed, as well as the limitations of comparison techniques in the larger context of technology assessment.
NASA Astrophysics Data System (ADS)
Anitha, J.; Vijila, C. Kezi Selva; Hemanth, D. Jude
2010-02-01
Diabetic retinopathy (DR) is a chronic eye disease for which early detection is highly essential to avoid any fatal results. Image processing of retinal images emerge as a feasible tool for this early diagnosis. Digital image processing techniques involve image classification which is a significant technique to detect the abnormality in the eye. Various automated classification systems have been developed in the recent years but most of them lack high classification accuracy. Artificial neural networks are the widely preferred artificial intelligence technique since it yields superior results in terms of classification accuracy. In this work, Radial Basis function (RBF) neural network based bi-level classification system is proposed to differentiate abnormal DR Images and normal retinal images. The results are analyzed in terms of classification accuracy, sensitivity and specificity. A comparative analysis is performed with the results of the probabilistic classifier namely Bayesian classifier to show the superior nature of neural classifier. Experimental results show promising results for the neural classifier in terms of the performance measures.
Perroca, Marcia Galan; Ek, Anna-Christina
2007-07-01
Although patient classification tools have been used in Sweden since the 1980s, few studies have examined how they are utilized and monitored. This paper investigates the patient classification systems implemented in hospitals in the country as well as the level of satisfaction of nurses with the implemented instrument. A postal survey method was used in which a total of 128 questionnaires were sent to nurse managers. Twenty-three hospitals were identified with patient classification systems currently in operation. The Zebra and Beakta systems are the most commonly used instruments. Nurse managers appear to be satisfied with the patient classification systems in use on their wards as a whole except for their inability to measure the quality of care provided, the time spent to use the instruments and the fact that the administration do not estimate nursing staff requirements using the system.
Yang, Wen; Zhu, Jin-Yong; Lu, Kai-Hong; Wan, Li; Mao, Xiao-Hua
2014-06-01
Appropriate schemes for classification of freshwater phytoplankton are prerequisites and important tools for revealing phytoplanktonic succession and studying freshwater ecosystems. An alternative approach, functional group of freshwater phytoplankton, has been proposed and developed due to the deficiencies of Linnaean and molecular identification in ecological applications. The functional group of phytoplankton is a classification scheme based on autoecology. In this study, the theoretical basis and classification criterion of functional group (FG), morpho-functional group (MFG) and morphology-based functional group (MBFG) were summarized, as well as their merits and demerits. FG was considered as the optimal classification approach for the aquatic ecology research and aquatic environment evaluation. The application status of FG was introduced, with the evaluation standards and problems of two approaches to assess water quality on the basis of FG, index methods of Q and QR, being briefly discussed.
A systematic literature review of automated clinical coding and classification systems
Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R
2010-01-01
Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome. PMID:20962126
A systematic literature review of automated clinical coding and classification systems.
Stanfill, Mary H; Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R
2010-01-01
Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome.
Boosting CNN performance for lung texture classification using connected filtering
NASA Astrophysics Data System (ADS)
Tarando, Sebastián. Roberto; Fetita, Catalin; Kim, Young-Wouk; Cho, Hyoun; Brillet, Pierre-Yves
2018-02-01
Infiltrative lung diseases describe a large group of irreversible lung disorders requiring regular follow-up with CT imaging. Quantifying the evolution of the patient status imposes the development of automated classification tools for lung texture. This paper presents an original image pre-processing framework based on locally connected filtering applied in multiresolution, which helps improving the learning process and boost the performance of CNN for lung texture classification. By removing the dense vascular network from images used by the CNN for lung classification, locally connected filters provide a better discrimination between different lung patterns and help regularizing the classification output. The approach was tested in a preliminary evaluation on a 10 patient database of various lung pathologies, showing an increase of 10% in true positive rate (on average for all the cases) with respect to the state of the art cascade of CNNs for this task.
Koh, Dong-Hee; Locke, Sarah J.; Chen, Yu-Cheng; Purdue, Mark P.; Friesen, Melissa C.
2016-01-01
Background Retrospective exposure assessment of occupational lead exposure in population-based studies requires historical exposure information from many occupations and industries. Methods We reviewed published US exposure monitoring studies to identify lead exposure measurement data. We developed an occupational lead exposure database from the 175 identified papers containing 1,111 sets of lead concentration summary statistics (21% area air, 47% personal air, 32% blood). We also extracted ancillary exposure-related information, including job, industry, task/location, year collected, sampling strategy, control measures in place, and sampling and analytical methods. Results Measurements were published between 1940 and 2010 and represented 27 2-digit standardized industry classification codes. The majority of the measurements were related to lead-based paint work, joining or cutting metal using heat, primary and secondary metal manufacturing, and lead acid battery manufacturing. Conclusions This database can be used in future statistical analyses to characterize differences in lead exposure across time, jobs, and industries. PMID:25968240
ERIC Educational Resources Information Center
Markey, Karen; Demeyer, Anh N.
In this research project, subject terms from the Dewey Decimal Classification (DDC) Schedules and Relative Index were incorporated into an online catalog as searcher's tools for subject access, browsing, and display. Four features of the DDC were employed to help searchers browse for and match their own subject terms with the online catalog's…
Peres, João; Mendes, Karine Laura Cortellazzi; Wada, Ronaldo Seichi; Sousa, Maria da Luz Rosario de
2017-06-01
Oral health teams can work with both information of the people related to the family context as individual epidemiological through risk ratings, considering equity and service organization. The purpose of our study was to evaluate the association between tools that classify individual and family risk. The study group consisted of students from the age group of 5-6 years and 11-12 years who were classified regarding risk of caries and whether their parents had periodontal disease, in addition to the family risk. There was an association between the risk rating for decay in children (n = 128) and family risk classification with Coef C = 0.338 and p = 0.01, indicating that the higher the family's risk, the higher the risk of caries. Similarly, the association between the risk classification for periodontal disease in parents and family risk classification with Coef C = 0.5503 and p = 0.03 indicated that the higher the family risk, the higher the risk of periodontal disease. It can be concluded that the use of family risk rating tool is indicated as a possibility of ordering actions of the dental service, organizing their demand with greater equity, in this access door.
Anterior Chamber Angle Shape Analysis and Classification of Glaucoma in SS-OCT Images.
Ni Ni, Soe; Tian, J; Marziliano, Pina; Wong, Hong-Tym
2014-01-01
Optical coherence tomography is a high resolution, rapid, and noninvasive diagnostic tool for angle closure glaucoma. In this paper, we present a new strategy for the classification of the angle closure glaucoma using morphological shape analysis of the iridocorneal angle. The angle structure configuration is quantified by the following six features: (1) mean of the continuous measurement of the angle opening distance; (2) area of the trapezoidal profile of the iridocorneal angle centered at Schwalbe's line; (3) mean of the iris curvature from the extracted iris image; (4) complex shape descriptor, fractal dimension, to quantify the complexity, or changes of iridocorneal angle; (5) ellipticity moment shape descriptor; and (6) triangularity moment shape descriptor. Then, the fuzzy k nearest neighbor (fkNN) classifier is utilized for classification of angle closure glaucoma. Two hundred and sixty-four swept source optical coherence tomography (SS-OCT) images from 148 patients were analyzed in this study. From the experimental results, the fkNN reveals the best classification accuracy (99.11 ± 0.76%) and AUC (0.98 ± 0.012) with the combination of fractal dimension and biometric parameters. It showed that the proposed approach has promising potential to become a computer aided diagnostic tool for angle closure glaucoma (ACG) disease.
Parametric Time-Frequency Analysis and Its Applications in Music Classification
NASA Astrophysics Data System (ADS)
Shen, Ying; Li, Xiaoli; Ma, Ngok-Wah; Krishnan, Sridhar
2010-12-01
Analysis of nonstationary signals, such as music signals, is a challenging task. The purpose of this study is to explore an efficient and powerful technique to analyze and classify music signals in higher frequency range (44.1 kHz). The pursuit methods are good tools for this purpose, but they aimed at representing the signals rather than classifying them as in Y. Paragakin et al., 2009. Among the pursuit methods, matching pursuit (MP), an adaptive true nonstationary time-frequency signal analysis tool, is applied for music classification. First, MP decomposes the sample signals into time-frequency functions or atoms. Atom parameters are then analyzed and manipulated, and discriminant features are extracted from atom parameters. Besides the parameters obtained using MP, an additional feature, central energy, is also derived. Linear discriminant analysis and the leave-one-out method are used to evaluate the classification accuracy rate for different feature sets. The study is one of the very few works that analyze atoms statistically and extract discriminant features directly from the parameters. From our experiments, it is evident that the MP algorithm with the Gabor dictionary decomposes nonstationary signals, such as music signals, into atoms in which the parameters contain strong discriminant information sufficient for accurate and efficient signal classifications.
The EPA's human exposure research program for assessing cumulative risk in communities.
Zartarian, Valerie G; Schultz, Bradley D
2010-06-01
Communities are faced with challenges in identifying and prioritizing environmental issues, taking actions to reduce their exposures, and determining their effectiveness for reducing human health risks. Additional challenges include determining what scientific tools are available and most relevant, and understanding how to use those tools; given these barriers, community groups tend to rely more on risk perception than science. The U.S. Environmental Protection Agency's Office of Research and Development, National Exposure Research Laboratory (NERL) and collaborators are developing and applying tools (models, data, methods) for enhancing cumulative risk assessments. The NERL's "Cumulative Communities Research Program" focuses on key science questions: (1) How to systematically identify and prioritize key chemical stressors within a given community?; (2) How to develop estimates of exposure to multiple stressors for individuals in epidemiologic studies?; and (3) What tools can be used to assess community-level distributions of exposures for the development and evaluation of the effectiveness of risk reduction strategies? This paper provides community partners and scientific researchers with an understanding of the NERL research program and other efforts to address cumulative community risks; and key research needs and opportunities. Some initial findings include the following: (1) Many useful tools exist for components of risk assessment, but need to be developed collaboratively with end users and made more comprehensive and user-friendly for practical application; (2) Tools for quantifying cumulative risks and impact of community risk reduction activities are also needed; (3) More data are needed to assess community- and individual-level exposures, and to link exposure-related information with health effects; and (4) Additional research is needed to incorporate risk-modifying factors ("non-chemical stressors") into cumulative risk assessments. The products of this research program will advance the science for cumulative risk assessments and empower communities with information so that they can make informed, cost-effective decisions to improve public health.
An Integrated Web-Based Assessment Tool for Assessing Pesticide Exposure and Risks
Background/Question/Methods We have created an integrated web-based tool designed to estimate exposure doses and ecological risks under the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Species Act. This involved combining a number of disparat...
Diagnostics for Confounding of Time-varying and Other Joint Exposures.
Jackson, John W
2016-11-01
The effects of joint exposures (or exposure regimes) include those of adhering to assigned treatment versus placebo in a randomized controlled trial, duration of exposure in a cohort study, interactions between exposures, and direct effects of exposure, among others. Unlike the setting of a single point exposure (e.g., propensity score matching), there are few tools to describe confounding for joint exposures or how well a method resolves it. Investigators need tools that describe confounding in ways that are conceptually grounded and intuitive for those who read, review, and use applied research to guide policy. We revisit the implications of exchangeability conditions that hold in sequentially randomized trials, and the bias structure that motivates the use of g-methods, such as marginal structural models. From these, we develop covariate balance diagnostics for joint exposures that can (1) describe time-varying confounding, (2) assess whether covariates are predicted by prior exposures given their past, the indication for g-methods, and (3) describe residual confounding after inverse probability weighting. For each diagnostic, we present time-specific metrics that encompass a wide class of joint exposures, including regimes of multivariate time-varying exposures in censored data, with multivariate point exposures as a special case. We outline how to estimate these directly or with regression and how to average them over person-time. Using a simulated example, we show how these metrics can be presented graphically. This conceptually grounded framework can potentially aid the transparent design, analysis, and reporting of studies that examine joint exposures. We provide easy-to-use tools to implement it.
Object classification and outliers analysis in the forthcoming Gaia mission
NASA Astrophysics Data System (ADS)
Ordóñez-Blanco, D.; Arcay, B.; Dafonte, C.; Manteiga, M.; Ulla, A.
2010-12-01
Astrophysics is evolving towards the rational optimization of costly observational material by the intelligent exploitation of large astronomical databases from both terrestrial telescopes and spatial mission archives. However, there has been relatively little advance in the development of highly scalable data exploitation and analysis tools needed to generate the scientific returns from these large and expensively obtained datasets. Among the upcoming projects of astronomical instrumentation, Gaia is the next cornerstone ESA mission. The Gaia survey foresees the creation of a data archive and its future exploitation with automated or semi-automated analysis tools. This work reviews some of the work that is being developed by the Gaia Data Processing and Analysis Consortium for the object classification and analysis of outliers in the forthcoming mission.
Modeling of tool path for the CNC sheet cutting machines
NASA Astrophysics Data System (ADS)
Petunin, Aleksandr A.
2015-11-01
In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.
Jordan, Alan; Rees, Tony; Gowlett-Holmes, Karen
2015-01-01
Imagery collected by still and video cameras is an increasingly important tool for minimal impact, repeatable observations in the marine environment. Data generated from imagery includes identification, annotation and quantification of biological subjects and environmental features within an image. To be long-lived and useful beyond their project-specific initial purpose, and to maximize their utility across studies and disciplines, marine imagery data should use a standardised vocabulary of defined terms. This would enable the compilation of regional, national and/or global data sets from multiple sources, contributing to broad-scale management studies and development of automated annotation algorithms. The classification scheme developed under the Collaborative and Automated Tools for Analysis of Marine Imagery (CATAMI) project provides such a vocabulary. The CATAMI classification scheme introduces Australian-wide acknowledged, standardised terminology for annotating benthic substrates and biota in marine imagery. It combines coarse-level taxonomy and morphology, and is a flexible, hierarchical classification that bridges the gap between habitat/biotope characterisation and taxonomy, acknowledging limitations when describing biological taxa through imagery. It is fully described, documented, and maintained through curated online databases, and can be applied across benthic image collection methods, annotation platforms and scoring methods. Following release in 2013, the CATAMI classification scheme was taken up by a wide variety of users, including government, academia and industry. This rapid acceptance highlights the scheme’s utility and the potential to facilitate broad-scale multidisciplinary studies of marine ecosystems when applied globally. Here we present the CATAMI classification scheme, describe its conception and features, and discuss its utility and the opportunities as well as challenges arising from its use. PMID:26509918
Development of the Rice Convection Model as a Space Weather Tool
2015-05-31
coupled to the ionosphere that is suitable for both scientific studies as well as a prediction tool. We are able to run the model faster than “real...of work by finding ways to fund a more systematic effort in making the RCM a space weather prediction tool for magnetospheric and ionospheric studies...convection electric field, total electron content, TEC, ionospheric convection, plasmasphere 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT
Comparing the Advanced REACH Tool's (ART) Estimates With Switzerland's Occupational Exposure Data.
Savic, Nenad; Gasic, Bojan; Schinkel, Jody; Vernez, David
2017-10-01
The Advanced REACH Tool (ART) is the most sophisticated tool used for evaluating exposure levels under the European Union's Registration, Evaluation, Authorisation and restriction of CHemicals (REACH) regulations. ART provides estimates at different percentiles of exposure and within different confidence intervals (CIs). However, its performance has only been tested on a limited number of exposure data. The present study compares ART's estimates with exposure measurements collected over many years in Switzerland. Measurements from 584 cases of exposure to vapours, mists, powders, and abrasive dusts (wood/stone and metal) were extracted from a Swiss database. The corresponding exposures at the 50th and 90th percentiles were calculated in ART. To characterize the model's performance, the 90% CI of the estimates was considered. ART's performance at the 50th percentile was only found to be insufficiently conservative with regard to exposure to wood/stone dusts, whereas the 90th percentile showed sufficient conservatism for all the types of exposure processed. However, a trend was observed with the residuals, where ART overestimated lower exposures and underestimated higher ones. The median was more precise, however, and the majority (≥60%) of real-world measurements were within a factor of 10 from ART's estimates. We provide recommendations based on the results and suggest further, more comprehensive, investigations. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W
2008-05-28
The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu.
Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.
2008-01-01
The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu. PMID:18509477
Tluczkiewicz, I; Kühne, R; Ebert, R-U; Batke, M; Schüürmann, G; Mangelsdorf, I; Escher, S E
2016-07-01
The present publication describes an integrative grouping concept to derive threshold values for inhalation exposure. The classification scheme starts with differences in toxicological potency and develops criteria to group compounds into two potency classes, namely toxic (T-group) or low toxic (L-group). The TTC concept for inhalation exposure is based on the TTC RepDose data set, consisting of 296 organic compounds with 608 repeated-dose inhalation studies. Initially, 21 structural features (SFs) were identified as being characteristic for compounds of either high or low NOEC values (Schüürmann et al., 2016). In subsequent analyses these SF groups were further refined by taking into account structural homogeneity, type of toxicological effect observed, differences in absorption, metabolism and mechanism of action (MoA), to better define their structural and toxicological boundaries. Differentiation of a local or systemic mode of action did not improve the classification scheme. Finally, 28 groups were discriminated: 19 T-groups and 9 L-groups. Clearly distinct thresholds were derived for the T- and L-toxicity groups, being 2 × 10(-5) ppm (2 μg/person/day) and 0.05 ppm (4260 μg/person/day), respectively. The derived thresholds and the classification are compared to the initial mainly structure driven grouping (Schüürmann et al., 2016) and to the Cramer classification. Copyright © 2016 Elsevier Inc. All rights reserved.
Classifying diseases and remedies in ethnomedicine and ethnopharmacology.
Staub, Peter O; Geck, Matthias S; Weckerle, Caroline S; Casu, Laura; Leonti, Marco
2015-11-04
Ethnopharmacology focuses on the understanding of local and indigenous use of medicines and therefore an emic approach is inevitable. Often, however, standard biomedical disease classifications are used to describe and analyse local diseases and remedies. Standard classifications might be a valid tool for cross-cultural comparisons and bioprospecting purposes but are not suitable to understand the local perception of disease and use of remedies. Different standard disease classification systems exist but their suitability for cross-cultural comparisons of ethnomedical data has never been assessed. Depending on the research focus, (I) ethnomedical, (II) cross-cultural, and (III) bioprospecting, we provide suggestions for the use of specific classification systems. We analyse three different standard biomedical classification systems (the International Classification of Diseases (ICD); the Economic Botany Data Collection Standard (EBDCS); and the International Classification of Primary Care (ICPC)), and discuss their value for categorizing diseases of ethnomedical systems and their suitability for cross-cultural research in ethnopharmacology. Moreover, based on the biomedical uses of all approved plant derived biomedical drugs, we propose a biomedical therapy-based classification system as a guide for the discovery of drugs from ethnopharmacological sources. Widely used standards, such as the International Classification of Diseases (ICD) by the WHO and the Economic Botany Data Collection Standard (EBDCS) are either technically challenging due to a categorisation system based on clinical examinations, which are usually not possible during field research (ICD) or lack clear biomedical criteria combining disorders and medical effects in an imprecise and confusing way (EBDCS). The International Classification of Primary Care (ICPC), also accepted by the WHO, has more in common with ethnomedical reality than the ICD or the EBDCS, as the categories are designed according to patient's perceptions and are less influenced by clinical medicine. Since diagnostic tools are not required, medical ethnobotanists and ethnopharmacologists can easily classify reported symptoms and complaints with the ICPC in one of the "chapters" based on 17 body systems, psychological and social problems. Also the biomedical uses of plant-derived drugs are classifiable into 17 broad organ- and therapy-based use-categories but can easily be divided into more specific subcategories. Depending on the research focus (I-III) we propose the following classification systems: I. Ethnomedicine: Ethnomedicine is culture-bound and local classifications have to be understood from an emic perspective. Consequently, the application of prefabricated, "one-size fits all" biomedical classification schemes is of limited value. II. Cross-cultural analysis: The ICPC is a suitable standard that can be applied but modified as required. III. Bioprospecting: We suggest a biomedical therapy-driven classification system with currently 17 use-categories based on biomedical uses of all approved plant derived natural product drugs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Molecular impact of juvenile hormone agonists on neonatal Daphnia magna.
Toyota, Kenji; Kato, Yasuhiko; Miyakawa, Hitoshi; Yatsu, Ryohei; Mizutani, Takeshi; Ogino, Yukiko; Miyagawa, Shinichi; Watanabe, Hajime; Nishide, Hiroyo; Uchiyama, Ikuo; Tatarazako, Norihisa; Iguchi, Taisen
2014-05-01
Daphnia magna has been used extensively to evaluate organism- and population-level responses to pollutants in acute toxicity and reproductive toxicity tests. We have previously reported that exposure to juvenile hormone (JH) agonists results in a reduction of reproductive function and production of male offspring in a cyclic parthenogenesis, D. magna. Recent advances in molecular techniques have provided tools to understand better the responses to pollutants in aquatic organisms, including D. magna. DNA microarray was used to evaluate gene expression profiles of neonatal daphnids exposed to JH agonists: methoprene (125, 250 and 500 ppb), fenoxycarb (0.5, 1 and 2 ppb) and epofenonane (50, 100 and 200 ppb). Exposure to these JH analogs resulted in chemical-specific patterns of gene expression. The heat map analyses based on hierarchical clustering revealed a similar pattern between treatments with a high dose of methoprene and with epofenonane. In contrast, treatment with low to middle doses of methoprene resulted in similar profiles to fenoxycarb treatments. Hemoglobin and JH epoxide hydrolase genes were clustered as JH-responsive genes. These data suggest that fenoxycarb has high activity as a JH agonist, methoprene shows high toxicity and epofenonane works through a different mechanism compared with other JH analogs, agreeing with data of previously reported toxicity tests. In conclusion, D. magna DNA microarray is useful for the classification of JH analogs and identification of JH-responsive genes. Copyright © 2013 John Wiley & Sons, Ltd.
Fetal and perinatal exposure to drugs and chemicals: novel biomarkers of risk.
Etwel, Fatma; Hutson, Janine R; Madadi, Parvaz; Gareri, Joey; Koren, Gideon
2014-01-01
Pregnant women are almost always excluded from randomized controlled clinical trials, as the risks to the fetus posed by most new chemical entities or approved drugs cannot be sufficiently ruled out. Hence, a major scientific challenge in this field is to discover and validate alternative tools that will fill the knowledge gap created by the lack of participation in gold-standard randomized trials. This review focuses on novel tools that allow estimation of fetal risks after exposure to therapeutic agents, such as placental perfusion studies, biomarkers of fetal exposure, and novel epidemiological and pharmacogenetic tools, all of which have been tested successfully in recent years.
2013-10-01
collection in underway. 15. SUBJECT TERMS Spinal Cord Injury, Immunogenetics, Chronic pain, Opioids 16. SECURITY CLASSIFICATION OF: 17...prototypic opioid , morphine, is capable of TLR4-mediated proinflammation6-8 . As such, exposure to morphine at the time of injury may result in...fashion to the spinal cord injury and/or to experience inflammation in response to opioid exposure. Critically, this genetic variability may
2014-10-01
collection in underway. 15. SUBJECT TERMS Spinal Cord Injury, Immunogenetics, Chronic pain, Opioids 16. SECURITY CLASSIFICATION OF: 17...The prototypic opioid , morphine, is capable of TLR4-mediated proinflammation6-8. As such, exposure to morphine at the time of injury may result in...proinflammatory fashion to the spinal cord injury, and/or to experience inflammation in response to opioid exposure. Critically, this genetic variability
Dose-response patterns for vibration-induced white finger
Griffin, M; Bovenzi, M; Nelson, C
2003-01-01
Aims: To investigate alternative relations between cumulative exposures to hand-transmitted vibration (taking account of vibration magnitude, lifetime exposure duration, and frequency of vibration) and the development of white finger (Raynaud's phenomenon). Methods: Three previous studies have been combined to provide a group of 1557 users of powered vibratory tools in seven occupational subgroups: stone grinders, stone carvers, quarry drillers, dockyard caulkers, dockyard boilermakers, dockyard painters, and forest workers. The estimated total operating duration in hours was thus obtained for each subject, for each tool, and for all tools combined. From the vibration magnitudes and exposure durations, seven alternative measurements of cumulative exposure were calculated for each subject, using expressions of the form: dose = ∑amiti, where ai is the acceleration magnitude on tool i, ti is the lifetime exposure duration for tool i, and m = 0, 1, 2, or 4. Results: For all seven alternative dose measures, an increase in dose was associated with a significant increase in the occurrence of vibration-induced white finger, after adjustment for age and smoking. However, dose measures with high powers of acceleration (m > 1) faired less well than measures in which the weighted or unweighted acceleration, and lifetime exposure duration, were given equal weight (m = 1). Dose determined solely by the lifetime exposure duration (without consideration of the vibration magnitude) gave better predictions than measures with m greater than unity. All measures of dose calculated from the unweighted acceleration gave better predictions than the equivalent dose measures using acceleration frequency-weighted according to current standards. Conclusions: Since the total duration of exposure does not discriminate between exposures accumulated over the day and those accumulated over years, a linear relation between vibration magnitude and exposure duration seems appropriate for predicting the occurrence of vibration-induced white finger. Poorer predictions were obtained when the currently recommended frequency weighting was employed than when accelerations at all frequencies were given equal weight. Findings suggest that improvements are possible to both the frequency weighting and the time dependency used to predict the development of vibration-induced white finger in current standards. PMID:12499452
Measurement systems and indices of miners' exposure to radon daughter products in the air of mines.
Domański, T
1990-01-01
This paper presents the classification of measurement systems that may be used for the assessment of miners' exposure to radiation in mines. The following systems were described and characterized as the Air Sampling System (ASS), the Environmental Control System (ECS), the Individual Dosimetry System (IDS), the Stream Monitoring System (SMS) and the Exhaust Monitoring System (EMS). The indices for evaluation of miners' working environments, or for assessment of individual or collective miners' exposure, were selected and determined. These are: average expected concentration (CAE), average observed concentration (CAO), average expected rate of exposure cumulation rate (EEXP), average observed exposure cumulation rate (EOBS), average effective exposure cumulation rate (EEFF). Mathematical formulae for determining all these indicators, according to the type of measurement system used in particular mines, are presented. The reliability of assessment of miners' exposure in particular measurement systems, as well as the role of the possible reference system, are discussed.
Schijven, Jack; Bouwknegt, Martijn; de Roda Husman, Ana Maria; Rutjes, Saskia; Sudre, Bertrand; Suk, Jonathan E; Semenza, Jan C
2013-12-01
Climate change may impact waterborne and foodborne infectious disease, but to what extent is uncertain. Estimating climate-change-associated relative infection risks from exposure to viruses, bacteria, or parasites in water or food is critical for guiding adaptation measures. We present a computational tool for strategic decision making that describes the behavior of pathogens using location-specific input data under current and projected climate conditions. Pathogen-pathway combinations are available for exposure to norovirus, Campylobacter, Cryptosporidium, and noncholera Vibrio species via drinking water, bathing water, oysters, or chicken fillets. Infection risk outcomes generated by the tool under current climate conditions correspond with those published in the literature. The tool demonstrates that increasing temperatures lead to increasing risks for infection with Campylobacter from consuming raw/undercooked chicken fillet and for Vibrio from water exposure. Increasing frequencies of drought generally lead to an elevated infection risk of exposure to persistent pathogens such as norovirus and Cryptosporidium, but decreasing risk of exposure to rapidly inactivating pathogens, like Campylobacter. The opposite is the case with increasing annual precipitation; an upsurge of heavy rainfall events leads to more peaks in infection risks in all cases. The interdisciplinary tool presented here can be used to guide climate change adaptation strategies focused on infectious diseases. © 2013 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Verma, Sneha K.; Chun, Sophia; Liu, Brent J.
2014-03-01
Pain is a common complication after spinal cord injury with prevalence estimates ranging 77% to 81%, which highly affects a patient's lifestyle and well-being. In the current clinical setting paper-based forms are used to classify pain correctly, however, the accuracy of diagnoses and optimal management of pain largely depend on the expert reviewer, which in many cases is not possible because of very few experts in this field. The need for a clinical decision support system that can be used by expert and non-expert clinicians has been cited in literature, but such a system has not been developed. We have designed and developed a stand-alone tool for correctly classifying pain type in spinal cord injury (SCI) patients, using Bayesian decision theory. Various machine learning simulation methods are used to verify the algorithm using a pilot study data set, which consists of 48 patients data set. The data set consists of the paper-based forms, collected at Long Beach VA clinic with pain classification done by expert in the field. Using the WEKA as the machine learning tool we have tested on the 48 patient dataset that the hypothesis that attributes collected on the forms and the pain location marked by patients have very significant impact on the pain type classification. This tool will be integrated with an imaging informatics system to support a clinical study that will test the effectiveness of using Proton Beam radiotherapy for treating spinal cord injury (SCI) related neuropathic pain as an alternative to invasive surgical lesioning.
Xu, Xiayu; Ding, Wenxiang; Abràmoff, Michael D; Cao, Ruofan
2017-04-01
Retinal artery and vein classification is an important task for the automatic computer-aided diagnosis of various eye diseases and systemic diseases. This paper presents an improved supervised artery and vein classification method in retinal image. Intra-image regularization and inter-subject normalization is applied to reduce the differences in feature space. Novel features, including first-order and second-order texture features, are utilized to capture the discriminating characteristics of arteries and veins. The proposed method was tested on the DRIVE dataset and achieved an overall accuracy of 0.923. This retinal artery and vein classification algorithm serves as a potentially important tool for the early diagnosis of various diseases, including diabetic retinopathy and cardiovascular diseases. Copyright © 2017 Elsevier B.V. All rights reserved.
Lean waste classification model to support the sustainable operational practice
NASA Astrophysics Data System (ADS)
Sutrisno, A.; Vanany, I.; Gunawan, I.; Asjad, M.
2018-04-01
Driven by growing pressure for a more sustainable operational practice, improvement on the classification of non-value added (waste) is one of the prerequisites to realize sustainability of a firm. While the use of the 7 (seven) types of the Ohno model now becoming a versatile tool to reveal the lean waste occurrence. In many recent investigations, the use of the Seven Waste model of Ohno is insufficient to cope with the types of waste occurred in industrial practices at various application levels. Intended to a narrowing down this limitation, this paper presented an improved waste classification model based on survey to recent studies discussing on waste at various operational stages. Implications on the waste classification model to the body of knowledge and industrial practices are provided.
Alsalem, M A; Zaidan, A A; Zaidan, B B; Hashim, M; Madhloom, H T; Azeez, N D; Alsyisuf, S
2018-05-01
Acute leukaemia diagnosis is a field requiring automated solutions, tools and methods and the ability to facilitate early detection and even prediction. Many studies have focused on the automatic detection and classification of acute leukaemia and their subtypes to promote enable highly accurate diagnosis. This study aimed to review and analyse literature related to the detection and classification of acute leukaemia. The factors that were considered to improve understanding on the field's various contextual aspects in published studies and characteristics were motivation, open challenges that confronted researchers and recommendations presented to researchers to enhance this vital research area. We systematically searched all articles about the classification and detection of acute leukaemia, as well as their evaluation and benchmarking, in three main databases: ScienceDirect, Web of Science and IEEE Xplore from 2007 to 2017. These indices were considered to be sufficiently extensive to encompass our field of literature. Based on our inclusion and exclusion criteria, 89 articles were selected. Most studies (58/89) focused on the methods or algorithms of acute leukaemia classification, a number of papers (22/89) covered the developed systems for the detection or diagnosis of acute leukaemia and few papers (5/89) presented evaluation and comparative studies. The smallest portion (4/89) of articles comprised reviews and surveys. Acute leukaemia diagnosis, which is a field requiring automated solutions, tools and methods, entails the ability to facilitate early detection or even prediction. Many studies have been performed on the automatic detection and classification of acute leukaemia and their subtypes to promote accurate diagnosis. Research areas on medical-image classification vary, but they are all equally vital. We expect this systematic review to help emphasise current research opportunities and thus extend and create additional research fields. Copyright © 2018 Elsevier B.V. All rights reserved.
Cancer classification using the Immunoscore: a worldwide task force.
Galon, Jérôme; Pagès, Franck; Marincola, Francesco M; Angell, Helen K; Thurin, Magdalena; Lugli, Alessandro; Zlobec, Inti; Berger, Anne; Bifulco, Carlo; Botti, Gerardo; Tatangelo, Fabiana; Britten, Cedrik M; Kreiter, Sebastian; Chouchane, Lotfi; Delrio, Paolo; Arndt, Hartmann; Asslaber, Martin; Maio, Michele; Masucci, Giuseppe V; Mihm, Martin; Vidal-Vanaclocha, Fernando; Allison, James P; Gnjatic, Sacha; Hakansson, Leif; Huber, Christoph; Singh-Jasuja, Harpreet; Ottensmeier, Christian; Zwierzina, Heinz; Laghi, Luigi; Grizzi, Fabio; Ohashi, Pamela S; Shaw, Patricia A; Clarke, Blaise A; Wouters, Bradly G; Kawakami, Yutaka; Hazama, Shoichi; Okuno, Kiyotaka; Wang, Ena; O'Donnell-Tormey, Jill; Lagorce, Christine; Pawelec, Graham; Nishimura, Michael I; Hawkins, Robert; Lapointe, Réjean; Lundqvist, Andreas; Khleif, Samir N; Ogino, Shuji; Gibbs, Peter; Waring, Paul; Sato, Noriyuki; Torigoe, Toshihiko; Itoh, Kyogo; Patel, Prabhu S; Shukla, Shilin N; Palmqvist, Richard; Nagtegaal, Iris D; Wang, Yili; D'Arrigo, Corrado; Kopetz, Scott; Sinicrope, Frank A; Trinchieri, Giorgio; Gajewski, Thomas F; Ascierto, Paolo A; Fox, Bernard A
2012-10-03
Prediction of clinical outcome in cancer is usually achieved by histopathological evaluation of tissue samples obtained during surgical resection of the primary tumor. Traditional tumor staging (AJCC/UICC-TNM classification) summarizes data on tumor burden (T), presence of cancer cells in draining and regional lymph nodes (N) and evidence for metastases (M). However, it is now recognized that clinical outcome can significantly vary among patients within the same stage. The current classification provides limited prognostic information, and does not predict response to therapy. Recent literature has alluded to the importance of the host immune system in controlling tumor progression. Thus, evidence supports the notion to include immunological biomarkers, implemented as a tool for the prediction of prognosis and response to therapy. Accumulating data, collected from large cohorts of human cancers, has demonstrated the impact of immune-classification, which has a prognostic value that may add to the significance of the AJCC/UICC TNM-classification. It is therefore imperative to begin to incorporate the 'Immunoscore' into traditional classification, thus providing an essential prognostic and potentially predictive tool. Introduction of this parameter as a biomarker to classify cancers, as part of routine diagnostic and prognostic assessment of tumors, will facilitate clinical decision-making including rational stratification of patient treatment. Equally, the inherent complexity of quantitative immunohistochemistry, in conjunction with protocol variation across laboratories, analysis of different immune cell types, inconsistent region selection criteria, and variable ways to quantify immune infiltration, all underline the urgent requirement to reach assay harmonization. In an effort to promote the Immunoscore in routine clinical settings, an international task force was initiated. This review represents a follow-up of the announcement of this initiative, and of the J Transl Med. editorial from January 2012. Immunophenotyping of tumors may provide crucial novel prognostic information. The results of this international validation may result in the implementation of the Immunoscore as a new component for the classification of cancer, designated TNM-I (TNM-Immune).
2012-01-01
Background Long terminal repeat (LTR) retrotransposons are a class of eukaryotic mobile elements characterized by a distinctive sequence similarity-based structure. Hence they are well suited for computational identification. Current software allows for a comprehensive genome-wide de novo detection of such elements. The obvious next step is the classification of newly detected candidates resulting in (super-)families. Such a de novo classification approach based on sequence-based clustering of transposon features has been proposed before, resulting in a preliminary assignment of candidates to families as a basis for subsequent manual refinement. However, such a classification workflow is typically split across a heterogeneous set of glue scripts and generic software (for example, spreadsheets), making it tedious for a human expert to inspect, curate and export the putative families produced by the workflow. Results We have developed LTRsift, an interactive graphical software tool for semi-automatic postprocessing of de novo predicted LTR retrotransposon annotations. Its user-friendly interface offers customizable filtering and classification functionality, displaying the putative candidate groups, their members and their internal structure in a hierarchical fashion. To ease manual work, it also supports graphical user interface-driven reassignment, splitting and further annotation of candidates. Export of grouped candidate sets in standard formats is possible. In two case studies, we demonstrate how LTRsift can be employed in the context of a genome-wide LTR retrotransposon survey effort. Conclusions LTRsift is a useful and convenient tool for semi-automated classification of newly detected LTR retrotransposons based on their internal features. Its efficient implementation allows for convenient and seamless filtering and classification in an integrated environment. Developed for life scientists, it is helpful in postprocessing and refining the output of software for predicting LTR retrotransposons up to the stage of preparing full-length reference sequence libraries. The LTRsift software is freely available at http://www.zbh.uni-hamburg.de/LTRsift under an open-source license. PMID:23131050
Steinbiss, Sascha; Kastens, Sascha; Kurtz, Stefan
2012-11-07
Long terminal repeat (LTR) retrotransposons are a class of eukaryotic mobile elements characterized by a distinctive sequence similarity-based structure. Hence they are well suited for computational identification. Current software allows for a comprehensive genome-wide de novo detection of such elements. The obvious next step is the classification of newly detected candidates resulting in (super-)families. Such a de novo classification approach based on sequence-based clustering of transposon features has been proposed before, resulting in a preliminary assignment of candidates to families as a basis for subsequent manual refinement. However, such a classification workflow is typically split across a heterogeneous set of glue scripts and generic software (for example, spreadsheets), making it tedious for a human expert to inspect, curate and export the putative families produced by the workflow. We have developed LTRsift, an interactive graphical software tool for semi-automatic postprocessing of de novo predicted LTR retrotransposon annotations. Its user-friendly interface offers customizable filtering and classification functionality, displaying the putative candidate groups, their members and their internal structure in a hierarchical fashion. To ease manual work, it also supports graphical user interface-driven reassignment, splitting and further annotation of candidates. Export of grouped candidate sets in standard formats is possible. In two case studies, we demonstrate how LTRsift can be employed in the context of a genome-wide LTR retrotransposon survey effort. LTRsift is a useful and convenient tool for semi-automated classification of newly detected LTR retrotransposons based on their internal features. Its efficient implementation allows for convenient and seamless filtering and classification in an integrated environment. Developed for life scientists, it is helpful in postprocessing and refining the output of software for predicting LTR retrotransposons up to the stage of preparing full-length reference sequence libraries. The LTRsift software is freely available at http://www.zbh.uni-hamburg.de/LTRsift under an open-source license.
Classification of time series patterns from complex dynamic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Rao, N.
1998-07-01
An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately,more » the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.« less
Assessment of Occupational Noise Exposure among Groundskeepers in North Carolina Public Universities
Balanay, Jo Anne G.; Kearney, Gregory D.; Mannarino, Adam J.
2016-01-01
Groundskeepers may have increased risk to noise-induced hearing loss due to the performance of excessively noisy tasks. This study assessed the exposure of groundskeepers to noise in multiple universities and determined the association between noise exposure and variables (ie, university, month, tool used). Personal noise exposures were monitored during the work shift using noise dosimetry. A sound level meter was used to measure the maximum sound pressure levels from groundskeeping equipment. The mean Occupational Safety and Health Administration (OSHA) and National Institute for Occupational Safety and Health (NIOSH) time-weighted average (TWA) noise exposures were 83.0 ± 9.6 and 88.0 ± 6.7 dBA, respectively. About 52% of the OSHA TWAs and 77% of the NIOSH TWAs exceeded 85 dBA. Riding mower use was associated with high TWA noise exposures and with having OSHA TWAs exceeding 85 and 90 dBA. The maximum sound pressure levels of equipment and tools measured ranged from 76 to 109 dBA, 82% of which were >85 dBA. These findings support that groundskeepers have excessive noise exposures, which may be effectively reduced through careful scheduling of the use of noisy equipment/tools. PMID:27330303
A solution for exposure tool optimization at the 65-nm node and beyond
NASA Astrophysics Data System (ADS)
Itai, Daisuke
2007-03-01
As device geometries shrink, tolerances for critical dimension, focus, and overlay control decrease. For the stable manufacture of semiconductor devices at (and beyond) the 65nm node, both performance variability and drift in exposure tools are no longer negligible factors. With EES (Equipment Engineering System) as a guidepost, hopes of improving productivity of semiconductor manufacturing are growing. We are developing a system, EESP (Equipment Engineering Support Program), based on the concept of EES. The EESP system collects and stores large volumes of detailed data generated from Canon lithographic equipment while product is being manufactured. It uses that data to monitor both equipment characteristics and process characteristics, which cannot be examined without this system. The goal of EESP is to maximize equipment capabilities, by feeding the result back to APC/FDC and the equipment maintenance list. This was a collaborative study of the system's effectiveness at the device maker's factories. We analyzed the performance variability of exposure tools by using focus residual data. We also attempted to optimize tool performance using the analyzed results. The EESP system can make the optimum performance of exposure tools available to the device maker.
Proposed changes in the classification of carcinogenic chemicals in the work area.
Neumann, H G; Thielmann, H W; Filser, J G; Gelbke, H P; Greim, H; Kappus, H; Norpoth, K H; Reuter, U; Vamvakas, S; Wardenbach, P; Wichmann, H E
1997-12-01
Carcinogenic chemicals in the work area are currently classified into three categories in Section III of the German List of MAK and BAT Values. This classification is based on qualitative criteria and reflects essentially the weight of evidence available for judging the carcinogenic potential of the chemicals. It is proposed that these Categories--IIIA1, IIIA2, and IIIB--be retained as Categories 1, 2, and 3, to conform with EU regulations. On the basis of our advancing knowledge of reaction mechanisms and the potency of carcinogens, it is now proposed that these three categories be supplemented with two additional categories. The essential feature of substances classified in the new categories is that exposure to these chemicals does not convey a significant risk of cancer to man, provided that an appropriate exposure limit (MAK value) is observed. It is proposed that chemicals known to act typically by nongenotoxic mechanisms and for which information is available that allows evaluation of the effects of low-dose exposures be classified in Category 4. Genotoxic chemicals for which low carcinogenic potency can be expected on the basis of dose-response relationships and toxicokinetics and for which risk at low doses can be assessed will be classified in Category 5. The basis for a better differentiation of carcinogens is discussed, the new categories are defined, and possible criteria for classification are described. Examples for Category 4 (1,4-dioxane) and Category 5 (styrene) are presented. The proposed changes in classifying carcinogenic chemicals in the work area are presented for further discussion.
ERIC Educational Resources Information Center
Moffitt, Kevin Christopher
2011-01-01
The three objectives of this dissertation were to develop a question type model for predicting linguistic features of responses to interview questions, create a tool for linguistic analysis of documents, and use lexical bundle analysis to identify linguistic differences between fraudulent and non-fraudulent financial reports. First, The Moffitt…
Molecular diagnostics of inflammatory disease: New tools and perspectives.
Garzorz-Stark, Natalie; Lauffer, Felix
2017-08-01
This essay reviews current approaches to establish novel molecular diagnostic tools for inflammatory skin diseases. Moreover, it highlights the importance of stratifying patients according to molecular signatures and revising current outdated disease classification systems to eventually reach the goal of personalized medicine. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Improving Environmental Model Calibration and Prediction
2011-01-18
REPORT Final Report - Improving Environmental Model Calibration and Prediction 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: First, we have continued to...develop tools for efficient global optimization of environmental models. Our algorithms are hybrid algorithms that combine evolutionary strategies...toward practical hybrid optimization tools for environmental models. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 18-01-2011 13
A New Computational Tool for Understanding Light-Matter Interactions
2016-02-11
SECURITY CLASSIFICATION OF: Plasmonic resonance of a metallic nanostructure results from coherent motion of its conduction electrons driven by...Box 12211 Research Triangle Park, NC 27709-2211 Plasmonics , light-matter interaction, time-dependent density functional theory, modeling and...reviewed journals: Final Report: A New Computational Tool For Understanding Light-Matter Interactions Report Title Plasmonic resonance of a metallic
Lee, Yun Jin; Kim, Jung Yoon
2016-03-01
The objective of this study was to evaluate the effect of pressure ulcer classification system education on clinical nurses' knowledge and visual differential diagnostic ability of pressure ulcer (PU) classification and incontinence-associated dermatitis (IAD). One group pre and post-test was used. A convenience sample of 407 nurses, participating in PU classification education programme of continuing education, were enrolled. The education programme was composed of a 50-minute lecture on PU classification and case-studies. The PU Classification system and IAD knowledge test (PUCS-KT) and visual differential diagnostic ability tool (VDDAT), consisting of 21 photographs including clinical information were used. Paired t-test was performed using SPSS/WIN 20.0. The overall mean difference of PUCS-KT (t = -11·437, P<0·001) and VDDAT (t = -21·113, P<0·001) was significantly increased after PU classification education. Overall understanding of six PU classification and IAD after education programme was increased, but lacked visual differential diagnostic ability regarding Stage III PU, suspected deep tissue injury (SDTI), and Unstageable. Continuous differentiated education based on clinical practice is needed to improve knowledge and visual differential diagnostic ability for PU classification, and comparison experiment study is required to examine effects of education programmes. © 2016 Medicalhelplines.com Inc and John Wiley & Sons Ltd.
SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE
The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...
Exposure Modeling Tools and Databases for Consideration for Relevance to the Amended TSCA (ISES)
The Agency’s Office of Research and Development (ORD) has a number of ongoing exposure modeling tools and databases. These efforts are anticipated to be useful in supporting ongoing implementation of the amended Toxic Substances Control Act (TSCA). Under ORD’s Chemic...
Malware distributed collection and pre-classification system using honeypot technology
NASA Astrophysics Data System (ADS)
Grégio, André R. A.; Oliveira, Isabela L.; Santos, Rafael D. C.; Cansian, Adriano M.; de Geus, Paulo L.
2009-04-01
Malware has become a major threat in the last years due to the ease of spread through the Internet. Malware detection has become difficult with the use of compression, polymorphic methods and techniques to detect and disable security software. Those and other obfuscation techniques pose a problem for detection and classification schemes that analyze malware behavior. In this paper we propose a distributed architecture to improve malware collection using different honeypot technologies to increase the variety of malware collected. We also present a daemon tool developed to grab malware distributed through spam and a pre-classification technique that uses antivirus technology to separate malware in generic classes.
Patnode, K.A.; White, D.H.
1991-01-01
A prototypic experimental design was used to assess sublethal effects of multiple and varied organophosphates and carbamates on reproduction in birds. The design allowed for classification of pesticide exposure according to toxicity of applied compounds and type and frequency of applications. Daily survival rates (DSRs) of nests, eggs, and nestlings were determined for northern mockingbirds (Mimus polyglottos), brown thrashers (Toxostoma rufum), and northern cardinals (Cardinalis cardinalis) nesting along edges of pecan orchards and row crops in southern Georgia [USA]. Egg and nestling DSRs for all species combined varied inversely (P 0.05) among three exposure levels. Brain cholinesterase activities were age-dependent and substantiated adult, but not nestling, exposure. Results suggest that increasing exposure to pesticides may reduce songbird productivity.
78 FR 36093 - Fenpyroximate; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-17
... pesticide manufacturer. The following list of North American Industrial Classification System (NAICS) codes... there is reliable information.'' This includes exposure through drinking water and in residential... the available scientific data and other relevant information in support of this action. EPA has...
NASA Astrophysics Data System (ADS)
Bellón, Beatriz; Bégué, Agnès; Lo Seen, Danny; Lebourgeois, Valentine; Evangelista, Balbino Antônio; Simões, Margareth; Demonte Ferraz, Rodrigo Peçanha
2018-06-01
Cropping systems' maps at fine scale over large areas provide key information for further agricultural production and environmental impact assessments, and thus represent a valuable tool for effective land-use planning. There is, therefore, a growing interest in mapping cropping systems in an operational manner over large areas, and remote sensing approaches based on vegetation index time series analysis have proven to be an efficient tool. However, supervised pixel-based approaches are commonly adopted, requiring resource consuming field campaigns to gather training data. In this paper, we present a new object-based unsupervised classification approach tested on an annual MODIS 16-day composite Normalized Difference Vegetation Index time series and a Landsat 8 mosaic of the State of Tocantins, Brazil, for the 2014-2015 growing season. Two variants of the approach are compared: an hyperclustering approach, and a landscape-clustering approach involving a previous stratification of the study area into landscape units on which the clustering is then performed. The main cropping systems of Tocantins, characterized by the crop types and cropping patterns, were efficiently mapped with the landscape-clustering approach. Results show that stratification prior to clustering significantly improves the classification accuracies for underrepresented and sparsely distributed cropping systems. This study illustrates the potential of unsupervised classification for large area cropping systems' mapping and contributes to the development of generic tools for supporting large-scale agricultural monitoring across regions.
ESKAPE/CF: A Knowledge Acquisition Tool for Expert Systems Using Cognitive Feedback
1991-03-01
NAVAL POSTGRADUATE SCHOOL Monterey, California AD-A241 815i!1! lit 1i iill 1111 !! I 1111 ST E * ODTIC OCT22 z 99I; THESIS ESKAPE /CF: A KNOWLEDGE...11. TITLE (include Security Classification) ESKAPE /CF: A KNOWLEDGE ACQUISITION TOOL FOR EXPERT SYSTEMS USING COGNITIVE FEEDBACK (U) e PERSOIAL AUTVR(Yl...tool using Cognitive Feedback ( ESKAPE /CF), based on Lens model techniques which have demonstrated effectiveness in cap- turing policy knowledge. The
NASA Astrophysics Data System (ADS)
Briones, J. C.; Heras, V.; Abril, C.; Sinchi, E.
2017-08-01
The proper control of built heritage entails many challenges related to the complexity of heritage elements and the extent of the area to be managed, for which the available resources must be efficiently used. In this scenario, the preventive conservation approach, based on the concept that prevent is better than cure, emerges as a strategy to avoid the progressive and imminent loss of monuments and heritage sites. Regular monitoring appears as a key tool to identify timely changes in heritage assets. This research demonstrates that the supervised learning model (Support Vector Machines - SVM) is an ideal tool that supports the monitoring process detecting visible elements in aerial images such as roofs structures, vegetation and pavements. The linear, gaussian and polynomial kernel functions were tested; the lineal function provided better results over the other functions. It is important to mention that due to the high level of segmentation generated by the classification procedure, it was necessary to apply a generalization process through opening a mathematical morphological operation, which simplified the over classification for the monitored elements.
Bispectral infrared forest fire detection and analysis using classification techniques
NASA Astrophysics Data System (ADS)
Aranda, Jose M.; Melendez, Juan; de Castro, Antonio J.; Lopez, Fernando
2004-01-01
Infrared cameras are well established as a useful tool for fire detection, but their use for quantitative forest fire measurements faces difficulties, due to the complex spatial and spectral structure of fires. In this work it is shown that some of these difficulties can be overcome by applying classification techniques, a standard tool for the analysis of satellite multispectral images, to bi-spectral images of fires. Images were acquired by two cameras that operate in the medium infrared (MIR) and thermal infrared (TIR) bands. They provide simultaneous and co-registered images, calibrated in brightness temperatures. The MIR-TIR scatterplot of these images can be used to classify the scene into different fire regions (background, ashes, and several ember and flame regions). It is shown that classification makes possible to obtain quantitative measurements of physical fire parameters like rate of spread, embers temperature, and radiated power in the MIR and TIR bands. An estimation of total radiated power and heat release per unit area is also made and compared with values derived from heat of combustion and fuel consumption.
Natural resources inventory and land evaluation in Switzerland
NASA Technical Reports Server (NTRS)
Haefner, H. (Principal Investigator)
1976-01-01
The author has identified the following significant results. Using MSS channels 5 and 7 and a supervised classification system with a PPD classification algorithm, it was possible to map the exact areal extent of the snow cover and of the transition zone with melting snow patches and snow free parts of various sizes over a large area under different aspects such as relief, exposure, shadows etc. A correlation of the data from ground control, areal underflights and earth resources satellites provided a very accurate interpretation of the melting procedure of snow in high mountains.
Pesticide-Related Hospitalizations Among Children and Teenagers in Texas, 2004-2013.
Trueblood, Amber B; Shipp, Eva; Han, Daikwon; Ross, Jennifer; Cizmas, Leslie H
2016-01-01
Acute exposure to pesticides is associated with nausea, headaches, rashes, eye irritation, seizures, and, in severe cases, death. We characterized pesticide-related hospitalizations in Texas among children and teenagers for 2004-2013 to characterize exposures in this population, which is less well understood than pesticide exposure among adults. We abstracted information on pesticide-related hospitalizations from hospitalization data using pesticide-related International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes and E-codes. We calculated the prevalence of pesticide-related hospitalizations among children and teenagers aged #19 years for all hospitalizations, unintentional exposures, intentional exposures, pesticide classifications, and illness severity. We also calculated age- and sex-specific prevalence of pesticide-related hospitalizations among children. The prevalence of pesticide-related hospitalizations among children and teenagers was 2.1 per 100,000 population. The prevalence of pesticide-related hospitalizations per 100,000 population was 2.7 for boys and 1.5 for girls. The age-specific prevalence per 100,000 population was 5.3 for children aged 0-4 years, 0.3 for children and teenagers aged 5-14 years, and 2.3 for teenagers aged 15-19 years. Children aged 0-4 years had the highest prevalence of unintentional exposures, whereas teenagers aged 15-19 years had the highest prevalence of intentional exposures. Commonly reported pesticide categories were organophosphates/carbamates, disinfectants, rodenticides, and other pesticides (e.g., pyrethrins, pyrethroids). Of the 158 pesticide-related hospitalizations, most were coded as having minor (n=86) or moderate (n=40) illness severity. Characterizing the prevalence of pesticide-related hospitalizations among children and teenagers leads to a better understanding of the burden of pesticide exposures, including the type of pesticides used and the severity of potential health effects. This study found differences in the frequency of pesticide-related hospitalizations by sex, age, and intent (e.g., unintentional vs. intentional).
Rosen, Lisa M.; Liu, Tao; Merchant, Roland C.
2016-01-01
BACKGROUND Blood and body fluid exposures are frequently evaluated in emergency departments (EDs). However, efficient and effective methods for estimating their incidence are not yet established. OBJECTIVE Evaluate the efficiency and accuracy of estimating statewide ED visits for blood or body fluid exposures using International Classification of Diseases, Ninth Revision (ICD-9), code searches. DESIGN Secondary analysis of a database of ED visits for blood or body fluid exposure. SETTING EDs of 11 civilian hospitals throughout Rhode Island from January 1, 1995, through June 30, 2001. PATIENTS Patients presenting to the ED for possible blood or body fluid exposure were included, as determined by prespecified ICD-9 codes. METHODS Positive predictive values (PPVs) were estimated to determine the ability of 10 ICD-9 codes to distinguish ED visits for blood or body fluid exposure from ED visits that were not for blood or body fluid exposure. Recursive partitioning was used to identify an optimal subset of ICD-9 codes for this purpose. Random-effects logistic regression modeling was used to examine variations in ICD-9 coding practices and styles across hospitals. Cluster analysis was used to assess whether the choice of ICD-9 codes was similar across hospitals. RESULTS The PPV for the original 10 ICD-9 codes was 74.4% (95% confidence interval [CI], 73.2%–75.7%), whereas the recursive partitioning analysis identified a subset of 5 ICD-9 codes with a PPV of 89.9% (95% CI, 88.9%–90.8%) and a misclassification rate of 10.1%. The ability, efficiency, and use of the ICD-9 codes to distinguish types of ED visits varied across hospitals. CONCLUSIONS Although an accurate subset of ICD-9 codes could be identified, variations across hospitals related to hospital coding style, efficiency, and accuracy greatly affected estimates of the number of ED visits for blood or body fluid exposure. PMID:22561713
Pesticide-Related Hospitalizations Among Children and Teenagers in Texas, 2004–2013
Shipp, Eva; Han, Daikwon; Ross, Jennifer; Cizmas, Leslie H.
2016-01-01
Objective Acute exposure to pesticides is associated with nausea, headaches, rashes, eye irritation, seizures, and, in severe cases, death. We characterized pesticide-related hospitalizations in Texas among children and teenagers for 2004–2013 to characterize exposures in this population, which is less well understood than pesticide exposure among adults. Methods We abstracted information on pesticide-related hospitalizations from hospitalization data using pesticide-related International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes and E-codes. We calculated the prevalence of pesticide-related hospitalizations among children and teenagers aged #19 years for all hospitalizations, unintentional exposures, intentional exposures, pesticide classifications, and illness severity. We also calculated age- and sex-specific prevalence of pesticide-related hospitalizations among children. Results The prevalence of pesticide-related hospitalizations among children and teenagers was 2.1 per 100,000 population. The prevalence of pesticide-related hospitalizations per 100,000 population was 2.7 for boys and 1.5 for girls. The age-specific prevalence per 100,000 population was 5.3 for children aged 0–4 years, 0.3 for children and teenagers aged 5–14 years, and 2.3 for teenagers aged 15–19 years. Children aged 0–4 years had the highest prevalence of unintentional exposures, whereas teenagers aged 15–19 years had the highest prevalence of intentional exposures. Commonly reported pesticide categories were organophosphates/carbamates, disinfectants, rodenticides, and other pesticides (e.g., pyrethrins, pyrethroids). Of the 158 pesticide-related hospitalizations, most were coded as having minor (n=86) or moderate (n=40) illness severity. Conclusion Characterizing the prevalence of pesticide-related hospitalizations among children and teenagers leads to a better understanding of the burden of pesticide exposures, including the type of pesticides used and the severity of potential health effects. This study found differences in the frequency of pesticide-related hospitalizations by sex, age, and intent (e.g., unintentional vs. intentional). PMID:27453604
Notched audiograms and noise exposure history in older adults.
Nondahl, David M; Shi, Xiaoyu; Cruickshanks, Karen J; Dalton, Dayna S; Tweed, Ted S; Wiley, Terry L; Carmichael, Lakeesha L
2009-12-01
Using data from a population-based cohort study, we compared four published algorithms for identifying notched audiograms and compared their resulting classifications with noise exposure history. Four algorithms: (1) , (2) , (3) , and (4) were used to identify notched audiograms. Audiometric evaluations were collected as a part of the 10-yr follow-up examinations of the Epidemiology of Hearing Loss Study, in Beaver Dam, WI (2003-2005, N = 2395). Detailed noise exposure histories were collected by interview at the baseline examination (1993-1995) and updated at subsequent visits. An extensive history of occupational noise exposure, participation in noisy hobbies, and firearm usage was used to evaluate consistency of the notch classifications with the history of noise exposure. The prevalence of notched audiograms varied greatly by definition (31.7, 25.9, 47.2, and 11.7% for methods 1, 2, 3, and 4, respectively). In this cohort, a history of noise exposure was common (56.2% for occupational noise, 71.7% for noisy hobbies, 13.4% for firearms, and 81.2% for any of these three sources). Among participants with a notched audiogram, almost one-third did not have a history of occupational noise exposure (31.4, 33.0, 32.5, and 28.1% for methods 1, 2, 3, and 4, respectively), and approximately 11% did not have a history of exposure to any of the three sources of noise (11.5, 13.6, 10.3, and 7.6%). Discordance was greater in women than in men. These results suggest that there is a poor agreement across existing algorithms for audiometric notches. In addition, notches can occur in the absence of a positive noise history. In the absence of an objective consensus definition of a notched audiogram and in light of the degree of discordance in women between noise history and notches by each of these algorithms, researchers should be cautious about classifying noise-induced hearing loss by notched audiograms.
Caetano dos Santos, Florentino Luciano; Skottman, Heli; Juuti-Uusitalo, Kati; Hyttinen, Jari
2016-01-01
Aims A fast, non-invasive and observer-independent method to analyze the homogeneity and maturity of human pluripotent stem cell (hPSC) derived retinal pigment epithelial (RPE) cells is warranted to assess the suitability of hPSC-RPE cells for implantation or in vitro use. The aim of this work was to develop and validate methods to create ensembles of state-of-the-art texture descriptors and to provide a robust classification tool to separate three different maturation stages of RPE cells by using phase contrast microscopy images. The same methods were also validated on a wide variety of biological image classification problems, such as histological or virus image classification. Methods For image classification we used different texture descriptors, descriptor ensembles and preprocessing techniques. Also, three new methods were tested. The first approach was an ensemble of preprocessing methods, to create an additional set of images. The second was the region-based approach, where saliency detection and wavelet decomposition divide each image in two different regions, from which features were extracted through different descriptors. The third method was an ensemble of Binarized Statistical Image Features, based on different sizes and thresholds. A Support Vector Machine (SVM) was trained for each descriptor histogram and the set of SVMs combined by sum rule. The accuracy of the computer vision tool was verified in classifying the hPSC-RPE cell maturation level. Dataset and Results The RPE dataset contains 1862 subwindows from 195 phase contrast images. The final descriptor ensemble outperformed the most recent stand-alone texture descriptors, obtaining, for the RPE dataset, an area under ROC curve (AUC) of 86.49% with the 10-fold cross validation and 91.98% with the leave-one-image-out protocol. The generality of the three proposed approaches was ascertained with 10 more biological image datasets, obtaining an average AUC greater than 97%. Conclusions Here we showed that the developed ensembles of texture descriptors are able to classify the RPE cell maturation stage. Moreover, we proved that preprocessing and region-based decomposition improves many descriptors’ accuracy in biological dataset classification. Finally, we built the first public dataset of stem cell-derived RPE cells, which is publicly available to the scientific community for classification studies. The proposed tool is available at https://www.dei.unipd.it/node/2357 and the RPE dataset at http://www.biomeditech.fi/data/RPE_dataset/. Both are available at https://figshare.com/s/d6fb591f1beb4f8efa6f. PMID:26895509
2010-01-01
Literature 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE Comparison of extracellular striatal acetylcholine and brain seizure activity following...lethality; nerve agents; organophosphorus compounds; seizure activity ; tabun 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER...acetylcholine and brain seizure activity following acute exposure to the nerve agents cyclosarin and tabun in freely moving guinea pigs John C
An automated approach to mapping corn from Landsat imagery
Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.
2004-01-01
Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.
A scheme for a flexible classification of dietary and health biomarkers.
Gao, Qian; Praticò, Giulia; Scalbert, Augustin; Vergères, Guy; Kolehmainen, Marjukka; Manach, Claudine; Brennan, Lorraine; Afman, Lydia A; Wishart, David S; Andres-Lacueva, Cristina; Garcia-Aloy, Mar; Verhagen, Hans; Feskens, Edith J M; Dragsted, Lars O
2017-01-01
Biomarkers are an efficient means to examine intakes or exposures and their biological effects and to assess system susceptibility. Aided by novel profiling technologies, the biomarker research field is undergoing rapid development and new putative biomarkers are continuously emerging in the scientific literature. However, the existing concepts for classification of biomarkers in the dietary and health area may be ambiguous, leading to uncertainty about their application. In order to better understand the potential of biomarkers and to communicate their use and application, it is imperative to have a solid scheme for biomarker classification that will provide a well-defined ontology for the field. In this manuscript, we provide an improved scheme for biomarker classification based on their intended use rather than the technology or outcomes (six subclasses are suggested: food compound intake biomarkers (FCIBs), food or food component intake biomarkers (FIBs), dietary pattern biomarkers (DPBs), food compound status biomarkers (FCSBs), effect biomarkers, physiological or health state biomarkers). The application of this scheme is described in detail for the dietary and health area and is compared with previous biomarker classification for this field of research.
Developing Decontamination Tools and Approaches to ...
Developing Decontamination Tools and Approaches to Address Indoor Pesticide Contamination from Improper Bed Bug Treatments The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.
Hyperspectral analysis of columbia spotted frog habitat
Shive, J.P.; Pilliod, D.S.; Peterson, C.R.
2010-01-01
Wildlife managers increasingly are using remotely sensed imagery to improve habitat delineations and sampling strategies. Advances in remote sensing technology, such as hyperspectral imagery, provide more information than previously was available with multispectral sensors. We evaluated accuracy of high-resolution hyperspectral image classifications to identify wetlands and wetland habitat features important for Columbia spotted frogs (Rana luteiventris) and compared the results to multispectral image classification and United States Geological Survey topographic maps. The study area spanned 3 lake basins in the Salmon River Mountains, Idaho, USA. Hyperspectral data were collected with an airborne sensor on 30 June 2002 and on 8 July 2006. A 12-year comprehensive ground survey of the study area for Columbia spotted frog reproduction served as validation for image classifications. Hyperspectral image classification accuracy of wetlands was high, with a producer's accuracy of 96 (44 wetlands) correctly classified with the 2002 data and 89 (41 wetlands) correctly classified with the 2006 data. We applied habitat-based rules to delineate breeding habitat from other wetlands, and successfully predicted 74 (14 wetlands) of known breeding wetlands for the Columbia spotted frog. Emergent sedge microhabitat classification showed promise for directly predicting Columbia spotted frog egg mass locations within a wetland by correctly identifying 72 (23 of 32) of known locations. Our study indicates hyperspectral imagery can be an effective tool for mapping spotted frog breeding habitat in the selected mountain basins. We conclude that this technique has potential for improving site selection for inventory and monitoring programs conducted across similar wetland habitat and can be a useful tool for delineating wildlife habitats. ?? 2010 The Wildlife Society.
Classification and authentication of unknown water samples using machine learning algorithms.
Kundu, Palash K; Panchariya, P C; Kundu, Madhusree
2011-07-01
This paper proposes the development of water sample classification and authentication, in real life which is based on machine learning algorithms. The proposed techniques used experimental measurements from a pulse voltametry method which is based on an electronic tongue (E-tongue) instrumentation system with silver and platinum electrodes. E-tongue include arrays of solid state ion sensors, transducers even of different types, data collectors and data analysis tools, all oriented to the classification of liquid samples and authentication of unknown liquid samples. The time series signal and the corresponding raw data represent the measurement from a multi-sensor system. The E-tongue system, implemented in a laboratory environment for 6 numbers of different ISI (Bureau of Indian standard) certified water samples (Aquafina, Bisleri, Kingfisher, Oasis, Dolphin, and McDowell) was the data source for developing two types of machine learning algorithms like classification and regression. A water data set consisting of 6 numbers of sample classes containing 4402 numbers of features were considered. A PCA (principal component analysis) based classification and authentication tool was developed in this study as the machine learning component of the E-tongue system. A proposed partial least squares (PLS) based classifier, which was dedicated as well; to authenticate a specific category of water sample evolved out as an integral part of the E-tongue instrumentation system. The developed PCA and PLS based E-tongue system emancipated an overall encouraging authentication percentage accuracy with their excellent performances for the aforesaid categories of water samples. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
Breaking Barriers and Building Bridges: Using EJ SCREEN ...
Communities across the United States are faced with concerns about environmental risks and exposures including air contaminants near roadways, proximity to hazardous waste sites and children’s environmental health. These concerns are compounded by complicated data, limited opportunities for collaboration and resource-based restrictions such as funding. This workshop will introduce innovative approaches for combining the capacity of EPA science tools - EJ SCREEN and the recently released Community Focused Exposure and Risk Screening Tool (C-FERST). Following a nationally applicable case study, participants will learn how these tools can be used sequentially to; (1) identify community environmental health ‘hotspots’; (2) take a closer look at local scale sources of exposure and; (3) use new features of the tool to target potential partners and resources across the country. By exploring the power of GIS mapping and crowdsource data, participants will leave with simple, user-defined approaches for using state of the science tools to advance their community and environmental health projects. Presentation using EJ SCREEN and C-FERST
NASA Astrophysics Data System (ADS)
Fujita, Yusuke; Mitani, Yoshihiro; Hamamoto, Yoshihiko; Segawa, Makoto; Terai, Shuji; Sakaida, Isao
2017-03-01
Ultrasound imaging is a popular and non-invasive tool used in the diagnoses of liver disease. Cirrhosis is a chronic liver disease and it can advance to liver cancer. Early detection and appropriate treatment are crucial to prevent liver cancer. However, ultrasound image analysis is very challenging, because of the low signal-to-noise ratio of ultrasound images. To achieve the higher classification performance, selection of training regions of interest (ROIs) is very important that effect to classification accuracy. The purpose of our study is cirrhosis detection with high accuracy using liver ultrasound images. In our previous works, training ROI selection by MILBoost and multiple-ROI classification based on the product rule had been proposed, to achieve high classification performance. In this article, we propose self-training method to select training ROIs effectively. Evaluation experiments were performed to evaluate effect of self-training, using manually selected ROIs and also automatically selected ROIs. Experimental results show that self-training for manually selected ROIs achieved higher classification performance than other approaches, including our conventional methods. The manually ROI definition and sample selection are important to improve classification accuracy in cirrhosis detection using ultrasound images.
Contribution of non-negative matrix factorization to the classification of remote sensing images
NASA Astrophysics Data System (ADS)
Karoui, M. S.; Deville, Y.; Hosseini, S.; Ouamri, A.; Ducrot, D.
2008-10-01
Remote sensing has become an unavoidable tool for better managing our environment, generally by realizing maps of land cover using classification techniques. The classification process requires some pre-processing, especially for data size reduction. The most usual technique is Principal Component Analysis. Another approach consists in regarding each pixel of the multispectral image as a mixture of pure elements contained in the observed area. Using Blind Source Separation (BSS) methods, one can hope to unmix each pixel and to perform the recognition of the classes constituting the observed scene. Our contribution consists in using Non-negative Matrix Factorization (NMF) combined with sparse coding as a solution to BSS, in order to generate new images (which are at least partly separated images) using HRV SPOT images from Oran area, Algeria). These images are then used as inputs of a supervised classifier integrating textural information. The results of classifications of these "separated" images show a clear improvement (correct pixel classification rate improved by more than 20%) compared to classification of initial (i.e. non separated) images. These results show the contribution of NMF as an attractive pre-processing for classification of multispectral remote sensing imagery.
77 FR 66715 - Fluridone; Pesticide Tolerances for Emergency Exemptions
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-07
..., or pesticide manufacturer. The following list of North American Industrial Classification System... for which there is reliable information.'' This includes exposure through drinking water and in... emergency conditions, EPA has not made any decisions about whether fluridone meets FIFRA's registration...
Predicting impaired extinction of traumatic memory and elevated startle.
Nalloor, Rebecca; Bunting, Kristopher; Vazdarjanova, Almira
2011-01-01
Emotionally traumatic experiences can lead to debilitating anxiety disorders, such as phobias and Post-Traumatic Stress Disorder (PTSD). Exposure to such experiences, however, is not sufficient to induce pathology, as only up to one quarter of people exposed to such events develop PTSD. These statistics, combined with findings that smaller hippocampal size prior to the trauma is associated with higher risk of developing PTSD, suggest that there are pre-disposing factors for such pathology. Because prospective studies in humans are limited and costly, investigating such pre-dispositions, and thus advancing understanding of the genesis of such pathologies, requires the use of animal models where predispositions are identified before the emotional trauma. Most existing animal models are retrospective: they classify subjects as those with or without a PTSD-like phenotype long after experiencing a traumatic event. Attempts to create prospective animal models have been largely unsuccessful. Here we report that individual predispositions to a PTSD-like phenotype, consisting of impaired rate and magnitude of extinction of an emotionally traumatic event coupled with long-lasting elevation of acoustic startle responses, can be revealed following exposure to a mild stressor, but before experiencing emotional trauma. We compare, in rats, the utility of several classification criteria and report that a combination of criteria based on acoustic startle responses and behavior in an anxiogenic environment is a reliable predictor of a PTSD-like phenotype. There are individual predispositions to developing impaired extinction and elevated acoustic startle that can be identified after exposure to a mildly stressful event, which by itself does not induce such a behavioral phenotype. The model presented here is a valuable tool for studying the etiology and pathophysiology of anxiety disorders and provides a platform for testing behavioral and pharmacological interventions that can reduce the probability of developing pathologic behaviors associated with such disorders.
Gan, Heng-Hui; Soukoulis, Christos; Fisk, Ian
2014-03-01
In the present work, we have evaluated for first time the feasibility of APCI-MS volatile compound fingerprinting in conjunction with chemometrics (PLS-DA) as a new strategy for rapid and non-destructive food classification. For this purpose 202 clarified monovarietal juices extracted from apples differing in their botanical and geographical origin were used for evaluation of the performance of APCI-MS as a classification tool. For an independent test set PLS-DA analyses of pre-treated spectral data gave 100% and 94.2% correct classification rate for the classification by cultivar and geographical origin, respectively. Moreover, PLS-DA analysis of APCI-MS in conjunction with GC-MS data revealed that masses within the spectral ACPI-MS data set were related with parent ions or fragments of alkyesters, carbonyl compounds (hexanal, trans-2-hexenal) and alcohols (1-hexanol, 1-butanol, cis-3-hexenol) and had significant discriminating power both in terms of cultivar and geographical origin. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
Noise reduction techniques in the design of a pneumatic-driven hand held power tool
NASA Astrophysics Data System (ADS)
Skinner, Christian M.
2005-09-01
Pneumatic-driven hand-held power tools generate noise in the workplace. Current legislation in Europe and the USA aims at protecting workers against noise exposure. In the United States, the Occupational Safety and Health Administration (OSHA) requires that employers create a hearing conservation program if the noise exposure exceeds 85 dB(A). In the European Community under the Directive 2003/10/EC, employers are required to provide hearing protection if the noise exposure within the working environment exceeds 80 dB(A) and must require hearing protection to be worn if the noise exposure exceeds 85 dB(A). This paper examines the sources of noise which contribute to the overall noise from a hand-held power tool. A test plan was developed to identify these individual sources of noise and to determine if structure-borne noise or airborne noise is the dominant source relative to the overall noise level. The measurements were performed per International Standards Organization (ISO) 15744. This paper will describe the methodology used to identify the noise sources and reduce the overall noise of a hand-held power tool.
The 2015 WHO Classification of Tumors of the Thymus: Continuity and Changes
Marx, Alexander; Chan, John K.C.; Coindre, Jean-Michel; Detterbeck, Frank; Girard, Nicolas; Harris, Nancy L.; Jaffe, Elaine S.; Kurrer, Michael O.; Marom, Edith M.; Moreira, Andre L.; Mukai, Kiyoshi; Orazi, Attilio; Ströbel, Philipp
2015-01-01
This overview of the 4th edition of the WHO Classification of thymic tumors has two aims. First, to comprehensively list the established and new tumour entities and variants that are described in the new WHO Classification of thymic epithelial tumors, germ cell tumors, lymphomas, dendritic cell and myeloid neoplasms, and soft tissue tumors of the thymus and mediastinum; second, to highlight major differences in the new WHO Classification that result from the progress that has been made since the 3rd edition in 2004 at immunohistochemical, genetic and conceptual levels. Refined diagnostic criteria for type A, AB, B1–B3 thymomas and thymic squamous cell carcinoma are given and will hopefully improve the reproducibility of the classification and its clinical relevance. The clinical perspective of the classification has been strengthened by involving experts from radiology, thoracic surgery and oncology; by incorporating state-of-the-art PET/CT images; and by depicting prototypic cytological specimens. This makes the thymus section of the new WHO Classification of Tumours of the Lung, Pleura, Thymus and Heart a valuable tool for pathologists, cytologists and clinicians alike. The impact of the new WHO Classification on therapeutic decisions is exemplified in this overview for thymic epithelial tumors and mediastinal lymphomas, and future perspectives and challenges are discussed. PMID:26295375
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auerbach, Scott S.; Shah, Ruchir R.; Mav, Deepak
Identification of carcinogenic activity is the primary goal of the 2-year bioassay. The expense of these studies limits the number of chemicals that can be studied and therefore chemicals need to be prioritized based on a variety of parameters. We have developed an ensemble of support vector machine classification models based on male F344 rat liver gene expression following 2, 14 or 90 days of exposure to a collection of hepatocarcinogens (aflatoxin B1, 1-amino-2,4-dibromoanthraquinone, N-nitrosodimethylamine, methyleugenol) and non-hepatocarcinogens (acetaminophen, ascorbic acid, tryptophan). Seven models were generated based on individual exposure durations (2, 14 or 90 days) or a combination ofmore » exposures (2 + 14, 2 + 90, 14 + 90 and 2 + 14 + 90 days). All sets of data, with the exception of one yielded models with 0% cross-validation error. Independent validation of the models was performed using expression data from the liver of rats exposed at 2 dose levels to a collection of alkenylbenzene flavoring agents. Depending on the model used and the exposure duration of the test data, independent validation error rates ranged from 47% to 10%. The variable with the most notable effect on independent validation accuracy was exposure duration of the alkenylbenzene test data. All models generally exhibited improved performance as the exposure duration of the alkenylbenzene data increased. The models differentiated between hepatocarcinogenic (estragole and safrole) and non-hepatocarcinogenic (anethole, eugenol and isoeugenol) alkenylbenzenes previously studied in a carcinogenicity bioassay. In the case of safrole the models correctly differentiated between carcinogenic and non-carcinogenic dose levels. The models predict that two alkenylbenzenes not previously assessed in a carcinogenicity bioassay, myristicin and isosafrole, would be weakly hepatocarcinogenic if studied at a dose level of 2 mmol/kg bw/day for 2 years in male F344 rats; therefore suggesting that these chemicals should be a higher priority relative to other untested alkenylbenzenes for evaluation in the carcinogenicity bioassay. The results of the study indicate that gene expression-based predictive models are an effective tool for identifying hepatocarcinogens. Furthermore, we find that exposure duration is a critical variable in the success or failure of such an approach, particularly when evaluating chemicals with unknown carcinogenic potency.« less
Auerbach, Scott S; Shah, Ruchir R; Mav, Deepak; Smith, Cynthia S; Walker, Nigel J; Vallant, Molly K; Boorman, Gary A; Irwin, Richard D
2010-03-15
Identification of carcinogenic activity is the primary goal of the 2-year bioassay. The expense of these studies limits the number of chemicals that can be studied and therefore chemicals need to be prioritized based on a variety of parameters. We have developed an ensemble of support vector machine classification models based on male F344 rat liver gene expression following 2, 14 or 90 days of exposure to a collection of hepatocarcinogens (aflatoxin B1, 1-amino-2,4-dibromoanthraquinone, N-nitrosodimethylamine, methyleugenol) and non-hepatocarcinogens (acetaminophen, ascorbic acid, tryptophan). Seven models were generated based on individual exposure durations (2, 14 or 90 days) or a combination of exposures (2+14, 2+90, 14+90 and 2+14+90 days). All sets of data, with the exception of one yielded models with 0% cross-validation error. Independent validation of the models was performed using expression data from the liver of rats exposed at 2 dose levels to a collection of alkenylbenzene flavoring agents. Depending on the model used and the exposure duration of the test data, independent validation error rates ranged from 47% to 10%. The variable with the most notable effect on independent validation accuracy was exposure duration of the alkenylbenzene test data. All models generally exhibited improved performance as the exposure duration of the alkenylbenzene data increased. The models differentiated between hepatocarcinogenic (estragole and safrole) and non-hepatocarcinogenic (anethole, eugenol and isoeugenol) alkenylbenzenes previously studied in a carcinogenicity bioassay. In the case of safrole the models correctly differentiated between carcinogenic and non-carcinogenic dose levels. The models predict that two alkenylbenzenes not previously assessed in a carcinogenicity bioassay, myristicin and isosafrole, would be weakly hepatocarcinogenic if studied at a dose level of 2 mmol/kg bw/day for 2 years in male F344 rats; therefore suggesting that these chemicals should be a higher priority relative to other untested alkenylbenzenes for evaluation in the carcinogenicity bioassay. The results of the study indicate that gene expression-based predictive models are an effective tool for identifying hepatocarcinogens. Furthermore, we find that exposure duration is a critical variable in the success or failure of such an approach, particularly when evaluating chemicals with unknown carcinogenic potency. Published by Elsevier Inc.
The Classification and Evaluation of Computer-Aided Software Engineering Tools
1990-09-01
International Business Machines Corporation Customizer is a Registered Trademark of Index Technology Corporation Data Analyst is a Registered Trademark of...years, a rapid series of new approaches have been adopted including: information engineering, entity- relationship modeling, automatic code generation...support true information sharing among tools and automated consistency checking. Moreover, the repository must record and manage the relationships and
de Barros, Alba Lúcia; Fakih, Flávio Trevisani; Michel, Jeanne Liliane
2002-01-01
This article reports the pathway used to build a prototype of a computer nurse's clinical decision making support system, using NANDA, NIC and NOC classifications, as an auxiliary tool in the insertion of nursing data in the computerized patient record of Hospital São Paulo/UNIFESP.
ERIC Educational Resources Information Center
Rose, Carolyn; Wang, Yi-Chia; Cui, Yue; Arguello, Jaime; Stegmann, Karsten; Weinberger, Armin; Fischer, Frank
2008-01-01
In this article we describe the emerging area of text classification research focused on the problem of collaborative learning process analysis both from a broad perspective and more specifically in terms of a publicly available tool set called TagHelper tools. Analyzing the variety of pedagogically valuable facets of learners' interactions is a…
Multivariate Density Estimation and Remote Sensing
NASA Technical Reports Server (NTRS)
Scott, D. W.
1983-01-01
Current efforts to develop methods and computer algorithms to effectively represent multivariate data commonly encountered in remote sensing applications are described. While this may involve scatter diagrams, multivariate representations of nonparametric probability density estimates are emphasized. The density function provides a useful graphical tool for looking at data and a useful theoretical tool for classification. This approach is called a thunderstorm data analysis.
ERIC Educational Resources Information Center
Darot, Mireille
1983-01-01
The usefulness of classifications within and comparisons among languages as a means of discovering the commonalities of human language is discussed. Metalinguistics offers not only the potential for analyzing the specifics of each language, but also the tools for teaching across languages. (MSE)
A new tool for post-AGB SED classification
NASA Astrophysics Data System (ADS)
Bendjoya, P.; Suarez, O.; Galluccio, L.; Michel, O.
We present the results of an unsupervised classification method applied on a set of 344 spectral energy distributions (SED) of post-AGB stars extracted from the Torun catalogue of Galactic post-AGB stars. This method aims to find a new unbiased method for post-AGB star classification based on the information contained in the IR region of the SED (fluxes, IR excess, colours). We used the data from IRAS and MSX satellites, and from the 2MASS survey. We applied a classification method based on the construction of the dataset of a minimal spanning tree (MST) with the Prim's algorithm. In order to build this tree, different metrics have been tested on both flux and color indices. Our method is able to classify the set of 344 post-AGB stars in 9 distinct groups according to their SEDs.
NASA Scope and Subject Category Guide
NASA Technical Reports Server (NTRS)
2011-01-01
This guide provides a simple, effective tool to assist aerospace information analysts and database builders in the high-level subject classification of technical materials. Each of the 76 subject categories comprising the classification scheme is presented with a description of category scope, a listing of subtopics, cross references, and an indication of particular areas of NASA interest. The guide also includes an index of nearly 3,000 specific research topics cross referenced to the subject categories. The portable document format (PDF) version of the guide contains links in the index from each input subject to its corresponding categories. In addition to subject classification, the guide can serve as an aid to searching databases that use the classification scheme, and is also an excellent selection guide for those involved in the acquisition of aerospace literature. The CD-ROM contains both HTML and PDF versions.
Valiante, D J; Richards, T B; Kinsley, K B
1992-01-01
To identify workplaces in New Jersey with potential for silica exposure, the New Jersey Department of Health compared four-digit Standard Industrial Classifications (SICs) identified by three different data sources: the National Occupational Exposure Survey (NOES), a new Jersey silicosis case registry, and regulatory agency compliance inspections in New Jersey. In total, the three data sources identified 204 SICs in New Jersey with potential for silica exposure. Forty-five percent of these SICs were identified by NOES only, 16% by registry cases only, 6% by compliance inspections only, and 33% by two or more sources. Since different surveillance sources implicate different SICs, this type of analysis is a useful first step in planning programs for prevention of silicosis.
[Clinical forms of major depressive states observed in the Ivory Coast. Classification trial].
Megglé, D; Série, E; Veillon, F; Delafosse, J; Hazera, M
1989-12-01
After a previous analysis of African depressions in studies based on the use of D.S.M. III as a preliminary tool, the authors are now seeking to understand more directly the different ways for depressed Ivorians to express the lowering of self-esteem, as well as the various meanings of agitation observed among them. An attempt of nosographic classification, closely linked with local reality, has been extracted from this material by the authors.
ERIC Educational Resources Information Center
French, Russell L.; And Others
The Annehurst Curriculum Classification System (ACCS), a tool for matching individual learners with appropriate curriculum materials, was used with a group of fifty-nine students (Air National Guard officer candidates) and their four instructor-advisors to examine two issues: (1) the applicability of the ACCS in a highly structured,…
Shameem, K M Muhammed; Choudhari, Khoobaram S; Bankapur, Aseefhali; Kulkarni, Suresh D; Unnikrishnan, V K; George, Sajan D; Kartha, V B; Santhosh, C
2017-05-01
Classification of plastics is of great importance in the recycling industry as the littering of plastic wastes increases day by day as a result of its extensive use. In this paper, we demonstrate the efficacy of a combined laser-induced breakdown spectroscopy (LIBS)-Raman system for the rapid identification and classification of post-consumer plastics. The atomic information and molecular information of polyethylene terephthalate, polyethylene, polypropylene, and polystyrene were studied using plasma emission spectra and scattered signal obtained in the LIBS and Raman technique, respectively. The collected spectral features of the samples were analyzed using statistical tools (principal component analysis, Mahalanobis distance) to categorize the plastics. The analyses of the data clearly show that elemental information and molecular information obtained from these techniques are efficient for classification of plastics. In addition, the molecular information collected via Raman spectroscopy exhibits clearly distinct features for the transparent plastics (100% discrimination), whereas the LIBS technique shows better spectral feature differences for the colored samples. The study shows that the information obtained from these complementary techniques allows the complete classification of the plastic samples, irrespective of the color or additives. This work further throws some light on the fact that the potential limitations of any of these techniques for sample identification can be overcome by the complementarity of these two techniques. Graphical Abstract ᅟ.
[Evaluation of new and emerging health technologies. Proposal for classification].
Prados-Torres, J D; Vidal-España, F; Barnestein-Fonseca, P; Gallo-García, C; Irastorza-Aldasoro, A; Leiva-Fernández, F
2011-01-01
Review and develop a proposal for the classification of health technologies (HT) evaluated by the Health Technology Assessment Agencies (HTAA). Peer review of AETS of the previous proposed classification of HT. Analysis of their input and suggestions for amendments. Construction of a new classification. Pilot study with physicians. Andalusian Public Health System. Spanish HTAA. Experts from HTAA. Tutors of family medicine residents. HT Update classification previously made by the research team. Peer review by Spanish HTAA. Qualitative and quantitative analysis of responses. Construction of a new and pilot study based on 12 evaluation reports of the HTAA. We obtained 11 thematic categories that are classified into 6 major head groups: 1, prevention technology; 2, diagnostic technology; 3, therapeutic technologies; 4, diagnostic and therapeutic technologies; 5, organizational technology, and 6, knowledge management and quality of care. In the pilot there was a good concordance in the classification of 8 of the 12 reports reviewed by physicians. Experts agree on 11 thematic categories of HT. A new classification of HT with double entry (Nature and purpose of HT) is proposed. APPLICABILITY: According to experts, the classification of the work of the HTAA may represent a useful tool to transfer and manage knowledge. Moreover, an adequate classification of the HTAA reports would help clinicians and other potential users to locate them and this can facilitate their dissemination. Copyright © 2010 SECA. Published by Elsevier Espana. All rights reserved.
Scott, Laura L F; Maldonado, George
2015-10-15
The purpose of this analysis was to quantify and adjust for disease misclassification from loss to follow-up in a historical cohort mortality study of workers where exposure was categorized as a multi-level variable. Disease classification parameters were defined using 2008 mortality data for the New Zealand population and the proportions of known deaths observed for the cohort. The probability distributions for each classification parameter were constructed to account for potential differences in mortality due to exposure status, gender, and ethnicity. Probabilistic uncertainty analysis (bias analysis), which uses Monte Carlo techniques, was then used to sample each parameter distribution 50,000 times, calculating adjusted odds ratios (ORDM-LTF) that compared the mortality of workers with the highest cumulative exposure to those that were considered never-exposed. The geometric mean ORDM-LTF ranged between 1.65 (certainty interval (CI): 0.50-3.88) and 3.33 (CI: 1.21-10.48), and the geometric mean of the disease-misclassification error factor (εDM-LTF), which is the ratio of the observed odds ratio to the adjusted odds ratio, had a range of 0.91 (CI: 0.29-2.52) to 1.85 (CI: 0.78-6.07). Only when workers in the highest exposure category were more likely than those never-exposed to be misclassified as non-cases did the ORDM-LTF frequency distributions shift further away from the null. The application of uncertainty analysis to historical cohort mortality studies with multi-level exposures can provide valuable insight into the magnitude and direction of study error resulting from losses to follow-up.
Exposure to smoking in movies among British adolescents 2001-2006.
Anderson, Stacey J; Millett, Christopher; Polansky, Jonathan R; Glantz, Stanton A
2010-06-01
To estimate youth exposure to smoking in movies in the UK and compare the likely effect with the USA. We collected tobacco occurrences data for 572 top-grossing films in the UK screened from 2001 to 2006 and estimated the number of on-screen tobacco impressions delivered to British youths in this time period. 91% of films in our sample that contained smoking were youth-rated films (British Board of Film Classification rating '15' and lower), delivering at least 1.10 billion tobacco impressions to British youths during theatrical release. British youths were exposed to 28% more smoking impressions in UK youth-rated movies than American youth-rated movies, because 79% of movies rated for adults in the USA ('R') are classified as suitable for youths in the UK ('15' or '12A'). Because there is a dose-response relation between the amount of on-screen exposure to smoking and the likelihood that adolescents will begin smoking, the fact that there is substantially higher exposure to smoking in youth-rated films in the UK than in the USA suggests that the fraction of all youth smoking because of films in the UK is probably larger than in the USA. Other countries with ratings systems that are less conservative (in terms of language and sexuality) than the USA will also be likely to deliver more on-screen tobacco impressions to youths. Assigning an '18' classification to movies that contain smoking would substantially reduce youth exposure to on-screen smoking and, hence, smoking initiation among British youths.
Singularity and Nonnormality in the Classification of Compositional Data
Bohling, Geoffrey C.; Davis, J.C.; Olea, R.A.; Harff, Jan
1998-01-01
Geologists may want to classify compositional data and express the classification as a map. Regionalized classification is a tool that can be used for this purpose, but it incorporates discriminant analysis, which requires the computation and inversion of a covariance matrix. Covariance matrices of compositional data always will be singular (noninvertible) because of the unit-sum constraint. Fortunately, discriminant analyses can be calculated using a pseudo-inverse of the singular covariance matrix; this is done automatically by some statistical packages such as SAS. Granulometric data from the Darss Sill region of the Baltic Sea is used to explore how the pseudo-inversion procedure influences discriminant analysis results, comparing the algorithm used by SAS to the more conventional Moore-Penrose algorithm. Logratio transforms have been recommended to overcome problems associated with analysis of compositional data, including singularity. A regionalized classification of the Darss Sill data after logratio transformation is different only slightly from one based on raw granulometric data, suggesting that closure problems do not influence severely regionalized classification of compositional data.
Adaptive phase k-means algorithm for waveform classification
NASA Astrophysics Data System (ADS)
Song, Chengyun; Liu, Zhining; Wang, Yaojun; Xu, Feng; Li, Xingming; Hu, Guangmin
2018-01-01
Waveform classification is a powerful technique for seismic facies analysis that describes the heterogeneity and compartments within a reservoir. Horizon interpretation is a critical step in waveform classification. However, the horizon often produces inconsistent waveform phase, and thus results in an unsatisfied classification. To alleviate this problem, an adaptive phase waveform classification method called the adaptive phase k-means is introduced in this paper. Our method improves the traditional k-means algorithm using an adaptive phase distance for waveform similarity measure. The proposed distance is a measure with variable phases as it moves from sample to sample along the traces. Model traces are also updated with the best phase interference in the iterative process. Therefore, our method is robust to phase variations caused by the interpretation horizon. We tested the effectiveness of our algorithm by applying it to synthetic and real data. The satisfactory results reveal that the proposed method tolerates certain waveform phase variation and is a good tool for seismic facies analysis.
Li, Hongkun; Zhang, Xuefeng; Xu, Fujian
2013-09-18
Centrifugal compressors are a key piece of equipment for modern production. Among the components of the centrifugal compressor, the impeller is a pivotal part as it is used to transform kinetic energy into pressure energy. Blade crack condition monitoring and classification has been broadly investigated in the industrial and academic area. In this research, a pressure pulsation (PP) sensor arranged in close vicinity to the crack area and the corresponding casing vibration signals are used to monitor blade crack information. As these signals cannot directly demonstrate the blade crack, the method employed in this research is based on the extraction of weak signal characteristics that are induced by blade cracking. A method for blade crack classification based on the signals monitored by using a squared envelope spectrum (SES) is presented. Experimental investigations on blade crack classification are carried out to verify the effectiveness of this method. The results show that it is an effective tool for blade crack classification in centrifugal compressors.
Kilavuz, Ahmet Erdem; Songu, Murat; İmre, Abdulkadir; Arslanoğlu, Secil; Özkul, Yilmaz; Pinar, Ercan; Ateş, Düzgün
2018-05-01
The accuracy of fine-needle aspiration biopsy (FNAB) is controversial in parotid tumors. We aimed to compare FNAB results with the final histopathological diagnosis and to apply the "Sal classification" to our data and discuss its results and its place in parotid gland cytology. The FNAB cytological findings and final histological diagnosis were assessed retrospectively in 2 different scenarios based on the distribution of nondefinitive cytology, and we applied the Sal classification and determined malignancy rate, sensitivity, and specificity for each category. In 2 different scenarios FNAB sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were found to be 81%, 87%, 54.7%, and 96.1%; and 65.3%, 100%, 100%, and 96.1%, respectively. The malignancy rates and sensitivity and specificity were also calculated and discussed for each Sal category. We believe that the Sal classification has a great potential to be a useful tool in classification of parotid gland cytology. © 2018 Wiley Periodicals, Inc.
Ecoregions as a level of ecological analysis
Wright, R.G.; Murray, M.P.; Merrill, T.
1998-01-01
There have been many attempts to classify geographic areas into zones of similar characteristics. Recent focus has been on ecoregions. We examined how well the boundaries of the most commonly used ecoregion classifications for the US matched the boundaries of existing vegetation cover mapped at three levels of classification, fine, mid- and coarse scale. We analyzed ecoregions in Idaho, Oregon and Washington. The results were similar among the two ecoregion classifications. For both ecoregion delineations and all three vegetation classifications, the patterns of existing vegetation did not correspond well with the patterns of ecoregions. Most vegetation types had a small proportion of their total area in a given ecoregion. There was also no dominance by one or more vegetation types in any ecoregion and contrary to our hypothesis, the level of congruence of vegetation patterns with ecoregion boundaries decreased as the level of classification became more general. The implications of these findings on the use of ecoregions as a planning tool and in the development of land conservation efforts are discussed.
Fuzzy support vector machine: an efficient rule-based classification technique for microarrays.
Hajiloo, Mohsen; Rabiee, Hamid R; Anooshahpour, Mahdi
2013-01-01
The abundance of gene expression microarray data has led to the development of machine learning algorithms applicable for tackling disease diagnosis, disease prognosis, and treatment selection problems. However, these algorithms often produce classifiers with weaknesses in terms of accuracy, robustness, and interpretability. This paper introduces fuzzy support vector machine which is a learning algorithm based on combination of fuzzy classifiers and kernel machines for microarray classification. Experimental results on public leukemia, prostate, and colon cancer datasets show that fuzzy support vector machine applied in combination with filter or wrapper feature selection methods develops a robust model with higher accuracy than the conventional microarray classification models such as support vector machine, artificial neural network, decision trees, k nearest neighbors, and diagonal linear discriminant analysis. Furthermore, the interpretable rule-base inferred from fuzzy support vector machine helps extracting biological knowledge from microarray data. Fuzzy support vector machine as a new classification model with high generalization power, robustness, and good interpretability seems to be a promising tool for gene expression microarray classification.
Li, Hongkun; Zhang, Xuefeng; Xu, Fujian
2013-01-01
Centrifugal compressors are a key piece of equipment for modern production. Among the components of the centrifugal compressor, the impeller is a pivotal part as it is used to transform kinetic energy into pressure energy. Blade crack condition monitoring and classification has been broadly investigated in the industrial and academic area. In this research, a pressure pulsation (PP) sensor arranged in close vicinity to the crack area and the corresponding casing vibration signals are used to monitor blade crack information. As these signals cannot directly demonstrate the blade crack, the method employed in this research is based on the extraction of weak signal characteristics that are induced by blade cracking. A method for blade crack classification based on the signals monitored by using a squared envelope spectrum (SES) is presented. Experimental investigations on blade crack classification are carried out to verify the effectiveness of this method. The results show that it is an effective tool for blade crack classification in centrifugal compressors. PMID:24051521
Ernst, Corinna; Hahnen, Eric; Engel, Christoph; Nothnagel, Michael; Weber, Jonas; Schmutzler, Rita K; Hauke, Jan
2018-03-27
The use of next-generation sequencing approaches in clinical diagnostics has led to a tremendous increase in data and a vast number of variants of uncertain significance that require interpretation. Therefore, prediction of the effects of missense mutations using in silico tools has become a frequently used approach. Aim of this study was to assess the reliability of in silico prediction as a basis for clinical decision making in the context of hereditary breast and/or ovarian cancer. We tested the performance of four prediction tools (Align-GVGD, SIFT, PolyPhen-2, MutationTaster2) using a set of 236 BRCA1/2 missense variants that had previously been classified by expert committees. However, a major pitfall in the creation of a reliable evaluation set for our purpose is the generally accepted classification of BRCA1/2 missense variants using the multifactorial likelihood model, which is partially based on Align-GVGD results. To overcome this drawback we identified 161 variants whose classification is independent of any previous in silico prediction. In addition to the performance as stand-alone tools we examined the sensitivity, specificity, accuracy and Matthews correlation coefficient (MCC) of combined approaches. PolyPhen-2 achieved the lowest sensitivity (0.67), specificity (0.67), accuracy (0.67) and MCC (0.39). Align-GVGD achieved the highest values of specificity (0.92), accuracy (0.92) and MCC (0.73), but was outperformed regarding its sensitivity (0.90) by SIFT (1.00) and MutationTaster2 (1.00). All tools suffered from poor specificities, resulting in an unacceptable proportion of false positive results in a clinical setting. This shortcoming could not be bypassed by combination of these tools. In the best case scenario, 138 families would be affected by the misclassification of neutral variants within the cohort of patients of the German Consortium for Hereditary Breast and Ovarian Cancer. We show that due to low specificities state-of-the-art in silico prediction tools are not suitable to predict pathogenicity of variants of uncertain significance in BRCA1/2. Thus, clinical consequences should never be based solely on in silico forecasts. However, our data suggests that SIFT and MutationTaster2 could be suitable to predict benignity, as both tools did not result in false negative predictions in our analysis.
Nanoparticle exposure biomonitoring: exposure/effect indicator development approaches
NASA Astrophysics Data System (ADS)
Marie-Desvergne, C.; Dubosson, M.; Lacombe, M.; Brun, V.; Mossuz, V.
2015-05-01
The use of engineered nanoparticles (NP) is more and more widespread in various industrial sectors. The inhalation route of exposure is a matter of concern (adverse effects of air pollution by ultrafine particles and asbestos). No NP biomonitoring recommendations or standards are available so far. The LBM laboratory is currently studying several approaches to develop bioindicators for occupational health applications. As regards exposure indicators, new tools are being implemented to assess potentially inhaled NP in non-invasive respiratory sampling (nasal sampling and exhaled breath condensates (EBC)). Diverse NP analytical characterization methods are used (ICP-MS, dynamic light scattering and electron microscopy coupled to energy-dispersive X-ray analysis). As regards effect indicators, a methodology has been developed to assess a range of 29 cytokines in EBCs (potential respiratory inflammation due to NP exposure). Secondly, collaboration between the LBM laboratory and the EDyp team has allowed the EBC proteome to be characterized by means of an LC-MS/MS process. These projects are expected to facilitate the development of individual NP exposure biomonitoring tools and the analysis of early potential impacts on health. Innovative techniques such as field-flow fractionation combined with ICP-MS and single particle-ICPMS are currently being explored. These tools are directly intended to assist occupational physicians in the identification of exposure situations.
Inferring Population Exposure from Biomonitoring Data on Urinary Concentrations (SOT)
Biomonitoring studies such as the National Health and Nutrition Examination Survey (NHANES) are valuable to exposure assessment both as sources of data to evaluate exposure models and as training sets to develop heuristics for rapid-exposure-assessment tools. However, linking in...
Computational Approaches and Tools for Exposure Prioritization and Biomonitoring Data Interpretation
The ability to describe the source-environment-exposure-dose-response continuum is essential for identifying exposures of greater concern to prioritize chemicals for toxicity testing or risk assessment, as well as for interpreting biomarker data for better assessment of exposure ...
GEAS Spectroscopy Tools for Authentic Research Investigations in the Classroom
NASA Astrophysics Data System (ADS)
Rector, Travis A.; Vogt, Nicole P.
2018-06-01
Spectroscopy is one of the most powerful tools that astronomers use to study the universe. However relatively few resources are available that enable undergraduates to explore astronomical spectra interactively. We present web-based applications which guide students through the analysis of real spectra of stars, galaxies, and quasars. The tools are written in HTML5 and function in all modern web browsers on computers and tablets. No software needs to be installed nor do any datasets need to be downloaded, enabling students to use the tools in or outside of class (e.g., for online classes).Approachable GUIs allow students to analyze spectra in the same manner as professional astronomers. The stellar spectroscopy tool can fit a continuum with a blackbody and identify spectral features, as well as fit line profiles and determine equivalent widths. The galaxy and AGN tools can also measure redshifts and calcium break strengths. The tools provide access to an archive of hundreds of spectra obtained with the optical telescopes at Kitt Peak National Observatory. It is also possible to load your own spectra or to query the Sloan Digital Sky Survey (SDSS) database.We have also developed curricula to investigate these topics: spectral classification, variable stars, redshift, and AGN classification. We will present the functionality of the tools and describe the associated curriculum. The tools are part of the General Education Astronomy Source (GEAS) project based at New Mexico State University, with support from the National Science Foundation (NSF, AST-0349155) and the National Aeronautics and Space Administration (NASA, NNX09AV36G). Curriculum development was supported by the NSF (DUE-0618849 and DUE-0920293).