Barty, Anton; Kirian, Richard A.; Maia, Filipe R. N. C.; Hantke, Max; Yoon, Chun Hong; White, Thomas A.; Chapman, Henry
2014-01-01
The emerging technique of serial X-ray diffraction, in which diffraction data are collected from samples flowing across a pulsed X-ray source at repetition rates of 100 Hz or higher, has necessitated the development of new software in order to handle the large data volumes produced. Sorting of data according to different criteria and rapid filtering of events to retain only diffraction patterns of interest results in significant reductions in data volume, thereby simplifying subsequent data analysis and management tasks. Meanwhile the generation of reduced data in the form of virtual powder patterns, radial stacks, histograms and other meta data creates data set summaries for analysis and overall experiment evaluation. Rapid data reduction early in the analysis pipeline is proving to be an essential first step in serial imaging experiments, prompting the authors to make the tool described in this article available to the general community. Originally developed for experiments at X-ray free-electron lasers, the software is based on a modular facility-independent library to promote portability between different experiments and is available under version 3 or later of the GNU General Public License. PMID:24904246
Graph embedding and extensions: a general framework for dimensionality reduction.
Yan, Shuicheng; Xu, Dong; Zhang, Benyu; Zhang, Hong-Jiang; Yang, Qiang; Lin, Stephen
2007-01-01
Over the past few decades, a large family of algorithms - supervised or unsupervised; stemming from statistics or geometry theory - has been designed to provide different solutions to the problem of dimensionality reduction. Despite the different motivations of these algorithms, we present in this paper a general formulation known as graph embedding to unify them within a common framework. In graph embedding, each algorithm can be considered as the direct graph embedding or its linear/kernel/tensor extension of a specific intrinsic graph that describes certain desired statistical or geometric properties of a data set, with constraints from scale normalization or a penalty graph that characterizes a statistical or geometric property that should be avoided. Furthermore, the graph embedding framework can be used as a general platform for developing new dimensionality reduction algorithms. By utilizing this framework as a tool, we propose a new supervised dimensionality reduction algorithm called Marginal Fisher Analysis in which the intrinsic graph characterizes the intraclass compactness and connects each data point with its neighboring points of the same class, while the penalty graph connects the marginal points and characterizes the interclass separability. We show that MFA effectively overcomes the limitations of the traditional Linear Discriminant Analysis algorithm due to data distribution assumptions and available projection directions. Real face recognition experiments show the superiority of our proposed MFA in comparison to LDA, also for corresponding kernel and tensor extensions.
Youpi: YOUr processing PIpeline
NASA Astrophysics Data System (ADS)
Monnerville, Mathias; Sémah, Gregory
2012-03-01
Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.
MFCompress: a compression tool for FASTA and multi-FASTA data.
Pinho, Armando J; Pratas, Diogo
2014-01-01
The data deluge phenomenon is becoming a serious problem in most genomic centers. To alleviate it, general purpose tools, such as gzip, are used to compress the data. However, although pervasive and easy to use, these tools fall short when the intention is to reduce as much as possible the data, for example, for medium- and long-term storage. A number of algorithms have been proposed for the compression of genomics data, but unfortunately only a few of them have been made available as usable and reliable compression tools. In this article, we describe one such tool, MFCompress, specially designed for the compression of FASTA and multi-FASTA files. In comparison to gzip and applied to multi-FASTA files, MFCompress can provide additional average compression gains of almost 50%, i.e. it potentially doubles the available storage, although at the cost of some more computation time. On highly redundant datasets, and in comparison with gzip, 8-fold size reductions have been obtained. Both source code and binaries for several operating systems are freely available for non-commercial use at http://bioinformatics.ua.pt/software/mfcompress/.
Harmonize Pipeline and Archiving Aystem: PESSTO@IA2 Use Case
NASA Astrophysics Data System (ADS)
Smareglia, R.; Knapic, C.; Molinaro, M.; Young, D.; Valenti, S.
2013-10-01
Italian Astronomical Archives Center (IA2) is a research infrastructure project that aims at coordinating different national and international initiatives to improve the quality of astrophysical data services. IA2 is now also involved in the PESSTO (Public ESO Spectroscopic Survey of Transient Objects) collaboration, developing a complete archiving system to store calibrated post processed data (including sensitive intermediate products), a user interface to access private data and Virtual Observatory (VO) compliant web services to access public fast reduction data via VO tools. The archive system shall rely on the PESSTO Marshall to provide file data and its associated metadata output by the PESSTO data-reduction pipeline. To harmonize the object repository, data handling and archiving system, new tools are under development. These systems must have a strong cross-interaction without increasing the complexities of any single task, in order to improve the performances of the whole system and must have a sturdy logic in order to perform all operations in coordination with the other PESSTO tools. MySQL Replication technology and triggers are used for the synchronization of new data in an efficient, fault tolerant manner. A general purpose library is under development to manage data starting from raw observations to final calibrated ones, open to the overriding of different sources, formats, management fields, storage and publication policies. Configurations for all the systems are stored in a dedicated schema (no configuration files), but can be easily updated by a planned Archiving System Configuration Interface (ASCI).
New Tools for Managing Agricultural P
NASA Astrophysics Data System (ADS)
Nieber, J. L.; Baker, L. A.; Peterson, H. M.; Ulrich, J.
2014-12-01
Best management practices (BMPs) generally focus on retaining nutrients (especially P) after they enter the watershed. This approach is expensive, unsustainable, and has not led to reductions of P pollution at large scales (e.g., Mississippi River). Although source reduction, which results in reducing inputs of nutrients to a watershed, has long been cited as a preferred approach, we have not had tools to guide source reduction efforts at the watershed level. To augment conventional TMDL tools, we developed an "actionable" watershed P balance approach, based largely on watershed-specific information, yet simple enough to be utilized as a practical tool. Interviews with farmers were used to obtain detailed farm management data, data from livestock permits were adjusted based on site visits, stream P fluxes were calculated from 3 years of monitoring data, and expert knowledge was used to model P fluxes through animal operations. The overall P use efficiency. Puse was calculated as the sum of deliberate exports (P in animals, milk, eggs, and crops) divided by deliberate inputs (P inputs of fertilizer, feed, and nursery animals x 100. The crop P use efficiency was 1.7, meaning that more P was exported as products that was deliberately imported; we estimate that this mining would have resulted in a loss of 6 mg P/kg across the watershed. Despite the negative P balance, the equivalent of 5% of watershed input was lost via stream export. Tile drainage, the presence of buffer strips, and relatively flat topography result in dominance of P loads by ortho-P (66%) and low particulate P. This, together with geochemical analysis (ongoing) suggest that biological processes may be at least as important as sediment transport in controlling P loads. We have developed a P balance calculator tool to enable watershed management organizations to develop watershed P balances and identify opportunities for improving the efficiency of P utilization.
The SCUBA Data Reduction Pipeline: ORAC-DR at the JCMT
NASA Astrophysics Data System (ADS)
Jenness, Tim; Economou, Frossie
The ORAC data reduction pipeline, developed for UKIRT, has been designed to be a completely general approach to writing data reduction pipelines. This generality has enabled the JCMT to adapt the system for use with SCUBA with minimal development time using the existing SCUBA data reduction algorithms (Surf).
Phylogenetic Tools for Generalized HIV-1 Epidemics: Findings from the PANGEA-HIV Methods Comparison
Ratmann, Oliver; Hodcroft, Emma B.; Pickles, Michael; Cori, Anne; Hall, Matthew; Lycett, Samantha; Colijn, Caroline; Dearlove, Bethany; Didelot, Xavier; Frost, Simon; Hossain, A.S. Md Mukarram; Joy, Jeffrey B.; Kendall, Michelle; Kühnert, Denise; Leventhal, Gabriel E.; Liang, Richard; Plazzotta, Giacomo; Poon, Art F.Y.; Rasmussen, David A.; Stadler, Tanja; Volz, Erik; Weis, Caroline; Leigh Brown, Andrew J.; Fraser, Christophe
2017-01-01
Viral phylogenetic methods contribute to understanding how HIV spreads in populations, and thereby help guide the design of prevention interventions. So far, most analyses have been applied to well-sampled concentrated HIV-1 epidemics in wealthy countries. To direct the use of phylogenetic tools to where the impact of HIV-1 is greatest, the Phylogenetics And Networks for Generalized HIV Epidemics in Africa (PANGEA-HIV) consortium generates full-genome viral sequences from across sub-Saharan Africa. Analyzing these data presents new challenges, since epidemics are principally driven by heterosexual transmission and a smaller fraction of cases is sampled. Here, we show that viral phylogenetic tools can be adapted and used to estimate epidemiological quantities of central importance to HIV-1 prevention in sub-Saharan Africa. We used a community-wide methods comparison exercise on simulated data, where participants were blinded to the true dynamics they were inferring. Two distinct simulations captured generalized HIV-1 epidemics, before and after a large community-level intervention that reduced infection levels. Five research groups participated. Structured coalescent modeling approaches were most successful: phylogenetic estimates of HIV-1 incidence, incidence reductions, and the proportion of transmissions from individuals in their first 3 months of infection correlated with the true values (Pearson correlation > 90%), with small bias. However, on some simulations, true values were markedly outside reported confidence or credibility intervals. The blinded comparison revealed current limits and strengths in using HIV phylogenetics in challenging settings, provided benchmarks for future methods’ development, and supports using the latest generation of phylogenetic tools to advance HIV surveillance and prevention. PMID:28053012
NASA Astrophysics Data System (ADS)
Rose, K.; Bauer, J. R.; Baker, D. V.
2015-12-01
As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).
ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures
NASA Astrophysics Data System (ADS)
Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.
2008-08-01
This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.
Dong, J; Hayakawa, Y; Kober, C
2014-01-01
When metallic prosthetic appliances and dental fillings exist in the oral cavity, the appearance of metal-induced streak artefacts is not avoidable in CT images. The aim of this study was to develop a method for artefact reduction using the statistical reconstruction on multidetector row CT images. Adjacent CT images often depict similar anatomical structures. Therefore, reconstructed images with weak artefacts were attempted using projection data of an artefact-free image in a neighbouring thin slice. Images with moderate and strong artefacts were continuously processed in sequence by successive iterative restoration where the projection data was generated from the adjacent reconstructed slice. First, the basic maximum likelihood-expectation maximization algorithm was applied. Next, the ordered subset-expectation maximization algorithm was examined. Alternatively, a small region of interest setting was designated. Finally, the general purpose graphic processing unit machine was applied in both situations. The algorithms reduced the metal-induced streak artefacts on multidetector row CT images when the sequential processing method was applied. The ordered subset-expectation maximization and small region of interest reduced the processing duration without apparent detriments. A general-purpose graphic processing unit realized the high performance. A statistical reconstruction method was applied for the streak artefact reduction. The alternative algorithms applied were effective. Both software and hardware tools, such as ordered subset-expectation maximization, small region of interest and general-purpose graphic processing unit achieved fast artefact correction.
C 3, A Command-line Catalog Cross-match Tool for Large Astrophysical Catalogs
NASA Astrophysics Data System (ADS)
Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio
2017-02-01
Modern Astrophysics is based on multi-wavelength data organized into large and heterogeneous catalogs. Hence, the need for efficient, reliable and scalable catalog cross-matching methods plays a crucial role in the era of the petabyte scale. Furthermore, multi-band data have often very different angular resolution, requiring the highest generality of cross-matching features, mainly in terms of region shape and resolution. In this work we present C 3 (Command-line Catalog Cross-match), a multi-platform application designed to efficiently cross-match massive catalogs. It is based on a multi-core parallel processing paradigm and conceived to be executed as a stand-alone command-line process or integrated within any generic data reduction/analysis pipeline, providing the maximum flexibility to the end-user, in terms of portability, parameter configuration, catalog formats, angular resolution, region shapes, coordinate units and cross-matching types. Using real data, extracted from public surveys, we discuss the cross-matching capabilities and computing time efficiency also through a direct comparison with some publicly available tools, chosen among the most used within the community, and representative of different interface paradigms. We verified that the C 3 tool has excellent capabilities to perform an efficient and reliable cross-matching between large data sets. Although the elliptical cross-match and the parametric handling of angular orientation and offset are known concepts in the astrophysical context, their availability in the presented command-line tool makes C 3 competitive in the context of public astronomical tools.
Phylogenetic Tools for Generalized HIV-1 Epidemics: Findings from the PANGEA-HIV Methods Comparison.
Ratmann, Oliver; Hodcroft, Emma B; Pickles, Michael; Cori, Anne; Hall, Matthew; Lycett, Samantha; Colijn, Caroline; Dearlove, Bethany; Didelot, Xavier; Frost, Simon; Hossain, A S Md Mukarram; Joy, Jeffrey B; Kendall, Michelle; Kühnert, Denise; Leventhal, Gabriel E; Liang, Richard; Plazzotta, Giacomo; Poon, Art F Y; Rasmussen, David A; Stadler, Tanja; Volz, Erik; Weis, Caroline; Leigh Brown, Andrew J; Fraser, Christophe
2017-01-01
Viral phylogenetic methods contribute to understanding how HIV spreads in populations, and thereby help guide the design of prevention interventions. So far, most analyses have been applied to well-sampled concentrated HIV-1 epidemics in wealthy countries. To direct the use of phylogenetic tools to where the impact of HIV-1 is greatest, the Phylogenetics And Networks for Generalized HIV Epidemics in Africa (PANGEA-HIV) consortium generates full-genome viral sequences from across sub-Saharan Africa. Analyzing these data presents new challenges, since epidemics are principally driven by heterosexual transmission and a smaller fraction of cases is sampled. Here, we show that viral phylogenetic tools can be adapted and used to estimate epidemiological quantities of central importance to HIV-1 prevention in sub-Saharan Africa. We used a community-wide methods comparison exercise on simulated data, where participants were blinded to the true dynamics they were inferring. Two distinct simulations captured generalized HIV-1 epidemics, before and after a large community-level intervention that reduced infection levels. Five research groups participated. Structured coalescent modeling approaches were most successful: phylogenetic estimates of HIV-1 incidence, incidence reductions, and the proportion of transmissions from individuals in their first 3 months of infection correlated with the true values (Pearson correlation > 90%), with small bias. However, on some simulations, true values were markedly outside reported confidence or credibility intervals. The blinded comparison revealed current limits and strengths in using HIV phylogenetics in challenging settings, provided benchmarks for future methods' development, and supports using the latest generation of phylogenetic tools to advance HIV surveillance and prevention. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
NASA Astrophysics Data System (ADS)
Graham, Thomas; Wheeler, Raymond
2016-06-01
The objective of this study was to evaluate root restriction as a tool to increase volume utilization efficiency in spaceflight crop production systems. Bell pepper plants (Capsicum annuum cv. California Wonder) were grown under restricted rooting volume conditions in controlled environment chambers. The rooting volume was restricted to 500 ml and 60 ml in a preliminary trial, and 1500 ml (large), 500 ml (medium), and 250 ml (small) for a full fruiting trial. To reduce the possible confounding effects of water and nutrient restrictions, care was taken to ensure an even and consistent soil moisture throughout the study, with plants being watered/fertilized several times daily with a low concentration soluble fertilizer solution. Root restriction resulted in a general reduction in biomass production, height, leaf area, and transpiration rate; however, the fruit production was not significantly reduced in the root restricted plants under the employed environmental and horticultural conditions. There was a 21% reduction in total height and a 23% reduction in overall crown diameter between the large and small pot size in the fruiting study. Data from the fruiting trial were used to estimate potential volume utilization efficiency improvements for edible biomass in a fixed production volume. For fixed lighting and rooting hardware situations, the majority of improvement from root restriction was in the reduction of canopy area per plant, while height reductions could also improve volume utilization efficiency in high stacked or vertical agricultural systems.
Graham, Thomas; Wheeler, Raymond
2016-06-01
The objective of this study was to evaluate root restriction as a tool to increase volume utilization efficiency in spaceflight crop production systems. Bell pepper plants (Capsicum annuum cv. California Wonder) were grown under restricted rooting volume conditions in controlled environment chambers. The rooting volume was restricted to 500ml and 60ml in a preliminary trial, and 1500ml (large), 500ml (medium), and 250ml (small) for a full fruiting trial. To reduce the possible confounding effects of water and nutrient restrictions, care was taken to ensure an even and consistent soil moisture throughout the study, with plants being watered/fertilized several times daily with a low concentration soluble fertilizer solution. Root restriction resulted in a general reduction in biomass production, height, leaf area, and transpiration rate; however, the fruit production was not significantly reduced in the root restricted plants under the employed environmental and horticultural conditions. There was a 21% reduction in total height and a 23% reduction in overall crown diameter between the large and small pot size in the fruiting study. Data from the fruiting trial were used to estimate potential volume utilization efficiency improvements for edible biomass in a fixed production volume. For fixed lighting and rooting hardware situations, the majority of improvement from root restriction was in the reduction of canopy area per plant, while height reductions could also improve volume utilization efficiency in high stacked or vertical agricultural systems. Copyright © 2016 The Committee on Space Research (COSPAR). All rights reserved.
Park, Youn Shik; Engel, Bernie A; Kim, Jonggun; Theller, Larry; Chaubey, Indrajeet; Merwade, Venkatesh; Lim, Kyoung Jae
2015-03-01
Total Maximum Daily Load is a water quality standard to regulate water quality of streams, rivers and lakes. A wide range of approaches are used currently to develop TMDLs for impaired streams and rivers. Flow and load duration curves (FDC and LDC) have been used in many states to evaluate the relationship between flow and pollutant loading along with other models and approaches. A web-based LDC Tool was developed to facilitate development of FDC and LDC as well as to support other hydrologic analyses. In this study, the FDC and LDC tool was enhanced to allow collection of water quality data via the web and to assist in establishing cost-effective Best Management Practice (BMP) implementations. The enhanced web-based tool provides use of water quality data not only from the US Geological Survey but also from the Water Quality Portal for the U.S. via web access. Moreover, the web-based tool identifies required pollutant reductions to meet standard loads and suggests a BMP scenario based on ability of BMPs to reduce pollutant loads, BMP establishment and maintenance costs. In the study, flow and water quality data were collected via web access to develop LDC and to identify the required reduction. The suggested BMP scenario from the web-based tool was evaluated using the EPA Spreadsheet Tool for the Estimation of Pollutant Load model to attain the required pollutant reduction at least cost. Copyright © 2014 Elsevier Ltd. All rights reserved.
A reduction package for cross-dispersed echelle spectrograph data in IDL
NASA Astrophysics Data System (ADS)
Hall, Jeffrey C.; Neff, James E.
1992-12-01
We have written in IDL a data reduction package that performs reduction and extraction of cross-dispersed echelle spectrograph data. The present package includes a complete set of tools for extracting data from any number of spectral orders with arbitrary tilt and curvature. Essential elements include debiasing and flatfielding of the raw CCD image, removal of scattered light background, either nonoptimal or optimal extraction of data, and wavelength calibration and continuum normalization of the extracted orders. A growing set of support routines permits examination of the frame being processed to provide continuing checks on the statistical properties of the data and on the accuracy of the extraction. We will display some sample reductions and discuss the algorithms used. The inherent simplicity and user-friendliness of the IDL interface make this package a useful tool for spectroscopists. We will provide an email distribution list for those interested in receiving the package, and further documentation will be distributed at the meeting.
Granularity refined by knowledge: contingency tables and rough sets as tools of discovery
NASA Astrophysics Data System (ADS)
Zytkow, Jan M.
2000-04-01
Contingency tables represent data in a granular way and are a well-established tool for inductive generalization of knowledge from data. We show that the basic concepts of rough sets, such as concept approximation, indiscernibility, and reduct can be expressed in the language of contingency tables. We further demonstrate the relevance to rough sets theory of additional probabilistic information available in contingency tables and in particular of statistical tests of significance and predictive strength applied to contingency tables. Tests of both type can help the evaluation mechanisms used in inductive generalization based on rough sets. Granularity of attributes can be improved in feedback with knowledge discovered in data. We demonstrate how 49er's facilities for (1) contingency table refinement, for (2) column and row grouping based on correspondence analysis, and (3) the search for equivalence relations between attributes improve both granularization of attributes and the quality of knowledge. Finally we demonstrate the limitations of knowledge viewed as concept approximation, which is the focus of rough sets. Transcending that focus and reorienting towards the predictive knowledge and towards the related distinction between possible and impossible (or statistically improbable) situations will be very useful in expanding the rough sets approach to more expressive forms of knowledge.
Development of an expert data reduction assistant
NASA Technical Reports Server (NTRS)
Miller, Glenn E.; Johnston, Mark D.; Hanisch, Robert J.
1992-01-01
We propose the development of an expert system tool for the management and reduction of complex data sets. The proposed work is an extension of a successful prototype system for the calibration of CCD images developed by Dr. Johnston in 1987. The reduction of complex multi-parameter data sets presents severe challenges to a scientist. Not only must a particular data analysis system be mastered, (e.g. IRAF/SDAS/MIDAS), large amounts of data can require many days of tedious work and supervision by the scientist for even the most straightforward reductions. The proposed Expert Data Reduction Assistant will help the scientist overcome these obstacles by developing a reduction plan based on the data at hand and producing a script for the reduction of the data in a target common language.
O'Connor, Elodie; Hatherly, Chris
2014-01-01
Background Encouraging middle-aged adults to maintain their physical and cognitive health may have a significant impact on reducing the prevalence of dementia in the future. Mobile phone apps and interactive websites may be one effective way to target this age group. However, to date there has been little research investigating the user experience of dementia risk reduction tools delivered in this way. Objective The aim of this study was to explore participant engagement and evaluations of three different targeted smartphone and Web-based dementia risk reduction tools following a four-week intervention. Methods Participants completed a Web-based screening questionnaire to collect eligibility information. Eligible participants were asked to complete a Web-based baseline questionnaire and were then randomly assigned to use one of the three dementia risk reduction tools for a period of four weeks: (1) a mobile phone application; (2) an information-based website; and (3) an interactive website. User evaluations were obtained via a Web-based follow-up questionnaire after completion of the intervention. Results Of 415 eligible participants, 370 (89.16%) completed the baseline questionnaire and were assigned to an intervention group; 200 (54.05%) completed the post-intervention questionnaire. The average age of participants was 52 years, and 149 (75%) were female. Findings indicated that participants from all three intervention groups reported a generally positive impression of the tools across a range of domains. Participants using the information-based website reported higher ratings of their overall impression of the tool, F2,191=4.12, P=.02; how interesting the information was, F2,189=3.53, P=.03; how helpful the information was, F2,192=4.15, P=.02; and how much they learned, F2,188=3.86, P=.02. Group differences were significant between the mobile phone app and information-based website users, but not between the interactive website users and the other two groups. Additionally, participants using the information-based website reported significantly higher scores on their ratings of the ease of navigation, F2,190=4.20, P=.02, than those using the mobile phone app and the interactive website. There were no significant differences between groups on ratings of ease of understanding the information, F2,188=0.27, P=.76. Most participants from each of the three intervention groups indicated that they intended to keep using the dementia risk reduction eHealth tool. Conclusions Overall, results indicated that while participants across all three intervention groups reported a generally positive experience with the targeted dementia risk reduction tools, participants using the information-based website provided a more favorable evaluation across a range of areas than participants using the mobile phone app. Further research is required to investigate whether targeted dementia risk reduction tools, in the form of interactive websites and mobile apps, can be improved to provide benefits above those gained by providing static information alone. PMID:26543904
O'Connor, Elodie; Farrow, Maree; Hatherly, Chris
2014-01-01
Encouraging middle-aged adults to maintain their physical and cognitive health may have a significant impact on reducing the prevalence of dementia in the future. Mobile phone apps and interactive websites may be one effective way to target this age group. However, to date there has been little research investigating the user experience of dementia risk reduction tools delivered in this way. The aim of this study was to explore participant engagement and evaluations of three different targeted smartphone and Web-based dementia risk reduction tools following a four-week intervention. Participants completed a Web-based screening questionnaire to collect eligibility information. Eligible participants were asked to complete a Web-based baseline questionnaire and were then randomly assigned to use one of the three dementia risk reduction tools for a period of four weeks: (1) a mobile phone application; (2) an information-based website; and (3) an interactive website. User evaluations were obtained via a Web-based follow-up questionnaire after completion of the intervention. Of 415 eligible participants, 370 (89.16%) completed the baseline questionnaire and were assigned to an intervention group; 200 (54.05%) completed the post-intervention questionnaire. The average age of participants was 52 years, and 149 (75%) were female. Findings indicated that participants from all three intervention groups reported a generally positive impression of the tools across a range of domains. Participants using the information-based website reported higher ratings of their overall impression of the tool, F2,191=4.12, P=.02; how interesting the information was, F2,189=3.53, P=.03; how helpful the information was, F2,192=4.15, P=.02; and how much they learned, F2,188=3.86, P=.02. Group differences were significant between the mobile phone app and information-based website users, but not between the interactive website users and the other two groups. Additionally, participants using the information-based website reported significantly higher scores on their ratings of the ease of navigation, F2,190=4.20, P=.02, than those using the mobile phone app and the interactive website. There were no significant differences between groups on ratings of ease of understanding the information, F2,188=0.27, P=.76. Most participants from each of the three intervention groups indicated that they intended to keep using the dementia risk reduction eHealth tool. Overall, results indicated that while participants across all three intervention groups reported a generally positive experience with the targeted dementia risk reduction tools, participants using the information-based website provided a more favorable evaluation across a range of areas than participants using the mobile phone app. Further research is required to investigate whether targeted dementia risk reduction tools, in the form of interactive websites and mobile apps, can be improved to provide benefits above those gained by providing static information alone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vixie, Kevin R.
This is the final report for the project "Geometric Analysis for Data Reduction and Structure Discovery" in which insights and tools from geometric analysis were developed and exploited for their potential to large scale data challenges.
NASA Astrophysics Data System (ADS)
Kuckein, C.; Denker, C.; Verma, M.; Balthasar, H.; González Manrique, S. J.; Louis, R. E.; Diercke, A.
2017-10-01
A huge amount of data has been acquired with the GREGOR Fabry-Pérot Interferometer (GFPI), large-format facility cameras, and since 2016 with the High-resolution Fast Imager (HiFI). These data are processed in standardized procedures with the aim of providing science-ready data for the solar physics community. For this purpose, we have developed a user-friendly data reduction pipeline called ``sTools'' based on the Interactive Data Language (IDL) and licensed under creative commons license. The pipeline delivers reduced and image-reconstructed data with a minimum of user interaction. Furthermore, quick-look data are generated as well as a webpage with an overview of the observations and their statistics. All the processed data are stored online at the GREGOR GFPI and HiFI data archive of the Leibniz Institute for Astrophysics Potsdam (AIP). The principles of the pipeline are presented together with selected high-resolution spectral scans and images processed with sTools.
Consequent use of IT tools as a driver for cost reduction and quality improvements
NASA Astrophysics Data System (ADS)
Hein, Stefan; Rapp, Roberto; Feustel, Andreas
2013-10-01
The semiconductor industry drives a lot of efforts in the field of cost reductions and quality improvements. The consequent use of IT tools is one possibility to support these goals. With the extensions of its 150mm Fab to 200mm Robert Bosch increased the systematic use of data analysis and Advanced Process Control (APC).
Supervised learning of tools for content-based search of image databases
NASA Astrophysics Data System (ADS)
Delanoy, Richard L.
1996-03-01
A computer environment, called the Toolkit for Image Mining (TIM), is being developed with the goal of enabling users with diverse interests and varied computer skills to create search tools for content-based image retrieval and other pattern matching tasks. Search tools are generated using a simple paradigm of supervised learning that is based on the user pointing at mistakes of classification made by the current search tool. As mistakes are identified, a learning algorithm uses the identified mistakes to build up a model of the user's intentions, construct a new search tool, apply the search tool to a test image, display the match results as feedback to the user, and accept new inputs from the user. Search tools are constructed in the form of functional templates, which are generalized matched filters capable of knowledge- based image processing. The ability of this system to learn the user's intentions from experience contrasts with other existing approaches to content-based image retrieval that base searches on the characteristics of a single input example or on a predefined and semantically- constrained textual query. Currently, TIM is capable of learning spectral and textural patterns, but should be adaptable to the learning of shapes, as well. Possible applications of TIM include not only content-based image retrieval, but also quantitative image analysis, the generation of metadata for annotating images, data prioritization or data reduction in bandwidth-limited situations, and the construction of components for larger, more complex computer vision algorithms.
Whitmore, Leanne S.; Davis, Ryan W.; McCormick, Robert L.; ...
2016-09-15
Screening a large number of biologically derived molecules for potential fuel compounds without recourse to experimental testing is important in identifying understudied yet valuable molecules. Experimental testing, although a valuable standard for measuring fuel properties, has several major limitations, including the requirement of testably high quantities, considerable expense, and a large amount of time. This paper discusses the development of a general-purpose fuel property tool, using machine learning, whose outcome is to screen molecules for desirable fuel properties. BioCompoundML adopts a general methodology, requiring as input only a list of training compounds (with identifiers and measured values) and a listmore » of testing compounds (with identifiers). For the training data, BioCompoundML collects open data from the National Center for Biotechnology Information, incorporates user-provided features, imputes missing values, performs feature reduction, builds a classifier, and clusters compounds. BioCompoundML then collects data for the testing compounds, predicts class membership, and determines whether compounds are found in the range of variability of the training data set. We demonstrate this tool using three different fuel properties: research octane number (RON), threshold soot index (TSI), and melting point (MP). Here we provide measures of its success with these properties using randomized train/test measurements: average accuracy is 88% in RON, 85% in TSI, and 94% in MP; average precision is 88% in RON, 88% in TSI, and 95% in MP; and average recall is 88% in RON, 82% in TSI, and 97% in MP. The receiver operator characteristics (area under the curve) were estimated at 0.88 in RON, 0.86 in TSI, and 0.87 in MP. We also measured the success of BioCompoundML by sending 16 compounds for direct RON determination. Finally, we provide a screen of 1977 hydrocarbons/oxygenates within the 8696 compounds in MetaCyc, identifying compounds with high predictive strength for high or low RON.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drotar, Alexander P.; Quinn, Erin E.; Sutherland, Landon D.
2012-07-30
Project description is: (1) Build a high performance computer; and (2) Create a tool to monitor node applications in Component Based Tool Framework (CBTF) using code from Lightweight Data Metric Service (LDMS). The importance of this project is that: (1) there is a need a scalable, parallel tool to monitor nodes on clusters; and (2) New LDMS plugins need to be able to be easily added to tool. CBTF stands for Component Based Tool Framework. It's scalable and adjusts to different topologies automatically. It uses MRNet (Multicast/Reduction Network) mechanism for information transport. CBTF is flexible and general enough to bemore » used for any tool that needs to do a task on many nodes. Its components are reusable and 'EASILY' added to a new tool. There are three levels of CBTF: (1) frontend node - interacts with users; (2) filter nodes - filters or concatenates information from backend nodes; and (3) backend nodes - where the actual work of the tool is done. LDMS stands for lightweight data metric servies. It's a tool used for monitoring nodes. Ltool is the name of the tool we derived from LDMS. It's dynamically linked and includes the following components: Vmstat, Meminfo, Procinterrupts and more. It works by: Ltool command is run on the frontend node; Ltool collects information from the backend nodes; backend nodes send information to the filter nodes; and filter nodes concatenate information and send to a database on the front end node. Ltool is a useful tool when it comes to monitoring nodes on a cluster because the overhead involved with running the tool is not particularly high and it will automatically scale to any size cluster.« less
The impact of smart metal artefact reduction algorithm for use in radiotherapy treatment planning.
Guilfoile, Connor; Rampant, Peter; House, Michael
2017-06-01
The presence of metal artefacts in computed tomography (CT) create issues in radiation oncology. The loss of anatomical information and incorrect Hounsfield unit (HU) values produce inaccuracies in dose calculations, providing suboptimal patient treatment. Metal artefact reduction (MAR) algorithms were developed to combat these problems. This study provides a qualitative and quantitative analysis of the "Smart MAR" software (General Electric Healthcare, Chicago, IL, USA), determining its usefulness in a clinical setting. A detailed analysis was conducted using both patient and phantom data, noting any improvements in HU values and dosimetry with the GE-MAR enabled. This study indicates qualitative improvements in severity of the streak artefacts produced by metals, allowing for easier patient contouring. Furthermore, the GE-MAR managed to recover previously lost anatomical information. Additionally, phantom data showed an improvement in HU value with GE-MAR correction, producing more accurate point dose calculations in the treatment planning system. Overall, the GE-MAR is a useful tool and is suitable for clinical environments.
Spectacle and SpecViz: New Spectral Analysis and Visualization Tools
NASA Astrophysics Data System (ADS)
Earl, Nicholas; Peeples, Molly; JDADF Developers
2018-01-01
A new era of spectroscopic exploration of our universe is being ushered in with advances in instrumentation and next-generation space telescopes. The advent of new spectroscopic instruments has highlighted a pressing need for tools scientists can use to analyze and explore these new data. We have developed Spectacle, a software package for analyzing both synthetic spectra from hydrodynamic simulations as well as real COS data with an aim of characterizing the behavior of the circumgalactic medium. It allows easy reduction of spectral data and analytic line generation capabilities. Currently, the package is focused on automatic determination of absorption regions and line identification with custom line list support, simultaneous line fitting using Voigt profiles via least-squares or MCMC methods, and multi-component modeling of blended features. Non-parametric measurements, such as equivalent widths, delta v90, and full-width half-max are available. Spectacle also provides the ability to compose compound models used to generate synthetic spectra allowing the user to define various LSF kernels, uncertainties, and to specify sampling.We also present updates to the visualization tool SpecViz, developed in conjunction with the JWST data analysis tools development team, to aid in the exploration of spectral data. SpecViz is an open source, Python-based spectral 1-D interactive visualization and analysis application built around high-performance interactive plotting. It supports handling general and instrument-specific data and includes advanced tool-sets for filtering and detrending one-dimensional data, along with the ability to isolate absorption regions using slicing and manipulate spectral features via spectral arithmetic. Multi-component modeling is also possible using a flexible model fitting tool-set that supports custom models to be used with various fitting routines. It also features robust user extensions such as custom data loaders and support for user-created plugins that add new functionality.This work was supported in part by HST AR #13919, HST GO #14268, and HST AR #14560.
NASA Astrophysics Data System (ADS)
Crawford, S. M.; Crause, Lisa; Depagne, Éric; Ilkiewicz, Krystian; Schroeder, Anja; Kuhn, Rudolph; Hettlage, Christian; Romero Colmenaro, Encarni; Kniazev, Alexei; Väisänen, Petri
2016-08-01
The High Resolution Spectrograph (HRS) on the Southern African Large Telescope (SALT) is a dual beam, fiber-fed echelle spectrograph providing high resolution capabilities to the SALT observing community. We describe the available data reduction tools and the procedures put in place for regular monitoring of the data quality from the spectrograph. Data reductions are carried out through the pyhrs package. The data characteristics and instrument stability are reported as part of the SALT Dashboard to help monitor the performance of the instrument.
Reduction of capsule endoscopy reading times by unsupervised image mining.
Iakovidis, D K; Tsevas, S; Polydorou, A
2010-09-01
The screening of the small intestine has become painless and easy with wireless capsule endoscopy (WCE) that is a revolutionary, relatively non-invasive imaging technique performed by a wireless swallowable endoscopic capsule transmitting thousands of video frames per examination. The average time required for the visual inspection of a full 8-h WCE video ranges from 45 to 120min, depending on the experience of the examiner. In this paper, we propose a novel approach to WCE reading time reduction by unsupervised mining of video frames. The proposed methodology is based on a data reduction algorithm which is applied according to a novel scheme for the extraction of representative video frames from a full length WCE video. It can be used either as a video summarization or as a video bookmarking tool, providing the comparative advantage of being general, unbounded by the finiteness of a training set. The number of frames extracted is controlled by a parameter that can be tuned automatically. Comprehensive experiments on real WCE videos indicate that a significant reduction in the reading times is feasible. In the case of the WCE videos used this reduction reached 85% without any loss of abnormalities.
NASA Astrophysics Data System (ADS)
Jenness, Tim; Economou, Frossie; Cavanagh, Brad
ORAC-DR is a general purpose automatic data reduction pipeline environment. This document describes how to modify data reduction recipes and how to add new instruments. For a general overview of ORAC-DR see SUN/230. For specific information on how to reduce the data for a particular instrument, please consult the appropriate ORAC-DR instrument guide.
Consequences of data reduction in the FIA database: a case study with southern yellow pine
Anita K. Rose; James F. Rosson Jr.; Helen Beresford
2015-01-01
The Forest Inventory and Analysis Program strives to make its data publicly available in a format that is easy to use and understand most commonly accessed through online tools such as EVALIDator and Forest Inventory Data Online. This requires a certain amount of data reduction. Using a common data request concerning the resource of southern yellow pine (SYP), we...
Development of an expert data reduction assistant
NASA Technical Reports Server (NTRS)
Miller, Glenn E.; Johnston, Mark D.; Hanisch, Robert J.
1993-01-01
We propose the development of an expert system tool for the management and reduction of complex datasets. the proposed work is an extension of a successful prototype system for the calibration of CCD (charge coupled device) images developed by Dr. Johnston in 1987. (ref.: Proceedings of the Goddard Conference on Space Applications of Artificial Intelligence). The reduction of complex multi-parameter data sets presents severe challenges to a scientist. Not only must a particular data analysis system be mastered, (e.g. IRAF/SDAS/MIDAS), large amounts of data can require many days of tedious work and supervision by the scientist for even the most straightforward reductions. The proposed Expert Data Reduction Assistant will help the scientist overcome these obstacles by developing a reduction plan based on the data at hand and producing a script for the reduction of the data in a target common language.
Validation of a general practice audit and data extraction tool.
Peiris, David; Agaliotis, Maria; Patel, Bindu; Patel, Anushka
2013-11-01
We assessed how accurately a common general practitioner (GP) audit tool extracts data from two software systems. First, pathology test codes were audited at 33 practices covering nine companies. Second, a manual audit of chronic disease data from 200 random patient records at two practices was compared with audit tool data. Pathology review: all companies assigned correct codes for cholesterol, creatinine and glycated haemoglobin; four companies assigned incorrect codes for albuminuria tests, precluding accurate detection with the audit tool. Case record review: there was strong agreement between the manual audit and the tool for all variables except chronic kidney disease diagnoses, which was due to a tool-related programming error. The audit tool accurately detected most chronic disease data in two GP record systems. The one exception, however, highlights the importance of surveillance systems to promptly identify errors. This will maximise potential for audit tools to improve healthcare quality.
Python and HPC for High Energy Physics Data Analyses
Sehrish, S.; Kowalkowski, J.; Paterno, M.; ...
2017-01-01
High level abstractions in Python that can utilize computing hardware well seem to be an attractive option for writing data reduction and analysis tasks. In this paper, we explore the features available in Python which are useful and efficient for end user analysis in High Energy Physics (HEP). A typical vertical slice of an HEP data analysis is somewhat fragmented: the state of the reduction/analysis process must be saved at certain stages to allow for selective reprocessing of only parts of a generally time-consuming workflow. Also, algorithms tend to to be modular because of the heterogeneous nature of most detectorsmore » and the need to analyze different parts of the detector separately before combining the information. This fragmentation causes difficulties for interactive data analysis, and as data sets increase in size and complexity (O10 TiB for a “small” neutrino experiment to the O10 PiB currently held by the CMS experiment at the LHC), data analysis methods traditional to the field must evolve to make optimum use of emerging HPC technologies and platforms. Mainstream big data tools, while suggesting a direction in terms of what can be done if an entire data set can be available across a system and analysed with high-level programming abstractions, are not designed with either scientific computing generally, or modern HPC platform features in particular, such as data caching levels, in mind. Our example HPC use case is a search for a new elementary particle which might explain the phenomenon known as “Dark Matter”. Here, using data from the CMS detector, we will use HDF5 as our input data format, and MPI with Python to implement our use case.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sehrish, S.; Kowalkowski, J.; Paterno, M.
High level abstractions in Python that can utilize computing hardware well seem to be an attractive option for writing data reduction and analysis tasks. In this paper, we explore the features available in Python which are useful and efficient for end user analysis in High Energy Physics (HEP). A typical vertical slice of an HEP data analysis is somewhat fragmented: the state of the reduction/analysis process must be saved at certain stages to allow for selective reprocessing of only parts of a generally time-consuming workflow. Also, algorithms tend to to be modular because of the heterogeneous nature of most detectorsmore » and the need to analyze different parts of the detector separately before combining the information. This fragmentation causes difficulties for interactive data analysis, and as data sets increase in size and complexity (O10 TiB for a “small” neutrino experiment to the O10 PiB currently held by the CMS experiment at the LHC), data analysis methods traditional to the field must evolve to make optimum use of emerging HPC technologies and platforms. Mainstream big data tools, while suggesting a direction in terms of what can be done if an entire data set can be available across a system and analysed with high-level programming abstractions, are not designed with either scientific computing generally, or modern HPC platform features in particular, such as data caching levels, in mind. Our example HPC use case is a search for a new elementary particle which might explain the phenomenon known as “Dark Matter”. Here, using data from the CMS detector, we will use HDF5 as our input data format, and MPI with Python to implement our use case.« less
AstroImageJ: Image Processing and Photometric Extraction for Ultra-precise Astronomical Light Curves
NASA Astrophysics Data System (ADS)
Collins, Karen A.; Kielkopf, John F.; Stassun, Keivan G.; Hessman, Frederic V.
2017-02-01
ImageJ is a graphical user interface (GUI) driven, public domain, Java-based, software package for general image processing traditionally used mainly in life sciences fields. The image processing capabilities of ImageJ are useful and extendable to other scientific fields. Here we present AstroImageJ (AIJ), which provides an astronomy specific image display environment and tools for astronomy specific image calibration and data reduction. Although AIJ maintains the general purpose image processing capabilities of ImageJ, AIJ is streamlined for time-series differential photometry, light curve detrending and fitting, and light curve plotting, especially for applications requiring ultra-precise light curves (e.g., exoplanet transits). AIJ reads and writes standard Flexible Image Transport System (FITS) files, as well as other common image formats, provides FITS header viewing and editing, and is World Coordinate System aware, including an automated interface to the astrometry.net web portal for plate solving images. AIJ provides research grade image calibration and analysis tools with a GUI driven approach, and easily installed cross-platform compatibility. It enables new users, even at the level of undergraduate student, high school student, or amateur astronomer, to quickly start processing, modeling, and plotting astronomical image data with one tightly integrated software package.
Data are from Mars, Tools are from Venus
NASA Technical Reports Server (NTRS)
Lee, H. Joe
2017-01-01
Although during the data production phase, the data producers will usually ensure the products to be easily used by the specific power users the products serve. However, most data products are also posted for general public to use. It is not straightforward for data producers to anticipate what tools that these general end-data users are likely to use. In this talk, we will try to help fill in the gap by going over various tools related to Earth Science and how they work with the existing NASA HDF (Hierarchical Data Format) data products and the reasons why some products cannot be visualized or analyzed by existing tools. One goal is for to give insights for data producers on how to make their data product more interoperable. On the other hand, we also provide some hints for end users on how to make tools work with existing HDF data products. (tool category list: check the comments) HDF-EOS tools: HDFView HDF-EOS Plugin, HEG, h4tonccf, hdf-eos2 dumper, NCL, MATLAB, IDL, etc.net; CDF-Java tools: Panoply, IDV, toosUI, NcML, etc.net; CDF-C tools: ArcGIS Desktop, GrADS, NCL, NCO, etc.; GDAL tools: ArcGIS Desktop, QGIS, Google Earth, etc.; CSV tools: ArcGIS Online, MS Excel, Tableau, etc.
Lateral conduction effects on heat-transfer data obtained with the phase-change paint technique
NASA Technical Reports Server (NTRS)
Maise, G.; Rossi, M. J.
1974-01-01
A computerized tool, CAPE, (Conduction Analysis Program using Eigenvalues) has been developed to account for lateral heat conduction in wind tunnel models in the data reduction of the phase-change paint technique. The tool also accounts for the effects of finite thickness (thin wings) and surface curvature. A special reduction procedure using just one time of melt is also possible on leading edges. A novel iterative numerical scheme was used, with discretized spatial coordinates but analytic integration in time, to solve the inverse conduction problem involved in the data reduction. A yes-no chart is provided which tells the test engineer when various corrections are large enough so that CAPE should be used. The accuracy of the phase-change paint technique in the presence of finite thickness and lateral conduction is also investigated.
The Research Tools of the Virtual Astronomical Observatory
NASA Astrophysics Data System (ADS)
Hanisch, Robert J.; Berriman, G. B.; Lazio, T. J.; Project, VAO
2013-01-01
Astronomy is being transformed by the vast quantities of data, models, and simulations that are becoming available to astronomers at an ever-accelerating rate. The U.S. Virtual Astronomical Observatory (VAO) has been funded to provide an operational facility that is intended to be a resource for discovery and access of data, and to provide science services that use these data. Over the course of the past year, the VAO has been developing and releasing for community use five science tools: 1) "Iris", for dynamically building and analyzing spectral energy distributions, 2) a web-based data discovery tool that allows astronomers to identify and retrieve catalog, image, and spectral data on sources of interest, 3) a scalable cross-comparison service that allows astronomers to conduct pair-wise positional matches between very large catalogs stored remotely as well as between remote and local catalogs, 4) time series tools that allow astronomers to compute periodograms of the public data held at the NASA Star and Exoplanet Database (NStED) and the Harvard Time Series Center, and 5) A VO-aware release of the Image Reduction and Analysis Facility (IRAF) that provides transparent access to VO-available data collections and is SAMP-enabled, so that IRAF users can easily use tools such as Aladin and Topcat in conjuction with IRAF tasks. Additional VAO services will be built to make it easy for researchers to provide access to their data in VO-compliant ways, to build VO-enabled custom applications in Python, and to respond generally to the growing size and complexity of astronomy data. Acknowledgements: The Virtual Astronomical Observatory (VAO) is managed by the VAO, LLC, a non-profit company established as a partnership of the Associated Universities, Inc. and the Association of Universities for Research in Astronomy, Inc. The VAO is sponsored by the National Science Foundation and the National Aeronautics and Space Administration.
MYRaf: A new Approach with IRAF for Astronomical Photometric Reduction
NASA Astrophysics Data System (ADS)
Kilic, Y.; Shameoni Niaei, M.; Özeren, F. F.; Yesilyaprak, C.
2016-12-01
In this study, the design and some developments of MYRaf software for astronomical photometric reduction are presented. MYRaf software is an easy to use, reliable, and has a fast IRAF aperture photometry GUI tools. MYRaf software is an important step for the automated software process of robotic telescopes, and uses IRAF, PyRAF, matplotlib, ginga, alipy, and Sextractor with the general-purpose and high-level programming language Python and uses the QT framework.
Active Subspace Methods for Data-Intensive Inverse Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qiqi
2017-04-27
The project has developed theory and computational tools to exploit active subspaces to reduce the dimension in statistical calibration problems. This dimension reduction enables MCMC methods to calibrate otherwise intractable models. The same theoretical and computational tools can also reduce the measurement dimension for calibration problems that use large stores of data.
Development of an Uncertainty Model for the National Transonic Facility
NASA Technical Reports Server (NTRS)
Walter, Joel A.; Lawrence, William R.; Elder, David W.; Treece, Michael D.
2010-01-01
This paper introduces an uncertainty model being developed for the National Transonic Facility (NTF). The model uses a Monte Carlo technique to propagate standard uncertainties of measured values through the NTF data reduction equations to calculate the combined uncertainties of the key aerodynamic force and moment coefficients and freestream properties. The uncertainty propagation approach to assessing data variability is compared with ongoing data quality assessment activities at the NTF, notably check standard testing using statistical process control (SPC) techniques. It is shown that the two approaches are complementary and both are necessary tools for data quality assessment and improvement activities. The SPC approach is the final arbiter of variability in a facility. Its result encompasses variation due to people, processes, test equipment, and test article. The uncertainty propagation approach is limited mainly to the data reduction process. However, it is useful because it helps to assess the causes of variability seen in the data and consequently provides a basis for improvement. For example, it is shown that Mach number random uncertainty is dominated by static pressure variation over most of the dynamic pressure range tested. However, the random uncertainty in the drag coefficient is generally dominated by axial and normal force uncertainty with much less contribution from freestream conditions.
duVerle, David A; Yotsukura, Sohiya; Nomura, Seitaro; Aburatani, Hiroyuki; Tsuda, Koji
2016-09-13
Single-cell RNA sequencing is fast becoming one the standard method for gene expression measurement, providing unique insights into cellular processes. A number of methods, based on general dimensionality reduction techniques, have been suggested to help infer and visualise the underlying structure of cell populations from single-cell expression levels, yet their models generally lack proper biological grounding and struggle at identifying complex differentiation paths. Here we introduce cellTree: an R/Bioconductor package that uses a novel statistical approach, based on document analysis techniques, to produce tree structures outlining the hierarchical relationship between single-cell samples, while identifying latent groups of genes that can provide biological insights. With cellTree, we provide experimentalists with an easy-to-use tool, based on statistically and biologically-sound algorithms, to efficiently explore and visualise single-cell RNA data. The cellTree package is publicly available in the online Bionconductor repository at: http://bioconductor.org/packages/cellTree/ .
Automated and Scalable Data Reduction in the textsc{Sofia} Data Processing System
NASA Astrophysics Data System (ADS)
Krzaczek, R.; Shuping, R.; Charcos-Llorens, M.; Alles, R.; Vacca, W.
2015-09-01
In order to provide suitable data products to general investigators and other end users in a timely manner, the Stratospheric Observatory for Infrared Astronomy SOFIA) has developed a framework supporting the automated execution of data processing pipelines for the various instruments, called the Data Processing System (DPS), see Shuping et al. (2014) for overview). The primary requirement is to process all data collected from a flight within eight hours, allowing data quality assessments and inspections to be made the following day. The raw data collected during a flight requires processing by a number of different software packages and tools unique to each combination of instrument and mode of operation, much of it developed in-house, in order to create data products for use by investigators and other end-users. The requirement to deliver these data products in a consistent, predictable, and performant manner presents a significant challenge for the observatory. Herein we present aspects of the DPS that help to achieve these goals. We discuss how it supports data reduction software written in a variety of languages and environments, its support for new versions and live upgrades to that software and other necessary resources (e.g., calibrations), its accommodation of sudden processing loads through the addition (and eventual removal) of computing resources, and close with an observation of the performance achieved in the first two observing cycles of SOFIA.
Real-time data reduction capabilities at the Langley 7 by 10 foot high speed tunnel
NASA Technical Reports Server (NTRS)
Fox, C. H., Jr.
1980-01-01
The 7 by 10 foot high speed tunnel performs a wide range of tests employing a variety of model installation methods. To support the reduction of static data from this facility, a generalized wind tunnel data reduction program had been developed for use on the Langley central computer complex. The capabilities of a version of this generalized program adapted for real time use on a dedicated on-site computer are discussed. The input specifications, instructions for the console operator, and full descriptions of the algorithms are included.
Dawn: A Simulation Model for Evaluating Costs and Tradeoffs of Big Data Science Architectures
NASA Astrophysics Data System (ADS)
Cinquini, L.; Crichton, D. J.; Braverman, A. J.; Kyo, L.; Fuchs, T.; Turmon, M.
2014-12-01
In many scientific disciplines, scientists and data managers are bracing for an upcoming deluge of big data volumes, which will increase the size of current data archives by a factor of 10-100 times. For example, the next Climate Model Inter-comparison Project (CMIP6) will generate a global archive of model output of approximately 10-20 Peta-bytes, while the upcoming next generation of NASA decadal Earth Observing instruments are expected to collect tens of Giga-bytes/day. In radio-astronomy, the Square Kilometre Array (SKA) will collect data in the Exa-bytes/day range, of which (after reduction and processing) around 1.5 Exa-bytes/year will be stored. The effective and timely processing of these enormous data streams will require the design of new data reduction and processing algorithms, new system architectures, and new techniques for evaluating computation uncertainty. Yet at present no general software tool or framework exists that will allow system architects to model their expected data processing workflow, and determine the network, computational and storage resources needed to prepare their data for scientific analysis. In order to fill this gap, at NASA/JPL we have been developing a preliminary model named DAWN (Distributed Analytics, Workflows and Numerics) for simulating arbitrary complex workflows composed of any number of data processing and movement tasks. The model can be configured with a representation of the problem at hand (the data volumes, the processing algorithms, the available computing and network resources), and is able to evaluate tradeoffs between different possible workflows based on several estimators: overall elapsed time, separate computation and transfer times, resulting uncertainty, and others. So far, we have been applying DAWN to analyze architectural solutions for 4 different use cases from distinct science disciplines: climate science, astronomy, hydrology and a generic cloud computing use case. This talk will present preliminary results and discuss how DAWN can be evolved into a powerful tool for designing system architectures for data intensive science.
MDAS: an integrated system for metabonomic data analysis.
Liu, Juan; Li, Bo; Xiong, Jiang-Hui
2009-03-01
Metabonomics, the latest 'omics' research field, shows great promise as a tool in biomarker discovery, drug efficacy and toxicity analysis, disease diagnosis and prognosis. One of the major challenges now facing researchers is how to process this data to yield useful information about a biological system, e.g., the mechanism of diseases. Traditional methods employed in metabonomic data analysis use multivariate analysis methods developed independently in chemometrics research. Additionally, with the development of machine learning approaches, some methods such as SVMs also show promise for use in metabonomic data analysis. Aside from the application of general multivariate analysis and machine learning methods to this problem, there is also a need for an integrated tool customized for metabonomic data analysis which can be easily used by biologists to reveal interesting patterns in metabonomic data.In this paper, we present a novel software tool MDAS (Metabonomic Data Analysis System) for metabonomic data analysis which integrates traditional chemometrics methods and newly introduced machine learning approaches. MDAS contains a suite of functional models for metabonomic data analysis and optimizes the flow of data analysis. Several file formats can be accepted as input. The input data can be optionally preprocessed and can then be processed with operations such as feature analysis and dimensionality reduction. The data with reduced dimensionalities can be used for training or testing through machine learning models. The system supplies proper visualization for data preprocessing, feature analysis, and classification which can be a powerful function for users to extract knowledge from the data. MDAS is an integrated platform for metabonomic data analysis, which transforms a complex analysis procedure into a more formalized and simplified one. The software package can be obtained from the authors.
SpecTracer: A Python-Based Interactive Solution for Echelle Spectra Reduction
NASA Astrophysics Data System (ADS)
Romero Matamala, Oscar Fernando; Petit, Véronique; Caballero-Nieves, Saida Maria
2018-01-01
SpecTracer is a newly developed interactive solution to reduce cross dispersed echelle spectra. The use of widgets saves the user the steep learning curves of currently available reduction software. SpecTracer uses well established image processing techniques based on IRAF to succesfully extract the stellar spectra. Comparisons with other reduction software, like IRAF, show comparable results, with the added advantages of ease of use, platform independence and portability. This tool can obtain meaningful scientific data and serve also as a training tool, especially for undergraduates doing research, in the procedure for spectroscopic analysis.
NASA Astrophysics Data System (ADS)
Ruljigaljig, T.; Huang, M. L.
2015-12-01
This study development interface for Mobile Application (App) use cloud technology, Web 2.0 and online community of technology to build the Environmental-Geological Disaster Network(EDN). The interaction App platform between expert knowledge and community is developed as a teaching tool, which bases on the open data released by Central Geological Survey. The APP can through Augmented Reality technology to potential hazards position through the camera lens, the real show in real-world environment. The interaction with experts in the community to improve the general public awareness of disaster. Training people to record the occurrence of geological disasters precursor, thereby awakened their to natural disaster consciousness and attention.General users obtain real-time information during travel, mountaineering and teaching process. Using App platform to upload and represent the environmental geological disaster data collected by themselves. It is expected that by public joint the open platform can accumulate environmental geological disaster data effectively, quickly, extensively and correctly. The most important thing of this study is rooting the concept of disaster prevention, reduction, and avoidance through public participation.
Extreme learning machine for reduced order modeling of turbulent geophysical flows.
San, Omer; Maulik, Romit
2018-04-01
We investigate the application of artificial neural networks to stabilize proper orthogonal decomposition-based reduced order models for quasistationary geophysical turbulent flows. An extreme learning machine concept is introduced for computing an eddy-viscosity closure dynamically to incorporate the effects of the truncated modes. We consider a four-gyre wind-driven ocean circulation problem as our prototype setting to assess the performance of the proposed data-driven approach. Our framework provides a significant reduction in computational time and effectively retains the dynamics of the full-order model during the forward simulation period beyond the training data set. Furthermore, we show that the method is robust for larger choices of time steps and can be used as an efficient and reliable tool for long time integration of general circulation models.
Extreme learning machine for reduced order modeling of turbulent geophysical flows
NASA Astrophysics Data System (ADS)
San, Omer; Maulik, Romit
2018-04-01
We investigate the application of artificial neural networks to stabilize proper orthogonal decomposition-based reduced order models for quasistationary geophysical turbulent flows. An extreme learning machine concept is introduced for computing an eddy-viscosity closure dynamically to incorporate the effects of the truncated modes. We consider a four-gyre wind-driven ocean circulation problem as our prototype setting to assess the performance of the proposed data-driven approach. Our framework provides a significant reduction in computational time and effectively retains the dynamics of the full-order model during the forward simulation period beyond the training data set. Furthermore, we show that the method is robust for larger choices of time steps and can be used as an efficient and reliable tool for long time integration of general circulation models.
Scientific visualization of volumetric radar cross section data
NASA Astrophysics Data System (ADS)
Wojszynski, Thomas G.
1992-12-01
For aircraft design and mission planning, designers, threat analysts, mission planners, and pilots require a Radar Cross Section (RCS) central tendency with its associated distribution about a specified aspect and its relation to a known threat, Historically, RCS data sets have been statically analyzed to evaluate a d profile. However, Scientific Visualization, the application of computer graphics techniques to produce pictures of complex physical phenomena appears to be a more promising tool to interpret this data. This work describes data reduction techniques and a surface rendering algorithm to construct and display a complex polyhedron from adjacent contours of RCS data. Data reduction is accomplished by sectorizing the data and characterizing the statistical properties of the data. Color, lighting, and orientation cues are added to complete the visualization system. The tool may be useful for synthesis, design, and analysis of complex, low observable air vehicles.
ORAC-DR -- integral field spectroscopy data reduction
NASA Astrophysics Data System (ADS)
Todd, Stephen
ORAC-DR is a general-purpose automatic data-reduction pipeline environment. This document describes its use to reduce integral field unit (IFU) data collected at the United Kingdom Infrared Telescope (UKIRT) with the UIST instrument.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tenney, J.L.
SARS is a data acquisition system designed to gather and process radar data from aircraft flights. A database of flight trajectories has been developed for Albuquerque, NM, and Amarillo, TX. The data is used for safety analysis and risk assessment reports. To support this database effort, Sandia developed a collection of hardware and software tools to collect and post process the aircraft radar data. This document describes the data reduction tools which comprise the SARS, and maintenance procedures for the hardware and software system.
ORAC-DR: Overview and General Introduction
NASA Astrophysics Data System (ADS)
Economou, Frossie; Jenness, Tim; Currie, Malcolm J.; Adamson, Andy; Allan, Alasdair; Cavanagh, Brad
ORAC-DR is a general purpose automatic data reduction pipeline environment. It currently supports data reduction for the United Kingdom Infrared Telescope (UKIRT) instruments UFTI, IRCAM, UIST and CGS4, for the James Clerk Maxwell Telescope (JCMT) instrument SCUBA, for the William Herschel Telescope (WHT) instrument INGRID, for the European Southern Observatory (ESO) instrument ISAAC and for the Anglo-Australian Telescope (AAT) instrument IRIS-2. This document describes the general pipeline environment. For specific information on how to reduce the data for a particular instrument, please consult the appropriate ORAC-DR instrument guide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
Clean Cities Alternative Fuels and Advanced Vehicles Data Center (AFDC) features a wide range of Web-based tools to help vehicle fleets and individual consumers reduce their petroleum use. This brochure lists and describes Clean Cities online tools related to vehicles, alternative fueling stations, electric vehicle charging stations, fuel conservation, emissions reduction, fuel economy, and more.
On the repeated measures designs and sample sizes for randomized controlled trials.
Tango, Toshiro
2016-04-01
For the analysis of longitudinal or repeated measures data, generalized linear mixed-effects models provide a flexible and powerful tool to deal with heterogeneity among subject response profiles. However, the typical statistical design adopted in usual randomized controlled trials is an analysis of covariance type analysis using a pre-defined pair of "pre-post" data, in which pre-(baseline) data are used as a covariate for adjustment together with other covariates. Then, the major design issue is to calculate the sample size or the number of subjects allocated to each treatment group. In this paper, we propose a new repeated measures design and sample size calculations combined with generalized linear mixed-effects models that depend not only on the number of subjects but on the number of repeated measures before and after randomization per subject used for the analysis. The main advantages of the proposed design combined with the generalized linear mixed-effects models are (1) it can easily handle missing data by applying the likelihood-based ignorable analyses under the missing at random assumption and (2) it may lead to a reduction in sample size, compared with the simple pre-post design. The proposed designs and the sample size calculations are illustrated with real data arising from randomized controlled trials. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Fornito, A; Yücel, M; Patti, J; Wood, S J; Pantelis, C
2009-03-01
Voxel-based morphometry (VBM) is a popular tool for mapping neuroanatomical changes in schizophrenia patients. Several recent meta-analyses have identified the brain regions in which patients most consistently show grey matter reductions, although they have not examined whether such changes reflect differences in grey matter concentration (GMC) or grey matter volume (GMV). These measures assess different aspects of grey matter integrity, and may therefore reflect different pathological processes. In this study, we used the Anatomical Likelihood Estimation procedure to analyse significant differences reported in 37 VBM studies of schizophrenia patients, incorporating data from 1646 patients and 1690 controls, and compared the findings of studies using either GMC or GMV to index grey matter differences. Analysis of all studies combined indicated that grey matter reductions in a network of frontal, temporal, thalamic and striatal regions are among the most frequently reported in literature. GMC reductions were generally larger and more consistent than GMV reductions, and were more frequent in the insula, medial prefrontal, medial temporal and striatal regions. GMV reductions were more frequent in dorso-medial frontal cortex, and lateral and orbital frontal areas. These findings support the primacy of frontal, limbic, and subcortical dysfunction in the pathophysiology of schizophrenia, and suggest that the grey matter changes observed with MRI may not necessarily result from a unitary pathological process.
Automation of Ocean Product Metrics
2008-09-30
Presented in: Ocean Sciences 2008 Conf., 5 Mar 2008. Shriver, J., J. D. Dykes, and J. Fabre: Automation of Operational Ocean Product Metrics. Presented in 2008 EGU General Assembly , 14 April 2008. 9 ...processing (multiple data cuts per day) and multiple-nested models. Routines for generating automated evaluations of model forecast statistics will be...developed and pre-existing tools will be collected to create a generalized tool set, which will include user-interface tools to the metrics data
Alternative Fuels Data Center: Biodiesel Vehicle Emissions
Petroleum Reduction Planning Tool AFLEET Tool All Tools Vehicle Cost Calculator Choose a vehicle to compare fuel cost and emissions with a conventional vehicle. Select Fuel/Technology Electric Hybrid Electric Cost Calculator Vehicle 0 City 0 Hwy (mi/gal) 0 City 0 Hwy (kWh/100m) Gasoline Vehicle 0 City 0 Hwy (mi
The VIDA Framework as an Education Tool: Leveraging Volcanology Data for Educational Purposes
NASA Astrophysics Data System (ADS)
Faied, D.; Sanchez, A.
2009-04-01
The VIDA Framework as an Education Tool: Leveraging Volcanology Data for Educational Purposes Dohy Faied, Aurora Sanchez (on behalf of SSP08 VAPOR Project Team) While numerous global initiatives exist to address the potential hazards posed by volcanic eruption events and assess impacts from a civil security viewpoint, there does not yet exist a single, unified, international system of early warning and hazard tracking for eruptions. Numerous gaps exist in the risk reduction cycle, from data collection, to data processing, and finally dissemination of salient information to relevant parties. As part of the 2008 International Space University's Space Studies Program, a detailed gap analysis of the state of volcano disaster risk reduction was undertaken, and this paper presents the principal results. This gap analysis considered current sensor technologies, data processing algorithms, and utilization of data products by various international organizations. Recommendations for strategies to minimize or eliminate certain gaps are also provided. In the effort to address the gaps, a framework evolved at system level. This framework, known as VIDA, is a tool to develop user requirements for civil security in hazardous contexts, and a candidate system concept for a detailed design phase. While the basic intention of VIDA is to support disaster risk reduction efforts, there are several methods of leveraging raw science data to support education across a wide demographic. Basic geophysical data could be used to educate school children about the characteristics of volcanoes, satellite mappings could support informed growth and development of societies in at-risk areas, and raw sensor data could contribute to a wide range of university-level research projects. Satellite maps, basic geophysical data, and raw sensor data are combined and accessible in a way that allows the relationships between these data types to be explored and used in a training environment. Such a resource naturally lends itself to research efforts in the subject but also research in operational tools, system architecture, and human/machine interaction in civil protection or emergency scenarios.
Full-field digital mammography image data storage reduction using a crop tool.
Kang, Bong Joo; Kim, Sung Hun; An, Yeong Yi; Choi, Byung Gil
2015-05-01
The storage requirements for full-field digital mammography (FFDM) in a picture archiving and communication system are significant, so methods to reduce the data set size are needed. A FFDM crop tool for this purpose was designed, implemented, and tested. A total of 1,651 screening mammography cases with bilateral FFDMs were included in this study. The images were cropped using a DICOM editor while maintaining image quality. The cases were evaluated according to the breast volume (1/4, 2/4, 3/4, and 4/4) in the craniocaudal view. The image sizes between the cropped image group and the uncropped image group were compared. The overall image quality and reader's preference were independently evaluated by the consensus of two radiologists. Digital storage requirements for sets of four uncropped to cropped FFDM images were reduced by 3.8 to 82.9 %. The mean reduction rates according to the 1/4-4/4 breast volumes were 74.7, 61.1, 38, and 24 %, indicating that the lower the breast volume, the smaller the size of the cropped data set. The total image data set size was reduced from 87 to 36.7 GB, or a 57.7 % reduction. The overall image quality and the reader's preference for the cropped images were higher than those of the uncropped images. FFDM mammography data storage requirements can be significantly reduced using a crop tool.
The Speckle Toolbox: A Powerful Data Reduction Tool for CCD Astrometry
NASA Astrophysics Data System (ADS)
Harshaw, Richard; Rowe, David; Genet, Russell
2017-01-01
Recent advances in high-speed low-noise CCD and CMOS cameras, coupled with breakthroughs in data reduction software that runs on desktop PCs, has opened the domain of speckle interferometry and high-accuracy CCD measurements of double stars to amateurs, allowing them to do useful science of high quality. This paper describes how to use a speckle interferometry reduction program, the Speckle Tool Box (STB), to achieve this level of result. For over a year the author (Harshaw) has been using STB (and its predecessor, Plate Solve 3) to obtain measurements of double stars based on CCD camera technology for pairs that are either too wide (the stars not sharing the same isoplanatic patch, roughly 5 arc-seconds in diameter) or too faint to image in the coherence time required for speckle (usually under 40ms). This same approach - using speckle reduction software to measure CCD pairs with greater accuracy than possible with lucky imaging - has been used, it turns out, for several years by the U. S. Naval Observatory.
ClinVar miner: Demonstrating utility of a web-based tool for viewing and filtering clinvar data.
Henrie, Alex; Hemphill, Sarah E; Ruiz-Schultz, Nicole; Cushman, Brandon; DiStefano, Marina T; Azzariti, Danielle; Harrison, Steven M; Rehm, Heidi L; Eilbeck, Karen
2018-05-23
ClinVar Miner is a web-based suite that utilizes the data held in the National Center for Biotechnology Information's ClinVar archive. The goal is to render the data more accessible to processes pertaining to conflict resolution of variant interpretation as well as tracking details of data submission and data management for detailed variant curation. Here we establish the use of these tools to address three separate use-cases and to perform analyses across submissions. We demonstrate that the ClinVar Miner tools are an effective means to browse and consolidate data for variant submitters, curation groups, and general oversight. These tools are also relevant to the variant interpretation community in general. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Alternative Fuels Data Center: Tools
Calculator Compare cost of ownership and emissions for most vehicle models. mobile Petroleum Reduction ROI and payback period for natural gas vehicles and infrastructure. AFLEET Tool Calculate a fleet's , hydrogen, or fuel cell infrastructure. GREET Fleet Footprint Calculator Calculate your fleet's petroleum
Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project
NASA Astrophysics Data System (ADS)
van Eck, T.; Giardini, D.
2010-12-01
The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.
Towards a generalized energy prediction model for machine tools
Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H.; Dornfeld, David A.; Helu, Moneer; Rachuri, Sudarsan
2017-01-01
Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process. PMID:28652687
Towards a generalized energy prediction model for machine tools.
Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H; Dornfeld, David A; Helu, Moneer; Rachuri, Sudarsan
2017-04-01
Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process.
Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M; Subbarao, Italo
2015-06-29
This article describes a novel triangulation methodological approach for identifying twitter activity of regional active twitter users during the 2013 Hattiesburg EF-4 Tornado. A data extraction and geographically centered filtration approach was utilized to generate Twitter data for 48 hrs pre- and post-Tornado. The data was further validated using six sigma approach utilizing GPS data. The regional analysis revealed a total of 81,441 tweets, 10,646 Twitter users, 27,309 retweets and 2637 tweets with GPS coordinates. Twitter tweet activity increased 5 fold during the response to the Hattiesburg Tornado. Retweeting activity increased 2.2 fold. Tweets with a hashtag increased 1.4 fold. Twitter was an effective disaster risk reduction tool for the Hattiesburg EF-4 Tornado 2013.
Complexities in Subsetting Satellite Level 2 Data
NASA Astrophysics Data System (ADS)
Huwe, P.; Wei, J.; Albayrak, A.; Silberstein, D. S.; Alfred, J.; Savtchenko, A. K.; Johnson, J. E.; Hearty, T.; Meyer, D. J.
2017-12-01
Satellite Level 2 data presents unique challenges for tools and services. From nonlinear spatial geometry to inhomogeneous file data structure to inconsistent temporal variables to complex data variable dimensionality to multiple file formats, there are many difficulties in creating general tools for Level 2 data support. At NASA Goddard Earth Sciences Data and Information Services Center (GES DISC), we are implementing a general Level 2 Subsetting service for Level 2 data. In this presentation, we will unravel some of the challenges faced in creating this service and the strategies we used to surmount them.
Consistent Pauli reduction on group manifolds
Baguet, A.; Pope, Christopher N.; Samtleben, H.
2016-01-01
We prove an old conjecture by Duff, Nilsson, Pope and Warner asserting that the NSNS sector of supergravity (and more general the bosonic string) allows for a consistent Pauli reduction on any d-dimensional group manifold G, keeping the full set of gauge bosons of the G×G isometry group of the bi-invariant metric on G. The main tool of the construction is a particular generalised Scherk–Schwarz reduction ansatz in double field theory which we explicitly construct in terms of the group's Killing vectors. Examples include the consistent reduction from ten dimensions on S3×S3 and on similar product spaces. The construction ismore » another example of globally geometric non-toroidal compactifications inducing non-geometric fluxes.« less
Farran, Bassam; Channanath, Arshad Mohamed; Behbehani, Kazem; Thanaraj, Thangavel Alphonse
2013-05-14
We build classification models and risk assessment tools for diabetes, hypertension and comorbidity using machine-learning algorithms on data from Kuwait. We model the increased proneness in diabetic patients to develop hypertension and vice versa. We ascertain the importance of ethnicity (and natives vs expatriate migrants) and of using regional data in risk assessment. Retrospective cohort study. Four machine-learning techniques were used: logistic regression, k-nearest neighbours (k-NN), multifactor dimensionality reduction and support vector machines. The study uses fivefold cross validation to obtain generalisation accuracies and errors. Kuwait Health Network (KHN) that integrates data from primary health centres and hospitals in Kuwait. 270 172 hospital visitors (of which, 89 858 are diabetic, 58 745 hypertensive and 30 522 comorbid) comprising Kuwaiti natives, Asian and Arab expatriates. Incident type 2 diabetes, hypertension and comorbidity. Classification accuracies of >85% (for diabetes) and >90% (for hypertension) are achieved using only simple non-laboratory-based parameters. Risk assessment tools based on k-NN classification models are able to assign 'high' risk to 75% of diabetic patients and to 94% of hypertensive patients. Only 5% of diabetic patients are seen assigned 'low' risk. Asian-specific models and assessments perform even better. Pathological conditions of diabetes in the general population or in hypertensive population and those of hypertension are modelled. Two-stage aggregate classification models and risk assessment tools, built combining both the component models on diabetes (or on hypertension), perform better than individual models. Data on diabetes, hypertension and comorbidity from the cosmopolitan State of Kuwait are available for the first time. This enabled us to apply four different case-control models to assess risks. These tools aid in the preliminary non-intrusive assessment of the population. Ethnicity is seen significant to the predictive models. Risk assessments need to be developed using regional data as we demonstrate the applicability of the American Diabetes Association online calculator on data from Kuwait.
Big Data, Models and Tools | Transportation Research | NREL
displacement, and greenhouse gas reduction scenarios. New Tool Accelerates Design of Electric Vehicle Batteries design better, safer, and longer-lasting lithium-ion batteries for electric-drive vehicles through the Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) project. This month, ANSYS
CMS Analysis and Data Reduction with Apache Spark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutsche, Oliver; Canali, Luca; Cremer, Illia
Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis ofmore » very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.« less
ExoMars Entry Demonstrator Module Dynamic Stability
NASA Astrophysics Data System (ADS)
Dormieux, Marc; Gulhan, Ali; Berner, Claude
2011-05-01
In the frame of ExoMars DM aerodynamics characterization, pitch damping derivatives determination is required as it drives the parachute deployment conditions. Series of free-flight and free- oscillation tests (captive model) have been conducted with particular attention for data reduction. 6 Degrees- of-Freedom (DoF) analysis tools require the knowledge of local damping derivatives. In general ground tests do not provide them directly but only effective damping derivatives. Free-flight (ballistic range) tests with full oscillations around trim angle have been performed at ISL for 0.5
ESO Reflex: a graphical workflow engine for data reduction
NASA Astrophysics Data System (ADS)
Hook, Richard; Ullgrén, Marko; Romaniello, Martino; Maisala, Sami; Oittinen, Tero; Solin, Otto; Savolainen, Ville; Järveläinen, Pekka; Tyynelä, Jani; Péron, Michèle; Ballester, Pascal; Gabasch, Armin; Izzo, Carlo
ESO Reflex is a prototype software tool that provides a novel approach to astronomical data reduction by integrating a modern graphical workflow system (Taverna) with existing legacy data reduction algorithms. Most of the raw data produced by instruments at the ESO Very Large Telescope (VLT) in Chile are reduced using recipes. These are compiled C applications following an ESO standard and utilising routines provided by the Common Pipeline Library (CPL). Currently these are run in batch mode as part of the data flow system to generate the input to the ESO/VLT quality control process and are also exported for use offline. ESO Reflex can invoke CPL-based recipes in a flexible way through a general purpose graphical interface. ESO Reflex is based on the Taverna system that was originally developed within the UK life-sciences community. Workflows have been created so far for three VLT/VLTI instruments, and the GUI allows the user to make changes to these or create workflows of their own. Python scripts or IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available. Taverna is intended for use with web services and experiments using ESO Reflex to access Virtual Observatory web services have been successfully performed. ESO Reflex is the main product developed by Sampo, a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal was to look into the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Sampo concluded early in 2008. This contribution will describe ESO Reflex and show several examples of its use both locally and using Virtual Observatory remote web services. ESO Reflex is expected to be released to the community in early 2009.
Engine dynamic analysis with general nonlinear finite element codes
NASA Technical Reports Server (NTRS)
Adams, M. L.; Padovan, J.; Fertis, D. G.
1991-01-01
A general engine dynamic analysis as a standard design study computational tool is described for the prediction and understanding of complex engine dynamic behavior. Improved definition of engine dynamic response provides valuable information and insights leading to reduced maintenance and overhaul costs on existing engine configurations. Application of advanced engine dynamic simulation methods provides a considerable cost reduction in the development of new engine designs by eliminating some of the trial and error process done with engine hardware development.
C3: A Command-line Catalogue Cross-matching tool for modern astrophysical survey data
NASA Astrophysics Data System (ADS)
Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio
2017-06-01
In the current data-driven science era, it is needed that data analysis techniques has to quickly evolve to face with data whose dimensions has increased up to the Petabyte scale. In particular, being modern astrophysics based on multi-wavelength data organized into large catalogues, it is crucial that the astronomical catalog cross-matching methods, strongly dependant from the catalogues size, must ensure efficiency, reliability and scalability. Furthermore, multi-band data are archived and reduced in different ways, so that the resulting catalogues may differ each other in formats, resolution, data structure, etc, thus requiring the highest generality of cross-matching features. We present C 3 (Command-line Catalogue Cross-match), a multi-platform application designed to efficiently cross-match massive catalogues from modern surveys. Conceived as a stand-alone command-line process or a module within generic data reduction/analysis pipeline, it provides the maximum flexibility, in terms of portability, configuration, coordinates and cross-matching types, ensuring high performance capabilities by using a multi-core parallel processing paradigm and a sky partitioning algorithm.
Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M.; Subbarao, Italo
2015-01-01
Background: This article describes a novel triangulation methodological approach for identifying twitter activity of regional active twitter users during the 2013 Hattiesburg EF-4 Tornado. Methodology: A data extraction and geographically centered filtration approach was utilized to generate Twitter data for 48 hrs pre- and post-Tornado. The data was further validated using six sigma approach utilizing GPS data. Results: The regional analysis revealed a total of 81,441 tweets, 10,646 Twitter users, 27,309 retweets and 2637 tweets with GPS coordinates. Conclusions: Twitter tweet activity increased 5 fold during the response to the Hattiesburg Tornado. Retweeting activity increased 2.2 fold. Tweets with a hashtag increased 1.4 fold. Twitter was an effective disaster risk reduction tool for the Hattiesburg EF-4 Tornado 2013. PMID:26203396
A MapReduce approach to diminish imbalance parameters for big deoxyribonucleic acid dataset.
Kamal, Sarwar; Ripon, Shamim Hasnat; Dey, Nilanjan; Ashour, Amira S; Santhi, V
2016-07-01
In the age of information superhighway, big data play a significant role in information processing, extractions, retrieving and management. In computational biology, the continuous challenge is to manage the biological data. Data mining techniques are sometimes imperfect for new space and time requirements. Thus, it is critical to process massive amounts of data to retrieve knowledge. The existing software and automated tools to handle big data sets are not sufficient. As a result, an expandable mining technique that enfolds the large storage and processing capability of distributed or parallel processing platforms is essential. In this analysis, a contemporary distributed clustering methodology for imbalance data reduction using k-nearest neighbor (K-NN) classification approach has been introduced. The pivotal objective of this work is to illustrate real training data sets with reduced amount of elements or instances. These reduced amounts of data sets will ensure faster data classification and standard storage management with less sensitivity. However, general data reduction methods cannot manage very big data sets. To minimize these difficulties, a MapReduce-oriented framework is designed using various clusters of automated contents, comprising multiple algorithmic approaches. To test the proposed approach, a real DNA (deoxyribonucleic acid) dataset that consists of 90 million pairs has been used. The proposed model reduces the imbalance data sets from large-scale data sets without loss of its accuracy. The obtained results depict that MapReduce based K-NN classifier provided accurate results for big data of DNA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Digital Signal Processing For Low Bit Rate TV Image Codecs
NASA Astrophysics Data System (ADS)
Rao, K. R.
1987-06-01
In view of the 56 KBPS digital switched network services and the ISDN, low bit rate codecs for providing real time full motion color video are under various stages of development. Some companies have already brought the codecs into the market. They are being used by industry and some Federal Agencies for video teleconferencing. In general, these codecs have various features such as multiplexing audio and data, high resolution graphics, encryption, error detection and correction, self diagnostics, freezeframe, split video, text overlay etc. To transmit the original color video on a 56 KBPS network requires bit rate reduction of the order of 1400:1. Such a large scale bandwidth compression can be realized only by implementing a number of sophisticated,digital signal processing techniques. This paper provides an overview of such techniques and outlines the newer concepts that are being investigated. Before resorting to the data compression techniques, various preprocessing operations such as noise filtering, composite-component transformation and horizontal and vertical blanking interval removal are to be implemented. Invariably spatio-temporal subsampling is achieved by appropriate filtering. Transform and/or prediction coupled with motion estimation and strengthened by adaptive features are some of the tools in the arsenal of the data reduction methods. Other essential blocks in the system are quantizer, bit allocation, buffer, multiplexer, channel coding etc.
Complexities in Subsetting Level 2 Data
NASA Technical Reports Server (NTRS)
Huwe, Paul; Wei, Jennifer; Meyer, David; Silberstein, David S.; Alfred, Jerome; Savtchenko, Andrey K.; Johnson, James E.; Albayrak, Arif; Hearty, Thomas
2017-01-01
Satellite Level 2 data presents unique challenges for tools and services. From nonlinear spatial geometry to inhomogeneous file data structure to inconsistent temporal variables to complex data variable dimensionality to multiple file formats, there are many difficulties in creating general tools for Level 2 data support. At NASA Goddard Earth Sciences Data and Information Services Center (GES DISC), we are implementing a general Level 2 Subsetting service for Level 2 data to a user-specified spatio-temporal region of interest (ROI). In this presentation, we will unravel some of the challenges faced in creating this service and the strategies we used to surmount them.
Information technology and medical missteps: evidence from a randomized trial.
Javitt, Jonathan C; Rebitzer, James B; Reisman, Lonny
2008-05-01
We analyze the effect of a decision support tool designed to help physicians detect and correct medical "missteps". The data comes from a randomized trial of the technology on a population of commercial HMO patients. The key findings are that the new information technology lowers average charges by 6% relative to the control group. This reduction in resource utilization was the result of reduced in-patient charges (and associated professional charges) for the most costly patients. The rate at which identified issues were resolved was generally higher in the study group than in the control group, suggesting the possibility of improvements in care quality along measured dimensions and enhanced diffusion of new protocols based on new clinical evidence.
A Java-based tool for the design of classification microarrays.
Meng, Da; Broschat, Shira L; Call, Douglas R
2008-08-04
Classification microarrays are used for purposes such as identifying strains of bacteria and determining genetic relationships to understand the epidemiology of an infectious disease. For these cases, mixed microarrays, which are composed of DNA from more than one organism, are more effective than conventional microarrays composed of DNA from a single organism. Selection of probes is a key factor in designing successful mixed microarrays because redundant sequences are inefficient and limited representation of diversity can restrict application of the microarray. We have developed a Java-based software tool, called PLASMID, for use in selecting the minimum set of probe sequences needed to classify different groups of plasmids or bacteria. The software program was successfully applied to several different sets of data. The utility of PLASMID was illustrated using existing mixed-plasmid microarray data as well as data from a virtual mixed-genome microarray constructed from different strains of Streptococcus. Moreover, use of data from expression microarray experiments demonstrated the generality of PLASMID. In this paper we describe a new software tool for selecting a set of probes for a classification microarray. While the tool was developed for the design of mixed microarrays-and mixed-plasmid microarrays in particular-it can also be used to design expression arrays. The user can choose from several clustering methods (including hierarchical, non-hierarchical, and a model-based genetic algorithm), several probe ranking methods, and several different display methods. A novel approach is used for probe redundancy reduction, and probe selection is accomplished via stepwise discriminant analysis. Data can be entered in different formats (including Excel and comma-delimited text), and dendrogram, heat map, and scatter plot images can be saved in several different formats (including jpeg and tiff). Weights generated using stepwise discriminant analysis can be stored for analysis of subsequent experimental data. Additionally, PLASMID can be used to construct virtual microarrays with genomes from public databases, which can then be used to identify an optimal set of probes.
On Undecidability Aspects of Resilient Computations and Implications to Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S
2014-01-01
Future Exascale computing systems with a large number of processors, memory elements and interconnection links, are expected to experience multiple, complex faults, which affect both applications and operating-runtime systems. A variety of algorithms, frameworks and tools are being proposed to realize and/or verify the resilience properties of computations that guarantee correct results on failure-prone computing systems. We analytically show that certain resilient computation problems in presence of general classes of faults are undecidable, that is, no algorithms exist for solving them. We first show that the membership verification in a generic set of resilient computations is undecidable. We describe classesmore » of faults that can create infinite loops or non-halting computations, whose detection in general is undecidable. We then show certain resilient computation problems to be undecidable by using reductions from the loop detection and halting problems under two formulations, namely, an abstract programming language and Turing machines, respectively. These two reductions highlight different failure effects: the former represents program and data corruption, and the latter illustrates incorrect program execution. These results call for broad-based, well-characterized resilience approaches that complement purely computational solutions using methods such as hardware monitors, co-designs, and system- and application-specific diagnosis codes.« less
Automatic welding detection by an intelligent tool pipe inspection
NASA Astrophysics Data System (ADS)
Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.
2015-07-01
This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.
Users guide to E859 phoswich analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costales, J.B.
1992-11-30
In this memo the authors describe the analysis path used to transform the phoswich data from raw data banks into cross sections suitable for publication. The primary purpose of this memo is not to document each analysis step in great detail but rather to point the reader to the fortran code used and to point out the essential features of the analysis path. A flow chart which summarizes the various steps performed to massage the data from beginning to end is given. In general, each step corresponds to a fortran program which was written to perform that particular task. Themore » automation of the data analysis has been kept purposefully minimal in order to ensure the highest quality of the final product. However, tools have been developed which ease the non--automated steps. There are two major parallel routes for the data analysis: data reduction and acceptance determination using detailed GEANT Monte Carlo simulations. In this memo, the authors will first describe the data reduction up to the point where PHAD banks (Pass 1-like banks) are created. They the will describe the steps taken in the GEANT Monte Carlo route. Note that a detailed memo describing the methodology of the acceptance corrections has already been written. Therefore the discussion of the acceptance determination will be kept to a minimum and the reader will be referred to the other memo for further details. Finally, they will describe the cross section formation process and how final spectra are extracted.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baguet, A.; Pope, Christopher N.; Samtleben, H.
We prove an old conjecture by Duff, Nilsson, Pope and Warner asserting that the NSNS sector of supergravity (and more general the bosonic string) allows for a consistent Pauli reduction on any d-dimensional group manifold G, keeping the full set of gauge bosons of the G×G isometry group of the bi-invariant metric on G. The main tool of the construction is a particular generalised Scherk–Schwarz reduction ansatz in double field theory which we explicitly construct in terms of the group's Killing vectors. Examples include the consistent reduction from ten dimensions on S3×S3 and on similar product spaces. The construction ismore » another example of globally geometric non-toroidal compactifications inducing non-geometric fluxes.« less
Cornelissen, Frans; Cik, Miroslav; Gustin, Emmanuel
2012-04-01
High-content screening has brought new dimensions to cellular assays by generating rich data sets that characterize cell populations in great detail and detect subtle phenotypes. To derive relevant, reliable conclusions from these complex data, it is crucial to have informatics tools supporting quality control, data reduction, and data mining. These tools must reconcile the complexity of advanced analysis methods with the user-friendliness demanded by the user community. After review of existing applications, we realized the possibility of adding innovative new analysis options. Phaedra was developed to support workflows for drug screening and target discovery, interact with several laboratory information management systems, and process data generated by a range of techniques including high-content imaging, multicolor flow cytometry, and traditional high-throughput screening assays. The application is modular and flexible, with an interface that can be tuned to specific user roles. It offers user-friendly data visualization and reduction tools for HCS but also integrates Matlab for custom image analysis and the Konstanz Information Miner (KNIME) framework for data mining. Phaedra features efficient JPEG2000 compression and full drill-down functionality from dose-response curves down to individual cells, with exclusion and annotation options, cell classification, statistical quality controls, and reporting.
Small Engine Technology (SET). Task 33: Airframe, Integration, and Community Noise Study
NASA Technical Reports Server (NTRS)
Lieber, Lys S.; Elkins, Daniel; Golub, Robert A. (Technical Monitor)
2002-01-01
Task Order 33 had four primary objectives as follows: (1) Identify and prioritize the airframe noise reduction technologies needed to accomplish the NASA Pillar goals for business and regional aircraft. (2) Develop a model to estimate the effect of jet shear layer refraction and attenuation of internally generated source noise of a turbofan engine on the aircraft system noise. (3) Determine the effect on community noise of source noise changes of a generic turbofan engine operating from sea level to 15,000 feet. (4) Support lateral attenuation experiments conducted by NASA Langley at Wallops Island, VA, by coordinating opportunities for Contractor Aircraft to participate as a noise source during the noise measurements. Noise data and noise prediction tools, including airframe noise codes, from the NASA Advanced Subsonic Technology (AST) program were applied to assess the current status of noise reduction technologies relative to the NASA pillar goals for regional and small business jet aircraft. In addition, the noise prediction tools were applied to evaluate the effectiveness of airframe-related noise reduction concepts developed in the AST program on reducing the aircraft system noise. The AST noise data and acoustic prediction tools used in this study were furnished by NASA.
Model diagnostics in reduced-rank estimation
Chen, Kun
2016-01-01
Reduced-rank methods are very popular in high-dimensional multivariate analysis for conducting simultaneous dimension reduction and model estimation. However, the commonly-used reduced-rank methods are not robust, as the underlying reduced-rank structure can be easily distorted by only a few data outliers. Anomalies are bound to exist in big data problems, and in some applications they themselves could be of the primary interest. While naive residual analysis is often inadequate for outlier detection due to potential masking and swamping, robust reduced-rank estimation approaches could be computationally demanding. Under Stein's unbiased risk estimation framework, we propose a set of tools, including leverage score and generalized information score, to perform model diagnostics and outlier detection in large-scale reduced-rank estimation. The leverage scores give an exact decomposition of the so-called model degrees of freedom to the observation level, which lead to exact decomposition of many commonly-used information criteria; the resulting quantities are thus named information scores of the observations. The proposed information score approach provides a principled way of combining the residuals and leverage scores for anomaly detection. Simulation studies confirm that the proposed diagnostic tools work well. A pattern recognition example with hand-writing digital images and a time series analysis example with monthly U.S. macroeconomic data further demonstrate the efficacy of the proposed approaches. PMID:28003860
Model diagnostics in reduced-rank estimation.
Chen, Kun
2016-01-01
Reduced-rank methods are very popular in high-dimensional multivariate analysis for conducting simultaneous dimension reduction and model estimation. However, the commonly-used reduced-rank methods are not robust, as the underlying reduced-rank structure can be easily distorted by only a few data outliers. Anomalies are bound to exist in big data problems, and in some applications they themselves could be of the primary interest. While naive residual analysis is often inadequate for outlier detection due to potential masking and swamping, robust reduced-rank estimation approaches could be computationally demanding. Under Stein's unbiased risk estimation framework, we propose a set of tools, including leverage score and generalized information score, to perform model diagnostics and outlier detection in large-scale reduced-rank estimation. The leverage scores give an exact decomposition of the so-called model degrees of freedom to the observation level, which lead to exact decomposition of many commonly-used information criteria; the resulting quantities are thus named information scores of the observations. The proposed information score approach provides a principled way of combining the residuals and leverage scores for anomaly detection. Simulation studies confirm that the proposed diagnostic tools work well. A pattern recognition example with hand-writing digital images and a time series analysis example with monthly U.S. macroeconomic data further demonstrate the efficacy of the proposed approaches.
Security of Continuous-Variable Quantum Key Distribution via a Gaussian de Finetti Reduction
NASA Astrophysics Data System (ADS)
Leverrier, Anthony
2017-05-01
Establishing the security of continuous-variable quantum key distribution against general attacks in a realistic finite-size regime is an outstanding open problem in the field of theoretical quantum cryptography if we restrict our attention to protocols that rely on the exchange of coherent states. Indeed, techniques based on the uncertainty principle are not known to work for such protocols, and the usual tools based on de Finetti reductions only provide security for unrealistically large block lengths. We address this problem here by considering a new type of Gaussian de Finetti reduction, that exploits the invariance of some continuous-variable protocols under the action of the unitary group U (n ) (instead of the symmetric group Sn as in usual de Finetti theorems), and by introducing generalized S U (2 ,2 ) coherent states. Crucially, combined with an energy test, this allows us to truncate the Hilbert space globally instead as at the single-mode level as in previous approaches that failed to provide security in realistic conditions. Our reduction shows that it is sufficient to prove the security of these protocols against Gaussian collective attacks in order to obtain security against general attacks, thereby confirming rigorously the widely held belief that Gaussian attacks are indeed optimal against such protocols.
Security of Continuous-Variable Quantum Key Distribution via a Gaussian de Finetti Reduction.
Leverrier, Anthony
2017-05-19
Establishing the security of continuous-variable quantum key distribution against general attacks in a realistic finite-size regime is an outstanding open problem in the field of theoretical quantum cryptography if we restrict our attention to protocols that rely on the exchange of coherent states. Indeed, techniques based on the uncertainty principle are not known to work for such protocols, and the usual tools based on de Finetti reductions only provide security for unrealistically large block lengths. We address this problem here by considering a new type of Gaussian de Finetti reduction, that exploits the invariance of some continuous-variable protocols under the action of the unitary group U(n) (instead of the symmetric group S_{n} as in usual de Finetti theorems), and by introducing generalized SU(2,2) coherent states. Crucially, combined with an energy test, this allows us to truncate the Hilbert space globally instead as at the single-mode level as in previous approaches that failed to provide security in realistic conditions. Our reduction shows that it is sufficient to prove the security of these protocols against Gaussian collective attacks in order to obtain security against general attacks, thereby confirming rigorously the widely held belief that Gaussian attacks are indeed optimal against such protocols.
Measuring infrastructure: A key step in program evaluation and planning.
Schmitt, Carol L; Glasgow, LaShawn; Lavinghouze, S Rene; Rieker, Patricia P; Fulmer, Erika; McAleer, Kelly; Rogers, Todd
2016-06-01
State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General's call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model's utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. Copyright © 2016 Elsevier Ltd. All rights reserved.
VizieR Online Data Catalog: Jame Clerk Maxwell Telescope Science Archive (CADC, 2003)
NASA Astrophysics Data System (ADS)
Canadian Astronomy Data, Centre
2018-01-01
The JCMT Science Archive (JSA), a collaboration between the CADC and EOA, is the official distribution site for observational data obtained with the James Clerk Maxwell Telescope (JCMT) on Mauna Kea, Hawaii. The JSA search interface is provided by the CADC Search tool, which provides generic access to the complete set of telescopic data archived at the CADC. Help on the use of this tool is provided via tooltips. For additional information on instrument capabilities and data reduction, please consult the SCUBA-2 and ACSIS instrument pages provided on the JAC maintained JCMT pages. JCMT-specific help related to the use of the CADC AdvancedSearch tool is available from the JAC. (1 data file).
Interactive Graphics Tools for Analysis of MOLA and Other Data
NASA Technical Reports Server (NTRS)
Frey, H.; Roark, J.; Sakimoto, S.
2000-01-01
We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tim, U.S.; Jolly, R.
1994-01-01
Considerable progress has been made in developing physically based, distributed parameter, hydrologic/water quality (HIWQ) models for planning and control of nonpoint-source pollution. The widespread use of these models is often constrained by the excessive and time-consuming input data demands and the lack of computing efficiencies necessary for iterative simulation of alternative management strategies. Recent developments in geographic information systems (GIS) provide techniques for handling large amounts of spatial data for modeling nonpoint-source pollution problems. Because a GIS can be used to combine information from several sources to form an array of model input data and to examine any combinations ofmore » spatial input/output data, it represents a highly effective tool for HiWQ modeling. This paper describes the integration of a distributed-parameter model (AGNPS) with a GIS (ARC/INFO) to examine nonpoint sources of pollution in an agricultural watershed. The ARC/INFO GIS provided the tools to generate and spatially organize the disparate data to support modeling, while the AGNPS model was used to predict several water quality variables including soil erosion and sedimentation within a watershed. The integrated system was used to evaluate the effectiveness of several alternative management strategies in reducing sediment pollution in a 417-ha watershed located in southern Iowa. The implementation of vegetative filter strips and contour buffer (grass) strips resulted in a 41 and 47% reduction in sediment yield at the watershed outlet, respectively. In addition, when the integrated system was used, the combination of the above management strategies resulted in a 71% reduction in sediment yield. In general, the study demonstrated the utility of integrating a simulation model with GIS for nonpoini-source pollution control and planning. Such techniques can help characterize the diffuse sources of pollution at the landscape level. 52 refs., 6 figs., 1 tab.« less
On-Orbit Performance Degradation of the International Space Station P6 Photovoltaic Arrays
NASA Technical Reports Server (NTRS)
Kerslake, Thomas W.; Gustafson, Eric D.
2003-01-01
This paper discusses the on-orbit performance and performance degradation of the International Space Station P6 solar array wings (SAWs) from the period of December 2000 through February 2003. Data selection considerations and data reduction methods are reviewed along with the approach for calculating array performance degradation based on measured string shunt current levels. Measured degradation rates are compared with those predicted by the computational tool SPACE and prior degradation rates measured with the same SAW technology on the Mir space station. Initial results show that the measured SAW short-circuit current is degrading 0.2 to 0.5 percent per year. This degradation rate is below the predicted rate of 0.8 percent per year and is well within the 3 percent estimated uncertainty in measured SAW current levels. General contributors to SAW degradation are briefly discussed.
NASA Astrophysics Data System (ADS)
Pradhan, Bikram; Delchambre, Ludovic; Hickson, Paul; Akhunov, Talat; Bartczak, Przemyslaw; Kumar, Brajesh; Surdej, Jean
2018-04-01
The 4-m International Liquid Mirror Telescope (ILMT) located at the ARIES Observatory (Devasthal, India) has been designed to scan at a latitude of +29° 22' 26" a band of sky having a width of about half a degree in the Time Delayed Integration (TDI) mode. Therefore, a special data-reduction and analysis pipeline to process online the large amount of optical data being produced has been dedicated to it. This requirement has led to the development of the 4-m ILMT data reduction pipeline, a new software package built with Python in order to simplify a large number of tasks aimed at the reduction of the acquired TDI images. This software provides astronomers with specially designed data reduction functions, astrometry and photometry calibration tools. In this paper we discuss the various reduction and calibration steps followed to reduce TDI images obtained in May 2015 with the Devasthal 1.3m telescope. We report here the detection and characterization of nine space debris present in the TDI frames.
Genome-Enabled Molecular Tools for Reductive Dehalogenation
2011-11-01
Genome-Enabled Molecular Tools for Reductive Dehalogenation - A Shift in Paradigm for Bioremediation - Alfred M. Spormann Departments of Chemical...Genome-Enabled Molecular Tools for Reductive Dehalogenation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...Applications Technical Session No. 3D C-77 GENOME-ENABLED MOLECULAR TOOLS FOR REDUCTIVE DEHALOGENATION PROFESSOR ALFRED SPORMANN Stanford
Page, Andrew; Atkinson, Jo-An; Heffernan, Mark; McDonnell, Geoff; Hickie, Ian
2017-04-27
Dynamic simulation modelling is increasingly being recognised as a valuable decision-support tool to help guide investments and actions to address complex public health issues such as suicide. In particular, participatory system dynamics (SD) modelling provides a useful tool for asking high-level 'what if' questions, and testing the likely impacts of different combinations of policies and interventions at an aggregate level before they are implemented in the real world. We developed an SD model for suicide prevention in Australia, and investigated the hypothesised impacts over the next 10 years (2015-2025) of a combination of current intervention strategies proposed for population interventions in Australia: 1) general practitioner (GP) training, 2) coordinated aftercare in those who have attempted suicide, 3) school-based mental health literacy programs, 4) brief-contact interventions in hospital settings, and 5) psychosocial treatment approaches. Findings suggest that the largest reductions in suicide were associated with GP training (6%) and coordinated aftercare approaches (4%), with total reductions of 12% for all interventions combined. This paper highlights the value of dynamic modelling methods for managing complexity and uncertainty, and demonstrates their potential use as a decision-support tool for policy makers and program planners for community suicide prevention actions.
40 CFR 51.5 - What tools are available to help prepare and report emissions data?
Code of Federal Regulations, 2014 CFR
2014-07-01
... PLANS Air Emissions Reporting Requirements General Information for Inventory Preparers § 51.5 What tools... 40 Protection of Environment 2 2014-07-01 2014-07-01 false What tools are available to help prepare and report emissions data? 51.5 Section 51.5 Protection of Environment ENVIRONMENTAL PROTECTION...
40 CFR 51.5 - What tools are available to help prepare and report emissions data?
Code of Federal Regulations, 2012 CFR
2012-07-01
... PLANS Air Emissions Reporting Requirements General Information for Inventory Preparers § 51.5 What tools... 40 Protection of Environment 2 2012-07-01 2012-07-01 false What tools are available to help prepare and report emissions data? 51.5 Section 51.5 Protection of Environment ENVIRONMENTAL PROTECTION...
40 CFR 51.5 - What tools are available to help prepare and report emissions data?
Code of Federal Regulations, 2013 CFR
2013-07-01
... PLANS Air Emissions Reporting Requirements General Information for Inventory Preparers § 51.5 What tools... 40 Protection of Environment 2 2013-07-01 2013-07-01 false What tools are available to help prepare and report emissions data? 51.5 Section 51.5 Protection of Environment ENVIRONMENTAL PROTECTION...
Di Lonardo, Anna; Donfrancesco, Chiara; Palmieri, Luigi; Vanuzzo, Diego; Giampaoli, Simona
2017-06-01
High blood pressure (BP) is a major risk factor for cardiovascular disease. The urgency of the problem was underlined by the World Health Organization (WHO) Global Action Plan for the prevention and control of noncommunicable diseases, which recommends a 25% relative reduction in the prevalence of raised BP by 2020. A surveillance system represents a useful tool to monitor BP in the general population. Since 1980s, the National Institute of Health has conducted several surveys of the adult general population, measuring cardiovascular risk factors by standardized procedures and methods. To describe mean BP levels and high BP prevalence from 1978 to 2012 by sex and quinquennia of age. Data were derived from the following three studies: (i) Risk Factors and Life Expectancy (RIFLE), conducted between 1978 and 2002 in 13 Italian regions (>70,000 persons); (ii) Osservatorio Epidemiologico Cardiovascolare (OEC), conducted between 1998-2002 in the general population from all Italian regions (>9000 persons); and (iii) Osservatorio Epidemiologico Cardiovascolare/Health Examination Survey (OEC/HES), conducted between 2008-2012 in the general population from all Italian regions (>9000 persons). A significant decrease in mean systolic and diastolic BP levels and prevalence of high BP from 1978 to 2012 was observed both in men and women. BP and high BP increased by age classes in all considered periods. BP awareness and control also improved. Our data suggest that BP control could be achieved by 2020, as recommended by WHO.
Disparity : scalable anomaly detection for clusters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desai, N.; Bradshaw, R.; Lusk, E.
2008-01-01
In this paper, we describe disparity, a tool that does parallel, scalable anomaly detection for clusters. Disparity uses basic statistical methods and scalable reduction operations to perform data reduction on client nodes and uses these results to locate node anomalies. We discuss the implementation of disparity and present results of its use on a SiCortex SC5832 system.
Lommen, Arjen
2009-04-15
Hyphenated full-scan MS technology creates large amounts of data. A versatile easy to handle automation tool aiding in the data analysis is very important in handling such a data stream. MetAlign softwareas described in this manuscripthandles a broad range of accurate mass and nominal mass GC/MS and LC/MS data. It is capable of automatic format conversions, accurate mass calculations, baseline corrections, peak-picking, saturation and mass-peak artifact filtering, as well as alignment of up to 1000 data sets. A 100 to 1000-fold data reduction is achieved. MetAlign software output is compatible with most multivariate statistics programs.
NASA Astrophysics Data System (ADS)
Xu, Zhe; Peng, M. G.; Tu, Lin Hsin; Lee, Cedric; Lin, J. K.; Jan, Jian Feng; Yin, Alb; Wang, Pei
2006-10-01
Nowadays, most foundries have paid more and more attention in order to reduce the CD width. Although the lithography technologies have developed drastically, mask data accuracy is still a big challenge than before. Besides, mask (reticle) price also goes up drastically such that data accuracy needs more special treatments.We've developed a system called eFDMS to guarantee the mask data accuracy. EFDMS is developed to do the automatic back-check of mask tooling database and the data transmission of mask tooling. We integrate our own EFDMS systems to engage with the standard mask tooling system K2 so that the upriver and the downriver processes of the mask tooling main body K2 can perform smoothly and correctly with anticipation. The competition in IC marketplace is changing from high-tech process to lower-price gradually. How to control the reduction of the products' cost more plays a significant role in foundries. Before the violent competition's drawing nearer, we should prepare the cost task ahead of time.
CASCADE IMPACTOR DATA REDUCTION WITH SR-52 AND TI-59 PROGRAMMABLE CALCULATORS
The report provides useful tools for obtaining particle size distributions and graded penetration data from cascade impactor measurements. The programs calculate impactor aerodynamic cut points, total mass collected by the impactor, cumulative mass fraction less than for each sta...
NASA Astrophysics Data System (ADS)
Tao, Chunhui; Chen, Sheng; Baker, Edward T.; Li, Huaiming; Liang, Jin; Liao, Shili; Chen, Yongshun John; Deng, Xianming; Zhang, Guoyin; Gu, Chunhua; Wu, Jialin
2017-06-01
Seafloor hydrothermal polymetallic sulfide deposits are a new type of resource, with great potential economic value and good prospect development. This paper discusses turbidity, oxidation-reduction potential, and temperature anomalies of hydrothermal plumes from the Zouyu-1 and Zouyu-2 hydrothermal fields on the southern Mid-Atlantic Ridge. We use the known location of these vent fields and plume data collected in multiple years (2009, 2011, 2013) to demonstrate how real-time plume exploration can be used to locate active vent fields, and thus associated sulfide deposits. Turbidity anomalies can be detected 10 s of km from an active source, but the location precision is no better than a few kilometers because fine-grained particles are quasi-conservative over periods of many days. Temperature and oxidation-reduction potential anomalies provide location precision of a few hundred meters. Temperature anomalies are generally weak and difficult to reliably detect, except by chance encounters of a buoyant plume. Oxidation-reduction potential is highly sensitive (nmol concentrations of reduced hydrothermal chemicals) to discharges of all temperatures and responds immediately to a plume encounter. Real-time surveys using continuous tows of turbidity and oxidation-reduction potential sensors offer the most efficient and precise surface ship exploration presently possible.
NASA Technical Reports Server (NTRS)
Salikuddin, M.; Martens, S.; Shin, H.; Majjigi, R. K.; Krejsa, Gene (Technical Monitor)
2002-01-01
The objective of this task was to develop a design methodology and noise reduction concepts for high bypass exhaust systems which could be applied to both existing production and new advanced engine designs. Special emphasis was given to engine cycles with bypass ratios in the range of 4:1 to 7:1, where jet mixing noise was a primary noise source at full power takeoff conditions. The goal of this effort was to develop the design methodology for mixed-flow exhaust systems and other novel noise reduction concepts that would yield 3 EPNdB noise reduction relative to 1992 baseline technology. Two multi-lobed mixers, a 22-lobed axisymmetric and a 21-lobed with a unique lobe, were designed. These mixers along with a confluent mixer were tested with several fan nozzles of different lengths with and without acoustic treatment in GEAE's Cell 41 under the current subtask (Subtask C). In addition to the acoustic and LDA tests for the model mixer exhaust systems, a semi-empirical noise prediction method for mixer exhaust system is developed. Effort was also made to implement flowfield data for noise prediction by utilizing MGB code. In general, this study established an aero and acoustic diagnostic database to calibrate and refine current aero and acoustic prediction tools.
Waste reduction possibilities for manufacturing systems in the industry 4.0
NASA Astrophysics Data System (ADS)
Tamás, P.; Illés, B.; Dobos, P.
2016-11-01
The industry 4.0 creates some new possibilities for the manufacturing companies’ waste reduction for example by appearance of the cyber physical systems and the big data concept and spreading the „Internet of things (IoT)”. This paper presents in details the fourth industrial revolutions’ more important achievements and tools. In addition there will be also numerous new research directions in connection with the waste reduction possibilities of the manufacturing systems outlined.
Code of Federal Regulations, 2010 CFR
2010-07-01
... of Plastic Parts and Products Pt. 63, Subpt. PPPP, Table 2 Table 2 to Subpart PPPP of Part 63...) Data Reduction No Sections 63.4567 and 63.4568 specify monitoring data reduction. § 63.9(a)-(d...
Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard
2011-06-01
Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.
Changing Conspiracy Beliefs through Rationality and Ridiculing.
Orosz, Gábor; Krekó, Péter; Paskuj, Benedek; Tóth-Király, István; Bőthe, Beáta; Roland-Lévy, Christine
2016-01-01
Conspiracy theory (CT) beliefs can be harmful. How is it possible to reduce them effectively? Three reduction strategies were tested in an online experiment using general and well-known CT beliefs on a comprehensive randomly assigned Hungarian sample ( N = 813): exposing rational counter CT arguments, ridiculing those who hold CT beliefs, and empathizing with the targets of CT beliefs. Several relevant individual differences were measured. Rational and ridiculing arguments were effective in reducing CT, whereas empathizing with the targets of CTs had no effect. Individual differences played no role in CT reduction, but the perceived intelligence and competence of the individual who conveyed the CT belief-reduction information contributed to the success of the CT belief reduction. Rational arguments targeting the link between the object of belief and its characteristics appear to be an effective tool in fighting conspiracy theory beliefs.
Integrating Information Technologies Into Large Organizations
NASA Technical Reports Server (NTRS)
Gottlich, Gretchen; Meyer, John M.; Nelson, Michael L.; Bianco, David J.
1997-01-01
NASA Langley Research Center's product is aerospace research information. To this end, Langley uses information technology tools in three distinct ways. First, information technology tools are used in the production of information via computation, analysis, data collection and reduction. Second, information technology tools assist in streamlining business processes, particularly those that are primarily communication based. By applying these information tools to administrative activities, Langley spends fewer resources on managing itself and can allocate more resources for research. Third, Langley uses information technology tools to disseminate its aerospace research information, resulting in faster turn around time from the laboratory to the end-customer.
Speckle Interferometry with the OCA Kuhn 22" Telescope
NASA Astrophysics Data System (ADS)
Wasson, Rick
2018-04-01
Speckle interferometry measurements of double stars were made in 2015 and 2016, using the Kuhn 22-inch classical Cassegrain telescope of the Orange County Astronomers, a Point Grey Blackfly CMOS camera, and three interference filters. 272 observations are reported for 177 systems, with separations ranging from 0.29" to 2.9". Data reduction was by means of the REDUC and Speckle Tool Box programs. Equipment, observing procedures, calibration, data reduction, and analysis are described, and unusual results for 11 stars are discussed in detail.
Origins of Line Defects in Self-Reacting Friction Stir Welds and Their Impact on Weld Quality
NASA Technical Reports Server (NTRS)
Schneider, Judy; Nunes, Arthur C., Jr.
2016-01-01
Friction stir welding (FSWing) is a solid state joining technique which reduces the occurrence of typical defects formed in fusion welds, especially of highly alloyed metals. Although the process is robust for aluminum alloys, occasional reductions in the strength of FSWs have been observed. Shortly after the NASA-MSFC implemented a variation of FSW called self-reacting (SR), low strength properties were observed. At that time, this reduction in strength was attributed to a line defect. At that time, the limited data suggested that the line defect was related to the accumulation of native oxides that form on the weld lands and faying surfaces. Through a series of improved cleaning methods, tool redesign, and process parameter modifications, the reduction in the strength of the SR-FSWs was eliminated. As more data has been collected, the occasional reduction in the strength of SR-FSW still occurs. These occasional reductions indicate a need to reexamine the underlying causes. This study builds off a series of self reacting (SR)-FSWs that were made in 3 different thickness panels of AA2219 (0.95, 1.27 and 1.56 cm) at 2 different weld pitches. A bead on plate SR-FSW was also made in the 1.56 cm thick panel to understand the contribution of the former faying surfaces. Copper tracer studies were used to understand the flow lines associated with the weld tool used. The quality of the SR-FSWs was evaluated from tensile testing at room temperature. Reductions in the tensile strength were observed in some weldments, primarily at higher weld pitch or tool rotations. This study explores possible correlations between line defects and the reduction of strength in SR-FSWs. Results from this study will assist in a better understand of the mechanisms responsible for reduced tensile strength and provide methodology for minimizing their occurrence.
Utilization of a CRT display light pen in the design of feedback control systems
NASA Technical Reports Server (NTRS)
Thompson, J. G.; Young, K. R.
1972-01-01
A hierarchical structure of the interlinked programs was developed to provide a flexible computer-aided design tool. A graphical input technique and a data structure are considered which provide the capability of entering the control system model description into the computer in block diagram form. An information storage and retrieval system was developed to keep track of the system description, and analysis and simulation results, and to provide them to the correct routines for further manipulation or display. Error analysis and diagnostic capabilities are discussed, and a technique was developed to reduce a transfer function to a set of nested integrals suitable for digital simulation. A general, automated block diagram reduction procedure was set up to prepare the system description for the analysis routines.
DOT National Transportation Integrated Search
2016-01-01
In this study, we use existing modeling tools and data from the San Francisco Bay Area : (California) to understand the potential market demand for a first mile transit access service : and possible reductions in vehicle miles traveled (VMT) (a...
Doing accelerator physics using SDDS, UNIX, and EPICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borland, M.; Emery, L.; Sereno, N.
1995-12-31
The use of the SDDS (Self-Describing Data Sets) file protocol, together with the UNIX operating system and EPICS (Experimental Physics and Industrial Controls System), has proved powerful during the commissioning of the APS (Advanced Photon Source) accelerator complex. The SDDS file protocol has permitted a tool-oriented approach to developing applications, wherein generic programs axe written that function as part of multiple applications. While EPICS-specific tools were written for data collection, automated experiment execution, closed-loop control, and so forth, data processing and display axe done with the SDDS Toolkit. Experiments and data reduction axe implemented as UNIX shell scripts that coordinatemore » the execution of EPICS specific tools and SDDS tools. Because of the power and generic nature of the individual tools and of the UNIX shell environment, automated experiments can be prepared and executed rapidly in response to unanticipated needs or new ideas. Examples are given of application of this methodology to beam motion characterization, beam-position-monitor offset measurements, and klystron characterization.« less
Robles, Guillermo; Fresno, José Manuel; Martínez-Tarifa, Juan Manuel; Ardila-Rey, Jorge Alfredo; Parrado-Hernández, Emilio
2018-03-01
The measurement of partial discharge (PD) signals in the radio frequency (RF) range has gained popularity among utilities and specialized monitoring companies in recent years. Unfortunately, in most of the occasions the data are hidden by noise and coupled interferences that hinder their interpretation and renders them useless especially in acquisition systems in the ultra high frequency (UHF) band where the signals of interest are weak. This paper is focused on a method that uses a selective spectral signal characterization to feature each signal, type of partial discharge or interferences/noise, with the power contained in the most representative frequency bands. The technique can be considered as a dimensionality reduction problem where all the energy information contained in the frequency components is condensed in a reduced number of UHF or high frequency (HF) and very high frequency (VHF) bands. In general, dimensionality reduction methods make the interpretation of results a difficult task because the inherent physical nature of the signal is lost in the process. The proposed selective spectral characterization is a preprocessing tool that facilitates further main processing. The starting point is a clustering of signals that could form the core of a PD monitoring system. Therefore, the dimensionality reduction technique should discover the best frequency bands to enhance the affinity between signals in the same cluster and the differences between signals in different clusters. This is done maximizing the minimum Mahalanobis distance between clusters using particle swarm optimization (PSO). The tool is tested with three sets of experimental signals to demonstrate its capabilities in separating noise and PDs with low signal-to-noise ratio and separating different types of partial discharges measured in the UHF and HF/VHF bands.
NASA Technical Reports Server (NTRS)
Vicente, Gilberto
2005-01-01
Several commercial applications of remote sensing data, such as water resources management, environmental monitoring, climate prediction, agriculture, forestry, preparation for and migration of extreme weather events, require access to vast amounts of archived high quality data, software tools and services for data manipulation and information extraction. These on the other hand require gaining detailed understanding of the data's internal structure and physical implementation of data reduction, combination and data product production. The time-consuming task must be undertaken before the core investigation can begin and is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets of different formats, structures, and resolutions.
ARX - A Comprehensive Tool for Anonymizing Biomedical Data
Prasser, Fabian; Kohlmayer, Florian; Lautenschläger, Ronald; Kuhn, Klaus A.
2014-01-01
Collaboration and data sharing have become core elements of biomedical research. Especially when sensitive data from distributed sources are linked, privacy threats have to be considered. Statistical disclosure control allows the protection of sensitive data by introducing fuzziness. Reduction of data quality, however, needs to be balanced against gains in protection. Therefore, tools are needed which provide a good overview of the anonymization process to those responsible for data sharing. These tools require graphical interfaces and the use of intuitive and replicable methods. In addition, extensive testing, documentation and openness to reviews by the community are important. Existing publicly available software is limited in functionality, and often active support is lacking. We present ARX, an anonymization tool that i) implements a wide variety of privacy methods in a highly efficient manner, ii) provides an intuitive cross-platform graphical interface, iii) offers a programming interface for integration into other software systems, and iv) is well documented and actively supported. PMID:25954407
The Global Earthquake Model and Disaster Risk Reduction
NASA Astrophysics Data System (ADS)
Smolka, A. J.
2015-12-01
Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all projects are undertaken with strong involvement of local scientific and risk reduction communities. Open-source software and careful documentation of the methodologies create full transparency of the modelling process, so that results can be reproduced any time by third parties.
Semi-automated camera trap image processing for the detection of ungulate fence crossing events.
Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija
2017-09-27
Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.
Identification of peptide features in precursor spectra using Hardklör and Krönik
Hoopmann, Michael R.; MacCoss, Michael J.; Moritz, Robert L.
2013-01-01
Hardklör and Krönik are software tools for feature detection and data reduction of high resolution mass spectra. Hardklör is used to reduce peptide isotope distributions to a single monoisotopic mass and charge state, and can deconvolve overlapping peptide isotope distributions. Krönik filters, validates, and summarizes peptide features identified with Hardklör from data obtained during liquid chromatography mass spectrometry (LC-MS). Both software tools contain a simple user interface and can be run from nearly any desktop computer. These tools are freely available from http://proteome.gs.washington.edu/software/hardklor. PMID:22389013
40 CFR Table 5 of Subpart Aaaaaaa... - Applicability of General Provisions to Subpart AAAAAAA
Code of Federal Regulations, 2010 CFR
2010-07-01
... must be conducted. § 63.7(e)(2)-(4) Conduct of Performance Tests and Data Reduction Yes. § 63.7(f)-(h) Use of Alternative Test Method; Data Analysis, Recordkeeping, and Reporting; and Waiver of Performance... CMS requirements. § 63.8(e)-(f) CMS Performance Evaluation Yes. § 63.8(g)(1)-(4) Data Reduction...
Farran, Bassam; Channanath, Arshad Mohamed; Behbehani, Kazem; Thanaraj, Thangavel Alphonse
2013-01-01
Objective We build classification models and risk assessment tools for diabetes, hypertension and comorbidity using machine-learning algorithms on data from Kuwait. We model the increased proneness in diabetic patients to develop hypertension and vice versa. We ascertain the importance of ethnicity (and natives vs expatriate migrants) and of using regional data in risk assessment. Design Retrospective cohort study. Four machine-learning techniques were used: logistic regression, k-nearest neighbours (k-NN), multifactor dimensionality reduction and support vector machines. The study uses fivefold cross validation to obtain generalisation accuracies and errors. Setting Kuwait Health Network (KHN) that integrates data from primary health centres and hospitals in Kuwait. Participants 270 172 hospital visitors (of which, 89 858 are diabetic, 58 745 hypertensive and 30 522 comorbid) comprising Kuwaiti natives, Asian and Arab expatriates. Outcome measures Incident type 2 diabetes, hypertension and comorbidity. Results Classification accuracies of >85% (for diabetes) and >90% (for hypertension) are achieved using only simple non-laboratory-based parameters. Risk assessment tools based on k-NN classification models are able to assign ‘high’ risk to 75% of diabetic patients and to 94% of hypertensive patients. Only 5% of diabetic patients are seen assigned ‘low’ risk. Asian-specific models and assessments perform even better. Pathological conditions of diabetes in the general population or in hypertensive population and those of hypertension are modelled. Two-stage aggregate classification models and risk assessment tools, built combining both the component models on diabetes (or on hypertension), perform better than individual models. Conclusions Data on diabetes, hypertension and comorbidity from the cosmopolitan State of Kuwait are available for the first time. This enabled us to apply four different case–control models to assess risks. These tools aid in the preliminary non-intrusive assessment of the population. Ethnicity is seen significant to the predictive models. Risk assessments need to be developed using regional data as we demonstrate the applicability of the American Diabetes Association online calculator on data from Kuwait. PMID:23676796
A Combinatorial Approach to Detecting Gene-Gene and Gene-Environment Interactions in Family Studies
Lou, Xiang-Yang; Chen, Guo-Bo; Yan, Lei; Ma, Jennie Z.; Mangold, Jamie E.; Zhu, Jun; Elston, Robert C.; Li, Ming D.
2008-01-01
Widespread multifactor interactions present a significant challenge in determining risk factors of complex diseases. Several combinatorial approaches, such as the multifactor dimensionality reduction (MDR) method, have emerged as a promising tool for better detecting gene-gene (G × G) and gene-environment (G × E) interactions. We recently developed a general combinatorial approach, namely the generalized multifactor dimensionality reduction (GMDR) method, which can entertain both qualitative and quantitative phenotypes and allows for both discrete and continuous covariates to detect G × G and G × E interactions in a sample of unrelated individuals. In this article, we report the development of an algorithm that can be used to study G × G and G × E interactions for family-based designs, called pedigree-based GMDR (PGMDR). Compared to the available method, our proposed method has several major improvements, including allowing for covariate adjustments and being applicable to arbitrary phenotypes, arbitrary pedigree structures, and arbitrary patterns of missing marker genotypes. Our Monte Carlo simulations provide evidence that the PGMDR method is superior in performance to identify epistatic loci compared to the MDR-pedigree disequilibrium test (PDT). Finally, we applied our proposed approach to a genetic data set on tobacco dependence and found a significant interaction between two taste receptor genes (i.e., TAS2R16 and TAS2R38) in affecting nicotine dependence. PMID:18834969
Manzoni, Gian Mauro; Rossi, Alessandro; Marazzi, Nicoletta; Agosti, Fiorenza; De Col, Alessandra; Pietrabissa, Giada; Castelnuovo, Gianluca; Molinari, Enrico; Sartorio, Allessandro
2018-01-01
Objective This study was aimed to examine the feasibility, validity, and reliability of the Italian Pediatric Quality of Life Inventory Multidimensional Fatigue Scale (PedsQL™ MFS) for adult inpatients with severe obesity. Methods 200 inpatients (81% females) with severe obesity (BMI ≥ 35 kg/m2) completed the PedsQL MFS (General Fatigue, Sleep/Rest Fatigue and Cognitive Fatigue domains), the Fatigue Severity Scale, and the Center for Epidemiologic Studies Depression Scale immediately after admission to a 3-week residential body weight reduction program. A randomized subsample of 48 patients re-completed the PedsQL MFS after 3 days. Results Confirmatory factor analysis showed that a modified hierarchical model with two items moved from the Sleep/Rest Fatigue domain to the General Fatigue domain and a second-order latent factor best fitted the data. Internal consistency and test-retest reliabilities were acceptable to high in all scales, and small to high statistically significant correlations were found with all convergent measures, with the exception of BMI. Significant floor effects were found in two scales (Cognitive Fatigue and Sleep/Rest Fatigue). Conclusion The Italian modified PedsQL MFS for adults showed to be a valid and reliable tool for the assessment of fatigue in inpatients with severe obesity. Future studies should assess its discriminant validity as well as its responsiveness to weight reduction. PMID:29402854
Seismic facies analysis based on self-organizing map and empirical mode decomposition
NASA Astrophysics Data System (ADS)
Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian
2015-01-01
Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.
Generalized Data Management Systems--Some Perspectives.
ERIC Educational Resources Information Center
Minker, Jack
A Generalized Data Management System (GDMS) is a software environment provided as a tool for analysts, administrators, and programmers who are responsible for the maintenance, query and analysis of a data base to permit the manipulation of newly defined files and data with the existing programs and system. Because the GDMS technology is believed…
76 FR 70517 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-14
... requested. These systems generally also provide analytics, spreadsheets, and other tools designed to enable funds to analyze the data presented, as well as communication tools to process fund instructions...
Study of Tools for Network Discovery and Network Mapping
2003-11-01
connected to the switch. iv. Accessibility of historical data and event data In general, network discovery tools keep a history of the collected...has the following software dependencies: - Java Virtual machine 76 - Perl modules - RRD Tool - TomCat - PostgreSQL STRENGTHS AND...systems - provide a simple view of the current network status - generate alarms on status change - generate history of status change VISUAL MAP
Kenneth L. Clark; Nicholas Skowronski; John Hom; Matthew Duveneck; Yude Pan; Stephen Van Tuyl; Jason Cole; Matthew Patterson; Stephen Maurer
2009-01-01
Our goal is to assist the New Jersey Forest Fire Service and federal wildland fire managers in the New Jersey Pine Barrens evaluate where and when to conduct hazardous fuel reduction treatments. We used remotely sensed LIDAR (Light Detection and Ranging System) data and field sampling to estimate fuel loads and consumption during prescribed fire treatments. This...
Improta, Giovanni; Cesarelli, Mario; Montuori, Paolo; Santillo, Liberatina Carmela; Triassi, Maria
2018-04-01
Lean Six Sigma (LSS) has been recognized as an effective management tool for improving healthcare performance. Here, LSS was adopted to reduce the risk of healthcare-associated infections (HAIs), a critical quality parameter in the healthcare sector. Lean Six Sigma was applied to the areas of clinical medicine (including general medicine, pulmonology, oncology, nephrology, cardiology, neurology, gastroenterology, rheumatology, and diabetology), and data regarding HAIs were collected for 28,000 patients hospitalized between January 2011 and December 2016. Following the LSS define, measure, analyse, improve, and control cycle, the factors influencing the risk of HAI were identified by using typical LSS tools (statistical analyses, brainstorming sessions, and cause-effect diagrams). Finally, corrective measures to prevent HAIs were implemented and monitored for 1 year after implementation. Lean Six Sigma proved to be a useful tool for identifying variables affecting the risk of HAIs and implementing corrective actions to improve the performance of the care process. A reduction in the number of patients colonized by sentinel bacteria was achieved after the improvement phase. The implementation of an LSS approach could significantly decrease the percentage of patients with HAIs. © 2017 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons Ltd.
Cesarelli, Mario; Montuori, Paolo; Santillo, Liberatina Carmela; Triassi, Maria
2017-01-01
Abstract Rationale, aims, and objectives Lean Six Sigma (LSS) has been recognized as an effective management tool for improving healthcare performance. Here, LSS was adopted to reduce the risk of healthcare‐associated infections (HAIs), a critical quality parameter in the healthcare sector. Methods Lean Six Sigma was applied to the areas of clinical medicine (including general medicine, pulmonology, oncology, nephrology, cardiology, neurology, gastroenterology, rheumatology, and diabetology), and data regarding HAIs were collected for 28,000 patients hospitalized between January 2011 and December 2016. Following the LSS define, measure, analyse, improve, and control cycle, the factors influencing the risk of HAI were identified by using typical LSS tools (statistical analyses, brainstorming sessions, and cause‐effect diagrams). Finally, corrective measures to prevent HAIs were implemented and monitored for 1 year after implementation. Results Lean Six Sigma proved to be a useful tool for identifying variables affecting the risk of HAIs and implementing corrective actions to improve the performance of the care process. A reduction in the number of patients colonized by sentinel bacteria was achieved after the improvement phase. Conclusions The implementation of an LSS approach could significantly decrease the percentage of patients with HAIs. PMID:29098756
[The future of scientific libraries].
De Fiore, Luca
2013-10-01
"Making predictions is always very difficult, especially about the future". Niels Bohr's quote is very appropriate when looking into the future of libraries. If the Web is now the richest library in the world, it is also the most friendly and therefore the most convenient. The evolution of libraries in the coming years - both traditional and online - will probably depend on their ability to meet the information needs of users: improved ease of use and better reliability of the information. These are objectives that require money and - given the general reduction in budgets - it is not obvious that the results will be achieved. However, there are many promising experiences at the international level that show that the world of libraries is populated by projects and creativity. Traditional or digital, libraries will increasingly present themselves more as a sharing tool than as a repository of information: it is the sharing that translates data into knowledge. In the healthcare field, the integration of online libraries with the epidemiological information systems could favor the fulfillment of unconscious information needs of health personnel; libraries will therefore be a key tool for an integrated answer to the challenge of continuing education in medicine. The Internet is no longer a library but an information ecosystem where the data are transformed into knowledge by sharing and discussion.
Imaging mass spectrometry data reduction: automated feature identification and extraction.
McDonnell, Liam A; van Remoortere, Alexandra; de Velde, Nico; van Zeijl, René J M; Deelder, André M
2010-12-01
Imaging MS now enables the parallel analysis of hundreds of biomolecules, spanning multiple molecular classes, which allows tissues to be described by their molecular content and distribution. When combined with advanced data analysis routines, tissues can be analyzed and classified based solely on their molecular content. Such molecular histology techniques have been used to distinguish regions with differential molecular signatures that could not be distinguished using established histologic tools. However, its potential to provide an independent, complementary analysis of clinical tissues has been limited by the very large file sizes and large number of discrete variables associated with imaging MS experiments. Here we demonstrate data reduction tools, based on automated feature identification and extraction, for peptide, protein, and lipid imaging MS, using multiple imaging MS technologies, that reduce data loads and the number of variables by >100×, and that highlight highly-localized features that can be missed using standard data analysis strategies. It is then demonstrated how these capabilities enable multivariate analysis on large imaging MS datasets spanning multiple tissues. Copyright © 2010 American Society for Mass Spectrometry. Published by Elsevier Inc. All rights reserved.
Experimental study on internal cooling system in hard turning of HCWCI using CBN tools
NASA Astrophysics Data System (ADS)
Ravi, A. M.; Murigendrappa, S. M.
2018-04-01
In recent times, hard turning became most emerging technique in manufacturing processes, especially to cut high hard materials like high chrome white cast iron (HCWCI). Use of Cubic boron nitride (CBN), pCBN and Carbide tools are most appropriate to shear the metals but are uneconomical. Since hard turning carried out in dry condition, lowering the tool wear by minimizing tool temperature is the only solution. Study reveals, no effective cooling systems are available so for in order to enhance the tool life of the cutting tools and to improve machinability characteristics. The detrimental effect of cutting parameters on cutting temperature is generally controlled by proper selections. The objective of this paper is to develop a new cooling system to control tool tip temperature, thereby minimizing the cutting forces and the tool wear rates. The materials chosen for this work was HCWCI and cutting tools are CBN inserts. Intricate cavities were made on the periphery of the tool holder for easy flow of cold water. Taguchi techniques were adopted to carry out the experimentations. The experimental results confirm considerable reduction in the cutting forces and tool wear rates.
2011-01-01
Background Insecticide-treated mosquito nets (ITNs) and indoor-residual spraying have been scaled-up across sub-Saharan Africa as part of international efforts to control malaria. These interventions have the potential to significantly impact child survival. The Lives Saved Tool (LiST) was developed to provide national and regional estimates of cause-specific mortality based on the extent of intervention coverage scale-up. We compared the percent reduction in all-cause child mortality estimated by LiST against measured reductions in all-cause child mortality from studies assessing the impact of vector control interventions in Africa. Methods We performed a literature search for appropriate studies and compared reductions in all-cause child mortality estimated by LiST to 4 studies that estimated changes in all-cause child mortality following the scale-up of vector control interventions. The following key parameters measured by each study were applied to available country projections: baseline all-cause child mortality rate, proportion of mortality due to malaria, and population coverage of vector control interventions at baseline and follow-up years. Results The percent reduction in all-cause child mortality estimated by the LiST model fell within the confidence intervals around the measured mortality reductions for all 4 studies. Two of the LiST estimates overestimated the mortality reductions by 6.1 and 4.2 percentage points (33% and 35% relative to the measured estimates), while two underestimated the mortality reductions by 4.7 and 6.2 percentage points (22% and 25% relative to the measured estimates). Conclusions The LiST model did not systematically under- or overestimate the impact of ITNs on all-cause child mortality. These results show the LiST model to perform reasonably well at estimating the effect of vector control scale-up on child mortality when compared against measured data from studies across a range of malaria transmission settings. The LiST model appears to be a useful tool in estimating the potential mortality reduction achieved from scaling-up malaria control interventions. PMID:21501453
Collentine, Dennis; Johnsson, Holger; Larsson, Peter; Markensten, Hampus; Persson, Kristian
2015-03-01
Riparian buffer zones are the only measure which has been used extensively in Sweden to reduce phosphorus losses from agricultural land. This paper describes how the FyrisSKZ web tool can be used to evaluate allocation scenarios using data from the Svärta River, an agricultural catchment located in central Sweden. Three scenarios are evaluated: a baseline, a uniform 6-m-wide buffer zone in each sub-catchment, and an allocation of areas of buffer zones to sub-catchments based on the average cost of reduction. The total P reduction increases by 30 % in the second scenario compared to the baseline scenario, and the average reduction per hectare increases by 90 % while total costs of the program fall by 32 %. In the third scenario, the average cost per unit of reduction (
40 CFR Table 7 of Subpart Yyyy of... - Applicability of General Provisions to Subpart YYYY
Code of Federal Regulations, 2010 CFR
2010-07-01
... provisions Yes § 63.7(g) Performance test data analysis, recordkeeping, and reporting Yes § 63.7(h) Waiver of... conducting performance tests Yes § 63.7(e)(2) Conduct of performance tests and reduction of data Yes Subpart... Yes § 63.8(g) Data reduction Yes Except that provisions for COMS are not applicable. Averaging periods...
NASA Astrophysics Data System (ADS)
Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip
2017-10-01
Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be extendable to multi-channel data represented by data cubes in Stokes I, Q, and U.
RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robert S. Balch; Ron Broadhead
2005-03-01
Incomplete or sparse data such as geologic or formation characteristics introduce a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results when working with sparse data. State-of-the-art expert exploration tools, relying on a database, and computer maps generated by neural networks and user inputs, have been developed through the use of ''fuzzy'' logic, a mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk has been reduced with the use of these properly verified and validated ''Fuzzy Expert Exploration (FEE) Tools.'' Through themore » course of this project, FEE Tools and supporting software were developed for two producing formations in southeast New Mexico. Tools of this type can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In today's oil industry environment, many smaller exploration companies lack the resources of a pool of expert exploration personnel. Downsizing, volatile oil prices, and scarcity of domestic exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. The FEE Tools benefit a diverse group in the U.S., allowing a more efficient use of scarce funds, and potentially reducing dependence on foreign oil and providing lower product prices for consumers.« less
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Alternative Test Method Yes EPA retains approval authority § 63.7(g) Data Analysis Yes § 63.7(h) Waiver of... monitoring systems (CEMS) § 63.8(g)(1) Data Reduction Yes § 63.8(g)(2) Data Reduction No Subpart HHHH does not require the use of CEMS or continuous opacity monitoring systems (COMS). § 63.8(g)(3)-(5) Data...
Martyna, Agnieszka; Zadora, Grzegorz; Neocleous, Tereza; Michalska, Aleksandra; Dean, Nema
2016-08-10
Many chemometric tools are invaluable and have proven effective in data mining and substantial dimensionality reduction of highly multivariate data. This becomes vital for interpreting various physicochemical data due to rapid development of advanced analytical techniques, delivering much information in a single measurement run. This concerns especially spectra, which are frequently used as the subject of comparative analysis in e.g. forensic sciences. In the presented study the microtraces collected from the scenarios of hit-and-run accidents were analysed. Plastic containers and automotive plastics (e.g. bumpers, headlamp lenses) were subjected to Fourier transform infrared spectrometry and car paints were analysed using Raman spectroscopy. In the forensic context analytical results must be interpreted and reported according to the standards of the interpretation schemes acknowledged in forensic sciences using the likelihood ratio approach. However, for proper construction of LR models for highly multivariate data, such as spectra, chemometric tools must be employed for substantial data compression. Conversion from classical feature representation to distance representation was proposed for revealing hidden data peculiarities and linear discriminant analysis was further applied for minimising the within-sample variability while maximising the between-sample variability. Both techniques enabled substantial reduction of data dimensionality. Univariate and multivariate likelihood ratio models were proposed for such data. It was shown that the combination of chemometric tools and the likelihood ratio approach is capable of solving the comparison problem of highly multivariate and correlated data after proper extraction of the most relevant features and variance information hidden in the data structure. Copyright © 2016 Elsevier B.V. All rights reserved.
Methodological Foundations for Designing Intelligent Computer-Based Training
1991-09-03
student models, graphic forms, version control data structures, flowcharts , etc. Circuit simulations are an obvious case. A circuit, after all, can... flowcharts as a basic data structure, and we were able to generalize our tools to create a flowchart drawing tool for inputting both the appearance and...the meaning of flowcharts efficiently. For the Sherlock work, we built a tool that permitted inputting of information about front panels and
Models of Sector Flows Under Local, Regional and Airport Weather Constraints
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak
2017-01-01
Recently, the ATM community has made important progress in collaborative trajectory management through the introduction of a new FAA traffic management initiative called a Collaborative Trajectory Options Program (CTOP). FAA can use CTOPs to manage air traffic under multiple constraints (manifested as flow constrained areas or FCAs) in the system, and it allows flight operators to indicate their preferences for routing and delay options. CTOPs also permits better management of the overall trajectory of flights by considering both routing and departure delay options simultaneously. However, adoption of CTOPs in airspace has been hampered by many factors that include challenges in how to identify constrained areas and how to set rates for the FCAs. Decision support tools providing assistance would be particularly helpful in effective use of CTOPs. Such DSTs tools would need models of demand and capacity in the presence of multiple constraints. This study examines different approaches to using historical data to create and validate models of maximum flows in sectors and other airspace regions in the presence of multiple constraints. A challenge in creating an empirical model of flows under multiple constraints is a lack of sufficient historical data that captures diverse situations involving combinations of multiple constraints especially those with severe weather. The approach taken here to deal with this is two-fold. First, we create a generalized sector model encompassing multiple sectors rather than individual sectors in order to increase the amount of data used for creating the model by an order of magnitude. Secondly, we decompose the problem so that the amount of data needed is reduced. This involves creating a baseline demand model plus a separate weather constrained flow reduction model and then composing these into a single integrated model. A nominal demand model is a flow model (gdem) in the presence of clear local weather. This defines the flow as a function of weather constraints in neighboring regions, airport constraints and weather in locations that can cause re-routes to the location of interest. A weather constrained flow reduction model (fwx-red) is a model of reduction in baseline counts as a function of local weather. Because the number of independent variables associated with each of the two decomposed models is smaller than that with a single model, need for amount of data is reduced. Finally, a composite model that combines these two can be represented as fwx-red (gdem(e), l) where e represents non-local constraints and l represents local weather. The approaches studied to developing these models are divided into three categories: (1) Point estimation models (2) Empirical models (3) Theoretical models. Errors in predictions of these different types of models have been estimated. In situations when there is abundant data, point estimation models tend to be very accurate. In contrast, empirical models do better than theoretical models when there is some data available. The biggest benefit of theoretical models is their general applicability in wider range situations once the degree of accuracy of these has been established.
Models of Sector Aircraft Counts in the Presence of Local, Regional and Airport Constraints
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak
2017-01-01
Recently, the ATM community has made important progress in collaborative trajectory management through the introduction of a new FAA traffic management initiative called a Collaborative Trajectory Options Program (CTOP). FAA can use CTOPs to manage air traffic under multiple constraints (manifested as flow constrained areas or FCAs) in the system, and it allows flight operators to indicate their preferences for routing and delay options. CTOPs also permits better management of the overall trajectory of flights by considering both routing and departure delay options simultaneously. However, adoption of CTOPs in airspace has been hampered by many factors that include challenges in how to identify constrained areas and how to set rates for the FCAs. Decision support tools providing assistance would be particularly helpful in effective use of CTOPs. Such DSTs tools would need models of demand and capacity in the presence of multiple constraints. This study examines different approaches to using historical data to create and validate models of maximum flows in sectors and other airspace regions in the presence of multiple constraints. A challenge in creating an empirical model of flows under multiple constraints is a lack of sufficient historical data that captures diverse situations involving combinations of multiple constraints especially those with severe weather. The approach taken here to deal with this is two-fold. First, we create a generalized sector model encompassing multiple sectors rather than individual sectors in order to increase the amount of data used for creating the model by an order of magnitude. Secondly, we decompose the problem so that the amount of data needed is reduced. This involves creating a baseline demand model plus a separate weather constrained flow reduction model and then composing these into a single integrated model. A nominal demand model is a flow model (gdem) in the presence of clear local weather. This defines the flow as a function of weather constraints in neighboring regions, airport constraints and weather in locations that can cause re-routes to the location of interest. A weather constrained flow reduction model (fwx-red) is a model of reduction in baseline counts as a function of local weather. Because the number of independent variables associated with each of the two decomposed models is smaller than that with a single model, need for amount of data is reduced. Finally, a composite model that combines these two can be represented as fwx-red (gdem(e), l) where e represents non-local constraints and l represents local weather. The approaches studied to developing these models are divided into three categories: (1) Point estimation models (2) Empirical models (3) Theoretical models. Errors in predictions of these different types of models have been estimated. In situations when there is abundant data, point estimation models tend to be very accurate. In contrast, empirical models do better than theoretical models when there is some data available. The biggest benefit of theoretical models is their general applicability in wider range situations once the degree of accuracy of these has been established.
BigBWA: approaching the Burrows-Wheeler aligner to Big Data technologies.
Abuín, José M; Pichel, Juan C; Pena, Tomás F; Amigo, Jorge
2015-12-15
BigBWA is a new tool that uses the Big Data technology Hadoop to boost the performance of the Burrows-Wheeler aligner (BWA). Important reductions in the execution times were observed when using this tool. In addition, BigBWA is fault tolerant and it does not require any modification of the original BWA source code. BigBWA is available at the project GitHub repository: https://github.com/citiususc/BigBWA. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Magnetic assessment and modelling of the Aramis undulator beamline
Calvi, M.; Camenzuli, C.; Ganter, R.; Sammut, N.; Schmidt, Th.
2018-01-01
Within the SwissFEL project at the Paul Scherrer Institute (PSI), the hard X-ray line (Aramis) has been equipped with short-period in-vacuum undulators, known as the U15 series. The undulator design has been developed within the institute itself, while the prototyping and the series production have been implemented through a close collaboration with a Swiss industrial partner, Max Daetwyler AG, and several subcontractors. The magnetic measurement system has been built at PSI, together with all the data analysis tools. The Hall probe has been designed for PSI by the Swiss company SENIS. In this paper the general concepts of both the mechanical and the magnetic properties of the U15 series of undulators are presented. A description of the magnetic measurement equipment is given and the results of the magnetic measurement campaign are reported. Lastly, the data reduction methods and the associated models are presented and their actual implementation in the control system is detailed. PMID:29714179
Applying machine learning classification techniques to automate sky object cataloguing
NASA Astrophysics Data System (ADS)
Fayyad, Usama M.; Doyle, Richard J.; Weir, W. Nick; Djorgovski, Stanislav
1993-08-01
We describe the application of an Artificial Intelligence machine learning techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Mt. Palomar Northern Sky Survey is nearly completed. This survey provides comprehensive coverage of the northern celestial hemisphere in the form of photographic plates. The plates are being transformed into digitized images whose quality will probably not be surpassed in the next ten to twenty years. The images are expected to contain on the order of 107 galaxies and 108 stars. Astronomers wish to determine which of these sky objects belong to various classes of galaxies and stars. Unfortunately, the size of this data set precludes analysis in an exclusively manual fashion. Our approach is to develop a software system which integrates the functions of independently developed techniques for image processing and data classification. Digitized sky images are passed through image processing routines to identify sky objects and to extract a set of features for each object. These routines are used to help select a useful set of attributes for classifying sky objects. Then GID3 (Generalized ID3) and O-B Tree, two inductive learning techniques, learns classification decision trees from examples. These classifiers will then be applied to new data. These developmnent process is highly interactive, with astronomer input playing a vital role. Astronomers refine the feature set used to construct sky object descriptions, and evaluate the performance of the automated classification technique on new data. This paper gives an overview of the machine learning techniques with an emphasis on their general applicability, describes the details of our specific application, and reports the initial encouraging results. The results indicate that our machine learning approach is well-suited to the problem. The primary benefit of the approach is increased data reduction throughput. Another benefit is consistency of classification. The classification rules which are the product of the inductive learning techniques will form an objective, examinable basis for classifying sky objects. A final, not to be underestimated benefit is that astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems based on automatically catalogued data.
Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)
NASA Astrophysics Data System (ADS)
Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia
2018-06-01
Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.
HI data reduction for the Arecibo Pisces-Perseus Supercluster Survey
NASA Astrophysics Data System (ADS)
Davis, Cory; Johnson, Cory; Craig, David W.; Haynes, Martha P.; Jones, Michael G.; Koopmann, Rebecca A.; Hallenbeck, Gregory L.; Undergraduate ALFALFA Team
2017-01-01
The Undergraduate ALFALFA team is currently focusing on the analysis of the Pisces-Perseus Supercluster to test current supercluster formation models. The primary goal of our research is to reduce L-band HI data from the Arecibo telescope. To reduce the data we use IDL programs written by our collaborators to reduce the data and find potential sources whose mass can be estimated by the baryonic Tully-Fisher relation, which relates the luminosity to the rotational velocity profile of spiral galaxies. Thus far we have reduced data and estimated HI masses for several galaxies in the supercluster region.We will give examples of data reduction and preliminary results for both the fall 2015 and 2016 observing seasons. We will also describe the data reduction process and the process of learning the associated software, and the use of virtual observatory tools such as the SDSS databases, Aladin, TOPCAT and others.This research was supported by the NSF grant AST-1211005.
Development of a simple, self-contained flight test data acquisition system
NASA Technical Reports Server (NTRS)
Renz, R. R. L.
1981-01-01
A low cost flight test data acquisition system, applicable to general aviation airplanes, was developed which meets criteria for doing longitudinal and lateral stability analysis. Th package consists of (1) a microprocessor controller and data acquisition module; (2) a transducer module; and (3) a power supply module. The system is easy to install and occupies space in the cabin or baggage compartment of the airplane. All transducers are contained in these modules except the total pressure tube, static pressure air temperature transducer, and control position transducers. The NASA-developed MMLE program was placed on a microcomputer on which all data reduction is done. The flight testing program undertaken proved both the flight testing hardware and the data reduction method to be applicable to the current field of general aviation airplanes.
Del Fante, Peter; Allan, Don; Babidge, Elizabeth
2006-01-01
The Practice Health Atlas (PHA) is a decision support tool for general practice, designed by the Adelaide Western Division of General Practice (AWDGP). This article describes the features of the PHA and its potential role in enhancing health care. In developing the PHA, the AWDGP utilises a range of software tools and consults with a practice to understand its clinical data management approach. The PHA comprises three sections: epidemiology, business and clinical modelling systems, access to services. The objectives include developing a professional culture around quality health data and synthesis of aggregated de-identified general practice data at both practice and divisional level (and beyond) to assist with local health needs assessment, planning, and funding. Evaluation occurs through group feedback sessions and from the general practitioners and staff. It has demonstrated its potential to fulfill the objectives in outcome areas such as data quality and management, team based care, pro-active practice population health care, and business systems development, thereby contributing to improved patient health outcomes.
Ocean Surface Topography Data Products and Tools
NASA Technical Reports Server (NTRS)
Case, Kelley E.; Bingham, Andrew W.; Berwin, Robert W.; Rigor, Eric M.; Raskin, Robert G.
2004-01-01
The Physical Oceanography Distributed Active Archiving Center (PO.DAAC), NASA's primary data center for archiving and distributing oceanographic data, is supporting the Jason and TOPEX/Poseidon satellite tandem missions by providing a variety of data products, tools, and distribution methods to the wider scientific and general community. PO.DAAC has developed several new data products for sea level residual measurements, providing a longterm climate data record from 1992 to the present These products provide compatible measurements of sea level residuals for the entire time series including the tandem TOPEX/Poseidon and Jason mission. Several data distribution tool. are available from NASA PO.DAAC. The Near-Real-Time Image Distribution Server (NEREIDS) provides quicklook browse images and binary data files The PO.DAAC Ocean ESIP Tool (POET) provides interactive, on-tine data subsetting and visualization for several altimetry data products.
ERIC Educational Resources Information Center
National Comprehensive Center for Teacher Quality, 2008
2008-01-01
The National Comprehensive Center for Teacher Quality (TQ Center) designed the Interactive Data Tools to provide users with access to state and national data that can be helpful in assessing the qualifications of teachers in the states and the extent to which a state's teacher policy climate generally supports teacher quality. The Interactive Data…
Aircraft Piston Engine Exhaust Emission Symposium
NASA Technical Reports Server (NTRS)
1976-01-01
A 2-day symposium on the reduction of exhaust emissions from aircraft piston engines was held on September 14 and 15, 1976, at the Lewis Research Center in Cleveland, Ohio. Papers were presented by both government organizations and the general aviation industry on the status of government contracts, emission measurement problems, data reduction procedures, flight testing, and emission reduction techniques.
Posthuma, Leo; Wahlstrom, Emilia; Nijenhuis, René; Dijkens, Chris; de Zwart, Dick; van de Meent, Dik; Hollander, Anne; Brand, Ellen; den Hollander, Henri A; van Middelaar, Johan; van Dijk, Sander; Hall, E F; Hoffer, Sally
2014-11-01
The United Nations response mechanism to environmental emergencies requested a tool to support disaster assessment and coordination actions by United Nations Disaster Assessment and Coordination (UNDAC) teams. The tool should support on-site decision making when substantial chemical emissions affect human health directly or via the environment and should be suitable for prioritizing impact reduction management options under challenging conditions worldwide. To answer this need, the Flash Environmental Assessment Tool (FEAT) was developed and the scientific and practical underpinning and application of this tool are described in this paper. FEAT consists of a printed decision framework and lookup tables, generated by combining the scientific data on chemicals, exposure pathways and vulnerabilities with the pragmatic needs of emergency field teams. Application of the tool yields information that can help prioritize impact reduction measures. The first years of use illustrated the usefulness of the tool as well as suggesting additional uses and improvements. An additional use is application of the back-office tool (Hazard Identification Tool, HIT), the results of which aid decision-making by the authorities of affected countries and the preparation of field teams for on-site deployment. Another extra use is in disaster pro action and prevention. In this case, the application of the tool supports safe land-use planning and improved technical design of chemical facilities. UNDAC teams are trained to use the tool after large-scale sudden onset natural disasters. Copyright © 2014 Elsevier Ltd. All rights reserved.
Dykes, Patricia C; Hurley, Ann; Cashen, Margaret; Bakken, Suzanne; Duffy, Mary E
2007-01-01
The use of health information technology (HIT) for the support of communication processes and data and information access in acute care settings is a relatively new phenomenon. A means of evaluating the impact of HIT in hospital settings is needed. The purpose of this research was to design and psychometrically evaluate the Impact of Health Information Technology scale (I-HIT). I-HIT was designed to measure the perception of nurses regarding the ways in which HIT influences interdisciplinary communication and workflow patterns and nurses' satisfaction with HIT applications and tools. Content for a 43-item tool was derived from the literature, and supported theoretically by the Coiera model and by nurse informaticists. Internal consistency reliability analysis using Cronbach's alpha was conducted on the 43-item scale to initiate the item reduction process. Items with an item total correlation of less than 0.35 were removed, leaving a total of 29 items. Item analysis, exploratory principal component analysis and internal consistency reliability using Cronbach's alpha were used to confirm the 29-item scale. Principal components analysis with Varimax rotation produced a four-factor solution that explained 58.5% of total variance (general advantages, information tools to support information needs, information tools to support communication needs, and workflow implications). Internal consistency of the total scale was 0.95 and ranged from 0.80-0.89 for four subscales. I-HIT demonstrated psychometric adequacy and is recommended to measure the impact of HIT on nursing practice in acute care settings.
Center for Corporate Climate Leadership GHG Inventory Guidance for Low Emitters
Tools and guidance for low emitters and small businesses to develop an organization-wide GHG inventory and establish a plan to ensure GHG emissions data consistency for tracking progress towards reaching an emissions reduction goal.
Starlink Software Developments
NASA Astrophysics Data System (ADS)
Bly, M. J.; Giaretta, D.; Currie, M. J.; Taylor, M.
Some current and upcoming software developments from Starlink were demonstrated. These included invoking traditional Starlink applications via web services, the current version of the ORAC-DR reduction pipeline, and some new Java-based tools including Treeview, an interactive explorer of hierarchical data structures.
Brown, B J; Emery, R J; Stock, T H; Lee, E S
2004-03-01
Inspection outcome data provided by the state of Washington Department of Health, Division of Radiation Protection, for licensees of radioactive materials was encoded according to a system established by the Texas Department of Health, Bureau of Radiation Control. The data, representing calendar year 1999 inspection activities, were then analyzed and the results compared to previously published studies for the same year in the states of Texas and Maine. Despite significant differences in regulatory program size, age, and geographic proximity, the most frequently cited violation for radioactive materials licensees were shown to be similar for all three states. Of particular note were the violations that were identified to be consistently issued in all three states. These included physical inventories and utilization logs not performed, not available, or incomplete; leak testing not performed or not performed on schedule; inadequate or unapproved operating and safety procedures; radiation survey and disposal records not available or incomplete; detection or measurement instrument calibration not performed or records not available; and radiation surveys or sampling not performed or performed with a noncalibrated instrument. Comparisons were made in an attempt to generate a summary of the most commonly issued violations that could be generalized to users of radioactive materials across the United States. A generalized list of common violations would be an invaluable tool for radiation protection programs, serving to aid in the reduction of the overall instance of program non-compliance. Any reduction in instances of non-compliance would result in the conservation of finite public health resources that might then be directed to other pressing public health matters.
Wynn, Thomas; Goren-Inbar, Naama
2017-01-01
Stone cleavers are one of the most distinctive components of the Acheulian toolkit. These tools were produced as part of a long and complex reduction sequence and they provide indications for planning and remarkable knapping skill. These aspects hold implications regarding the cognitive complexity and abilities of their makers and users. In this study we have analyzed a cleaver assemblage originating from the Acheulian site of Gesher Benot Ya‘aqov, Israel, to provide a reconstruction of the chaîne opératoire which structured their production. This reduction sequence was taken as the basis for a cognitive analysis which allowed us to draw conclusion regarding numerous behavioral and cognitive aspects of the GBY hominins. The results indicate that the cleavers production incorporated a highly specific sequence of decisions and actions which resulted in three distinct modes of cleavers modification. Furthermore, the decision to produce a cleaver must have been taken very early in the sequence, thus differentiating its production from that of handaxes. The substantial predetermination and the specific reduction sequence provide evidence that the Gesher Benot Ya‘aqov hominins had a number of cognitive categories such as a general ‘tool concept’ and a more specific ‘cleaver concept’, setting them apart from earlier tool-producing hominins and extant tool-using non-human primates. Furthermore, it appears that the Gesher Benot Ya‘aqov lithic technology was governed by expert cognition, which is the kind of thinking typical of modern human experts in their various domains. Thus, the results provide direct indications that important components of modern cognition have been well established in the minds of the Gesher Benot Ya‘aqov hominins. PMID:29145489
Herzlinger, Gadi; Wynn, Thomas; Goren-Inbar, Naama
2017-01-01
Stone cleavers are one of the most distinctive components of the Acheulian toolkit. These tools were produced as part of a long and complex reduction sequence and they provide indications for planning and remarkable knapping skill. These aspects hold implications regarding the cognitive complexity and abilities of their makers and users. In this study we have analyzed a cleaver assemblage originating from the Acheulian site of Gesher Benot Ya'aqov, Israel, to provide a reconstruction of the chaîne opératoire which structured their production. This reduction sequence was taken as the basis for a cognitive analysis which allowed us to draw conclusion regarding numerous behavioral and cognitive aspects of the GBY hominins. The results indicate that the cleavers production incorporated a highly specific sequence of decisions and actions which resulted in three distinct modes of cleavers modification. Furthermore, the decision to produce a cleaver must have been taken very early in the sequence, thus differentiating its production from that of handaxes. The substantial predetermination and the specific reduction sequence provide evidence that the Gesher Benot Ya'aqov hominins had a number of cognitive categories such as a general 'tool concept' and a more specific 'cleaver concept', setting them apart from earlier tool-producing hominins and extant tool-using non-human primates. Furthermore, it appears that the Gesher Benot Ya'aqov lithic technology was governed by expert cognition, which is the kind of thinking typical of modern human experts in their various domains. Thus, the results provide direct indications that important components of modern cognition have been well established in the minds of the Gesher Benot Ya'aqov hominins.
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
NASA Astrophysics Data System (ADS)
Plag, H.-P.
2012-04-01
Geo-referenced information is increasingly important for many scientific and societal applications. The availability of reliable and applicable spatial data and information is fundamental for addressing pressing problems such as food, water, and energy security; disaster risk reduction; climate change; environmental quality; pandemics; economic crises and wars; population migration; and, in a general sense, sustainability. Today, more than 70% of societal activities in developed countries depend directly or indirectly on geo-referenced information. The rapid development of analysis tools, such as Geographic Information Systems and web-based tools for viewing, accessing, and analyzing of geo-referenced information, and the growing abundance of openly available Earth observations (e.g., through the Global Earth Observation System of Systems, GEOSS) likely will increase the dependency of science and society on geo-referenced information. Increasingly, the tools allow the combination of data sets from various sources. Improvements of interoperability, promoted particularly by GEOSS, will strengthen this trend and lead to more tools for the combinations of data from different sources. What is currently lacking is a service-oriented infrastructure helping to ensure that data quality and applicability are not compromised through modifications and combinations. Most geo-referenced information comes without sufficient information on quality and applicability. The Group on Earth Observations (GEO) has embarked on establishing a so-called GEO Label that would provide easy-to-understand, globally available information on aspects of quality, user rating, relevance, and fit-for-usage of the products and services accessible through GEOSS (with the responsibility for the concept development delegated to Work Plan Task ID-03). In designing a service-oriented architecture that could support a GEO Label, it is important to understand the impact of the goals for the label on the design of the infrastructure. Design, concept, implementation, and success of a label depend on the goals, and these goals need to be well-defined and widely accepted. Strong labels are generally those that are unique in their field and accepted by an authoritative body in this field. A label requires time to get accepted, and once established the key characteristics normally can not be changed. Therefore, an informed decision on a labeling for geo-referenced data is crucial for success. GEO is in a position to make this decision. There is a wide range of potential goals for the GEO Label including: (1) an attractive incentive for involvement of S&T communities by giving recognition for contributions; enabling credits for providers (attribution); and supporting forward traceability (usage); (2) promote data sharing by signaling data availability and conditions; (3) inform users by increasing trustworthiness; characterizing quality; characterizing applicability; ensuring backward traceability (data sources); (4) inform providers (and their funders) by providing information on relevance (meeting user needs); and provide information on usage. GEO will have to decide on which of these goals to choose for the GEO Label. Input from GEOSS users and S&T communities would help to reach a decision that would serve best all those depending on geo-referenced information.
e-IQ and IQ knowledge mining for generalized LDA
NASA Astrophysics Data System (ADS)
Jenkins, Jeffrey; van Bergem, Rutger; Sweet, Charles; Vietsch, Eveline; Szu, Harold
2015-05-01
How can the human brain uncover patterns, associations and features in real-time, real-world data? There must be a general strategy used to transform raw signals into useful features, but representing this generalization in the context of our information extraction tool set is lacking. In contrast to Big Data (BD), Large Data Analysis (LDA) has become a reachable multi-disciplinary goal in recent years due in part to high performance computers and algorithm development, as well as the availability of large data sets. However, the experience of Machine Learning (ML) and information communities has not been generalized into an intuitive framework that is useful to researchers across disciplines. The data exploration phase of data mining is a prime example of this unspoken, ad-hoc nature of ML - the Computer Scientist works with a Subject Matter Expert (SME) to understand the data, and then build tools (i.e. classifiers, etc.) which can benefit the SME and the rest of the researchers in that field. We ask, why is there not a tool to represent information in a meaningful way to the researcher asking the question? Meaning is subjective and contextual across disciplines, so to ensure robustness, we draw examples from several disciplines and propose a generalized LDA framework for independent data understanding of heterogeneous sources which contribute to Knowledge Discovery in Databases (KDD). Then, we explore the concept of adaptive Information resolution through a 6W unsupervised learning methodology feedback system. In this paper, we will describe the general process of man-machine interaction in terms of an asymmetric directed graph theory (digging for embedded knowledge), and model the inverse machine-man feedback (digging for tacit knowledge) as an ANN unsupervised learning methodology. Finally, we propose a collective learning framework which utilizes a 6W semantic topology to organize heterogeneous knowledge and diffuse information to entities within a society in a personalized way.
Curran, E; Harper, P; Loveday, H; Gilmour, H; Jones, S; Benneyan, J; Hood, J; Pratt, R
2008-10-01
Statistical process control (SPC) charts have previously been advocated for infection control quality improvement. To determine their effectiveness, a multicentre randomised controlled trial was undertaken to explore whether monthly SPC feedback from infection control nurses (ICNs) to healthcare workers of ward-acquired meticillin-resistant Staphylococcus aureus (WA-MRSA) colonisation or infection rates would produce any reductions in incidence. Seventy-five wards in 24 hospitals in the UK were randomised into three arms: (1) wards receiving SPC chart feedback; (2) wards receiving SPC chart feedback in conjunction with structured diagnostic tools; and (3) control wards receiving neither type of feedback. Twenty-five months of pre-intervention WA-MRSA data were compared with 24 months of post-intervention data. Statistically significant and sustained decreases in WA-MRSA rates were identified in all three arms (P<0.001; P=0.015; P<0.001). The mean percentage reduction was 32.3% for wards receiving SPC feedback, 19.6% for wards receiving SPC and diagnostic feedback, and 23.1% for control wards, but with no significant difference between the control and intervention arms (P=0.23). There were significantly more post-intervention 'out-of-control' episodes (P=0.021) in the control arm (averages of 0.60, 0.28, and 0.28 for Control, SPC and SPC+Tools wards, respectively). Participants identified SPC charts as an effective communication tool and valuable for disseminating WA-MRSA data.
ORAC-DR -- spectroscopy data reduction
NASA Astrophysics Data System (ADS)
Hirst, Paul; Cavanagh, Brad
ORAC-DR is a general-purpose automatic data-reduction pipeline environment. This document describes its use to reduce spectroscopy data collected at the United Kingdom Infrared Telescope (UKIRT) with the CGS4, UIST and Michelle instruments, at the Anglo-Australian Telescope (AAT) with the IRIS2 instrument, and from the Very Large Telescope with ISAAC. It outlines the algorithms used and how to make minor modifications of them, and how to correct for errors made at the telescope.
Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.
2016-01-01
Abstract. The use of a channelization mechanism on model observers not only makes mimicking human visual behavior possible, but also reduces the amount of image data needed to estimate the model observer parameters. The channelized Hotelling observer (CHO) and channelized scanning linear observer (CSLO) have recently been used to assess CT image quality for detection tasks and combined detection/estimation tasks, respectively. Although the use of channels substantially reduces the amount of data required to compute image quality, the number of scans required for CT imaging is still not practical for routine use. It is our desire to further reduce the number of scans required to make CHO or CSLO an image quality tool for routine and frequent system validations and evaluations. This work explores different data-reduction schemes and designs an approach that requires only a few CT scans. Three different kinds of approaches are included in this study: a conventional CHO/CSLO technique with a large sample size, a conventional CHO/CSLO technique with fewer samples, and an approach that we will show requires fewer samples to mimic conventional performance with a large sample size. The mean value and standard deviation of areas under ROC/EROC curve were estimated using the well-validated shuffle approach. The results indicate that an 80% data reduction can be achieved without loss of accuracy. This substantial data reduction is a step toward a practical tool for routine-task-based QA/QC CT system assessment. PMID:27493982
Development of a Comprehensive Community Nitrogen Oxide Emissions Reduction Toolkit (CCNERT)
NASA Astrophysics Data System (ADS)
Sung, Yong Hoon
The main objective of this study is to research and develop a simplified tool to estimate energy use in a community and its associated effects on air pollution. This tool is intended to predict the impacts of selected energy conservation options and efficiency programs on emission reduction. It is intended to help local government and their residents understand and manage information collection and the procedures to be used. This study presents a broad overview of the community-wide energy use and NOx emissions inventory process. It also presents various simplified procedures to estimate each sector's energy use. In an effort to better understand community-wide energy use and its associated NOx emissions, the City of College Station, Texas, was selected as a case study community for this research. While one community might successfully reduce the production of NOx emissions by adopting electricity efficiency programs in its buildings, another community might be equally successful by changing the mix of fuel sources used to generate electricity, which is consumed by the community. In yet a third community low NOx automobiles may be mandated. Unfortunately, the impact and cost of one strategy over another changes over time as major sources of pollution are reduced. Therefore, this research proposes to help community planners answer these questions and to assist local communities with their NOx emission reduction plans by developing a Comprehensive Community NOx Emissions Reduction Toolkit (CCNERT). The proposed simplified tool could have a substantial impact on reducing NOx emission by providing decision-makers with a preliminary understanding about the impacts of various energy efficiency programs on emissions reductions. To help decision makers, this study has addressed these issues by providing a general framework for examining how a community's non-renewable energy use leads to NOx emissions, by quantifying each end-user's energy usage and its associated NOx emissions, and by evaluating the environmental benefits of various types of energy saving options.
NASA Technical Reports Server (NTRS)
Lord, Steven D.
1992-01-01
This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.
U.S. Geological Survey community for data integration: data upload, registry, and access tool
,
2012-01-01
As a leading science and information agency and in fulfillment of its mission to provide reliable scientific information to describe and understand the Earth, the U.S. Geological Survey (USGS) ensures that all scientific data are effectively hosted, adequately described, and appropriately accessible to scientists, collaborators, and the general public. To succeed in this task, the USGS established the Community for Data Integration (CDI) to address data and information management issues affecting the proficiency of earth science research. Through the CDI, the USGS is providing data and metadata management tools, cyber infrastructure, collaboration tools, and training in support of scientists and technology specialists throughout the project life cycle. One of the significant tools recently created to contribute to this mission is the Uploader tool. This tool allows scientists with limited data management resources to address many of the key aspects of the data life cycle: the ability to protect, preserve, publish and share data. By implementing this application inside ScienceBase, scientists also can take advantage of other collaboration capabilities provided by the ScienceBase platform.
Manzoni, Gian Mauro; Rossi, Alessandro; Marazzi, Nicoletta; Agosti, Fiorenza; De Col, Alessandra; Pietrabissa, Giada; Castelnuovo, Gianluca; Molinari, Enrico; Sartorio, Allessandro
2018-01-01
This study was aimed to examine the feasibility, validity, and reliability of the Italian Pediatric Quality of Life Inventory Multidimensional Fatigue Scale (PedsQL™ MFS) for adult inpatients with severe obesity. 200 inpatients (81% females) with severe obesity (BMI ≥ 35 kg/m2) completed the PedsQL MFS (General Fatigue, Sleep/Rest Fatigue and Cognitive Fatigue domains), the Fatigue Severity Scale, and the Center for Epidemiologic Studies Depression Scale immediately after admission to a 3-week residential body weight reduction program. A randomized subsample of 48 patients re-completed the PedsQL MFS after 3 days. Confirmatory factor analysis showed that a modified hierarchical model with two items moved from the Sleep/Rest Fatigue domain to the General Fatigue domain and a second-order latent factor best fitted the data. Internal consistency and test-retest reliabilities were acceptable to high in all scales, and small to high statistically significant correlations were found with all convergent measures, with the exception of BMI. Significant floor effects were found in two scales (Cognitive Fatigue and Sleep/Rest Fatigue). The Italian modified PedsQL MFS for adults showed to be a valid and reliable tool for the assessment of fatigue in inpatients with severe obesity. Future studies should assess its discriminant validity as well as its responsiveness to weight reduction. © 2018 The Author(s) Published by S. Karger GmbH, Freiburg.
Sowden, Justina N; Olver, Mark E
2017-03-01
The present study provides an examination of dynamic sexual violence risk featuring the Stable-2007 (Hanson, Harris, Scott, & Helmus, 2007) and the Violence Risk Scale-Sexual Offender version (VRS-SO; Wong, Olver, Nicholaichuk, & Gordon, 2003) in a Canadian sample of 180 federally incarcerated sexual offenders who attended a high-intensity sexual offender treatment program. Archival pretreatment and posttreatment ratings were completed on the VRS-SO and Stable-2007, and recidivism data were obtained from official criminal records, with the sample being followed up approximately 10 years postrelease. VRS-SO pre- and posttreatment dynamic scores demonstrated significant predictive accuracy for sexual, nonsexual violent, any violent (including sexual), and general recidivism, while Stable-2007 pre- and posttreatment scores were significantly associated with the latter 3 outcomes; these associations were maintained after controlling for the Static-99R (Helmus, Thornton, Hanson, & Babchishin, 2012). Finally, significant pre-post differences, amounting to approximately three quarters of a standard deviation, were found on Stable-2007 and VRS-SO scores. VRS-SO change scores were significantly associated with reductions in nonsexual violent, any violent, and general recidivism (but not sexual recidivism) after controlling for baseline risk or pretreatment score, while Stable-2007 change scores did not significantly predict reductions in any recidivism outcomes. Applications of these tools within the context of dynamic sexual violence risk assessment incorporating the use of change information are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Data user's notes of the radio astronomy experiment aboard the OGO-V spacecraft
NASA Technical Reports Server (NTRS)
Haddock, F. T.; Breckenridge, S. L.
1970-01-01
General information concerning the low-frequency radiometer, instrument package launching and operation, and scientific objectives of the flight are provided. Calibration curves and correction factors, with general and detailed information on the preflight calibration procedure are included. The data acquisition methods and the format of the data reduction, both on 35 mm film and on incremental computer plots, are described.
Applied Meteorology Unit (AMU)
NASA Technical Reports Server (NTRS)
Bauman, William; Lambert, Winifred; Wheeler, Mark; Barrett, Joe; Watson, Leela
2007-01-01
This report summarizes the Applied Meteorology Unit (AMU) activities for the second quarter of Fiscal Year 2007 (January - March 2007). Tasks reported on are: Obiective Lightning Probability Tool, Peak Wind Tool for General Forecasting, Situational Lightning Climatologies for Central Florida, Anvil Threat Corridor Forecast Tool in AWIPS, Volume Averaqed Heiqht lnteq rated Radar Reflectivity (VAHIRR), Tower Data Skew-t Tool, and Weather Research and Forecastini (WRF) Model Sensitivity Study
Automating OSIRIS Data Reduction for the Keck Observatory Archive
NASA Astrophysics Data System (ADS)
Holt, J.; Tran, H. D.; Goodrich, R.; Berriman, G. B.; Gelino, C. R.; KOA Team
2014-05-01
By the end of 2013, the Keck Observatory Archive (KOA) will serve data from all active instruments on the Keck Telescopes. OSIRIS (OH-Suppressing Infra-Red Imaging Spectrograph), the last active instrument to be archived in KOA, has been in use behind the (AO) system at Keck since February 2005. It uses an array of tiny lenslets to simultaneously produce spectra at up to 4096 locations. Due to the complicated nature of the OSIRIS raw data, the OSIRIS team developed a comprehensive data reduction program. This data reduction system has an online mode for quick real-time reductions, which are used primarily for basic data visualization and quality assessment done at the telescope while observing. The offline version of the data reduction system includes an expanded reduction method list, does more iterations for a better construction of the data cubes, and is used to produce publication-quality products. It can also use reconstruction matrices that are developed after the observations were taken, and are more refined. The KOA team is currently utilizing the standard offline reduction mode to produce quick-look browse products for the raw data. Users of the offline data reduction system generally use a graphical user interface to manually setup the reduction parameters. However, in order to reduce and serve the 200,000 science files on disk, all of the reduction parameters and steps need to be fully automated. This pipeline will also be used to automatically produce quick-look browse products for future OSIRIS data after each night's observations. Here we discuss the complexities of OSIRIS data, the reduction system in place, methods for automating the system, performance using virtualization, and progress made to date in generating the KOA products.
Automating OSIRIS Data Reduction for the Keck Observatory Archive
NASA Astrophysics Data System (ADS)
Tran, Hien D.; Holt, J.; Goodrich, R. W.; Lyke, J. E.; Gelino, C. R.; Berriman, G. B.; KOA Team
2014-01-01
Since the end of 2013, the Keck Observatory Archive (KOA) has served data from all active instruments on the Keck Telescopes. OSIRIS (OH-Suppressing Infra-Red Imaging Spectrograph), the last active instrument to be archived in KOA, has been in use behind the adaptive optics (AO) system at Keck since February 2005. It uses an array of tiny lenslets to simultaneously produce spectra at up to 4096 locations. Due to the complicated nature of the OSIRIS raw data, the OSIRIS team developed a comprehensive data reduction program. This data reduction system has an online mode for quick real-time reductions which are used primarily for basic data visualization and quality assessment done at the telescope while observing. The offline version of the data reduction system includes an expanded reduction method list, does more iterations for a better construction of the data cubes, and is used to produce publication-quality products. It can also use reconstruction matrices that are developed after the observations were taken, and are more refined. The KOA team is currently utilizing the standard offline reduction mode to produce quick-look browse products for the raw data. Users of the offline data reduction system generally use a graphical user interface to manually setup the reduction parameters. However, in order to reduce and serve the ~200,000 science files on disk, all of the reduction parameters and steps need to be fully automated. This pipeline will also be used to automatically produce quick-look browse products for future OSIRIS data after each night's observations. Here we discuss the complexities of OSIRIS data, the reduction system in place, methods for automating the system, performance using virtualization, and progress made to date in generating the KOA products.
Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P
Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.
Rebolledo-Leiva, Ricardo; Angulo-Meza, Lidia; Iriarte, Alfredo; González-Araya, Marcela C
2017-09-01
Operations management tools are critical in the process of evaluating and implementing action towards a low carbon production. Currently, a sustainable production implies both an efficient resource use and the obligation to meet targets for reducing greenhouse gas (GHG) emissions. The carbon footprint (CF) tool allows estimating the overall amount of GHG emissions associated with a product or activity throughout its life cycle. In this paper, we propose a four-step method for a joint use of CF assessment and Data Envelopment Analysis (DEA). Following the eco-efficiency definition, which is the delivery of goods using fewer resources and with decreasing environmental impact, we use an output oriented DEA model to maximize production and reduce CF, taking into account simultaneously the economic and ecological perspectives. In another step, we stablish targets for the contributing CF factors in order to achieve CF reduction. The proposed method was applied to assess the eco-efficiency of five organic blueberry orchards throughout three growing seasons. The results show that this method is a practical tool for determining eco-efficiency and reducing GHG emissions. Copyright © 2017 Elsevier B.V. All rights reserved.
Asymmetric author-topic model for knowledge discovering of big data in toxicogenomics.
Chung, Ming-Hua; Wang, Yuping; Tang, Hailin; Zou, Wen; Basinger, John; Xu, Xiaowei; Tong, Weida
2015-01-01
The advancement of high-throughput screening technologies facilitates the generation of massive amount of biological data, a big data phenomena in biomedical science. Yet, researchers still heavily rely on keyword search and/or literature review to navigate the databases and analyses are often done in rather small-scale. As a result, the rich information of a database has not been fully utilized, particularly for the information embedded in the interactive nature between data points that are largely ignored and buried. For the past 10 years, probabilistic topic modeling has been recognized as an effective machine learning algorithm to annotate the hidden thematic structure of massive collection of documents. The analogy between text corpus and large-scale genomic data enables the application of text mining tools, like probabilistic topic models, to explore hidden patterns of genomic data and to the extension of altered biological functions. In this paper, we developed a generalized probabilistic topic model to analyze a toxicogenomics dataset that consists of a large number of gene expression data from the rat livers treated with drugs in multiple dose and time-points. We discovered the hidden patterns in gene expression associated with the effect of doses and time-points of treatment. Finally, we illustrated the ability of our model to identify the evidence of potential reduction of animal use.
Population genetics of autopolyploids under a mixed mating model and the estimation of selfing rate.
Hardy, Olivier J
2016-01-01
Nowadays, the population genetics analysis of autopolyploid species faces many difficulties due to (i) limited development of population genetics tools under polysomic inheritance, (ii) difficulties to assess allelic dosage when genotyping individuals and (iii) a form of inbreeding resulting from the mechanism of 'double reduction'. Consequently, few data analysis computer programs are applicable to autopolyploids. To contribute bridging this gap, this article first derives theoretical expectations for the inbreeding and identity disequilibrium coefficients under polysomic inheritance in a mixed mating model. Moment estimators of these coefficients are proposed when exact genotypes or just markers phenotypes (i.e. allelic dosage unknown) are available. This led to the development of estimators of the selfing rate based on adult genotypes or phenotypes and applicable to any even-ploidy level. Their statistical performances and robustness were assessed by numerical simulations. Contrary to inbreeding-based estimators, the identity disequilibrium-based estimator using phenotypes is robust (absolute bias generally < 0.05), even in the presence of double reduction, null alleles or biparental inbreeding due to isolation by distance. A fairly good precision of the selfing rate estimates (root mean squared error < 0.1) is already achievable using a sample of 30-50 individuals phenotyped at 10 loci bearing 5-10 alleles each, conditions reachable using microsatellite markers. Diallelic markers (e.g. SNP) can also perform satisfactorily in diploids and tetraploids but more polymorphic markers are necessary for higher ploidy levels. The method is implemented in the software SPAGeDi and should contribute to reduce the lack of population genetics tools applicable to autopolyploids. © 2015 John Wiley & Sons Ltd.
HIV RISK REDUCTION INTERVENTIONS AMONG SUBSTANCE-ABUSING REPRODUCTIVE-AGE WOMEN: A SYSTEMATIC REVIEW
Weissman, Jessica; Kanamori, Mariano; Dévieux, Jessy G.; Trepka, Mary Jo; De La Rosa, Mario
2017-01-01
HIV/AIDS is one of the leading causes of death among reproductive-age women throughout the world, and substance abuse plays a major role in HIV infection. We conducted a systematic review, in accordance with the 2015 Preferred Items for Reporting Systematic Reviews and Meta-analysis tool, to assess HIV risk-reduction intervention studies among reproductive-age women who abuse substances. We initially identified 6,506 articles during our search and, after screening titles and abstracts, examining articles in greater detail, and finally excluding those rated methodologically weak, a total of 10 studies were included in this review. Studies that incorporated behavioral skills training into the intervention and were based on theoretical model(s) were the most effective in general at decreasing sex and drug risk behaviors. Additional HIV risk-reduction intervention research with improved methodological designs is warranted to determine the most efficacious HIV risk-reduction intervention for reproductive-age women who abuse substances. PMID:28467160
A Sensor Driven Probabilistic Method for Enabling Hyper Resolution Flood Simulations
NASA Astrophysics Data System (ADS)
Fries, K. J.; Salas, F.; Kerkez, B.
2016-12-01
A reduction in the cost of sensors and wireless communications is now enabling researchers and local governments to make flow, stage and rain measurements at locations that are not covered by existing USGS or state networks. We ask the question: how should these new sources of densified, street-level sensor measurements be used to make improved forecasts using the National Water Model (NWM)? Assimilating these data "into" the NWM can be challenging due to computational complexity, as well as heterogeneity of sensor and other input data. Instead, we introduce a machine learning and statistical framework that layers these data "on top" of the NWM outputs to improve high-resolution hydrologic and hydraulic forecasting. By generalizing our approach into a post-processing framework, a rapidly repeatable blueprint is generated for for decision makers who want to improve local forecasts by coupling sensor data with the NWM. We present preliminary results based on case studies in highly instrumented watersheds in the US. Through the use of statistical learning tools and hydrologic routing schemes, we demonstrate the ability of our approach to improve forecasts while simultaneously characterizing bias and uncertainty in the NWM.
Falco: a quick and flexible single-cell RNA-seq processing framework on the cloud.
Yang, Andrian; Troup, Michael; Lin, Peijie; Ho, Joshua W K
2017-03-01
Single-cell RNA-seq (scRNA-seq) is increasingly used in a range of biomedical studies. Nonetheless, current RNA-seq analysis tools are not specifically designed to efficiently process scRNA-seq data due to their limited scalability. Here we introduce Falco, a cloud-based framework to enable paralellization of existing RNA-seq processing pipelines using big data technologies of Apache Hadoop and Apache Spark for performing massively parallel analysis of large scale transcriptomic data. Using two public scRNA-seq datasets and two popular RNA-seq alignment/feature quantification pipelines, we show that the same processing pipeline runs 2.6-145.4 times faster using Falco than running on a highly optimized standalone computer. Falco also allows users to utilize low-cost spot instances of Amazon Web Services, providing a ∼65% reduction in cost of analysis. Falco is available via a GNU General Public License at https://github.com/VCCRI/Falco/. j.ho@victorchang.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
A Monte Carlo Simulation Study of the Reliability of Intraindividual Variability
Estabrook, Ryne; Grimm, Kevin J.; Bowles, Ryan P.
2012-01-01
Recent research has seen intraindividual variability (IIV) become a useful technique to incorporate trial-to-trial variability into many types of psychological studies. IIV as measured by individual standard deviations (ISDs) has shown unique prediction to several types of positive and negative outcomes (Ram, Rabbit, Stollery, & Nesselroade, 2005). One unanswered question regarding measuring intraindividual variability is its reliability and the conditions under which optimal reliability is achieved. Monte Carlo simulation studies were conducted to determine the reliability of the ISD compared to the intraindividual mean. The results indicate that ISDs generally have poor reliability and are sensitive to insufficient measurement occasions, poor test reliability, and unfavorable amounts and distributions of variability in the population. Secondary analysis of psychological data shows that use of individual standard deviations in unfavorable conditions leads to a marked reduction in statistical power, although careful adherence to underlying statistical assumptions allows their use as a basic research tool. PMID:22268793
Nonpharmacologic approach to fatigue in patients with cancer.
Pachman, Deirdre R; Price, Katharine A; Carey, Elise C
2014-01-01
Cancer-related fatigue is a common yet underappreciated problem with a significant impact on functional ability and quality of life. Practice guidelines mandate that all cancer patients and survivors be screened for cancer-related fatigue (CRF) at regular intervals. Comorbidities that could contribute to fatigue should be treated, and patients with moderate to severe fatigue should undergo a comprehensive evaluation. Nonpharmacologic interventions are important tools to combat CRF and should be incorporated into routine practice. Physical activity, educational interventions, and cognitive-behavioral therapy have the most supportive data and can be recommended to patients with confidence. From a practical standpoint, general education on CRF is something that most care providers can readily offer patients as part of routine care. Other interventions that appear promising but are as yet lacking convincing evidence include mindfulness-based stress reduction, yoga, and acupuncture. Reiki, Qigong, hypnosis, and music therapy may be worthy of further investigation.
Gene Selection and Cancer Classification: A Rough Sets Based Approach
NASA Astrophysics Data System (ADS)
Sun, Lijun; Miao, Duoqian; Zhang, Hongyun
Indentification of informative gene subsets responsible for discerning between available samples of gene expression data is an important task in bioinformatics. Reducts, from rough sets theory, corresponding to a minimal set of essential genes for discerning samples, is an efficient tool for gene selection. Due to the compuational complexty of the existing reduct algoritms, feature ranking is usually used to narrow down gene space as the first step and top ranked genes are selected . In this paper,we define a novel certierion based on the expression level difference btween classes and contribution to classification of the gene for scoring genes and present a algorithm for generating all possible reduct from informative genes.The algorithm takes the whole attribute sets into account and find short reduct with a significant reduction in computational complexity. An exploration of this approach on benchmark gene expression data sets demonstrates that this approach is successful for selecting high discriminative genes and the classification accuracy is impressive.
General pathologist-helper: The new medical app about general pathology.
Fernández-Vega, Iván
2015-01-01
Smartphone applications (apps) have become increasingly prevalent in medicine. Due to most pathologists, pathology trainees, technicians, and medical students use smartphones; apps can be a different way for general pathology education. "General pathologist-helper (GP-HELPER)" is a novel app developed as a reference tool in general pathology and especially for general pathologists, developed for Android and iOS platforms. "GP-HELPER," was created using Mobincube website platform. This tool also integrates "FORUM GP-HELPER," an external website created using Miarroba website (http://forum-gp-helper.mboards.com) and "COMMUNITY GP-HELPER" a multichannel chat created using Chatango website platform. The application was released in July 2015, and it is been periodically updated since then. The app has permanent information (offline data) about different pathology protocols (TNM latest edition, protocols regarding management of tumors of unknown primary origin, and flowcharts for some of the most difficult tumors to diagnose) and a database with more than 5000 immunohistochemistry results from different tumors. Online data have links to more than 1100 reference pathology video lectures, 250 antibodies information, more than 70 pathology association websites, 46 pathology providers, and 78 outstanding pathology journal websites. Besides this information, the app has two interactive places such as "FORUM GP-HELPER" and "COMMUNITY GP-HELPER" that let users to stay in touch everywhere and every time. Expert consult section is also available. "GP-HELPER" pretends to integrate offline and online data about pathology with two interactive external places in order to represent a reference tool for general pathologists and associate members.
NASA Astrophysics Data System (ADS)
Carlson, Derrick R.
While renewable energy is in the process of maturing, energy efficiency improvements may provide an opportunity to reduce energy consumption and consequent greenhouse gas emissions to bridge the gap between current emissions and the reductions necessary to prevent serious effects of climate change and will continue to be an integral part of greenhouse gas emissions policy moving forward. Residential energy is a largely untapped source of energy reductions as consumers, who wish to reduce energy consumption for monetary, environmental, and other reasons, face barriers. One such barrier is a lack of knowledge or understanding of how energy is consumed in a home and how to reduce this consumption effectively through behavioral and technological changes. One way to improve understanding of residential energy consumption is through the creation of a model to predict which appliances and electronics will be present and significantly contribute to the electricity consumption of a home on the basis of various characteristics of that home. The basis of this model is publically available survey data from the Residential Energy Consumption Survey (RECS). By predicting how households are likely to consume energy, homeowners, policy makers, and other stakeholders have access to valuable data that enables reductions in energy consumption in the residential sector. This model can be used to select homes that may be ripe for energy reductions and to predict the appliances that are the basis of these potential reductions. This work suggests that most homes in the U.S. have about eight appliances that are responsible for about 80% of the electricity consumption in that home. Characteristics such as census region, floor space, income, and total electricity consumption affect which appliances are likely to be in a home, however the number of appliances is generally around 8. Generally it takes around 4 appliances to reach the 50% threshold and 12 appliances to reach 90% of electricity consumption, which suggests significant diminishing returns for parties interested in monitoring appliance level electricity consumption. Another way to improve understanding of residential energy consumption is through the development of residential use phase energy vectors for use in the Economic Input-Output Life Cycle Assessment (EIO-LCA) model. The EIO-LCA model is a valuable scoping tool to predict the environmental impacts of economic activity. This tool has a gap in its capabilities as residential use phase energy is outside the scope of the model. Adding use phase energy vectors to the EIO-LCA model will improve the modeling, provide a more complete estimation of energy impacts and allow for embedded energy to be compared to use phase energy for the purchase of goods and services in the residential sector. This work adds 21 quads of energy to the residential energy sector for the model and 15 quads of energy for personal transportation. These additions represent one third of the total energy consumption of the United States and a third of the total energy in the EIO-LCA model. This work also demonstrates that for many products such as electronics and household appliances use phase energy demands are much greater than manufacturing energy demands and dominate the life cycles for these products. A final way in which this thesis improves upon the understanding of how use phase energy is consumed in a home is through the exploration of potential energy reductions in a home. This analysis selects products that are used or consumed in a home, and explores the potential for reductions in the embedded manufacturing and use phase energy of that product using EIO-LCA and the energy vectors created in Chapter 3. The results give consumers an understanding of where energy is consumed in the lifecycle of products that they purchase and provide policy makers with valuable information on how to focus or refocus policies that are aimed and reducing energy in the residential sector. This work finds that a majority of the energy consumed by retail products is consumed in the use phase of electronics and appliances. Consequently the largest potential reductions in residential energy use can be found in the same area. The work also shows that targeting reductions in the manufacturing energy for many products is likely to be an ineffective strategy for energy reductions with the exception of a select few products. Supply chain energy reductions may be more promising than manufacturing energy reductions, though neither is likely to be as effective as strategies that target use phase energy reductions.
Reid, Susan A; Callister, Robin; Katekar, Michael G; Treleaven, Julia M
2017-08-01
Cervicogenic dizziness (CGD) is hard to diagnose as there is no objective test. Can a brief assessment tool be derived from the Dizziness Handicap Inventory (DHI) to assist in screening for CGD? Case-control study with split-sample analysis. 86 people with CGD and 86 people with general dizziness completed the DHI as part of the assessment of their dizziness. Descriptive statistics were used to assess how frequently each question on the DHI was answered 'yes' or 'sometimes' by participants with CGD and by participants with general dizziness. The questions that best discriminated between GCD and general dizziness were compiled into a brief assessment tool for CGD. Data from 80 participants (40 from each group) were used to generate a receiver operating characteristic (ROC) curve to establish a cut-off score for that brief assessment tool. Then, data from the remaining 92 participants were used to try to validate the diagnostic ability of the brief assessment tool using that cut-off score. Questions 1, 9 and 11 were the most discriminatory and were combined to form the brief assessment tool. The ROC curve indicated an optimal threshold of 9. The diagnostic ability of the brief assessment tool among the remaining 46 participants from each group was: sensitivity 77% (95% CI: 67 to 84), specificity 66% (56-75), positive likelihood ratio 2.28 (1.66-3.13), and negative likelihood ratio 0.35 (0.23-0.53). A brief assessment tool of three questions appears to be helpful in screening for CGD. Copyright © 2017. Published by Elsevier Ltd.
2013-07-01
structure of the data and Gower’s similarity coefficient as the algorithm for calculating the proximity matrices. The following section provides a...representative set of terrorist event data. Attribute Day Location Time Prim /Attack Sec/Attack Weight 1 1 1 1 1 Scale Nominal Nominal Interval Nominal...calculate the similarity it uses Gower’s similarity and multidimensional scaling algorithms contained in an R statistical computing environment
STILTS -- Starlink Tables Infrastructure Library Tool Set
NASA Astrophysics Data System (ADS)
Taylor, Mark
STILTS is a set of command-line tools for processing tabular data. It has been designed for, but is not restricted to, use on astronomical data such as source catalogues. It contains both generic (format-independent) table processing tools and tools for processing VOTable documents. Facilities offered include crossmatching, format conversion, format validation, column calculation and rearrangement, row selection, sorting, plotting, statistical calculations and metadata display. Calculations on cell data can be performed using a powerful and extensible expression language. The package is written in pure Java and based on STIL, the Starlink Tables Infrastructure Library. This gives it high portability, support for many data formats (including FITS, VOTable, text-based formats and SQL databases), extensibility and scalability. Where possible the tools are written to accept streamed data so the size of tables which can be processed is not limited by available memory. As well as the tutorial and reference information in this document, detailed on-line help is available from the tools themselves. STILTS is available under the GNU General Public Licence.
NASA Technical Reports Server (NTRS)
Ballester, P.
1992-01-01
MIDAS (Munich Image Data Analysis System) is the image processing system developed at ESO for astronomical data reduction. MIDAS is used for off-line data reduction at ESO and many astronomical institutes all over Europe. In addition to a set of general commands, enabling to process and analyze images, catalogs, graphics and tables, MIDAS includes specialized packages dedicated to astronomical applications or to specific ESO instruments. Several graphical interfaces are available in the MIDAS environment: XHelp provides an interactive help facility, and XLong and XEchelle enable data reduction of long-slip and echelle spectra. GUI builders facilitate the development of interfaces. All ESO interfaces comply to the ESO User Interfaces Common Conventions which secures an identical look and feel for telescope operations, data analysis, and archives.
Exposure Assessment Tools by Lifestages and Populations - General Population
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
NASA Astrophysics Data System (ADS)
Leidig, Mathias; Teeuw, Richard M.; Gibson, Andrew D.
2016-08-01
The article presents a time series (2009-2013) analysis for a new version of the ;Digital Divide; concept that developed in the 1990s. Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. The Data Poverty Index (DPI) provides an open-source means of annually evaluating global access to data and information. The DPI can be used to monitor aspects of data and information availability at global and national levels, with potential application at local (district) levels. Access to data and information is a major factor in disaster risk reduction, increased resilience to disaster and improved adaptation to climate change. In that context, the DPI could be a useful tool for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction (2015-2030). The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. Unlike many other indices, the DPI is underpinned by datasets that are consistently provided annually for almost all the countries of the world and can be downloaded without restriction or cost.
Modeling Complex Chemical Systems: Problems and Solutions
NASA Astrophysics Data System (ADS)
van Dijk, Jan
2016-09-01
Non-equilibrium plasmas in complex gas mixtures are at the heart of numerous contemporary technologies. They typically contain dozens to hundreds of species, involved in hundreds to thousands of reactions. Chemists and physicists have always been interested in what are now called chemical reduction techniques (CRT's). The idea of such CRT's is that they reduce the number of species that need to be considered explicitly without compromising the validity of the model. This is usually achieved on the basis of an analysis of the reaction time scales of the system under study, which identifies species that are in partial equilibrium after a given time span. The first such CRT that has been widely used in plasma physics was developed in the 1960's and resulted in the concept of effective ionization and recombination rates. It was later generalized to systems in which multiple levels are effected by transport. In recent years there has been a renewed interest in tools for chemical reduction and reaction pathway analysis. An example of the latter is the PumpKin tool. Another trend is that techniques that have previously been developed in other fields of science are adapted as to be able to handle the plasma state of matter. Examples are the Intrinsic Low Dimension Manifold (ILDM) method and its derivatives, which originate from combustion engineering, and the general-purpose Principle Component Analysis (PCA) technique. In this contribution we will provide an overview of the most common reduction techniques, then critically assess the pros and cons of the methods that have gained most popularity in recent years. Examples will be provided for plasmas in argon and carbon dioxide.
Applied Meteorology Unit (AMU)
NASA Technical Reports Server (NTRS)
Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Wheeler, Mark
2010-01-01
This report summarizes the Applied Meteorology Unit (AMU) activities for the first quarter of Fiscal Year 2010 (October - December 2009). A detailed project schedule is included in the Appendix. Included tasks are: (1) Peak Wind Tool for User Launch Commit Criteria (LCC), (2) Objective Lightning Probability Tool, Phase III, (3) Peak Wind Tool for General Forecasting, Phase II, (4) Upgrade Summer Severe Weather Tool in Meteorological Interactive Data Display System (MIDDS), (5) Advanced Regional Prediction System (ARPS) Data Analysis System (ADAS) Update and Maintainability, (5) Verify 12-km resolution North American Model (MesoNAM) Performance, and (5) Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) Graphical User Interface.
Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.
Dunn, Joshua G; Weissman, Jonathan S
2016-11-22
Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily adapted to novel NGS assays. Examples, tutorials, and extensive documentation can be found at https://plastid.readthedocs.io .
Conceptual design of a data reduction system
NASA Technical Reports Server (NTRS)
1983-01-01
A telemetry data processing system was defined of the Data Reduction. Data reduction activities in support of the developmental flights of the Space Shuttle were used as references against which requirements are assessed in general terms. A conceptual system design believed to offer significant throughput for the anticipated types of data reduction activities is presented. The design identifies the use of a large, intermediate data store as a key element in a complex of high speed, single purpose processors, each of which performs predesignated, repetitive operations on either raw or partially processed data. The recommended approach to implement the design concept is to adopt an established interface standard and rely heavily on mature or promising technologies which are considered main stream of the integrated circuit industry. The design system concept, is believed to be implementable without reliance on exotic devices and/or operational procedures. Numerical methods were employed to examine the feasibility of digital discrimination of FDM composite signals, and of eliminating line frequency noises in data measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sauter, Nicholas K., E-mail: nksauter@lbl.gov; Hattne, Johan; Grosse-Kunstleve, Ralf W.
The Computational Crystallography Toolbox (cctbx) is a flexible software platform that has been used to develop high-throughput crystal-screening tools for both synchrotron sources and X-ray free-electron lasers. Plans for data-processing and visualization applications are discussed, and the benefits and limitations of using graphics-processing units are evaluated. Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h{sup −1}) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in realmore » time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femtosecond crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units.« less
SkICAT: A cataloging and analysis tool for wide field imaging surveys
NASA Technical Reports Server (NTRS)
Weir, N.; Fayyad, U. M.; Djorgovski, S. G.; Roden, J.
1992-01-01
We describe an integrated system, SkICAT (Sky Image Cataloging and Analysis Tool), for the automated reduction and analysis of the Palomar Observatory-ST ScI Digitized Sky Survey. The Survey will consist of the complete digitization of the photographic Second Palomar Observatory Sky Survey (POSS-II) in three bands, comprising nearly three Terabytes of pixel data. SkICAT applies a combination of existing packages, including FOCAS for basic image detection and measurement and SAS for database management, as well as custom software, to the task of managing this wealth of data. One of the most novel aspects of the system is its method of object classification. Using state-of-theart machine learning classification techniques (GID3* and O-BTree), we have developed a powerful method for automatically distinguishing point sources from non-point sources and artifacts, achieving comparably accurate discrimination a full magnitude fainter than in previous Schmidt plate surveys. The learning algorithms produce decision trees for classification by examining instances of objects classified by eye on both plate and higher quality CCD data. The same techniques will be applied to perform higher-level object classification (e.g., of galaxy morphology) in the near future. Another key feature of the system is the facility to integrate the catalogs from multiple plates (and portions thereof) to construct a single catalog of uniform calibration and quality down to the faintest limits of the survey. SkICAT also provides a variety of data analysis and exploration tools for the scientific utilization of the resulting catalogs. We include initial results of applying this system to measure the counts and distribution of galaxies in two bands down to Bj is approximately 21 mag over an approximate 70 square degree multi-plate field from POSS-II. SkICAT is constructed in a modular and general fashion and should be readily adaptable to other large-scale imaging surveys.
The analytical representation of viscoelastic material properties using optimization techniques
NASA Technical Reports Server (NTRS)
Hill, S. A.
1993-01-01
This report presents a technique to model viscoelastic material properties with a function of the form of the Prony series. Generally, the method employed to determine the function constants requires assuming values for the exponential constants of the function and then resolving the remaining constants through linear least-squares techniques. The technique presented here allows all the constants to be analytically determined through optimization techniques. This technique is employed in a computer program named PRONY and makes use of commercially available optimization tool developed by VMA Engineering, Inc. The PRONY program was utilized to compare the technique against previously determined models for solid rocket motor TP-H1148 propellant and V747-75 Viton fluoroelastomer. In both cases, the optimization technique generated functions that modeled the test data with at least an order of magnitude better correlation. This technique has demonstrated the capability to use small or large data sets and to use data sets that have uniformly or nonuniformly spaced data pairs. The reduction of experimental data to accurate mathematical models is a vital part of most scientific and engineering research. This technique of regression through optimization can be applied to other mathematical models that are difficult to fit to experimental data through traditional regression techniques.
Cluster tool solution for fabrication and qualification of advanced photomasks
NASA Astrophysics Data System (ADS)
Schaetz, Thomas; Hartmann, Hans; Peter, Kai; Lalanne, Frederic P.; Maurin, Olivier; Baracchi, Emanuele; Miramond, Corinne; Brueck, Hans-Juergen; Scheuring, Gerd; Engel, Thomas; Eran, Yair; Sommer, Karl
2000-07-01
The reduction of wavelength in optical lithography, phase shift technology and optical proximity correction (OPC), requires a rapid increase in cost effective qualification of photomasks. The knowledge about CD variation, loss of pattern fidelity especially for OPC pattern and mask defects concerning the impact on wafer level is becoming a key issue for mask quality assessment. As part of the European Community supported ESPRIT projection 'Q-CAP', a new cluster concept has been developed, which allows the combination of hardware tools as well as software tools via network communication. It is designed to be open for any tool manufacturer and mask hose. The bi-directional network access allows the exchange of all relevant mask data including grayscale images, measurement results, lithography parameters, defect coordinates, layout data, process data etc. and its storage to a SQL database. The system uses SEMI format descriptions as well as standard network hardware and software components for the client server communication. Each tool is used mainly to perform its specific application without using expensive time to perform optional analysis, but the availability of the database allows each component to share the full data ste gathered by all components. Therefore, the cluster can be considered as one single virtual tool. The paper shows the advantage of the cluster approach, the benefits of the tools linked together already, and a vision of a mask house in the near future.
Chaos in plasma simulation and experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, C.; Newman, D.E.; Sprott, J.C.
1993-09-01
We investigate the possibility that chaos and simple determinism are governing the dynamics of reversed field pinch (RFP) plasmas using data from both numerical simulations and experiment. A large repertoire of nonlinear analysis techniques is used to identify low dimensional chaos. These tools include phase portraits and Poincard sections, correlation dimension, the spectrum of Lyapunov exponents and short term predictability. In addition, nonlinear noise reduction techniques are applied to the experimental data in an attempt to extract any underlying deterministic dynamics. Two model systems are used to simulate the plasma dynamics. These are -the DEBS code, which models global RFPmore » dynamics, and the dissipative trapped electron mode (DTEM) model, which models drift wave turbulence. Data from both simulations show strong indications of low,dimensional chaos and simple determinism. Experimental data were obtained from the Madison Symmetric Torus RFP and consist of a wide array of both global and local diagnostic signals. None of the signals shows any indication of low dimensional chaos or other simple determinism. Moreover, most of the analysis tools indicate the experimental system is very high dimensional with properties similar to noise. Nonlinear noise reduction is unsuccessful at extracting an underlying deterministic system.« less
Environmental public health protection requires a good understanding of types and locations of pollutant emissions of health concern and their relationship to environmental public health indicators. Therefore, it is necessary to develop the methodologies, data sources, and tools...
Visual enhancement of images of natural resources: Applications in geology
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Neto, G.; Araujo, E. O.; Mascarenhas, N. D. A.; Desouza, R. C. M.
1980-01-01
The principal components technique for use in multispectral scanner LANDSAT data processing results in optimum dimensionality reduction. A powerful tool for MSS IMAGE enhancement, the method provides a maximum impression of terrain ruggedness; this fact makes the technique well suited for geological analysis.
Lipscomb, Hester J; Nolan, James; Patterson, Dennis; Dement, John M
2010-06-01
Nail guns are a common source of acute, and potentially serious, injury in residential construction. Data on nail gun injuries, hours worked and hours of tool use were collected in 2008 from union apprentice carpenters (n=464) through classroom surveys; this completed four years of serial cross-sectional data collection from apprentices. A predictive model of injury risk was constructed using Poisson regression. Injury rates declined 55% from baseline measures in 2005 with early training and increased use of tools with sequential actuation. Injury rates declined among users of tools with both actuation systems, but the rates of injury were consistently twice as high among those using tools with contact trip triggers. DISCUSSION AND IMPACT: Nail gun injuries can be reduced markedly through early training and use of tools with sequential actuation. These successful efforts need to be diffused broadly, including to the non-union sector. (c) 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Gainer, T. G.; Hoffman, S.
1972-01-01
Basic formulations for developing coordinate transformations and motion equations used with free-flight and wind-tunnel data reduction are presented. The general forms presented include axes transformations that enable transfer back and forth between any of the five axes systems that are encountered in aerodynamic analysis. Equations of motion are presented that enable calculation of motions anywhere in the vicinity of the earth. A bibliography of publications on methods of analyzing flight data is included.
NASA Astrophysics Data System (ADS)
Irwansyah; Sinh, N. P.; Lai, J. Y.; Essomba, T.; Asbar, R.; Lee, P. Y.
2018-02-01
In this paper, we present study to integrate virtual fracture bone reduction simulation tool with a novel hybrid 3-DOF-RPS external fixator to relocate back bone fragments into their anatomically original position. A 3D model of fractured bone was reconstructed and manipulated using 3D design and modeling software, PhysiGuide. The virtual reduction system was applied to reduce a bilateral femoral shaft fracture type 32-A3. Measurement data from fracture reduction and fixation stages were implemented to manipulate the manipulator pose in patient’s clinical case. The experimental result presents that by merging both of those techniques will give more possibilities to reduce virtual bone reduction time, improve facial and shortest healing treatment.
Combinative Particle Size Reduction Technologies for the Production of Drug Nanocrystals
Salazar, Jaime; Müller, Rainer H.; Möschwitzer, Jan P.
2014-01-01
Nanosizing is a suitable method to enhance the dissolution rate and therefore the bioavailability of poorly soluble drugs. The success of the particle size reduction processes depends on critical factors such as the employed technology, equipment, and drug physicochemical properties. High pressure homogenization and wet bead milling are standard comminution techniques that have been already employed to successfully formulate poorly soluble drugs and bring them to market. However, these techniques have limitations in their particle size reduction performance, such as long production times and the necessity of employing a micronized drug as the starting material. This review article discusses the development of combinative methods, such as the NANOEDGE, H 96, H 69, H 42, and CT technologies. These processes were developed to improve the particle size reduction effectiveness of the standard techniques. These novel technologies can combine bottom-up and/or top-down techniques in a two-step process. The combinative processes lead in general to improved particle size reduction effectiveness. Faster production of drug nanocrystals and smaller final mean particle sizes are among the main advantages. The combinative particle size reduction technologies are very useful formulation tools, and they will continue acquiring importance for the production of drug nanocrystals. PMID:26556191
NASA Astrophysics Data System (ADS)
Tonitto, C.; Gurwick, N. P.
2012-12-01
Policy initiatives to reduce greenhouse gas emissions (GHG) have promoted the development of agricultural management protocols to increase SOC storage and reduce GHG emissions. We review approaches for quantifying N2O flux from agricultural landscapes. We summarize the temporal and spatial extent of observations across representative soil classes, climate zones, cropping systems, and management scenarios. We review applications of simulation and empirical modeling approaches and compare validation outcomes across modeling tools. Subsequently, we review current model application in agricultural management protocols. In particular, we compare approaches adapted for compliance with the California Global Warming Solutions Act, the Alberta Climate Change and Emissions Management Act, and by the American Carbon Registry. In the absence of regional data to drive model development, policies that require GHG quantification often use simple empirical models based on highly aggregated data of N2O flux as a function of applied N - Tier 1 models according to IPCC categorization. As participants in development of protocols that could be used in carbon offset markets, we observed that stakeholders outside of the biogeochemistry community favored outcomes from simulation modeling (Tier 3) rather than empirical modeling (Tier 2). In contrast, scientific advisors were more accepting of outcomes based on statistical approaches that rely on local observations, and their views sometimes swayed policy practitioners over the course of policy development. Both Tier 2 and Tier 3 approaches have been implemented in current policy development, and it is important that the strengths and limitations of both approaches, in the face of available data, be well-understood by those drafting and adopting policies and protocols. The reliability of all models is contingent on sufficient observations for model development and validation. Simulation models applied without site-calibration generally result in poor validation results, and this point particularly needs to be emphasized during policy development. For cases where sufficient calibration data are available, simulation models have demonstrated the ability to capture seasonal patterns of N2O flux. The reliability of statistical models likewise depends on data availability. Because soil moisture is a significant driver of N2O flux, the best outcomes occur when empirical models are applied to systems with relevant soil classification and climate. The structure of current carbon offset protocols is not well-aligned with a budgetary approach to GHG accounting. Current protocols credit field-scale reduction in N2O flux as a result of reduced fertilizer use. Protocols do not award farmers credit for reductions in CO2 emissions resulting from reduced production of synthetic N fertilizer. To achieve the greatest GHG emission reductions through reduced synthetic N production and reduced landscape N saturation requires a re-envisioning of the agricultural landscape to include cropping systems with legume and manure N sources. The current focus on on-farm GHG sources focuses credits on simple reductions of N applied in conventional systems rather than on developing cropping systems which promote higher recycling and retention of N.
Boulesteix, Anne-Laure; Wilson, Rory; Hapfelmeier, Alexander
2017-09-09
The goal of medical research is to develop interventions that are in some sense superior, with respect to patient outcome, to interventions currently in use. Similarly, the goal of research in methodological computational statistics is to develop data analysis tools that are themselves superior to the existing tools. The methodology of the evaluation of medical interventions continues to be discussed extensively in the literature and it is now well accepted that medicine should be at least partly "evidence-based". Although we statisticians are convinced of the importance of unbiased, well-thought-out study designs and evidence-based approaches in the context of clinical research, we tend to ignore these principles when designing our own studies for evaluating statistical methods in the context of our methodological research. In this paper, we draw an analogy between clinical trials and real-data-based benchmarking experiments in methodological statistical science, with datasets playing the role of patients and methods playing the role of medical interventions. Through this analogy, we suggest directions for improvement in the design and interpretation of studies which use real data to evaluate statistical methods, in particular with respect to dataset inclusion criteria and the reduction of various forms of bias. More generally, we discuss the concept of "evidence-based" statistical research, its limitations and its impact on the design and interpretation of real-data-based benchmark experiments. We suggest that benchmark studies-a method of assessment of statistical methods using real-world datasets-might benefit from adopting (some) concepts from evidence-based medicine towards the goal of more evidence-based statistical research.
Multiple Imputation of Cognitive Performance as a Repeatedly Measured Outcome
Rawlings, Andreea M.; Sang, Yingying; Sharrett, A. Richey; Coresh, Josef; Griswold, Michael; Kucharska-Newton, Anna M.; Palta, Priya; Wruck, Lisa M.; Gross, Alden L.; Deal, Jennifer A.; Power, Melinda C.; Bandeen-Roche, Karen
2016-01-01
Background Longitudinal studies of cognitive performance are sensitive to dropout, as participants experiencing cognitive deficits are less likely to attend study visits, which may bias estimated associations between exposures of interest and cognitive decline. Multiple imputation is a powerful tool for handling missing data, however its use for missing cognitive outcome measures in longitudinal analyses remains limited. Methods We use multiple imputation by chained equations (MICE) to impute cognitive performance scores of participants who did not attend the 2011-2013 exam of the Atherosclerosis Risk in Communities Study. We examined the validity of imputed scores using observed and simulated data under varying assumptions. We examined differences in the estimated association between diabetes at baseline and 20-year cognitive decline with and without imputed values. Lastly, we discuss how different analytic methods (mixed models and models fit using generalized estimate equations) and choice of for whom to impute result in different estimands. Results Validation using observed data showed MICE produced unbiased imputations. Simulations showed a substantial reduction in the bias of the 20-year association between diabetes and cognitive decline comparing MICE (3-4% bias) to analyses of available data only (16-23% bias) in a construct where missingness was strongly informative but realistic. Associations between diabetes and 20-year cognitive decline were substantially stronger with MICE than in available-case analyses. Conclusions Our study suggests when informative data are available for non-examined participants, MICE can be an effective tool for imputing cognitive performance and improving assessment of cognitive decline, though careful thought should be given to target imputation population and analytic model chosen, as they may yield different estimands. PMID:27619926
DYNAMO-HIA--a Dynamic Modeling tool for generic Health Impact Assessments.
Lhachimi, Stefan K; Nusselder, Wilma J; Smit, Henriette A; van Baal, Pieter; Baili, Paolo; Bennett, Kathleen; Fernández, Esteve; Kulik, Margarete C; Lobstein, Tim; Pomerleau, Joceline; Mackenbach, Johan P; Boshuizen, Hendriek C
2012-01-01
Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA). A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states) and three usability criteria (modest data requirements, rich model output, generally accessible) to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment), we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures--e.g. life expectancy and disease-free life expectancy--and detailed data--e.g. prevalences and mortality/survival rates--by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based on epidemiologic evidence.
Decision Support Tool Evaluation Report for General NOAA Oil Modeling Environment(GNOME) Version 2.0
NASA Technical Reports Server (NTRS)
Spruce, Joseph P.; Hall, Callie; Zanoni, Vicki; Blonski, Slawomir; D'Sa, Eurico; Estep, Lee; Holland, Donald; Moore, Roxzana F.; Pagnutti, Mary; Terrie, Gregory
2004-01-01
NASA's Earth Science Applications Directorate evaluated the potential of NASA remote sensing data and modeling products to enhance the General NOAA Oil Modeling Environment (GNOME) decision support tool. NOAA's Office of Response and Restoration (OR&R) Hazardous Materials (HAZMAT) Response Division is interested in enhancing GNOME with near-realtime (NRT) NASA remote sensing products on oceanic winds and ocean circulation. The NASA SeaWinds sea surface wind and Jason-1 sea surface height NRT products have potential, as do sea surface temperature and reflectance products from the Moderate Resolution Imaging Spectroradiometer and sea surface reflectance products from Landsat and the Advanced Spaceborne Thermal Emission and Reflectance Radiometer. HAZMAT is also interested in the Advanced Circulation model and the Ocean General Circulation Model. Certain issues must be considered, including lack of data continuity, marginal data redundancy, and data formatting problems. Spatial resolution is an issue for near-shore GNOME applications. Additional work will be needed to incorporate NASA inputs into GNOME, including verification and validation of data products, algorithms, models, and NRT data.
An Integrated Tool for Calculating and Reducing Institution Carbon and Nitrogen Footprints
Galloway, James N.; Castner, Elizabeth A.; Andrews, Jennifer; Leary, Neil; Aber, John D.
2017-01-01
Abstract The development of nitrogen footprint tools has allowed a range of entities to calculate and reduce their contribution to nitrogen pollution, but these tools represent just one aspect of environmental pollution. For example, institutions have been calculating their carbon footprints to track and manage their greenhouse gas emissions for over a decade. This article introduces an integrated tool that institutions can use to calculate, track, and manage their nitrogen and carbon footprints together. It presents the methodology for the combined tool, describes several metrics for comparing institution nitrogen and carbon footprint results, and discusses management strategies that reduce both the nitrogen and carbon footprints. The data requirements for the two tools overlap substantially, although integrating the two tools does necessitate the calculation of the carbon footprint of food. Comparison results for five institutions suggest that the institution nitrogen and carbon footprints correlate strongly, especially in the utilities and food sectors. Scenario analyses indicate benefits to both footprints from a range of utilities and food footprint reduction strategies. Integrating these two footprints into a single tool will account for a broader range of environmental impacts, reduce data entry and analysis, and promote integrated management of institutional sustainability. PMID:29350217
An Observer's View of the ORAC System at UKIRT
NASA Astrophysics Data System (ADS)
Wright, G. S.; Bridger, A. B.; Pickup, D. A.; Tan, M.; Folger, M.; Economou, F.; Adamson, A. J.; Currie, M. J.; Rees, N. P.; Purves, M.; Kackley, R. D.
The Observatory Reduction and Acquisition Control system (ORAC) was commissioned with its first instrument at the UK Infrared Telescope (UKIRT) in October 1999, and with all of the other UKIRT instrumentation this year. ORAC's advance preparation Observing Tool makes it simpler to prepare and carry out observations. Its Observing Manager gives observers excellent feedback on their observing as it goes along, reducing wasted time. The ORAC pipelined Data Reduction system produces near-publication quality reduced data at the telescope. ORAC is now in use for all observing at UKIRT, including flexibly scheduled nights and service observing. This paper provides an observer's perspective of the system and its performance.
NASA Astrophysics Data System (ADS)
Verrucci, Enrica; Bevington, John; Vicini, Alessandro
2014-05-01
A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be inferred. Lastly, this work draws attention to the use of the IDCT suite as an education resource for inspiring and training new students and engineers in the field of disaster risk reduction.
A parallel algorithm for multi-level logic synthesis using the transduction method. M.S. Thesis
NASA Technical Reports Server (NTRS)
Lim, Chieng-Fai
1991-01-01
The Transduction Method has been shown to be a powerful tool in the optimization of multilevel networks. Many tools such as the SYLON synthesis system (X90), (CM89), (LM90) have been developed based on this method. A parallel implementation is presented of SYLON-XTRANS (XM89) on an eight processor Encore Multimax shared memory multiprocessor. It minimizes multilevel networks consisting of simple gates through parallel pruning, gate substitution, gate merging, generalized gate substitution, and gate input reduction. This implementation, called Parallel TRANSduction (PTRANS), also uses partitioning to break large circuits up and performs inter- and intra-partition dynamic load balancing. With this, good speedups and high processor efficiencies are achievable without sacrificing the resulting circuit quality.
Wang, Duolin; Zeng, Shuai; Xu, Chunhui; Qiu, Wangren; Liang, Yanchun; Joshi, Trupti; Xu, Dong
2017-12-15
Computational methods for phosphorylation site prediction play important roles in protein function studies and experimental design. Most existing methods are based on feature extraction, which may result in incomplete or biased features. Deep learning as the cutting-edge machine learning method has the ability to automatically discover complex representations of phosphorylation patterns from the raw sequences, and hence it provides a powerful tool for improvement of phosphorylation site prediction. We present MusiteDeep, the first deep-learning framework for predicting general and kinase-specific phosphorylation sites. MusiteDeep takes raw sequence data as input and uses convolutional neural networks with a novel two-dimensional attention mechanism. It achieves over a 50% relative improvement in the area under the precision-recall curve in general phosphorylation site prediction and obtains competitive results in kinase-specific prediction compared to other well-known tools on the benchmark data. MusiteDeep is provided as an open-source tool available at https://github.com/duolinwang/MusiteDeep. xudong@missouri.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Modeling and Reduction With Applications to Semiconductor Processing
1999-01-01
smoothies ,” as they kept my energy level high without resorting to coffee (the beverage of choice, it seems, for graduate students). My advisor gave me all...with POC data, and balancing approach. . . . . . . . . . . . . . . . 312 xii LIST OF FIGURES 1.1 General state-space model reduction methodology ...reduction problem, then, is one of finding a systematic methodology within a given mathematical framework to produce an efficient or optimal trade-off of
Hydrologic analysis for selection and placement of conservation practices at the watershed scale
NASA Astrophysics Data System (ADS)
Wilson, C.; Brooks, E. S.; Boll, J.
2012-12-01
When a water body is exceeding water quality standards and a Total Maximum Daily Load has been established, conservation practices in the watershed are able to reduce point and non-point source pollution. Hydrological analysis is needed to place conservation practices in the most hydrologically sensitive areas. The selection and placement of conservation practices, however, is challenging in ungauged watersheds with little or no data for the hydrological analysis. The objective of this research is to perform a hydrological analysis for mitigation of erosion and total phosphorus in a mixed land use watershed, and to select and place the conservation practices in the most sensitive areas. The study area is the Hangman Creek watershed in Idaho and Washington State, upstream of Long Lake (WA) reservoir, east of Spokane, WA. While the pollutant of concern is total phosphorus (TP), reductions in TP were translated to total suspended solids or reductions in nonpoint source erosion and sediment delivery to streams. Hydrological characterization was done with a simple web-based tool, which runs the Water Erosion Prediction Project (WEPP) model for representative land types in the watersheds, where a land type is defined as a unique combination of soil type, slope configuration, land use and management, and climate. The web-based tool used site-specific spatial and temporal data on land use, soil physical parameters, slope, and climate derived from readily available data sources and provided information on potential pollutant pathways (i.e. erosion, runoff, lateral flow, and percolation). Multiple land types representative in the watershed were ordered from most effective to least effective, and displayed spatially using GIS. The methodology for the Hangman Creek watershed was validated in the nearby Paradise Creek watershed that has long-term stream discharge and monitoring as well as land use data. Output from the web-based tool shows the potential reductions for different tillage practices, buffer strips, streamside management, and conversion to the conservation reserve program in the watershed. The output also includes the relationship between land area where conservation practices are placed and the potential reduction in pollution, showing the diminished returns on investment as less sensitive areas are being treated. This application of a simple web-based tool and the use of a physically-based erosion model (i.e. WEPP) illustrates that quantitative, spatial and temporal analysis of changes in pollutant loading and site-specific recommendations of conservation practices can be made in ungauged watersheds.
ORAC-DR: Pipelining With Other People's Code
NASA Astrophysics Data System (ADS)
Economou, Frossie; Bridger, Alan; Wright, Gillian S.; Jenness, Tim; Currie, Malcolm J.; Adamson, Andy
As part of the UKIRT ORAC project, we have developed a pipeline (orac-dr) for driving on-line data reduction using existing astronomical packages as algorithm engines and display tools. The design is modular and extensible on several levels, allowing it to be easily adapted to a wide variety of instruments. Here we briefly review the design, discuss the robustness and speed of execution issues inherent in such pipelines, and address what constitutes a desirable (in terms of ``buy-in'' effort) engine or tool.
A Comparative Analysis of Life-Cycle Assessment Tools for ...
We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal solid waste-decision support tool (MSW-DST), solid waste optimization life-cycle framework (SWOLF), environmental assessment system for environmental technologies (EASETECH), and waste and resources assessment for the environment (WRATE). WARM, MSW-DST, and SWOLF were developed for US-specific materials management strategies, while WRATE and EASETECH were developed for European-specific conditions. All of the tools (with the exception of WARM) allow specification of a wide variety of parameters (e.g., materials composition and energy mix) to a varying degree, thus allowing users to model specific EOL materials management methods even outside the geographical domain they are originally intended for. The flexibility to accept user-specified input for a large number of parameters increases the level of complexity and the skill set needed for using these tools. The tools were evaluated and compared based on a series of criteria, including general tool features, the scope of the analysis (e.g., materials and processes included), and the impact categories analyzed (e.g., climate change, acidification). A series of scenarios representing materials management problems currently relevant to c
Insights into early lithic technologies from ethnography
Hayden, Brian
2015-01-01
Oldowan lithic assemblages are often portrayed as a product of the need to obtain sharp flakes for cutting into animal carcases. However, ethnographic and experimental research indicates that the optimal way to produce flakes for such butchering purposes is via bipolar reduction of small cryptocrystalline pebbles rather than from larger crystalline cores resembling choppers. Ethnographic observations of stone tool-using hunter-gatherers in environments comparable with early hominins indicate that most stone tools (particularly chopper forms and flake tools) were used for making simple shaft tools including spears, digging sticks and throwing sticks. These tools bear strong resemblances to Oldowan stone tools. Bipolar reduction for butchering probably preceded chopper-like core reduction and provides a key link between primate nut-cracking technologies and the emergence of more sophisticated lithic technologies leading to the Oldowan. PMID:26483534
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, Christopher A.
In this dissertation the possibility that chaos and simple determinism are governing the dynamics of reversed field pinch (RFP) plasmas is investigated. To properly assess this possibility, data from both numerical simulations and experiment are analyzed. A large repertoire of nonlinear analysis techniques is used to identify low dimensional chaos in the data. These tools include phase portraits and Poincare sections, correlation dimension, the spectrum of Lyapunov exponents and short term predictability. In addition, nonlinear noise reduction techniques are applied to the experimental data in an attempt to extract any underlying deterministic dynamics. Two model systems are used to simulatemore » the plasma dynamics. These are the DEBS code, which models global RFP dynamics, and the dissipative trapped electron mode (DTEM) model, which models drift wave turbulence. Data from both simulations show strong indications of low dimensional chaos and simple determinism. Experimental date were obtained from the Madison Symmetric Torus RFP and consist of a wide array of both global and local diagnostic signals. None of the signals shows any indication of low dimensional chaos or low simple determinism. Moreover, most of the analysis tools indicate the experimental system is very high dimensional with properties similar to noise. Nonlinear noise reduction is unsuccessful at extracting an underlying deterministic system.« less
Szymańska, Ewa; Tinnevelt, Gerjen H; Brodrick, Emma; Williams, Mark; Davies, Antony N; van Manen, Henk-Jan; Buydens, Lutgarde M C
2016-08-05
Current challenges of clinical breath analysis include large data size and non-clinically relevant variations observed in exhaled breath measurements, which should be urgently addressed with competent scientific data tools. In this study, three different baseline correction methods are evaluated within a previously developed data size reduction strategy for multi capillary column - ion mobility spectrometry (MCC-IMS) datasets. Introduced for the first time in breath data analysis, the Top-hat method is presented as the optimum baseline correction method. A refined data size reduction strategy is employed in the analysis of a large breathomic dataset on a healthy and respiratory disease population. New insights into MCC-IMS spectra differences associated with respiratory diseases are provided, demonstrating the additional value of the refined data analysis strategy in clinical breath analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less
The GONG Data Reduction and Analysis System. [solar oscillations
NASA Technical Reports Server (NTRS)
Pintar, James A.; Andersen, Bo Nyborg; Andersen, Edwin R.; Armet, David B.; Brown, Timothy M.; Hathaway, David H.; Hill, Frank; Jones, Harrison P.
1988-01-01
Each of the six GONG observing stations will produce three, 16-bit, 256X256 images of the Sun every 60 sec of sunlight. These data will be transferred from the observing sites to the GONG Data Management and Analysis Center (DMAC), in Tucson, on high-density tapes at a combined rate of over 1 gibabyte per day. The contemporaneous processing of these data will produce several standard data products and will require a sustained throughput in excess of 7 megaflops. Peak rates may exceed 50 megaflops. Archives will accumulate at the rate of approximately 1 terabyte per year, reaching nearly 3 terabytes in 3 yr of observing. Researchers will access the data products with a machine-independent GONG Reduction and Analysis Software Package (GRASP). Based on the Image Reduction and Analysis Facility, this package will include database facilities and helioseismic analysis tools. Users may access the data as visitors in Tucson, or may access DMAC remotely through networks, or may process subsets of the data at their local institutions using GRASP or other systems of their choice. Elements of the system will reach the prototype stage by the end of 1988. Full operation is expected in 1992 when data acquisition begins.
Liaw, Siaw-Teng; Powell-Davies, Gawaine; Pearce, Christopher; Britt, Helena; McGlynn, Lisa; Harris, Mark F
2016-03-01
With increasing computerisation in general practice, national primary care networks are mooted as sources of data for health services and population health research and planning. Existing data collection programs - MedicinesInsight, Improvement Foundation, Bettering the Evaluation and Care of Health (BEACH) - vary in purpose, governance, methodologies and tools. General practitioners (GPs) have significant roles as collectors, managers and users of electronic health record (EHR) data. They need to understand the challenges to their clinical and managerial roles and responsibilities. The aim of this article is to examine the primary and secondary use of EHR data, identify challenges, discuss solutions and explore directions. Representatives from existing programs, Medicare Locals, Local Health Districts and research networks held workshops on the scope, challenges and approaches to the quality and use of EHR data. Challenges included data quality, interoperability, fragmented governance, proprietary software, transparency, sustainability, competing ethical and privacy perspectives, and cognitive load on patients and clinicians. Proposed solutions included effective change management; transparent governance and management of intellectual property, data quality, security, ethical access, and privacy; common data models, metadata and tools; and patient/community engagement. Collaboration and common approaches to tools, platforms and governance are needed. Processes and structures must be transparent and acceptable to GPs.
Development of a personalized decision aid for breast cancer risk reduction and management.
Ozanne, Elissa M; Howe, Rebecca; Omer, Zehra; Esserman, Laura J
2014-01-14
Breast cancer risk reduction has the potential to decrease the incidence of the disease, yet remains underused. We report on the development a web-based tool that provides automated risk assessment and personalized decision support designed for collaborative use between patients and clinicians. Under Institutional Review Board approval, we evaluated the decision tool through a patient focus group, usability testing, and provider interviews (including breast specialists, primary care physicians, genetic counselors). This included demonstrations and data collection at two scientific conferences (2009 International Shared Decision Making Conference, 2009 San Antonio Breast Cancer Symposium). Overall, the evaluations were favorable. The patient focus group evaluations and usability testing (N = 34) provided qualitative feedback about format and design; 88% of these participants found the tool useful and 94% found it easy to use. 91% of the providers (N = 23) indicated that they would use the tool in their clinical setting. BreastHealthDecisions.org represents a new approach to breast cancer prevention care and a framework for high quality preventive healthcare. The ability to integrate risk assessment and decision support in real time will allow for informed, value-driven, and patient-centered breast cancer prevention decisions. The tool is being further evaluated in the clinical setting.
Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory
NASA Astrophysics Data System (ADS)
Veal, William R.; Taylor, Dawne; Rogers, Amy L.
2009-03-01
Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video feedback and direct instruction were employed in a general chemistry laboratory course to improve students' mastery and understanding of basic and advanced process skills. Qualitative results and statistical analysis of quantitative data proved that self-reflection significantly helped students develop basic and advanced process skills, yet did not seem to influence the general understanding of the science content.
Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)
2005-02-01
method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background
acdc – Automated Contamination Detection and Confidence estimation for single-cell genome data
Lux, Markus; Kruger, Jan; Rinke, Christian; ...
2016-12-20
A major obstacle in single-cell sequencing is sample contamination with foreign DNA. To guarantee clean genome assemblies and to prevent the introduction of contamination into public databases, considerable quality control efforts are put into post-sequencing analysis. Contamination screening generally relies on reference-based methods such as database alignment or marker gene search, which limits the set of detectable contaminants to organisms with closely related reference species. As genomic coverage in the tree of life is highly fragmented, there is an urgent need for a reference-free methodology for contaminant identification in sequence data. We present acdc, a tool specifically developed to aidmore » the quality control process of genomic sequence data. By combining supervised and unsupervised methods, it reliably detects both known and de novo contaminants. First, 16S rRNA gene prediction and the inclusion of ultrafast exact alignment techniques allow sequence classification using existing knowledge from databases. Second, reference-free inspection is enabled by the use of state-of-the-art machine learning techniques that include fast, non-linear dimensionality reduction of oligonucleotide signatures and subsequent clustering algorithms that automatically estimate the number of clusters. The latter also enables the removal of any contaminant, yielding a clean sample. Furthermore, given the data complexity and the ill-posedness of clustering, acdc employs bootstrapping techniques to provide statistically profound confidence values. Tested on a large number of samples from diverse sequencing projects, our software is able to quickly and accurately identify contamination. Results are displayed in an interactive user interface. Acdc can be run from the web as well as a dedicated command line application, which allows easy integration into large sequencing project analysis workflows. Acdc can reliably detect contamination in single-cell genome data. In addition to database-driven detection, it complements existing tools by its unsupervised techniques, which allow for the detection of de novo contaminants. Our contribution has the potential to drastically reduce the amount of resources put into these processes, particularly in the context of limited availability of reference species. As single-cell genome data continues to grow rapidly, acdc adds to the toolkit of crucial quality assurance tools.« less
acdc – Automated Contamination Detection and Confidence estimation for single-cell genome data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lux, Markus; Kruger, Jan; Rinke, Christian
A major obstacle in single-cell sequencing is sample contamination with foreign DNA. To guarantee clean genome assemblies and to prevent the introduction of contamination into public databases, considerable quality control efforts are put into post-sequencing analysis. Contamination screening generally relies on reference-based methods such as database alignment or marker gene search, which limits the set of detectable contaminants to organisms with closely related reference species. As genomic coverage in the tree of life is highly fragmented, there is an urgent need for a reference-free methodology for contaminant identification in sequence data. We present acdc, a tool specifically developed to aidmore » the quality control process of genomic sequence data. By combining supervised and unsupervised methods, it reliably detects both known and de novo contaminants. First, 16S rRNA gene prediction and the inclusion of ultrafast exact alignment techniques allow sequence classification using existing knowledge from databases. Second, reference-free inspection is enabled by the use of state-of-the-art machine learning techniques that include fast, non-linear dimensionality reduction of oligonucleotide signatures and subsequent clustering algorithms that automatically estimate the number of clusters. The latter also enables the removal of any contaminant, yielding a clean sample. Furthermore, given the data complexity and the ill-posedness of clustering, acdc employs bootstrapping techniques to provide statistically profound confidence values. Tested on a large number of samples from diverse sequencing projects, our software is able to quickly and accurately identify contamination. Results are displayed in an interactive user interface. Acdc can be run from the web as well as a dedicated command line application, which allows easy integration into large sequencing project analysis workflows. Acdc can reliably detect contamination in single-cell genome data. In addition to database-driven detection, it complements existing tools by its unsupervised techniques, which allow for the detection of de novo contaminants. Our contribution has the potential to drastically reduce the amount of resources put into these processes, particularly in the context of limited availability of reference species. As single-cell genome data continues to grow rapidly, acdc adds to the toolkit of crucial quality assurance tools.« less
Going with the flow: using species-discharge relationships to forecast losses in fish biodiversity.
Xenopoulos, Marguerite A; Lodge, David M
2006-08-01
In response to the scarcity of tools to make quantitative forecasts of the loss of aquatic species from anthropogenic effects, we present a statistical model that relates fish species richness to river discharge. Fish richness increases logarithmically with discharge, an index of habitat space, similar to a species-area curve in terrestrial systems. We apply the species-discharge model as a forecasting tool to build scenarios of changes in riverine fish richness from climate change, water consumption, and other anthropogenic drivers that reduce river discharge. Using hypothetical reductions in discharges (of magnitudes that have been observed in other rivers), we predict that reductions of 20-90% in discharge would result in losses of 2-38% of the fish species in two biogeographical regions in the United States (Lower Ohio-Upper Mississippi and Southeastern). Additional data on the occurrence of specific species relative to specific discharge regimes suggests that fishes found exclusively in high discharge environments (e.g., Shovelnose sturgeon) would be most vulnerable to reductions in discharge. Lag times in species extinctions after discharge reduction provide a window of opportunity for conservation efforts. Applications of the species-discharge model can help prioritize such management efforts among species and rivers.
40 CFR Table 6 of Subpart Bbbbbbb... - General Provisions
Code of Federal Regulations, 2010 CFR
2010-07-01
... under which performance tests must be conducted. § 63.7(e)(2)-(4) Conduct of Performance Tests and Data Reduction Yes. § 63.7(f)-(h) Use of Alternative Test Method; Data Analysis, Recordkeeping, and Reporting...
Robust prediction of protein subcellular localization combining PCA and WSVMs.
Tian, Jiang; Gu, Hong; Liu, Wenqi; Gao, Chiyang
2011-08-01
Automated prediction of protein subcellular localization is an important tool for genome annotation and drug discovery, and Support Vector Machines (SVMs) can effectively solve this problem in a supervised manner. However, the datasets obtained from real experiments are likely to contain outliers or noises, which can lead to poor generalization ability and classification accuracy. To explore this problem, we adopt strategies to lower the effect of outliers. First we design a method based on Weighted SVMs, different weights are assigned to different data points, so the training algorithm will learn the decision boundary according to the relative importance of the data points. Second we analyse the influence of Principal Component Analysis (PCA) on WSVM classification, propose a hybrid classifier combining merits of both PCA and WSVM. After performing dimension reduction operations on the datasets, kernel-based possibilistic c-means algorithm can generate more suitable weights for the training, as PCA transforms the data into a new coordinate system with largest variances affected greatly by the outliers. Experiments on benchmark datasets show promising results, which confirms the effectiveness of the proposed method in terms of prediction accuracy. Copyright © 2011 Elsevier Ltd. All rights reserved.
Neurodynamic evaluation of hearing aid features using EEG correlates of listening effort.
Bernarding, Corinna; Strauss, Daniel J; Hannemann, Ronny; Seidler, Harald; Corona-Strauss, Farah I
2017-06-01
In this study, we propose a novel estimate of listening effort using electroencephalographic data. This method is a translation of our past findings, gained from the evoked electroencephalographic activity, to the oscillatory EEG activity. To test this technique, electroencephalographic data from experienced hearing aid users with moderate hearing loss were recorded, wearing hearing aids. The investigated hearing aid settings were: a directional microphone combined with a noise reduction algorithm in a medium and a strong setting, the noise reduction setting turned off, and a setting using omnidirectional microphones without any noise reduction. The results suggest that the electroencephalographic estimate of listening effort seems to be a useful tool to map the exerted effort of the participants. In addition, the results indicate that a directional processing mode can reduce the listening effort in multitalker listening situations.
Comparison of quality control software tools for diffusion tensor imaging.
Liu, Bilan; Zhu, Tong; Zhong, Jianhui
2015-04-01
Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.
Ernst, Christian; Szczesny, Andrea; Soderstrom, Naomi; Siegmund, Frank; Schleppers, Alexander
2012-09-01
One of the declared objectives of surgical suite management in Germany is to increase operating room (OR) efficiency by reducing tardiness of first case of the day starts. We analyzed whether the introduction of OR management tools by German hospitals in response to increasing economic pressure was successful in achieving this objective. The OR management tools we considered were the appointment of an OR manager and the development and adoption of a surgical suite governance document (OR charter). We hypothesized that tardiness of first case starts was less in ORs that have adopted one or both of these tools. Using representative 2005 survey data from 107 German anesthesiology departments, we used a Tobit model to estimate the effect of the introduction of an OR manager or OR charter on tardiness of first case starts, while controlling for hospital size and surgical suite complexity. Adoption reduced tardiness of first case starts by at least 7 minutes (mean reduction 15 minutes, 95% confidence interval (CI): 7-22 minutes, P < 0.001). Reductions in tardiness of first case starts figure prominently the objectives of surgical suite management in Germany. Our results suggest that the appointment of an OR manager or the adoption of an OR charter support this objective. For short-term decision making on the day of surgery, this reduction in tardiness may have economic implications, because it reduced overutilized OR time.
77 FR 51807 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-27
... Minimum Data Elements (MDEs) for the National Breast and Cervical Cancer Early Detection Program (NBCCEDP... screening and early detection tests for breast and cervical cancer. Mammography is extremely valuable as an early detection tool because it can detect breast cancer well before the woman can feel the lump, when...
Holt, Tim A; Thorogood, Margaret; Griffiths, Frances; Munday, Stephen; Friede, Tim; Stables, David
2010-01-01
Background Primary care databases contain cardiovascular disease risk factor data, but practical tools are required to improve identification of at-risk patients. Aim To test the effects of a system of electronic reminders (the ‘e-Nudge’) on cardiovascular events and the adequacy of data for cardiovascular risk estimation. Design of study Randomised controlled trial. Setting Nineteen general practices in the West Midlands, UK. Method The e-Nudge identifies four groups of patients aged over 50 years on the basis of estimated cardiovascular risk and adequacy of risk factor data in general practice computers. Screen messages highlight individuals at raised risk and prompt users to complete risk profiles where necessary. The proportion of the study population in the four groups was measured, as well as the rate of cardiovascular events in each arm after 2 years. Results Over 38 000 patients' electronic records were randomised. The intervention led to an increase in the proportion of patients with sufficient data who were identifiably at risk, with a difference of 1.94% compared to the control group (95% confidence interval [CI] = 1.38 to 2.50, P<0.001). A corresponding reduction occurred in the proportion potentially at risk but requiring further data for a risk estimation (difference = –3.68%, 95% CI = –4.53 to –2.84, P<0.001). No significant difference was observed in the incidence of cardiovascular events (rate ratio = 0.96, 95% CI = 0.85 to 1.10, P = 0.59). Conclusion Automated electronic reminders using routinely collected primary care data can improve the adequacy of cardiovascular risk factor information during everyday practice and increase the visibility of the at-risk population. PMID:20353659
Pérez Vaquero, M Á; Gorria, C; Lezaun, M; López, F J; Monge, J; Eguizabal, C; Vesga, M A
2016-05-01
The management of platelet concentrate (PC) stocks is not simple given their short shelf life and variable demand. In general, managers decide on PC production based on personal experience. The objective of this study was to provide a tool to help decide how many PC units to produce each day in a more rational and objective way. From the historical data on PCs produced, transfused and discarded in the Basque Country in 2012, a mathematical model was built, based on the normality of the time series of the transfusions performed on each day of the week throughout the year. This model was implemented in an easy-to-use Excel spreadsheet and validated using real production data from 2013. Comparing with real 2013 data, in the best scenario, the number of PC units that expired was 87·7% lower, PC production, 14·3% lower and the age of the PCs transfused nearly 1-day younger in the simulation. If we want to ensure a minimum stock at the end of each day, the outdating rate and average age of the transfused PCs progressively increase. The practical application of the designed tool can facilitate decision-making about how many PC units to produce each day, resulting in very significant reductions in PC production and wastage and corresponding cost savings, together with an almost 1 day decrease in the mean age of PCs transfused. © 2016 The Authors. Vox Sanguinis published by John Wiley & Sons Ltd on behalf of International Society of Blood Transfusion.
Theophilus, Eugenia H; Coggins, Christopher R E; Chen, Peter; Schmidt, Eckhardt; Borgerding, Michael F
2015-03-01
Tobacco toxicant-related exposure reduction is an important tool in harm reduction. Cigarette per day reduction (CPDR) occurs as smokers migrate from smoking cigarettes to using alternative tobacco/nicotine products, or quit smoking. Few reports characterize the dose-response relationships between CPDR and effects on exposure biomarkers, especially at the low end of CPD exposure (e.g., 5 CPD). We present data on CPDR by characterizing magnitudes of biomarker reductions. We present data from a well-controlled, one-week clinical confinement study in healthy smokers who were switched from smoking 19-25 CPD to smoking 20, 10, 5 or 0 CPD. Biomarkers were measured in blood, plasma, urine, and breath, and included smoke-related toxicants, urine mutagenicity, smoked cigarette filter analyses (mouth level exposure), and vital signs. Many of the biomarkers (e.g., plasma nicotine) showed strong CPDR dose-response reductions, while others (e.g., plasma thiocyanate) showed weaker dose-response reductions. Factors that lead to lower biomarker reductions include non-CPD related contributors to the measured response (e.g., other exposure sources from environment, life style, occupation; inter-individual variability). This study confirms CPDR dose-responsive biomarkers and suggests that a one-week design is appropriate for characterizing exposure reductions when smokers switch from cigarettes to new tobacco products. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi
2016-01-01
Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405
Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi
2015-11-01
Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.
Atmospheric Delay Reduction Using KARAT for GPS Analysis and Implications for VLBI
NASA Technical Reports Server (NTRS)
Ichikawa, Ryuichi; Hobiger, Thomas; Koyama, Yasuhiro; Kondo, Tetsuro
2010-01-01
We have been developing a state-of-the-art tool to estimate the atmospheric path delays by raytracing through mesoscale analysis (MANAL) data, which is operationally used for numerical weather prediction by the Japan Meteorological Agency (JMA). The tools, which we have named KAshima RAytracing Tools (KARAT)', are capable of calculating total slant delays and ray-bending angles considering real atmospheric phenomena. The KARAT can estimate atmospheric slant delays by an analytical 2-D ray-propagation model by Thayer and a 3-D Eikonal solver. We compared PPP solutions using KARAT with that using the Global Mapping Function (GMF) and Vienna Mapping Function 1 (VMF1) for GPS sites of the GEONET (GPS Earth Observation Network System) operated by Geographical Survey Institute (GSI). In our comparison 57 stations of GEONET during the year of 2008 were processed. The KARAT solutions are slightly better than the solutions using VMF1 and GMF with linear gradient model for horizontal and height positions. Our results imply that KARAT is a useful tool for an efficient reduction of atmospheric path delays in radio-based space geodetic techniques such as GNSS and VLBI.
YersiniaBase: a genomic resource and analysis platform for comparative analysis of Yersinia.
Tan, Shi Yang; Dutta, Avirup; Jakubovics, Nicholas S; Ang, Mia Yang; Siow, Cheuk Chuen; Mutha, Naresh Vr; Heydari, Hamed; Wee, Wei Yee; Wong, Guat Jah; Choo, Siew Woh
2015-01-16
Yersinia is a Gram-negative bacteria that includes serious pathogens such as the Yersinia pestis, which causes plague, Yersinia pseudotuberculosis, Yersinia enterocolitica. The remaining species are generally considered non-pathogenic to humans, although there is evidence that at least some of these species can cause occasional infections using distinct mechanisms from the more pathogenic species. With the advances in sequencing technologies, many genomes of Yersinia have been sequenced. However, there is currently no specialized platform to hold the rapidly-growing Yersinia genomic data and to provide analysis tools particularly for comparative analyses, which are required to provide improved insights into their biology, evolution and pathogenicity. To facilitate the ongoing and future research of Yersinia, especially those generally considered non-pathogenic species, a well-defined repository and analysis platform is needed to hold the Yersinia genomic data and analysis tools for the Yersinia research community. Hence, we have developed the YersiniaBase, a robust and user-friendly Yersinia resource and analysis platform for the analysis of Yersinia genomic data. YersiniaBase has a total of twelve species and 232 genome sequences, of which the majority are Yersinia pestis. In order to smooth the process of searching genomic data in a large database, we implemented an Asynchronous JavaScript and XML (AJAX)-based real-time searching system in YersiniaBase. Besides incorporating existing tools, which include JavaScript-based genome browser (JBrowse) and Basic Local Alignment Search Tool (BLAST), YersiniaBase also has in-house developed tools: (1) Pairwise Genome Comparison tool (PGC) for comparing two user-selected genomes; (2) Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomics analysis of Yersinia genomes; (3) YersiniaTree for constructing phylogenetic tree of Yersinia. We ran analyses based on the tools and genomic data in YersiniaBase and the preliminary results showed differences in virulence genes found in Yersinia pestis and Yersinia pseudotuberculosis compared to other Yersinia species, and differences between Yersinia enterocolitica subsp. enterocolitica and Yersinia enterocolitica subsp. palearctica. YersiniaBase offers free access to wide range of genomic data and analysis tools for the analysis of Yersinia. YersiniaBase can be accessed at http://yersinia.um.edu.my .
AstroCloud: An Agile platform for data visualization and specific analyzes in 2D and 3D
NASA Astrophysics Data System (ADS)
Molina, F. Z.; Salgado, R.; Bergel, A.; Infante, A.
2017-07-01
Nowadays, astronomers commonly run their own tools, or distributed computational packages, for data analysis and then visualizing the results with generic applications. This chain of processes comes at high cost: (a) analyses are manually applied, they are therefore difficult to be automatized, and (b) data have to be serialized, thus increasing the cost of parsing and saving intermediary data. We are developing AstroCloud, an agile visualization multipurpose platform intended for specific analyses of astronomical images (https://astrocloudy.wordpress.com). This platform incorporates domain-specific languages which make it easily extensible. AstroCloud supports customized plug-ins, which translate into time reduction on data analysis. Moreover, it also supports 2D and 3D rendering, including interactive features in real time. AstroCloud is under development, we are currently implementing different choices for data reduction and physical analyzes.
Insights into early lithic technologies from ethnography.
Hayden, Brian
2015-11-19
Oldowan lithic assemblages are often portrayed as a product of the need to obtain sharp flakes for cutting into animal carcases. However, ethnographic and experimental research indicates that the optimal way to produce flakes for such butchering purposes is via bipolar reduction of small cryptocrystalline pebbles rather than from larger crystalline cores resembling choppers. Ethnographic observations of stone tool-using hunter-gatherers in environments comparable with early hominins indicate that most stone tools (particularly chopper forms and flake tools) were used for making simple shaft tools including spears, digging sticks and throwing sticks. These tools bear strong resemblances to Oldowan stone tools. Bipolar reduction for butchering probably preceded chopper-like core reduction and provides a key link between primate nut-cracking technologies and the emergence of more sophisticated lithic technologies leading to the Oldowan. © 2015 The Author(s).
NASA Astrophysics Data System (ADS)
Ratner, Jacqueline; Pyle, David; Mather, Tamsin
2015-04-01
Structure-from-motion (SfM) techniques are now widely available to quickly and cheaply generate digital terrain models (DTMs) from optical imagery. Topography can change rapidly during disaster scenarios and change the nature of local hazards, making ground-based SfM a particularly useful tool in hazard studies due to its low cost, accessibility, and potential for immediate deployment. Our study is designed to serve as an analogue to potential real-world use of the SfM method if employed for disaster risk reduction purposes. Experiments at a volcanic crater in Santorini, Greece, used crowd-sourced data collection to demonstrate the impact of user expertise and randomization of SfM data on the resultant DTM. Three groups of participants representing variable expertise levels utilized 16 different camera models, including four camera phones, to collect 1001 total photos in one hour of data collection. Datasets collected by each group were processed using the free and open source software VisualSFM. The point densities and overall quality of the resultant SfM point clouds were compared against each other and also against a LiDAR dataset for reference to the industry standard. Our results show that the point clouds are resilient to changes in user expertise and collection method and are comparable or even preferable in data density to LiDAR. We find that 'crowd-sourced' data collected by a moderately informed general public yields topography results comparable to those produced with data collected by experts. This means that in a real-world scenario involving participants with a diverse range of expertise levels, topography models could be produced from crowd-sourced data quite rapidly and to a very high standard. This could be beneficial to disaster risk reduction as a relatively quick, simple, and low-cost method to attain a rapidly updated knowledge of terrain attributes, useful for the prediction and mitigation of many natural hazards.
The environment power system analysis tool development program
NASA Technical Reports Server (NTRS)
Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.
1990-01-01
The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.
Britt, David W; Evans, Mark I
2007-12-01
Data are analyzed for 54 women who made an appointment with a North American Center specializing in multifetal pregnancy reduction (MFPR) to be counseled and possibly have a reduction. The impact on decision difficulty of combinations of three frames through which patients may understand and consider their options and use to justify their decisions are examined: a conceptional frame marked by a belief that life begins at conception; a medical frame marked by a belief in the statistics regarding risk and risk prevention through selective reduction; and a lifestyle frame marked by a belief that a balance of children and career has normative value. All data were gathered through semi-structured interviews and observation during the visit to the center over an average 2.5h period. Decision difficulty was indicated by self-assessed decision difficulty and by residual emotional turmoil surrounding the decision. Qualitative comparative analysis was used to analyze the impact of combinations of frames on decision difficulty. Separate analyses were conducted for those reducing only to three fetuses (or deciding not to reduce) and women who chose to reduce below three fetuses. Results indicated that for those with a non-intense conceptional frame, the decision was comparatively easy no matter whether the patients had high or low values of medical and lifestyle frames. For those with an intense conceptional frame, the decision was almost uniformly difficult, with the exception of those who chose to reduce only to three fetuses. Simplifying the results to their most parsimonious scenarios oversimplifies the results and precludes an understanding of how women can feel pulled in different directions by the dictates of the frames they hold. Variations in the characterization of intense medical frames, for example, can both pull toward reduction to two fetuses and neutralize shame and guilt by seeming to remove personal responsibility for the decision. We conclude that the examination of frame combinations is an important tool for understanding the way women carrying multiple fetuses negotiate their way through multi-fetal pregnancies, and that it may have more general relevance for understanding pregnancy decisions in context.
ORAC-DR: A generic data reduction pipeline infrastructure
NASA Astrophysics Data System (ADS)
Jenness, Tim; Economou, Frossie
2015-03-01
ORAC-DR is a general purpose data reduction pipeline system designed to be instrument and observatory agnostic. The pipeline works with instruments as varied as infrared integral field units, imaging arrays and spectrographs, and sub-millimeter heterodyne arrays and continuum cameras. This paper describes the architecture of the pipeline system and the implementation of the core infrastructure. We finish by discussing the lessons learned since the initial deployment of the pipeline system in the late 1990s.
ORAC-DR -- imaging data reduction
NASA Astrophysics Data System (ADS)
Currie, Malcolm J.; Cavanagh, Brad
ORAC-DR is a general-purpose automatic data-reduction pipeline environment. This document describes its use to reduce imaging data collected at the United Kingdom Infrared Telescope (UKIRT) with the UFTI, UIST, IRCAM, and Michelle instruments; at the Anglo-Australian Telescope (AAT) with the IRIS2 instrument; at the Very Large Telescope with ISAAC and NACO; from Magellan's Classic Cam, at Gemini with NIRI, and from the Isaac Newton Group using INGRID. It outlines the algorithms used and how to make minor modifications to them, and how to correct for errors made at the telescope.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, S; Ji, Y; Kim, K
Purpose: A diagnostics Multileaf Collimator (MLC) was designed for diagnostic radiography dose reduction. Monte Carlo simulation was used to evaluate efficiency of shielding material for producing leaves of Multileaf collimator. Material & Methods: The general radiography unit (Rex-650R, Listem, Korea) was modeling with Monte Carlo simulation (MCNPX, LANL, USA) and we used SRS-78 program to calculate the energy spectrum of tube voltage (80, 100, 120 kVp). The shielding materials was SKD 11 alloy tool steel that is composed of 1.6% carbon(C), 0.4% silicon (Si), 0.6% manganese (Mn), 5% chromium (Cr), 1% molybdenum (Mo), and vanadium (V). The density of itmore » was 7.89 g/m3. We simulated leafs diagnostic MLC using SKD 11 with general radiography unit. We calculated efficiency of diagnostic MLC using tally6 card of MCNPX depending on energy. Results: The diagnostic MLC consisted of 25 individual metal shielding leaves on both sides, with dimensions of 10 × 0.5 × 0.5 cm3. The leaves of MLC were controlled by motors positioned on both sides of the MLC. According to energy (tube voltage), the shielding efficiency of MLC in Monte Carlo simulation was 99% (80 kVp), 96% (100 kVp) and 93% (120 kVp). Conclusion: We certified efficiency of diagnostic MLC fabricated from SKD11 alloy tool steel. Based on the results, the diagnostic MLC was designed. We will make the diagnostic MLC for dose reduction of diagnostic radiography.« less
Simrank: Rapid and sensitive general-purpose k-mer search tool
2011-01-01
Background Terabyte-scale collections of string-encoded data are expected from consortia efforts such as the Human Microbiome Project http://nihroadmap.nih.gov/hmp. Intra- and inter-project data similarity searches are enabled by rapid k-mer matching strategies. Software applications for sequence database partitioning, guide tree estimation, molecular classification and alignment acceleration have benefited from embedded k-mer searches as sub-routines. However, a rapid, general-purpose, open-source, flexible, stand-alone k-mer tool has not been available. Results Here we present a stand-alone utility, Simrank, which allows users to rapidly identify database strings the most similar to query strings. Performance testing of Simrank and related tools against DNA, RNA, protein and human-languages found Simrank 10X to 928X faster depending on the dataset. Conclusions Simrank provides molecular ecologists with a high-throughput, open source choice for comparing large sequence sets to find similarity. PMID:21524302
NASA/IPAC Infrared Archive's General Image Cutouts Service
NASA Astrophysics Data System (ADS)
Alexov, A.; Good, J. C.
2006-07-01
The NASA/IPAC Infrared Archive (IRSA) ``Cutouts" Service (http://irsa.ipac.caltech.edu/applications/Cutouts) is a general tool for creating small ``cutout" FITS images and JPEGs from collections of data archived at IRSA. This service is a companion to IRSA's Atlas tool (http://irsa.ipac.caltech.edu/applications/Atlas/), which currently serves over 25 different data collections of various sizes and complexity and returns entire images for a user-defined region of the sky. The Cutouts Services sits on top of Atlas and extends the Atlas functionality by generating subimages at locations and sizes requested by the user from images already identified by Atlas. These results can be downloaded individually, in batch mode (using the program wget), or as a tar file. Cutouts re-uses IRSA's software architecture along with the publicly available Montage mosaicking tools. The advantages and disadvantages of this approach to generic cutout serving will be discussed.
Hagen, Espen; Ness, Torbjørn V; Khosrowshahi, Amir; Sørensen, Christina; Fyhn, Marianne; Hafting, Torkel; Franke, Felix; Einevoll, Gaute T
2015-04-30
New, silicon-based multielectrodes comprising hundreds or more electrode contacts offer the possibility to record spike trains from thousands of neurons simultaneously. This potential cannot be realized unless accurate, reliable automated methods for spike sorting are developed, in turn requiring benchmarking data sets with known ground-truth spike times. We here present a general simulation tool for computing benchmarking data for evaluation of spike-sorting algorithms entitled ViSAPy (Virtual Spiking Activity in Python). The tool is based on a well-established biophysical forward-modeling scheme and is implemented as a Python package built on top of the neuronal simulator NEURON and the Python tool LFPy. ViSAPy allows for arbitrary combinations of multicompartmental neuron models and geometries of recording multielectrodes. Three example benchmarking data sets are generated, i.e., tetrode and polytrode data mimicking in vivo cortical recordings and microelectrode array (MEA) recordings of in vitro activity in salamander retinas. The synthesized example benchmarking data mimics salient features of typical experimental recordings, for example, spike waveforms depending on interspike interval. ViSAPy goes beyond existing methods as it includes biologically realistic model noise, synaptic activation by recurrent spiking networks, finite-sized electrode contacts, and allows for inhomogeneous electrical conductivities. ViSAPy is optimized to allow for generation of long time series of benchmarking data, spanning minutes of biological time, by parallel execution on multi-core computers. ViSAPy is an open-ended tool as it can be generalized to produce benchmarking data or arbitrary recording-electrode geometries and with various levels of complexity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
The AAO fiber instrument data simulator
NASA Astrophysics Data System (ADS)
Goodwin, Michael; Farrell, Tony; Smedley, Scott; Heald, Ron; Heijmans, Jeroen; De Silva, Gayandhi; Carollo, Daniela
2012-09-01
The fiber instrument data simulator is an in-house software tool that simulates detector images of fiber-fed spectrographs developed by the Australian Astronomical Observatory (AAO). In addition to helping validate the instrument designs, the resulting simulated images are used to develop the required data reduction software. Example applications that have benefited from the tool usage are the HERMES and SAMI instrumental projects for the Anglo-Australian Telescope (AAT). Given the sophistication of these projects an end-to-end data simulator that accurately models the predicted detector images is required. The data simulator encompasses all aspects of the transmission and optical aberrations of the light path: from the science object, through the atmosphere, telescope, fibers, spectrograph and finally the camera detectors. The simulator runs under a Linux environment that uses pre-calculated information derived from ZEMAX models and processed data from MATLAB. In this paper, we discuss the aspects of the model, software, example simulations and verification.
ProphTools: general prioritization tools for heterogeneous biological networks.
Navarro, Carmen; Martínez, Victor; Blanco, Armando; Cano, Carlos
2017-12-01
Networks have been proven effective representations for the analysis of biological data. As such, there exist multiple methods to extract knowledge from biological networks. However, these approaches usually limit their scope to a single biological entity type of interest or they lack the flexibility to analyze user-defined data. We developed ProphTools, a flexible open-source command-line tool that performs prioritization on a heterogeneous network. ProphTools prioritization combines a Flow Propagation algorithm similar to a Random Walk with Restarts and a weighted propagation method. A flexible model for the representation of a heterogeneous network allows the user to define a prioritization problem involving an arbitrary number of entity types and their interconnections. Furthermore, ProphTools provides functionality to perform cross-validation tests, allowing users to select the best network configuration for a given problem. ProphTools core prioritization methodology has already been proven effective in gene-disease prioritization and drug repositioning. Here we make ProphTools available to the scientific community as flexible, open-source software and perform a new proof-of-concept case study on long noncoding RNAs (lncRNAs) to disease prioritization. ProphTools is robust prioritization software that provides the flexibility not present in other state-of-the-art network analysis approaches, enabling researchers to perform prioritization tasks on any user-defined heterogeneous network. Furthermore, the application to lncRNA-disease prioritization shows that ProphTools can reach the performance levels of ad hoc prioritization tools without losing its generality. © The Authors 2017. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Li, Ziyu; Jia, Zhigang; Ni, Tao; Li, Shengbiao
2017-12-01
Natural cotton, featuring abundant oxygen-containing functional groups, has been utilized as a reductant to synthesize Ag nanoparticles on its surface. Through the facile and environment-friendly reduction process, the fibrous Ag/cotton composite (FAC) was conveniently synthesized. Various characterization techniques including XRD, XPS, TEM, SEM, EDS and FT-IR had been utilized to study the material microstructure and surface properties. The resulting FAC exhibited favorable activity on the catalytic reduction of 4-nitrophenol with high reaction rate. Moreover, the fibrous Ag/cotton composites were capable to form a desirable catalytic mat for catalyzing and simultaneous product separation. Reactants passing through the mat could be catalytically transformed to product, which is of great significance for water treatment. Such catalyst (FAC) was thus expected to have the potential as a highly efficient, cost-effective and eco-friendly catalyst for industrial applications. More importantly, this newly developed synthetic methodology could serve as a general tool to design and synthesize other metal/biomass composites catalysts for a wider range of catalytic applications.
Maddineni, Satish B; Lau, Maurice M; Sangar, Vijay K
2009-08-08
Penile cancer is an uncommon malignancy with an incidence of 1 per 100,000. Conservative and radical treatments can be disfiguring and may have an impact on sexual function, quality of life (QOL), social interactions, self-image and self-esteem. Knowledge of how this disease affects patients is paramount to developing a global, multi-disciplinary approach to treatment. A Medline/PubMed literature search was conducted using the terms "sexual function penis cancer"; "quality of life penis cancer" and "psychological effects penis cancer" from 1985 to 2008. Articles containing quantitative data on QOL, sexual function or psychological well-being were included. 128 patients from 6 studies were included. 5 studies contained retrospective data whilst 1 study collected prospective data on erectile function. In the 6 studies 13 different quantitative tools were used to assess psychological well-being, QOL and sexual function. The General Health Questionnaire (GHQ) showed impaired well-being in up to 40% in 2 studies. Patients undergoing more mutilating treatments were more likely to have impaired well-being. The Hospital Anxiety and Depression Score (HADS) demonstrated pathological anxiety up to 31% in 2 studies. 1 study used the Diagnostic and Statistical Manual of Mental Disorders of psychiatric illness (DSM III-R) with 53% exhibiting mental illness, 25% avoidance behaviour and 40% impaired well-being. 12/30 suffered from post-traumatic stress disorder. The IIEF-15 was the commonest tool used to assess sexual function. The results varied from 36% in 1 study with no sexual function to 67% in another reporting reduced sexual satisfaction to 78% in another reporting high confidence with erections. The treatment of penile cancer results in negative effects on well-being in up to 40% with psychiatric symptoms in approximately 50%. Up to two-thirds of patients report a reduction in sexual function. This study demonstrates that penile cancer sufferers can exhibit significant psychological dysfunction, yet no standardised tools or interventional pathways are available. Therefore, there is a need to identify and assess adequate tools to measure psychological and sexual dysfunction in this group of patients.
Kling, Teresia; Johansson, Patrik; Sanchez, José; Marinescu, Voichita D.; Jörnsten, Rebecka; Nelander, Sven
2015-01-01
Statistical network modeling techniques are increasingly important tools to analyze cancer genomics data. However, current tools and resources are not designed to work across multiple diagnoses and technical platforms, thus limiting their applicability to comprehensive pan-cancer datasets such as The Cancer Genome Atlas (TCGA). To address this, we describe a new data driven modeling method, based on generalized Sparse Inverse Covariance Selection (SICS). The method integrates genetic, epigenetic and transcriptional data from multiple cancers, to define links that are present in multiple cancers, a subset of cancers, or a single cancer. It is shown to be statistically robust and effective at detecting direct pathway links in data from TCGA. To facilitate interpretation of the results, we introduce a publicly accessible tool (cancerlandscapes.org), in which the derived networks are explored as interactive web content, linked to several pathway and pharmacological databases. To evaluate the performance of the method, we constructed a model for eight TCGA cancers, using data from 3900 patients. The model rediscovered known mechanisms and contained interesting predictions. Possible applications include prediction of regulatory relationships, comparison of network modules across multiple forms of cancer and identification of drug targets. PMID:25953855
Küster, Eberhard; Dorusch, Falk; Vogt, Carsten; Weiss, Holger; Altenburger, Rolf
2004-07-15
Success of groundwater remediation is typically controlled via snapshot analysis of selected chemical substances or physical parameters. Biological parameters, i.e. ecotoxicological assays, are rarely employed. Hence the aim of the study was to develop a bioassay tool, which allows an on line monitoring of contaminated groundwater, as well as a toxicity reduction evaluation (TRE) of different remediation techniques in parallel and may furthermore be used as an additional tool for process control to supervise remediation techniques in a real time mode. Parallel testing of groundwater remediation techniques was accomplished for short and long time periods, by using the energy dependent luminescence of the bacterium Vibrio fischeri as biological monitoring parameter. One data point every hour for each remediation technique was generated by an automated biomonitor. The bacteria proved to be highly sensitive to the contaminated groundwater and the biomonitor showed a long standing time despite the highly corrosive groundwater present in Bitterfeld, Germany. The bacterial biomonitor is demonstrated to be a valuable tool for remediation success evaluation. Dose response relationships were generated for the six quantitatively dominant groundwater contaminants (2-chlortoluene, 1,2- and 1,4-dichlorobenzene, monochlorobenzene, ethylenbenzene and benzene). The concentrations of individual volatile organic chemicals (VOCs) could not explain the observed effects in the bacteria. An expected mixture toxicity was calculated for the six components using the concept of concentration addition. The calculated EC(50) for the mixture was still one order of magnitude lower than the observed EC(50) of the actual groundwater. The results pointed out that chemical analysis of the six most quantitative substances alone was not able to explain the effects observed with the bacteria. Thus chemical analysis alone may not be an adequate tool for remediation success evaluation in terms of toxicity reduction.
General aviation design synthesis utilizing interactive computer graphics
NASA Technical Reports Server (NTRS)
Galloway, T. L.; Smith, M. R.
1976-01-01
Interactive computer graphics is a fast growing area of computer application, due to such factors as substantial cost reductions in hardware, general availability of software, and expanded data communication networks. In addition to allowing faster and more meaningful input/output, computer graphics permits the use of data in graphic form to carry out parametric studies for configuration selection and for assessing the impact of advanced technologies on general aviation designs. The incorporation of interactive computer graphics into a NASA developed general aviation synthesis program is described, and the potential uses of the synthesis program in preliminary design are demonstrated.
General aviation aircraft interior noise problem: Some suggested solutions
NASA Technical Reports Server (NTRS)
Roskam, J.; Navaneethan, R.
1984-01-01
Laboratory investigation of sound transmission through panels and the use of modern data analysis techniques applied to actual aircraft is used to determine methods to reduce general aviation interior noise. The experimental noise reduction characteristics of stiffened flat and curved panels with damping treatment are discussed. The experimental results of double-wall panels used in the general aviation industry are given. The effects of skin panel material, fiberglass insulation and trim panel material on the noise reduction characteristics of double-wall panels are investigated. With few modifications, the classical sound transmission theory can be used to design the interior noise control treatment of aircraft. Acoustic intensity and analysis procedures are included.
A ranking index for quality assessment of forensic DNA profiles forensic DNA profiles
2010-01-01
Background Assessment of DNA profile quality is vital in forensic DNA analysis, both in order to determine the evidentiary value of DNA results and to compare the performance of different DNA analysis protocols. Generally the quality assessment is performed through manual examination of the DNA profiles based on empirical knowledge, or by comparing the intensities (allelic peak heights) of the capillary electrophoresis electropherograms. Results We recently developed a ranking index for unbiased and quantitative quality assessment of forensic DNA profiles, the forensic DNA profile index (FI) (Hedman et al. Improved forensic DNA analysis through the use of alternative DNA polymerases and statistical modeling of DNA profiles, Biotechniques 47 (2009) 951-958). FI uses electropherogram data to combine the intensities of the allelic peaks with the balances within and between loci, using Principal Components Analysis. Here we present the construction of FI. We explain the mathematical and statistical methodologies used and present details about the applied data reduction method. Thereby we show how to adapt the ranking index for any Short Tandem Repeat-based forensic DNA typing system through validation against a manual grading scale and calibration against a specific set of DNA profiles. Conclusions The developed tool provides unbiased quality assessment of forensic DNA profiles. It can be applied for any DNA profiling system based on Short Tandem Repeat markers. Apart from crime related DNA analysis, FI can therefore be used as a quality tool in paternal or familial testing as well as in disaster victim identification. PMID:21062433
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neudecker, Denise; Conlin, Jeremy Lloyd; Gray, Mark Girard
This memo contains general guidelines on what documentation and tools need to be in place as well as format and data testing requirements such that evaluated nuclear data sets or entire libraries can be adopted by the nuclear data team. Additional requirements beyond this memo might apply for specific nuclear data observables. These guidelines were established based on discussions between J.L. Conlin, M.G. Gray, A.P. McCartney, D. Neudecker, D.K. Parsons and M.C. White.
NASA Astrophysics Data System (ADS)
Hullo, J.-F.; Thibault, G.
2014-06-01
As-built CAD data reconstructed from Terrestrial Laser Scanner (TLS) data are used for more than two decades by Electricité de France (EDF) to prepare maintenance operations in its facilities. But today, the big picture is renewed: "as-built virtual reality" must address a huge scale-up to provide data to an increasing number of applications. In this paper, we first present a wide multi-sensor multi-purpose scanning campaign performed in a 10 floor building of a power plant in 2013: 1083 TLS stations (about 40.109 3D points referenced under a 2 cm tolerance) and 1025 RGB panoramic images (340.106 pixels per point of view). As expected, this very large survey of high precision measurements in a complex environment stressed sensors and tools that were developed for more favourable conditions and smaller data sets. The whole survey process (tools and methods used from acquisition and processing to CAD reconstruction) underwent a detailed follow-up in order to state on the locks to a possible generalization to other buildings. Based on these recent feedbacks, we have highlighted some of these current bottlenecks in this paper: sensors denoising, automation in processes, data validation tools improvements, standardization of formats and (meta-) data structures.
Comparisons of non-Gaussian statistical models in DNA methylation analysis.
Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-06-16
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.
Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis
Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-01-01
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687
Real time software tools and methodologies
NASA Technical Reports Server (NTRS)
Christofferson, M. J.
1981-01-01
Real time systems are characterized by high speed processing and throughput as well as asynchronous event processing requirements. These requirements give rise to particular implementations of parallel or pipeline multitasking structures, of intertask or interprocess communications mechanisms, and finally of message (buffer) routing or switching mechanisms. These mechanisms or structures, along with the data structue, describe the essential character of the system. These common structural elements and mechanisms are identified, their implementation in the form of routines, tasks or macros - in other words, tools are formalized. The tools developed support or make available the following: reentrant task creation, generalized message routing techniques, generalized task structures/task families, standardized intertask communications mechanisms, and pipeline and parallel processing architectures in a multitasking environment. Tools development raise some interesting prospects in the areas of software instrumentation and software portability. These issues are discussed following the description of the tools themselves.
Integrating economic and biophysical data in assessing cost-effectiveness of buffer strip placement.
Balana, Bedru Babulo; Lago, Manuel; Baggaley, Nikki; Castellazzi, Marie; Sample, James; Stutter, Marc; Slee, Bill; Vinten, Andy
2012-01-01
The European Union Water Framework Directive (WFD) requires Member States to set water quality objectives and identify cost-effective mitigation measures to achieve "good status" in all waters. However, costs and effectiveness of measures vary both within and between catchments, depending on factors such as land use and topography. The aim of this study was to develop a cost-effectiveness analysis framework for integrating estimates of phosphorus (P) losses from land-based sources, potential abatement using riparian buffers, and the economic implications of buffers. Estimates of field-by-field P exports and routing were based on crop risk and field slope classes. Buffer P trapping efficiencies were based on literature metadata analysis. Costs of placing buffers were based on foregone farm gross margins. An integrated optimization model of cost minimization was developed and solved for different P reduction targets to the Rescobie Loch catchment in eastern Scotland. A target mean annual P load reduction of 376 kg to the loch to achieve good status was identified. Assuming all the riparian fields initially have the 2-m buffer strip required by the General Binding Rules (part of the WFD in Scotland), the model gave good predictions of P loads (345-481 kg P). The modeling results show that riparian buffers alone cannot achieve the required P load reduction (up to 54% P can be removed). In the medium P input scenario, average costs vary from £38 to £176 kg P at 10% and 54% P reduction, respectively. The framework demonstrates a useful tool for exploring cost-effective targeting of environmental measures. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
General Purpose Data-Driven Online System Health Monitoring with Applications to Space Operations
NASA Technical Reports Server (NTRS)
Iverson, David L.; Spirkovska, Lilly; Schwabacher, Mark
2010-01-01
Modern space transportation and ground support system designs are becoming increasingly sophisticated and complex. Determining the health state of these systems using traditional parameter limit checking, or model-based or rule-based methods is becoming more difficult as the number of sensors and component interactions grows. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults, failures, or precursors of significant failures. The Inductive Monitoring System (IMS) is a general purpose, data-driven system health monitoring software tool that has been successfully applied to several aerospace applications and is under evaluation for anomaly detection in vehicle and ground equipment for next generation launch systems. After an introduction to IMS application development, we discuss these NASA online monitoring applications, including the integration of IMS with complementary model-based and rule-based methods. Although the examples presented in this paper are from space operations applications, IMS is a general-purpose health-monitoring tool that is also applicable to power generation and transmission system monitoring.
Hamilton, Matthew; Mahiane, Guy; Werst, Elric; Sanders, Rachel; Briët, Olivier; Smith, Thomas; Cibulskis, Richard; Cameron, Ewan; Bhatt, Samir; Weiss, Daniel J; Gething, Peter W; Pretorius, Carel; Korenromp, Eline L
2017-02-10
Scale-up of malaria prevention and treatment needs to continue but national strategies and budget allocations are not always evidence-based. This article presents a new modelling tool projecting malaria infection, cases and deaths to support impact evaluation, target setting and strategic planning. Nested in the Spectrum suite of programme planning tools, the model includes historic estimates of case incidence and deaths in groups aged up to 4, 5-14, and 15+ years, and prevalence of Plasmodium falciparum infection (PfPR) among children 2-9 years, for 43 sub-Saharan African countries and their 602 provinces, from the WHO and malaria atlas project. Impacts over 2016-2030 are projected for insecticide-treated nets (ITNs), indoor residual spraying (IRS), seasonal malaria chemoprevention (SMC), and effective management of uncomplicated cases (CMU) and severe cases (CMS), using statistical functions fitted to proportional burden reductions simulated in the P. falciparum dynamic transmission model OpenMalaria. In projections for Nigeria, ITNs, IRS, CMU, and CMS scale-up reduced health burdens in all age groups, with largest proportional and especially absolute reductions in children up to 4 years old. Impacts increased from 8 to 10 years following scale-up, reflecting dynamic effects. For scale-up of each intervention to 80% effective coverage, CMU had the largest impacts across all health outcomes, followed by ITNs and IRS; CMS and SMC conferred additional small but rapid mortality impacts. Spectrum-Malaria's user-friendly interface and intuitive display of baseline data and scenario projections holds promise to facilitate capacity building and policy dialogue in malaria programme prioritization. The module's linking to the OneHealth Tool for costing will support use of the software for strategic budget allocation. In settings with moderately low coverage levels, such as Nigeria, improving case management and achieving universal coverage with ITNs could achieve considerable burden reductions. Projections remain to be refined and validated with local expert input data and actual policy scenarios.
Uddin, Muhammad Shahin; Halder, Kalyan Kumar; Tahtali, Murat; Lambert, Andrew J; Pickering, Mark R; Marchese, Margaret; Stuart, Iain
2016-11-01
Ultrasound (US) imaging is a widely used clinical diagnostic tool in medical imaging techniques. It is a comparatively safe, economical, painless, portable, and noninvasive real-time tool compared to the other imaging modalities. However, the image quality of US imaging is severely affected by the presence of speckle noise and blur during the acquisition process. In order to ensure a high-quality clinical diagnosis, US images must be restored by reducing their speckle noise and blur. In general, speckle noise is modeled as a multiplicative noise following a Rayleigh distribution and blur as a Gaussian function. Hereto, we propose an intelligent estimator based on artificial neural networks (ANNs) to estimate the variances of noise and blur, which, in turn, are used to obtain an image without discernible distortions. A set of statistical features computed from the image and its complex wavelet sub-bands are used as input to the ANN. In the proposed method, we solve the inverse Rayleigh function numerically for speckle reduction and use the Richardson-Lucy algorithm for de-blurring. The performance of this method is compared with that of the traditional methods by applying them to a synthetic, physical phantom and clinical data, which confirms better restoration results by the proposed method.
Using an analytical geometry method to improve tiltmeter data presentation
Su, W.-J.
2000-01-01
The tiltmeter is a useful tool for geologic and geotechnical applications. To obtain full benefit from the tiltmeter, easy and accurate data presentations should be used. Unfortunately, the most commonly used method for tilt data reduction now may yield inaccurate and low-resolution results. This article describes a simple, accurate, and high-resolution approach developed at the Illinois State Geological Survey for data reduction and presentation. The orientation of tiltplates is determined first by using a trigonometric relationship, followed by a matrix transformation, to obtain the true amount of rotation change of the tiltplate at any given time. The mathematical derivations used for the determination and transformation are then coded into an integrated PC application by adapting the capabilities of commercial spreadsheet, database, and graphics software. Examples of data presentation from tiltmeter applications in studies of landfill covers, characterizations of mine subsidence, and investigations of slope stability are also discussed.
SAND: an automated VLBI imaging and analysing pipeline - I. Stripping component trajectories
NASA Astrophysics Data System (ADS)
Zhang, M.; Collioud, A.; Charlot, P.
2018-02-01
We present our implementation of an automated very long baseline interferometry (VLBI) data-reduction pipeline that is dedicated to interferometric data imaging and analysis. The pipeline can handle massive VLBI data efficiently, which makes it an appropriate tool to investigate multi-epoch multiband VLBI data. Compared to traditional manual data reduction, our pipeline provides more objective results as less human interference is involved. The source extraction is carried out in the image plane, while deconvolution and model fitting are performed in both the image plane and the uv plane for parallel comparison. The output from the pipeline includes catalogues of CLEANed images and reconstructed models, polarization maps, proper motion estimates, core light curves and multiband spectra. We have developed a regression STRIP algorithm to automatically detect linear or non-linear patterns in the jet component trajectories. This algorithm offers an objective method to match jet components at different epochs and to determine their proper motions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Andrew J.; Capo, Rosemary C.; Stewart, Brian W.
2016-09-22
This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hakala, Jacqueline Alexandra
2016-11-22
This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.
SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel
Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari
2009-01-01
Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software. PMID:19852806
SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.
Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari
2009-10-23
Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.
Wavelet packets for multi- and hyper-spectral imagery
NASA Astrophysics Data System (ADS)
Benedetto, J. J.; Czaja, W.; Ehler, M.; Flake, C.; Hirn, M.
2010-01-01
State of the art dimension reduction and classification schemes in multi- and hyper-spectral imaging rely primarily on the information contained in the spectral component. To better capture the joint spatial and spectral data distribution we combine the Wavelet Packet Transform with the linear dimension reduction method of Principal Component Analysis. Each spectral band is decomposed by means of the Wavelet Packet Transform and we consider a joint entropy across all the spectral bands as a tool to exploit the spatial information. Dimension reduction is then applied to the Wavelet Packets coefficients. We present examples of this technique for hyper-spectral satellite imaging. We also investigate the role of various shrinkage techniques to model non-linearity in our approach.
van Tongeren, Martie; Lamb, Judith; Cherrie, John W; MacCalman, Laura; Basinas, Ioannis; Hesse, Susanne
2017-10-01
Tier 1 exposure tools recommended for use under REACH are designed to easily identify situations that may pose a risk to health through conservative exposure predictions. However, no comprehensive evaluation of the performance of the lower tier tools has previously been carried out. The ETEAM project aimed to evaluate several lower tier exposure tools (ECETOC TRA, MEASE, and EMKG-EXPO-TOOL) as well as one higher tier tool (STOFFENMANAGER®). This paper describes the results of the external validation of tool estimates using measurement data. Measurement data were collected from a range of providers, both in Europe and United States, together with contextual information. Individual measurement and aggregated measurement data were obtained. The contextual information was coded into the tools to obtain exposure estimates. Results were expressed as percentage of measurements exceeding the tool estimates and presented by exposure category (non-volatile liquid, volatile liquid, metal abrasion, metal processing, and powder handling). We also explored tool performance for different process activities as well as different scenario conditions and exposure levels. In total, results from nearly 4000 measurements were obtained, with the majority for the use of volatile liquids and powder handling. The comparisons of measurement results with tool estimates suggest that the tools are generally conservative. However, the tools were more conservative when estimating exposure from powder handling compared to volatile liquids and other exposure categories. In addition, results suggested that tool performance varies between process activities and scenario conditions. For example, tools were less conservative when estimating exposure during activities involving tabletting, compression, extrusion, pelletisation, granulation (common process activity PROC14) and transfer of substance or mixture (charging and discharging) at non-dedicated facilities (PROC8a; powder handling only). With the exception of STOFFENMANAGER® (for estimating exposure during powder handling), the tools were less conservative for scenarios with lower estimated exposure levels. This is the most comprehensive evaluation of the performance of REACH exposure tools carried out to date. The results show that, although generally conservative, the tools may not always achieve the performance specified in the REACH guidance, i.e. using the 75th or 90th percentile of the exposure distribution for the risk characterisation. Ongoing development, adjustment, and recalibration of the tools with new measurement data are essential to ensure adequate characterisation and control of worker exposure to hazardous substances. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
NASA Astrophysics Data System (ADS)
Bytev, Vladimir V.; Kniehl, Bernd A.
2016-09-01
We present a further extension of the HYPERDIRE project, which is devoted to the creation of a set of Mathematica-based program packages for manipulations with Horn-type hypergeometric functions on the basis of differential equations. Specifically, we present the implementation of the differential reduction for the Lauricella function FC of three variables. Catalogue identifier: AEPP_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEPP_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 243461 No. of bytes in distributed program, including test data, etc.: 61610782 Distribution format: tar.gz Programming language: Mathematica. Computer: All computers running Mathematica. Operating system: Operating systems running Mathematica. Classification: 4.4. Does the new version supersede the previous version?: No, it significantly extends the previous version. Nature of problem: Reduction of hypergeometric function FC of three variables to a set of basis functions. Solution method: Differential reduction. Reasons for new version: The extension package allows the user to handle the Lauricella function FC of three variables. Summary of revisions: The previous version goes unchanged. Running time: Depends on the complexity of the problem.
Engineer, Rakesh S; Podolsky, Seth R; Fertel, Baruch S; Grover, Purva; Jimenez, Heather; Simon, Erin L; Smalley, Courtney M
2018-05-15
The American College of Emergency Physicians embarked on the "Choosing Wisely" campaign to avoid computed tomographic (CT) scans in patients with minor head injury who are at low risk based on validated decision rules. We hypothesized that a Pediatric Mild Head Injury Care Path could be developed and implemented to reduce inappropriate CT utilization with support of a clinical decision support tool (CDST) and a structured parent discussion tool. A quality improvement project was initiated for 9 weeks to reduce inappropriate CT utilization through 5 interventions: (1) engagement of leadership, (2) provider education, (3) incorporation of a parent discussion tool to guide discussion during the emergency department (ED) visit between the parent and the provider, (4) CDST embedded in the electronic medical record, and (5) importation of data into the note to drive compliance. Patients prospectively were enrolled when providers at a pediatric and a freestanding ED entered data into the CDST for decision making. Rate of care path utilization and head CT reduction was determined for all patients with minor head injury based on International Classification of Diseases, Ninth Revision codes. Targets for care path utilization and head CT reduction were established a priori. Results were compared with baseline data collected from 2013. The CDST was used in 176 (77.5%) of 227 eligible patients. Twelve patients were excluded based on a priori criteria. Adherence to recommendations occurred in 162 (99%) of 164 patients. Head CT utilization was reduced from 62.7% to 22% (odds ratio, 0.17; 95% confidence interval, 0.12-0.24) where CDST was used by the provider. There were no missed traumatic brain injuries in our study group. A Pediatric Mild Head Injury Care Path can be implemented in a pediatric and freestanding ED, resulting in reduced head CT utilization and high levels of adherence to CDST recommendations.
System data communication structures for active-control transport aircraft, volume 1
NASA Technical Reports Server (NTRS)
Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.
1981-01-01
Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.
41 CFR 101-30.704-1 - General Services Administration.
Code of Federal Regulations, 2012 CFR
2012-07-01
... civil agencies recorded as users of the item in the DLSC data base. This distribution will be made by..., data base approved item reduction decisions concerning items within the FSC classes which are managed... agencies (except direct submitters of catalog data to DLSC) will also be forwarded covering letters which...
41 CFR 101-30.704-1 - General Services Administration.
Code of Federal Regulations, 2013 CFR
2013-07-01
... civil agencies recorded as users of the item in the DLSC data base. This distribution will be made by..., data base approved item reduction decisions concerning items within the FSC classes which are managed... agencies (except direct submitters of catalog data to DLSC) will also be forwarded covering letters which...
41 CFR 101-30.704-1 - General Services Administration.
Code of Federal Regulations, 2010 CFR
2010-07-01
... civil agencies recorded as users of the item in the DLSC data base. This distribution will be made by..., data base approved item reduction decisions concerning items within the FSC classes which are managed... agencies (except direct submitters of catalog data to DLSC) will also be forwarded covering letters which...
41 CFR 101-30.704-1 - General Services Administration.
Code of Federal Regulations, 2011 CFR
2011-07-01
... civil agencies recorded as users of the item in the DLSC data base. This distribution will be made by..., data base approved item reduction decisions concerning items within the FSC classes which are managed... agencies (except direct submitters of catalog data to DLSC) will also be forwarded covering letters which...
41 CFR 101-30.704-1 - General Services Administration.
Code of Federal Regulations, 2014 CFR
2014-07-01
... civil agencies recorded as users of the item in the DLSC data base. This distribution will be made by..., data base approved item reduction decisions concerning items within the FSC classes which are managed... agencies (except direct submitters of catalog data to DLSC) will also be forwarded covering letters which...
TPSLVM: a dimensionality reduction algorithm based on thin plate splines.
Jiang, Xinwei; Gao, Junbin; Wang, Tianjiang; Shi, Daming
2014-10-01
Dimensionality reduction (DR) has been considered as one of the most significant tools for data analysis. One type of DR algorithms is based on latent variable models (LVM). LVM-based models can handle the preimage problem easily. In this paper we propose a new LVM-based DR model, named thin plate spline latent variable model (TPSLVM). Compared to the well-known Gaussian process latent variable model (GPLVM), our proposed TPSLVM is more powerful especially when the dimensionality of the latent space is low. Also, TPSLVM is robust to shift and rotation. This paper investigates two extensions of TPSLVM, i.e., the back-constrained TPSLVM (BC-TPSLVM) and TPSLVM with dynamics (TPSLVM-DM) as well as their combination BC-TPSLVM-DM. Experimental results show that TPSLVM and its extensions provide better data visualization and more efficient dimensionality reduction compared to PCA, GPLVM, ISOMAP, etc.
Assimilatory Nitrate Reduction in Hansenula polymorpha
NASA Astrophysics Data System (ADS)
Rossi, Beatrice; Berardi, Enrico
In the last decade, the yeast Hansenula polymorpha (syn.: Pichia angusta) has become an excellent experimental model for genetic and molecular investigations of nitrate assimilation, a subject traditionally investigated in plants, filamentous fungi and bacteria. Among other advantages, H. polymorpha offers classical and molecular genetic tools, as well as the availability of genomic sequence data.
A Model School Facility for Energy (with Related Video)
ERIC Educational Resources Information Center
Spangler, Seth; Crutchfield, Dave
2011-01-01
Energy modeling can be a powerful tool for managing energy-reduction concepts for an institution. Different types of energy models are developed at various stages of a project to provide data that can verify or disprove suggested energy-efficiency measures. Education institutions should understand what an energy model can do and, more important,…
HC StratoMineR: A Web-Based Tool for the Rapid Analysis of High-Content Datasets.
Omta, Wienand A; van Heesbeen, Roy G; Pagliero, Romina J; van der Velden, Lieke M; Lelieveld, Daphne; Nellen, Mehdi; Kramer, Maik; Yeong, Marley; Saeidi, Amir M; Medema, Rene H; Spruit, Marco; Brinkkemper, Sjaak; Klumperman, Judith; Egan, David A
2016-10-01
High-content screening (HCS) can generate large multidimensional datasets and when aligned with the appropriate data mining tools, it can yield valuable insights into the mechanism of action of bioactive molecules. However, easy-to-use data mining tools are not widely available, with the result that these datasets are frequently underutilized. Here, we present HC StratoMineR, a web-based tool for high-content data analysis. It is a decision-supportive platform that guides even non-expert users through a high-content data analysis workflow. HC StratoMineR is built by using My Structured Query Language for storage and querying, PHP: Hypertext Preprocessor as the main programming language, and jQuery for additional user interface functionality. R is used for statistical calculations, logic and data visualizations. Furthermore, C++ and graphical processor unit power is diffusely embedded in R by using the rcpp and rpud libraries for operations that are computationally highly intensive. We show that we can use HC StratoMineR for the analysis of multivariate data from a high-content siRNA knock-down screen and a small-molecule screen. It can be used to rapidly filter out undesirable data; to select relevant data; and to perform quality control, data reduction, data exploration, morphological hit picking, and data clustering. Our results demonstrate that HC StratoMineR can be used to functionally categorize HCS hits and, thus, provide valuable information for hit prioritization.
DATA-MEAns: an open source tool for the classification and management of neural ensemble recordings.
Bonomini, María P; Ferrandez, José M; Bolea, Jose Angel; Fernandez, Eduardo
2005-10-30
The number of laboratories using techniques that allow to acquire simultaneous recordings of as many units as possible is considerably increasing. However, the development of tools used to analyse this multi-neuronal activity is generally lagging behind the development of the tools used to acquire these data. Moreover, the data exchange between research groups using different multielectrode acquisition systems is hindered by commercial constraints such as exclusive file structures, high priced licenses and hard policies on intellectual rights. This paper presents a free open-source software for the classification and management of neural ensemble data. The main goal is to provide a graphical user interface that links the experimental data to a basic set of routines for analysis, visualization and classification in a consistent framework. To facilitate the adaptation and extension as well as the addition of new routines, tools and algorithms for data analysis, the source code and documentation are freely available.
Multi-Level Reduced Order Modeling Equipped with Probabilistic Error Bounds
NASA Astrophysics Data System (ADS)
Abdo, Mohammad Gamal Mohammad Mostafa
This thesis develops robust reduced order modeling (ROM) techniques to achieve the needed efficiency to render feasible the use of high fidelity tools for routine engineering analyses. Markedly different from the state-of-the-art ROM techniques, our work focuses only on techniques which can quantify the credibility of the reduction which can be measured with the reduction errors upper-bounded for the envisaged range of ROM model application. Our objective is two-fold. First, further developments of ROM techniques are proposed when conventional ROM techniques are too taxing to be computationally practical. This is achieved via a multi-level ROM methodology designed to take advantage of the multi-scale modeling strategy typically employed for computationally taxing models such as those associated with the modeling of nuclear reactor behavior. Second, the discrepancies between the original model and ROM model predictions over the full range of model application conditions are upper-bounded in a probabilistic sense with high probability. ROM techniques may be classified into two broad categories: surrogate construction techniques and dimensionality reduction techniques, with the latter being the primary focus of this work. We focus on dimensionality reduction, because it offers a rigorous approach by which reduction errors can be quantified via upper-bounds that are met in a probabilistic sense. Surrogate techniques typically rely on fitting a parametric model form to the original model at a number of training points, with the residual of the fit taken as a measure of the prediction accuracy of the surrogate. This approach, however, does not generally guarantee that the surrogate model predictions at points not included in the training process will be bound by the error estimated from the fitting residual. Dimensionality reduction techniques however employ a different philosophy to render the reduction, wherein randomized snapshots of the model variables, such as the model parameters, responses, or state variables, are projected onto lower dimensional subspaces, referred to as the "active subspaces", which are selected to capture a user-defined portion of the snapshots variations. Once determined, the ROM model application involves constraining the variables to the active subspaces. In doing so, the contribution from the variables discarded components can be estimated using a fundamental theorem from random matrix theory which has its roots in Dixon's theory, developed in 1983. This theory was initially presented for linear matrix operators. The thesis extends this theorem's results to allow reduction of general smooth nonlinear operators. The result is an approach by which the adequacy of a given active subspace determined using a given set of snapshots, generated either using the full high fidelity model, or other models with lower fidelity, can be assessed, which provides insight to the analyst on the type of snapshots required to reach a reduction that can satisfy user-defined preset tolerance limits on the reduction errors. Reactor physics calculations are employed as a test bed for the proposed developments. The focus will be on reducing the effective dimensionality of the various data streams such as the cross-section data and the neutron flux. The developed methods will be applied to representative assembly level calculations, where the size of the cross-section and flux spaces are typically large, as required by downstream core calculations, in order to capture the broad range of conditions expected during reactor operation. (Abstract shortened by ProQuest.).
The challenge of big data in public health: an opportunity for visual analytics.
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.
Physical Activity as a Vital Sign: A Systematic Review
Allen, Kelli D.; Ambrose, Kirsten R.; Stiller, Jamie L.; Evenson, Kelly R.; Voisin, Christiane; Hootman, Jennifer M.; Callahan, Leigh F.
2017-01-01
Introduction Physical activity (PA) is strongly endorsed for managing chronic conditions, and a vital sign tool (indicator of general physical condition) could alert providers of inadequate PA to prompt counseling or referral. This systematic review examined the use, definitions, psychometric properties, and outcomes of brief PA instruments as vital sign measures, with attention primarily to studies focused on arthritis. Methods Electronic databases were searched for English-language literature from 1985 through 2016 using the terms PA, exercise, vital sign, exercise referral scheme, and exercise counseling. Of the 838 articles identified for title and abstract review, 9 articles qualified for full text review and data extraction. Results Five brief PA measures were identified: Exercise Vital Sign (EVS), Physical Activity Vital Sign (PAVS), Speedy Nutrition and Physical Activity Assessment (SNAP), General Practice Physical Activity Questionnaire (GPPAQ), and Stanford Brief Activity Survey (SBAS). Studies focusing on arthritis were not found. Over 1.5 years of using EVS in a large hospital system, improvements occurred in relative weight loss among overweight patients and reduction in glycosylated hemoglobin among diabetic patients. On PAVS, moderate physical activity of 5 or more days per week versus fewer than 5 days per week was associated with a lower body mass index (−2.90 kg/m2). Compared with accelerometer-defined physical activity, EVS was weakly correlated (r = 0.27), had low sensitivity (27%–59%), and high specificity (74%–89%); SNAP showed weak agreement (κ = 0.12); GPPAQ had moderate sensitivity (46%) and specificity (50%), and SBAS was weakly correlated (r = 0.10–0.28), had poor to moderate sensitivity (18%–67%), and had moderate specificity (58%–79%). Conclusion Few studies have examined a brief physical activity tool as a vital sign measure. Initial investigations suggest the promise of these simple and quick assessment tools, and research is needed to test the effects of their use on chronic disease outcomes. PMID:29191260
Ecohealth System Dynamic Model as a Planning Tool for the Reduction of Breeding Sites
NASA Astrophysics Data System (ADS)
Respati, T.; Raksanagara, A.; Djuhaeni, H.; Sofyan, A.; Shandriasti, A.
2017-03-01
Dengue is still one of major health problem in Indonesia. Dengue transmission is influenced by dengue prevention and eradication program, community participation, housing environment and climate. The complexity of the disease coupled with limited resources necessitates different approach for prevention methods that include factors contribute to the transmission. One way to prevent the dengue transmission is by reducing the mosquito’s breeding sites. Four factors suspected to influence breeding sites are dengue prevention and eradication program, community participation, housing environment, and weather condition. In order to have an effective program in reducing the breeding site it is needed to have a model which can predict existence of the breeding sites while the four factors under study are controlled. The objective of this study is to develop an Ecohealth model using system dynamic as a planning tool for the reduction of breeding sites to prevent dengue transmission with regard to dengue prevention and eradication program, community participation, housing environment, and weather condition. The methodology is a mixed method study using sequential exploratory design. The study comprised of 3 stages: first a qualitative study to 14 respondents using in-depth interview and 6 respondents for focus group discussion. The results from the first stage was used to develop entomology and household survey questionnaires for second stage conducted in 2036 households across 12 sub districts in Bandung City. Ecohealth system dynamic model was developed using data from first and second stages. Analyses used are thematic analysis for qualitative data; spatial, generalized estimating equation (GEE) and structural equation modeling for quantitative data; also average mean error (AME) and average variance error (AVE) for dynamic system model validation. System dynamic model showed that the most effective approach to eliminate breeding places was by ensuring the availability of basic sanitation for all houses. Weather factors such as precipitation can be compensated with the eradication of breeding sites activities which is conducted as scheduled and at the same time for the whole areas. Conclusion of this study is that dengue prevention and eradication program, community participation, and housing environment contributed to breeding places elimination influenced the existence of the breeding sites. The availability of basic sanitation and breeding places eradication program done timely and collectively are the most effective approach to eradicate breeding sites. Ecohealth dynamic system model can be used as a tool for the planning of breeding sites eradication program to prevent disease transmissions at city level.
Innovative Stormwater Quality Tools by SARA for Holistic Watershed Master Planning
NASA Astrophysics Data System (ADS)
Thomas, S. M.; Su, Y. C.; Hummel, P. R.
2016-12-01
Stormwater management strategies such as Best Management Practices (BMP) and Low-Impact Development (LID) have increasingly gained attention in urban runoff control, becoming vital to holistic watershed master plans. These strategies can help address existing water quality impairments and support regulatory compliance, as well as guide planning and management of future development when substantial population growth and urbanization is projected to occur. However, past efforts have been limited to qualitative planning due to the lack of suitable tools to conduct quantitative assessment. The San Antonio River Authority (SARA), with the assistance of Lockwood, Andrews & Newnam, Inc. (LAN) and AQUA TERRA Consultants (a division of RESPEC), developed comprehensive hydrodynamic and water quality models using the Hydrological Simulation Program-FORTRAN (HSPF) for several urban watersheds in the San Antonio River Basin. These models enabled watershed management to look at water quality issues on a more refined temporal and spatial scale than the limited monitoring data. They also provided a means to locate and quantify potential water quality impairments and evaluate the effects of mitigation measures. To support the models, a suite of software tools were developed. including: 1) SARA Timeseries Utility Tool for managing and processing of large model timeseries files, 2) SARA Load Reduction Tool to determine load reductions needed to achieve screening levels for each modeled constituent on a sub-basin basis, and 3) SARA Enhanced BMP Tool to determine the optimal combination of BMP types and units needed to achieve the required load reductions. Using these SARA models and tools, water quality agencies and stormwater professionals can determine the optimal combinations of BMP/LID to accomplish their goals and save substantial stormwater infrastructure and management costs. The tools can also help regulators and permittees evaluate the feasibility of achieving compliance using BMP/LID. The project has gained national attention, being showcased in multiple newsletters, professional magazines, and conference presentations. The project also won the Texas American Council of Engineering Companies (ACEC) Gold Medal Award and the ACEC National Recognition Award in 2016.
Internet and cardiovascular research: the present and its future potentials and limits.
2002-03-01
The Internet and the World Wide Web have been proposed as tools to improve medical and cardiovascular research. These new technologies have been mainly applied to large-scale clinical trials, with the development of clinical-trial websites. They include tools for the management of some aspects of clinical trials, such as the dissemination of information on trial progress; randomisation and the monitoring processes; the distribution and accountability of study drugs; and remote data-entry. Several clinical-trial websites have been developed in the cardiovascular field over the last few years, but few have been designed to conduct trials fully online. Advantages of such systems include greater interaction between the coordinating centre and investigators, availability of a clean database in a short time, and cost reduction. Website developers need to take care of security issues and to use security tools (data encryption, firewalls, passwords and electronic signatures) in order to prevent unauthorised users from accessing the system and patient data.
Butler, Chris C; Dunstan, Frank; Heginbothom, Margaret; Mason, Brendan; Roberts, Zoë; Hillier, Sharon; Howe, Robin; Palmer, Stephen; Howard, Anthony
2007-01-01
Background GPs are urged to prescribe antibiotics less frequently, despite lack of evidence linking reduced antibiotic prescribing with reductions in resistance at a local level. Aim To investigate associations between changes in antibiotic dispensing and changes in antibiotic resistance at general-practice level. Design of study Seven-year study of dispensed antibiotics and antibiotic resistance in coliform isolates from urine samples routinely submitted from general practice. Setting General practices in Wales. Method Multilevel modelling of trends in resistance to ampicillin and trimethoprim, and changes in practice total antibiotic dispensing and amoxicillin and trimethoprim dispensing. Results The primary analysis included data on 164 225 coliform isolates from urine samples submitted from 240 general practices over the 7-year study period. These practices served a population of 1.7 million patients. The quartile of practices that had the greatest decrease in total antibiotic dispensing demonstrated a 5.2% reduction in ampicillin resistance over the 7-year period with changes of 0.4%, 2.4%, and −0.3% in the other three quartiles. There was a statistically significant overall decrease in ampicillin resistance of 1.03% (95% confidence interval [CI] = 0.37 to 1.67%) per decrease of 50 amoxicillin items dispensed per 1000 patients per annum. There were also significant reductions in trimethoprim resistance in the two quartiles of practices that reduced total antibiotic dispensing most compared with those that reduced it least, with an overall decrease in trimethoprim resistance of 1.08% (95% CI = 0.065 to 2.10%) per decrease of 20 trimethoprim items dispensed per 1000 patients per annum. Main findings were confirmed by secondary analyses of 256 370 isolates from 527 practices that contributed data at some point during the study period. Conclusion Reducing antibiotic dispensing at general-practice level is associated with reduced local antibiotic resistance. These findings should further encourage clinicians and patients to use antibiotics conservatively. PMID:17925135
Montella, Emma; Di Cicco, Maria Vincenza; Ferraro, Anna; Centobelli, Piera; Raiola, Eliana; Triassi, Maria; Improta, Giovanni
2017-06-01
Nowadays, the monitoring and prevention of healthcare-associated infections (HAIs) is a priority for the healthcare sector. In this article, we report on the application of the Lean Six Sigma (LSS) methodology to reduce the number of patients affected by sentinel bacterial infections who are at risk of HAI. The LSS methodology was applied in the general surgery department by using a multidisciplinary team of both physicians and academics. Data on more than 20 000 patients who underwent a wide range of surgical procedures between January 2011 and December 2014 were collected to conduct the study using the departmental information system. The most prevalent sentinel bacteria were determined among the infected patients. The preintervention (January 2011 to December 2012) and postintervention (January 2013 to December 2014) phases were compared to analyze the effects of the methodology implemented. The methodology allowed the identification of variables that influenced the risk of HAIs and the implementation of corrective actions to improve the care process, thereby reducing the percentage of infected patients. The improved process resulted in a 20% reduction in the average number of hospitalization days between preintervention and control phases, and a decrease in the mean (SD) number of days of hospitalization amounted to 36 (15.68), with a data distribution around 3 σ. The LSS is a helpful strategy that ensures a significant decrease in the number of HAIs in patients undergoing surgical interventions. The implementation of this intervention in the general surgery departments resulted in a significant reduction in both the number of hospitalization days and the number of patients affected by HAIs. This approach, together with other tools for reducing the risk of infection (surveillance, epidemiological guidelines, and training of healthcare personnel), could be applied to redesign and improve a wide range of healthcare processes. © 2016 John Wiley & Sons, Ltd.
Student-Produced Podcasts as an Assessment Tool: An Example from Geomorphology
ERIC Educational Resources Information Center
Kemp, Justine; Mellor, Antony; Kotter, Richard; Oosthoek, Jan W.
2012-01-01
The emergence of user-friendly technologies has made podcasting an accessible learning tool in undergraduate teaching. In a geomorphology course, student-produced podcasts were used as part of the assessment in 2008-2010. Student groups constructed radio shows aimed at a general audience to interpret and communicate geomorphological data within…
New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis
NASA Astrophysics Data System (ADS)
Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.
2017-12-01
Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.
Woodhouse, Steven; Piterman, Nir; Wintersteiger, Christoph M; Göttgens, Berthold; Fisher, Jasmin
2018-05-25
Reconstruction of executable mechanistic models from single-cell gene expression data represents a powerful approach to understanding developmental and disease processes. New ambitious efforts like the Human Cell Atlas will soon lead to an explosion of data with potential for uncovering and understanding the regulatory networks which underlie the behaviour of all human cells. In order to take advantage of this data, however, there is a need for general-purpose, user-friendly and efficient computational tools that can be readily used by biologists who do not have specialist computer science knowledge. The Single Cell Network Synthesis toolkit (SCNS) is a general-purpose computational tool for the reconstruction and analysis of executable models from single-cell gene expression data. Through a graphical user interface, SCNS takes single-cell qPCR or RNA-sequencing data taken across a time course, and searches for logical rules that drive transitions from early cell states towards late cell states. Because the resulting reconstructed models are executable, they can be used to make predictions about the effect of specific gene perturbations on the generation of specific lineages. SCNS should be of broad interest to the growing number of researchers working in single-cell genomics and will help further facilitate the generation of valuable mechanistic insights into developmental, homeostatic and disease processes.
NASA Astrophysics Data System (ADS)
Marty, J.; Martysevich, P.; Kramer, A.; Haralabus, G.
2012-04-01
The Provisional Technical Secretariat (PTS) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has a continuous interest in enhancing its capability in infrasound source localization and characterization. This capability is based on the processing of data recorded by the infrasound network of the International Monitoring System (IMS). This infrasound network consists of sixty stations, among which forty-five are already certified and continuously transmit data to the International Data Center (IDC) in Vienna, Austria. Each infrasound station is composed of an array of infrasound sensors capable of measuring micro-pressure changes produced at ground level by infrasonic waves. It is the responsibility of the Engineering and Development Section of the IMS Division to ensure the highest quality for IMS infrasound data. This includes the design of robust and reliable infrasound stations, the use of accurate and calibrated infrasound measuring chains, the installation of efficient wind noise reduction systems and the implementation of quality-control tools. The purpose of this paper is to present ongoing PTS infrasound engineering and development projects related to the testing and validation of wind noise reduction system models, the implementation of infrasound data QC tools, the definition of guidelines for the design of IMS power supply systems and the development of a portable infrasound calibrator and of field kits for site survey and certification.
Accuracy Analysis for Finite-Volume Discretization Schemes on Irregular Grids
NASA Technical Reports Server (NTRS)
Diskin, Boris; Thomas, James L.
2010-01-01
A new computational analysis tool, downscaling test, is introduced and applied for studying the convergence rates of truncation and discretization errors of nite-volume discretization schemes on general irregular (e.g., unstructured) grids. The study shows that the design-order convergence of discretization errors can be achieved even when truncation errors exhibit a lower-order convergence or, in some cases, do not converge at all. The downscaling test is a general, efficient, accurate, and practical tool, enabling straightforward extension of verification and validation to general unstructured grid formulations. It also allows separate analysis of the interior, boundaries, and singularities that could be useful even in structured-grid settings. There are several new findings arising from the use of the downscaling test analysis. It is shown that the discretization accuracy of a common node-centered nite-volume scheme, known to be second-order accurate for inviscid equations on triangular grids, degenerates to first order for mixed grids. Alternative node-centered schemes are presented and demonstrated to provide second and third order accuracies on general mixed grids. The local accuracy deterioration at intersections of tangency and in flow/outflow boundaries is demonstrated using the DS tests tailored to examining the local behavior of the boundary conditions. The discretization-error order reduction within inviscid stagnation regions is demonstrated. The accuracy deterioration is local, affecting mainly the velocity components, but applies to any order scheme.
The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376
NASA Astrophysics Data System (ADS)
Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent
2017-03-01
Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.
DYNAMO-HIA–A Dynamic Modeling Tool for Generic Health Impact Assessments
Lhachimi, Stefan K.; Nusselder, Wilma J.; Smit, Henriette A.; van Baal, Pieter; Baili, Paolo; Bennett, Kathleen; Fernández, Esteve; Kulik, Margarete C.; Lobstein, Tim; Pomerleau, Joceline; Mackenbach, Johan P.; Boshuizen, Hendriek C.
2012-01-01
Background Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA). A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states) and three usability criteria (modest data requirements, rich model output, generally accessible) to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment), we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. Methods and Results DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures – e.g. life expectancy and disease-free life expectancy – and detailed data – e.g. prevalences and mortality/survival rates – by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. Conclusion By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based on epidemiologic evidence. PMID:22590491
ORAC: 21st Century Observing at UKIRT
NASA Astrophysics Data System (ADS)
Bridger, A.; Wright, G. S.; Tan, M.; Pickup, D. A.; Economou, F.; Currie, M. J.; Adamson, A. J.; Rees, N. P.; Purves, M. H.
The Observatory Reduction and Acquisition Control system replaces all of the existing software which interacts with the observers at UKIRT. The aim is to improve observing efficiency with a set of integrated tools that take the user from pre-observing preparation, through the acquisition of observations to the reduction using a data-driven pipeline. ORAC is designed to be flexible and extensible, and is intended for use with all future UKIRT instruments, as well as existing telescope hardware and ``legacy'' instruments. It is also designed to allow integration with phase-1 and queue-scheduled observing tools in anticipation of possible future requirements. A brief overview of the project and its relationship to other systems is given. ORAC also re-uses much code from other systems and we discuss issues relating to the trade-off between reuse and the generation of new software specific to our requirements.
Stausmire, Julie M; Cashen, Constance P; Myerholtz, Linda; Buderer, Nancy
2015-01-01
The Communication Assessment Tool (CAT) has been used and validated to assess Family and Emergency Medicine resident communication skills from the patient's perspective. However, it has not been previously reported as an outcome measure for general surgery residents. The purpose of this study is to establish initial benchmarking data for the use of the CAT as an evaluation tool in an osteopathic general surgery residency program. Results are analyzed quarterly and used by the program director to provide meaningful feedback and targeted goal setting for residents to demonstrate progressive achievement of interpersonal and communication skills with patients. The 14-item paper version of the CAT (developed by Makoul et al. for residency programs) asks patients to anonymously rate surgery residents on discrete communication skills using a 5-point rating scale immediately after the clinical encounter. Results are reported as the percentage of items rated as "excellent" (5) by the patient. The setting is a hospital-affiliated ambulatory urban surgery office staffed by the residency program. Participants are representative of adult patients of both sexes across all ages with diverse ethnic backgrounds. They include preoperative and postoperative patients, as well as those needing diagnostic testing and follow-up. Data have been collected on 17 general surgery residents from a single residency program representing 5 postgraduate year levels and 448 patient encounters since March 2012. The reliability (Cronbach α) of the tool for surgery residents was 0.98. The overall mean percentage of items rated as excellent was 70% (standard deviations = 42%), with a median of 100%. The CAT is a useful tool for measuring 1 facet of resident communication skills-the patient's perception of the physician-patient encounter. The tool provides a unique and personalized outcome measure for identifying communication strengths and improvement opportunities, allowing residents to receive specific feedback and mentoring by program directors. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Study on design of light-weight super-abrasive wheel
NASA Astrophysics Data System (ADS)
Nohara, K.; Yanagihara, K.; Ogawa, M.
2018-01-01
Fixed-abrasive tool, also called a grinding wheel, is produced by furnacing abrasive compound which contains abrasive grains and binding powder such as vitrified materials or resins. Fixed-abrasive tool is installed on spindle of grinding machine. And it is given 1,800-2,000 min-1 of spindle rotation for the usage. The centrifugal fracture of the compound of fixed- abrasive tool is one of the careful respects in designing. In recent years, however, super-abrasive wheel as a fixed-abrasive tool has been developed and applied widely. One of the most characteristic respects is that metal is applied for the body of grinding-wheel. The strength to hold abrasive grain and the rigidity of wheel become stronger than those of general grinding wheel, also the lifespan of fixed-abrasive tool becomes longer. The weight of fixed-abrasive tool, however, becomes heavier. Therefore, when the super-abrasive wheel is used, the power consumption of spindle motor becomes larger. It also becomes difficult for the grinding-wheel to respond to sudden acceleration or deceleration. Thus, in order to reduce power consumption in grinding and to obtain quicker frequency response of super-abrasive wheel, the new wheel design is proposed. The design accomplishes 46% weight reduction. Acceleration that is one second quicker than that of conventional grinding wheel is obtained.
Air Force Operational Test and Evaluation Center, Volume 2, Number 2
1988-01-01
the special class of attributes arc recorded, cost or In place of the normalization ( I). we propose beliefit. the lollowins normalization NUMERICAL ...comprchcnsi\\c set of modular basic data flow to meet requirements at test tools ,. designed to provide flexible data reduction start, then building to...possible. a totlinaion ot the two position error measurement techniques arc used SLR is a methd of fitting a linear model o accumlulate a position error
A General Exponential Framework for Dimensionality Reduction.
Wang, Su-Jing; Yan, Shuicheng; Yang, Jian; Zhou, Chun-Guang; Fu, Xiaolan
2014-02-01
As a general framework, Laplacian embedding, based on a pairwise similarity matrix, infers low dimensional representations from high dimensional data. However, it generally suffers from three issues: 1) algorithmic performance is sensitive to the size of neighbors; 2) the algorithm encounters the well known small sample size (SSS) problem; and 3) the algorithm de-emphasizes small distance pairs. To address these issues, here we propose exponential embedding using matrix exponential and provide a general framework for dimensionality reduction. In the framework, the matrix exponential can be roughly interpreted by the random walk over the feature similarity matrix, and thus is more robust. The positive definite property of matrix exponential deals with the SSS problem. The behavior of the decay function of exponential embedding is more significant in emphasizing small distance pairs. Under this framework, we apply matrix exponential to extend many popular Laplacian embedding algorithms, e.g., locality preserving projections, unsupervised discriminant projections, and marginal fisher analysis. Experiments conducted on the synthesized data, UCI, and the Georgia Tech face database show that the proposed new framework can well address the issues mentioned above.
Scheduling Results for the THEMIS Observation Scheduling Tool
NASA Technical Reports Server (NTRS)
Mclaren, David; Rabideau, Gregg; Chien, Steve; Knight, Russell; Anwar, Sadaat; Mehall, Greg; Christensen, Philip
2011-01-01
We describe a scheduling system intended to assist in the development of instrument data acquisitions for the THEMIS instrument, onboard the Mars Odyssey spacecraft, and compare results from multiple scheduling algorithms. This tool creates observations of both (a) targeted geographical regions of interest and (b) general mapping observations, while respecting spacecraft constraints such as data volume, observation timing, visibility, lighting, season, and science priorities. This tool therefore must address both geometric and state/timing/resource constraints. We describe a tool that maps geometric polygon overlap constraints to set covering constraints using a grid-based approach. These set covering constraints are then incorporated into a greedy optimization scheduling algorithm incorporating operations constraints to generate feasible schedules. The resultant tool generates schedules of hundreds of observations per week out of potential thousands of observations. This tool is currently under evaluation by the THEMIS observation planning team at Arizona State University.
A simple model of carbon in the soil profile for agricultural soils in Northwestern Europe
NASA Astrophysics Data System (ADS)
Taghizadeh-Toosi, Arezoo; Hutchings, Nicholas J.; Vejlin, Jonas; Christensen, Bent T.; Olesen, Jørgen E.
2014-05-01
World soil carbon (C) stocks are second to those in the ocean, and represent three times as much C as currently present in the atmosphere. The amount of C in soil may play a significant role in carbon exchanges between the atmosphere and the terrestrial environment. The C-TOOL model is a three-pool linked soil organic carbon (SOC) model in well-drained mineral soils under agricultural land management to allow generalized parameterization for estimating effects of management measures at medium to long time scales for the entire soil profile (0-100 cm). C-TOOL has been developed to enable simulations of SOC turnover in soil using temperature dependent first order kinetics for describing decomposition. Compared with many other SOC models, C-TOOL applies a less complicated structure, which facilitates easier calibration, and it requires only few inputs (i.e., average monthly air temperature, soil clay content,soil carbon-to-nitrogen ratio, and C inputs to the soil from plants and other sources). C-TOOL was parameterized using SOC and radiocarbon data from selected long-term field treatments in United Kingdom, Sweden and Denmark. However, less data were available for evaluation of subsoil C (25-100 cm) from the long-term experiments applied. In Denmark a national 7×7 km grid net was established in 1986 for soil C monitoring down to 100 cm depth. The results of SOC showed a significant decline from 1997 to 2009 in the 0-50 cm soil layer. This was mainly attributed to changes in the 25-50 cm layer, where a decline in SOC was found for all soil texture types. Across the period 1986 to 2009 there was clear tendency for increasing SOC on the sandy soils and reductions on the loamy soils. This effect is linked to land use, since grasslands and dairy farms are more abundant in the western parts of Denmark, where most of the sandy soils are located. The results and the data from soil monitoring have been used to validate the C-TOOL modelling approach used for accounting of changes in SOC of Danish agricultural soils and for verification of the national inventories of SOC changes in agricultural soils. Future work will focus on further evaluating effects on subsoil C as well as improving the estimation of C inputs, particularly root C input at different soil depth. Key words: Soil organic carbon, modelling, C-TOOL, agriculture, management, grassland
Stereo-Video Data Reduction of Wake Vortices and Trailing Aircrafts
NASA Technical Reports Server (NTRS)
Alter-Gartenberg, Rachel
1998-01-01
This report presents stereo image theory and the corresponding image processing software developed to analyze stereo imaging data acquired for the wake-vortex hazard flight experiment conducted at NASA Langley Research Center. In this experiment, a leading Lockheed C-130 was equipped with wing-tip smokers to visualize its wing vortices, while a trailing Boeing 737 flew into the wake vortices of the leading airplane. A Rockwell OV-10A airplane, fitted with video cameras under its wings, flew at 400 to 1000 feet above and parallel to the wakes, and photographed the wake interception process for the purpose of determining the three-dimensional location of the trailing aircraft relative to the wake. The report establishes the image-processing tools developed to analyze the video flight-test data, identifies sources of potential inaccuracies, and assesses the quality of the resultant set of stereo data reduction.
Locally linear embedding: dimension reduction of massive protostellar spectra
NASA Astrophysics Data System (ADS)
Ward, J. L.; Lumsden, S. L.
2016-09-01
We present the results of the application of locally linear embedding (LLE) to reduce the dimensionality of dereddened and continuum subtracted near-infrared spectra using a combination of models and real spectra of massive protostars selected from the Red MSX Source survey data base. A brief comparison is also made with two other dimension reduction techniques; principal component analysis (PCA) and Isomap using the same set of spectra as well as a more advanced form of LLE, Hessian locally linear embedding. We find that whilst LLE certainly has its limitations, it significantly outperforms both PCA and Isomap in classification of spectra based on the presence/absence of emission lines and provides a valuable tool for classification and analysis of large spectral data sets.
Astrometrica: Astrometric data reduction of CCD images
NASA Astrophysics Data System (ADS)
Raab, Herbert
2012-03-01
Astrometrica is an interactive software tool for scientific grade astrometric data reduction of CCD images. The current version of the software is for the Windows 32bit operating system family. Astrometrica reads FITS (8, 16 and 32 bit integer files) and SBIG image files. The size of the images is limited only by available memory. It also offers automatic image calibration (Dark Frame and Flat Field correction), automatic reference star identification, automatic moving object detection and identification, and access to new-generation star catalogs (PPMXL, UCAC 3 and CMC-14), in addition to online help and other features. Astrometrica is shareware, available for use for a limited period of time (100 days) for free; special arrangements can be made for educational projects.
A general engineering scenario for concurrent engineering environments
NASA Astrophysics Data System (ADS)
Mucino, V. H.; Pavelic, V.
The paper describes an engineering method scenario which categorizes the various activities and tasks into blocks seen as subjects which consume and produce data and information. These methods, tools, and associated utilities interact with other engineering tools by exchanging information in such a way that a relationship between customers and suppliers of engineering data is established clearly, while data exchange consistency is maintained throughout the design process. The events and data transactions are presented in the form of flowcharts in which data transactions represent the connection between the various bricks, which in turn represent the engineering activities developed for the particular task required in the concurrent engineering environment.
Processing Solutions for Big Data in Astronomy
NASA Astrophysics Data System (ADS)
Fillatre, L.; Lepiller, D.
2016-09-01
This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.
Toward automated denoising of single molecular Förster resonance energy transfer data
NASA Astrophysics Data System (ADS)
Lee, Hao-Chih; Lin, Bo-Lin; Chang, Wei-Hau; Tu, I.-Ping
2012-01-01
A wide-field two-channel fluorescence microscope is a powerful tool as it allows for the study of conformation dynamics of hundreds to thousands of immobilized single molecules by Förster resonance energy transfer (FRET) signals. To date, the data reduction from a movie to a final set containing meaningful single-molecule FRET (smFRET) traces involves human inspection and intervention at several critical steps, greatly hampering the efficiency at the post-imaging stage. To facilitate the data reduction from smFRET movies to smFRET traces and to address the noise-limited issues, we developed a statistical denoising system toward fully automated processing. This data reduction system has embedded several novel approaches. First, as to background subtraction, high-order singular value decomposition (HOSVD) method is employed to extract spatial and temporal features. Second, to register and map the two color channels, the spots representing bleeding through the donor channel to the acceptor channel are used. Finally, correlation analysis and likelihood ratio statistic for the change point detection (CPD) are developed to study the two channels simultaneously, resolve FRET states, and report the dwelling time of each state. The performance of our method has been checked using both simulation and real data.
Information Power Grid (IPG) Tutorial 2003
NASA Technical Reports Server (NTRS)
Meyers, George
2003-01-01
For NASA and the general community today Grid middleware: a) provides tools to access/use data sources (databases, instruments, ...); b) provides tools to access computing (unique and generic); c) Is an enabler of large scale collaboration. Dynamically responding to needs is a key selling point of a grid. Independent resources can be joined as appropriate to solve a problem. Provide tools to enable the building of a frameworks for application. Provide value added service to the NASA user base for utilizing resources on the grid in new and more efficient ways. Provides tools for development of Frameworks.
MyGeoHub: A Collaborative Geospatial Research and Education Platform
NASA Astrophysics Data System (ADS)
Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.
2017-12-01
Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.
The EPA's human exposure research program for assessing cumulative risk in communities.
Zartarian, Valerie G; Schultz, Bradley D
2010-06-01
Communities are faced with challenges in identifying and prioritizing environmental issues, taking actions to reduce their exposures, and determining their effectiveness for reducing human health risks. Additional challenges include determining what scientific tools are available and most relevant, and understanding how to use those tools; given these barriers, community groups tend to rely more on risk perception than science. The U.S. Environmental Protection Agency's Office of Research and Development, National Exposure Research Laboratory (NERL) and collaborators are developing and applying tools (models, data, methods) for enhancing cumulative risk assessments. The NERL's "Cumulative Communities Research Program" focuses on key science questions: (1) How to systematically identify and prioritize key chemical stressors within a given community?; (2) How to develop estimates of exposure to multiple stressors for individuals in epidemiologic studies?; and (3) What tools can be used to assess community-level distributions of exposures for the development and evaluation of the effectiveness of risk reduction strategies? This paper provides community partners and scientific researchers with an understanding of the NERL research program and other efforts to address cumulative community risks; and key research needs and opportunities. Some initial findings include the following: (1) Many useful tools exist for components of risk assessment, but need to be developed collaboratively with end users and made more comprehensive and user-friendly for practical application; (2) Tools for quantifying cumulative risks and impact of community risk reduction activities are also needed; (3) More data are needed to assess community- and individual-level exposures, and to link exposure-related information with health effects; and (4) Additional research is needed to incorporate risk-modifying factors ("non-chemical stressors") into cumulative risk assessments. The products of this research program will advance the science for cumulative risk assessments and empower communities with information so that they can make informed, cost-effective decisions to improve public health.
Python tools for rapid development, calibration, and analysis of generalized groundwater-flow models
NASA Astrophysics Data System (ADS)
Starn, J. J.; Belitz, K.
2014-12-01
National-scale water-quality data sets for the United States have been available for several decades; however, groundwater models to interpret these data are available for only a small percentage of the country. Generalized models may be adequate to explain and project groundwater-quality trends at the national scale by using regional scale models (defined as watersheds at or between the HUC-6 and HUC-8 levels). Coast-to-coast data such as the National Hydrologic Dataset Plus (NHD+) make it possible to extract the basic building blocks for a model anywhere in the country. IPython notebooks have been developed to automate the creation of generalized groundwater-flow models from the NHD+. The notebook format allows rapid testing of methods for model creation, calibration, and analysis. Capabilities within the Python ecosystem greatly speed up the development and testing of algorithms. GeoPandas is used for very efficient geospatial processing. Raster processing includes the Geospatial Data Abstraction Library and image processing tools. Model creation is made possible through Flopy, a versatile input and output writer for several MODFLOW-based flow and transport model codes. Interpolation, integration, and map plotting included in the standard Python tool stack also are used, making the notebook a comprehensive platform within on to build and evaluate general models. Models with alternative boundary conditions, number of layers, and cell spacing can be tested against one another and evaluated by using water-quality data. Novel calibration criteria were developed by comparing modeled heads to land-surface and surface-water elevations. Information, such as predicted age distributions, can be extracted from general models and tested for its ability to explain water-quality trends. Groundwater ages then can be correlated with horizontal and vertical hydrologic position, a relation that can be used for statistical assessment of likely groundwater-quality conditions. Convolution with age distributions can be used to quickly ascertain likely future water-quality conditions. Although these models are admittedly very general and are still being tested, the hope is that they will be useful for answering questions related to water quality at the regional scale.
Tiagabine in clinical practice: effects on seizure control and behavior.
Vossler, David G; Morris, George L; Harden, Cynthia L; Montouris, Georgia; Faught, Edward; Kanner, Andres M; Fix, Aaron; French, Jacqueline A
2013-08-01
Preapproval randomized controlled trials of antiepileptic drugs provide data in limited patient groups. We assessed the side effect and seizure reduction profile of tiagabine (TGB) in typical clinical practice. Investigators recorded adverse effect (AE), seizure, and assessment-of-benefit data prospectively in sequential patients treated open label with TGB. Two hundred ninety-two patients (39 children) were enrolled to be treated long term with TGB. Seizure types were focal-onset (86%), generalized-onset (12%), both focal- and generalized-onset (0.3%), and multiple associated with Lennox-Gastaut Syndrome (2%). Two hundred thirty-one received at least one dose of TGB (median = 28 mg/day) and had follow-up seizure or AE data reported. Common AEs were fatigue, dizziness, psychomotor slowing, ataxia, gastrointestinal upset, weight change, insomnia, and "others" (mostly behavioral). Serious AEs occurred in 19 patients: behavioral effects (n = 12), status epilepticus (n = 3), others (n = 3), and sudden unexplained death (n = 1). No patients experienced suicidal ideation/behavior, rash, nephrolithiasis, or organ failure. Seizure outcomes were seizure freedom (5%), ≥75% reduction (12%), ≥50% reduction (23%), and increased number of seizures (17%), or new seizure type (1%). Behavioral AEs occurred in a larger proportion of patients compared to those reported in TGB preapproval randomized controlled trials. A moderate percentage of patients had a meaningful reduction in seizure frequency. In clinical practice, TGB remains a useful antiepileptic drug. Copyright © 2013 Elsevier Inc. All rights reserved.
Doheny, H C; Whittington, M A; Jefferys, J G R; Patsalos, P N
2002-01-01
The tetanus toxin seizure model, which is associated with spontaneous and intermittent generalized and non-generalized seizures, is considered to reflect human complex partial epilepsy. The purpose of the present study was to investigate and compare the anticonvulsant effects of carbamazepine with that of levetiracetam, a new anti-epileptic drug in this model. One μl of tetanus toxin solution (containing 12 mLD50 μl−1 of tetanus toxin) was placed stereotactically into the rat left hippocampus resulting in generalized and non-generalized seizures. Carbamazepine (4 mg kg−1 h−1) and levetiracetam (8 and 16 mg kg−1 h−1) were administered during a 7 day period via an osmotic minipump which was placed in the peritoneal cavity. Carbamazepine (4 mg kg−1 h−1) exhibited no significant anticonvulsant effect, compared to control, when the entire 7 day study period was evaluated but the reduction in generalized seizures was greater (35.5%) than that for non-generalized seizures (12.6%). However, during the first 2 days of carbamazepine administration a significant reduction in both generalized seizure frequency (90%) and duration (25%) was observed. Non-generalized seizures were unaffected. This time-dependent anticonvulsant effect exactly paralleled the central (CSF) and peripheral (serum) kinetics of carbamazepine in that steady-state concentrations declined over time, with the highest concentrations achieved during the first 2 days. Also there was a significant 27.3% reduction in duration of generalized seizures during the 7 day study period (P=0.0001). Levetiracetam administration (8 and 16 mg kg−1 h−1) was associated with a dose-dependent reduction in the frequency of both generalized (39 v 57%) and non-generalized (36 v 41%) seizures. However, seizure suppression was more substantial for generalized seizures. Also a significant dose-dependent reduction in overall generalized seizure duration was observed. These data provide experimental evidence for the clinical efficacy of levetiracetam for the management of patients with complex partial seizures. Furthermore, levetiracetam probably does not act by preventing ictogenesis per se but acts to reduce seizure severity and seizure generalization. PMID:11906955
Focus on...The right tools: Managing for fire using FIA inventory data.
USDA Forest Service
2003-01-01
The relative severity of recent fire seasons has led to numerous debates about the health, associated fire hazards, and effectiveness of fuel reduction treatments in forests across the United States. Scientific analyses of forest inventories offer policy makers and other interested parties objective information with which to make crucial forest management decisions....
Onboard Acoustic Data-Processing for the Statistical Analysis of Array Beam-Noise,
1980-12-15
performance of the sonar system as a measurement tool and others that can assess the character of the ambient- noise field at the time of the measurement. In...the plot as would "dead" hydrophones. A reduction in sensitivity of a hydrophone, a faulty preamplifier , or any other fault in the acoustic channel
CT Imaging, Data Reduction, and Visualization of Hardwood Logs
Daniel L. Schmoldt
1996-01-01
Computer tomography (CT) is a mathematical technique that, combined with noninvasive scanning such as x-ray imaging, has become a powerful tool to nondestructively test materials prior to use or to evaluate materials prior to processing. In the current context, hardwood lumber processing can benefit greatly by knowing what a log looks like prior to initial breakdown....
77 FR 18718 - Petroleum Reduction and Alternative Fuel Consumption Requirements for Federal Fleets
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-28
... Statistical Tool Web-based reporting system (FAST) for FY 2005. Moreover, section 438.102(b) would require... reflected in FY 2005 FAST data, or (2) the lesser of (a) five percent of total Federal fleet vehicle fuel... event that the Federal fleet's alternative fuel use value for FY 2005 submitted through FAST did not...
Frasher, Sarah K; Woodruff, Tracy M; Bouldin, Jennifer L
2016-06-01
In efforts to reduce nonpoint source runoff and improve water quality, Best Management Practices (BMPs) were implemented in the Outlet Larkin Creek Watershed. Farmers need to make scientifically informed decisions concerning BMPs addressing contaminants from agricultural fields. The BMP Tool was developed from previous studies to estimate BMP effectiveness at reducing nonpoint source contaminants. The purpose of this study was to compare the measured percent reduction of dissolved phosphorus (DP) and total suspended solids to the reported percent reductions from the BMP Tool for validation. Similarities were measured between the BMP Tool and the measured water quality parameters. Construction of a sedimentation pond resulted in 74 %-76 % reduction in DP as compared to 80 % as predicted with the BMP Tool. However, further research is needed to validate the tool for additional water quality parameters. The BMP Tool is recommended for future BMP implementation as a useful predictor for farmers.
Tool for the Reduction and Assessment of Chemical and other Environmental Impacts
TRACI, the Tool for the Reduction and Assessment of Chemical and other environmental Impacts, has been developed by the US Environmental Protection Agency’s National Risk Management Research Laboratory to facilitate the characterization of stressors that have potential effects, ...
SUPER-FOCUS: a tool for agile functional analysis of shotgun metagenomic data
Green, Kevin T.; Dutilh, Bas E.; Edwards, Robert A.
2016-01-01
Summary: Analyzing the functional profile of a microbial community from unannotated shotgun sequencing reads is one of the important goals in metagenomics. Functional profiling has valuable applications in biological research because it identifies the abundances of the functional genes of the organisms present in the original sample, answering the question what they can do. Currently, available tools do not scale well with increasing data volumes, which is important because both the number and lengths of the reads produced by sequencing platforms keep increasing. Here, we introduce SUPER-FOCUS, SUbsystems Profile by databasE Reduction using FOCUS, an agile homology-based approach using a reduced reference database to report the subsystems present in metagenomic datasets and profile their abundances. SUPER-FOCUS was tested with over 70 real metagenomes, the results showing that it accurately predicts the subsystems present in the profiled microbial communities, and is up to 1000 times faster than other tools. Availability and implementation: SUPER-FOCUS was implemented in Python, and its source code and the tool website are freely available at https://edwards.sdsu.edu/SUPERFOCUS. Contact: redwards@mail.sdsu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26454280
Robotic Seals as Therapeutic Tools in an Aged Care Facility: A Qualitative Study
Bodak, Marie; Barlas, Joanna; Harwood, June; Pether, Mary
2016-01-01
Robots, including robotic seals, have been used as an alternative to therapies such as animal assisted therapy in the promotion of health and social wellbeing of older people in aged care facilities. There is limited research available that evaluates the effectiveness of robot therapies in these settings. The aim of this study was to identify, explore, and describe the impact of the use of Paro robotic seals in an aged care facility in a regional Australian city. A qualitative, descriptive, exploratory design was employed. Data were gathered through interviews with the three recreational therapists employed at the facility who were also asked to maintain logs of their interactions with the Paro and residents. Data were transcribed and thematically analysed. Three major themes were identified from the analyses of these data: “a therapeutic tool that's not for everybody,” “every interaction is powerful,” and “keeping the momentum.” Findings support the use of Paro as a therapeutic tool, revealing improvement in emotional state, reduction of challenging behaviours, and improvement in social interactions of residents. The potential benefits justify the investment in Paro, with clear evidence that these tools can have a positive impact that warrants further exploration. PMID:27990301
SUPER-FOCUS: a tool for agile functional analysis of shotgun metagenomic data.
Silva, Genivaldo Gueiros Z; Green, Kevin T; Dutilh, Bas E; Edwards, Robert A
2016-02-01
Analyzing the functional profile of a microbial community from unannotated shotgun sequencing reads is one of the important goals in metagenomics. Functional profiling has valuable applications in biological research because it identifies the abundances of the functional genes of the organisms present in the original sample, answering the question what they can do. Currently, available tools do not scale well with increasing data volumes, which is important because both the number and lengths of the reads produced by sequencing platforms keep increasing. Here, we introduce SUPER-FOCUS, SUbsystems Profile by databasE Reduction using FOCUS, an agile homology-based approach using a reduced reference database to report the subsystems present in metagenomic datasets and profile their abundances. SUPER-FOCUS was tested with over 70 real metagenomes, the results showing that it accurately predicts the subsystems present in the profiled microbial communities, and is up to 1000 times faster than other tools. SUPER-FOCUS was implemented in Python, and its source code and the tool website are freely available at https://edwards.sdsu.edu/SUPERFOCUS. redwards@mail.sdsu.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Design and Evaluation of a Web-Based Symptom Monitoring Tool for Heart Failure.
Wakefield, Bonnie J; Alexander, Gregory; Dohrmann, Mary; Richardson, James
2017-05-01
Heart failure is a chronic condition where symptom recognition and between-visit communication with providers are critical. Patients are encouraged to track disease-specific data, such as weight and shortness of breath. Use of a Web-based tool that facilitates data display in graph form may help patients recognize exacerbations and more easily communicate out-of-range data to clinicians. The purposes of this study were to (1) design a Web-based tool to facilitate symptom monitoring and symptom recognition in patients with chronic heart failure and (2) conduct a usability evaluation of the Web site. Patient participants generally had a positive view of the Web site and indicated it would support recording their health status and communicating with their doctors. Clinician participants generally had a positive view of the Web site and indicated it would be a potentially useful adjunct to electronic health delivery systems. Participants expressed a need to incorporate decision support within the site and wanted to add other data, for example, blood pressure, and have the ability to adjust font size. A few expressed concerns about data privacy and security. Technologies require careful design and testing to ensure they are useful, usable, and safe for patients and do not add to the burden of busy providers.
Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele
QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less
NASA Astrophysics Data System (ADS)
Hahlbeck, N.; Scales, K. L.; Hazen, E. L.; Bograd, S. J.
2016-12-01
The reduction of bycatch, or incidental capture of non-target species in a fishery, is a key objective of ecosystem-based fisheries management (EBFM) and critical to the conservation of many threatened marine species. Prediction of bycatch events is therefore of great importance to EBFM efforts. Here, bycatch of the ocean sunfish (Mola mola) and bluefin tuna (Thunnus thynnus) in the California drift gillnet fishery is modeled using a suite of remotely sensed environmental variables as predictors. Data from 8321 gillnet sets was aggregated by month to reduce zero inflation and autocorrelation among sets, and a set of a priori generalized additive models (GAMs) was created for each species based on literature review and preliminary data exploration. Each of the models was fit using a binomial family with a logit link in R, and Aikake's Information Criterion with correction (AICc) was used in the first stage of model selection. K-fold cross validation was used in the second stage of model selection and performance assessment, using the least-squares linear model of predicted vs. observed values as the performance metric. The best-performing mola model indicated a strong, nearly linear negative correlation with sea surface temperature, as well as weaker nonlinear correlations with eddy kinetic energy, chlorophyll-a concentration and rugosity. These findings are consistent with current understanding of ocean sunfish habitat use; for example, previous studies suggest seasonal movement patterns and exploitation of dynamic, highly productive areas characteristic of upwelling regions. Preliminary results from the bluefin models also indicate seasonal fluctuation and correlation with environmental variables. These models can be used with near-real time satellite data as bycatch avoidance tools for both fishers and managers, allowing for the use of more dynamic ocean management strategies to improve sustainability of the fishery.
NASA Astrophysics Data System (ADS)
Hahlbeck, N.; Scales, K. L.; Hazen, E. L.; Bograd, S. J.
2016-02-01
The reduction of bycatch, or incidental capture of non-target species in a fishery, is a key objective of ecosystem-based fisheries management (EBFM) and critical to the conservation of many threatened marine species. Prediction of bycatch events is therefore of great importance to EBFM efforts. Here, bycatch of the ocean sunfish (Mola mola) and bluefin tuna (Thunnus thynnus) in the California drift gillnet fishery is modeled using a suite of remotely sensed environmental variables as predictors. Data from 8321 gillnet sets was aggregated by month to reduce zero inflation and autocorrelation among sets, and a set of a priori generalized additive models (GAMs) was created for each species based on literature review and preliminary data exploration. Each of the models was fit using a binomial family with a logit link in R, and Aikake's Information Criterion with correction (AICc) was used in the first stage of model selection. K-fold cross validation was used in the second stage of model selection and performance assessment, using the least-squares linear model of predicted vs. observed values as the performance metric. The best-performing mola model indicated a strong, nearly linear negative correlation with sea surface temperature, as well as weaker nonlinear correlations with eddy kinetic energy, chlorophyll-a concentration and rugosity. These findings are consistent with current understanding of ocean sunfish habitat use; for example, previous studies suggest seasonal movement patterns and exploitation of dynamic, highly productive areas characteristic of upwelling regions. Preliminary results from the bluefin models also indicate seasonal fluctuation and correlation with environmental variables. These models can be used with near-real time satellite data as bycatch avoidance tools for both fishers and managers, allowing for the use of more dynamic ocean management strategies to improve sustainability of the fishery.
MAGMA: Generalized Gene-Set Analysis of GWAS Data
de Leeuw, Christiaan A.; Mooij, Joris M.; Heskes, Tom; Posthuma, Danielle
2015-01-01
By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical power for most methods is strongly affected by linkage disequilibrium between markers, multi-marker associations are often hard to detect, and the reliance on permutation to compute p-values tends to make the analysis computationally very expensive. To address these issues we have developed MAGMA, a novel tool for gene and gene-set analysis. The gene analysis is based on a multiple regression model, to provide better statistical performance. The gene-set analysis is built as a separate layer around the gene analysis for additional flexibility. This gene-set analysis also uses a regression structure to allow generalization to analysis of continuous properties of genes and simultaneous analysis of multiple gene sets and other gene properties. Simulations and an analysis of Crohn’s Disease data are used to evaluate the performance of MAGMA and to compare it to a number of other gene and gene-set analysis tools. The results show that MAGMA has significantly more power than other tools for both the gene and the gene-set analysis, identifying more genes and gene sets associated with Crohn’s Disease while maintaining a correct type 1 error rate. Moreover, the MAGMA analysis of the Crohn’s Disease data was found to be considerably faster as well. PMID:25885710
MAGMA: generalized gene-set analysis of GWAS data.
de Leeuw, Christiaan A; Mooij, Joris M; Heskes, Tom; Posthuma, Danielle
2015-04-01
By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical power for most methods is strongly affected by linkage disequilibrium between markers, multi-marker associations are often hard to detect, and the reliance on permutation to compute p-values tends to make the analysis computationally very expensive. To address these issues we have developed MAGMA, a novel tool for gene and gene-set analysis. The gene analysis is based on a multiple regression model, to provide better statistical performance. The gene-set analysis is built as a separate layer around the gene analysis for additional flexibility. This gene-set analysis also uses a regression structure to allow generalization to analysis of continuous properties of genes and simultaneous analysis of multiple gene sets and other gene properties. Simulations and an analysis of Crohn's Disease data are used to evaluate the performance of MAGMA and to compare it to a number of other gene and gene-set analysis tools. The results show that MAGMA has significantly more power than other tools for both the gene and the gene-set analysis, identifying more genes and gene sets associated with Crohn's Disease while maintaining a correct type 1 error rate. Moreover, the MAGMA analysis of the Crohn's Disease data was found to be considerably faster as well.
Antenatal services for Aboriginal women: the relevance of cultural competence.
Reibel, Tracy; Walker, Roz
2010-01-01
Due to persistent significantly poorer Aboriginal perinatal outcomes, the Women's and Newborns' Health Network, Western Australian Department of Health, required a comprehensive appraisal of antenatal services available to Aboriginal women as a starting point for future service delivery modelling. A services audit was conducted to ascertain the usage frequency and characteristics of antenatal services used by Aboriginal women in Western Australia (WA). Telephone interviews were undertaken with eligible antenatal services utilising a purpose specific service audit tool comprising questions in five categories: 1) general characteristics; 2) risk assessment; 3) treatment, risk reduction and education; 4) access; and 5) quality of care. Data were analysed according to routine antenatal care (e.g. risk assessment, treatment and risk reduction), service status (Aboriginal specific or non-specific) and application of cultural responsiveness. Significant gaps in appropriate antenatal services for Aboriginal women in metropolitan, rural and remote regions in WA were evident. Approximately 75% of antenatal services used by Aboriginal women have not achieved a model of service delivery consistent with the principles of culturally responsive care, with few services incorporating Aboriginal specific antenatal protocols/programme, maintaining access or employing Aboriginal Health Workers (AHWs). Of 42 audited services, 18 Aboriginal specific and 24 general antenatal services reported utilisation by Aboriginal women. Of these, nine were identified as providing culturally responsive service delivery, incorporating key indicators of cultural security combined with highly consistent delivery of routine antenatal care. One service was located in the metropolitan area and eight in rural or remote locations. The audit of antenatal services in WA represents a significant step towards a detailed understanding of which services are most highly utilised and their defining characteristics. The cultural responsiveness indicators used in the audit establish benchmarks for planning culturally appropriate antenatal services that may encourage Aboriginal women to more frequently attend antenatal visits.
Venous thromboembolism prophylaxis risk assessment in a general surgery cohort: a closed-loop audit.
McGoldrick, D M; Redmond, H P
2017-08-01
Venous thromboembolism (VTE) is a potential source of morbidity and mortality in surgical in-patients. A number of guidelines exist that advise on prophylactic measures. We aimed to assess VTE prophylaxis prescribing practices and compliance with a kardex-based risk assessment tool in a general surgery population. Data on general surgery in-patients were collected on two separate wards on two separate days. Drug kardexes were assessed for VTE prophylaxis measures and use of the risk assessment tool. NICE and SIGN guidelines were adopted as a gold standard. The audit results and information on the risk assessment tool were presented as an educational intervention at two separate departmental teaching sessions. A re-audit was completed after 3 months. In Audit A, 74 patients were assessed. 70% were emergency admissions. The risk assessment tool was completed in 2.7%. 75 and 97% of patients were correctly prescribed anti-embolic stockings (AES) and low-molecular weight heparin (LMWH), respectively. 30 patients were included in Audit B, 56% of whom were emergency admissions. 66% had a risk assessment performed, a statistically significant improvement (p < 0.0001). Rates of LMWH prescribing were similar (96%), but AES prescribing was lower (36%). Rates of LMWH prescribing are high in this general surgical population, although AES prescribing rates vary. Use of the VTE risk assessment tool increased following the initial audit and intervention.
A comparison of four geophysical methods for determining the shear wave velocity of soils
Anderson, N.; Thitimakorn, T.; Ismail, A.; Hoffman, D.
2007-01-01
The Missouri Department of Transportation (MoDOT) routinely acquires seismic cone penetrometer (SCPT) shear wave velocity control as part of the routine investigation of soils within the Mississippi Embayment. In an effort to ensure their geotechnical investigations are as effective and efficient as possible, the SCPT tool and several available alternatives (crosshole [CH]; multichannel analysis of surface waves [MASW]; and refraction microtremor [ReMi]) were evaluated and compared on the basis of field data acquired at two test sites in southeast Missouri. These four methods were ranked in terms of accuracy, functionality, cost, other considerations, and overall utility. It is concluded that MASW data are generally more reliable than SCPT data, comparable to quality ReMi data, and only slightly less accurate than CH data. However, the other advantages of MASW generally make it a superior choice over the CH, SCPT, and ReMi methods for general soil classification purposes to depths of 30 m. MASW data are less expensive than CH data and SCPT data and can normally be acquired in areas inaccessible to drill and SCPT rigs. In contrast to the MASW tool, quality ReMi data can be acquired only in areas where there are interpretable levels of "passive" acoustic energy and only when the geophone array is aligned with the source(s) of such energy.
Machine Learning: A Crucial Tool for Sensor Design
Zhao, Weixiang; Bhushan, Abhinav; Santamaria, Anthony D.; Simon, Melinda G.; Davis, Cristina E.
2009-01-01
Sensors have been widely used for disease diagnosis, environmental quality monitoring, food quality control, industrial process analysis and control, and other related fields. As a key tool for sensor data analysis, machine learning is becoming a core part of novel sensor design. Dividing a complete machine learning process into three steps: data pre-treatment, feature extraction and dimension reduction, and system modeling, this paper provides a review of the methods that are widely used for each step. For each method, the principles and the key issues that affect modeling results are discussed. After reviewing the potential problems in machine learning processes, this paper gives a summary of current algorithms in this field and provides some feasible directions for future studies. PMID:20191110
Staff, Michael
2012-01-01
The review of clinical data extraction from electronic records is increasingly being used as a tool to assist general practitioners (GPs) manage their patients in Australia. Type 2 diabetes (T2DM) is a chronic condition cared for primarily in the general practice setting that lends itself to the application of tools in this area. To assess the feasibility of extracting data from a general practice medical record software package to predict clinically significant outcomes for patients with T2DM. A pilot study was conducted involving two large practices where routinely collected clinical data were extracted and inputted into the United Kingdom Prospective Diabetes Study Outcomes Model to predict life expectancy. An initial assessment of the completeness of data available was performed and then for those patients aged between 45 and 64 years with adequate data life expectancies estimated. A total of 1019 patients were identified as current patients with T2DM. There were sufficient data available on 40% of patients from one practice and 49% from the other to provide inputs into the UKPDS Outcomes Model. Predicted life expectancy was similar across the practices with women having longer life expectancies than men. Improved compliance with current management guidelines for glycaemic, lipid and blood pressure control was demonstrated to increase life expectancy between 1.0 and 2.4 years dependent on gender and age group. This pilot demonstrated that clinical data extraction from electronic records is feasible although there are several limitations chiefly caused by the incompleteness of data for patients with T2DM.
NASA Technical Reports Server (NTRS)
Roskam, J.; Grosveld, F.
1980-01-01
Effect of panel curvature and oblique angle of sound incidence on noise reduction characteristics of an aluminum panel are experimentally investigated. Panel curvature results show significant increase in stiffness with comparable decrease of sound transmission through the panel in the frequency region below the panel/cavity resonance frequency. Noise reduction data have been achieved for aluminum panels with clamped, bonded and riveted edge conditions. These edge conditions are shown to influence noise reduction characteristics of aluminum panels. Experimentally measured noise reduction characteristics of flat aluminum panels with uniaxial and biaxial in-plane stresses are presented and discussed. Results indicate important improvement in noise reduction of these panels in the frequency range below the fundamental panel/cavity resonance frequency.
Spark and HPC for High Energy Physics Data Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sehrish, Saba; Kowalkowski, Jim; Paterno, Marc
A full High Energy Physics (HEP) data analysis is divided into multiple data reduction phases. Processing within these phases is extremely time consuming, therefore intermediate results are stored in files held in mass storage systems and referenced as part of large datasets. This processing model limits what can be done with interactive data analytics. Growth in size and complexity of experimental datasets, along with emerging big data tools are beginning to cause changes to the traditional ways of doing data analyses. Use of big data tools for HEP analysis looks promising, mainly because extremely large HEP datasets can be representedmore » and held in memory across a system, and accessed interactively by encoding an analysis using highlevel programming abstractions. The mainstream tools, however, are not designed for scientific computing or for exploiting the available HPC platform features. We use an example from the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) in Geneva, Switzerland. The LHC is the highest energy particle collider in the world. Our use case focuses on searching for new types of elementary particles explaining Dark Matter in the universe. We use HDF5 as our input data format, and Spark to implement the use case. We show the benefits and limitations of using Spark with HDF5 on Edison at NERSC.« less
NASA Astrophysics Data System (ADS)
Santiago, José M.; Alonso, Carlos; García de Jalón, Diego; Solana, Joaquín
2017-04-01
Streamflow and temperature regimes are determinant for the availability of suitable physical habitat for instream biological communities. Iberian brown trout (Salmo trutta) populations live in a climatic border in which summer water scarcity and raising temperatures will compromise their viability throughout the current century. Due to their impaired mobility, sessile stages of trout life cycle (i.e. eggs and larvae) are among the most sensitive organisms to environmental changing conditions. At a given spawning redd, thermal habitat is limited by the length of the period at which suitable temperatures occur. At the same time, suitable physical habitat is limited by the instream flow regime during spawning and incubation of eggs and larvae. Temperature and flow do also interact, thus producing synergistic effects on both physical and thermal habitats. This study is aimed at quantitatively predicting thermal and physical habitat loss for the sessile stages of brown trout life cycle due to clime change, in mountain streams at the rear edge of the species natural distribution using high-resolution spatial-temporal simulations of the thermal and physical habitat. Two streams of Central Spain have been studied (Cega and Lozoya streams). Daily temperature and flow data from ad hoc downscaled IPCC (RCP4.5 and RCP8.5) predictions were used as input variables. Physical habitat changes were simulated from previously predicted stream flow data by means of hydraulic simulation tools (River2D). By taking into account the thermal tolerance limits and the proportion of lost physical habitat, limiting factors for the reproduction of brown trout in the study area were determined. The general increase of mean temperatures shortens the duration of the early developmental stages. This reduction of the sessile period is rather similar in both RCP4.5 and RCP8.5 scenarios by 2050. Differences between both scenarios become greater by 2099. The duration of sessile developmental is reduced in 12 days (-10%) according to scenario RCP4.5 and as much as 30 days (-25%) according to RCP8.5 in the Cega stream. Reduction of this sessile period in the Lozoya stream ranges between 14 days (-12%) in RCP4.5 and 35 (-29%) in RCP8.5. However, this acceleration of the development is not sufficient to compensate the much greater reduction of the thermal window in which mean water temperature remain below 10°C (considered a critical threshold). In the Cega stream, suitable thermal window reduction will range between 21% (RCP4.5) and 49% (RCP8.5) by 2099. In contrast, the Lozoya stream will lose much less time of suitable temperatures by 2099: 3% and 21%, according to RCP4.5 and RCP8.5, respectively. Although habitat reductions will be significant during the spawning season, the most important problems for trout population viability seem to be related to the reduction of the available time window for embryos and larvae to complete their development. Besides, due to the differential sensitivity of instream thermal habitat to a general increase in air temperature, it is highly recommendable to address locally adapted mitigation programs to avoid a general retraction of the current native range of this species.
New Python-based methods for data processing
Sauter, Nicholas K.; Hattne, Johan; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel
2013-01-01
Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h−1) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in real time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femtosecond crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units. PMID:23793153
The use of the sexual function questionnaire as a screening tool for women with sexual dysfunction.
Quirk, Frances; Haughie, Scott; Symonds, Tara
2005-07-01
To determine if the validated Sexual Function Questionnaire (SFQ), developed to assess efficacy in female sexual dysfunction (FSD) clinical trials, may also have utility in identifying target populations for such studies. Data from five clinical trials and two general population surveys were used to analyze the utility of the SFQ as a tool to discriminate between the presence of specific components of FSD (i.e., hypoactive sexual desire disorder, female sexual arousal disorder, female orgasmic disorder, and dyspareunia). Sensitivity/specificity analysis and logistic regression analysis, using data from all five clinical studies and the general population surveys, confirmed that the SFQ domains have utility in detecting the presence of specific components of FSD and provide scores indicative of the presence of a specific sexual disorder. The SFQ is a valuable new tool for detecting the presence of FSD and identifying the specific components of sexual functions affected (desire, arousal, orgasm, or dyspareunia).
NASA Technical Reports Server (NTRS)
Nieberding, W. C.
1975-01-01
A general discussion of various methods which can be used to reduce energy consumption is presented. A very brief description of Lewis Research Center facilities is given and the energy reduction methods are discussed relative to them. Some specific examples (ie; automated equipment and data systems) of the implementation of the energy reduction methods are included.
Texture control of zircaloy tubing during tube reduction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagai, N.; Kakuma, T.; Fujita, K.
1982-01-01
Seven batches of Zircaloy-2 nuclear fuel cladding tubes with different textures were processed from tube shells of the same size, by different reduction routes, using pilger and 3-roll mills. Based on the texture data of these tubes, the texture control of Zircaloy tubing, the texture gradient across the wall, and the texture change during annealing were studied. The deformation texture of Zicaloy-2 tubing was dependent on the tool's curvature and was independent of the dimensions of the mother tubes. The different slopes of texture gradients were observed between the tubing of higher strain ration and that of lower strain ratio.
NASA Technical Reports Server (NTRS)
Swenson, Harry N.; Vincent, Danny; Tobias, Leonard (Technical Monitor)
1997-01-01
NASA and the FAA have designed and developed and an automation tool known as the Traffic Management Advisor (TMA). The system was operationally evaluated at the Ft. Worth Air Route Traffic Control Center (ARTCC). The TMA is a time-based strategic planning tool that provides Traffic Management Coordinators and En Route Air Traffic Controllers the ability to efficiently optimize the capacity of a demand impacted airport. The TMA consists of trajectory prediction, constraint-based runway scheduling, traffic flow visualization and controllers advisories. The TMA was used and operationally evaluated for forty-one rush traffic periods during a one month period in the Summer of 1996. The evaluations included all shifts of air traffic operations as well as periods of inclement weather. Performance data was collected for engineering and human factor analysis and compared with similar operations without the TMA. The engineering data indicates that the operations with the TMA show a one to two minute per aircraft delay reduction during rush periods. The human factor data indicate a perceived reduction in en route controller workload as well as an increase in job satisfaction. Upon completion of the evaluation, the TMA has become part of the normal operations at the Ft. Worth ARTCC.
TRACI - THE TOOL FOR THE REDUCTION AND ASSESSMENT OF CHEMICAL AND OTHER ENVIRONMENTAL IMPACTS
TRACI, The Tool for the Reduction and Assessment of Chemical and other environmental Impacts, is described along with its history, the underlying research, methodologies, and insights within individual impact categories. TRACI facilitates the characterization of stressors that ma...
Shen, Angela K; Warnock, Rob; Brereton, Stephaeno; McKean, Stephen; Wernecke, Michael; Chu, Steve; Kelman, Jeffrey A
2018-04-11
Older adults are at great risk of developing serious complications from seasonal influenza. We explore vaccination coverage estimates in the Medicare population through the use of administrative claims data and describe a tool designed to help shape outreach efforts and inform strategies to help raise influenza vaccination rates. This interactive mapping tool uses claims data to compare vaccination levels between geographic (i.e., state, county, zip code) and demographic (i.e., race, age) groups at different points in a season. Trends can also be compared across seasons. Utilization of this tool can assist key actors interested in prevention - medical groups, health plans, hospitals, and state and local public health authorities - in supporting strategies for reaching pools of unvaccinated beneficiaries where general national population estimates of coverage are less informative. Implementing evidence-based tools can be used to address persistent racial and ethnic disparities and prevent a substantial number of influenza cases and hospitalizations.
Data-Driven Model Reduction and Transfer Operator Approximation
NASA Astrophysics Data System (ADS)
Klus, Stefan; Nüske, Feliks; Koltai, Péter; Wu, Hao; Kevrekidis, Ioannis; Schütte, Christof; Noé, Frank
2018-06-01
In this review paper, we will present different data-driven dimension reduction techniques for dynamical systems that are based on transfer operator theory as well as methods to approximate transfer operators and their eigenvalues, eigenfunctions, and eigenmodes. The goal is to point out similarities and differences between methods developed independently by the dynamical systems, fluid dynamics, and molecular dynamics communities such as time-lagged independent component analysis, dynamic mode decomposition, and their respective generalizations. As a result, extensions and best practices developed for one particular method can be carried over to other related methods.
Air Markets Program Data (AMPD)
The Air Markets Program Data tool allows users to search EPA data to answer scientific, general, policy, and regulatory questions about industry emissions. Air Markets Program Data (AMPD) is a web-based application that allows users easy access to both current and historical data collected as part of EPA's emissions trading programs. This site allows you to create and view reports and to download emissions data for further analysis. AMPD provides a query tool so users can create custom queries of industry source emissions data, allowance data, compliance data, and facility attributes. In addition, AMPD provides interactive maps, charts, reports, and pre-packaged datasets. AMPD does not require any additional software, plug-ins, or security controls and can be accessed using a standard web browser.
Experimental evaluation of tool run-out in micro milling
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta
2018-05-01
This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.
Sparse principal component analysis in medical shape modeling
NASA Astrophysics Data System (ADS)
Sjöstrand, Karl; Stegmann, Mikkel B.; Larsen, Rasmus
2006-03-01
Principal component analysis (PCA) is a widely used tool in medical image analysis for data reduction, model building, and data understanding and exploration. While PCA is a holistic approach where each new variable is a linear combination of all original variables, sparse PCA (SPCA) aims at producing easily interpreted models through sparse loadings, i.e. each new variable is a linear combination of a subset of the original variables. One of the aims of using SPCA is the possible separation of the results into isolated and easily identifiable effects. This article introduces SPCA for shape analysis in medicine. Results for three different data sets are given in relation to standard PCA and sparse PCA by simple thresholding of small loadings. Focus is on a recent algorithm for computing sparse principal components, but a review of other approaches is supplied as well. The SPCA algorithm has been implemented using Matlab and is available for download. The general behavior of the algorithm is investigated, and strengths and weaknesses are discussed. The original report on the SPCA algorithm argues that the ordering of modes is not an issue. We disagree on this point and propose several approaches to establish sensible orderings. A method that orders modes by decreasing variance and maximizes the sum of variances for all modes is presented and investigated in detail.
Korošec, Živa; Pravst, Igor
2014-09-04
Processed foods are recognized as a major contributor to high dietary sodium intake, associated with increased risk of cardiovascular disease. Different public health actions are being introduced to reduce sodium content in processed foods and sodium intake in general. A gradual reduction of sodium content in processed foods was proposed in Slovenia, but monitoring sodium content in the food supply is essential to evaluate the progress. Our primary objective was to test a new approach for assessing the sales-weighted average sodium content of prepacked foods on the market. We show that a combination of 12-month food sales data provided by food retailers covering the majority of the national market and a comprehensive food composition database compiled using food labelling data represent a robust and cost-effective approach to assessing the sales-weighted average sodium content of prepacked foods. Food categories with the highest sodium content were processed meats (particularly dry cured meat), ready meals (especially frozen pizza) and cheese. The reported results show that in most investigated food categories, market leaders in the Slovenian market have lower sodium contents than the category average. The proposed method represents an excellent tool for monitoring sodium content in the food supply.
Korošec, Živa; Pravst, Igor
2014-01-01
Processed foods are recognized as a major contributor to high dietary sodium intake, associated with increased risk of cardiovascular disease. Different public health actions are being introduced to reduce sodium content in processed foods and sodium intake in general. A gradual reduction of sodium content in processed foods was proposed in Slovenia, but monitoring sodium content in the food supply is essential to evaluate the progress. Our primary objective was to test a new approach for assessing the sales-weighted average sodium content of prepacked foods on the market. We show that a combination of 12-month food sales data provided by food retailers covering the majority of the national market and a comprehensive food composition database compiled using food labelling data represent a robust and cost-effective approach to assessing the sales-weighted average sodium content of prepacked foods. Food categories with the highest sodium content were processed meats (particularly dry cured meat), ready meals (especially frozen pizza) and cheese. The reported results show that in most investigated food categories, market leaders in the Slovenian market have lower sodium contents than the category average. The proposed method represents an excellent tool for monitoring sodium content in the food supply. PMID:25192028
2016-01-01
for services offers promising opportunities for savings . One force-shaping tool at the department’s disposal is the ability to convert military...Manpower Data Center (DMDC) data • discussions with human resource , manpower, and budget experts across DoD who have experience with military-to...that are not less than the savings achieved from reductions in military end strength.3 Moreover, the services are prohibited by statute from
Topographic variation in structure of mixed-conifer forests under an active-fire regime
Jamie Lydersen; Malcolm North
2012-01-01
Management efforts to promote forest resiliency as climate changes have often used historical forest conditions to provide general guidance for fuels reduction and forest restoration treatments. However, it has been difficult to identify what stand conditions might be fire and drought resilient because historical data and reconstruction studies are generally limited to...
Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.
Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben
2017-06-06
Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.
Tools and Data Services from the NASA Earth Satellite Observations for Climate Applications
NASA Technical Reports Server (NTRS)
Vicente, Gilberto A.
2005-01-01
Climate science and applications require access to vast amounts of archived high quality data, software tools and services for data manipulation and information extraction. These on the other hand require gaining detailed understanding of the data's internal structure and physical implementation to data reduction, combination and data product production. This time-consuming task must be undertaken before the core investigation can begin and is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets of different formats, structures, and resolutions. In order to address these issues the Goddard Space Flight Center (GSFC) Earth Sciences (GES), Data and Information Service Center (DISC) Distributed Active Archive Center (DAAC) has made great progress in facilitating science and applications research by developing innovative tools and data services applied to the Earth sciences atmospheric and climate data. The GES/DISC/DAAC has successfully implemented and maintained a long-term climate satellite data archive and developed tools and services to a variety of atmospheric science missions including AIRS, AVHRR, MODIS, SeaWiFS, SORCE, TOMS, TOVS, TRMM, and UARS and Aura instruments providing researchers with excellent opportunities to acquire accurate and continuous atmospheric measurements. Since the number of climate science products from these various missions is steadily increasing as a result of more sophisticated sensors and new science algorithms, the main challenge for data centers like the GES/DISC/DAAC is to guide users through the variety of data sets and products, provide tools to visualize and reduce the volume of the data and secure uninterrupted and reliable access to data and related products. This presentation will describe the effort at the GES/DISC/DAAC to build a bridge between multi-sensor data and the effective scientific use of the data, with an emphasis on the heritage satellite observations and science products for climate applications. The intent is to inform users of the existence of this large collection of data and products; suggest starting points for cross-platform science projects and data mining activities and provide data services and tools information. More information about the GES/DISC/DAAC satellite data and products, tools, and services can be found at http://daac.gsfc.nasa.gov.
Marchant, Carol A; Briggs, Katharine A; Long, Anthony
2008-01-01
ABSTRACT Lhasa Limited is a not-for-profit organization that exists to promote the sharing of data and knowledge in chemistry and the life sciences. It has developed the software tools Derek for Windows, Meteor, and Vitic to facilitate such sharing. Derek for Windows and Meteor are knowledge-based expert systems that predict the toxicity and metabolism of a chemical, respectively. Vitic is a chemically intelligent toxicity database. An overview of each software system is provided along with examples of the sharing of data and knowledge in the context of their development. These examples include illustrations of (1) the use of data entry and editing tools for the sharing of data and knowledge within organizations; (2) the use of proprietary data to develop nonconfidential knowledge that can be shared between organizations; (3) the use of shared expert knowledge to refine predictions; (4) the sharing of proprietary data between organizations through the formation of data-sharing groups; and (5) the use of proprietary data to validate predictions. Sharing of chemical toxicity and metabolism data and knowledge in this way offers a number of benefits including the possibilities of faster scientific progress and reductions in the use of animals in testing. Maximizing the accessibility of data also becomes increasingly crucial as in silico systems move toward the prediction of more complex phenomena for which limited data are available.
Management information systems--green light for better info.
Foster, K; McBride, N
1992-02-27
An EIS gathers financial and non-financial data from a variety of sources, both internal and external to an organisation, and presents them accessibly and understandably. It should facilitate the presentation of data so that it is timely and relevant to senior managers' needs. Generally, there are three categories of EIS: Front-end tools, which enhance the presentation of output from existing systems. For example, they can take the output from a general ledger system and enhance its appearance but do not change the content. Internal consolidation tools, which take data from a number of internal sources (such as the general ledger), store it in a central database, and present it to users. This is often done in report book format to which all EIS style functions may be applied. Integrators of information, which integrate data from internal and external sources. Much of the data is non-financial and a major emphasis is on the users' ability to communicate with each other. The key functions that may be included within an EIS are: Drill down. The facility to explore increasingly detailed levels of data. Trends and variances against pre-set targets, such as financial budgets. Graphics and tabular reporting. Data integrity checking. Analysis of the data, modelling and the production of forecasts using time series analysis techniques. Exception reporting through the use of some form of alert. Incorporation of text into the output.
A trade-off analysis design tool. Aircraft interior noise-motion/passenger satisfaction model
NASA Technical Reports Server (NTRS)
Jacobson, I. D.
1977-01-01
A design tool was developed to enhance aircraft passenger satisfaction. The effect of aircraft interior motion and noise on passenger comfort and satisfaction was modelled. Effects of individual aircraft noise sources were accounted for, and the impact of noise on passenger activities and noise levels to safeguard passenger hearing were investigated. The motion noise effect models provide a means for tradeoff analyses between noise and motion variables, and also provide a framework for optimizing noise reduction among noise sources. Data for the models were collected onboard commercial aircraft flights and specially scheduled tests.
Scanner baseliner monitoring and control in high volume manufacturing
NASA Astrophysics Data System (ADS)
Samudrala, Pavan; Chung, Woong Jae; Aung, Nyan; Subramany, Lokesh; Gao, Haiyong; Gomez, Juan-Manuel
2016-03-01
We analyze performance of different customized models on baseliner overlay data and demonstrate the reduction in overlay residuals by ~10%. Smart Sampling sets were assessed and compared with the full wafer measurements. We found that performance of the grid can still be maintained by going to one-third of total sampling points, while reducing metrology time by 60%. We also demonstrate the feasibility of achieving time to time matching using scanner fleet manager and thus identify the tool drifts even when the tool monitoring controls are within spec limits. We also explore the scanner feedback constant variation with illumination sources.
Jonker, D; Rolander, B; Balogh, I; Sandsjö, L; Ekberg, K; Winkel, J
2011-10-01
The present study investigates the dental work in terms of time distribution and mechanical exposure in value-adding work (VAW) and non-VAW. Further rationalisation of dental work would typically involve an increase in the proportion of VAW. Information on mechanical exposure within the classes of VAW and non-VAW may be used to predict possible implications of rationalisation. Sixteen dentists were investigated. Using a data logger, postures and movements were continuously recorded for each subject during the 4 h of work, which included the 45 min of video recording. Time distribution and mechanical exposure for the six different work activities identified were evaluated from the video recordings, using a loss analysis technique. VAW, which comprised 54% of the total working time, generally implied significantly more constrained mechanical exposures as compared with non-VAW. The results suggest that future rationalisation of dental work, involving a reduction of non-VAW, may increase the risk of developing musculoskeletal disorders. Statement of Relevance: The present study illustrates the potential effects of rationalisation on biomechanical exposures for dentists. The results highlight the significance of integrating ergonomic issues into the rationalisation process in dentistry in addition to ordinary workstation and tool design improvements performed by ergonomists.
Web-based visualisation and analysis of 3D electron-microscopy data from EMDB and PDB.
Lagerstedt, Ingvar; Moore, William J; Patwardhan, Ardan; Sanz-García, Eduardo; Best, Christoph; Swedlow, Jason R; Kleywegt, Gerard J
2013-11-01
The Protein Data Bank in Europe (PDBe) has developed web-based tools for the visualisation and analysis of 3D electron microscopy (3DEM) structures in the Electron Microscopy Data Bank (EMDB) and Protein Data Bank (PDB). The tools include: (1) a volume viewer for 3D visualisation of maps, tomograms and models, (2) a slice viewer for inspecting 2D slices of tomographic reconstructions, and (3) visual analysis pages to facilitate analysis and validation of maps, tomograms and models. These tools were designed to help non-experts and experts alike to get some insight into the content and assess the quality of 3DEM structures in EMDB and PDB without the need to install specialised software or to download large amounts of data from these archives. The technical challenges encountered in developing these tools, as well as the more general considerations when making archived data available to the user community through a web interface, are discussed. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Selenium isotope fractionation during reduction by Fe(II)-Fe(III) hydroxide-sulfate (green rust)
Johnson, T.M.; Bullen, T.D.
2003-01-01
We have determined the extent of Se isotope fractionation induced by reduction of selenate by sulfate interlayered green rust (GRSO4), a Fe(II)-Fe(III) hydroxide-sulfate. This compound is known to reduce selenate to Se(0), and it is the only naturally relevant abiotic selenate reduction pathway documented to date. Se reduction reactions, when they occur in nature, greatly reduce Se mobility and bioavailability. Se stable isotope analysis shows promise as an indicator of Se reduction, and Se isotope fractionation by various Se reactions must be known in order to refine this tool. We measured the increase in the 80Se/76Se ratio of dissolved selenate as lighter isotopes were preferentially consumed during reduction by GRSO4. Six different experiments that used GRSO4 made by two methods, with varying solution compositions and pH, yielded identical isotopic fractionations. Regression of all the data yielded an instantaneous isotope fractionation of 7.36 ?? 0.24???. Selenate reduction by GRSO4 induces much greater isotopic fractionation than does bacterial selenate reduction. If selenate reduction by GRSO4 occurs in nature, it may be identifiable on the basis of its relatively large isotopic fractionation. ?? 2003 Elsevier Science Ltd.
Toward an Improvement of the Analysis of Neural Coding.
Alegre-Cortés, Javier; Soto-Sánchez, Cristina; Albarracín, Ana L; Farfán, Fernando D; Val-Calvo, Mikel; Ferrandez, José M; Fernandez, Eduardo
2017-01-01
Machine learning and artificial intelligence have strong roots on principles of neural computation. Some examples are the structure of the first perceptron, inspired in the retina, neuroprosthetics based on ganglion cell recordings or Hopfield networks. In addition, machine learning provides a powerful set of tools to analyze neural data, which has already proved its efficacy in so distant fields of research as speech recognition, behavioral states classification, or LFP recordings. However, despite the huge technological advances in neural data reduction of dimensionality, pattern selection, and clustering during the last years, there has not been a proportional development of the analytical tools used for Time-Frequency (T-F) analysis in neuroscience. Bearing this in mind, we introduce the convenience of using non-linear, non-stationary tools, EMD algorithms in particular, for the transformation of the oscillatory neural data (EEG, EMG, spike oscillations…) into the T-F domain prior to its analysis with machine learning tools. We support that to achieve meaningful conclusions, the transformed data we analyze has to be as faithful as possible to the original recording, so that the transformations forced into the data due to restrictions in the T-F computation are not extended to the results of the machine learning analysis. Moreover, bioinspired computation such as brain-machine interface may be enriched from a more precise definition of neuronal coding where non-linearities of the neuronal dynamics are considered.
Boyce, Richard D.; Handler, Steven M.; Karp, Jordan F.; Perera, Subashan; Reynolds, Charles F.
2016-01-01
Introduction: A potential barrier to nursing home research is the limited availability of research quality data in electronic form. We describe a case study of converting electronic health data from five skilled nursing facilities to a research quality longitudinal dataset by means of open-source tools produced by the Observational Health Data Sciences and Informatics (OHDSI) collaborative. Methods: The Long-Term Care Minimum Data Set (MDS), drug dispensing, and fall incident data from five SNFs were extracted, translated, and loaded into version 4 of the OHDSI common data model. Quality assurance involved identifying errors using the Achilles data characterization tool and comparing both quality measures and drug exposures in the new database for concordance with externally available sources. Findings: Records for a total 4,519 patients (95.1%) made it into the final database. Achilles identified 10 different types of errors that were addressed in the final dataset. Drug exposures based on dispensing were generally accurate when compared with medication administration data from the pharmacy services provider. Quality measures were generally concordant between the new database and Nursing Home Compare for measures with a prevalence ≥ 10%. Fall data recorded in MDS was found to be more complete than data from fall incident reports. Conclusions: The new dataset is ready to support observational research on topics of clinical importance in the nursing home including patient-level prediction of falls. The extraction, translation, and loading process enabled the use of OHDSI data characterization tools that improved the quality of the final dataset. PMID:27891528
Principal polynomial analysis.
Laparra, Valero; Jiménez, Sandra; Tuia, Devis; Camps-Valls, Gustau; Malo, Jesus
2014-11-01
This paper presents a new framework for manifold learning based on a sequence of principal polynomials that capture the possibly nonlinear nature of the data. The proposed Principal Polynomial Analysis (PPA) generalizes PCA by modeling the directions of maximal variance by means of curves, instead of straight lines. Contrarily to previous approaches, PPA reduces to performing simple univariate regressions, which makes it computationally feasible and robust. Moreover, PPA shows a number of interesting analytical properties. First, PPA is a volume-preserving map, which in turn guarantees the existence of the inverse. Second, such an inverse can be obtained in closed form. Invertibility is an important advantage over other learning methods, because it permits to understand the identified features in the input domain where the data has physical meaning. Moreover, it allows to evaluate the performance of dimensionality reduction in sensible (input-domain) units. Volume preservation also allows an easy computation of information theoretic quantities, such as the reduction in multi-information after the transform. Third, the analytical nature of PPA leads to a clear geometrical interpretation of the manifold: it allows the computation of Frenet-Serret frames (local features) and of generalized curvatures at any point of the space. And fourth, the analytical Jacobian allows the computation of the metric induced by the data, thus generalizing the Mahalanobis distance. These properties are demonstrated theoretically and illustrated experimentally. The performance of PPA is evaluated in dimensionality and redundancy reduction, in both synthetic and real datasets from the UCI repository.
Impact of light rail transit on traffic-related pollution and stroke mortality.
Park, Eun Sug; Sener, Ipek Nese
2017-09-01
This paper evaluates the changes in vehicle exhaust and stroke mortality for the general public residing in the surrounding area of the light rail transit (LRT) in Houston, Texas, after its opening. The number of daily deaths due to stroke for 2002-2005 from the surrounding area of the original LRT line (exposure group) and the control groups was analyzed using an interrupted time-series analysis. Ambient concentrations of acetylene before and after the opening of LRT were also compared. A statistically significant reduction in the average concentration of acetylene was observed for the exposure sites whereas the reduction was negligible at the control site. Poisson regression models applied to the stroke mortality data indicated a significant reduction in daily stroke mortality after the opening of LRT for the exposure group, while there was either an increase or a considerably smaller reduction for the control groups. The findings support the idea that LRT systems provide health benefits for the general public and that the reduction in motor-vehicle-related air pollution may have contributed to these health benefits.
Establishing Evidence for Internal Structure Using Exploratory Factor Analysis
ERIC Educational Resources Information Center
Watson, Joshua C.
2017-01-01
Exploratory factor analysis (EFA) is a data reduction technique used to condense data into smaller sets of summary variables by identifying underlying factors potentially accounting for patterns of collinearity among said variables. Using an illustrative example, the 5 general steps of EFA are described with best practices for decision making…
Garcés-Vega, Francisco; Marks, Bradley P
2014-08-01
In the last 20 years, the use of microbial reduction models has expanded significantly, including inactivation (linear and nonlinear), survival, and transfer models. However, a major constraint for model development is the impossibility to directly quantify the number of viable microorganisms below the limit of detection (LOD) for a given study. Different approaches have been used to manage this challenge, including ignoring negative plate counts, using statistical estimations, or applying data transformations. Our objective was to illustrate and quantify the effect of negative plate count data management approaches on parameter estimation for microbial reduction models. Because it is impossible to obtain accurate plate counts below the LOD, we performed simulated experiments to generate synthetic data for both log-linear and Weibull-type microbial reductions. We then applied five different, previously reported data management practices and fit log-linear and Weibull models to the resulting data. The results indicated a significant effect (α = 0.05) of the data management practices on the estimated model parameters and performance indicators. For example, when the negative plate counts were replaced by the LOD for log-linear data sets, the slope of the subsequent log-linear model was, on average, 22% smaller than for the original data, the resulting model underpredicted lethality by up to 2.0 log, and the Weibull model was erroneously selected as the most likely correct model for those data. The results demonstrate that it is important to explicitly report LODs and related data management protocols, which can significantly affect model results, interpretation, and utility. Ultimately, we recommend using only the positive plate counts to estimate model parameters for microbial reduction curves and avoiding any data value substitutions or transformations when managing negative plate counts to yield the most accurate model parameters.
Framework for objective evaluation of privacy filters
NASA Astrophysics Data System (ADS)
Korshunov, Pavel; Melle, Andrea; Dugelay, Jean-Luc; Ebrahimi, Touradj
2013-09-01
Extensive adoption of video surveillance, affecting many aspects of our daily lives, alarms the public about the increasing invasion into personal privacy. To address these concerns, many tools have been proposed for protection of personal privacy in image and video. However, little is understood regarding the effectiveness of such tools and especially their impact on the underlying surveillance tasks, leading to a tradeoff between the preservation of privacy offered by these tools and the intelligibility of activities under video surveillance. In this paper, we investigate this privacy-intelligibility tradeoff objectively by proposing an objective framework for evaluation of privacy filters. We apply the proposed framework on a use case where privacy of people is protected by obscuring faces, assuming an automated video surveillance system. We used several popular privacy protection filters, such as blurring, pixelization, and masking and applied them with varying strengths to people's faces from different public datasets of video surveillance footage. Accuracy of face detection algorithm was used as a measure of intelligibility (a face should be detected to perform a surveillance task), and accuracy of face recognition algorithm as a measure of privacy (a specific person should not be identified). Under these conditions, after application of an ideal privacy protection tool, an obfuscated face would be visible as a face but would not be correctly identified by the recognition algorithm. The experiments demonstrate that, in general, an increase in strength of privacy filters under consideration leads to an increase in privacy (i.e., reduction in recognition accuracy) and to a decrease in intelligibility (i.e., reduction in detection accuracy). Masking also shows to be the most favorable filter across all tested datasets.
Al-Aziz, Jameel; Christou, Nicolas; Dinov, Ivo D.
2011-01-01
The amount, complexity and provenance of data have dramatically increased in the past five years. Visualization of observed and simulated data is a critical component of any social, environmental, biomedical or scientific quest. Dynamic, exploratory and interactive visualization of multivariate data, without preprocessing by dimensionality reduction, remains a nearly insurmountable challenge. The Statistics Online Computational Resource (www.SOCR.ucla.edu) provides portable online aids for probability and statistics education, technology-based instruction and statistical computing. We have developed a new Java-based infrastructure, SOCR Motion Charts, for discovery-based exploratory analysis of multivariate data. This interactive data visualization tool enables the visualization of high-dimensional longitudinal data. SOCR Motion Charts allows mapping of ordinal, nominal and quantitative variables onto time, 2D axes, size, colors, glyphs and appearance characteristics, which facilitates the interactive display of multidimensional data. We validated this new visualization paradigm using several publicly available multivariate datasets including Ice-Thickness, Housing Prices, Consumer Price Index, and California Ozone Data. SOCR Motion Charts is designed using object-oriented programming, implemented as a Java Web-applet and is available to the entire community on the web at www.socr.ucla.edu/SOCR_MotionCharts. It can be used as an instructional tool for rendering and interrogating high-dimensional data in the classroom, as well as a research tool for exploratory data analysis. PMID:21479108
Impact of Lean on patient cycle and waiting times at a rural district hospital in KwaZulu-Natal
Naidoo, Logandran
2016-01-01
Background Prolonged waiting time is a source of patient dissatisfaction with health care and is negatively associated with patient satisfaction. Prolonged waiting times in many district hospitals result in many dissatisfied patients, overworked and frustrated staff, and poor quality of care because of the perceived increased workload. Aim The aim of the study was to determine the impact of Lean principles techniques, and tools on the operational efficiency in the outpatient department (OPD) of a rural district hospital. Setting The study was conducted at the Catherine Booth Hospital (CBH) – a rural district hospital in KwaZulu-Natal, South Africa. Methods This was an action research study with pre-, intermediate-, and post-implementation assessments. Cycle and waiting times were measured by direct observation on two occasions before, approximately two-weekly during, and on two occasions after Lean implementation. A standardised data collection tool was completed by the researcher at each of the six key service nodes in the OPD to capture the waiting times and cycle times. Results All six service nodes showed a reduction in cycle times and waiting times between the baseline assessment and post-Lean implementation measurement. Significant reduction was achieved in cycle times (27%; p < 0.05) and waiting times (from 11.93 to 10 min; p = 0.03) at the Investigations node. Although the target reduction was not achieved for the Consulting Room node, there was a significant reduction in waiting times from 80.95 to 74.43 min, (p < 0.001). The average efficiency increased from 16.35% (baseline) to 20.13% (post-intervention). Conclusion The application of Lean principles, tools and techniques provides hospital managers with an evidence-based management approach to resolving problems and improving quality indicators. PMID:27543283
Development of an Irrigation Scheduling Tool for the High Plains Region
NASA Astrophysics Data System (ADS)
Shulski, M.; Hubbard, K. G.; You, J.
2009-12-01
The High Plains Regional Climate Center (HPRCC) at the University of Nebraska is one of NOAA’s six regional climate centers in the U.S. Primary objectives of the HPRCC are to conduct applied climate research, engage in climate education and outreach, and increase the use and availability of climate information by developing value-added products. Scientists at the center are engaged in utilizing regional weather data to develop tools that can be used directly by area stakeholders, particularly for agricultural sectors. A new study is proposed that will combine NOAA products (short-term forecasts and seasonal outlooks of temperature and precipitation) with existing capabilities to construct an irrigation scheduling tool that can be used by producers in the region. This tool will make use of weather observations from the regional mesonet (specifically the AWDN, Automated Weather Data Network) and the nation-wide relational database and web portal (ACIS, Applied Climate Information System). The primary benefit to stakeholders will be a more efficient use of water and energy resources owing to the reduction of uncertainty in the timing of irrigation.
A pilot survey of pediatric surgical capacity in West Africa.
Okoye, Mekam T; Ameh, Emmanuel A; Kushner, Adam L; Nwomeh, Benedict C
2015-03-01
While some data exist for the burden of pediatric surgical disease in low- and middle-income countries (LMICs), little is known about pediatric surgical capacity. In an effort to better plan and allocate resources for pediatric surgical care in LMICs, a survey of pediatric surgical capacity using specific tool was needed. Based on the previously published Surgeons OverSeas Personnel, Infrastructure, Procedure, Equipment, and Supplies (PIPES) survey, a pediatric PIPES (PediPIPES) survey was created. To ensure relevance to local needs and inclusion of only essential items, a draft PediPIPES survey was reviewed by nine pediatric surgeons and modifications were incorporated into a final tool. The survey was then distributed to surgeons throughout sub-Saharan Africa. Data from West Africa (37 hospitals in 10 of the 16 countries in the subregion) were analyzed. Fewer than 50% (18/37) of the hospitals had more than two pediatric surgeons. Neonatal or general intensive care units were not available in 51.4% (19/37) of hospitals. Open procedures such as appendectomy were performed in all the hospitals whereas less-invasive interventions such as non-operative intussusception reduction were done in only 41% (15/37). Life-saving pediatric equipment such as apnea monitors were not available in 65% (24/37) of the hospitals. The PediPIPES survey was useful in documenting the pediatric surgical capacity in West Africa. Many hospitals in West Africa are not optimally prepared to undertake pediatric surgery. Our study showed shortages in personnel, infrastructure, procedures, equipment, and supplies necessary to adequately and appropriately provide surgical care for pediatric patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfiffner, Susan M.; Löffler, Frank; Ritalahti, Kirsti
The overall goal for this funded project was to develop and exploit environmental metaproteomics tools to identify biomarkers for monitoring microbial activity affecting U speciation at U-contaminated sites, correlate metaproteomics profiles with geochemical parameters and U(VI) reduction activity (or lack thereof), elucidate mechanisms contributing to U(VI) reduction, and provide remediation project managers with additional information to make science-based site management decisions for achieving cleanup goals more efficiently. Although significant progress has been made in elucidating the microbiology contribution to metal and radionuclide reduction, the cellular components, pathway(s), and mechanisms involved in U trans-formation remain poorly understood. Recent advances in (meta)proteomicsmore » technology enable detailed studies of complex samples, including environmental samples, which differ between sites and even show considerable variability within the same site (e.g., the Oak Ridge IFRC site). Additionally, site-specific geochemical conditions affect microbial activity and function, suggesting generalized assessment and interpretations may not suffice. This research effort integrated current understanding of the microbiology and biochemistry of U(VI) reduction and capitalize on advances in proteomics technology made over the past few years. Field-related analyses used Oak Ridge IFRC field ground water samples from locations where slow-release substrate biostimulation has been implemented to accelerate in situ U(VI) reduction rates. Our overarching hypothesis was that the metabolic signature in environmental samples, as deciphered by the metaproteome measurements, would show a relationship with U(VI) reduction activity. Since metaproteomic and metagenomic characterizations were computationally challenging and time-consuming, we used a tiered approach that combines database mining, controlled laboratory studies, U(VI) reduction activity measurements, phylogenetic analyses, and gene expression studies to support the metaproteomics characterizations. Growth experiments of target microorganisms (Anaeromyxobacter, Shewanella, Geobacter) revealed tremendous respiratory versatility, as evidenced by the ability to utilize a range of electron donors (e.g. acetate, hydrogen, pyruvate, lactate, succinate, formate) and electron acceptors (e.g. nitrate, fumarate, halogenated phenols, ferric iron, nitrous oxide, etc.). In particular, the dissimilatory metabolic reduction of metals, including radionuclides, by target microorganisms spurred interest for in situ bioremediation of contaminated soils and sediments. Distinct c-type cytochrome expression patterns were observed in target microorganisms grown with the different electron acceptors. For each target microorganism, the core proteome covered almost all metabolic pathways represented by their corresponding pan-proteomes. Unique proteins were detected for each target microorganism, and their expression and possible functionalities were linked to specific growth conditions through proteomics measurements. Optimization of the proteomic tools included in-depth comprehensive metagenomic and metaproteomic analyses on a limited number of samples. The optimized metaproteomic analyses were then applied to Oak Ridge IFRC field samples from the slow-release substrate biostimulation. Metaproteomic analysis and pathway mapping results demonstrated the distinct effects of metal and non-metal growth conditions on the proteome expression. With these metaproteomic tools, we identified which previously hypothetical metabolic pathways were active during the analyzed time points of the slow release substrate biostimulation. Thus, we demonstrated the utility of these tools for site assessment, efficient implementation of bioremediation and long-term monitoring. This research of detailed protein analysis linked with metal reduction activity did (1) show that c-type cytochrome isoforms, previously associated with radionuclide reduction activity, are suitable biomarkers, (2) identify new biomarker targets for site assessment and bioremediation monitoring, and (3) provide new information about specific proteins and mechanisms involved in U(VI) reduction and immobilization. This expanded metagenomic and metaproteomic toolbox contributed to implementing science-driven site management with broad benefits to the DOE mission.« less
A review on the multivariate statistical methods for dimensional reduction studies
NASA Astrophysics Data System (ADS)
Aik, Lim Eng; Kiang, Lam Chee; Mohamed, Zulkifley Bin; Hong, Tan Wei
2017-05-01
In this research study we have discussed multivariate statistical methods for dimensional reduction, which has been done by various researchers. The reduction of dimensionality is valuable to accelerate algorithm progression, as well as really may offer assistance with the last grouping/clustering precision. A lot of boisterous or even flawed info information regularly prompts a not exactly alluring algorithm progression. Expelling un-useful or dis-instructive information segments may for sure help the algorithm discover more broad grouping locales and principles and generally speaking accomplish better exhibitions on new data set.
Steven's orbital reduction factor in ionic clusters
NASA Astrophysics Data System (ADS)
Gajek, Z.; Mulak, J.
1985-11-01
General expressions for reduction coefficients of matrix elements of angular momentum operator in ionic clusters or molecular systems have been derived. The reduction in this approach results from overlap and covalency effects and plays an important role in the reconciling of magnetic and spectroscopic experimental data. The formulated expressions make possible a phenomenological description of the effect with two independent parameters for typical equidistant clusters. Some detailed calculations also suggest the possibility of a one-parameter description. The results of these calculations for some ionic uranium compounds are presented as an example.
Echelle Data Reduction Cookbook
NASA Astrophysics Data System (ADS)
Clayton, Martin
This document is the first version of the Starlink Echelle Data Reduction Cookbook. It contains scripts and procedures developed by regular or heavy users of the existing software packages. These scripts are generally of two types; templates which readers may be able to modify to suit their particular needs and utilities which carry out a particular common task and can probably be used `off-the-shelf'. In the nature of this subject the recipes given are quite strongly tied to the software packages, rather than being science-data led. The major part of this document is divided into two sections dealing with scripts to be used with IRAF and with Starlink software (SUN/1).
ERIC Educational Resources Information Center
Weaver, Dave
Science interfacing packages (also known as microcomputer-based laboratories or probeware) generally consist of a set of programs on disks, a user's manual, and hardware which includes one or more sensory devices. Together with a microcomputer they combine to make a powerful data acquisition and analysis tool. Packages are available for accurately…
HyperCard as a Text Analysis Tool for the Qualitative Researcher.
ERIC Educational Resources Information Center
Handler, Marianne G.; Turner, Sandra V.
HyperCard is a general-purpose program for the Macintosh computer that allows multiple ways of viewing and accessing a large body of information. Two ways in which HyperCard can be used as a research tool are illustrated. One way is to organize and analyze qualitative data from observations, interviews, surveys, and other documents. The other way…
A De Novo Tool to Measure the Preclinical Learning Climate of Medical Faculties in Turkey
ERIC Educational Resources Information Center
Yilmaz, Nilufer Demiral; Velipasaoglu, Serpil; Sahin, Hatice; Basusta, Bilge Uzun; Midik, Ozlem; Coskun, Ozlem; Budakoglu, Isil Irem; Mamakli, Sumer; Tengiz, Funda Ifakat; Durak, Halil Ibrahim; Ozan, Sema
2015-01-01
Although several scales are used to measure general and clinical learning climates, there are no scales that assess the preclinical learning climate. Therefore, the purpose of this study was to develop an effective measurement tool in order to assess the preclinical learning climate. In this cross-sectional study, data were collected from 3,540…
Kohlmayer, Florian; Prasser, Fabian; Kuhn, Klaus A
2015-12-01
With the ARX data anonymization tool structured biomedical data can be de-identified using syntactic privacy models, such as k-anonymity. Data is transformed with two methods: (a) generalization of attribute values, followed by (b) suppression of data records. The former method results in data that is well suited for analyses by epidemiologists, while the latter method significantly reduces loss of information. Our tool uses an optimal anonymization algorithm that maximizes output utility according to a given measure. To achieve scalability, existing optimal anonymization algorithms exclude parts of the search space by predicting the outcome of data transformations regarding privacy and utility without explicitly applying them to the input dataset. These optimizations cannot be used if data is transformed with generalization and suppression. As optimal data utility and scalability are important for anonymizing biomedical data, we had to develop a novel method. In this article, we first confirm experimentally that combining generalization with suppression significantly increases data utility. Next, we proof that, within this coding model, the outcome of data transformations regarding privacy and utility cannot be predicted. As a consequence, existing algorithms fail to deliver optimal data utility. We confirm this finding experimentally. The limitation of previous work can be overcome at the cost of increased computational complexity. However, scalability is important for anonymizing data with user feedback. Consequently, we identify properties of datasets that may be predicted in our context and propose a novel and efficient algorithm. Finally, we evaluate our solution with multiple datasets and privacy models. This work presents the first thorough investigation of which properties of datasets can be predicted when data is anonymized with generalization and suppression. Our novel approach adopts existing optimization strategies to our context and combines different search methods. The experiments show that our method is able to efficiently solve a broad spectrum of anonymization problems. Our work shows that implementing syntactic privacy models is challenging and that existing algorithms are not well suited for anonymizing data with transformation models which are more complex than generalization alone. As such models have been recommended for use in the biomedical domain, our results are of general relevance for de-identifying structured biomedical data. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Ganachaud, Alexandre; Wunsch, Carl; Kim, Myung-Chan; Tapley, Byron
1997-01-01
A global estimate of the absolute oceanic general circulation from a geostrophic inversion of in situ hydrographic data is tested against and then combined with an estimate obtained from TOPEX/POSEIDON altimetric data and a geoid model computed using the JGM-3 gravity-field solution. Within the quantitative uncertainties of both the hydrographic inversion and the geoid estimate, the two estimates derived by very different methods are consistent. When the in situ inversion is combined with the altimetry/geoid scheme using a recursive inverse procedure, a new solution, fully consistent with both hydrography and altimetry, is found. There is, however, little reduction in the uncertainties of the calculated ocean circulation and its mass and heat fluxes because the best available geoid estimate remains noisy relative to the purely oceanographic inferences. The conclusion drawn from this is that the comparatively large errors present in the existing geoid models now limit the ability of satellite altimeter data to improve directly the general ocean circulation models derived from in situ measurements. Because improvements in the geoid could be realized through a dedicated spaceborne gravity recovery mission, the impact of hypothetical much better, future geoid estimates on the circulation uncertainty is also quantified, showing significant hypothetical reductions in the uncertainties of oceanic transport calculations. Full ocean general circulation models could better exploit both existing oceanographic data and future gravity-mission data, but their present use is severely limited by the inability to quantify their error budgets.
Jones, S.A.; Braun, Christopher L.; Lee, Roger W.
2003-01-01
Concentrations of trichloroethene in ground water at the Naval Weapons Industrial Reserve Plant in Dallas, Texas, indicate three source areas of chlorinated solvents?building 1, building 6, and an off-site source west of the facility. The presence of daughter products of reductive dechlorination of trichloroethene, which were not used at the facility, south and southwest of the source areas are evidence that reductive dechlorination is occurring. In places south of the source areas, dissolved oxygen concentrations indicated that reduction of oxygen could be the dominant process, particularly south of building 6; but elevated dissolved oxygen concentrations south of building 6 might be caused by a leaking water or sewer pipe. The nitrite data indicate that denitrification is occurring in places; however, dissolved hydrogen concentrations indicate that iron reduction is the dominant process south of building 6. The distributions of ferrous iron indicate that iron reduction is occurring in places south-southwest of buildings 6 and 1; dissolved hydrogen concentrations generally support the interpretation that iron reduction is the dominant process in those places. The generally low concentrations of sulfide indicate that sulfate reduction is not a key process in most sampled areas, an interpretation that is supported by dissolved hydrogen concentrations. Ferrous iron and dissolved hydrogen concentrations indicate that ferric iron reduction is the primary oxidation-reduction process. Application of mean first-order decay rates in iron-reducing conditions for trichloroethene, dichloroethene, and vinyl chloride yielded half-lives for those solvents of 231, 347, and 2.67 days, respectively. Decay rates, and thus half-lives, at the facility are expected to be similar to those computed. A weighted scoring method to indicate sites where reductive dechlorination might be likely to occur indicated strong evidence for anaerobic biodegradation of chlorinated solvents at six sites. In general, scores were highest for samples collected on the northeast side of the facility.
ERIC Educational Resources Information Center
Beaujean, A. Alexander
2013-01-01
"R" (R Development Core Team, 2011) is a very powerful tool to analyze data, that is gaining in popularity due to its costs (its free) and flexibility (its open-source). This article gives a general introduction to using "R" (i.e., loading the program, using functions, importing data). Then, using data from Canivez, Konold, Collins, and Wilson…
Use of Data Visualisation in the Teaching of Statistics: A New Zealand Perspective
ERIC Educational Resources Information Center
Forbes, Sharleen; Chapman, Jeanette; Harraway, John; Stirling, Doug; Wild, Chris
2014-01-01
For many years, students have been taught to visualise data by drawing graphs. Recently, there has been a growing trend to teach statistics, particularly statistical concepts, using interactive and dynamic visualisation tools. Free down-loadable teaching and simulation software designed specifically for schools, and more general data visualisation…
Ehrensperger, Michael M; Taylor, Kirsten I; Berres, Manfred; Foldi, Nancy S; Dellenbach, Myriam; Bopp, Irene; Gold, Gabriel; von Gunten, Armin; Inglin, Daniel; Müri, René; Rüegger, Brigitte; Kressig, Reto W; Monsch, Andreas U
2014-01-01
Optimal identification of subtle cognitive impairment in the primary care setting requires a very brief tool combining (a) patients' subjective impairments, (b) cognitive testing, and (c) information from informants. The present study developed a new, very quick and easily administered case-finding tool combining these assessments ('BrainCheck') and tested the feasibility and validity of this instrument in two independent studies. We developed a case-finding tool comprised of patient-directed (a) questions about memory and depression and (b) clock drawing, and (c) the informant-directed 7-item version of the Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE). Feasibility study: 52 general practitioners rated the feasibility and acceptance of the patient-directed tool. Validation study: An independent group of 288 Memory Clinic patients (mean ± SD age = 76.6 ± 7.9, education = 12.0 ± 2.6; 53.8% female) with diagnoses of mild cognitive impairment (n = 80), probable Alzheimer's disease (n = 185), or major depression (n = 23) and 126 demographically matched, cognitively healthy volunteer participants (age = 75.2 ± 8.8, education = 12.5 ± 2.7; 40% female) partook. All patient and healthy control participants were administered the patient-directed tool, and informants of 113 patient and 70 healthy control participants completed the very short IQCODE. Feasibility study: General practitioners rated the patient-directed tool as highly feasible and acceptable. Validation study: A Classification and Regression Tree analysis generated an algorithm to categorize patient-directed data which resulted in a correct classification rate (CCR) of 81.2% (sensitivity = 83.0%, specificity = 79.4%). Critically, the CCR of the combined patient- and informant-directed instruments (BrainCheck) reached nearly 90% (that is 89.4%; sensitivity = 97.4%, specificity = 81.6%). A new and very brief instrument for general practitioners, 'BrainCheck', combined three sources of information deemed critical for effective case-finding (that is, patients' subject impairments, cognitive testing, informant information) and resulted in a nearly 90% CCR. Thus, it provides a very efficient and valid tool to aid general practitioners in deciding whether patients with suspected cognitive impairments should be further evaluated or not ('watchful waiting').
Faizi, Fakhrudin; Tavallaee, Abbas; Rahimi, Abolfazl; Saghafinia, Masoud
2017-01-01
Background Lifestyle modification has a significant role in chronic daily headache (CDH) management. Participatory action research (PAR) can play an important role in managing chronic medical conditions. However, it has been scarcely used in CDH management. Objectives This study aimed to empower patients with CDH to modify their lifestyle in order to reduce both their headache and related psychiatric co-morbidities in a multidisciplinary headache clinic at Baqiyatallah hospital, Tehran, IR Iran. Methods In the PAR plan, 37 patients (27 females) diagnosed with CDH were selected using purposeful sampling. Along with face-to-face group sessions, all available communication means such as phone calls, emails, short message system (SMS), and social media (Telegram) were used to facilitate the process. Questionnaires of health promotion lifestyle profile (HPLPІІ), visual analog scale (VAS), and depression-anxiety-stress scale (DASS21) were used to collect data. The data were analyzed using SPSS software. Results Mean age of the patients was 38.33 (± 9.7) years. Both “general pain” and “the worst imaginable pain” reduced (mean of reduction: 2.56 ± 2.7 and 2.3 ± 2.9, respectively, P < 0.001). > 50% of pain reduction occurred in “the worst imaginable pain" category (-1.45 ± 2.02, P < 0.001) and mean VAS score reduced to 5.20 (± 2.3) compared to the start of the study (7.50 ± 1.9, P < 0.001). Mean DASS-21 score also reduced significantly for depression (P < 0.016), anxiety (P < 0.026), and stress (P < 0.008). HPLPІІ score significantly improved (118.17 ± 14.8 vs. 160.83 ± 16.4, P < 0.001) and the highest increase was seen in the subscale of "stress management" (17.73 ± 2.8 vs. 25.53 ± 3.9, P < 0.001). Conclusions The empowering PAR plan combined with new communication tools helped the CDH patients better handle their lifestyle, reduce their headache, and lower their symptoms. Further studies with better use of currently available communication tools and social media are recommended for action research to be more applicable. PMID:28920050
Faizi, Fakhrudin; Tavallaee, Abbas; Rahimi, Abolfazl; Saghafinia, Masoud
2017-04-01
Lifestyle modification has a significant role in chronic daily headache (CDH) management. Participatory action research (PAR) can play an important role in managing chronic medical conditions. However, it has been scarcely used in CDH management. This study aimed to empower patients with CDH to modify their lifestyle in order to reduce both their headache and related psychiatric co-morbidities in a multidisciplinary headache clinic at Baqiyatallah hospital, Tehran, IR Iran. In the PAR plan, 37 patients (27 females) diagnosed with CDH were selected using purposeful sampling. Along with face-to-face group sessions, all available communication means such as phone calls, emails, short message system (SMS), and social media (Telegram) were used to facilitate the process. Questionnaires of health promotion lifestyle profile (HPLPІІ), visual analog scale (VAS), and depression-anxiety-stress scale (DASS21) were used to collect data. The data were analyzed using SPSS software. Mean age of the patients was 38.33 (± 9.7) years. Both "general pain" and "the worst imaginable pain" reduced (mean of reduction: 2.56 ± 2.7 and 2.3 ± 2.9, respectively, P < 0.001). > 50% of pain reduction occurred in "the worst imaginable pain" category (-1.45 ± 2.02, P < 0.001) and mean VAS score reduced to 5.20 (± 2.3) compared to the start of the study (7.50 ± 1.9, P < 0.001). Mean DASS-21 score also reduced significantly for depression (P < 0.016), anxiety (P < 0.026), and stress (P < 0.008). HPLPІІ score significantly improved (118.17 ± 14.8 vs. 160.83 ± 16.4, P < 0.001) and the highest increase was seen in the subscale of "stress management" (17.73 ± 2.8 vs. 25.53 ± 3.9, P < 0.001). The empowering PAR plan combined with new communication tools helped the CDH patients better handle their lifestyle, reduce their headache, and lower their symptoms. Further studies with better use of currently available communication tools and social media are recommended for action research to be more applicable.
TRACI 2.0 - The Tool for the Reduction and Assessment of Chemical and other environmental Impacts
TRACI 2.0, the Tool for the Reduction and Assessment of Chemical and other environmental Impacts 2.0, has been expanded and developed for sustainability metrics, life cycle impact assessment, industrial ecology, and process design impact assessment for developing increasingly sus...
TRACI 2.1 (the Tool for the Reduction and Assessment of Chemical and other environmental Impacts) has been developed for sustainability metrics, life cycle impact assessment, industrial ecology, and process design impact assessment for developing increasingly sustainable products...
Data Content and Exchange in General Practice: a Review
Kalankesh, Leila R; Farahbakhsh, Mostafa; Rahimi, Niloofar
2014-01-01
Background: efficient communication of data is inevitable requirement for general practice. Any issue in data content and its exchange among GP and other related entities hinders continuity of patient care. Methods: literature search for this review was conducted on three electronic databases including Medline, Scopus and Science Direct. Results: through reviewing papers, we extracted information on the GP data content, use cases of GP information exchange, its participants, tools and methods, incentives and barriers. Conclusion: considering importance of data content and exchange for GP systems, it seems that more research is needed to be conducted toward providing a comprehensive framework for data content and exchange in GP systems. PMID:25648317
The Future of Additive Manufacturing in Air Force Acquisition
2017-03-22
manufacturing data - Designing and deploying a virtual aircraft fleet for future conflict - Space-based satellite production for defense capabilities via...changing system design via lower production costs, enhanced performance possibilities, and rapid replenishment. In the Technology Maturation and Risk... manufacturing as well as major cost savings via reduction of required materials, unique tooling, specialized production plans, and segments of the
Tradespace and Affordability - Phase 1
2013-07-09
assessment options Cost-effectiveness, risk reduction leverage/ROI, rework avoidance Tool, data, scenario availability Contract Number: H98230-08-D-0171...Prepare FED assessment plans and earned value milestones Try to relate earned value to risk -exposure avoided rather than budgeted cost F. Begin...evaluate and iterate plans and enablers I. Assess readiness for Commitment Review Shortfalls identified as risks and covered by risk mitigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Ucilia
This report has the following articles: (1) Deconstructing Microbes--metagenomic research on bugs in termites relies on new data analysis tools; (2) Popular Science--a nanomaterial research paper in Nano Letters drew strong interest from the scientific community; (3) Direct Approach--researchers employ an algorithm to solve an energy-reduction issue essential in describing complex physical system; and (4) SciDAC Special--A science journal features research on petascale enabling technologies.
Analyzing the efficacy of subtropical urban forests in offsetting carbon emissions from cities
Francisco Escobedo; Sebastian Varela; Min Zhao; John E. Wagner; Wayne Zipperer
2010-01-01
Urban forest management and policies have been promoted as a tool to mitigate carbon dioxide (CO2) emissions. This study used existing CO2 reduction measures from subtropical Miami-Dade and Gainesville, USA and modeled carbon storage and sequestration by trees to analyze policies that use urban forests to offset carbon emissions. Field data were analyzed, modeled, and...
Biological data integration: wrapping data and tools.
Lacroix, Zoé
2002-06-01
Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. Building a digital library for scientific data requires accessing and manipulating data extracted from flat files or databases, documents retrieved from the Web as well as data generated by software. We present an approach to wrapping web data sources, databases, flat files, or data generated by tools through a database view mechanism. Generally, a wrapper has two tasks: it first sends a query to the source to retrieve data and, second builds the expected output with respect to the virtual structure. Our wrappers are composed of a retrieval component based on an intermediate object view mechanism called search views mapping the source capabilities to attributes, and an eXtensible Markup Language (XML) engine, respectively, to perform these two tasks. The originality of the approach consists of: 1) a generic view mechanism to access seamlessly data sources with limited capabilities and 2) the ability to wrap data sources as well as the useful specific tools they may provide. Our approach has been developed and demonstrated as part of the multidatabase system supporting queries via uniform object protocol model (OPM) interfaces.
Density-Dependent Quantized Least Squares Support Vector Machine for Large Data Sets.
Nan, Shengyu; Sun, Lei; Chen, Badong; Lin, Zhiping; Toh, Kar-Ann
2017-01-01
Based on the knowledge that input data distribution is important for learning, a data density-dependent quantization scheme (DQS) is proposed for sparse input data representation. The usefulness of the representation scheme is demonstrated by using it as a data preprocessing unit attached to the well-known least squares support vector machine (LS-SVM) for application on big data sets. Essentially, the proposed DQS adopts a single shrinkage threshold to obtain a simple quantization scheme, which adapts its outputs to input data density. With this quantization scheme, a large data set is quantized to a small subset where considerable sample size reduction is generally obtained. In particular, the sample size reduction can save significant computational cost when using the quantized subset for feature approximation via the Nyström method. Based on the quantized subset, the approximated features are incorporated into LS-SVM to develop a data density-dependent quantized LS-SVM (DQLS-SVM), where an analytic solution is obtained in the primal solution space. The developed DQLS-SVM is evaluated on synthetic and benchmark data with particular emphasis on large data sets. Extensive experimental results show that the learning machine incorporating DQS attains not only high computational efficiency but also good generalization performance.
The SCUBA map reduction cookbook
NASA Astrophysics Data System (ADS)
Sandell, G.; Jessop, N.; Jenness, T.
This cookbook tells you how to reduce and analyze maps obtained with SCUBA using the off-line SCUBA reduction package, SURF, and the Starlink KAPPA, Figaro, GAIA and CONVERT applications. The easiest way of using these packages is to run-up ORAC-DR, a general purpose pipeline for reducing data from any telescope. A set of data reduction recipes are available to ORAC-DR for use when working with scuba maps, these recipes utilize the SURF and KAPPA packages. This cookbook makes no attempts to explain why and how, for that there is a comprehensive Starlink User Note 216 which properly documents all the software tasks in SURF, which should be consulted for those who need to know details of a task, or how the task really works.
System Analyses of Pneumatic Technology for High Speed Civil Transport Aircraft
NASA Technical Reports Server (NTRS)
Mavris, Dimitri N.; Tai, Jimmy C.; Kirby, Michelle M.; Roth, Bryce A.
1999-01-01
The primary aspiration of this study was to objectively assess the feasibility of the application of a low speed pneumatic technology, in particular Circulation Control (CC) to an HSCT concept. Circulation Control has been chosen as an enabling technology to be applied on a generic High Speed Civil Transport (HSCT). This technology has been proven for various subsonic vehicles including flight tests on a Navy A-6 and computational application on a Boeing 737. Yet, CC has not been widely accepted for general commercial fixed-wing use but its potential has been extensively investigated for decades in wind tunnels across the globe for application to rotorcraft. More recently, an experimental investigation was performed at Georgia Tech Research Institute (GTRI) with application to an HSCT-type configuration. The data from those experiments was to be applied to a full-scale vehicle to assess the impact from a system level point of view. Hence, this study attempted to quantitatively assess the impact of this technology to an HSCT. The study objective was achieved in three primary steps: 1) Defining the need for CC technology; 2) Wind tunnel data reduction; 3) Detailed takeoff/landing performance assessment. Defining the need for the CC technology application to an HSCT encompassed a preliminary system level analysis. This was accomplished through the utilization of recent developments in modern aircraft design theory at Aerospace Systems Design Laboratory (ASDL). These developments include the creation of techniques and methods needed for the identification of technical feasibility show stoppers. These techniques and methods allow the designer to rapidly assess a design space and disciplinary metric enhancements to enlarge or improve the design space. The takeoff and landing field lengths were identified as the concept "show-stoppers". Once the need for CC was established, the actual application of data and trends was assessed. This assessment entailed a reduction of the wind tunnel data from the experiments performed by Mr. Bob Englar at the GTRI. Relevant data was identified and manipulated based on the required format of the analysis tools utilized. Propulsive, aerodynamic, duct sizing, and vehicle sizing investigations were performed and information supplied to a detailed takeoff and landing tool, From the assessments, CC was shown to improve the low speed performance metrics, which were previously not satisfied. An HSCT with CC augmentation does show potential for full-scale application. Yet, an economic assessment of an HSCT with and without CC showed that a moderate penalty was incurred from the increased RDT&E costs associated with developing the CC technology and slight increases in empty weight.
Miskin, Chandrabhaga; Khurana, Divya S; Valencia, Ignacio; Legido, Agustin; Hasbani, Daphne M; Carvalho, Karen S
2016-06-01
Lacosamide is FDA-approved in patients 17 years or older with partial-onset epilepsy. We evaluated the efficacy and tolerability of lacosamide in children with refractory generalized epilepsy. We retrospectively reviewed records of 21 children with refractory generalized epilepsy treated with lacosamide in our institution from 2009-2013 divided into 2 subgroups- I, Lennox-Gastaut Syndrome, and II, other generalized epilepsies. Efficacy was defined as seizure freedom or ≥50% seizure reduction. Descriptive data analysis including seizure freedom was compared using c(2) analysis. There were eleven females and ten males with a mean age, of 11.9 years. Five patients became seizure free, nine had ≥50% seizure reduction, and seven had no response. Group I: seven had ≥50% improvement, one did not respond. Group II: five became seizure free, two had ≥50% improvement, five had no response. Lacosamide is effective and well tolerated in children with refractory generalized epilepsy particularly patients with Lennox-Gastaut Syndrome. © The Author(s) 2016.
Methods in Astronomical Image Processing
NASA Astrophysics Data System (ADS)
Jörsäter, S.
A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future
Nyblade, Laura; Jain, Aparna; Benkirane, Manal; Li, Li; Lohiniva, Anna-Leena; McLean, Roger; Turan, Janet M; Varas-Díaz, Nelson; Cintrón-Bou, Francheska; Guan, Jihui; Kwena, Zachary; Thomas, Wendell
2013-11-13
Within healthcare settings, HIV-related stigma is a recognized barrier to access of HIV prevention and treatment services and yet, few efforts have been made to scale-up stigma reduction programs in service delivery. This is in part due to the lack of a brief, simple, standardized tool for measuring stigma among all levels of health facility staff that works across diverse HIV prevalence, language and healthcare settings. In response, an international consortium led by the Health Policy Project, has developed and field tested a stigma measurement tool for use with health facility staff. Experts participated in a content-development workshop to review an item pool of existing measures, identify gaps and prioritize questions. The resulting questionnaire was field tested in six diverse sites (China, Dominica, Egypt, Kenya, Puerto Rico and St. Christopher & Nevis). Respondents included clinical and non-clinical staff. Questionnaires were self- or interviewer-administered. Analysis of item performance across sites examined both psychometric properties and contextual issues. The key outcome of the process was a substantially reduced questionnaire. Eighteen core questions measure three programmatically actionable drivers of stigma within health facilities (worry about HIV transmission, attitudes towards people living with HIV (PLHIV), and health facility environment, including policies), and enacted stigma. The questionnaire also includes one short scale for attitudes towards PLHIV (5-item scale, α=0.78). Stigma-reduction programmes in healthcare facilities are urgently needed to improve the quality of care provided, uphold the human right to healthcare, increase access to health services, and maximize investments in HIV prevention and treatment. This brief, standardized tool will facilitate inclusion of stigma measurement in research studies and in routine facility data collection, allowing for the monitoring of stigma within healthcare facilities and evaluation of stigma-reduction programmes. There is potential for wide use of the tool either as a stand-alone survey or integrated within other studies of health facility staff.
ThinkHazard!: an open-source, global tool for understanding hazard information
NASA Astrophysics Data System (ADS)
Fraser, Stuart; Jongman, Brenden; Simpson, Alanna; Nunez, Ariel; Deparday, Vivien; Saito, Keiko; Murnane, Richard; Balog, Simone
2016-04-01
Rapid and simple access to added-value natural hazard and disaster risk information is a key issue for various stakeholders of the development and disaster risk management (DRM) domains. Accessing available data often requires specialist knowledge of heterogeneous data, which are often highly technical and can be difficult for non-specialists in DRM to find and exploit. Thus, availability, accessibility and processing of these information sources are crucial issues, and an important reason why many development projects suffer significant impacts from natural hazards. The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) is currently developing a new open-source tool to address this knowledge gap: ThinkHazard! The main aim of the ThinkHazard! project is to develop an analytical tool dedicated to facilitating improvements in knowledge and understanding of natural hazards among non-specialists in DRM. It also aims at providing users with relevant guidance and information on handling the threats posed by the natural hazards present in a chosen location. Furthermore, all aspects of this tool will be open and transparent, in order to give users enough information to understand its operational principles. In this presentation, we will explain the technical approach behind the tool, which translates state-of-the-art probabilistic natural hazard data into understandable hazard classifications and practical recommendations. We will also demonstrate the functionality of the tool, and discuss limitations from a scientific as well as an operational perspective.
Using ESO Reflex with Web Services
NASA Astrophysics Data System (ADS)
Järveläinen, P.; Savolainen, V.; Oittinen, T.; Maisala, S.; Ullgrén, M. Hook, R.
2008-08-01
ESO Reflex is a prototype graphical workflow system, based on Taverna, and primarily intended to be a flexible way of running ESO data reduction recipes along with other legacy applications and user-written tools. ESO Reflex can also readily use the Taverna Web Services features that are based on the Apache Axis SOAP implementation. Taverna is a general purpose Web Service client, and requires no programming to use such services. However, Taverna also has some restrictions: for example, no numerical types such integers. In addition the preferred binding style is document/literal wrapped, but most astronomical services publish the Axis default WSDL using RPC/encoded style. Despite these minor limitations we have created simple but very promising test VO workflow using the Sesame name resolver service at CDS Strasbourg, the Hubble SIAP server at the Multi-Mission Archive at Space Telescope (MAST) and the WESIX image cataloging and catalogue cross-referencing service at the University of Pittsburgh. ESO Reflex can also pass files and URIs via the PLASTIC protocol to visualisation tools and has its own viewer for VOTables. We picked these three Web Services to try to set up a realistic and useful ESO Reflex workflow. They also demonstrate ESO Reflex abilities to use many kind of Web Services because each of them requires a different interface. We describe each of these services in turn and comment on how it was used
Parmagnani, Federica; Ranzi, Andrea; Ancona, Carla; Angelini, Paola; Chiusolo, Monica; Cadum, Ennio; Lauriola, Paolo; Forastiere, Francesco
2014-01-01
The Project Epidemiological Surveillance of Health Status of Resident Population Around the Waste Treatment Plants (SESPIR) included five Italian regions (Emilia-Romagna, Piedmont, Lazio, Campania, and Sicily) and the National Institute of Health in the period 2010-2013. SESPIR was funded by the Ministry of Health as part of the National centre for diseases prevention and control (CCM) programme of 2010 with the general objective to provide methods and operational tools for the implementation of surveillance systems for waste and health, aimed at assessing the impact of the municipal solid waste (MSW) treatment cycle on the health of the population. The specific objective was to assess health impacts resulting from the presence of disposal facilities related to different regional scenarios of waste management. Suitable tools for analysis of integrated assessment of environmental and health impact were developed and applied, using current demographic, environmental and health data. In this article, the methodology used for the quantitative estimation of the impact on the health of populations living nearby incinerators, landfills and mechanical biological treatment plants is showed, as well as the analysis of three different temporal scenarios: the first related to the existing plants in the period 2008-2009 (baseline), the second based on regional plans, the latter referring to MSW virtuous policy management based on reduction of produced waste and an intense recovery policy.
A Predictive Model for Readmissions Among Medicare Patients in a California Hospital.
Duncan, Ian; Huynh, Nhan
2017-11-17
Predictive models for hospital readmission rates are in high demand because of the Centers for Medicare & Medicaid Services (CMS) Hospital Readmission Reduction Program (HRRP). The LACE index is one of the most popular predictive tools among hospitals in the United States. The LACE index is a simple tool with 4 parameters: Length of stay, Acuity of admission, Comorbidity, and Emergency visits in the previous 6 months. The authors applied logistic regression to develop a predictive model for a medium-sized not-for-profit community hospital in California using patient-level data with more specific patient information (including 13 explanatory variables). Specifically, the logistic regression is applied to 2 populations: a general population including all patients and the specific group of patients targeted by the CMS penalty (characterized as ages 65 or older with select conditions). The 2 resulting logistic regression models have a higher sensitivity rate compared to the sensitivity of the LACE index. The C statistic values of the model applied to both populations demonstrate moderate levels of predictive power. The authors also build an economic model to demonstrate the potential financial impact of the use of the model for targeting high-risk patients in a sample hospital and demonstrate that, on balance, whether the hospital gains or loses from reducing readmissions depends on its margin and the extent of its readmission penalties.
Kelly, Laura; Ziebland, Sue; Jenkinson, Crispin
2015-11-01
Health-related websites have developed to be much more than information sites: they are used to exchange experiences and find support as well as information and advice. This paper documents the development of a tool to compare the potential consequences and experiences a person may encounter when using health-related websites. Questionnaire items were developed following a review of relevant literature and qualitative secondary analysis of interviews relating to experiences of health. Item reduction steps were performed on pilot survey data (n=167). Tests of validity and reliability were subsequently performed (n=170) to determine the psychometric properties of the questionnaire. Two independent item pools entered psychometric testing: (1) Items relating to general views of using the internet in relation to health and, (2) Items relating to the consequences of using a specific health-related website. Identified sub-scales were found to have high construct validity, internal consistency and test-retest reliability. Analyses confirmed good psychometric properties in the eHIQ-Part 1 (11 items) and the eHIQ-Part 2 (26 items). This tool will facilitate the measurement of the potential consequences of using websites containing different types of material (scientific facts and figures, blogs, experiences, images) across a range of health conditions. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Computer programing for geosciences: Teach your students how to make tools
NASA Astrophysics Data System (ADS)
Grapenthin, Ronni
2011-12-01
When I announced my intention to pursue a Ph.D. in geophysics, some people gave me confused looks, because I was working on a master's degree in computer science at the time. My friends, like many incoming geoscience graduate students, have trouble linking these two fields. From my perspective, it is pretty straightforward: Much of geoscience evolves around novel analyses of large data sets that require custom tools—computer programs—to minimize the drudgery of manual data handling; other disciplines share this characteristic. While most faculty adapted to the need for tool development quite naturally, as they grew up around computer terminal interfaces, incoming graduate students lack intuitive understanding of programing concepts such as generalization and automation. I believe the major cause is the intuitive graphical user interfaces of modern operating systems and applications, which isolate the user from all technical details. Generally, current curricula do not recognize this gap between user and machine. For students to operate effectively, they require specialized courses teaching them the skills they need to make tools that operate on particular data sets and solve their specific problems. Courses in computer science departments are aimed at a different audience and are of limited help.
Data reduction software for LORAN-C flight test evaluation
NASA Technical Reports Server (NTRS)
Fischer, J. P.
1979-01-01
A set of programs designed to be run on an IBM 370/158 computer to read the recorded time differences from the tape produced by the LORAN data collection system, convert them to latitude/longitude and produce various plotting input files are described. The programs were written so they may be tailored easily to meet the demands of a particular data reduction job. The tape reader program is written in 370 assembler language and the remaining programs are written in standard IBM FORTRAN-IV language. The tape reader program is dependent upon the recording format used by the data collection system and on the I/O macros used at the computing facility. The other programs are generally device-independent, although the plotting routines are dependent upon the plotting method used. The data reduction programs convert the recorded data to a more readily usable form; convert the time difference (TD) numbers to latitude/longitude (lat/long), to format a printed listing of the TDs, lat/long, reference times, and other information derived from the data, and produce data files which may be used for subsequent plotting.
Liu, Yang; Chiaromonte, Francesca; Li, Bing
2017-06-01
In many scientific and engineering fields, advanced experimental and computing technologies are producing data that are not just high dimensional, but also internally structured. For instance, statistical units may have heterogeneous origins from distinct studies or subpopulations, and features may be naturally partitioned based on experimental platforms generating them, or on information available about their roles in a given phenomenon. In a regression analysis, exploiting this known structure in the predictor dimension reduction stage that precedes modeling can be an effective way to integrate diverse data. To pursue this, we propose a novel Sufficient Dimension Reduction (SDR) approach that we call structured Ordinary Least Squares (sOLS). This combines ideas from existing SDR literature to merge reductions performed within groups of samples and/or predictors. In particular, it leads to a version of OLS for grouped predictors that requires far less computation than recently proposed groupwise SDR procedures, and provides an informal yet effective variable selection tool in these settings. We demonstrate the performance of sOLS by simulation and present a first application to genomic data. The R package "sSDR," publicly available on CRAN, includes all procedures necessary to implement the sOLS approach. © 2016, The International Biometric Society.
Scientific Visualization Tools for Enhancement of Undergraduate Research
NASA Astrophysics Data System (ADS)
Rodriguez, W. J.; Chaudhury, S. R.
2001-05-01
Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable four-dimensional interactive environment. These tools allow students to make higher order decisions based on large multidimensional sets of data while diminishing the level of frustration that results from dealing with the details of processing large data sets.
Influence of Environmental Governance on Deforestation in Municipalities of the Brazilian Amazon.
Dias, Lilian Fernandes Oliveira; Dias, David Valentim; Magnusson, William Ernest
2015-01-01
It has been argued that measuring governance at scales smaller than global could be an important management tool. However, current studies are conducted on a global scale and use expensive methods. In the present study, we assess whether the reported governance of Amazonian municipalities is related to reductions in deforestation. Economic activity (EA) affected general governance (G) positively (G = 0.81 +1.19 * EA, F1, 98 = 77.36, p < 0.001). Environmental governance (EG) was not affected significantly (p = 0.43) by deforestation before 2000 (PD), but increased significantly (p < 0.001) with general governance (G) (EG = -0.29 + 0.04 PD+0.98*OG, F2,97 = 42.6, p <0.001). Deforestation was not significantly related to environmental governance (p = 0.82). The only indirect effect of significant magnitude was the effect of the density of forest reserves on recent deforestation through deforestation before 2000, which was strongly negative (-0.49). It is possible to assess reported actions to promote municipal governance through official data. However, it is not enough to assume that general governance or environmental governance at the municipal level, as reflected in the official statistics, benefits environmental conservation. In fact, even at the level of nation states, at which most quantification of governance has been undertaken, it seems that the relationship between governance and environmental preservation is only an assumption, because we are aware of no study that supports that hypothesis quantitatively.
Influence of Environmental Governance on Deforestation in Municipalities of the Brazilian Amazon
Dias, Lilian Fernandes Oliveira; Dias, David Valentim; Magnusson, William Ernest
2015-01-01
It has been argued that measuring governance at scales smaller than global could be an important management tool. However, current studies are conducted on a global scale and use expensive methods. In the present study, we assess whether the reported governance of Amazonian municipalities is related to reductions in deforestation. Economic activity (EA) affected general governance (G) positively (G = 0.81 +1.19 * EA, F1, 98 = 77.36, p < 0.001). Environmental governance (EG) was not affected significantly (p = 0.43) by deforestation before 2000 (PD), but increased significantly (p < 0.001) with general governance (G) (EG = -0.29 + 0.04 PD+0.98*OG, F2,97 = 42.6, p <0.001). Deforestation was not significantly related to environmental governance (p = 0.82). The only indirect effect of significant magnitude was the effect of the density of forest reserves on recent deforestation through deforestation before 2000, which was strongly negative (-0.49). It is possible to assess reported actions to promote municipal governance through official data. However, it is not enough to assume that general governance or environmental governance at the municipal level, as reflected in the official statistics, benefits environmental conservation. In fact, even at the level of nation states, at which most quantification of governance has been undertaken, it seems that the relationship between governance and environmental preservation is only an assumption, because we are aware of no study that supports that hypothesis quantitatively. PMID:26208282
NASA Technical Reports Server (NTRS)
Aniversario, R. B.; Harvey, S. T.; Mccarty, J. E.; Parsons, J. T.; Peterson, D. C.; Pritchett, L. D.; Wilson, D. R.; Wogulis, E. R.
1983-01-01
The horizontal stabilizer of the 737 transport was redesigned. Five shipsets were fabricated using composite materials. Weight reduction greater than the 20% goal was achieved. Parts and assemblies were readily produced on production-type tooling. Quality assurance methods were demonstrated. Repair methods were developed and demonstrated. Strength and stiffness analytical methods were substantiated by comparison with test results. Cost data was accumulated in a semiproduction environment. FAA certification was obtained.
The effect of enforcement of the Master Settlement Agreement on youth exposure to print advertising.
Lieberman, Alan
2004-07-01
Enforcement of the Master Settlement Agreement's (MSA) prohibitions on youth targeting and the use of cartoons has resulted in a significant reduction in youth exposure to tobacco advertising. The MSA between the states and the tobacco companies has provided state officials with a new and powerful tool to address tobacco company marketing practices that may promote underage smoking. In the area of print advertising, enforcement of the MSA's prohibitions on youth targeting (MSA III[a]) and on the use of cartoons (MSA III[b]) has resulted in a significant reduction in youth exposure to tobacco advertising. The recent court decisions finding that R. J. Reynolds violated the youth targeting prohibition in its tobacco advertising in national magazines affirm the viability of the MSA's various restrictions and its enforcement mechanisms as a key way that state Attorneys General are responding to a range of tobacco company practices affecting youth.
1988-03-31
radar operation and data - collection activities, a large data -analysis effort has been under way in support of automatic wind-shear detection algorithm ...REDUCTION AND ALGORITHM DEVELOPMENT 49 A. General-Purpose Software 49 B. Concurrent Computer Systems 49 C. Sun Workstations 51 D. Radar Data Analysis 52...1. Algorithm Verification 52 2. Other Studies 53 3. Translations 54 4. Outside Distributions 55 E. Mesonet/LLWAS Data Analysis 55 1. 1985 Data 55 2
Cannell, M Brad; Jetelina, Katelyn K; Zavadsky, Matt; Gonzalez, Jennifer M Reingle
2016-06-01
To develop a screening tool to enhance elder abuse and neglect detection and reporting rates among emergency medical technicians (EMTs). Our primary aim was to identify the most salient indicators of elder abuse and neglect for potential inclusion on a screening tool. We also sought to identify practical elements of the tool that would optimize EMT uptake and use in the field, such as format, length and number of items, and types of response options available. Qualitative data were collected from 23 EMTs and Adult Protective Services (APS) caseworkers that participated in one of five semi-structured focus groups. Focus group data were iteratively coded by two coders using inductive thematic identification and data reduction. Findings were subject to interpretation by the research team. EMTs and APS caseworks identified eight domains of items that might be included on a screening tool: (1) exterior home condition; (2) interior living conditions; (3) social support; (4) medical history; (5) caregiving quality; (6) physical condition of the older adult; (7) older adult's behavior; and, (8) EMTs instincts. The screening tool should be based on observable cues in the physical or social environment, be very brief, easily integrated into electronic charting systems, and provide a decision rule for reporting guidance to optimize utility for EMTs in the field. We described characteristics of a screening tool for EMTs to enhance detection and reporting of elder abuse and neglect to APS. Future research should narrow identified items and evaluate how these domains positively predict confirmed cases of elder abuse and neglect.
NASA Technical Reports Server (NTRS)
1975-01-01
The results of the third and final phase of a study undertaken to define means of optimizing the Spacelab experiment data system by interactively manipulating the flow of data were presented. A number of payload applicable interactive techniques and an integrated interaction system for each of two possible payloads are described. These interaction systems have been functionally defined and are accompanied with block diagrams, hardware specifications, software sizing and speed requirements, operational procedures and cost/benefits analysis data for both onboard and ground based system elements. It is shown that accrued benefits are attributable to a reduction in data processing costs obtained by, generally, a considerable reduction in the quantity of data that might otherwise be generated without interaction. One other additional anticipated benefit includes the increased scientific value obtained by the quicker return of all useful data.
GOSAT-2 : Science Plan, Products, Validation, and Application
NASA Astrophysics Data System (ADS)
Matsunaga, T.; Morino, I.; Yoshida, Y.; Saito, M.; Hiraki, K.; Yokota, Y.; Kamei, A.; Oishi, Y.; Dupuy, E.; Murakami, K.; Ninomiya, K.; Pang, J. S.; Yokota, T.; Maksyutov, S. S.; Machida, T.; Saigusa, N.; Mukai, H.; Nakajima, M.; Imasu, R.; Nakajima, T.
2013-12-01
Based on the success of Greenhouse Gases Observing Satellite (GOSAT) launched in 2009, Ministry of the Environment (MOE), Japan Space Exploration Agency (JAXA), and National Institute for Environmental Studies (NIES) started the preparations for the follow-on satellite, GOSAT-2 in FY2011. The current target launch year of GOSAT-2 is FY2017. The objectives of GOSAT-2 include : 1) Continue and enhance spaceborne greenhouse gases observation started by GOSAT, 2) Improve our understanding of global and regional carbon cycles, and 3) Contribute to the climate change related policies as one of MRV(Measurement, Reporting, and Verification) tools for carbon emission reduction. As a scientific background/rationale of GOSAT-2, GOSAT-2 Science Plan is being edited by GOSAT-2 Science Team Preparation Committee. Not only carbon dioxide and methane but also carbon monoxide, tropospheric ozone, and aerosols are discussed in the plan. GOSAT-2 Level 2 (gas concentrations) and Level 4 (gas fluxes) products will be operationally generated at and distributed from GOSAT-2 Data Handling Facility located in NIES. In addition, a new supercomputer dedicated to GOSAT-2 research and development will be also installed in NIES. GOSAT-2 validation plan is also being discussed. Its baseline is similar to the current GOSAT . But various efforts will be made to extend the coverage of validation data for GOSAT-2. The efforts include the increased commercial passenger aircraft volunteering atmospheric measurements and additional ground-based Fourier transform spectrometers to be newly installed in Asian countries. In addition, a compact accelerator mass spectrometer is being introduced to NIES to investigate the contributions of anthropogenic emissions which is important for GOSAT-2. Climate change related policies include JCM (Joint Crediting Mechanism) in which MRV plays a critical role. MRV tools used in the existing JCM projects are mostly ground-based and site-specific. Satellite atmospheric measurements such as GOSAT or GOSAT-2 are expected to be more general and independent MRV tools and complement ground-based tools. The current status of GOSAT-2 science plan, products, validation, and application will be shown in the presentation.
Singh, Kunwar P; Singh, Arun K; Gupta, Shikha; Rai, Premanjali
2012-07-01
The present study aims to investigate the individual and combined effects of temperature, pH, zero-valent bimetallic nanoparticles (ZVBMNPs) dose, and chloramphenicol (CP) concentration on the reductive degradation of CP using ZVBMNPs in aqueous medium. Iron-silver ZVBMNPs were synthesized. Batch experimental data were generated using a four-factor statistical experimental design. CP reduction by ZVBMNPs was optimized using the response surface modeling (RSM) and artificial neural network-genetic algorithm (ANN-GA) approaches. The RSM and ANN methodologies were also compared for their predictive and generalization abilities using the same training and validation data set. Reductive by-products of CP were identified using liquid chromatography-mass spectrometry technique. The optimized process variables (RSM and ANN-GA approaches) yielded CP reduction capacity of 57.37 and 57.10 mg g(-1), respectively, as compared to the experimental value of 54.0 mg g(-1) with un-optimized variables. The ANN-GA and RSM methodologies yielded comparable results and helped to achieve a higher reduction (>6%) of CP by the ZVBMNPs as compared to the experimental value. The root mean squared error, relative standard error of prediction and correlation coefficient between the measured and model-predicted values of response variable were 1.34, 3.79, and 0.964 for RSM and 0.03, 0.07, and 0.999 for ANN models for the training and 1.39, 3.47, and 0.996 for RSM and 1.25, 3.11, and 0.990 for ANN models for the validation set. Predictive and generalization abilities of both the RSM and ANN models were comparable. The synthesized ZVBMNPs may be used for an efficient reductive removal of CP from the water.
A Concept for the One Degree Imager (ODI) Data Reduction Pipeline and Archiving System
NASA Astrophysics Data System (ADS)
Knezek, Patricia; Stobie, B.; Michael, S.; Valdes, F.; Marru, S.; Henschel, R.; Pierce, M.
2010-05-01
The One Degree Imager (ODI), currently being built by the WIYN Observatory, will provide tremendous possibilities for conducting diverse scientific programs. ODI will be a complex instrument, using non-conventional Orthogonal Transfer Array (OTA) detectors. Due to its large field of view, small pixel size, use of OTA technology, and expected frequent use, ODI will produce vast amounts of astronomical data. If ODI is to achieve its full potential, a data reduction pipeline must be developed. Long-term archiving must also be incorporated into the pipeline system to ensure the continued value of ODI data. This paper presents a concept for an ODI data reduction pipeline and archiving system. To limit costs and development time, our plan leverages existing software and hardware, including existing pipeline software, Science Gateways, Computational Grid & Cloud Technology, Indiana University's Data Capacitor and Massive Data Storage System, and TeraGrid compute resources. Existing pipeline software will be augmented to add functionality required to meet challenges specific to ODI, enhance end-user control, and enable the execution of the pipeline on grid resources including national grid resources such as the TeraGrid and Open Science Grid. The planned system offers consistent standard reductions and end-user flexibility when working with images beyond the initial instrument signature removal. It also gives end-users access to computational and storage resources far beyond what are typically available at most institutions. Overall, the proposed system provides a wide array of software tools and the necessary hardware resources to use them effectively.
NASA Technical Reports Server (NTRS)
Jeracki, Robert J. (Technical Monitor); Topol, David A.; Ingram, Clint L.; Larkin, Michael J.; Roche, Charles H.; Thulin, Robert D.
2004-01-01
This report presents results of the work completed on the preliminary design of Fan 3 of NASA s 22-inch Fan Low Noise Research project. Fan 3 was intended to build on the experience gained from Fans 1 and 2 by demonstrating noise reduction technology that surpasses 1992 levels by 6 dB. The work was performed as part of NASA s Advanced Subsonic Technology (AST) program. Work on this task was conducted in the areas of CFD code validation, acoustic prediction and validation, rotor parametric studies, and fan exit guide vane (FEGV) studies up to the time when a NASA decision was made to cancel the design, fabrication and testing phases of the work. The scope of the program changed accordingly to concentrate on two subtasks: (1) Rig data analysis and CFD code validation and (2) Fan and FEGV optimization studies. The results of the CFD code validation work showed that this tool predicts 3D flowfield features well from the blade trailing edge to about a chord downstream. The CFD tool loses accuracy as the distance from the trailing edge increases beyond a blade chord. The comparisons of noise predictions to rig test data showed that both the tone noise tool and the broadband noise tool demonstrated reasonable agreement with the data to the degree that these tools can reliably be used for design work. The section on rig airflow and inlet separation analysis describes the method used to determine total fan airflow, shows the good agreement of predicted boundary layer profiles to measured profiles, and shows separation angles of attack ranging from 29.5 to 27deg for the range of airflows tested. The results of the rotor parametric studies were significant in leading to the decision not to pursue a new rotor design for Fan 3 and resulted in recommendations to concentrate efforts on FEGV stator designs. The ensuing parametric study on FEGV designs showed the potential for 8 to 10 EPNdB noise reduction relative to the baseline.
Juszczyk, Dorota; Charlton, Judith; McDermott, Lisa; Soames, Jamie; Sultana, Kirin; Ashworth, Mark; Fox, Robin; Hay, Alastair D; Little, Paul; Moore, Michael V; Yardley, Lucy; Prevost, A Toby; Gulliford, Martin C
2016-01-01
Introduction Respiratory tract infections (RTIs) account for about 60% of antibiotics prescribed in primary care. This study aims to test the effectiveness, in a cluster randomised controlled trial, of electronically delivered, multicomponent interventions to reduce unnecessary antibiotic prescribing when patients consult for RTIs in primary care. The research will specifically evaluate the effectiveness of feeding back electronic health records (EHRs) data to general practices. Methods and analysis 2-arm cluster randomised trial using the EHRs of the Clinical Practice Research Datalink (CPRD). General practices in England, Scotland, Wales and Northern Ireland are being recruited and the general population of all ages represents the target population. Control trial arm practices will continue with usual care. Practices in the intervention arm will receive complex multicomponent interventions, delivered remotely to information systems, including (1) feedback of each practice's antibiotic prescribing through monthly antibiotic prescribing reports estimated from CPRD data; (2) delivery of educational and decision support tools; (3) a webinar to explain and promote effective usage of the intervention. The intervention will continue for 12 months. Outcomes will be evaluated from CPRD EHRs. The primary outcome will be the number of antibiotic prescriptions for RTIs per 1000 patient years. Secondary outcomes will be: the RTI consultation rate; the proportion of consultations for RTI with an antibiotic prescribed; subgroups of age; different categories of RTI and quartiles of intervention usage. There will be more than 80% power to detect an absolute reduction in antibiotic prescription for RTI of 12 per 1000 registered patient years. Total healthcare usage will be estimated from CPRD data and compared between trial arms. Ethics and dissemination Trial protocol was approved by the National Research Ethics Service Committee (14/LO/1730). The pragmatic design of the trial will enable subsequent translation of effective interventions at scale in order to achieve population impact. Trial registration number ISRCTN95232781; Pre-results. PMID:27491663
Pendergrass, Tyra M; Hieftje, Kimberly; Crusto, Cindy A; Montanaro, Erika; Fiellin, Lynn E
2016-08-01
Serious games are emerging as important tools that offer an innovative approach to teach adolescents behavioral skills to avoid risky situations. PlayForward: Elm City Stories, an interactive videogame targeting risk reduction, is currently undergoing evaluation. Collecting stakeholder data on its acceptability and real-life implementation strategies is critical for successful dissemination. We collected interview data from four stakeholder groups regarding incorporating PlayForward into settings with adolescents. Transcripts were coded, creating a comprehensive code structure for each stakeholder group. We conducted 40 semi-structured interviews that included 14 adolescents (aged 12-15 years; 10 boys), eight parents/guardians (all women), 12 after-school/school coordinators (nine women), and 14 community partners (13 women). We identified four themes that reflected stakeholders' perceptions about how the videogame might be implemented in real-world settings. (1) Stakeholder groups expressed that the topics of sex, alcohol, and drugs were not being taught in an educational setting. (2) Stakeholder groups saw a videogame as a viable option to teach about sex, alcohol, and drugs. (3) Stakeholder groups thought that the videogame would fit well into other settings, such as after-school programs or community organizations. (4) Some stakeholder groups highlighted additional tools that could help with implementation, such as manuals, homework assignments, and group discussion questions. Stakeholder groups supported the game as a delivery vehicle for targeted content, indicating high acceptability but highlighting additional tools that would aid in implementation.
González-Arenzana, Lucía; López-Alfaro, Isabel; Garde-Cerdán, Teresa; Portu, Javier; López, Rosa; Santamaría, Pilar
2018-03-23
This study was performed with the aim of reducing the microbial communities of wines after alcoholic fermentation to improve the establishment of commercial Oenococcus oeni inoculum for developing the malolactic fermentation. Microbial community reduction was accomplished by applying Pulsed Electric Field (PEF) technology to four different wines. Overall, significant reductions in yeast population were observed. To a lesser extent, lactic acid bacteria were reduced while acetic acid bacteria were completely eliminated after the PEF treatment. In three out of the four tested wines, a decrease in the competitive pressure between microorganisms due to the detected reduction led to a general but slight shortening of the malolactic fermentation duration. In the wine with the most adverse conditions to commercial starter establishment, the shortest malolactic fermentation was reached after PEF treatment. Finally, the sensorial quality of three out of the four treated wines was considered better after the PEF treatment. Therefore, PEF technology meant an important tool for improving the malolactic fermentation performance. Copyright © 2018 Elsevier B.V. All rights reserved.
Hexose rearrangements upon fragmentation of N-glycopeptides and reductively aminated N-glycans.
Wuhrer, Manfred; Koeleman, Carolien A M; Deelder, André M
2009-06-01
Tandem mass spectrometry of glycans and glycoconjugates in protonated form is known to result in rearrangement reactions leading to internal residue loss. Here we studied the occurrence of hexose rearrangements in tandem mass spectrometry of N-glycopeptides and reductively aminated N-glycans by MALDI-TOF/TOF-MS/MS and ESI-ion trap-MS/MS. Fragmentation of proton adducts of oligomannosidic N-glycans of ribonuclease B that were labeled with 2-aminobenzamide and 2-aminobenzoic acid resulted in transfer of one to five hexose residues to the fluorescently tagged innermost N-acetylglucosamine. Glycopeptides from various biological sources with oligomannosidic glycans were likewise shown to undergo hexose rearrangement reactions, resulting in chitobiose cleavage products that have acquired one or two hexose moieties. Tryptic immunoglobulin G Fc-glycopeptides with biantennary N-glycans likewise showed hexose rearrangements resulting in hexose transfer to the peptide moiety retaining the innermost N-acetylglucosamine. Thus, as a general phenomenon, tandem mass spectrometry of reductively aminated glycans as well as glycopeptides may result in hexose rearrangements. This characteristic of glycopeptide MS/MS has to be considered when developing tools for de novo glycopeptide structural analysis.
International Charter of principles for sharing bio-specimens and data.
Mascalzoni, Deborah; Dove, Edward S; Rubinstein, Yaffa; Dawkins, Hugh J S; Kole, Anna; McCormack, Pauline; Woods, Simon; Riess, Olaf; Schaefer, Franz; Lochmüller, Hanns; Knoppers, Bartha M; Hansson, Mats
2015-06-01
There is a growing international agreement on the need to provide greater access to research data and bio-specimen collections to optimize their long-term value and exploit their potential for health discovery and validation. This is especially evident for rare disease research. Currently, the rising value of data and bio-specimen collections does not correspond with an equal increase in data/sample-sharing and data/sample access. Contradictory legal and ethical frameworks across national borders are obstacles to effective sharing: more specifically, the absence of an integrated model proves to be a major logistical obstruction. The Charter intends to amend the obstacle by providing both the ethical foundations on which data sharing should be based, as well as a general Material and Data Transfer Agreement (MTA/DTA). This Charter is the result of a careful negotiation of different stakeholders' interest and is built on earlier consensus documents and position statements, which provided the general international legal framework. Further to this, the Charter provides tools that may help accelerate sharing. The Charter has been formulated to serve as an enabling tool for effective and transparent data and bio-specimen sharing and the general MTA/DTA constitutes a mechanism to ensure uniformity of access across projects and countries, and may be regarded as a consistent basic agreement for addressing data and material sharing globally. The Charter is forward looking in terms of emerging issues from the perspective of a multi-stakeholder group, and where possible, provides strategies that may address these issues.
Decoding-Accuracy-Based Sequential Dimensionality Reduction of Spatio-Temporal Neural Activities
NASA Astrophysics Data System (ADS)
Funamizu, Akihiro; Kanzaki, Ryohei; Takahashi, Hirokazu
Performance of a brain machine interface (BMI) critically depends on selection of input data because information embedded in the neural activities is highly redundant. In addition, properly selected input data with a reduced dimension leads to improvement of decoding generalization ability and decrease of computational efforts, both of which are significant advantages for the clinical applications. In the present paper, we propose an algorithm of sequential dimensionality reduction (SDR) that effectively extracts motor/sensory related spatio-temporal neural activities. The algorithm gradually reduces input data dimension by dropping neural data spatio-temporally so as not to undermine the decoding accuracy as far as possible. Support vector machine (SVM) was used as the decoder, and tone-induced neural activities in rat auditory cortices were decoded into the test tone frequencies. SDR reduced the input data dimension to a quarter and significantly improved the accuracy of decoding of novel data. Moreover, spatio-temporal neural activity patterns selected by SDR resulted in significantly higher accuracy than high spike rate patterns or conventionally used spatial patterns. These results suggest that the proposed algorithm can improve the generalization ability and decrease the computational effort of decoding.
A cross-validation package driving Netica with python
Fienen, Michael N.; Plant, Nathaniel G.
2014-01-01
Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).
Fostering Outreach, Education and Exploration of the Moon Using the Lunar Mapping & Modeling Portal
NASA Astrophysics Data System (ADS)
Dodge, K.; Law, E.; Malhotra, S.; Chang, G.; Kim, R. M.; Bui, B.; Sadaqathullah, S.; Day, B. H.
2014-12-01
The Lunar Mapping and Modeling Portal (LMMP)[1], is a web-based Portal and a suite of interactive visualization and analysis tools for users to access mapped lunar data products (including image mosaics, digital elevation models, etc.) from past and current lunar missions (e.g., Lunar Reconnaissance Orbiter, Apollo, etc.). Originally designed as a mission planning tool for the Constellation Program, LMMP has grown into a generalized suite of tools facilitating a wide range of activities in support of lunar exploration including public outreach, education, lunar mission planning and scientific research. LMMP fosters outreach, education, and exploration of the Moon by educators, students, amateur astronomers, and the general public. These efforts are enhanced by Moon Tours, LMMP's mobile application, which makes LMMP's information accessible to people of all ages, putting opportunities for real lunar exploration in the palms of their hands. Our talk will include an overview of LMMP and a demonstration of its technologies (web portals, mobile apps), to show how it serves NASA data as commodities for use by advanced visualization facilities (e.g., planetariums) and how it contributes to improving teaching and learning, increasing scientific literacy of the general public, and enriching STEM efforts. References:[1] http://www.lmmp.nasa.gov
How to Identify a Domain-General Learning Mechanism when You See One
ERIC Educational Resources Information Center
Rakison, David H.; Yermolayeva, Yevdokiya
2011-01-01
A longstanding and fundamental debate in developmental science is whether knowledge is acquired through domain-specific or domain-general mechanisms. To date, there exists no tool to determine whether experimental data support one theoretical approach or the other. In this article, we argue that the U- and N-shaped curves found in a number of…
Cure-WISE: HETDEX data reduction with Astro-WISE
NASA Astrophysics Data System (ADS)
Snigula, J. M.; Cornell, M. E.; Drory, N.; Fabricius, Max.; Landriau, M.; Hill, G. J.; Gebhardt, K.
2012-09-01
The Hobby-Eberly Telescope Dark Energy Experiment (HETDEX) is a blind spectroscopic survey to map the evolution of dark energy using Lyman-alpha emitting galaxies at redshifts 1:9 < z < 3:5 as tracers. The survey instrument, VIRUS, consists of 75 IFUs distributed across the 22-arcmin field of the upgraded 9.2-m HET. Each exposure gathers 33,600 spectra. Over the projected five year run of the survey we expect about 170 GB of data per night. For the data reduction we developed the Cure pipeline. Cure is designed to automatically find and calibrate the observed spectra, subtract the sky background, and detect and classify different types of sources. Cure employs rigorous statistical methods and complete pixel-level error propagation throughout the reduction process to ensure Poisson-limited performance and meaningful significance values. To automate the reduction of the whole dataset we implemented the Cure pipeline in the Astro-WISE framework. This integration provides for HETDEX a database backend with complete dependency tracking of the various reduction steps, automated checks, and a searchable interface to the detected sources and user management. It can be used to create various web interfaces for data access and quality control. Astro-WISE allows us to reduce the data from all the IFUs in parallel on a compute cluster. This cluster allows us to reduce the observed data in quasi real time and still have excess capacity for rerunning parts of the reduction. Finally, the Astro-WISE interface will be used to provide access to reduced data products to the general community.
Archiving, sharing, processing and publishing historical earthquakes data: the IT point of view
NASA Astrophysics Data System (ADS)
Locati, Mario; Rovida, Andrea; Albini, Paola
2014-05-01
Digital tools devised for seismological data are mostly designed for handling instrumentally recorded data. Researchers working on historical seismology are forced to perform their daily job using a general purpose tool and/or coding their own to address their specific tasks. The lack of out-of-the-box tools expressly conceived to deal with historical data leads to a huge amount of time lost in performing tedious task to search for the data and, to manually reformat it in order to jump from one tool to the other, sometimes causing a loss of the original data. This reality is common to all activities related to the study of earthquakes of the past centuries, from the interpretations of past historical sources, to the compilation of earthquake catalogues. A platform able to preserve the historical earthquake data, trace back their source, and able to fulfil many common tasks was very much needed. In the framework of two European projects (NERIES and SHARE) and one global project (Global Earthquake History, GEM), two new data portals were designed and implemented. The European portal "Archive of Historical Earthquakes Data" (AHEAD) and the worldwide "Global Historical Earthquake Archive" (GHEA), are aimed at addressing at least some of the above mentioned issues. The availability of these new portals and their well-defined standards makes it easier than before the development of side tools for archiving, publishing and processing the available historical earthquake data. The AHEAD and GHEA portals, their underlying technologies and the developed side tools are presented.
Design of a practical model-observer-based image quality assessment method for CT imaging systems
NASA Astrophysics Data System (ADS)
Tseng, Hsin-Wu; Fan, Jiahua; Cao, Guangzhi; Kupinski, Matthew A.; Sainath, Paavana
2014-03-01
The channelized Hotelling observer (CHO) is a powerful method for quantitative image quality evaluations of CT systems and their image reconstruction algorithms. It has recently been used to validate the dose reduction capability of iterative image-reconstruction algorithms implemented on CT imaging systems. The use of the CHO for routine and frequent system evaluations is desirable both for quality assurance evaluations as well as further system optimizations. The use of channels substantially reduces the amount of data required to achieve accurate estimates of observer performance. However, the number of scans required is still large even with the use of channels. This work explores different data reduction schemes and designs a new approach that requires only a few CT scans of a phantom. For this work, the leave-one-out likelihood (LOOL) method developed by Hoffbeck and Landgrebe is studied as an efficient method of estimating the covariance matrices needed to compute CHO performance. Three different kinds of approaches are included in the study: a conventional CHO estimation technique with a large sample size, a conventional technique with fewer samples, and the new LOOL-based approach with fewer samples. The mean value and standard deviation of area under ROC curve (AUC) is estimated by shuffle method. Both simulation and real data results indicate that an 80% data reduction can be achieved without loss of accuracy. This data reduction makes the proposed approach a practical tool for routine CT system assessment.
NASA Astrophysics Data System (ADS)
Busonero, D.; Gai, M.
The goals of 21st century high angular precision experiments rely on the limiting performance associated to the selected instrumental configuration and observational strategy. Both global and narrow angle micro-arcsec space astrometry require that the instrument contributions to the overall error budget has to be less than the desired micro-arcsec level precision. Appropriate modelling of the astrometric response is required for optimal definition of the data reduction and calibration algorithms, in order to ensure high sensitivity to the astrophysical source parameters and in general high accuracy. We will refer to the framework of the SIM-Lite and the Gaia mission, the most challenging space missions of the next decade in the narrow angle and global astrometry field, respectively. We will focus our dissertation on the Gaia data reduction issues and instrument calibration implications. We describe selected topics in the framework of the Astrometric Instrument Modelling for the Gaia mission, evidencing their role in the data reduction chain and we give a brief overview of the Astrometric Instrument Model Data Analysis Software System, a Java-based pipeline under development by our team.
Teledermatology Helps Doctors and Hospitals to Serve Their Clients
NASA Astrophysics Data System (ADS)
Witkamp, Leonard
Telemedicine contributes to efficiency increase and leads to the accelerated development and use of the internet based electronic patient record. The broad use of telemedicine is hampered by rigid decision structures, slow adaptation processes and concern for its consequences. Health Management Practice (HMP) addresses these issues by developing, investigating and implementing telemedicine tools in a modular way. KSYOS TeleMedical Centre, the first virtual healthcare institution in The Netherlands, has successfully applied HMP on teledermatology. Teledermatology has led to high satisfaction and learning effect, 65,1% referral reduction, 40% cost savings, and better quality of care. Teledermatology is an excellent tool for hospitals to dosage their waiting list, increase and strengthen their contacts with general practitioners, and provide them and the patients with better service. HMP has enabled KSYOS to perform over 16.000 teleconsultations, expand teledermatology to other EU countries, as well as to other areas such as teleophtalmology, telespirometry and telecardiology.
Undergraduate Education with the WIYN 0.9-m Telescope
NASA Astrophysics Data System (ADS)
Pilachowski, Catherine A.
2017-01-01
Several models have been explored at Indiana University Bloomington for undergraduate student engagement in astronomy using the WIYN 0.9-m telescope at Kitt Peak. These models include individual student research projects using the telescope, student observations as part of an observational techniques course for majors, and enrichment activities for non-science majors in general education courses. Where possible, we arrange for students to travel to the telescope. More often, we are able to use simple online tools such as Skype and VNC viewers to give students an authentic observing experience. Experiences with the telescope motivate students to learn basic content in astronomy, including the celestial sphere, the electromagnetic spectrum, telescopes and detectors, the variety of astronomical objects, date reduction processes, image analysis, and color image creation and appreciation. The WIYN 0.9-m telescope is an essential tool for our program at all levels of undergraduate education
Quality tracing in meat supply chains
Mack, Miriam; Dittmer, Patrick; Veigt, Marius; Kus, Mehmet; Nehmiz, Ulfert; Kreyenschmidt, Judith
2014-01-01
The aim of this study was the development of a quality tracing model for vacuum-packed lamb that is applicable in different meat supply chains. Based on the development of relevant sensory parameters, the predictive model was developed by combining a linear primary model and the Arrhenius model as the secondary model. Then a process analysis was conducted to define general requirements for the implementation of the temperature-based model into a meat supply chain. The required hardware and software for continuous temperature monitoring were developed in order to use the model under practical conditions. Further on a decision support tool was elaborated in order to use the model as an effective tool in combination with the temperature monitoring equipment for the improvement of quality and storage management within the meat logistics network. Over the long term, this overall procedure will support the reduction of food waste and will improve the resources efficiency of food production. PMID:24797136
Quality tracing in meat supply chains.
Mack, Miriam; Dittmer, Patrick; Veigt, Marius; Kus, Mehmet; Nehmiz, Ulfert; Kreyenschmidt, Judith
2014-06-13
The aim of this study was the development of a quality tracing model for vacuum-packed lamb that is applicable in different meat supply chains. Based on the development of relevant sensory parameters, the predictive model was developed by combining a linear primary model and the Arrhenius model as the secondary model. Then a process analysis was conducted to define general requirements for the implementation of the temperature-based model into a meat supply chain. The required hardware and software for continuous temperature monitoring were developed in order to use the model under practical conditions. Further on a decision support tool was elaborated in order to use the model as an effective tool in combination with the temperature monitoring equipment for the improvement of quality and storage management within the meat logistics network. Over the long term, this overall procedure will support the reduction of food waste and will improve the resources efficiency of food production.
Strategies to improve industrial energy efficiency
NASA Astrophysics Data System (ADS)
O'Rielly, Kristine M.
A lack of technical expertise, fueled by a lack of positive examples, can lead to companies opting not to implement energy reduction projects unless mandated by legislation. As a result, companies are missing out on exceptional opportunities to improve not only their environmental record but also save considerably on fuel costs. This study investigates the broad topic of energy efficiency within the context of the industrial sector by means of a thorough review of existing energy reduction strategies and a demonstration of their successful implementation. The study begins by discussing current industrial energy consumption trends around the globe and within the Canadian manufacturing sector. This is followed by a literature review which outlines 3 prominent energy efficiency improvement strategies currently available to companies: 1) Waste heat recovery, 2) Idle power loss reduction and production rate optimization, and lastly 3) Auxiliary equipment operational performance. Next, a broad overview of the resources and tools available to organizations looking to improve their industrial energy efficiency is provided. Following this, several case studies are presented which demonstrate the potential benefits that are available to Canadian organizations looking to improve their energy efficiency. Lastly, a discussion of a number of issues and barriers pertaining to the wide-scale implementation of industrial efficiency strategies is presented. It discusses a number of potential roadblocks, including a lack of energy consumption monitoring and data transparency. While this topic has been well researched in the past in terms of the losses encountered during various general manufacturing process streams, practically no literature exists which attempts to provide real data from companies who have implemented energy efficiency strategies. By obtaining original data directly from companies, this thesis demonstrates the potential for companies to save money and reduce GHG (greenhouse gas) emissions through the implementation of energy efficiency projects and publishes numbers which are almost impossible to find directly. By publishing success stories, it is hoped that other companies, especially SMEs (small and medium enterprises) will be able to learn from these case studies and be inspired to embark on energy efficiency projects of their own.
The Tool for the Reduction and Assessment of Chemical and other environmental Impacts (TRACI) was developed to allow the quantification of environmental impacts for a variety of impact categories which are necessary for a comprehensive impact assessment. See Figure 1. TRACI is c...
ERIC Educational Resources Information Center
Beltran-Cruz, Maribel; Cruz, Shannen Belle B.
2013-01-01
This study explored the use of social media as a tool in enhancing student's learning experiences, by using online instruction as a supplement to a face-to-face general education course, such as biological sciences. Survey data were collected from 186 students who were enrolled in a Biological Sciences course. The course was taught in a blended…
Code of Federal Regulations, 2012 CFR
2012-01-01
... through the Federal Automotive Statistical Tool (FAST), an Internet-based reporting tool. To find out how to submit motor vehicle data to GSA through FAST, consult the instructions from your agency fleet...; and (5) Fuel used. Note to § 102-34.335: The FAST system is also used by agency Fleet Managers to...
Code of Federal Regulations, 2010 CFR
2010-07-01
... through the Federal Automotive Statistical Tool (FAST), an Internet-based reporting tool. To find out how to submit motor vehicle data to GSA through FAST, consult the instructions from your agency fleet...; and (5) Fuel used. Note to § 102-34.335: The FAST system is also used by agency Fleet Managers to...
Code of Federal Regulations, 2014 CFR
2014-01-01
... through the Federal Automotive Statistical Tool (FAST), an Internet-based reporting tool. To find out how to submit motor vehicle data to GSA through FAST, consult the instructions from your agency fleet...; and (5) Fuel used. Note to § 102-34.335: The FAST system is also used by agency Fleet Managers to...
Code of Federal Regulations, 2011 CFR
2011-01-01
... through the Federal Automotive Statistical Tool (FAST), an Internet-based reporting tool. To find out how to submit motor vehicle data to GSA through FAST, consult the instructions from your agency fleet...; and (5) Fuel used. Note to § 102-34.335: The FAST system is also used by agency Fleet Managers to...
Code of Federal Regulations, 2013 CFR
2013-07-01
... through the Federal Automotive Statistical Tool (FAST), an Internet-based reporting tool. To find out how to submit motor vehicle data to GSA through FAST, consult the instructions from your agency fleet...; and (5) Fuel used. Note to § 102-34.335: The FAST system is also used by agency Fleet Managers to...
SUPER-FOCUS: A tool for agile functional analysis of shotgun metagenomic data
Silva, Genivaldo Gueiros Z.; Green, Kevin T.; Dutilh, Bas E.; ...
2015-10-09
Analyzing the functional profile of a microbial community from unannotated shotgun sequencing reads is one of the important goals in metagenomics. Functional profiling has valuable applications in biological research because it identifies the abundances of the functional genes of the organisms present in the original sample, answering the question what they can do. Currently, available tools do not scale well with increasing data volumes, which is important because both the number and lengths of the reads produced by sequencing platforms keep increasing. Here, we introduce SUPER-FOCUS, SUbsystems Profile by databasE Reduction using FOCUS, an agile homology-based approach using a reducedmore » reference database to report the subsystems present in metagenomic datasets and profile their abundances. We tested SUPER-FOCUS with over 70 real metagenomes, the results showing that it accurately predicts the subsystems present in the profiled microbial communities, and is up to 1000 times faster than other tools.« less
40 CFR 63.642 - General standards.
Code of Federal Regulations, 2010 CFR
2010-07-01
... reduction. (4) Data shall be reduced in accordance with the EPA-approved methods specified in the applicable section or, if other test methods are used, the data and methods shall be validated according to the protocol in Method 301 of appendix A of this part. (e) Each owner or operator of a source subject to this...
Assessment of a Targeted Trap-Neuter-Return Pilot Study in Auckland, New Zealand
Zito, Sarah; Vigeant, Shalsee; Dale, Arnja
2018-01-01
Simple Summary It is generally accepted that stray cats need to be managed to minimise the associated negative impacts and there is a need for effective and humane management tools. One such potential tool is trap-neuter-return (TNR), which anecdotally has been used in New Zealand to manage stray cats, but no concerted and targeted implementation of this technique has been reported, nor any formal assessments conducted. A targeted TNR (TTNR) programme for urban stray cats was implemented and assessed in one Auckland suburb. Assessment was based on the number of incoming felines; stray, unsocialised cats euthanased; unsocialised, unowned cats sterilised and returned (independently of the TTNR programme); and neonatal/underage euthanasias. Incoming stray feline, underage euthanasia, and unsocialised stray cat euthanasia numbers all reduced for the targeted suburb when these outcome measures were compared for the years before and after the programme. These outcome measures had a greater reduction in the targeted suburb compared to the other Auckland suburbs not targeted by the TTNR programme, although causation cannot be inferred, as a variety of reasons could have contributed to the changes. This pilot programme suggests that TTNR could be a valuable humane cat management tool in urban New Zealand, and further assessment is warranted. Abstract There is a need for effective and humane management tools to manage urban stray cats and minimise negative impacts associated with stray cats. One such tool is targeted trap-neuter-return (TTNR), but no concerted implementation of this technique or formal assessments have been reported. To address this deficit, a TTNR programme was implemented and assessed in one Auckland suburb from May 2015 to June 2016; the programme sterilised and returned 348 cats (4.2 cats/1000 residents). Assessment was based on the number of incoming felines; stray, unsocialised cats euthanased; unsocialised, unowned cats sterilised and returned (independently of the TTNR programme); and neonatal/underage euthanasias. Incoming stray felines, underage euthanasias, and unsocialised stray cat euthanasias were all reduced for the targeted suburb when compared for the years before and after the programme (the percentage reduction in these parameters was −39, −17, −34, −7, and −47, respectively). These outcome measures had a greater reduction in the targeted suburb compared to the Auckland suburbs not targeted by the TTNR programme (p < 0.01), although causation cannot be inferred, as a variety of reasons could have contributed to the changes. This pilot programme suggests that TTNR could be a valuable, humane cat management tool in urban New Zealand, and further assessment is warranted. PMID:29757255
NCAR Earth Observing Laboratory's Data Tracking System
NASA Astrophysics Data System (ADS)
Cully, L. E.; Williams, S. F.
2014-12-01
The NCAR Earth Observing Laboratory (EOL) maintains an extensive collection of complex, multi-disciplinary datasets from national and international, current and historical projects accessible through field project web pages (https://www.eol.ucar.edu/all-field-projects-and-deployments). Data orders are processed through the EOL Metadata Database and Cyberinfrastructure (EMDAC) system. Behind the scenes is the institutionally created EOL Computing, Data, and Software/Data Management Group (CDS/DMG) Data Tracking System (DTS) tool. The DTS is used to track the complete life cycle (from ingest to long term stewardship) of the data, metadata, and provenance for hundreds of projects and thousands of data sets. The DTS is an EOL internal only tool which consists of three subsystems: Data Loading Notes (DLN), Processing Inventory Tool (IVEN), and Project Metrics (STATS). The DLN is used to track and maintain every dataset that comes to the CDS/DMG. The DLN captures general information such as title, physical locations, responsible parties, high level issues, and correspondence. When the CDS/DMG processes a data set, IVEN is used to track the processing status while collecting sufficient information to ensure reproducibility. This includes detailed "How To" documentation, processing software (with direct links to the EOL Subversion software repository), and descriptions of issues and resolutions. The STATS subsystem generates current project metrics such as archive size, data set order counts, "Top 10" most ordered data sets, and general information on who has ordered these data. The DTS was developed over many years to meet the specific needs of the CDS/DMG, and it has been successfully used to coordinate field project data management efforts for the past 15 years. This paper will describe the EOL CDS/DMG Data Tracking System including its basic functionality, the provenance maintained within the system, lessons learned, potential improvements, and future developments.
NASA Astrophysics Data System (ADS)
Shinnaga, H.; Humphreys, E.; Indebetouw, R.; Villard, E.; Kern, J.; Davis, L.; Miura, R. E.; Nakazato, T.; Sugimoto, K.; Kosugi, G.; Akiyama, E.; Muders, D.; Wyrowski, F.; Williams, S.; Lightfoot, J.; Kent, B.; Momjian, E.; Hunter, T.; ALMA Pipeline Team
2015-12-01
The ALMA Pipeline is the automated data reduction tool that runs on ALMA data. Current version of the ALMA pipeline produces science quality data products for standard interferometric observing modes up to calibration process. The ALMA Pipeline is comprised of (1) heuristics in the form of Python scripts that select the best processing parameters, and (2) contexts that are given for book-keeping purpose of data processes. The ALMA Pipeline produces a "weblog" that showcases detailed plots for users to judge how each step of calibration processes are treated. The ALMA Interferometric Pipeline was conditionally accepted in March 2014 by processing Cycle 0 and Cycle 1 data sets. From Cycle 2, ALMA Pipeline is used for ALMA data reduction and quality assurance for the projects whose observing modes are supported by the ALMA Pipeline. Pipeline tasks are available based on CASA version 4.2.2, and the first public pipeline release called CASA 4.2.2-pipe has been available since October 2014. One can reduce ALMA data both by CASA tasks as well as by pipeline tasks by using CASA version 4.2.2-pipe.
Kara, E O; Elliot, A J; Bagnall, H; Foord, D G F; Pnaiser, R; Osman, H; Smith, G E; Olowokure, B
2012-07-01
Certain influenza outbreaks, including the 2009 influenza A(H1N1) pandemic, can predominantly affect school-age children. Therefore the use of school absenteeism data has been considered as a potential tool for providing early warning of increasing influenza activity in the community. This study retrospectively evaluates the usefulness of these data by comparing them with existing syndromic surveillance systems and laboratory data. Weekly mean percentages of absenteeism in 373 state schools (children aged 4-18 years) in Birmingham, UK, from September 2006 to September 2009, were compared with established syndromic surveillance systems including a telephone health helpline, a general practitioner sentinel network and laboratory data for influenza. Correlation coefficients were used to examine the relationship between each syndromic system. In June 2009, school absenteeism generally peaked concomitantly with the existing influenza surveillance systems in England. Weekly school absenteeism surveillance would not have detected pandemic influenza A(H1N1) earlier but daily absenteeism data and the development of baselines could improve the timeliness of the system.
Installation and Testing of ITER Integrated Modeling and Analysis Suite (IMAS) on DIII-D
NASA Astrophysics Data System (ADS)
Lao, L.; Kostuk, M.; Meneghini, O.; Smith, S.; Staebler, G.; Kalling, R.; Pinches, S.
2017-10-01
A critical objective of the ITER Integrated Modeling Program is the development of IMAS to support ITER plasma operation and research activities. An IMAS framework has been established based on the earlier work carried out within the EU. It consists of a physics data model and a workflow engine. The data model is capable of representing both simulation and experimental data and is applicable to ITER and other devices. IMAS has been successfully installed on a local DIII-D server using a flexible installer capable of managing the core data access tools (Access Layer and Data Dictionary) and optionally the Kepler workflow engine and coupling tools. A general adaptor for OMFIT (a workflow engine) is being built for adaptation of any analysis code to IMAS using a new IMAS universal access layer (UAL) interface developed from an existing OMFIT EU Integrated Tokamak Modeling UAL. Ongoing work includes development of a general adaptor for EFIT and TGLF based on this new UAL that can be readily extended for other physics codes within OMFIT. Work supported by US DOE under DE-FC02-04ER54698.
Application of conditional moment tests to model checking for generalized linear models.
Pan, Wei
2002-06-01
Generalized linear models (GLMs) are increasingly being used in daily data analysis. However, model checking for GLMs with correlated discrete response data remains difficult. In this paper, through a case study on marginal logistic regression using a real data set, we illustrate the flexibility and effectiveness of using conditional moment tests (CMTs), along with other graphical methods, to do model checking for generalized estimation equation (GEE) analyses. Although CMTs provide an array of powerful diagnostic tests for model checking, they were originally proposed in the econometrics literature and, to our knowledge, have never been applied to GEE analyses. CMTs cover many existing tests, including the (generalized) score test for an omitted covariate, as special cases. In summary, we believe that CMTs provide a class of useful model checking tools.
MURMoT. Design and Application of Microbial Uranium Reduction Monitoring Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loeffler, Frank E.
2014-12-31
Uranium (U) contamination in the subsurface is a major remediation challenge at many DOE sites. Traditional site remedies present enormous costs to DOE; hence, enhanced bioremediation technologies (i.e., biostimulation and bioaugmentation) combined with monitoring efforts are being considered as cost-effective corrective actions to address subsurface contamination. This research effort improved understanding of the microbial U reduction process and developed new tools for monitoring microbial activities. Application of these tools will promote science-based site management decisions that achieve contaminant detoxification, plume control, and long-term stewardship in the most efficient manner. The overarching hypothesis was that the design, validation and application ofmore » a suite of new molecular and biogeochemical tools advance process understanding, and improve environmental monitoring regimes to assess and predict in situ U immobilization. Accomplishments: This project (i) advanced nucleic acid-based approaches to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-detoxifying bacteria; (ii) developed proteomics workflows for detection of metal reduction biomarker proteins in laboratory cultures and contaminated site groundwater; (iii) developed and demonstrated the utility of U isotopic fractionation using high precision mass spectrometry to quantify U(VI) reduction for a range of reduction mechanisms and environmental conditions; and (iv) validated the new tools using field samples from U-contaminated IFRC sites, and demonstrated their prognostic and diagnostic capabilities in guiding decision making for environmental remediation and long-term site stewardship.« less
Kinetic modeling of liquefied petroleum gas (LPG) reduction of titania in MATLAB
NASA Astrophysics Data System (ADS)
Yin, Tan Wei; Ramakrishnan, Sivakumar; Rezan, Sheikh Abdul; Noor, Ahmad Fauzi Mohd; Izah Shoparwe, Noor; Alizadeh, Reza; Roohi, Parham
2017-04-01
In the present study, reduction of Titania (TiO2) by liquefied petroleum gas (LPG)-hydrogen-argon gas mixture was investigated by experimental and kinetic modelling in MATLAB. The reduction experiments were carried out in the temperature range of 1100-1200°C with a reduction time from 1-3 hours and 10-20 minutes of LPG flowing time. A shrinking core model (SCM) was employed for the kinetic modelling in order to determine the rate and extent of reduction. The highest experimental extent of reduction of 38% occurred at a temperature of 1200°C with 3 hours reduction time and 20 minutes of LPG flowing time. The SCM gave a predicted extent of reduction of 82.1% due to assumptions made in the model. The deviation between SCM and experimental data was attributed to porosity, thermodynamic properties and minute thermal fluctuations within the sample. In general, the reduction rates increased with increasing reduction temperature and LPG flowing time.
Data Visualization and Geospatial Tools | Geospatial Data Science | NREL
renewable resources are available in a specific areas. General Analysis Renewable Energy Atlas View the geographic distribution of wind, solar, geothermal, hydropower, and biomass resources in the United States . Solar and Wind Energy Resource Assessment (SWERA) Model Access international renewable energy resource
Data visualization and analysis tools for the MAVEN mission
NASA Astrophysics Data System (ADS)
Harter, B.; De Wolfe, A. W.; Putnam, B.; Brain, D.; Chaffin, M.
2016-12-01
The Mars Atmospheric and Volatile Evolution (MAVEN) mission has been collecting data at Mars since September 2014. We have developed new software tools for exploring and analyzing the science data. Our open-source Python toolkit for working with data from MAVEN and other missions is based on the widely-used "tplot" IDL toolkit. We have replicated all of the basic tplot functionality in Python, and use the bokeh and matplotlib libraries to generate interactive line plots and spectrograms, providing additional functionality beyond the capabilities of IDL graphics. These Python tools are generalized to work with missions beyond MAVEN, and our software is available on Github. We have also been exploring 3D graphics as a way to better visualize the MAVEN science data and models. We have constructed a 3D visualization of MAVEN's orbit using the CesiumJS library, which not only allows viewing of MAVEN's orientation and position, but also allows the display of selected science data sets and their variation over time.
AKSZ construction from reduction data
NASA Astrophysics Data System (ADS)
Bonechi, Francesco; Cabrera, Alejandro; Zabzine, Maxim
2012-07-01
We discuss a general procedure to encode the reduction of the target space geometry into AKSZ sigma models. This is done by considering the AKSZ construction with target the BFV model for constrained graded symplectic manifolds. We investigate the relation between this sigma model and the one with the reduced structure. We also discuss several examples in dimension two and three when the symmetries come from Lie group actions and systematically recover models already proposed in the literature.
Predictive tool for estimating the potential effect of water fluoridation on dental caries.
Foster, G R K; Downer, M C; Lunt, M; Aggarwal, V; Tickle, M
2009-03-01
To provide a tool for public health planners to estimate the potential improvement in dental caries in children that might be expected in a region if its water supply were to be fluoridated. Recent BASCD (British Association for the Study of Community Dentistry) dental epidemiological data for caries in 5- and 11-year-old children in English primary care trusts in fluoridated and non-fluoridated areas were analysed to estimate absolute and relative improvement in dmft/DMFT and caries-free measures observed in England. Where data were sufficient for testing significance this analysis included the effect of different levels of deprivation. A table of observed improvements was produced, together with an example of how that table can be used as a tool for estimating the expected improvement in caries in any specific region of England. Observed absolute improvements and 95% confidence intervals were: for 5-year-olds reduction in mean dmft 0.56 (0.38, 0.74) for IMD 12, 0.73 (0.60, 0.85) for IMD 20, and 0.94 (0.76, 1.12) for IMD 30, with 12% (9%, 14%) more children free of caries; for 11-year-olds reduction in mean DMFT 0.12 (0.04, 0.20) for IMD 12, 0.19 (0.13, 0.26) for IMD 20, 0.29 (0.18, 0.40) and for IMD 30, with 8% (5%, 11%) more children free from caries. The BASCD data taken together with a deprivation measure are capable of yielding an age-specific, 'intention to treat' model of water fluoridation that can be used to estimate the potential effect on caries levels of a notional new fluoridation scheme in an English region.
Dimensional Reduction for the General Markov Model on Phylogenetic Trees.
Sumner, Jeremy G
2017-03-01
We present a method of dimensional reduction for the general Markov model of sequence evolution on a phylogenetic tree. We show that taking certain linear combinations of the associated random variables (site pattern counts) reduces the dimensionality of the model from exponential in the number of extant taxa, to quadratic in the number of taxa, while retaining the ability to statistically identify phylogenetic divergence events. A key feature is the identification of an invariant subspace which depends only bilinearly on the model parameters, in contrast to the usual multi-linear dependence in the full space. We discuss potential applications including the computation of split (edge) weights on phylogenetic trees from observed sequence data.
GRAFLAB 2.3 for UNIX - A MATLAB database, plotting, and analysis tool: User`s guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunn, W.N.
1998-03-01
This report is a user`s manual for GRAFLAB, which is a new database, analysis, and plotting package that has been written entirely in the MATLAB programming language. GRAFLAB is currently used for data reduction, analysis, and archival. GRAFLAB was written to replace GRAFAID, which is a FORTRAN database, analysis, and plotting package that runs on VAX/VMS.
Center for Corporate Climate Leadership Goal Setting
EPA provides tools and recognition for companies setting aggressive GHG reduction goals, which can galvanize reduction efforts at a company and often leads to the identification of many additional reduction opportunities.
Spatially dynamic forest management to sustain biodiversity and economic returns.
Mönkkönen, Mikko; Juutinen, Artti; Mazziotta, Adriano; Miettinen, Kaisa; Podkopaev, Dmitry; Reunanen, Pasi; Salminen, Hannu; Tikkanen, Olli-Pekka
2014-02-15
Production of marketed commodities and protection of biodiversity in natural systems often conflict and thus the continuously expanding human needs for more goods and benefits from global ecosystems urgently calls for strategies to resolve this conflict. In this paper, we addressed what is the potential of a forest landscape to simultaneously produce habitats for species and economic returns, and how the conflict between habitat availability and timber production varies among taxa. Secondly, we aimed at revealing an optimal combination of management regimes that maximizes habitat availability for given levels of economic returns. We used multi-objective optimization tools to analyze data from a boreal forest landscape consisting of about 30,000 forest stands simulated 50 years into future. We included seven alternative management regimes, spanning from the recommended intensive forest management regime to complete set-aside of stands (protection), and ten different taxa representing a wide variety of habitat associations and social values. Our results demonstrate it is possible to achieve large improvements in habitat availability with little loss in economic returns. In general, providing dead-wood associated species with more habitats tended to be more expensive than providing requirements for other species. No management regime alone maximized habitat availability for the species, and systematic use of any single management regime resulted in considerable reductions in economic returns. Compared with an optimal combination of management regimes, a consistent application of the recommended management regime would result in 5% reduction in economic returns and up to 270% reduction in habitat availability. Thus, for all taxa a combination of management regimes was required to achieve the optimum. Refraining from silvicultural thinnings on a proportion of stands should be considered as a cost-effective management in commercial forests to reconcile the conflict between economic returns and habitat required by species associated with dead-wood. In general, a viable strategy to maintain biodiversity in production landscapes would be to diversify management regimes. Our results emphasize the importance of careful landscape level forest management planning because optimal combinations of management regimes were taxon-specific. For cost-efficiency, the results call for balanced and correctly targeted strategies among habitat types. Copyright © 2013 Elsevier Ltd. All rights reserved.
Tool-specific performance of vibration-reducing gloves for attenuating fingers-transmitted vibration
Welcome, Daniel E.; Dong, Ren G.; Xu, Xueyan S.; Warren, Christopher; McDowell, Thomas W.
2016-01-01
BACKGROUND Fingers-transmitted vibration can cause vibration-induced white finger. The effectiveness of vibration-reducing (VR) gloves for reducing hand transmitted vibration to the fingers has not been sufficiently examined. OBJECTIVE The objective of this study is to examine tool-specific performance of VR gloves for reducing finger-transmitted vibrations in three orthogonal directions (3D) from powered hand tools. METHODS A transfer function method was used to estimate the tool-specific effectiveness of four typical VR gloves. The transfer functions of the VR glove fingers in three directions were either measured in this study or during a previous study using a 3D laser vibrometer. More than seventy vibration spectra of various tools or machines were used in the estimations. RESULTS When assessed based on frequency-weighted acceleration, the gloves provided little vibration reduction. In some cases, the gloves amplified the vibration by more than 10%, especially the neoprene glove. However, the neoprene glove did the best when the assessment was based on unweighted acceleration. The neoprene glove was able to reduce the vibration by 10% or more of the unweighted vibration for 27 out of the 79 tools. If the dominant vibration of a tool handle or workpiece was in the shear direction relative to the fingers, as observed in the operation of needle scalers, hammer chisels, and bucking bars, the gloves did not reduce the vibration but increased it. CONCLUSIONS This study confirmed that the effectiveness for reducing vibration varied with the gloves and the vibration reduction of each glove depended on tool, vibration direction to the fingers, and finger location. VR gloves, including certified anti-vibration gloves do not provide much vibration reduction when judged based on frequency-weighted acceleration. However, some of the VR gloves can provide more than 10% reduction of the unweighted vibration for some tools or workpieces. Tools and gloves can be matched for better effectiveness for protecting the fingers. PMID:27867313
A generic multi-flex-body dynamics, controls simulation tool for space station
NASA Technical Reports Server (NTRS)
London, Ken W.; Lee, John F.; Singh, Ramen P.; Schubele, Buddy
1991-01-01
An order (n) multiflex body Space Station simulation tool is introduced. The flex multibody modeling is generic enough to model all phases of Space Station from build up through to Assembly Complete configuration and beyond. Multibody subsystems such as the Mobile Servicing System (MSS) undergoing a prescribed translation and rotation are also allowed. The software includes aerodynamic, gravity gradient, and magnetic field models. User defined controllers can be discrete or continuous. Extensive preprocessing of 'body by body' NASTRAN flex data is built in. A significant aspect, too, is the integrated controls design capability which includes model reduction and analytic linearization.
Network pharmacology: reigning in drug attrition?
Alian, Osama M; Shah, Minjel; Mohammad, Momin; Mohammad, Ramzi M
2013-06-01
In the process of drug development, there has been an exceptionally high attrition rate in oncological compounds entering late phases of testing. This has seen a concurrent reduction in approved NCEs (new chemical entities) reaching patients. Network pharmacology has become a valuable tool in understanding the fine details of drug-target interactions as well as painting a more practical picture of phenotype relationships to patients and drugs. By utilizing all the tools achieved through molecular medicine and combining it with high throughput data analysis, interactions and mechanisms can be elucidated and treatments reasonably tailored to patients expressing specific phenotypes (or genotypes) of disease, essentially reigning in the phenomenon of drug attrition.
Analysis of real-time vibration data
Safak, E.
2005-01-01
In recent years, a few structures have been instrumented to provide continuous vibration data in real time, recording not only large-amplitude motions generated by extreme loads, but also small-amplitude motions generated by ambient loads. The main objective in continuous recording is to track any changes in structural characteristics, and to detect damage after an extreme event, such as an earthquake or explosion. The Fourier-based spectral analysis methods have been the primary tool to analyze vibration data from structures. In general, such methods do not work well for real-time data, because real-time data are mainly composed of ambient vibrations with very low amplitudes and signal-to-noise ratios. The long duration, linearity, and the stationarity of ambient data, however, allow us to utilize statistical signal processing tools, which can compensate for the adverse effects of low amplitudes and high noise. The analysis of real-time data requires tools and techniques that can be applied in real-time; i.e., data are processed and analyzed while being acquired. This paper presents some of the basic tools and techniques for processing and analyzing real-time vibration data. The topics discussed include utilization of running time windows, tracking mean and mean-square values, filtering, system identification, and damage detection.
ERIC Educational Resources Information Center
Crone, Regina M.; Mehta, Smita Shukla
2016-01-01
Setting variables such as location of parent training, programming with common stimuli, generalization of discrete responses to non-trained settings, and subsequent reduction in child problem behavior may influence the effectiveness of interventions. The purpose of this study was to evaluate the effectiveness of home-versus clinic-based training…
AMIDE: a free software tool for multimodality medical image analysis.
Loening, Andreas Markus; Gambhir, Sanjiv Sam
2003-07-01
Amide's a Medical Image Data Examiner (AMIDE) has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI) and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.
Eppig, Janan T; Smith, Cynthia L; Blake, Judith A; Ringwald, Martin; Kadin, James A; Richardson, Joel E; Bult, Carol J
2017-01-01
The Mouse Genome Informatics (MGI), resource ( www.informatics.jax.org ) has existed for over 25 years, and over this time its data content, informatics infrastructure, and user interfaces and tools have undergone dramatic changes (Eppig et al., Mamm Genome 26:272-284, 2015). Change has been driven by scientific methodological advances, rapid improvements in computational software, growth in computer hardware capacity, and the ongoing collaborative nature of the mouse genomics community in building resources and sharing data. Here we present an overview of the current data content of MGI, describe its general organization, and provide examples using simple and complex searches, and tools for mining and retrieving sets of data.