Science.gov

Sample records for analysis tool based

  1. JAVA based LCD Reconstruction and Analysis Tools

    SciTech Connect

    Bower, G.

    2004-10-11

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.

  2. GIS-based hydrogeochemical analysis tools (QUIMET)

    NASA Astrophysics Data System (ADS)

    Velasco, V.; Tubau, I.; Vázquez-Suñè, E.; Gogu, R.; Gaitanaru, D.; Alcaraz, M.; Serrano-Juan, A.; Fernàndez-Garcia, D.; Garrido, T.; Fraile, J.; Sanchez-Vila, X.

    2014-09-01

    A software platform (QUIMET) was developed to improve the sorting, analysis, calculations, visualizations, and interpretations of hydrogeochemical data in a GIS environment. QUIMET is composed of a geospatial database plus a set of tools specially designed for graphical and statistical analysis of hydrogeochemical data. The geospatial database has been designed to include organic and inorganic chemical records, as well as relevant physical parameters (temperature, Eh, electrical conductivity). The instruments for analysis cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data. They include, among others, chemical time-series analysis, ionic balance calculations, correlation of chemical parameters, and calculation of various common hydrogeochemical diagrams (Salinity, Schöeller-Berkaloff, Piper, and Stiff). The GIS platform allows the generation of maps of the spatial distribution of parameters and diagrams. Moreover, it allows performing a complete statistical analysis of the data including descriptive statistic univariate and bivariate analysis, the latter including generation of correlation matrices and graphics. Finally, QUIMET offers interoperability with other external platforms. The platform is illustrated with a geochemical data set from the city of Badalona, located on the Mediterranean coast in NE Spain.

  3. Klonos: A Similarity Analysis Based Tool for Software Porting

    SciTech Connect

    and Oscar Hernandez, Wei Ding

    2014-07-30

    The Klonos is a compiler-based tool that can help users for scientific application porting. The tool is based on the similarity analysis with the help of the OpenUH compiler (a branch of Open64 compiler). This tool combines syntactic and cost-model-provided metrics clusters, which aggregate similar subroutines that can be ported similarity. The generated porting plan, which allows programmers and compilers to reuse porting experience as much as possible during the porting process.

  4. Usage-Based Evolution of Visual Analysis Tools

    SciTech Connect

    Hetzler, Elizabeth G.; Rose, Stuart J.; McQuerry, Dennis L.; Medvick, Patricia A.

    2005-06-12

    Visual analysis tools have been developed to help people in many different domains more effectively explore, understand, and make decisions from their information. Challenges in making a successful tool include suitability within a user's work processes, and tradeoffs between analytic power and tool complexity, both of which impact ease of learning. This paper describes experience working with users to help them apply visual analysis tools in several different domains, and examples of how the tools evolved significantly to better match users' goals and processes.

  5. EEG analysis using wavelet-based information tools.

    PubMed

    Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

    2006-06-15

    Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity.

  6. GOMA: functional enrichment analysis tool based on GO modules

    PubMed Central

    Huang, Qiang; Wu, Ling-Yun; Wang, Yong; Zhang, Xiang-Sun

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology. A variety of enrichment analysis tools have been developed in recent years, but most output a long list of significantly enriched terms that are often redundant, making it difficult to extract the most meaningful functions. In this paper, we present GOMA, a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules. With this method, we systematically revealed functional GO modules, i.e., groups of functionally similar GO terms, via an optimization model and then ranked them by enrichment scores. Our new method simplifies enrichment analysis results by reducing redundancy, thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results. PMID:23237213

  7. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    SciTech Connect

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

  8. Principles and tools for collaborative entity-based intelligence analysis.

    PubMed

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  9. Image edge detection based tool condition monitoring with morphological component analysis.

    PubMed

    Yu, Xiaolong; Lin, Xin; Dai, Yiquan; Zhu, Kunpeng

    2017-07-01

    The measurement and monitoring of tool condition are keys to the product precision in the automated manufacturing. To meet the need, this study proposes a novel tool wear monitoring approach based on the monitored image edge detection. Image edge detection has been a fundamental tool to obtain features of images. This approach extracts the tool edge with morphological component analysis. Through the decomposition of original tool wear image, the approach reduces the influence of texture and noise for edge measurement. Based on the target image sparse representation and edge detection, the approach could accurately extract the tool wear edge with continuous and complete contour, and is convenient in charactering tool conditions. Compared to the celebrated algorithms developed in the literature, this approach improves the integrity and connectivity of edges, and the results have shown that it achieves better geometry accuracy and lower error rate in the estimation of tool conditions. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

  11. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

  12. ATAMM analysis tool

    NASA Technical Reports Server (NTRS)

    Jones, Robert; Stoughton, John; Mielke, Roland

    1991-01-01

    Diagnostics software for analyzing Algorithm to Architecture Mapping Model (ATAMM) based concurrent processing systems is presented. ATAMM is capable of modeling the execution of large grain algorithms on distributed data flow architectures. The tool graphically displays algorithm activities and processor activities for evaluation of the behavior and performance of an ATAMM based system. The tool's measurement capabilities indicate computing speed, throughput, concurrency, resource utilization, and overhead. Evaluations are performed on a simulated system using the software tool. The tool is used to estimate theoretical lower bound performance. Analysis results are shown to be comparable to the predictions.

  13. Applications of custom developed object based analysis tool: Precipitation in Pacific, Tropical cyclones precipitation, Hail areas

    NASA Astrophysics Data System (ADS)

    Skok, Gregor; Rakovec, Jože; Strajnar, Benedikt; Bacmeister, Julio; Tribbia, Joe

    2014-05-01

    In the last few years an object-based analysis software tool was developed at University of Ljubljana in collaboration with National Center for Atmospheric Research (NCAR). The tool was originally based on ideas of the Method for Object-Based Diagnostic Evaluation (MODE) developed by NCAR but has since evolved and changed considerably and is now available as a separate free software package. The software is called the Forward in Time object analysis tool (FiT tool). The software was used to analyze numerous datasets - mainly focusing on precipitation. Climatology of satellite and model precipitation in the low-and-mid latitude Pacific Ocean was performed by identifying and tracking of individual perception systems and estimating their lifespan, movement and size. A global climatology of tropical cyclone precipitation was performed using satellite data and tracking and analysis of areas with hail in Slovenia was performed using radar data. The tool will be presented along with some results of applications.

  14. An agent-based tool for infrastructure interdependency policy analysis.

    SciTech Connect

    North, M. J.

    2000-12-14

    Complex Adaptive Systems (CAS) can be applied to investigate complex infrastructure interdependencies such as those between the electric power and natural gas markets. These markets are undergoing fundamental transformations including major changes in electric generator fuel sources. Electric generators that use natural gas as a fuel source are rapidly gaining market share. These generators introduce direct interdependency between the electric power and natural gas markets. These interdependencies have been investigated using the emergent behavior of CAS model agents within the Spot Market Agent Research Tool Version 2.0 Plus Natural Gas (SMART II+).

  15. dada - a web-based 2D detector analysis tool

    NASA Astrophysics Data System (ADS)

    Osterhoff, Markus

    2017-06-01

    The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.

  16. Data Processing and Analysis Tools Based on Ground-Based Synthetic Aperture Radar Imagery

    NASA Astrophysics Data System (ADS)

    Crosetto, M.; Monserrat, O.; Luzi, G.; Devanthéry, N.; Cuevas-González, M.; Barra, A.

    2017-09-01

    The Ground-Based SAR (GBSAR) is a terrestrial remote sensing technique used to measure and monitor deformation. In this paper we describe two complementary approaches to derive deformation measurements using GBSAR data. The first approach is based on radar interferometry, while the second one exploits the GBSAR amplitude. In this paper we consider the so-called discontinuous GBSAR acquisition mode. The interferometric process is not always straightforward: it requires appropriate data processing and analysis tools. One of the main critical steps is phase unwrapping, which can critically affect the deformation measurements. In this paper we describe the procedure used at the CTTC to process and analyse discontinuous GBSAR data. In the second part of the paper we describe the approach based on GBSAR amplitude images and an image-matching method.

  17. State Analysis Database Tool

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Bennett, Matthew

    2006-01-01

    The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.

  18. A compilation of Web-based research tools for miRNA analysis.

    PubMed

    Shukla, Vaibhav; Varghese, Vinay Koshy; Kabekkodu, Shama Prasada; Mallya, Sandeep; Satyamoorthy, Kapaettu

    2017-02-25

    Since the discovery of microRNAs (miRNAs), a class of noncoding RNAs that regulate the gene expression posttranscriptionally in sequence-specific manner, there has been a release of number of tools useful for both basic and advanced applications. This is because of the significance of miRNAs in many pathophysiological conditions including cancer. Numerous bioinformatics tools that have been developed for miRNA analysis have their utility for detection, expression, function, target prediction and many other related features. This review provides a comprehensive assessment of web-based tools for the miRNA analysis that does not require prior knowledge of any computing languages.

  19. Web-Based Tools for Modelling and Analysis of Multivariate Data: California Ozone Pollution Activity

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Christou, Nicolas

    2011-01-01

    This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting…

  20. Web-Based Tools for Modelling and Analysis of Multivariate Data: California Ozone Pollution Activity

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Christou, Nicolas

    2011-01-01

    This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting…

  1. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    NASA Astrophysics Data System (ADS)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  2. SDA-based diagnostic and analysis tools for Collider Run II

    SciTech Connect

    Bolshakov, T.B.; Lebrun, P.; Panacek, S.; Papadimitriou, V.; Slaughter, J.; Xiao, A.; /Fermilab

    2005-05-01

    Operating and improving the understanding of the Fermilab Accelerator Complex for the colliding beam experiments requires advanced software methods and tools. The Shot Data Analysis (SDA) has been developed to fulfill this need. Data from the Fermilab Accelerator Complex is stored in a relational database, and is served to programs and users via Web-based tools. Summary tables are systematically generated during and after a store. These tables (the Supertable, the Recomputed Emittances, the Recomputed Intensities and other tables) are discussed here.

  3. Tool and Task Analysis Guide for Vocational Welding (150 Tasks). Performance Based Vocational Education.

    ERIC Educational Resources Information Center

    John H. Hinds Area Vocational School, Elwood, IN.

    This book contains a task inventory, a task analysis of 150 tasks from that inventory, and a tool list for performance-based welding courses in the state of Indiana. The task inventory and tool list reflect 28 job titles found in Indiana. In the first part of the guide, tasks are listed by these domains: carbon-arc, electron beam, G.M.A.W., gas…

  4. Tool and Task Analysis Guide for Vocational Welding (150 Tasks). Performance Based Vocational Education.

    ERIC Educational Resources Information Center

    John H. Hinds Area Vocational School, Elwood, IN.

    This book contains a task inventory, a task analysis of 150 tasks from that inventory, and a tool list for performance-based welding courses in the state of Indiana. The task inventory and tool list reflect 28 job titles found in Indiana. In the first part of the guide, tasks are listed by these domains: carbon-arc, electron beam, G.M.A.W., gas…

  5. CoryneBase: Corynebacterium Genomic Resources and Analysis Tools at Your Fingertips

    PubMed Central

    Tan, Mui Fern; Jakubovics, Nick S.; Wee, Wei Yee; Mutha, Naresh V. R.; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh

    2014-01-01

    Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/. PMID:24466021

  6. geneSurv: An interactive web-based tool for survival analysis in genomics research.

    PubMed

    Korkmaz, Selcuk; Goksuluk, Dincer; Zararsiz, Gokmen; Karahan, Sevilay

    2017-09-05

    Survival analysis methods are often used in cancer studies. It has been shown that the combination of clinical data with genomics increases the predictive performance of survival analysis methods. But, this leads to a high-dimensional data problem. Fortunately, new methods have been developed in the last decade to overcome this problem. However, there is a strong need for easily accessible, user-friendly and interactive tool to perform survival analysis in the presence of genomics data. We developed an open-source and freely available web-based tool for survival analysis methods that can deal with high-dimensional data. This tool includes classical methods, such as Kaplan-Meier, Cox proportional hazards regression, and advanced methods, such as penalized Cox regression and Random Survival Forests. It also offers an optimal cutoff determination method based on maximizing several test statistics. The tool has a simple and interactive interface, and it can handle high dimensional data through feature selection and ensemble methods. To dichotomize gene expressions, geneSurv can identify optimal cutoff points. Users can upload their microarray, RNA-Seq, chip-Seq, proteomics, metabolomics or clinical data as a nxp dimensional data matrix, where n refers to samples and p refers to genes. This tool is available free at www.biosoft.hacettepe.edu.tr/geneSurv. All source code is available at https://github.com/selcukorkmaz/geneSurv under the GPL-3 license. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Financial analysis of community-based forest enterprises with the Green Value tool

    Treesearch

    S. Humphries; Tom Holmes

    2016-01-01

    The Green Value tool was developed in response to the need for simplified procedures that could be used in the field to conduct financial analysis for community-based forest enterprises (CFEs). Initially our efforts focused on a set of worksheets that could be used by both researchers and CFEs to monitor and analyze costs and income for one production period. The...

  8. HISTORICAL ANALYSIS, A VALUABLE TOOL IN COMMUNITY-BASED ENVIRONMENTAL PROTECTION

    EPA Science Inventory

    A historical analysis of the ecological consequences of development can be a valuable tool in community-based environmental protection. These studies can engage the public in environmental issues and lead to informed decision making. Historical studies provide an understanding of...

  9. HISTORICAL ANALYSIS, A VALUABLE TOOL IN COMMUNITY-BASED ENVIRONMENTAL PROTECTION

    EPA Science Inventory

    A historical analysis of the ecological consequences of development can be a valuable tool in community-based environmental protection. These studies can engage the public in environmental issues and lead to informed decision making. Historical studies provide an understanding of...

  10. Web-based tools for modelling and analysis of multivariate data: California ozone pollution activity

    NASA Astrophysics Data System (ADS)

    Dinov, Ivo D.; Christou, Nicolas

    2011-09-01

    This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting and statistical inference on these data are presented. All components of this case study (data, tools, activity) are freely available online at: http://wiki.stat.ucla.edu/socr/index.php/SOCR_MotionCharts_CAOzoneData. Several types of exploratory (motion charts, box-and-whisker plots, spider charts) and quantitative (inference, regression, analysis of variance (ANOVA)) data analyses tools are demonstrated. Two specific human health related questions (temporal and geographic effects of ozone pollution) are discussed as motivational challenges.

  11. Web-based tools for modelling and analysis of multivariate data: California ozone pollution activity

    PubMed Central

    Dinov, Ivo D.; Christou, Nicolas

    2014-01-01

    This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting and statistical inference on these data are presented. All components of this case study (data, tools, activity) are freely available online at: http://wiki.stat.ucla.edu/socr/index.php/SOCR_MotionCharts_CAOzoneData. Several types of exploratory (motion charts, box-and-whisker plots, spider charts) and quantitative (inference, regression, analysis of variance (ANOVA)) data analyses tools are demonstrated. Two specific human health related questions (temporal and geographic effects of ozone pollution) are discussed as motivational challenges. PMID:24465054

  12. Multiscale Multiphysics-Based Modeling and Analysis on the Tool Wear in Micro Drilling

    NASA Astrophysics Data System (ADS)

    Niu, Zhichao; Cheng, Kai

    2016-02-01

    In micro-cutting processes, process variables including cutting force, cutting temperature and drill-workpiece interfacing conditions (lubrication and interaction, etc.) significantly affect the tool wear in a dynamic interactive in-process manner. The resultant tool life and cutting performance directly affect the component surface roughness, material removal rate and form accuracy control, etc. In this paper, a multiscale multiphysics oriented approach to modeling and analysis is presented particularly on tooling performance in micro drilling processes. The process optimization is also taken account based on establishing the intrinsic relationship between process parameters and cutting performance. The modeling and analysis are evaluated and validated through well-designed machining trials, and further supported by metrology measurements and simulations. The paper is concluded with a further discussion on the potential and application of the approach for broad micro manufacturing purposes.

  13. Mobility analysis tool based on the fundamental principle of conservation of energy.

    SciTech Connect

    Spletzer, Barry Louis; Nho, Hyuchul C.; Salton, Jonathan Robert

    2007-08-01

    In the past decade, a great deal of effort has been focused in research and development of versatile robotic ground vehicles without understanding their performance in a particular operating environment. As the usage of robotic ground vehicles for intelligence applications increases, understanding mobility of the vehicles becomes critical to increase the probability of their successful operations. This paper describes a framework based on conservation of energy to predict the maximum mobility of robotic ground vehicles over general terrain. The basis of the prediction is the difference between traction capability and energy loss at the vehicle-terrain interface. The mission success of a robotic ground vehicle is primarily a function of mobility. Mobility of a vehicle is defined as the overall capability of a vehicle to move from place to place while retaining its ability to perform its primary mission. A mobility analysis tool based on the fundamental principle of conservation of energy is described in this document. The tool is a graphical user interface application. The mobility analysis tool has been developed at Sandia National Laboratories, Albuquerque, NM. The tool is at an initial stage of development. In the future, the tool will be expanded to include all vehicles and terrain types.

  14. SigMate: a Matlab-based automated tool for extracellular neuronal signal processing and analysis.

    PubMed

    Mahmud, Mufti; Bertoldo, Alessandra; Girardi, Stefano; Maschietto, Marta; Vassanelli, Stefano

    2012-05-30

    Rapid advances in neuronal probe technology for multisite recording of brain activity have posed a significant challenge to neuroscientists for processing and analyzing the recorded signals. To be able to infer meaningful conclusions quickly and accurately from large datasets, automated and sophisticated signal processing and analysis tools are required. This paper presents a Matlab-based novel tool, "SigMate", incorporating standard methods to analyze spikes and EEG signals, and in-house solutions for local field potentials (LFPs) analysis. Available modules at present are - 1. In-house developed algorithms for: data display (2D and 3D), file operations (file splitting, file concatenation, and file column rearranging), baseline correction, slow stimulus artifact removal, noise characterization and signal quality assessment, current source density (CSD) analysis, latency estimation from LFPs and CSDs, determination of cortical layer activation order using LFPs and CSDs, and single LFP clustering; 2. Existing modules: spike detection, sorting and spike train analysis, and EEG signal analysis. SigMate has the flexibility of analyzing multichannel signals as well as signals from multiple recording sources. The in-house developed tools for LFP analysis have been extensively tested with signals recorded using standard extracellular recording electrode, and planar and implantable multi transistor array (MTA) based neural probes. SigMate will be disseminated shortly to the neuroscience community under the open-source GNU-General Public License.

  15. Capturing district nursing through a knowledge-based electronic caseload analysis tool (eCAT).

    PubMed

    Kane, Kay

    2014-03-01

    The Electronic Caseload Analysis Tool (eCAT) is a knowledge-based software tool to assist the caseload analysis process. The tool provides a wide range of graphical reports, along with an integrated clinical advisor, to assist district nurses, team leaders, operational and strategic managers with caseload analysis by describing, comparing and benchmarking district nursing practice in the context of population need, staff resources, and service structure. District nurses and clinical lead nurses in Northern Ireland developed the tool, along with academic colleagues from the University of Ulster, working in partnership with a leading software company. The aim was to use the eCAT tool to identify the nursing need of local populations, along with the variances in district nursing practice, and match the workforce accordingly. This article reviews the literature, describes the eCAT solution and discusses the impact of eCAT on nursing practice, staff allocation, service delivery and workforce planning, using fictitious exemplars and a post-implementation evaluation from the trusts.

  16. OEXP Analysis Tools Workshop

    NASA Technical Reports Server (NTRS)

    Garrett, L. Bernard; Wright, Robert L.; Badi, Deborah; Findlay, John T.

    1988-01-01

    This publication summarizes the software needs and available analysis tools presented at the OEXP Analysis Tools Workshop held at the NASA Langley Research Center, Hampton, Virginia on June 21 to 22, 1988. The objective of the workshop was to identify available spacecraft system (and subsystem) analysis and engineering design tools, and mission planning and analysis software that could be used for various NASA Office of Exploration (code Z) studies, specifically lunar and Mars missions.

  17. Clustergrammer, a web-based heatmap visualization and analysis tool for high-dimensional biological data.

    PubMed

    Fernandez, Nicolas F; Gundersen, Gregory W; Rahman, Adeeb; Grimes, Mark L; Rikova, Klarisa; Hornbeck, Peter; Ma'ayan, Avi

    2017-10-10

    Most tools developed to visualize hierarchically clustered heatmaps generate static images. Clustergrammer is a web-based visualization tool with interactive features such as: zooming, panning, filtering, reordering, sharing, performing enrichment analysis, and providing dynamic gene annotations. Clustergrammer can be used to generate shareable interactive visualizations by uploading a data table to a web-site, or by embedding Clustergrammer in Jupyter Notebooks. The Clustergrammer core libraries can also be used as a toolkit by developers to generate visualizations within their own applications. Clustergrammer is demonstrated using gene expression data from the cancer cell line encyclopedia (CCLE), original post-translational modification data collected from lung cancer cells lines by a mass spectrometry approach, and original cytometry by time of flight (CyTOF) single-cell proteomics data from blood. Clustergrammer enables producing interactive web based visualizations for the analysis of diverse biological data.

  18. A System Analysis Tool

    SciTech Connect

    CAMPBELL,PHILIP L.; ESPINOZA,JUAN

    2000-06-01

    In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

  19. Reproducible Analysis of Sequencing-Based RNA Structure Probing Data with User-Friendly Tools.

    PubMed

    Kielpinski, Lukasz Jan; Sidiropoulos, Nikolaos; Vinther, Jeppe

    2015-01-01

    RNA structure-probing data can improve the prediction of RNA secondary and tertiary structure and allow structural changes to be identified and investigated. In recent years, massive parallel sequencing has dramatically improved the throughput of RNA structure probing experiments, but at the same time also made analysis of the data challenging for scientists without formal training in computational biology. Here, we discuss different strategies for data analysis of massive parallel sequencing-based structure-probing data. To facilitate reproducible and standardized analysis of this type of data, we have made a collection of tools, which allow raw sequencing reads to be converted to normalized probing values using different published strategies. In addition, we also provide tools for visualization of the probing data in the UCSC Genome Browser and for converting RNA coordinates to genomic coordinates and vice versa. The collection is implemented as functions in the R statistical environment and as tools in the Galaxy platform, making them easily accessible for the scientific community. We demonstrate the usefulness of the collection by applying it to the analysis of sequencing-based hydroxyl radical probing data and comparing different normalization strategies. © 2015 Elsevier Inc. All rights reserved.

  20. A developmental screening tool for toddlers with multiple domains based on Rasch analysis.

    PubMed

    Hwang, Ai-Wen; Chou, Yeh-Tai; Hsieh, Ching-Lin; Hsieh, Wu-Shiun; Liao, Hua-Fang; Wong, Alice May-Kuen

    2015-01-01

    Using multidomain developmental screening tools is a feasible method for pediatric health care professionals to identify children at risk of developmental problems in multiple domains simultaneously. The purpose of this study was to develop a Rasch-based tool for Multidimensional Screening in Child Development (MuSiC) for children aged 0-3 years. The MuSic was developed by constructing items bank based on three commonly used screening tools, validating with developmental status (at risk for delay or not) on five developmental domains. Parents of a convenient sample of 632 children (aged 3-35.5 months) with and without developmental delays responded to items from the three screening tools funded by health authorities in Taiwan. Item bank was determined by item fit of Rasch analysis for each of the five developmental domains (cognitive skills, language skills, gross motor skills, fine motor skills, and socioadaptive skills). Children's performance scores in logits derived in Rasch analysis were validated with developmental status for each domain using the area under receiver operating characteristic curves. MuSiC, a 75-item developmental screening tool for five domains, was derived. The diagnostic validity of all five domains was acceptable for all stages of development, except for the infant stage (≤11 months and 15 days). MuSiC can be applied simultaneously to well-child care visits as a universal screening tool for children aged 1-3 years on multiple domains. Items with sound validity for infants need to be further developed. Copyright © 2014. Published by Elsevier B.V.

  1. A new damage diagnosis approach for NC machine tools based on hybrid Stationary subspace analysis

    NASA Astrophysics Data System (ADS)

    Gao, Chen; Zhou, Yuqing; Ren, Yan

    2017-05-01

    This paper focused on the damage diagnosis for NC machine tools and put forward a damage diagnosis method based on hybrid Stationary subspace analysis (SSA), for improving the accuracy and visibility of damage identification. First, the observed single sensor signal was reconstructed to multi-dimensional signals by the phase space reconstruction technique, as the inputs of SSA. SSA method was introduced to separate the reconstructed data into stationary components and non-stationary components without the need for independency and prior information of the origin signals. Subsequently, the selected non-stationary components were analysed for training LS-SVM (Least Squares Support Vector Machine) classifier model, in which several statistic parameters in the time and frequency domains were exacted as the sample of LS-SVM. An empirical analysis in NC milling machine tools is developed, and the result shows high accuracy of the proposed approach.

  2. WholePathwayScope: a comprehensive pathway-based analysis tool for high-throughput data

    PubMed Central

    Yi, Ming; Horton, Jay D; Cohen, Jonathan C; Hobbs, Helen H; Stephens, Robert M

    2006-01-01

    Background Analysis of High Throughput (HTP) Data such as microarray and proteomics data has provided a powerful methodology to study patterns of gene regulation at genome scale. A major unresolved problem in the post-genomic era is to assemble the large amounts of data generated into a meaningful biological context. We have developed a comprehensive software tool, WholePathwayScope (WPS), for deriving biological insights from analysis of HTP data. Result WPS extracts gene lists with shared biological themes through color cue templates. WPS statistically evaluates global functional category enrichment of gene lists and pathway-level pattern enrichment of data. WPS incorporates well-known biological pathways from KEGG (Kyoto Encyclopedia of Genes and Genomes) and Biocarta, GO (Gene Ontology) terms as well as user-defined pathways or relevant gene clusters or groups, and explores gene-term relationships within the derived gene-term association networks (GTANs). WPS simultaneously compares multiple datasets within biological contexts either as pathways or as association networks. WPS also integrates Genetic Association Database and Partial MedGene Database for disease-association information. We have used this program to analyze and compare microarray and proteomics datasets derived from a variety of biological systems. Application examples demonstrated the capacity of WPS to significantly facilitate the analysis of HTP data for integrative discovery. Conclusion This tool represents a pathway-based platform for discovery integration to maximize analysis power. The tool is freely available at . PMID:16423281

  3. Stack Trace Analysis Tool

    SciTech Connect

    2013-02-19

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallell application. STAT uses the MRNet tree based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to from a single call prefix tree. The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence slasses. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.

  4. Data mining tools for Salmonella characterization: application to gel-based fingerprinting analysis

    PubMed Central

    2013-01-01

    Background Pulsed field gel electrophoresis (PFGE) is currently the most widely and routinely used method by the Centers for Disease Control and Prevention (CDC) and state health labs in the United States for Salmonella surveillance and outbreak tracking. Major drawbacks of commercially available PFGE analysis programs have been their difficulty in dealing with large datasets and the limited availability of analysis tools. There exists a need to develop new analytical tools for PFGE data mining in order to make full use of valuable data in large surveillance databases. Results In this study, a software package was developed consisting of five types of bioinformatics approaches exploring and implementing for the analysis and visualization of PFGE fingerprinting. The approaches include PFGE band standardization, Salmonella serotype prediction, hierarchical cluster analysis, distance matrix analysis and two-way hierarchical cluster analysis. PFGE band standardization makes it possible for cross-group large dataset analysis. The Salmonella serotype prediction approach allows users to predict serotypes of Salmonella isolates based on their PFGE patterns. The hierarchical cluster analysis approach could be used to clarify subtypes and phylogenetic relationships among groups of PFGE patterns. The distance matrix and two-way hierarchical cluster analysis tools allow users to directly visualize the similarities/dissimilarities of any two individual patterns and the inter- and intra-serotype relationships of two or more serotypes, and provide a summary of the overall relationships between user-selected serotypes as well as the distinguishable band markers of these serotypes. The functionalities of these tools were illustrated on PFGE fingerprinting data from PulseNet of CDC. Conclusions The bioinformatics approaches included in the software package developed in this study were integrated with the PFGE database to enhance the data mining of PFGE fingerprints. Fast and

  5. Data mining tools for Salmonella characterization: application to gel-based fingerprinting analysis.

    PubMed

    Zou, Wen; Tang, Hailin; Zhao, Weizhong; Meehan, Joe; Foley, Steven L; Lin, Wei-Jiun; Chen, Hung-Chia; Fang, Hong; Nayak, Rajesh; Chen, James J

    2013-01-01

    Pulsed field gel electrophoresis (PFGE) is currently the most widely and routinely used method by the Centers for Disease Control and Prevention (CDC) and state health labs in the United States for Salmonella surveillance and outbreak tracking. Major drawbacks of commercially available PFGE analysis programs have been their difficulty in dealing with large datasets and the limited availability of analysis tools. There exists a need to develop new analytical tools for PFGE data mining in order to make full use of valuable data in large surveillance databases. In this study, a software package was developed consisting of five types of bioinformatics approaches exploring and implementing for the analysis and visualization of PFGE fingerprinting. The approaches include PFGE band standardization, Salmonella serotype prediction, hierarchical cluster analysis, distance matrix analysis and two-way hierarchical cluster analysis. PFGE band standardization makes it possible for cross-group large dataset analysis. The Salmonella serotype prediction approach allows users to predict serotypes of Salmonella isolates based on their PFGE patterns. The hierarchical cluster analysis approach could be used to clarify subtypes and phylogenetic relationships among groups of PFGE patterns. The distance matrix and two-way hierarchical cluster analysis tools allow users to directly visualize the similarities/dissimilarities of any two individual patterns and the inter- and intra-serotype relationships of two or more serotypes, and provide a summary of the overall relationships between user-selected serotypes as well as the distinguishable band markers of these serotypes. The functionalities of these tools were illustrated on PFGE fingerprinting data from PulseNet of CDC. The bioinformatics approaches included in the software package developed in this study were integrated with the PFGE database to enhance the data mining of PFGE fingerprints. Fast and accurate prediction makes it possible

  6. Providing web-based tools for time series access and analysis

    NASA Astrophysics Data System (ADS)

    Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane

    2014-05-01

    Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30

  7. Demand Response Analysis Tool

    SciTech Connect

    2012-03-01

    Demand Response Analysis Tool is a software developed at the Lawrence Berkeley National Laboratory. It is initially funded by Southern California Edison. Our goal in developing this tool is to provide an online, useable, with standardized methods, an analysis tool to evaluate demand and demand response performance of commercial and industrial facilities. The tool provides load variability and weather sensitivity analysis capabilities as well as development of various types of baselines. It can be used by researchers, real estate management firms, utilities, or any individuals who are interested in analyzing their demand and demand response capabilities.

  8. A low-cost video-based tool for clinical gait analysis.

    PubMed

    Soda, Paolo; Carta, Alfonso; Formica, Domenico; Guglielmelli, Eugenio

    2009-01-01

    In physical and rehabilitation medicine physicians need to perform clinical gait analysis to assess patients walking ability. Despite the relevant research on motion tracking, gait analysis technologies are far to be commonly diffused in clinical practice since they are quite expensive, need high-structured laboratories and trained personnel who are not always available. In order to overcome such limitations, this work proposes a low-cost, video-based portable tool for clinical gait analysis which provides the bi-dimensional kinematic analysis of walking. The system processes a video stream by means of tracking different markers placed in five anatomical landmarks of the subject's leg, applying Kalman filter in conjunction with a method that copes with occlusions. The system has been validated on a healthy subject, showing that it is able to reconstruct marker position and leg kinematics even if several occlusions occur.

  9. A web-based tool for groundwater mapping and drought analysis

    NASA Astrophysics Data System (ADS)

    Christensen, S.; Burns, M.; Jones, N.; Strassberg, G.

    2012-12-01

    In 2011-2012, the state of Texas saw the worst one-year drought on record. Fluctuations in gravity measured by GRACE satellites indicate that as much as 100 cubic kilometers of water was lost during this period. Much of this came from reservoirs and shallow soil moisture, but a significant amount came from aquifers. In response to this crisis, a Texas Drought Technology Steering Committee (TDTSC) consisting of academics and water managers was formed to develop new tools and strategies to assist the state in monitoring, predicting, and responding to drought events. In this presentation, we describe one of the tools that was developed as part of this effort. When analyzing the impact of drought on groundwater levels, it is fairly common to examine time series data at selected monitoring wells. However, accurately assessing impacts and trends requires both spatial and temporal analysis involving the development of detailed water level maps at various scales. Creating such maps in a flexible and rapid fashion is critical for effective drought analysis, but can be challenging due to the massive amounts of data involved and the processing required to generate such maps. Furthermore, wells are typically not sampled at the same points in time, and so developing a water table map for a particular date requires both spatial and temporal interpolation of water elevations. To address this challenge, a Cloud-based water level mapping system was developed for the state of Texas. The system is based on the Texas Water Development Board (TWDB) groundwater database, but can be adapted to use other databases as well. The system involves a set of ArcGIS workflows running on a server with a web-based front end and a Google Earth plug-in. A temporal interpolation geoprocessing tool was developed to estimate the piezometric heads for all wells in a given region at a specific date using a regression analysis. This interpolation tool is coupled with other geoprocessing tools to filter

  10. Immunoassay based water quality analysis: A new tool for drinking water supply management

    SciTech Connect

    Kostyshyn, C.R.; Brown, W.; Hervey, E.; Hull, C.

    1996-11-01

    The recent availability of enzyme-linked immunosorbent assay (ELISA) tests for the analysis of organic environmental contaminants provides drinking water utility managers and operators with a new tool for managing treatment operations and monitoring source watersheds. Immunoassay technology permits rapid, inexpensive and accurate in-plant testing of many SDWA regulated organic contaminants at concentrations well below established MCL`s. Analytical testing which would not be practicable due to the high cost or long turnaround time limitations of conventional testing methods is now being performed using immunoassay based analysis. Water quality data generated using immunoassay based methods are being utilized by drinking water utilities as an integral part of source watershed management programs, process operations optimization efforts, pro-active raw and finished water testing programs, and flood and incident response management.

  11. Development of an Acoustic Signal Analysis Tool “Auto-F” Based on the Temperament Scale

    NASA Astrophysics Data System (ADS)

    Modegi, Toshio

    The MIDI interface is originally designed for electronic musical instruments but we consider this music-note based coding concept can be extended for general acoustic signal description. We proposed applying the MIDI technology to coding of bio-medical auscultation sound signals such as heart sounds for retrieving medical records and performing telemedicine. Then we have tried to extend our encoding targets including vocal sounds, natural sounds and electronic bio-signals such as ECG, using Generalized Harmonic Analysis method. Currently, we are trying to separate vocal sounds included in popular songs and encode both vocal sounds and background instrumental sounds into separate MIDI channels. And also, we are trying to extract articulation parameters such as MIDI pitch-bend parameters in order to reproduce natural acoustic sounds using a GM-standard MIDI tone generator. In this paper, we present an overall algorithm of our developed acoustic signal analysis tool, based on those research works, which can analyze given time-based signals on the musical temperament scale. The prominent feature of this tool is producing high-precision MIDI codes, which reproduce the similar signals as the given source signal using a GM-standard MIDI tone generator, and also providing analyzed texts in the XML format.

  12. iScreen: Image-Based High-Content RNAi Screening Analysis Tools.

    PubMed

    Zhong, Rui; Dong, Xiaonan; Levine, Beth; Xie, Yang; Xiao, Guanghua

    2015-09-01

    High-throughput RNA interference (RNAi) screening has opened up a path to investigating functional genomics in a genome-wide pattern. However, such studies are often restricted to assays that have a single readout format. Recently, advanced image technologies have been coupled with high-throughput RNAi screening to develop high-content screening, in which one or more cell image(s), instead of a single readout, were generated from each well. This image-based high-content screening technology has led to genome-wide functional annotation in a wider spectrum of biological research studies, as well as in drug and target discovery, so that complex cellular phenotypes can be measured in a multiparametric format. Despite these advances, data analysis and visualization tools are still largely lacking for these types of experiments. Therefore, we developed iScreen (image-Based High-content RNAi Screening Analysis Tool), an R package for the statistical modeling and visualization of image-based high-content RNAi screening. Two case studies were used to demonstrate the capability and efficiency of the iScreen package. iScreen is available for download on CRAN (http://cran.cnr.berkeley.edu/web/packages/iScreen/index.html). The user manual is also available as a supplementary document.

  13. Robust depth-based tools for the analysis of gene expression data.

    PubMed

    López-Pintado, Sara; Romo, Juan; Torrente, Aurora

    2010-04-01

    Microarray experiments provide data on the expression levels of thousands of genes and, therefore, statistical methods applicable to the analysis of such high-dimensional data are needed. In this paper, we propose robust nonparametric tools for the description and analysis of microarray data based on the concept of functional depth, which measures the centrality of an observation within a sample. We show that this concept can be easily adapted to high-dimensional observations and, in particular, to gene expression data. This allows the development of the following depth-based inference tools: (1) a scale curve for measuring and visualizing the dispersion of a set of points, (2) a rank test for deciding if 2 groups of multidimensional observations come from the same population, and (3) supervised classification techniques for assigning a new sample to one of G given groups. We apply these methods to microarray data, and to simulated data including contaminated models, and show that they are robust, efficient, and competitive with other procedures proposed in the literature, outperforming them in some situations.

  14. Neutron multiplicity analysis tool

    SciTech Connect

    Stewart, Scott L

    2010-01-01

    I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This

  15. GEPAS, a web-based tool for microarray data analysis and interpretation

    PubMed Central

    Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Huerta-Cepas, Jaime; Minguez, Pablo; Alloza, Eva; Al-Shahrour, Fátima; Vegas-Azcárate, Susana; Goetz, Stefan; Escobar, Pablo; Garcia-Garcia, Francisco; Conesa, Ana; Montaner, David; Dopazo, Joaquín

    2008-01-01

    Gene Expression Profile Analysis Suite (GEPAS) is one of the most complete and extensively used web-based packages for microarray data analysis. During its more than 5 years of activity it has continuously been updated to keep pace with the state-of-the-art in the changing microarray data analysis arena. GEPAS offers diverse analysis options that include well established as well as novel algorithms for normalization, gene selection, class prediction, clustering and functional profiling of the experiment. New options for time-course (or dose-response) experiments, microarray-based class prediction, new clustering methods and new tests for differential expression have been included. The new pipeliner module allows automating the execution of sequential analysis steps by means of a simple but powerful graphic interface. An extensive re-engineering of GEPAS has been carried out which includes the use of web services and Web 2.0 technology features, a new user interface with persistent sessions and a new extended database of gene identifiers. GEPAS is nowadays the most quoted web tool in its field and it is extensively used by researchers of many countries and its records indicate an average usage rate of 500 experiments per day. GEPAS, is available at http://www.gepas.org. PMID:18508806

  16. Research-Based Monitoring, Prediction, and Analysis Tools of the Spacecraft Charging Environment for Spacecraft Users

    NASA Technical Reports Server (NTRS)

    Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti A.; Maddox, Marlo M.; Mays, Mona Leila

    2015-01-01

    The Space Weather Research Center (http://swrc. gsfc.nasa.gov) at NASA Goddard, part of the Community Coordinated Modeling Center (http://ccmc.gsfc.nasa.gov), is committed to providing research-based forecasts and notifications to address NASA's space weather needs, in addition to its critical role in space weather education. It provides a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, tailored space weather alerts and products, and weekly summaries and reports. In this paper, we focus on how (near) real-time data (both in space and on ground), in combination with modeling capabilities and an innovative dissemination system called the integrated Space Weather Analysis system (http://iswa.gsfc.nasa.gov), enable monitoring, analyzing, and predicting the spacecraft charging environment for spacecraft users. Relevant tools and resources are discussed.

  17. Research-Based Monitoring, Prediction, and Analysis Tools of the Spacecraft Charging Environment for Spacecraft Users

    NASA Technical Reports Server (NTRS)

    Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti A.; Maddox, Marlo M.; Mays, Mona Leila

    2015-01-01

    The Space Weather Research Center (http://swrc. gsfc.nasa.gov) at NASA Goddard, part of the Community Coordinated Modeling Center (http://ccmc.gsfc.nasa.gov), is committed to providing research-based forecasts and notifications to address NASA's space weather needs, in addition to its critical role in space weather education. It provides a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, tailored space weather alerts and products, and weekly summaries and reports. In this paper, we focus on how (near) real-time data (both in space and on ground), in combination with modeling capabilities and an innovative dissemination system called the integrated Space Weather Analysis system (http://iswa.gsfc.nasa.gov), enable monitoring, analyzing, and predicting the spacecraft charging environment for spacecraft users. Relevant tools and resources are discussed.

  18. Draper Station Analysis Tool

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  19. InfraPy: Python-Based Signal Analysis Tools for Infrasound

    SciTech Connect

    Blom, Philip Stephen; Marcillo, Omar Eduardo; Euler, Garrett Gene

    2016-05-31

    InfraPy is a Python-based analysis toolkit being development at LANL. The algorithms are intended for ground-based nuclear detonation detection applications to detect, locate, and characterize explosive sources using infrasonic observations. The implementation is usable as a stand-alone Python library or as a command line driven tool operating directly on a database. With multiple scientists working on the project, we've begun using a LANL git repository for collaborative development and version control. Current and planned work on InfraPy focuses on the development of new algorithms and propagation models. Collaboration with Southern Methodist University (SMU) has helped identify bugs and limitations of the algorithms. The current focus of usage development is focused on library imports and CLI.

  20. Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology.

    PubMed

    Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan

    2017-10-01

    Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( www.comparativego.com ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.

  1. Population based MRI and DTI templates of the adult ferret brain and tools for voxelwise analysis.

    PubMed

    Hutchinson, E B; Schwerin, S C; Radomski, K L; Sadeghi, N; Jenkins, J; Komlosh, M E; Irfanoglu, M O; Juliano, S L; Pierpaoli, C

    2017-05-15

    Non-invasive imaging has the potential to play a crucial role in the characterization and translation of experimental animal models to investigate human brain development and disorders, especially when employed to study animal models that more accurately represent features of human neuroanatomy. The purpose of this study was to build and make available MRI and DTI templates and analysis tools for the ferret brain as the ferret is a well-suited species for pre-clinical MRI studies with folded cortical surface, relatively high white matter volume and body dimensions that allow imaging with pre-clinical MRI scanners. Four ferret brain templates were built in this study - in-vivo MRI and DTI and ex-vivo MRI and DTI - using brain images across many ferrets and region of interest (ROI) masks corresponding to established ferret neuroanatomy were generated by semi-automatic and manual segmentation. The templates and ROI masks were used to create a web-based ferret brain viewing software for browsing the MRI and DTI volumes with annotations based on the ROI masks. A second objective of this study was to provide a careful description of the imaging methods used for acquisition, processing, registration and template building and to demonstrate several voxelwise analysis methods including Jacobian analysis of morphometry differences between the female and male brain and bias-free identification of DTI abnormalities in an injured ferret brain. The templates, tools and methodological optimization presented in this study are intended to advance non-invasive imaging approaches for human-similar animal species that will enable the use of pre-clinical MRI studies for understanding and treating brain disorders. Published by Elsevier Inc.

  2. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    PubMed

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  3. Physics analysis tools

    SciTech Connect

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed.

  4. MAGIA, a web-based tool for miRNA and Genes Integrated Analysis.

    PubMed

    Sales, Gabriele; Coppe, Alessandro; Bisognin, Andrea; Biasiolo, Marta; Bortoluzzi, Stefania; Romualdi, Chiara

    2010-07-01

    MAGIA (miRNA and genes integrated analysis) is a novel web tool for the integrative analysis of target predictions, miRNA and gene expression data. MAGIA is divided into two parts: the query section allows the user to retrieve and browse updated miRNA target predictions computed with a number of different algorithms (PITA, miRanda and Target Scan) and Boolean combinations thereof. The analysis section comprises a multistep procedure for (i) direct integration through different functional measures (parametric and non-parametric correlation indexes, a variational Bayesian model, mutual information and a meta-analysis approach based on P-value combination) of mRNA and miRNA expression data, (ii) construction of bipartite regulatory network of the best miRNA and mRNA putative interactions and (iii) retrieval of information available in several public databases of genes, miRNAs and diseases and via scientific literature text-mining. MAGIA is freely available for Academic users at http://gencomp.bio.unipd.it/magia.

  5. Graphical Contingency Analysis Tool

    SciTech Connect

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identify the best action by interactively evaluate candidate actions.

  6. SearchLight: a freely available web-based quantitative spectral analysis tool (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Prabhat, Prashant; Peet, Michael; Erdogan, Turan

    2016-03-01

    In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).

  7. Portfolio Analysis Tool

    NASA Technical Reports Server (NTRS)

    Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug

    2005-01-01

    Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.

  8. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    PubMed

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  9. Application of motif-based tools on evolutionary analysis of multipartite single-stranded DNA viruses.

    PubMed

    Wang, Hsiang-Iu; Chang, Chih-Hung; Lin, Po-Heng; Fu, Hui-Chuan; Tang, Chuanyi; Yeh, Hsin-Hung

    2013-01-01

    Multipartite viruses contain more than one distinctive genome component, and the origin of multipartite viruses has been suggested to evolve from a non-segmented wild-type virus. To explore whether recombination also plays a role in the evolution of the genomes of multipartite viruses, we developed a systematic approach that employs motif-finding tools to detect conserved motifs from divergent genomic regions and applies statistical approaches to select high-confidence motifs. The information that this approach provides helps us understand the evolution of viruses. In this study, we compared our motif-based strategy with current alignment-based recombination-detecting methods and applied our methods to the analysis of multipartite single-stranded plant DNA viruses, including bipartite begomoviruses, Banana bunchy top virus (BBTV) (consisting of 6 genome components) and Faba bean necrotic yellows virus (FBNYV) (consisting of 8 genome components). Our analysis revealed that recombination occurred between genome components in some begomoviruses, BBTV and FBNYV. Our data also show that several unusual recombination events have contributed to the evolution of BBTV genome components. We believe that similar approaches can be applied to resolve the evolutionary history of other viruses.

  10. SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Young, M. D.; Hayashi, S.; Gopu, A.

    2014-05-01

    As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.

  11. Application of Motif-Based Tools on Evolutionary Analysis of Multipartite Single-Stranded DNA Viruses

    PubMed Central

    Wang, Hsiang-Iu; Chang, Chih-Hung; Lin, Po-Heng; Fu, Hui-Chuan; Tang, ChuanYi; Yeh, Hsin-Hung

    2013-01-01

    Multipartite viruses contain more than one distinctive genome component, and the origin of multipartite viruses has been suggested to evolve from a non-segmented wild-type virus. To explore whether recombination also plays a role in the evolution of the genomes of multipartite viruses, we developed a systematic approach that employs motif-finding tools to detect conserved motifs from divergent genomic regions and applies statistical approaches to select high-confidence motifs. The information that this approach provides helps us understand the evolution of viruses. In this study, we compared our motif-based strategy with current alignment-based recombination-detecting methods and applied our methods to the analysis of multipartite single-stranded plant DNA viruses, including bipartite begomoviruses, Banana bunchy top virus (BBTV) (consisting of 6 genome components) and Faba bean necrotic yellows virus (FBNYV) (consisting of 8 genome components). Our analysis revealed that recombination occurred between genome components in some begomoviruses, BBTV and FBNYV. Our data also show that several unusual recombination events have contributed to the evolution of BBTV genome components. We believe that similar approaches can be applied to resolve the evolutionary history of other viruses. PMID:23936517

  12. Review and comparison of web- and disk-based tools for residentialenergy analysis

    SciTech Connect

    Mills, Evan

    2002-08-25

    There exist hundreds of building energy software tools, both web- and disk-based. These tools exhibit considerable range in approach and creativity, with some being highly specialized and others able to consider the building as a whole. However, users are faced with a dizzying array of choices and, often, conflicting results. The fragmentation of development and deployment efforts has hampered tool quality and market penetration. The purpose of this review is to provide information for defining the desired characteristics of residential energy tools, and to encourage future tool development that improves on current practice. This project entails (1) creating a framework for describing possible technical and functional characteristics of such tools, (2) mapping existing tools onto this framework, (3) exploring issues of tool accuracy, and (4) identifying ''best practice'' and strategic opportunities for tool design. evaluated 50 web-based residential calculators, 21 of which we regard as ''whole-house'' tools(i.e., covering a range of end uses). Of the whole-house tools, 13 provide open-ended energy calculations, 5 normalize the results to actual costs (a.k.a ''bill-disaggregation tools''), and 3 provide both options. Across the whole-house tools, we found a range of 5 to 58 house-descriptive features (out of 68 identified in our framework) and 2 to 41 analytical and decision-support features (55 possible). We also evaluated 15 disk-based residential calculators, six of which are whole-house tools. Of these tools, 11 provide open-ended calculations, 1 normalizes the results to actual costs, and 3 provide both options. These tools offered ranges of 18 to 58 technical features (70 possible) and 10 to 40 user- and decision-support features (56 possible). The comparison shows that such tools can employ many approaches and levels of detail. Some tools require a relatively small number of well-considered inputs while others ask a myriad of questions and still miss key

  13. Web-Based Model Visualization Tools to Aid in Model Optimization and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Alder, J.; van Griensven, A.; Meixner, T.

    2003-12-01

    Individuals applying hydrologic models have a need for a quick easy to use visualization tools to permit them to assess and understand model performance. We present here the Interactive Hydrologic Modeling (IHM) visualization toolbox. The IHM utilizes high-speed Internet access, the portability of the web and the increasing power of modern computers to provide an online toolbox for quick and easy model result visualization. This visualization interface allows for the interpretation and analysis of Monte-Carlo and batch model simulation results. Often times a given project will generate several thousands or even hundreds of thousands simulations. This large number of simulations creates a challenge for post-simulation analysis. IHM's goal is to try to solve this problem by loading all of the data into a database with a web interface that can dynamically generate graphs for the user according to their needs. IHM currently supports: a global samples statistics table (e.g. sum of squares error, sum of absolute differences etc.), top ten simulations table and graphs, graphs of an individual simulation using time step data, objective based dotty plots, threshold based parameter cumulative density function graphs (as used in the regional sensitivity analysis of Spear and Hornberger) and 2D error surface graphs of the parameter space. IHM is ideal for the simplest bucket model to the largest set of Monte-Carlo model simulations with a multi-dimensional parameter and model output space. By using a web interface, IHM offers the user complete flexibility in the sense that they can be anywhere in the world using any operating system. IHM can be a time saving and money saving alternative to spending time producing graphs or conducting analysis that may not be informative or being forced to purchase or use expensive and proprietary software. IHM is a simple, free, method of interpreting and analyzing batch model results, and is suitable for novice to expert hydrologic modelers.

  14. Building energy analysis tool

    DOEpatents

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  15. On-line manipulator tool condition monitoring based on vibration analysis

    NASA Astrophysics Data System (ADS)

    Gierlak, Piotr; Burghardt, Andrzej; Szybicki, Dariusz; Szuster, Marcin; Muszyńska, Magdalena

    2017-05-01

    This article presents a method of processing and analyzing the measurement signals used in the problem of diagnosing the state of a manipulator's tool. The analysis of the signals was performed within the domain of time and frequency. The signals utilized in the analysis were the mechanical vibrations and the rotation speed of the tool. The database for analysis was obtained in a research environment and it includes the instances of the functioning of the system with tool in good technical state as well as instances with a damaged tool. With the intent at reducing the data, the registered signals are represented with the use of selected features. The preliminary selection of the significant features of the signals is made with the use of the sequential feature selection procedure. The reduced set of features is used for the creation of a tool condition classifier, which has a form of an artificial neural network. The obtained classifier operates on-line on robotized system and generates diagnostic information on the state of the tool.

  16. Configuration Analysis Tool

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1983-01-01

    Configuration Analysis Tool (CAT), is information storage and report generation system for aid of configuration management activities. Configuration management is discipline composed of many techniques selected to track and direct evolution of complex systems. CAT is interactive program that accepts, organizes and stores information pertinent to specific phases of project.

  17. Analysis Tools (AT)

    Treesearch

    Larry J. Gangi

    2006-01-01

    The FIREMON Analysis Tools program is designed to let the user perform grouped or ungrouped summary calculations of single measurement plot data, or statistical comparisons of grouped or ungrouped plot data taken at different sampling periods. The program allows the user to create reports and graphs, save and print them, or cut and paste them into a word processor....

  18. Logistics Process Analysis ToolProcess Analysis Tool

    SciTech Connect

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).

  19. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  20. BiNChE: a web tool and library for chemical enrichment analysis based on the ChEBI ontology.

    PubMed

    Moreno, Pablo; Beisken, Stephan; Harsha, Bhavana; Muthukrishnan, Venkatesh; Tudose, Ilinca; Dekker, Adriano; Dornfeldt, Stefanie; Taruttis, Franziska; Grosse, Ivo; Hastings, Janna; Neumann, Steffen; Steinbeck, Christoph

    2015-02-21

    Ontology-based enrichment analysis aids in the interpretation and understanding of large-scale biological data. Ontologies are hierarchies of biologically relevant groupings. Using ontology annotations, which link ontology classes to biological entities, enrichment analysis methods assess whether there is a significant over or under representation of entities for ontology classes. While many tools exist that run enrichment analysis for protein sets annotated with the Gene Ontology, there are only a few that can be used for small molecules enrichment analysis. We describe BiNChE, an enrichment analysis tool for small molecules based on the ChEBI Ontology. BiNChE displays an interactive graph that can be exported as a high-resolution image or in network formats. The tool provides plain, weighted and fragment analysis based on either the ChEBI Role Ontology or the ChEBI Structural Ontology. BiNChE aids in the exploration of large sets of small molecules produced within Metabolomics or other Systems Biology research contexts. The open-source tool provides easy and highly interactive web access to enrichment analysis with the ChEBI ontology tool and is additionally available as a standalone library.

  1. Omics AnalySIs System for PRecision Oncology (OASISPRO): A Web-based Omics Analysis Tool for Clinical Phenotype Prediction.

    PubMed

    Yu, Kun-Hsing; Fitzpatrick, Michael R; Pappas, Luke; Chan, Warren; Kung, Jessica; Snyder, Michael

    2017-09-12

    Precision oncology is an approach that accounts for individual differences to guide cancer management. Omics signatures have been shown to predict clinical traits for cancer patients. However, the vast amount of omics information poses an informatics challenge in systematically identifying patterns associated with health outcomes, and no general-purpose data-mining tool exists for physicians, medical researchers, and citizen scientists without significant training in programming and bioinformatics. To bridge this gap, we built the Omics AnalySIs System for PRecision Oncology (OASISPRO), a web-based system to mine the quantitative omics information from The Cancer Genome Atlas (TCGA). This system effectively visualizes patients' clinical profiles, executes machine-learning algorithms of choice on the omics data, and evaluates the prediction performance using held-out test sets. With this tool, we successfully identified genes strongly associated with tumor stage, and accurately predicted patients' survival outcomes in many cancer types, including mesothelioma and adrenocortical carcinoma. By identifying the links between omics and clinical phenotypes, this system will facilitate omics studies on precision cancer medicine and contribute to establishing personalized cancer treatment plans. This web-based tool is available at http://tinyurl.com/oasispro ;source codes are available at http://tinyurl.com/oasisproSourceCode .

  2. Linking Nurses with Evidence-Based Information via Social Media Tools: An Analysis of the Literature.

    PubMed

    Carter-Templeton, Heather; Krishnamurthy, Mangala; Nelson, Ramona

    2016-01-01

    Many health professional believe that social media tools can play a pivotal role in sharing and facilitating the use of evidence-based information with patients and other healthcare providers. By understanding how social media tools function, healthcare professionals can capitalize on these interactive platforms to improve the health of others. However, limited information exists to guide nurse educators in preparing healthcare professionals to engage patients or share evidence-based information among peers. The purpose of this literature review was to determine the extent to which professional development programs using social media for sharing evidence-based information have reported their research and/or experience in the published literature.

  3. PCard Data Analysis Tool

    SciTech Connect

    Hilts, Jim

    2005-04-01

    The Procurement Card data analysis and monitoring tool enables due-diligence review using predefined user-created queries and reports. The system tracks individual compliance emails. More specifically, the tool: - Helps identify exceptions or questionable and non-compliant purchases, - Creates audit random sample on request, - Allows users to create and run new or ad-hoc queries and reports, - Monitors disputed charges, - Creates predefined Emails to Cardholders requesting documentation and/or clarification, - Tracks audit status, notes, Email status (date sent, response), audit resolution.

  4. Development of an analysis tool for cloud base height and visibility

    NASA Astrophysics Data System (ADS)

    Umdasch, Sarah; Reinhold, Steinacker; Manfred, Dorninger; Markus, Kerschbaum; Wolfgang, Pöttschacher

    2014-05-01

    The meteorological variables cloud base height (CBH) and horizontal atmospheric visibility (VIS) at surface level are of vital importance for safety and effectiveness in aviation. Around 20% of all civil aviation accidents in the USA from 2003 to 2007 were due to weather related causes, around 18% of which were owing to decreased visibility or ceiling (main CBH). The aim of this study is to develop a system generating quality-controlled gridded analyses of the two parameters based on the integration of various kinds of observational data. Upon completion, the tool is planned to provide guidance for nowcasting during take-off and landing as well as for flights operated under visual flight rules. Primary input data consists of manual as well as instrumental observation of CBH and VIS. In Austria, restructuring of part of the standard meteorological stations from human observation to automatic measurement of VIS and CBH is currently in progress. As ancillary data, satellite derived products can add 2-dimensional information, e.g. Cloud Type by NWC SAF (Nowcasting Satellite Application Facilities) MSG (Meteosat Second Generation). Other useful available data are meteorological surface measurements (in particular of temperature, humidity, wind and precipitation), radiosonde, radar and high resolution topography data. A one-year data set is used to study the spatial and weather-dependent representativeness of the CBH and VIS measurements. The VERA (Vienna Enhanced Resolution Analysis) system of the Institute of Meteorology and Geophysics of the University of Vienna provides the framework for the analysis development. Its integrated "Fingerprint" technique allows the insertion of empirical prior knowledge and ancillary information in the form of spatial patterns. Prior to the analysis, a quality control of input data is performed. For CBH and VIS, quality control can consist of internal consistency checks between different data sources. The possibility of two

  5. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  6. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  7. Analysis/Design Tool

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Excelerator II, developed by INTERSOLV, Inc., provides a complete environment for rules-based expert systems. The software incorporates NASA's C Language Integrated Production System (CLIPS), a shell for constructing expert systems. Excelerator II provides complex verification and transformation routines based on matching that is simple and inexpensive. *Excelerator II was sold to SELECT Software Tools in June 1997 and is now called SELECT Excelerator. SELECT has assumed full support and maintenance for the product line.

  8. Transmission Planning Analysis Tool

    SciTech Connect

    2015-06-23

    Developed to solve specific problem: Assist transmission planning for regional transfers in interconnected power systems. This work was originated in a study for the U.S. Department of State, to recommend transmission reinforcements for the Central American regional system that interconnects 6 countries. Transmission planning analysis is currently performed by engineers with domainspecific and systemspecific knowledge without a unique methodology. The software codes of this disclosure assists engineers by defining systematic analysis procedures to help identify weak points and make decisions on transmission planning of regional interconnected power systems. Transmission Planning Analysis Tool groups PSS/E results of multiple AC contingency analysis and voltage stability analysis and QV analysis of many scenarios of study and arrange them in a systematic way to aid power system planning engineers or transmission operators in effective decision]making process or in the off]line study environment.

  9. DRIVE Analysis Tool Generates Custom Vehicle Drive Cycles Based on Real-World Data (Fact Sheet)

    SciTech Connect

    Not Available

    2013-04-01

    This fact sheet from the National Renewable Energy Laboratory describes the Drive-Cycle Rapid Investigation, Visualization, and Evaluation (DRIVE) analysis tool, which uses GPS and controller area network data to characterize vehicle operation and produce custom vehicle drive cycles, analyzing thousands of hours of data in a matter of minutes.

  10. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool

    PubMed Central

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-01-01

    Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080

  11. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  12. Development of a management tool for reservoirs in Mediterranean environments based on uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Gómez-Beas, R.; Moñino, A.; Polo, M. J.

    2012-05-01

    In compliance with the development of the Water Framework Directive, there is a need for an integrated management of water resources, which involves the elaboration of reservoir management models. These models should include the operational and technical aspects which allow us to forecast an optimal management in the short term, besides the factors that may affect the volume of water stored in the medium and long term. The climate fluctuations of the water cycle that affect the reservoir watershed should be considered, as well as the social and economic aspects of the area. This paper shows the development of a management model for Rules reservoir (southern Spain), through which the water supply is regulated based on set criteria, in a sustainable way with existing commitments downstream, with the supply capacity being well established depending on demand, and the probability of failure when the operating requirements are not fulfilled. The results obtained allowed us: to find out the reservoir response at different time scales, to introduce an uncertainty analysis and to demonstrate the potential of the methodology proposed here as a tool for decision making.

  13. Flight Operations Analysis Tool

    NASA Technical Reports Server (NTRS)

    Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca

    2006-01-01

    Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.

  14. Dynamic Contingency Analysis Tool

    SciTech Connect

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  15. A Semantic Provenance-aware Expert Advisory System in a Web-based Science Data Analysis Tool

    NASA Astrophysics Data System (ADS)

    Zednik, S.; Lynnes, C.; Fox, P. A.; Leptoukh, G. G.; Pan, J.

    2010-12-01

    Web-based science analysis and processing tools allow users to access, analyze, and generate visualizations of data while alleviating users from having to directly manage complex data processing operations. These tools provide value by streamlining the data analysis process, but usually shield users from details of the data processing steps, algorithm assumptions, caveats, etc. Correct interpretation of the final analysis requires user understanding of how data has been generated and processed and what potential biases, anomalies, or errors may have been introduced. By providing services that leverage data lineage provenance and domain-expertise, expert systems can be built to aid the user in understanding data sources, processing, and the suitability for use of products generated by the tools. As an example of such a system, we describe a semantic, provenance-aware, expert-knowledge advisory system applied to an existing web-based Earth science data analysis tool (e.g. Giovanni from NASA/GSFC). First we introduce our integrated semantic data model, which is comprised of provenance, data processing, and science domain ontologies. Then we describe how we developed an initial set of expert rules, to reason over our data model and discover conditions in the processing provenance that could lead to anomalies or errors in the processing results. Finally we will highlight how knowledge from the semantic data model and inferences of the advisory expert ruleset may be presented to the user to assist in user understanding of the suitability of products generated by the analysis tool.

  16. An analysis tool to calculate permeability based on the Patlak method.

    PubMed

    Cetin, Ozdemir

    2012-06-01

    Strokes are commonly diagnosed by utilizing images obtained from magnetic resonance imaging (MRI) technology. Nowadays, computer software can play a large role in analyzing these images and arriving at diagnoses quickly and accurately. Additionally, this software can reduce workload for medical personnel and lower misdiagnoses. In this paper a flexible permeability calculation tool called PCT based on the Patlak plot method is presented. Using the PCT we can calculate the permeability co-efficient of the Blood-Brain Barrier (BBB) function. The PCT tool offers both manual and automatic options for diagnosing the regions of the brain affected by stroke. Moreover, the PCT tool supports various extensions such as dicom, nifty and analyze.

  17. Recommendations for tool-handle material choice based on finite element analysis.

    PubMed

    Harih, Gregor; Dolšak, Bojan

    2014-05-01

    Huge areas of work are still done manually and require the usages of different powered and non-powered hand tools. In order to increase the user performance, satisfaction, and lower the risk of acute and cumulative trauma disorders, several researchers have investigated the sizes and shapes of tool-handles. However, only a few authors have investigated tool-handles' materials for further optimising them. Therefore, as presented in this paper, we have utilised a finite-element method for simulating human fingertip whilst grasping tool-handles. We modelled and simulated steel and ethylene propylene diene monomer (EPDM) rubber as homogeneous tool-handle materials and two composites consisting of EPDM rubber and EPDM foam, and also EPDM rubber and PU foam. The simulated finger force was set to obtain characteristic contact pressures of 20 kPa, 40 kPa, 80 kPa, and 100 kPa. Numerical tests have shown that EPDM rubber lowers the contact pressure just slightly. On the other hand, both composites showed significant reduction in contact pressure that could lower the risks of acute and cumulative trauma disorders which are pressure-dependent. Based on the results, it is also evident that a composite containing PU foam with a more evident and flat plateau deformed less at lower strain rates and deformed more when the plateau was reached, in comparison to the composite with EPDM foam. It was shown that hyper-elastic foam materials, which take into account the non-linear behaviour of fingertip soft tissue, can lower the contact pressure whilst maintaining low deformation rate of the tool-handle material for maintaining sufficient rate of stability of the hand tool in the hands. Lower contact pressure also lowers the risk of acute and cumulative trauma disorders, and increases comfort whilst maintaining performance. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  18. Analysis of the Thermal Characteristics of Machine Tool Feed System Based on Finite Element Method

    NASA Astrophysics Data System (ADS)

    Mao, Xiaobo; Mao, Kuanmin; Du, Yikang; Wang, Fengyun; Yan, Bo

    2017-09-01

    The loading of mobile heat source and boundary conditions setting are difficult problems in the analysis of thermal characteristics of machine tools. Taking the machine tool feed system as an example, a novel method for loading of mobile heat source was proposed by establishing a function which was constructed by the heat source and time. The convective heat transfer coefficient is the key parameter of boundary conditions, and it varies with the temperature. In this paper, a model of “variable convection heat transfer coefficient” was proposed, and the setting of boundary conditions of thermal analysis was closer to the real situation. Finally, comparing results of above method and experimental data, the accuracy and validity of this method was proved, meanwhile, the simulation calculation and simulation time was reducing greatly.

  19. MARSTHERM: A Web-based System Providing Thermophysical Analysis Tools for Mars Research

    NASA Astrophysics Data System (ADS)

    Putzig, N. E.; Barratt, E. M.; Mellon, M. T.; Michaels, T. I.

    2013-12-01

    We introduce MARSTHERM, a web-based system that will allow researchers access to a standard numerical thermal model of the Martian near-surface and atmosphere. In addition, the system will provide tools for the derivation, mapping, and analysis of apparent thermal inertia from temperature observations by the Mars Global Surveyor Thermal Emission Spectrometer (TES) and the Mars Odyssey Thermal Emission Imaging System (THEMIS). Adjustable parameters for the thermal model include thermal inertia, albedo, surface pressure, surface emissivity, atmospheric dust opacity, latitude, surface slope angle and azimuth, season (solar longitude), and time steps for calculations and output. The model computes diurnal surface and brightness temperatures for either a single day or a full Mars year. Output options include text files and plots of seasonal and diurnal surface, brightness, and atmospheric temperatures. The tools for the derivation and mapping of apparent thermal inertia from spacecraft data are project-based, wherein the user provides an area of interest (AOI) by specifying latitude and longitude ranges. The system will then extract results within the AOI from prior global mapping of elevation (from the Mars Orbiter Laser Altimeter, for calculating surface pressure), TES annual albedo, and TES seasonal and annual-mean 2AM and 2PM apparent thermal inertia (Putzig and Mellon, 2007, Icarus 191, 68-94). In addition, a history of TES dust opacity within the AOI is computed. For each project, users may then provide a list of THEMIS images to process for apparent thermal inertia, optionally overriding the TES-derived dust opacity with a fixed value. Output from the THEMIS derivation process includes thumbnail and context images, GeoTIFF raster data, and HDF5 files containing arrays of input and output data (radiance, brightness temperature, apparent thermal inertia, elevation, quality flag, latitude, and longitude) and ancillary information. As a demonstration of capabilities

  20. A Survey of Visualization Tools Assessed for Anomaly-Based Intrusion Detection Analysis

    DTIC Science & Technology

    2014-04-01

    network security analysts’ tasks. They are AutoFocus, Beluga, Cichild, Cuttlefish, FlowScan, GeoPlot, GTrace, MapNet, Otter , Plankton, PlotPaths, Real...animation. One monitoring and one analysis capability; no response capabilities. Otter http://www.caida.org/tools/visualization/ otter ...AVS Express, Otter , and Tableau Desktop. AVS Express manages memory better and provides faster graphics. Otter has high memory usage for large data

  1. Communications network analysis tool

    NASA Astrophysics Data System (ADS)

    Phillips, Wayne; Dunn, Gary

    1989-11-01

    The Communications Network Analysis Tool (CNAT) is a set of computer programs that aids in the performance evaluation of a communication system in a real-world scenario. Communication network protocols can be modeled and battle group connectivity can be analyzed in the presence of jamming and the benefit of relay platforms can be studied. The Joint Tactical Information Distribution System (JTIDS) Communication system architecture is currently being modeled; however, the computer software is modular enough to allow substitution of a new code representative of prospective communication protocols.

  2. Frequency Response Analysis Tool

    SciTech Connect

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  3. IT3F: a web-based tool for functional analysis of transcription factors in plants.

    PubMed

    Bailey, Paul C; Dicks, Jo; Wang, Trevor L; Martin, Cathie

    2008-10-01

    A web-based tool, the Interspecies Transcription Factor Function Finder (IT3F), has been developed to display both evolutionary gene relationships and expression data for plant transcription factors, focussing primarily on the R2R3MYB gene subfamily for proof of concept. The graphical display of information allows users to make direct comparisons between structurally related genes and to identify those genes that are potentially orthologous, thereby assisting with their understanding of gene function. A key feature of the website is the provision of an interrogative phylogenetic tree that allows submission of new sequences corresponding to a transcription factor family or subfamily and maps their relative positions to the products of other genes on an 'existing' tree containing proteins encoded by Arabidopsis and rice genes, along with key proteins encoded by genes from other species that have been characterised functionally. In addition, a feature to select clusters of related sequences has been developed so that more detailed phylogenetic analysis can be performed to highlight potential orthologous and paralogous genes within related clusters. Arabidopsis genes that reside on duplicated regions of the genome are indicated on the tree, providing further information for interpreting gene function. An additional feature of the website allows a selected number of key Arabidopsis and rice microarray experiments to be visualised alongside the tree as a tabulated heat map of expression intensity values. Through this display, it is possible to observe relative expression levels across a whole gene family and the extent to which the expression of closely related genes within subgroups has altered since their ancestral divergence. The website is available at http://jicbio.nbi.ac.uk/IT3F/.

  4. PACSPulse: a web-based DICOM network traffic monitor and analysis tool.

    PubMed

    Nagy, Paul G; Daly, Mark; Warnock, Max; Ehlers, Kevin C; Rehm, Jeff

    2003-01-01

    PACSPulse, an open-source tool, was developed to identify and analyze the performance bottlenecks of picture archiving and communication systems (PACS). PACSPulse provides a graphical Web interface for straightforward analysis of PACS performance on the basis of data acquired by tracking usage by network, server, workstation, type of traffic, and time of day. The PACS archive logs performance and usage data on image traffic being sent to it from the imaging units and study data requested by users. The performance log is sent via file transfer protocol (FTP) to a separate server for analysis. The data are parsed and sent to a database server connected to a Web server. The Web site is used to depict trends in the performance of the entire system to detect signs of degradation. The system was built entirely of open-source components for the operating system, database, charting tool, and Web server. Performance monitoring is an essential tool for analyzing, understanding, and predicting the performance characteristics of a PACS.

  5. New analysis tools and processes for mask repair verification and defect disposition based on AIMS images

    NASA Astrophysics Data System (ADS)

    Richter, Rigo; Poortinga, Eric; Scheruebl, Thomas

    2009-10-01

    Using AIMSTM to qualify repairs of defects on photomasks is an industry standard. AIMSTM images match the lithographic imaging performance without the need for wafer prints. Utilization of this capability by photomask manufacturers has risen due to the increased complexity of layouts incorporating RET and phase shift technologies. Tighter specifications by end-users have pushed AIMSTM analysis to now include CD performance results in addition to the traditional intensity performance results. Discussed is a new Repair Verification system for automated analysis of AIMSTM images. Newly designed user interfaces and algorithms guide users through predefined analysis routines as to minimize errors. There are two main routines discussed, one allowing multiple reference sites along with a test/defect site within a single image of repeating features. The second routine compares a test/defect measurement image with a reference measurement image. Three evaluation methods possible with the compared images are discussed in the context of providing thorough analysis capability. This paper highlights new functionality for AIMSTM analysis. Using structured analysis processes and innovative analysis tools leads to a highly efficient and more reliable result reporting of repair verification analysis.

  6. Testing of reliability - Analysis tools

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1989-01-01

    An outline is presented of issues raised in verifying the accuracy of reliability analysis tools. State-of-the-art reliability analysis tools implement various decomposition, aggregation, and estimation techniques to compute the reliability of a diversity of complex fault-tolerant computer systems. However, no formal methodology has been formulated for validating the reliability estimates produced by these tools. The author presents three states of testing that can be performed on most reliability analysis tools to effectively increase confidence in a tool. These testing stages were applied to the SURE (semi-Markov Unreliability Range Evaluator) reliability analysis tool, and the results of the testing are discussed.

  7. Climate Data Analysis Tools

    SciTech Connect

    2009-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems, The power of the system comes from Python and its ability to seamlissly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and fraphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management Systems or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS).

  8. Development and description of a decision analysis based decision support tool for stroke prevention in atrial fibrillation

    PubMed Central

    Thomson, R.; Robinson, A.; Greenaway, J.; Lowe, P.

    2002-01-01

    Background: There is an increasing move towards clinical decision making that engages the patient, which has led to the development and use of decision aids to support better decisions. The treatment of patients in atrial fibrillation (AF) with warfarin to prevent stroke is a decision that is sensitive to patient preferences as shown by a previous decision analysis. Aim: To develop a computerised decision support tool, building upon a previous decision analysis, which would engage individual patient preferences in reaching a shared decision on whether to take warfarin to prevent stroke. Methods: The development process had two main phases: (1) the development phase which employed focus groups and repeated interviews with GPs/practice nurses and patients alongside an iterative development of a computerised tool; (2) the training and testing phase in which GPs and practice nurses underwent training in the use of the tool, including the use of simulated patients. The tool was then used in a feasibility study in a small number of patients with AF to inform the design of a subsequent randomised controlled trial. Results: The prototype tool had three components: (1) derivation of an individual patient's values for relevant health states using a standard gamble; (2) presentation/discussion of a patient's risks of stroke using the Framingham equation and the benefits/risks of warfarin from a systematic literature review; and (3) decision making component incorporating the outcome of a Markov decision analysis model. Older patients could be taken through the decision analysis based computerised tool, and patients and clinicians welcomed information on risks and benefits of treatments. The tool required time and training to use. Patients' decisions in the feasibility phase did not necessarily coincide with the output of the decision analysis model, but decision conflict appeared to be reduced and both patients and GPs were satisfied with the process. Conclusions: It is

  9. DaGO-Fun: tool for Gene Ontology-based functional analysis using term information content measures.

    PubMed

    Mazandu, Gaston K; Mulder, Nicola J

    2013-09-25

    The use of Gene Ontology (GO) data in protein analyses have largely contributed to the improved outcomes of these analyses. Several GO semantic similarity measures have been proposed in recent years and provide tools that allow the integration of biological knowledge embedded in the GO structure into different biological analyses. There is a need for a unified tool that provides the scientific community with the opportunity to explore these different GO similarity measure approaches and their biological applications. We have developed DaGO-Fun, an online tool available at http://web.cbio.uct.ac.za/ITGOM, which incorporates many different GO similarity measures for exploring, analyzing and comparing GO terms and proteins within the context of GO. It uses GO data and UniProt proteins with their GO annotations as provided by the Gene Ontology Annotation (GOA) project to precompute GO term information content (IC), enabling rapid response to user queries. The DaGO-Fun online tool presents the advantage of integrating all the relevant IC-based GO similarity measures, including topology- and annotation-based approaches to facilitate effective exploration of these measures, thus enabling users to choose the most relevant approach for their application. Furthermore, this tool includes several biological applications related to GO semantic similarity scores, including the retrieval of genes based on their GO annotations, the clustering of functionally related genes within a set, and term enrichment analysis.

  10. A suite of MATLAB-based computational tools for automated analysis of COPAS Biosort data

    PubMed Central

    Morton, Elizabeth; Lamitina, Todd

    2010-01-01

    Complex Object Parametric Analyzer and Sorter (COPAS) devices are large-object, fluorescence-capable flow cytometers used for high-throughput analysis of live model organisms, including Drosophila melanogaster, Caenorhabditis elegans, and zebrafish. The COPAS is especially useful in C. elegans high-throughput genome-wide RNA interference (RNAi) screens that utilize fluorescent reporters. However, analysis of data from such screens is relatively labor-intensive and time-consuming. Currently, there are no computational tools available to facilitate high-throughput analysis of COPAS data. We used MATLAB to develop algorithms (COPAquant, COPAmulti, and COPAcompare) to analyze different types of COPAS data. COPAquant reads single-sample files, filters and extracts values and value ratios for each file, and then returns a summary of the data. COPAmulti reads 96-well autosampling files generated with the ReFLX adapter, performs sample filtering, graphs features across both wells and plates, performs some common statistical measures for hit identification, and outputs results in graphical formats. COPAcompare performs a correlation analysis between replicate 96-well plates. For many parameters, thresholds may be defined through a simple graphical user interface (GUI), allowing our algorithms to meet a variety of screening applications. In a screen for regulators of stress-inducible GFP expression, COPAquant dramatically accelerated data analysis and allowed us to rapidly move from raw data to hit identification. Because the COPAS file structure is standardized and our MATLAB code is freely available, our algorithms should be extremely useful for analysis of COPAS data from multiple platforms and organisms. The MATLAB code is freely available at our web site (www.med.upenn.edu/lamitinalab/downloads.shtml). PMID:20569218

  11. Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  12. Automated cell analysis tool for a genome-wide RNAi screen with support vector machine based supervised learning

    NASA Astrophysics Data System (ADS)

    Remmele, Steffen; Ritzerfeld, Julia; Nickel, Walter; Hesser, Jürgen

    2011-03-01

    RNAi-based high-throughput microscopy screens have become an important tool in biological sciences in order to decrypt mostly unknown biological functions of human genes. However, manual analysis is impossible for such screens since the amount of image data sets can often be in the hundred thousands. Reliable automated tools are thus required to analyse the fluorescence microscopy image data sets usually containing two or more reaction channels. The herein presented image analysis tool is designed to analyse an RNAi screen investigating the intracellular trafficking and targeting of acylated Src kinases. In this specific screen, a data set consists of three reaction channels and the investigated cells can appear in different phenotypes. The main issue of the image processing task is an automatic cell segmentation which has to be robust and accurate for all different phenotypes and a successive phenotype classification. The cell segmentation is done in two steps by segmenting the cell nuclei first and then using a classifier-enhanced region growing on basis of the cell nuclei to segment the cells. The classification of the cells is realized by a support vector machine which has to be trained manually using supervised learning. Furthermore, the tool is brightness invariant allowing different staining quality and it provides a quality control that copes with typical defects during preparation and acquisition. A first version of the tool has already been successfully applied for an RNAi-screen containing three hundred thousand image data sets and the SVM extended version is designed for additional screens.

  13. FluxCTTX: A LIMS-based tool for management and analysis of cytotoxicity assays data.

    PubMed

    Faria-Campos, Alessandra C; Balottin, Luciene B; Zuin, Gianlucca; Garcia, Vinicius; Batista, Paulo H S; Granjeiro, José M; Campos, Sérgio V A

    2015-01-01

    Cytotoxicity assays have been used by researchers to screen for cytotoxicity in compound libraries. Researchers can either look for cytotoxic compounds or screen "hits" from initial high-throughput drug screens for unwanted cytotoxic effects before investing in their development as a pharmaceutical. These assays may be used as an alternative to animal experimentation and are becoming increasingly important in modern laboratories. However, the execution of these assays in large scale and different laboratories requires, among other things, the management of protocols, reagents, cell lines used as well as the data produced, which can be a challenge. The management of all this information is greatly improved by the utilization of computational tools to save time and guarantee quality. However, a tool that performs this task designed specifically for cytotoxicity assays is not yet available. In this work, we have used a workflow based LIMS -- the Flux system -- and the Together Workflow Editor as a framework to develop FluxCTTX, a tool for management of data from cytotoxicity assays performed at different laboratories. The main work is the development of a workflow, which represents all stages of the assay and has been developed and uploaded in Flux. This workflow models the activities of cytotoxicity assays performed as described in the OECD 129 Guidance Document. FluxCTTX presents a solution for the management of the data produced by cytotoxicity assays performed at Interlaboratory comparisons. Its adoption will contribute to guarantee the quality of activities in the process of cytotoxicity tests and enforce the use of Good Laboratory Practices (GLP). Furthermore, the workflow developed is complete and can be adapted to other contexts and different tests for management of other types of data.

  14. FluxCTTX: A LIMS-based tool for management and analysis of cytotoxicity assays data

    PubMed Central

    2015-01-01

    Background Cytotoxicity assays have been used by researchers to screen for cytotoxicity in compound libraries. Researchers can either look for cytotoxic compounds or screen "hits" from initial high-throughput drug screens for unwanted cytotoxic effects before investing in their development as a pharmaceutical. These assays may be used as an alternative to animal experimentation and are becoming increasingly important in modern laboratories. However, the execution of these assays in large scale and different laboratories requires, among other things, the management of protocols, reagents, cell lines used as well as the data produced, which can be a challenge. The management of all this information is greatly improved by the utilization of computational tools to save time and guarantee quality. However, a tool that performs this task designed specifically for cytotoxicity assays is not yet available. Results In this work, we have used a workflow based LIMS -- the Flux system -- and the Together Workflow Editor as a framework to develop FluxCTTX, a tool for management of data from cytotoxicity assays performed at different laboratories. The main work is the development of a workflow, which represents all stages of the assay and has been developed and uploaded in Flux. This workflow models the activities of cytotoxicity assays performed as described in the OECD 129 Guidance Document. Conclusions FluxCTTX presents a solution for the management of the data produced by cytotoxicity assays performed at Interlaboratory comparisons. Its adoption will contribute to guarantee the quality of activities in the process of cytotoxicity tests and enforce the use of Good Laboratory Practices (GLP). Furthermore, the workflow developed is complete and can be adapted to other contexts and different tests for management of other types of data. PMID:26696462

  15. A Tool To Support Failure Mode And Effects Analysis Based On Causal Modelling And Reasoning

    NASA Astrophysics Data System (ADS)

    Underwood, W. E.; Laib, S. L.

    1987-05-01

    A prototype knowledge-based system has been developed that supports Failure Mode & Effects Analysis (FMEA). The knowledge base consists of causal models of components and a representation for coupling these components into assemblies and systems. The causal models are qualitative models. They allow reasoning as to whether variables are increasing, decreasing or steady. The analysis strategies used by the prototype allow it to determine the effects of failure modes on the function of the part, the failure effect on the assembly the part is contained in, and the effect on the subsystem containing the assembly.

  16. GeoPCA: a new tool for multivariate analysis of dihedral angles based on principal component geodesics

    PubMed Central

    Sargsyan, Karen; Wright, Jon; Lim, Carmay

    2012-01-01

    The GeoPCA package is the first tool developed for multivariate analysis of dihedral angles based on principal component geodesics. Principal component geodesic analysis provides a natural generalization of principal component analysis for data distributed in non-Euclidean space, as in the case of angular data. GeoPCA presents projection of angular data on a sphere composed of the first two principal component geodesics, allowing clustering based on dihedral angles as opposed to Cartesian coordinates. It also provides a measure of the similarity between input structures based on only dihedral angles, in analogy to the root-mean-square deviation of atoms based on Cartesian coordinates. The principal component geodesic approach is shown herein to reproduce clusters of nucleotides observed in an η–θ plot. GeoPCA can be accessed via http://pca.limlab.ibms.sinica.edu.tw. PMID:22139913

  17. Exploring JavaScript and ROOT technologies to create Web-based ATLAS analysis and monitoring tools

    NASA Astrophysics Data System (ADS)

    Sánchez Pineda, A.

    2015-12-01

    We explore the potential of current web applications to create online interfaces that allow the visualization, interaction and real cut-based physics analysis and monitoring of processes through a web browser. The project consists in the initial development of web- based and cloud computing services to allow students and researchers to perform fast and very useful cut-based analysis on a browser, reading and using real data and official Monte- Carlo simulations stored in ATLAS computing facilities. Several tools are considered: ROOT, JavaScript and HTML. Our study case is the current cut-based H → ZZ → llqq analysis of the ATLAS experiment. Preliminary but satisfactory results have been obtained online.

  18. What can management theories offer evidence-based practice? A comparative analysis of measurement tools for organisational context.

    PubMed

    French, Beverley; Thomas, Lois H; Baker, Paula; Burton, Christopher R; Pennington, Lindsay; Roddam, Hazel

    2009-05-19

    Given the current emphasis on networks as vehicles for innovation and change in health service delivery, the ability to conceptualize and measure organisational enablers for the social construction of knowledge merits attention. This study aimed to develop a composite tool to measure the organisational context for evidence-based practice (EBP) in healthcare. A structured search of the major healthcare and management databases for measurement tools from four domains: research utilisation (RU), research activity (RA), knowledge management (KM), and organisational learning (OL). Included studies were reports of the development or use of measurement tools that included organisational factors. Tools were appraised for face and content validity, plus development and testing methods. Measurement tool items were extracted, merged across the four domains, and categorised within a constructed framework describing the absorptive and receptive capacities of organisations. Thirty measurement tools were identified and appraised. Eighteen tools from the four domains were selected for item extraction and analysis. The constructed framework consists of seven categories relating to three core organisational attributes of vision, leadership, and a learning culture, and four stages of knowledge need, acquisition of new knowledge, knowledge sharing, and knowledge use. Measurement tools from RA or RU domains had more items relating to the categories of leadership, and acquisition of new knowledge; while tools from KM or learning organisation domains had more items relating to vision, learning culture, knowledge need, and knowledge sharing. There was equal emphasis on knowledge use in the different domains. If the translation of evidence into knowledge is viewed as socially mediated, tools to measure the organisational context of EBP in healthcare could be enhanced by consideration of related concepts from the organisational and management sciences. Comparison of measurement tools across

  19. What can management theories offer evidence-based practice? A comparative analysis of measurement tools for organisational context

    PubMed Central

    French, Beverley; Thomas, Lois H; Baker, Paula; Burton, Christopher R; Pennington, Lindsay; Roddam, Hazel

    2009-01-01

    Background Given the current emphasis on networks as vehicles for innovation and change in health service delivery, the ability to conceptualise and measure organisational enablers for the social construction of knowledge merits attention. This study aimed to develop a composite tool to measure the organisational context for evidence-based practice (EBP) in healthcare. Methods A structured search of the major healthcare and management databases for measurement tools from four domains: research utilisation (RU), research activity (RA), knowledge management (KM), and organisational learning (OL). Included studies were reports of the development or use of measurement tools that included organisational factors. Tools were appraised for face and content validity, plus development and testing methods. Measurement tool items were extracted, merged across the four domains, and categorised within a constructed framework describing the absorptive and receptive capacities of organisations. Results Thirty measurement tools were identified and appraised. Eighteen tools from the four domains were selected for item extraction and analysis. The constructed framework consists of seven categories relating to three core organisational attributes of vision, leadership, and a learning culture, and four stages of knowledge need, acquisition of new knowledge, knowledge sharing, and knowledge use. Measurement tools from RA or RU domains had more items relating to the categories of leadership, and acquisition of new knowledge; while tools from KM or learning organisation domains had more items relating to vision, learning culture, knowledge need, and knowledge sharing. There was equal emphasis on knowledge use in the different domains. Conclusion If the translation of evidence into knowledge is viewed as socially mediated, tools to measure the organisational context of EBP in healthcare could be enhanced by consideration of related concepts from the organisational and management sciences

  20. Java Radar Analysis Tool

    NASA Technical Reports Server (NTRS)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  1. Geo-Cultural Analysis Tool (trademark) (GCAT)

    DTIC Science & Technology

    2008-03-10

    Outline • Introduction • Why GCAT? • Geo-Cultural Analysis (Theory) • Geo-Cultural Ontology (Method) • Geo-Cultural Analysis Tool (Application...in the urban environment at any given time/day? • Approach: Use ontology enterprise system to model aggregate routine/ritual behavior using Geo...Cultural Analysis method. • Objective: Geo-Cultural Analysis Tool – Ontology enterprise-based system – Web portal for prototype application – GIS-Based

  2. Analysis of knowledge-based expert systems as tools for construction design

    NASA Astrophysics Data System (ADS)

    Cole, Arthur N.

    1991-03-01

    Because construction costs are continuously rising, Congress mandated that those within the respective branches of military service who are responsible for planning and executing construction programs develop policies and procedures that ensure that the individual projects are designed, bid, and constructed as rapidly as possible. This requires an approach that demands maximum efficiency from the design process. Reviews are necessary to ensure that designs meet all requirements, but the reviews themselves must be conducted in the least amount of time so as to preclude delays. Design tools that increases efficiency are knowledge-based expert systems which are interactive computer programs that incorporate judgement, experience, rules of thumb, and other expertise, so as to provide knowledgeable advice about a specific domain. They mimic the thought process employed by a human expert in solving a problem.

  3. Discourse-Based Methods across Texts and Semiotic Modes: Three Tools for Micro-Rhetorical Analysis

    ERIC Educational Resources Information Center

    Oddo, John

    2013-01-01

    As the scope of rhetorical inquiry broadens to cover intersemiotic and intertextual phenomena, scholars are increasingly in need of new, defensible analytic procedures. Several scholars have suggested that methods of discourse analysis could enhance rhetorical criticism. Here, I introduce a discourse-based method that is empirical, delicate, and…

  4. Discourse-Based Methods across Texts and Semiotic Modes: Three Tools for Micro-Rhetorical Analysis

    ERIC Educational Resources Information Center

    Oddo, John

    2013-01-01

    As the scope of rhetorical inquiry broadens to cover intersemiotic and intertextual phenomena, scholars are increasingly in need of new, defensible analytic procedures. Several scholars have suggested that methods of discourse analysis could enhance rhetorical criticism. Here, I introduce a discourse-based method that is empirical, delicate, and…

  5. Improving health Professional's knowledge of hepatitis B using cartoon based learning tools: a retrospective analysis of pre and post tests.

    PubMed

    Sim, Moira G; McEvoy, Ashleigh C; Wain, Toni D; Khong, Eric L

    2014-11-21

    Hepatitis B serology is complex and a lack of knowledge in interpretation contributes to the inadequate levels of screening and referral for highly effective hepatitis antiviral treatments. This knowledge gap needs to be addressed so that current and future healthcare professionals are more confident in the detection and assessment of hepatitis B to improve the uptake of treatment and reduce long-term complications from the disease. Cartoons have been used effectively as a teaching tool in other settings and were considered as a potentially useful teaching aid in explaining hepatitis B serology. This study examines the impact of cartoons in improving healthcare professionals' knowledge. A cartoon based learning tool designed to simplify the complexities of hepatitis B serology was developed as part of an online learning program for medical practitioners, nurses and students in these professions. A retrospective analysis was carried out of pre and post online test results. An average improvement of 96% of correct answers to case study questions in hepatitis B serology was found across all ten questions following the use of an online cartoon based learning tool. The data indicates a significant improvement of participants' knowledge of hepatitis B serology from pre-test to post-test immediately following an online cartoon based learning tool. However, further research is required to measure its long term impact.

  6. Shot Planning and Analysis Tools

    SciTech Connect

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  7. MGAS: a powerful tool for multivariate gene-based genome-wide association analysis.

    PubMed

    Van der Sluis, Sophie; Dolan, Conor V; Li, Jiang; Song, Youqiang; Sham, Pak; Posthuma, Danielle; Li, Miao-Xin

    2015-04-01

    Standard genome-wide association studies, testing the association between one phenotype and a large number of single nucleotide polymorphisms (SNPs), are limited in two ways: (i) traits are often multivariate, and analysis of composite scores entails loss in statistical power and (ii) gene-based analyses may be preferred, e.g. to decrease the multiple testing problem. Here we present a new method, multivariate gene-based association test by extended Simes procedure (MGAS), that allows gene-based testing of multivariate phenotypes in unrelated individuals. Through extensive simulation, we show that under most trait-generating genotype-phenotype models MGAS has superior statistical power to detect associated genes compared with gene-based analyses of univariate phenotypic composite scores (i.e. GATES, multiple regression), and multivariate analysis of variance (MANOVA). Re-analysis of metabolic data revealed 32 False Discovery Rate controlled genome-wide significant genes, and 12 regions harboring multiple genes; of these 44 regions, 30 were not reported in the original analysis. MGAS allows researchers to conduct their multivariate gene-based analyses efficiently, and without the loss of power that is often associated with an incorrectly specified genotype-phenotype models. MGAS is freely available in KGG v3.0 (http://statgenpro.psychiatry.hku.hk/limx/kgg/download.php). Access to the metabolic dataset can be requested at dbGaP (https://dbgap.ncbi.nlm.nih.gov/). The R-simulation code is available from http://ctglab.nl/people/sophie_van_der_sluis. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  8. G-SESAME: web tools for GO-term-based gene similarity analysis and knowledge discovery

    PubMed Central

    Du, Zhidian; Li, Lin; Chen, Chin-Fu; Yu, Philip S.; Wang, James Z.

    2009-01-01

    We have developed a set of online tools for measuring the semantic similarities of Gene Ontology (GO) terms and the functional similarities of gene products, and for further discovering biomedical knowledge from the GO database. The tools have been used for about 6.9 million times by 417 institutions from 43 countries since October 2006. The online tools are available at: http://bioinformatics.clemson.edu/G-SESAME. PMID:19491312

  9. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  10. Analysis of Learning Tools in the study of Developmental of Interactive Multimedia Based Physic Learning Charged in Problem Solving

    NASA Astrophysics Data System (ADS)

    Manurung, Sondang; Demonta Pangabean, Deo

    2017-05-01

    The main purpose of this study is to produce needs analysis, literature review, and learning tools in the study of developmental of interactive multimedia based physic learning charged in problem solving to improve thinking ability of physic prospective student. The first-year result of the study is: result of the draft based on a needs analysis of the facts on the ground, the conditions of existing learning and literature studies. Following the design of devices and instruments performed as well the development of media. Result of the second study is physics learning device -based interactive multimedia charged problem solving in the form of textbooks and scientific publications. Previous learning models tested in a limited sample, then in the evaluation and repair. Besides, the product of research has an economic value on the grounds: (1) a virtual laboratory to offer this research provides a solution purchases physics laboratory equipment is expensive; (2) address the shortage of teachers of physics in remote areas as a learning tool can be accessed offline and online; (3). reducing material or consumables as tutorials can be done online; Targeted research is the first year: i.e story board learning physics that have been scanned in a web form CD (compact disk) and the interactive multimedia of gas Kinetic Theory concept. This draft is based on a needs analysis of the facts on the ground, the existing learning conditions, and literature studies. Previous learning models tested in a limited sample, then in the evaluation and repair.

  11. SVAw - a web-based application tool for automated surrogate variable analysis of gene expression studies.

    PubMed

    Pirooznia, Mehdi; Seifuddin, Fayaz; Goes, Fernando S; Leek, Jeffrey T; Zandi, Peter P

    2013-03-11

    Surrogate variable analysis (SVA) is a powerful method to identify, estimate, and utilize the components of gene expression heterogeneity due to unknown and/or unmeasured technical, genetic, environmental, or demographic factors. These sources of heterogeneity are common in gene expression studies, and failing to incorporate them into the analysis can obscure results. Using SVA increases the biological accuracy and reproducibility of gene expression studies by identifying these sources of heterogeneity and correctly accounting for them in the analysis. Here we have developed a web application called SVAw (Surrogate variable analysis Web app) that provides a user friendly interface for SVA analyses of genome-wide expression studies. The software has been developed based on open source bioconductor SVA package. In our software, we have extended the SVA program functionality in three aspects: (i) the SVAw performs a fully automated and user friendly analysis workflow; (ii) It calculates probe/gene Statistics for both pre and post SVA analysis and provides a table of results for the regression of gene expression on the primary variable of interest before and after correcting for surrogate variables; and (iii) it generates a comprehensive report file, including graphical comparison of the outcome for the user. SVAw is a web server freely accessible solution for the surrogate variant analysis of high-throughput datasets and facilitates removing all unwanted and unknown sources of variation. It is freely available for use at http://psychiatry.igm.jhmi.edu/sva. The executable packages for both web and standalone application and the instruction for installation can be downloaded from our web site.

  12. ape 3.0: New tools for distance-based phylogenetics and evolutionary analysis in R.

    PubMed

    Popescu, Andrei-Alin; Huber, Katharina T; Paradis, Emmanuel

    2012-06-01

    Reflecting its continuously increasing versatility and functionality, the popularity of the ape (analysis of phylogenetics and evolution) software package has grown steadily over the years. Among its features, it has a strong distance-based component allowing the user to compute distances from aligned DNA sequences based on most methods from the literature and also build phylogenetic trees from them. However, even data generated with modern genomic approaches can fail to give rise to sufficiently reliable distance estimates. One way to overcome this problem is to exclude such estimates from data analysis giving rise to an incomplete distance data set (as opposed to a complete one). So far their analysis has been out of reach for ape. To remedy this, we have incorporated into ape several methods from the literature for phylogenetic inference from incomplete distance matrices. In addition, we have also extended ape's repertoire for phylogenetic inference from complete distances, added a new object class to efficiently encode sets of splits of taxa, and extended the functionality of some of its existing functions. ape is distributed through the Comprehensive R Archive Network: http://cran.r-project.org/web/packages/ape/index.html Further information may be found at http://ape.mpl.ird.fr/pegas/

  13. Developing web-based data analysis tools for precision farming using R and Shiny

    NASA Astrophysics Data System (ADS)

    Jahanshiri, Ebrahim; Mohd Shariff, Abdul Rashid

    2014-06-01

    Technologies that are set to increase the productivity of agricultural practices require more and more data. Nevertheless, farming data is also being increasingly cheap to collect and maintain. Bulk of data that are collected by the sensors and samples need to be analysed in an efficient and transparent manner. Web technologies have long being used to develop applications that can assist the farmers and managers. However until recently, analysing the data in an online environment has not been an easy task especially in the eyes of data analysts. This barrier is now overcome by the availability of new application programming interfaces that can provide real-time web based data analysis. In this paper developing a prototype web based application for data analysis using new facilities in R statistical package and its web development facility, Shiny is explored. The pros and cons of this type of data analysis environment for precision farming are enumerated and future directions in web application development for agricultural data are discussed.

  14. FSSC Science Tools: Pulsar Analysis

    NASA Technical Reports Server (NTRS)

    Thompson, Dave

    2010-01-01

    This slide presentation reviews the typical pulsar analysis, giving tips for screening of the data, the use of time series analysis, and utility tools. Specific information about analyzing Vela data is reviewed.

  15. HC StratoMineR: A Web-Based Tool for the Rapid Analysis of High-Content Datasets.

    PubMed

    Omta, Wienand A; van Heesbeen, Roy G; Pagliero, Romina J; van der Velden, Lieke M; Lelieveld, Daphne; Nellen, Mehdi; Kramer, Maik; Yeong, Marley; Saeidi, Amir M; Medema, Rene H; Spruit, Marco; Brinkkemper, Sjaak; Klumperman, Judith; Egan, David A

    2016-10-01

    High-content screening (HCS) can generate large multidimensional datasets and when aligned with the appropriate data mining tools, it can yield valuable insights into the mechanism of action of bioactive molecules. However, easy-to-use data mining tools are not widely available, with the result that these datasets are frequently underutilized. Here, we present HC StratoMineR, a web-based tool for high-content data analysis. It is a decision-supportive platform that guides even non-expert users through a high-content data analysis workflow. HC StratoMineR is built by using My Structured Query Language for storage and querying, PHP: Hypertext Preprocessor as the main programming language, and jQuery for additional user interface functionality. R is used for statistical calculations, logic and data visualizations. Furthermore, C++ and graphical processor unit power is diffusely embedded in R by using the rcpp and rpud libraries for operations that are computationally highly intensive. We show that we can use HC StratoMineR for the analysis of multivariate data from a high-content siRNA knock-down screen and a small-molecule screen. It can be used to rapidly filter out undesirable data; to select relevant data; and to perform quality control, data reduction, data exploration, morphological hit picking, and data clustering. Our results demonstrate that HC StratoMineR can be used to functionally categorize HCS hits and, thus, provide valuable information for hit prioritization.

  16. Sight Application Analysis Tool

    SciTech Connect

    Bronevetsky, G.

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  17. Integrated Design and Analysis Tools for Software-Based Control Systems

    DTIC Science & Technology

    2005-07-01

    information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this...The first Giotto paper was published in the ACM Workshop for Languages, Compilers, and Tools for Embedded Systems: Giotto is a principled, tool...Models for System Design, Kluwer, 2004. Lee, E.A. and Neuendorffer, S. Classes and Subclasses in Actor-Oriented Design. Invited paper

  18. A-DaGO-Fun: an adaptable Gene Ontology semantic similarity-based functional analysis tool.

    PubMed

    Mazandu, Gaston K; Chimusa, Emile R; Mbiyavanga, Mamana; Mulder, Nicola J

    2016-02-01

    Gene Ontology (GO) semantic similarity measures are being used for biological knowledge discovery based on GO annotations by integrating biological information contained in the GO structure into data analyses. To empower users to quickly compute, manipulate and explore these measures, we introduce A-DaGO-Fun (ADaptable Gene Ontology semantic similarity-based Functional analysis). It is a portable software package integrating all known GO information content-based semantic similarity measures and relevant biological applications associated with these measures. A-DaGO-Fun has the advantage not only of handling datasets from the current high-throughput genome-wide applications, but also allowing users to choose the most relevant semantic similarity approach for their biological applications and to adapt a given module to their needs. A-DaGO-Fun is freely available to the research community at http://web.cbio.uct.ac.za/ITGOM/adagofun. It is implemented in Linux using Python under free software (GNU General Public Licence). gmazandu@cbio.uct.ac.za or Nicola.Mulder@uct.ac.za Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. A-DaGO-Fun: an adaptable Gene Ontology semantic similarity-based functional analysis tool

    PubMed Central

    Mazandu, Gaston K.; Chimusa, Emile R.; Mbiyavanga, Mamana; Mulder, Nicola J.

    2016-01-01

    Summary: Gene Ontology (GO) semantic similarity measures are being used for biological knowledge discovery based on GO annotations by integrating biological information contained in the GO structure into data analyses. To empower users to quickly compute, manipulate and explore these measures, we introduce A-DaGO-Fun (ADaptable Gene Ontology semantic similarity-based Functional analysis). It is a portable software package integrating all known GO information content-based semantic similarity measures and relevant biological applications associated with these measures. A-DaGO-Fun has the advantage not only of handling datasets from the current high-throughput genome-wide applications, but also allowing users to choose the most relevant semantic similarity approach for their biological applications and to adapt a given module to their needs. Availability and implementation: A-DaGO-Fun is freely available to the research community at http://web.cbio.uct.ac.za/ITGOM/adagofun. It is implemented in Linux using Python under free software (GNU General Public Licence). Contact: gmazandu@cbio.uct.ac.za or Nicola.Mulder@uct.ac.za Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26476781

  20. SYNCSA--R tool for analysis of metacommunities based on functional traits and phylogeny of the community components.

    PubMed

    Debastiani, Vanderlei J; Pillar, Valério D

    2012-08-01

    SYNCSA is an R package for the analysis of metacommunities based on functional traits and phylogeny of the community components. It offers tools to calculate several matrix correlations that express trait-convergence assembly patterns, trait-divergence assembly patterns and phylogenetic signal in functional traits at the species pool level and at the metacommunity level. SYNCSA is a package for the R environment, under a GPL-2 open-source license and freely available on CRAN official web server for R (http://cran.r-project.org). vanderleidebastiani@yahoo.com.br.

  1. Image-based pupil plane characterization via principal component analysis for EUVL tools

    NASA Astrophysics Data System (ADS)

    Levinson, Zac; Burbine, Andrew; Verduijn, Erik; Wood, Obert; Mangat, Pawitter; Goldberg, Kenneth A.; Benk, Markus P.; Wojdyla, Antoine; Smith, Bruce W.

    2016-03-01

    We present an approach to image-based pupil plane amplitude and phase characterization using models built with principal component analysis (PCA). PCA is a statistical technique to identify the directions of highest variation (principal components) in a high-dimensional dataset. A polynomial model is constructed between the principal components of through-focus intensity for the chosen binary mask targets and pupil amplitude or phase variation. This method separates model building and pupil characterization into two distinct steps, thus enabling rapid pupil characterization following data collection. The pupil plane variation of a zone-plate lens from the Semiconductor High-NA Actinic Reticle Review Project (SHARP) at Lawrence Berkeley National Laboratory will be examined using this method. Results will be compared to pupil plane characterization using a previously proposed methodology where inverse solutions are obtained through an iterative process involving least-squares regression.

  2. Direct matrix assisted laser desorption ionization mass spectrometry-based analysis of wine as a powerful tool for classification purposes.

    PubMed

    Nunes-Miranda, J D; Santos, Hugo M; Reboiro-Jato, Miguel; Fdez-Riverola, Florentino; Igrejas, G; Lodeiro, Carlos; Capelo, J L

    2012-03-15

    The variables affecting the direct matrix assisted laser desorption ionization mass spectrometry-based analysis of wine for classification purposes have been studied. The type of matrix, the number of bottles of wine, the number of technical replicates and the number of spots used for the sample analysis have been carefully assessed to obtain the best classification possible. Ten different algorithms have been assessed as classification tools using the experimental data collected after the analysis of fourteen types of wine. The best matrix was found to be α-Cyano with a sample to matrix ratio of 1:0.75. To correctly classify the wines, profiling a minimum of five bottles per type of wine is suggested, with a minimum of three MALDI spot replicates for each bottle. The best algorithm to classify the wines was found to be Bayes Net. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Computer vision-based analysis of foods: a non-destructive colour measurement tool to monitor quality and safety.

    PubMed

    Mogol, Burçe Ataç; Gökmen, Vural

    2014-05-01

    Computer vision-based image analysis has been widely used in food industry to monitor food quality. It allows low-cost and non-contact measurements of colour to be performed. In this paper, two computer vision-based image analysis approaches are discussed to extract mean colour or featured colour information from the digital images of foods. These types of information may be of particular importance as colour indicates certain chemical changes or physical properties in foods. As exemplified here, the mean CIE a* value or browning ratio determined by means of computer vision-based image analysis algorithms can be correlated with acrylamide content of potato chips or cookies. Or, porosity index as an important physical property of breadcrumb can be calculated easily. In this respect, computer vision-based image analysis provides a useful tool for automatic inspection of food products in a manufacturing line, and it can be actively involved in the decision-making process where rapid quality/safety evaluation is needed.

  4. MicrobiomeAnalyst: a web-based tool for comprehensive statistical, visual and meta-analysis of microbiome data.

    PubMed

    Dhariwal, Achal; Chong, Jasmine; Habib, Salam; King, Irah L; Agellon, Luis B; Xia, Jianguo

    2017-04-26

    The widespread application of next-generation sequencing technologies has revolutionized microbiome research by enabling high-throughput profiling of the genetic contents of microbial communities. How to analyze the resulting large complex datasets remains a key challenge in current microbiome studies. Over the past decade, powerful computational pipelines and robust protocols have been established to enable efficient raw data processing and annotation. The focus has shifted toward downstream statistical analysis and functional interpretation. Here, we introduce MicrobiomeAnalyst, a user-friendly tool that integrates recent progress in statistics and visualization techniques, coupled with novel knowledge bases, to enable comprehensive analysis of common data outputs produced from microbiome studies. MicrobiomeAnalyst contains four modules - the Marker Data Profiling module offers various options for community profiling, comparative analysis and functional prediction based on 16S rRNA marker gene data; the Shotgun Data Profiling module supports exploratory data analysis, functional profiling and metabolic network visualization of shotgun metagenomics or metatranscriptomics data; the Taxon Set Enrichment Analysis module helps interpret taxonomic signatures via enrichment analysis against >300 taxon sets manually curated from literature and public databases; finally, the Projection with Public Data module allows users to visually explore their data with a public reference data for pattern discovery and biological insights. MicrobiomeAnalyst is freely available at http://www.microbiomeanalyst.ca. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. MicrobiomeAnalyst: a web-based tool for comprehensive statistical, visual and meta-analysis of microbiome data

    PubMed Central

    Dhariwal, Achal; Chong, Jasmine; Habib, Salam; King, Irah L.; Agellon, Luis B.

    2017-01-01

    Abstract The widespread application of next-generation sequencing technologies has revolutionized microbiome research by enabling high-throughput profiling of the genetic contents of microbial communities. How to analyze the resulting large complex datasets remains a key challenge in current microbiome studies. Over the past decade, powerful computational pipelines and robust protocols have been established to enable efficient raw data processing and annotation. The focus has shifted toward downstream statistical analysis and functional interpretation. Here, we introduce MicrobiomeAnalyst, a user-friendly tool that integrates recent progress in statistics and visualization techniques, coupled with novel knowledge bases, to enable comprehensive analysis of common data outputs produced from microbiome studies. MicrobiomeAnalyst contains four modules - the Marker Data Profiling module offers various options for community profiling, comparative analysis and functional prediction based on 16S rRNA marker gene data; the Shotgun Data Profiling module supports exploratory data analysis, functional profiling and metabolic network visualization of shotgun metagenomics or metatranscriptomics data; the Taxon Set Enrichment Analysis module helps interpret taxonomic signatures via enrichment analysis against >300 taxon sets manually curated from literature and public databases; finally, the Projection with Public Data module allows users to visually explore their data with a public reference data for pattern discovery and biological insights. MicrobiomeAnalyst is freely available at http://www.microbiomeanalyst.ca. PMID:28449106

  6. NCC: A Physics-Based Design and Analysis Tool for Combustion Systems

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Quealy, Angela

    2000-01-01

    The National Combustion Code (NCC) is an integrated system of computer codes for physics-based design and analysis of combustion systems. It uses unstructured meshes and runs on parallel computing platforms. The NCC is composed of a set of distinct yet closely related modules. They are: (1) a gaseous flow module solving 3-D Navier-Stokes equations; (2) a turbulence module containing the non-linear k-epsilon models; (3) a chemistry module using either the conventional reduced kinetics approach of solving species equations or the Intrinsic Low Dimensional Manifold (ILDM) kinetics approach of table looking up in conjunction with solving the equations of the progressive variables; (4) a turbulence-chemistry interaction module including the option of solving the joint probability density function (PDF) for species and enthalpy; and (5) a spray module for solving the liquid phase equations. In early 1995, an industry-government team was formed to develop the NCC. In July 1998, the baseline beta version was completed and presented in two NCC sessions at the 34th AIAA/ASME/SAE/ASEE Joint Propulsion Conference & Exhibit, July 1998. An overview of this baseline beta version was presented at the NASA HPCCP/CAS Workshop 98, August 1998. Since then, the effort has been focused on the streamlining, validation, and enhancement of the th baseline beta version. The progress is presented in two NCC sessions at the AIAA 38 Aerospace Sciences Meeting & Exhibit, January 2000. At this NASA HPCCP/CAS Workshop 2000, an overview of the NCC papers presented at the AIAA 38 th Aerospace Sciences Meeting & Exhibit is presented, with emphasis on the reduction of analysis time of simulating the (gaseous) reacting flows in full combustors. In addition, results of NCC simulation of a modern turbofan combustor will also be reported.

  7. NCC: A Physics-Based Design and Analysis Tool for Combustion Systems

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Quealy, Angela

    2000-01-01

    The National Combustion Code (NCC) is an integrated system of computer codes for physics-based design and analysis of combustion systems. It uses unstructured meshes and runs on parallel computing platforms. The NCC is composed of a set of distinct yet closely related modules. They are: (1) a gaseous flow module solving 3-D Navier-Stokes equations; (2) a turbulence module containing the non-linear k-epsilon models; (3) a chemistry module using either the conventional reduced kinetics approach of solving species equations or the Intrinsic Low Dimensional Manifold (ILDM) kinetics approach of table looking up in conjunction with solving the equations of the progressive variables; (4) a turbulence-chemistry interaction module including the option of solving the joint probability density function (PDF) for species and enthalpy; and (5) a spray module for solving the liquid phase equations. In early 1995, an industry-government team was formed to develop the NCC. In July 1998, the baseline beta version was completed and presented in two NCC sessions at the 34th AIAA/ASME/SAE/ASEE Joint Propulsion Conference & Exhibit, July 1998. An overview of this baseline beta version was presented at the NASA HPCCP/CAS Workshop 98, August 1998. Since then, the effort has been focused on the streamlining, validation, and enhancement of the th baseline beta version. The progress is presented in two NCC sessions at the AIAA 38 Aerospace Sciences Meeting & Exhibit, January 2000. At this NASA HPCCP/CAS Workshop 2000, an overview of the NCC papers presented at the AIAA 38 th Aerospace Sciences Meeting & Exhibit is presented, with emphasis on the reduction of analysis time of simulating the (gaseous) reacting flows in full combustors. In addition, results of NCC simulation of a modern turbofan combustor will also be reported.

  8. An Agro-Climatological Early Warning Tool Based on the Google Earth Engine to Support Regional Food Security Analysis

    NASA Astrophysics Data System (ADS)

    Landsfeld, M. F.; Daudert, B.; Friedrichs, M.; Morton, C.; Hegewisch, K.; Husak, G. J.; Funk, C. C.; Peterson, P.; Huntington, J. L.; Abatzoglou, J. T.; Verdin, J. P.; Williams, E. L.

    2015-12-01

    The Famine Early Warning Systems Network (FEWS NET) focuses on food insecurity in developing nations and provides objective, evidence based analysis to help government decision-makers and relief agencies plan for and respond to humanitarian emergencies. The Google Earth Engine (GEE) is a platform provided by Google Inc. to support scientific research and analysis of environmental data in their cloud environment. The intent is to allow scientists and independent researchers to mine massive collections of environmental data and leverage Google's vast computational resources to detect changes and monitor the Earth's surface and climate. GEE hosts an enormous amount of satellite imagery and climate archives, one of which is the Climate Hazards Group Infrared Precipitation with Stations dataset (CHIRPS). The CHIRPS dataset is land based, quasi-global (latitude 50N-50S), 0.05 degree resolution, and has a relatively long term period of record (1981-present). CHIRPS is on a continuous monthly feed into the GEE as new data fields are generated each month. This precipitation dataset is a key input for FEWS NET monitoring and forecasting efforts. FEWS NET intends to leverage the GEE in order to provide analysts and scientists with flexible, interactive tools to aid in their monitoring and research efforts. These scientists often work in bandwidth limited regions, so lightweight Internet tools and services that bypass the need for downloading massive datasets to analyze them, are preferred for their work. The GEE provides just this type of service. We present a tool designed specifically for FEWS NET scientists to be utilized interactively for investigating and monitoring for agro-climatological issues. We are able to utilize the enormous GEE computing power to generate on-the-fly statistics to calculate precipitation anomalies, z-scores, percentiles and band ratios, and allow the user to interactively select custom areas for statistical time series comparisons and predictions.

  9. IKOS: A Framework for Static Analysis based on Abstract Interpretation (Tool Paper)

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.; Laserna, Jorge A.; Shi, Nija; Venet, Arnaud Jean

    2014-01-01

    The RTCA standard (DO-178C) for developing avionic software and getting certification credits includes an extension (DO-333) that describes how developers can use static analysis in certification. In this paper, we give an overview of the IKOS static analysis framework that helps developing static analyses that are both precise and scalable. IKOS harnesses the power of Abstract Interpretation and makes it accessible to a larger class of static analysis developers by separating concerns such as code parsing, model development, abstract domain management, results management, and analysis strategy. The benefits of the approach is demonstrated by a buffer overflow analysis applied to flight control systems.

  10. Landscape analysis software tools

    Treesearch

    Don Vandendriesche

    2008-01-01

    Recently, several new computer programs have been developed to assist in landscape analysis. The “Sequential Processing Routine for Arraying Yields” (SPRAY) program was designed to run a group of stands with particular treatment activities to produce vegetation yield profiles for forest planning. SPRAY uses existing Forest Vegetation Simulator (FVS) software coupled...

  11. AL-Base: a visual platform analysis tool for the study of amyloidogenic immunoglobulin light chain sequences

    PubMed Central

    Bodi, Kip; Prokaeva, Tatiana; Spencer, Brian; Eberhard, Maurya; Connors, Lawreen H.; Seldin, David C.

    2014-01-01

    AL-Base, a curated database of human immunoglobulin (Ig) light chain (LC) sequences derived from patients with AL amyloidosis and controls, is described, along with a collection of analytical and graphic tools designed to facilitate their analysis. AL-Base is designed to compile and analyse amyloidogenic Ig LC sequences and to compare their predicted protein sequence and structure to non-amyloidogenic LC sequences. Currently, the database contains over 3000 de-identified LC nucleotide and amino acid sequences, of which 433 encode monoclonal proteins that were reported to form fibrillar deposits in AL patients. Each sequence is categorised according to germline gene usage, clinical status and sample source. Currently, tools are available to search for sequences by various criteria, to analyse the biochemical properties of the predicted amino acids at each position and to display the results in a graphical fashion. The likelihood that each sequence has evolved through somatic hypermutation can be predicted using an automated binomial or multinomial distribution model. AL-Base is available to the scientific community for research purposes. PMID:19291508

  12. Enabling New Visualization and Analysis Tools with a Kameleon-based Data Streaming Service

    NASA Astrophysics Data System (ADS)

    Berrios, D.; Rastaetter, L.; Maddox, M. M.

    2012-12-01

    The Community Coordinated Modeling Center (CCMC) at NASA Goddard Space Flight Center has developed a new data streaming service for space weather simulation data. Leveraging the capabilities of the Kameleon-Plus data access and interpolation library, it provides access to visualizations of high resolution simulations with an easy to use, well defined API. 2D slices, isosurfaces, fieldlines, and volumetric datacubes can be requested in multiple formats. We present the motivations for designing such a service, the capabilities of the underlying Kameleon-Plus library, the new data streaming service, and demo experimental tools using this service.

  13. STRESSED SEBATES: A TRAIT-BASED EVALUATION OF CLIMATE RISKS TO ROCKFISHES OF THE NORTHEASTERN PACIFIC USING THE COASTAL BIOGEOGRAPHIC RISK ANALYSIS TOOL (CBRAT)

    EPA Science Inventory

    The EPA and USGS have developed a framework to evaluate the relative vulnerability of near-coastal species to impacts of climate change. This framework is implemented in a web-based tool, the Coastal Biogeographic Risk Analysis Tool (CBRAT). We evaluated the vulnerability of the ...

  14. Stressed Sebastes: A Trait-Based Evaluation of Climate Risks to Rockfishes of the Northeastern Pacific Using the Coastal Biogeographic Risk Analysis Tool (CBRAT)

    EPA Science Inventory

    The EPA and USGS have developed a framework to evaluate the relative vulnerability of near-coastal species to impacts of climate change. This framework was implemented in a web-based tool, the Coastal Biogeographic Risk Analysis Tool (CBRAT). We evaluated the vulnerability of the...

  15. STRESSED SEBATES: A TRAIT-BASED EVALUATION OF CLIMATE RISKS TO ROCKFISHES OF THE NORTHEASTERN PACIFIC USING THE COASTAL BIOGEOGRAPHIC RISK ANALYSIS TOOL (CBRAT)

    EPA Science Inventory

    The EPA and USGS have developed a framework to evaluate the relative vulnerability of near-coastal species to impacts of climate change. This framework is implemented in a web-based tool, the Coastal Biogeographic Risk Analysis Tool (CBRAT). We evaluated the vulnerability of the ...

  16. Stressed Sebastes: A Trait-Based Evaluation of Climate Risks to Rockfishes of the Northeastern Pacific Using the Coastal Biogeographic Risk Analysis Tool (CBRAT)

    EPA Science Inventory

    The EPA and USGS have developed a framework to evaluate the relative vulnerability of near-coastal species to impacts of climate change. This framework was implemented in a web-based tool, the Coastal Biogeographic Risk Analysis Tool (CBRAT). We evaluated the vulnerability of the...

  17. The Cerefy Neuroradiology Atlas: a Talairach-Tournoux atlas-based tool for analysis of neuroimages available over the internet.

    PubMed

    Nowinski, Wieslaw L; Belov, Dmitry

    2003-09-01

    The article introduces an atlas-assisted method and a tool called the Cerefy Neuroradiology Atlas (CNA), available over the Internet for neuroradiology and human brain mapping. The CNA contains an enhanced, extended, and fully segmented and labeled electronic version of the Talairach-Tournoux brain atlas, including parcelated gyri and Brodmann's areas. To our best knowledge, this is the first online, publicly available application with the Talairach-Tournoux atlas. The process of atlas-assisted neuroimage analysis is done in five steps: image data loading, Talairach landmark setting, atlas normalization, image data exploration and analysis, and result saving. Neuroimage analysis is supported by a near-real-time, atlas-to-data warping based on the Talairach transformation. The CNA runs on multiple platforms; is able to process simultaneously multiple anatomical and functional data sets; and provides functions for a rapid atlas-to-data registration, interactive structure labeling and annotating, and mensuration. It is also empowered with several unique features, including interactive atlas warping facilitating fine tuning of atlas-to-data fit, navigation on the triplanar formed by the image data and the atlas, multiple-images-in-one display with interactive atlas-anatomy-function blending, multiple label display, and saving of labeled and annotated image data. The CNA is useful for fast atlas-assisted analysis of neuroimage data sets. It increases accuracy and reduces time in localization analysis of activation regions; facilitates to communicate the information on the interpreted scans from the neuroradiologist to other clinicians and medical students; increases the neuroradiologist's confidence in terms of anatomy and spatial relationships; and serves as a user-friendly, public domain tool for neuroeducation. At present, more than 700 users from five continents have subscribed to the CNA.

  18. Flow Analysis Tool White Paper

    NASA Technical Reports Server (NTRS)

    Boscia, Nichole K.

    2012-01-01

    Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

  19. CoPub: a literature-based keyword enrichment tool for microarray data analysis

    PubMed Central

    Frijters, Raoul; Heupers, Bart; van Beek, Pieter; Bouwhuis, Maurice; van Schaik, René; de Vlieg, Jacob; Polman, Jan; Alkema, Wynand

    2008-01-01

    Medline is a rich information source, from which links between genes and keywords describing biological processes, pathways, drugs, pathologies and diseases can be extracted. We developed a publicly available tool called CoPub that uses the information in the Medline database for the biological interpretation of microarray data. CoPub allows batch input of multiple human, mouse or rat genes and produces lists of keywords from several biomedical thesauri that are significantly correlated with the set of input genes. These lists link to Medline abstracts in which the co-occurring input genes and correlated keywords are highlighted. Furthermore, CoPub can graphically visualize differentially expressed genes and over-represented keywords in a network, providing detailed insight in the relationships between genes and keywords, and revealing the most influential genes as highly connected hubs. CoPub is freely accessible at http://services.nbic.nl/cgi-bin/copub/CoPub.pl. PMID:18442992

  20. Sandia PUF Analysis Tool

    SciTech Connect

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  1. Planning and Implementing Web-Based Instruction: Tools for Decision Analysis.

    ERIC Educational Resources Information Center

    Harmon, Stephen W.; Jones, Marshall G.

    This paper discusses issues and factors involved in making decisions on whether to use World Wide Web-based instruction. Five levels of Web-based instruction (no Web use, informational, supplemental, essential, communal, or immersive Web use) are discussed, and the following factors are identified to consider before making the decision to put…

  2. VCAT: Visual Crosswalk Analysis Tool

    SciTech Connect

    Cleland, Timothy J.; Forslund, David W.; Cleland, Catherine A.

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  3. Web-Based Tools for Environmental Data

    SciTech Connect

    Laguna, G.; Lager, D.; Colombini, F.; Ottesen, P.

    2000-03-28

    Lawrence Livermore National Laboratory (LLNL) has pursued an aggressive site characterization and remediation program since the early 1980's. The effort has required drilling and sampling over 1000 wells. The development of tools for interacting with the large volume of data is imperative. Working closely with interdisciplinary project scientists, we have developed a suite of web-based tools for facilitating many data-driven analysis and interpretation tasks. LLNL tool development must meet the needs of several different groups: LLNL project staff, DOE project managers, and government regulators. The project managers and regulators require general tools, answering questions such as ''what locations have had detectable amounts of a particular chemical.'' In addition to general inquiries, regulators want specific information, such as reports of volatile organic compound concentrations for an area over time. LLNL users need tools that support analysis and facility operations as well as general inquiry tools. We have developed web-based tools that allow each class of user to obtain much of the information they desire without the assistance of database specialists. While these tools were created for particular classes of users, each tool has proven useful to other groups as well. Providing a web interface to these tools makes them easily accessible regardless of the user's location or computing platform. Cross-linking these tools increases their visibility and enables data exploration. In this paper, we will describe a selection of our web-based tools, illustrating the way we are using the web to facilitate easier and broader access to project data.

  4. Evaluating the Evidence Base of Video Analysis: A Special Education Teacher Development Tool

    ERIC Educational Resources Information Center

    Nagro, Sarah A.; Cornelius, Kyena E.

    2013-01-01

    Special education teacher development is continually studied to determine best practices for improving teacher quality and promoting student learning. Video analysis is commonly included in teacher development targeting both teacher thinking and practice intended to improve learning opportunities for students. Positive research findings support…

  5. Structured Assessment Approach: a microcomputer-based insider-vulnerability analysis tool

    SciTech Connect

    Patenaude, C.J.; Sicherman, A.; Sacks, I.J.

    1986-01-01

    The Structured Assessment Approach (SAA) was developed to help assess the vulnerability of safeguards systems to insiders in a staged manner. For physical security systems, the SAA identifies possible diversion paths which are not safeguarded under various facility operating conditions and insiders who could defeat the system via direct access, collusion or indirect tampering. For material control and accounting systems, the SAA identifies those who could block the detection of a material loss or diversion via data falsification or equipment tampering. The SAA, originally desinged to run on a mainframe computer, has been converted to run on a personal computer. Many features have been added to simplify and facilitate its use for conducting vulnerability analysis. For example, the SAA input, which is a text-like data file, is easily readable and can provide documentation of facility safeguards and assumptions used for the analysis.

  6. DanteR: an extensible R-based tool for quantitative analysis of -omics data

    SciTech Connect

    Taverner, Thomas; Karpievitch, Yuliya; Polpitiya, Ashoka D.; Brown, Joseph N.; Dabney, Alan R.; Anderson, Gordon A.; Smith, Richard D.

    2012-09-15

    Motivation: The size and complex nature of LC-MS proteomics data sets motivates development of specialized software for statistical data analysis and exploration. We present DanteR, a graphical R package that features extensive statistical and diagnostic functions for quantitative proteomics data analysis, including normalization, imputation, hypothesis testing, interactive visualization and peptide-to-protein rollup. More importantly, users can easily extend the existing functionality by including their own algorithms under the Add-On tab. Availability: DanteR and its associated user guide are available for download at http://omics.pnl.gov/software/. For Windows, a single click automatically installs DanteR along with the R programming environment. For Linux and Mac OS X, users must first install R and then follow instructions on the DanteR web site for package installation.

  7. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  8. Femtosecond laser ablation-based mass spectrometry: An ideal tool for stoichiometric analysis of thin films

    PubMed Central

    LaHaye, Nicole L.; Kurian, Jose; Diwakar, Prasoon K.; Alff, Lambert; Harilal, Sivanandan S.

    2015-01-01

    An accurate and routinely available method for stoichiometric analysis of thin films is a desideratum of modern materials science where a material’s properties depend sensitively on elemental composition. We thoroughly investigated femtosecond laser ablation-inductively coupled plasma-mass spectrometry (fs-LA-ICP-MS) as an analytical technique for determination of the stoichiometry of thin films down to the nanometer scale. The use of femtosecond laser ablation allows for precise removal of material with high spatial and depth resolution that can be coupled to an ICP-MS to obtain elemental and isotopic information. We used molecular beam epitaxy-grown thin films of LaPd(x)Sb2 and T′-La2CuO4 to demonstrate the capacity of fs-LA-ICP-MS for stoichiometric analysis and the spatial and depth resolution of the technique. Here we demonstrate that the stoichiometric information of thin films with a thickness of ~10 nm or lower can be determined. Furthermore, our results indicate that fs-LA-ICP-MS provides precise information on the thin film-substrate interface and is able to detect the interdiffusion of cations. PMID:26285795

  9. Femtosecond laser ablation-based mass spectrometry. An ideal tool for stoichiometric analysis of thin films

    SciTech Connect

    LaHaye, Nicole L.; Kurian, Jose; Diwakar, Prasoon K.; Alff, Lambert; Harilal, Sivanandan S.

    2015-08-19

    An accurate and routinely available method for stoichiometric analysis of thin films is a desideratum of modern materials science where a material’s properties depend sensitively on elemental composition. We thoroughly investigated femtosecond laser ablation-inductively coupled plasma-mass spectrometry (fs-LA-ICP-MS) as an analytical technique for determination of the stoichiometry of thin films down to the nanometer scale. The use of femtosecond laser ablation allows for precise removal of material with high spatial and depth resolution that can be coupled to an ICP-MS to obtain elemental and isotopic information. We used molecular beam epitaxy-grown thin films of LaPd(x)Sb2 and T´-La2CuO4 to demonstrate the capacity of fs-LA-ICP-MS for stoichiometric analysis and the spatial and depth resolution of the technique. Here we demonstrate that the stoichiometric information of thin films with a thickness of ~10 nm or lower can be determined. Furthermore, our results indicate that fs-LA-ICP-MS provides precise information on the thin film-substrate interface and is able to detect the interdiffusion of cations.

  10. APL@Voro: a Voronoi-based membrane analysis tool for GROMACS trajectories.

    PubMed

    Lukat, Gunther; Krüger, Jens; Sommer, Björn

    2013-11-25

    APL@Voro is a new program developed to aid in the analysis of GROMACS trajectories of lipid bilayer simulations. It can read a GROMACS trajectory file, a PDB coordinate file, and a GROMACS index file to create a two-dimensional geometric representation of a bilayer. Voronoi diagrams and Delaunay triangulations--generated for different selection models of lipids--support the analysis of the bilayer. The values calculated on the geometric structures can be visualized in a user-friendly interactive environment and, then, plotted and exported to different file types. APL@Voro supports complex bilayers with a mix of various lipids and proteins. For the calculation of the projected area per lipid, a modification of the well-known Voronoi approach is presented as well as the presentation of a new approach for including atoms into an existing triangulation. The application of the developed software is discussed for three example systems simulated with GROMACS. The program is written in C++, is open source, and is available free of charge.

  11. Tools based on multivariate statistical analysis for classification of soil and groundwater in Apulian agricultural sites.

    PubMed

    Ielpo, Pierina; Leardi, Riccardo; Pappagallo, Giuseppe; Uricchio, Vito Felice

    2017-06-01

    In this paper, the results obtained from multivariate statistical techniques such as PCA (Principal component analysis) and LDA (Linear discriminant analysis) applied to a wide soil data set are presented. The results have been compared with those obtained on a groundwater data set, whose samples were collected together with soil ones, within the project "Improvement of the Regional Agro-meteorological Monitoring Network (2004-2007)". LDA, applied to soil data, has allowed to distinguish the geographical origin of the sample from either one of the two macroaeras: Bari and Foggia provinces vs Brindisi, Lecce e Taranto provinces, with a percentage of correct prediction in cross validation of 87%. In the case of the groundwater data set, the best classification was obtained when the samples were grouped into three macroareas: Foggia province, Bari province and Brindisi, Lecce and Taranto provinces, by reaching a percentage of correct predictions in cross validation of 84%. The obtained information can be very useful in supporting soil and water resource management, such as the reduction of water consumption and the reduction of energy and chemical (nutrients and pesticides) inputs in agriculture.

  12. Femtosecond laser ablation-based mass spectrometry. An ideal tool for stoichiometric analysis of thin films

    DOE PAGES

    LaHaye, Nicole L.; Kurian, Jose; Diwakar, Prasoon K.; ...

    2015-08-19

    An accurate and routinely available method for stoichiometric analysis of thin films is a desideratum of modern materials science where a material’s properties depend sensitively on elemental composition. We thoroughly investigated femtosecond laser ablation-inductively coupled plasma-mass spectrometry (fs-LA-ICP-MS) as an analytical technique for determination of the stoichiometry of thin films down to the nanometer scale. The use of femtosecond laser ablation allows for precise removal of material with high spatial and depth resolution that can be coupled to an ICP-MS to obtain elemental and isotopic information. We used molecular beam epitaxy-grown thin films of LaPd(x)Sb2 and T´-La2CuO4 to demonstrate themore » capacity of fs-LA-ICP-MS for stoichiometric analysis and the spatial and depth resolution of the technique. Here we demonstrate that the stoichiometric information of thin films with a thickness of ~10 nm or lower can be determined. Furthermore, our results indicate that fs-LA-ICP-MS provides precise information on the thin film-substrate interface and is able to detect the interdiffusion of cations.« less

  13. Risk-management and risk-analysis-based decision tools for attacks on electric power.

    PubMed

    Simonoff, Jeffrey S; Restrepo, Carlos E; Zimmerman, Rae

    2007-06-01

    Incident data about disruptions to the electric power grid provide useful information that can be used as inputs into risk management policies in the energy sector for disruptions from a variety of origins, including terrorist attacks. This article uses data from the Disturbance Analysis Working Group (DAWG) database, which is maintained by the North American Electric Reliability Council (NERC), to look at incidents over time in the United States and Canada for the period 1990-2004. Negative binomial regression, logistic regression, and weighted least squares regression are used to gain a better understanding of how these disturbances varied over time and by season during this period, and to analyze how characteristics such as number of customers lost and outage duration are related to different characteristics of the outages. The results of the models can be used as inputs to construct various scenarios to estimate potential outcomes of electric power outages, encompassing the risks, consequences, and costs of such outages.

  14. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  15. Genetic analysis of Giardia and Cryptosporidium from people in Northern Australia using PCR-based tools.

    PubMed

    Ebner, Janine; Koehler, Anson V; Robertson, Gemma; Bradbury, Richard S; Jex, Aaron R; Haydon, Shane R; Stevens, Melita A; Norton, Robert; Joachim, Anja; Gasser, Robin B

    2015-12-01

    To date, there has been limited genetic study of the gastrointestinal pathogens Giardia and Cryptosporidium in northern parts of Australia. Here, PCR-based methods were used for the genetic characterization of Giardia and Cryptosporidium from 695 people with histories of gastrointestinal disorders from the tropical North of Australia. Genomic DNAs from fecal samples were subjected to PCR-based analyses of regions from the triose phosphate isomerase (tpi), small subunit (SSU) of the nuclear ribosomal RNA and/or the glycoprotein (gp60) genes. Giardia and Cryptosporidium were detected in 13 and four of the 695 samples, respectively. Giardia duodenalis assemblages A and B were found in 4 (31%) and 9 (69%) of the 13 samples in persons of <9 years of age. Cryptosporidium hominis (subgenotype IdA18), Cryptosporidium mink genotype (subgenotype IIA16R1) and C. felis were also identified in single patients of 11-21 years of age. Future studies might focus on a comparative study of these and other protists in rural communities in Northern Australia.

  16. Fuselage Versus Subcomponent Panel Response Correlation Based on ABAQUS Explicit Progressive Damage Analysis Tools

    NASA Technical Reports Server (NTRS)

    Gould, Kevin E.; Satyanarayana, Arunkumar; Bogert, Philip B.

    2016-01-01

    Analysis performed in this study substantiates the need for high fidelity vehicle level progressive damage analyses (PDA) structural models for use in the verification and validation of proposed sub-scale structural models and to support required full-scale vehicle level testing. PDA results are presented that capture and correlate the responses of sub-scale 3-stringer and 7-stringer panel models and an idealized 8-ft diameter fuselage model, which provides a vehicle level environment for the 7-stringer sub-scale panel model. Two unique skin-stringer attachment assumptions are considered and correlated in the models analyzed: the TIE constraint interface versus the cohesive element (COH3D8) interface. Evaluating different interfaces allows for assessing a range of predicted damage modes, including delamination and crack propagation responses. Damage models considered in this study are the ABAQUS built-in Hashin procedure and the COmplete STress Reduction (COSTR) damage procedure implemented through a VUMAT user subroutine using the ABAQUS/Explicit code.

  17. A tool for automated diabetic retinopathy pre-screening based on retinal image computer analysis.

    PubMed

    Gegundez-Arias, Manuel E; Marin, Diego; Ponte, Beatriz; Alvarez, Fatima; Garrido, Javier; Ortega, Carlos; Vasallo, Manuel J; Bravo, Jose M

    2017-09-01

    This paper presents a methodology and first results of an automatic detection system of first signs of Diabetic Retinopathy (DR) in fundus images, developed for the Health Ministry of the Andalusian Regional Government (Spain). The system detects the presence of microaneurysms and haemorrhages in retinography by means of techniques of digital image processing and supervised classification. Evaluation was conducted on 1058 images of 529 diabetic patients at risk of presenting evidence of DR (an image of each eye is provided). To this end, a ground-truth diagnosis was created based on gradations performed by 3 independent ophthalmology specialists. The comparison between the diagnosis provided by the system and the reference clinical diagnosis shows that the system can work at a level of sensitivity that is similar to that achieved by experts (0.9380 sensitivity per patient against 0.9416 sensitivity of several specialists). False negatives have proven to be mild cases. Moreover, while the specificity of the system is significantly lower than that of human graders (0.5098), it is high enough to screen more than half of the patients unaffected by the disease. Results are promising in integrating this system in DR screening programmes. At an early stage, the system could act as a pre-screening system, by screening healthy patients (with no obvious signs of DR) and identifying only those presenting signs of the disease. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Development of a virtual tool for the quantification and the analysis of soil erosion in olive orchards based on RUSLE

    NASA Astrophysics Data System (ADS)

    Marín, Víctor; Taguas, Encarnación V.; Redel, María Dolores; Gómez, Jose A.

    2013-04-01

    Erosion rates above 30 t ha-1 yr-1 have been measured in hilly agricultural regions such as Andalusia in Southern Spain, associated to orchard crops (Gómez et al., 2008). In this region, there are 1.48 Mha of olive groves (CAP, 2007), which are essential in terms of income, employment and landscape. The acquisition of training and experience in modelling soil erosion is difficult by the conventional system teaching for students as well as specific technicians. This paper presents a telematic training/analysis tool, CREO (Calculator of Rates of Erosion in Olive crops/ Calculadora RUSLE para Erosión en Olivar), to quantify erosion rates in olive grove areas based on the Revised Universal Soil Loss Equation (RUSLE; Renard et al., 1997) and on specific information published on soil losses and soil characteristics in olive orchards in Southern Spain. The tool has been programmed with Matlab R2008a from MathWorks Inc. (USA), although it could be used as an executable program in Spanish and English language by interested users. It consists of seven menus with visual material where different sources, databases and methodologies are presented to quantify soil rates (A = R.K.LS.C.P) by the calculation of six factors.A is computed in t ha-1 yr-1; R is the rainfall erosivity factor (MJ mm ha-1.h-1 yr-1); K represents the soil erodibility (t ha h ha-1 MJ-1 mm-1); L is the slope length factor and S is the slope gradient factor (dimensionless); C is a cover management factor (dimensionless) and P is a support practice factor (dimensionless). Different equations and methodologies can be selected by the user for the calculation of each factor while recommendations and advice can be showed for the suitable use of the tool. It is expected that CREO was a valuable helpful tool in environmental studies associated to olive orchard land use and its further use allows a better understanding of the interaction among the different factors involved, and better access to available

  19. Water Quality Analysis Tool (WQAT)

    EPA Science Inventory

    The purpose of the Water Quality Analysis Tool (WQAT) software is to provide a means for analyzing and producing useful remotely sensed data products for an entire estuary, a particular point or area of interest (AOI or POI) in estuaries, or water bodies of interest where pre-pro...

  20. Water Quality Analysis Tool (WQAT)

    EPA Science Inventory

    The purpose of the Water Quality Analysis Tool (WQAT) software is to provide a means for analyzing and producing useful remotely sensed data products for an entire estuary, a particular point or area of interest (AOI or POI) in estuaries, or water bodies of interest where pre-pro...

  1. GAIA: a gram-based interaction analysis tool – an approach for identifying interacting domains in yeast

    PubMed Central

    Zhang, Kelvin X; Ouellette, BF Francis

    2009-01-01

    Background Protein-Protein Interactions (PPIs) play important roles in many biological functions. Protein domains, which are defined as independently folding structural blocks of proteins, physically interact with each other to perform these biological functions. Therefore, the identification of Domain-Domain Interactions (DDIs) is of great biological interests because it is generally accepted that PPIs are mediated by DDIs. As a result, much effort has been put on the prediction of domain pair interactions based on computational methods. Many DDI prediction tools using PPIs network and domain evolution information have been reported. However, tools that combine the primary sequences, domain annotations, and structural annotations of proteins have not been evaluated before. Results In this study, we report a novel approach called Gram-bAsed Interaction Analysis (GAIA). GAIA extracts peptide segments that are composed of fixed length of continuous amino acids, called n-grams (where n is the number of amino acids), from the annotated domain and DDI data set in Saccharomyces cerevisiae (budding yeast) and identifies a list of n-grams that may contribute to DDIs and PPIs based on the frequencies of their appearance. GAIA also reports the coordinate position of gram pairs on each interacting domain pair. We demonstrate that our approach improves on other DDI prediction approaches when tested against a gold-standard data set and achieves a true positive rate of 82% and a false positive rate of 21%. We also identify a list of 4-gram pairs that are significantly over-represented in the DDI data set and may mediate PPIs. Conclusion GAIA represents a novel and reliable way to predict DDIs that mediate PPIs. Our results, which show the localizations of interacting grams/hotspots, provide testable hypotheses for experimental validation. Complemented with other prediction methods, this study will allow us to elucidate the interactome of cells. PMID:19208164

  2. SNiPlay: a web-based tool for detection, management and analysis of SNPs. Application to grapevine diversity projects

    PubMed Central

    2011-01-01

    Background High-throughput re-sequencing, new genotyping technologies and the availability of reference genomes allow the extensive characterization of Single Nucleotide Polymorphisms (SNPs) and insertion/deletion events (indels) in many plant species. The rapidly increasing amount of re-sequencing and genotyping data generated by large-scale genetic diversity projects requires the development of integrated bioinformatics tools able to efficiently manage, analyze, and combine these genetic data with genome structure and external data. Results In this context, we developed SNiPlay, a flexible, user-friendly and integrative web-based tool dedicated to polymorphism discovery and analysis. It integrates: 1) a pipeline, freely accessible through the internet, combining existing softwares with new tools to detect SNPs and to compute different types of statistical indices and graphical layouts for SNP data. From standard sequence alignments, genotyping data or Sanger sequencing traces given as input, SNiPlay detects SNPs and indels events and outputs submission files for the design of Illumina's SNP chips. Subsequently, it sends sequences and genotyping data into a series of modules in charge of various processes: physical mapping to a reference genome, annotation (genomic position, intron/exon location, synonymous/non-synonymous substitutions), SNP frequency determination in user-defined groups, haplotype reconstruction and network, linkage disequilibrium evaluation, and diversity analysis (Pi, Watterson's Theta, Tajima's D). Furthermore, the pipeline allows the use of external data (such as phenotype, geographic origin, taxa, stratification) to define groups and compare statistical indices. 2) a database storing polymorphisms, genotyping data and grapevine sequences released by public and private projects. It allows the user to retrieve SNPs using various filters (such as genomic position, missing data, polymorphism type, allele frequency), to compare SNP patterns

  3. SNiPlay: a web-based tool for detection, management and analysis of SNPs. Application to grapevine diversity projects.

    PubMed

    Dereeper, Alexis; Nicolas, Stéphane; Le Cunff, Loïc; Bacilieri, Roberto; Doligez, Agnès; Peros, Jean-Pierre; Ruiz, Manuel; This, Patrice

    2011-05-05

    High-throughput re-sequencing, new genotyping technologies and the availability of reference genomes allow the extensive characterization of Single Nucleotide Polymorphisms (SNPs) and insertion/deletion events (indels) in many plant species. The rapidly increasing amount of re-sequencing and genotyping data generated by large-scale genetic diversity projects requires the development of integrated bioinformatics tools able to efficiently manage, analyze, and combine these genetic data with genome structure and external data. In this context, we developed SNiPlay, a flexible, user-friendly and integrative web-based tool dedicated to polymorphism discovery and analysis. It integrates:1) a pipeline, freely accessible through the internet, combining existing softwares with new tools to detect SNPs and to compute different types of statistical indices and graphical layouts for SNP data. From standard sequence alignments, genotyping data or Sanger sequencing traces given as input, SNiPlay detects SNPs and indels events and outputs submission files for the design of Illumina's SNP chips. Subsequently, it sends sequences and genotyping data into a series of modules in charge of various processes: physical mapping to a reference genome, annotation (genomic position, intron/exon location, synonymous/non-synonymous substitutions), SNP frequency determination in user-defined groups, haplotype reconstruction and network, linkage disequilibrium evaluation, and diversity analysis (Pi, Watterson's Theta, Tajima's D).Furthermore, the pipeline allows the use of external data (such as phenotype, geographic origin, taxa, stratification) to define groups and compare statistical indices.2) a database storing polymorphisms, genotyping data and grapevine sequences released by public and private projects. It allows the user to retrieve SNPs using various filters (such as genomic position, missing data, polymorphism type, allele frequency), to compare SNP patterns between populations, and to

  4. Interactive Web-based Access and Analysis Tools for the Western Climate Mapping Initiative (WestMap)

    NASA Astrophysics Data System (ADS)

    Comrie, A. C.; Redmond, K.; Glueck, M. F.; Reinbold, H.

    2006-12-01

    The Western Climate Mapping Consortium (WestMap) has developed a prototype web-based interactive access and resource interface to optimize public dissemination and usage of fine-scale spatial climate time series for the western United States. The western U.S. focus reflects the complex climate interactions and diverse geography that make resource management, policy considerations, and climate research challenging in this region. WestMap was conceived by a consortium comprised of the University of Arizona/CLIMAS, the Western Regional Climate Center (WRCC)/Desert Research Institute, and the PRISM group at Oregon State University, along with collaborators at Scripps Institute of Oceanography/California Applications Project, NOAA Climate Diagnostics Center, and the USDA Natural Resource Conservation Service. WestMap evolved in direct response to a multitude of requests to the WRCC and the RISAs from public and private stakeholder communities for lengthy time series of fine-scale spatial climate aggregated to user-specified domains, and related user-friendly web-based access and analysis tools. The WestMap interface is designed to link three stakeholder-driven components, 1) climate data development and operations (access, maintenance); 2) error assessment, data analysis, diagnostics, and related tools; and (3) data access, visualization, and educational resources. The 100-year PRISM 4km monthly temperature and precipitation series serve as the initial data archive, updating automatically once in operational mode. Operational user components are being designed to allow direct stakeholder access to user-specified data and resources most relevant to current needs in a timely manner. Requested resources currently in development and limited testing stages include clickable maps, regional aggregate capabilities, basic statistical analysis, time series visualization, error assessment, and download/print capability. Phased prototype testing, currently underway internally, will

  5. SUN-TZU: Proposal for an Agent Based Battle Staff Planning Tool For Analysis of Situation Awareness Data Anomalies

    DTIC Science & Technology

    2005-06-14

    concept for an agent based situational awareness (SA) data base tool intended to find and highlight inconsistencies in the battle SA picture. The goal...is to find inconsistencies that might cue the existence of a deception story. It is bottom-up, not top-down. Sources of inconsistency other than

  6. Remote Sensing Image Analysis Without Expert Knowledge - A Web-Based Classification Tool On Top of Taverna Workflow Management System

    NASA Astrophysics Data System (ADS)

    Selsam, Peter; Schwartze, Christian

    2016-10-01

    Providing software solutions via internet has been known for quite some time and is now an increasing trend marketed as "software as a service". A lot of business units accept the new methods and streamlined IT strategies by offering web-based infrastructures for external software usage - but geospatial applications featuring very specialized services or functionalities on demand are still rare. Originally applied in desktop environments, the ILMSimage tool for remote sensing image analysis and classification was modified in its communicating structures and enabled for running on a high-power server and benefiting from Tavema software. On top, a GIS-like and web-based user interface guides the user through the different steps in ILMSimage. ILMSimage combines object oriented image segmentation with pattern recognition features. Basic image elements form a construction set to model for large image objects with diverse and complex appearance. There is no need for the user to set up detailed object definitions. Training is done by delineating one or more typical examples (templates) of the desired object using a simple vector polygon. The template can be large and does not need to be homogeneous. The template is completely independent from the segmentation. The object definition is done completely by the software.

  7. Data Mining and Knowledge Discovery in Gaia survey: GUASOM, an analysis tool based on Self Organizing Maps

    NASA Astrophysics Data System (ADS)

    Manteiga, Minia; Dafonte, Jose Carlos; Ulla, Ana; Alvarez, Marco Antonio; Garabato, Daniel; Fustes, Diego

    2015-08-01

    Gaia, the astrometric cornerstone mission of the European Space Agency (ESA) was successfully launched in December 2013. In June 2014 Gaia started its scientific operations phase scanning the sky with the different instruments on board. Gaia was designed to measure positions, parallaxes and motions to the microarcsec level, thus providing the first highly accurate 6-D map of about a thousand million objects of the Milky Way. A vast community of astronomers are looking forward to the delivery of the promise of the first non-biased survey of the entire sky down to magnitude 20.We present GUASOM a data mining tool designed for knowledge discovery in large astronomical spectrophotometric archives, that was developed in the framework of Gaia DPAC (Data Processing and Analysis Consocium). Our tool is based on a type of unsupervised learning Artificial Neural Networks named Self-organizing maps (SOMs).SOMs are used to organize the information in clusters of objects, as homogeneously as possible according to their spectral energy distributions, and to project them onto a 2D grid where the data structure can be visualized. Each cluster has a representative, called prototype which is a virtual pattern that better represents or resembles the set of input patterns belonging to such a cluster. Prototypes make easier the task of determining the physical nature of the objects populating each cluster. Our algorithm has been tested on SDSS observations and theoretical spectral libraries covering a wide sample of astronomical objects.Self-organizing maps permit the grouping and visualization of big amount of data for which there is no a priori knowledge..GUASOM provides a useful toolbox for data visualization and crossmatching. To this effect, we have used SIMBAD catalog to perform astrometric crossmatching with a sample of SDSS classification outliers, seeking for identifications.

  8. Assessment of HTGR Helium Compressor Analysis Tool Based on Newton-Raphson Numerical Application to Through-flow Analysis

    SciTech Connect

    Ji Hwan Kim; Hyeun Min Kim; Hee Cheon NO

    2006-07-01

    This study describes the development of a computer program for analyzing the off-design performance of axial flow helium compressors, which is one of the major concerns for the power conversion system of a high temperature gas-cooled reactor (HTGR). The compressor performance has been predicted by the aerodynamic analysis of meridional flow with allowances for losses. The governing equations have been derived from Euler turbomachine equation and the streamline curvature method, and then they have been merged into linearized equations based on the Newton-Raphson numerical method. The effect of viscosity is considered by empirical correlations to introduce entropy rises caused by primary loss sources. Use of the method has been illustrated by applying it to a 20-stage helium compressor of the GTHTR300 plant. As a result, the flow throughout the stages of the compressor has been predicted and the compressor characteristics have been also investigated according to the design specification. The program results show much better stability and good convergence with respect to other through-flow methods, and good agreement with the compressor performance map provided by JAEA. (authors)

  9. TARA: Tool Assisted Requirements Analysis

    DTIC Science & Technology

    1988-05-01

    techniques examined in detail was the use of direct ’animation’ of data flow specifications in Prolog [Bartlett, Cherrie, Lehman, MacLean and Potts, 1984...it is our objective to provide tools and techniques that are tightly coupled to CORE and the Analyst, as it is only by making such a commitment that...provides techniques and notations for all phases of elicitation, specification and analysis of requirements and results in a structured, action

  10. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  11. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1994-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within it, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  12. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  13. Dynamic Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  14. Rapid analysis of protein backbone resonance assignments using cryogenic probes, a distributed Linux-based computing architecture, and an integrated set of spectral analysis tools.

    PubMed

    Monleón, Daniel; Colson, Kimberly; Moseley, Hunter N B; Anklin, Clemens; Oswald, Robert; Szyperski, Thomas; Montelione, Gaetano T

    2002-01-01

    Rapid data collection, spectral referencing, processing by time domain deconvolution, peak picking and editing, and assignment of NMR spectra are necessary components of any efficient integrated system for protein NMR structure analysis. We have developed a set of software tools designated AutoProc, AutoPeak, and AutoAssign, which function together with the data processing and peak-picking programs NMRPipe and Sparky, to provide an integrated software system for rapid analysis of protein backbone resonance assignments. In this paper we demonstrate that these tools, together with high-sensitivity triple resonance NMR cryoprobes for data collection and a Linux-based computer cluster architecture, can be combined to provide nearly complete backbone resonance assignments and secondary structures (based on chemical shift data) for a 59-residue protein in less than 30 hours of data collection and processing time. In this optimum case of a small protein providing excellent spectra, extensive backbone resonance assignments could also be obtained using less than 6 hours of data collection and processing time. These results demonstrate the feasibility of high throughput triple resonance NMR for determining resonance assignments and secondary structures of small proteins, and the potential for applying NMR in large scale structural proteomics projects.

  15. Tools for sea urchin genomic analysis.

    PubMed

    Cameron, R Andrew

    2014-01-01

    The Sea Urchin Genome Project Web site, SpBase ( http://SpBase.org ), in association with a suite of publicly available sequence comparison tools provides a platform from which to analyze genes and genomic sequences of sea urchin. This information system is specifically designed to support laboratory bench studies in cell and molecular biology. In particular these tools and datasets have supported the description of the gene regulatory networks of the purple sea urchin S. purpuratus. This chapter details methods to undertake in the first steps to find genes and noncoding regulatory sequences for further analysis.

  16. Microarray Data Analysis and Mining Tools

    PubMed Central

    Selvaraj, Saravanakumar; Natarajan, Jeyakumar

    2011-01-01

    Microarrays are one of the latest breakthroughs in experimental molecular biology that allow monitoring the expression levels of tens of thousands of genes simultaneously. Arrays have been applied to studies in gene expression, genome mapping, SNP discrimination, transcription factor activity, toxicity, pathogen identification and many other applications. In this paper we concentrate on discussing various bioinformatics tools used for microarray data mining tasks with its underlying algorithms, web resources and relevant reference. We emphasize this paper mainly for digital biologists to get an aware about the plethora of tools and programs available for microarray data analysis. First, we report the common data mining applications such as selecting differentially expressed genes, clustering, and classification. Next, we focused on gene expression based knowledge discovery studies such as transcription factor binding site analysis, pathway analysis, protein- protein interaction network analysis and gene enrichment analysis. PMID:21584183

  17. Microarray data analysis and mining tools.

    PubMed

    Selvaraj, Saravanakumar; Natarajan, Jeyakumar

    2011-04-22

    Microarrays are one of the latest breakthroughs in experimental molecular biology that allow monitoring the expression levels of tens of thousands of genes simultaneously. Arrays have been applied to studies in gene expression, genome mapping, SNP discrimination, transcription factor activity, toxicity, pathogen identification and many other applications. In this paper we concentrate on discussing various bioinformatics tools used for microarray data mining tasks with its underlying algorithms, web resources and relevant reference. We emphasize this paper mainly for digital biologists to get an aware about the plethora of tools and programs available for microarray data analysis. First, we report the common data mining applications such as selecting differentially expressed genes, clustering, and classification. Next, we focused on gene expression based knowledge discovery studies such as transcription factor binding site analysis, pathway analysis, protein- protein interaction network analysis and gene enrichment analysis.

  18. PopulationProfiler: A Tool for Population Analysis and Visualization of Image-Based Cell Screening Data.

    PubMed

    Matuszewski, Damian J; Wählby, Carolina; Puigvert, Jordi Carreras; Sintorn, Ida-Maria

    2016-01-01

    Image-based screening typically produces quantitative measurements of cell appearance. Large-scale screens involving tens of thousands of images, each containing hundreds of cells described by hundreds of measurements, result in overwhelming amounts of data. Reducing per-cell measurements to the averages across the image(s) for each treatment leads to loss of potentially valuable information on population variability. We present PopulationProfiler-a new software tool that reduces per-cell measurements to population statistics. The software imports measurements from a simple text file, visualizes population distributions in a compact and comprehensive way, and can create gates for subpopulation classes based on control samples. We validate the tool by showing how PopulationProfiler can be used to analyze the effect of drugs that disturb the cell cycle, and compare the results to those obtained with flow cytometry.

  19. Universal tool microscope remanufacture based on CCD

    NASA Astrophysics Data System (ADS)

    Kang, Jian; Hu, Zhongxiang; Zhang, Xunming; Zhang, Jiaying

    2006-02-01

    To overcome the drawback of traditional universal tool microscopes, a remanufacturing scheme based on charge coupled devices (CCD) is proposed. In this paper, the remanufacturing of old tool microscopes is replaced gradually by CCD and grating ruler and the development of a novel measuring system designed to directly analyze image of the screw to be measured is discussed. For the analysis of image, such novel image processing methods as adaptive switching median (ASM) filter and edge detection based on the modified Sobel operator are designed. For the line detection algorithm, HOUGH transform also is used to measure the screw parameter. Experiments on screw images demonstrate that the scheme of remanufactured universal tool microscope is of feasibility and the proposed measurement is of validity.

  20. simuwatt - A Tablet Based Electronic Auditing Tool

    SciTech Connect

    Macumber, Daniel; Parker, Andrew; Lisell, Lars; Metzger, Ian; Brown, Matthew

    2014-05-08

    'simuwatt Energy Auditor' (TM) is a new tablet-based electronic auditing tool that is designed to dramatically reduce the time and cost to perform investment-grade audits and improve quality and consistency. The tool uses the U.S. Department of Energy's OpenStudio modeling platform and integrated Building Component Library to automate modeling and analysis. simuwatt's software-guided workflow helps users gather required data, and provides the data in a standard electronic format that is automatically converted to a baseline OpenStudio model for energy analysis. The baseline energy model is calibrated against actual monthly energy use to ASHRAE Standard 14 guidelines. Energy conservation measures from the Building Component Library are then evaluated using OpenStudio's parametric analysis capability. Automated reporting creates audit documents that describe recommended packages of energy conservation measures. The development of this tool was partially funded by the U.S. Department of Defense's Environmental Security Technology Certification Program. As part of this program, the tool is being tested at 13 buildings on 5 Department of Defense sites across the United States. Results of the first simuwatt audit tool demonstration are presented in this paper.

  1. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development. The goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities, businesses, and other government organizations, and to share that technology in an open and unhindered way. GMAT is a free and open source software system licensed under the NASA Open Source Agreement: free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or further technology development.

  2. CRAB: Distributed analysis tool for CMS

    NASA Astrophysics Data System (ADS)

    Sala, Leonardo; CMS Collaboration

    2012-12-01

    CMS has a distributed computing model, based on a hierarchy of tiered regional computing centers and adopts a data driven model for the end user analysis. This model foresees that jobs are submitted to the analysis resources where data are hosted. The increasing complexity of the whole computing infrastructure makes the simple analysis work flow more and more complicated for the end user. CMS has developed and deployed a dedicated tool named CRAB (CMS Remote Analysis Builder) in order to guarantee the physicists an efficient access to the distributed data whilst hiding the underlying complexity. This tool is used by CMS to enable the running of physics analysis jobs in a transparent manner over data distributed across sites. It factorizes out the interaction with the underlying batch farms, grid infrastructure and CMS data management tools, allowing the user to deal only with a simple and intuitive interface. We present the CRAB architecture, as well as the current status and lessons learnt in deploying this tool for use by the CMS collaboration. We also present the future development of the CRAB system.

  3. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  4. MicroPattern: a web-based tool for microbe set enrichment analysis and disease similarity calculation based on a list of microbes.

    PubMed

    Ma, Wei; Huang, Chuanbo; Zhou, Yuan; Li, Jianwei; Cui, Qinghua

    2017-01-10

    The microbiota colonized on human body is renowned as "a forgotten organ" due to its big impacts on human health and disease. Recently, microbiome studies have identified a large number of microbes differentially regulated in a variety of conditions, such as disease and diet. However, methods for discovering biological patterns in the differentially regulated microbes are still limited. For this purpose, here, we developed a web-based tool named MicroPattern to discover biological patterns for a list of microbes. In addition, MicroPattern implemented and integrated an algorithm we previously presented for the calculation of disease similarity based on disease-microbe association data. MicroPattern first grouped microbes into different sets based on the associated diseases and the colonized positions. Then, for a given list of microbes, MicroPattern performed enrichment analysis of the given microbes on all of the microbe sets. Moreover, using MicroPattern, we can also calculate disease similarity based on the shared microbe associations. Finally, we confirmed the accuracy and usefulness of MicroPattern by applying it to the changed microbes under the animal-based diet condition. MicroPattern is freely available at http://www.cuilab.cn/micropattern.

  5. MicroPattern: a web-based tool for microbe set enrichment analysis and disease similarity calculation based on a list of microbes

    PubMed Central

    Ma, Wei; Huang, Chuanbo; Zhou, Yuan; Li, Jianwei; Cui, Qinghua

    2017-01-01

    The microbiota colonized on human body is renowned as “a forgotten organ” due to its big impacts on human health and disease. Recently, microbiome studies have identified a large number of microbes differentially regulated in a variety of conditions, such as disease and diet. However, methods for discovering biological patterns in the differentially regulated microbes are still limited. For this purpose, here, we developed a web-based tool named MicroPattern to discover biological patterns for a list of microbes. In addition, MicroPattern implemented and integrated an algorithm we previously presented for the calculation of disease similarity based on disease-microbe association data. MicroPattern first grouped microbes into different sets based on the associated diseases and the colonized positions. Then, for a given list of microbes, MicroPattern performed enrichment analysis of the given microbes on all of the microbe sets. Moreover, using MicroPattern, we can also calculate disease similarity based on the shared microbe associations. Finally, we confirmed the accuracy and usefulness of MicroPattern by applying it to the changed microbes under the animal-based diet condition. MicroPattern is freely available at http://www.cuilab.cn/micropattern. PMID:28071710

  6. A qualitative analysis of coronary heart disease patient views of dietary adherence and web-based and mobile-based nutrition tools

    PubMed Central

    Yehle, Karen S.; Chen, Aleda M. H.; Plake, Kimberly S.; Yi, Ji Soo; Mobley, Amy R.

    2012-01-01

    PURPOSE Dietary adherence can be challenging for patients with coronary heart disease (CHD), as they may require multiple dietary changes. Choosing appropriate food items may be difficult or take extensive amounts of time without the aid of technology. The objective of this project was to (1) examine the dietary challenges faced by patients with CHD, (2) examine methods of coping with dietary challenges, (3) explore the feasibility of a web-based food decision support system, and (4) explore the feasibility of a mobile-based food decision support system. METHODS Food for the Heart (FFH), a website-based food decision support system, and Mobile Magic Lens (MML), a mobile-based system, were developed to aid in daily dietary choices. Three CHD patient focus groups were conducted and focused on CHD-associated dietary changes as well as the FFH and MML prototypes. A total of 20 CHD patients and 7 informal caregivers participated. Qualitative, content analysis was performed to find themes grounded in the responses. RESULTS Five predominant themes emerged: 1) decreasing carbohydrate intake and portion control are common dietary challenges, 2) clinician and social support makes dietary adherence easier, 3) FFH could make meal-planning and dietary adherence less complicated, 4) MML could save time and assist with healthy choices, and 5) additional features need to be added to make both tools more comprehensive. CONCLUSIONS FFH and MML may be tools that CHD patients would value in making food choices and adhering to dietary recommendations, especially if additional features are added to assist patients with changes. PMID:22760245

  7. Online machining error estimation method of numerical control gear grinding machine tool based on data analysis of internal sensors

    NASA Astrophysics Data System (ADS)

    Zhao, Fei; Zhang, Chi; Yang, Guilin; Chen, Chinyin

    2016-12-01

    This paper presents an online estimation method of cutting error by analyzing of internal sensor readings. The internal sensors of numerical control (NC) machine tool are selected to avoid installation problem. The estimation mathematic model of cutting error was proposed to compute the relative position of cutting point and tool center point (TCP) from internal sensor readings based on cutting theory of gear. In order to verify the effectiveness of the proposed model, it was simulated and experimented in gear generating grinding process. The cutting error of gear was estimated and the factors which induce cutting error were analyzed. The simulation and experiments verify that the proposed approach is an efficient way to estimate the cutting error of work-piece during machining process.

  8. CancerEST: a web-based tool for automatic meta-analysis of public EST data.

    PubMed

    Feichtinger, Julia; McFarlane, Ramsay J; Larcombe, Lee D

    2014-01-01

    The identification of cancer-restricted biomarkers is fundamental to the development of novel cancer therapies and diagnostic tools. The construction of comprehensive profiles to define tissue- and cancer-specific gene expression has been central to this. To this end, the exploitation of the current wealth of 'omic'-scale databases can be facilitated by automated approaches, allowing researchers to directly address specific biological questions. Here we present CancerEST, a user-friendly and intuitive web-based tool for the automated identification of candidate cancer markers/targets, for examining tissue specificity as well as for integrated expression profiling. CancerEST operates by means of constructing and meta-analyzing expressed sequence tag (EST) profiles of user-supplied gene sets across an EST database supporting 36 tissue types. Using a validation data set from the literature, we show the functionality and utility of CancerEST. DATABASE URL: http://www.cancerest.org.uk.

  9. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  10. Budget Risk & Prioritization Analysis Tool

    SciTech Connect

    Carlos Castillo, Jerel Nelson

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  11. Gene-based comparative analysis of tools for estimating copy number alterations using whole-exome sequencing data

    PubMed Central

    Kim, Hyung-Yong; Choi, Jin-Woo; Lee, Jeong-Yeon; Kong, Gu

    2017-01-01

    Accurate detection of copy number alterations (CNAs) using next-generation sequencing technology is essential for the development and application of more precise medical treatments for human cancer. Here, we evaluated seven CNA estimation tools (ExomeCNV, CoNIFER, VarScan2, CODEX, ngCGH, saasCNV, and falcon) using whole-exome sequencing data from 419 breast cancer tumor-normal sample pairs from The Cancer Genome Atlas. Estimations generated using each tool were converted into gene-based copy numbers; concordance for gains and losses and the sensitivity and specificity of each tool were compared to validated copy numbers from a single nucleotide polymorphism reference array. The concordance and sensitivity of the tumor-normal pair methods for estimating CNAs (saasCNV, ExomeCNV, and VarScan2) were better than those of the tumor batch methods (CoNIFER and CODEX). SaasCNV had the highest gain and loss concordances (65.0%), sensitivity (69.4%), and specificity (89.1%) for estimating copy number gains or losses. These findings indicate that improved CNA detection algorithms are needed to more accurately interpret whole-exome sequencing results in human cancer. PMID:28460482

  12. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  13. Tools for THOR: Wave analysis

    NASA Astrophysics Data System (ADS)

    Narita, Yasuhito; Haaland, Stein; Vaivads, Andris

    2017-04-01

    The THOR mission goal is to reveal particle acceleration and heating mechanisms in turbulent space and astrophysical plasmas. Understanding the properties of waves and turbulent fluctuations plays a key role in revealing the acceleration and heating processes. An extensive set of field and particle experiments are developed and mounted on board the spacecraft. Correspondingly, many of the data analysis methods are being prepared, some as a heritage from the past and the current spacecraft missions and the others as new analysis methods to maximize the scientific potential of the THOR mission. It is worth noting that the THOR mission performs not only single-point measurements but also multi-point measurements by interferometric probe technique. We offer a set of analysis tools designed for the THOR mission: energy spectra, compressibility, ellipticity, wavevector direction, phase speed, Poynting vector, helicity quantities, wave distribution function, higher order statistics, wave-particle resonance parameter, and detection of pitch angle scattering. The emphasis is on the use of both the field data (electric and magnetic fields) and the particle data.

  14. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    NASA Astrophysics Data System (ADS)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is

  15. Decision Analysis Tools for Volcano Observatories

    NASA Astrophysics Data System (ADS)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  16. Regression Modeling and Meta-Analysis of Diagnostic Accuracy of SNP-Based Pathogenicity Detection Tools for UGT1A1 Gene Mutation

    PubMed Central

    Rahim, Fakher; Galehdari, Hamid; Mohammadi-asl, Javad; Saki, Najmaldin

    2013-01-01

    Aims. This review summarized all available evidence on the accuracy of SNP-based pathogenicity detection tools and introduced regression model based on functional scores, mutation score, and genomic variation degree. Materials and Methods. A comprehensive search was performed to find all mutations related to Crigler-Najjar syndrome. The pathogenicity prediction was done using SNP-based pathogenicity detection tools including SIFT, PHD-SNP, PolyPhen2, fathmm, Provean, and Mutpred. Overall, 59 different SNPs related to missense mutations in the UGT1A1 gene, were reviewed. Results. Comparing the diagnostic OR, our model showed high detection potential (diagnostic OR: 16.71, 95% CI: 3.38–82.69). The highest MCC and ACC belonged to our suggested model (46.8% and 73.3%), followed by SIFT (34.19% and 62.71%). The AUC analysis showed a significance overall performance of our suggested model compared to the selected SNP-based pathogenicity detection tool (P = 0.046). Conclusion. Our suggested model is comparable to the well-established SNP-based pathogenicity detection tools that can appropriately reflect the role of a disease-associated SNP in both local and global structures. Although the accuracy of our suggested model is not relatively high, the functional impact of the pathogenic mutations is highlighted at the protein level, which improves the understanding of the molecular basis of mutation pathogenesis. PMID:23997956

  17. SimTool - An object based approach to simulation construction

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Yazbeck, Marwan E.; Edwards, H. C.; Barnette, Randall D.

    1993-01-01

    The creation and maintenance of large complex simulations can be a difficult and error prone task. A number of interactive and automated tools have been developed to aid in simulation construction and maintenance. Many of these tools are based upon object oriented analysis and design concepts. One such tool, SimTool, is an object based integrated tool set for the development, maintenance, and operation of large, complex and long lived simulations. This paper discusses SimTool's object based approach to simulation design, construction and execution. It also discusses the services provided to various levels of SimTool users to assist them in a wide range of simulation tasks. Also, with the aid of an implemented and working simulation example, this paper discusses SimTool's key design and operational features. Finally, this paper presents a condensed discussion of SimTool's Entity-Relationship-Attribute (ERA) modeling approach.

  18. System analysis: Developing tools for the future

    SciTech Connect

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  19. Web Based Personal Nutrition Management Tool

    NASA Astrophysics Data System (ADS)

    Bozkurt, Selen; Zayim, Neşe; Gülkesen, Kemal Hakan; Samur, Mehmet Kemal

    Internet is being used increasingly as a resource for accessing health-related information because of its several advantages. Therefore, Internet tailoring becomes quite preferable in health education and personal health management recently. Today, there are many web based health programs de-signed for individuals. Among these studies nutrition and weight management is popular because, obesity has become a heavy burden for populations worldwide. In this study, we designed a web based personal nutrition education and management tool, The Nutrition Web Portal, in order to enhance patients’ nutrition knowledge, and provide behavioral change against obesity. The present paper reports analysis, design and development processes of The Nutrition Web Portal.

  20. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  1. Automatic tools for microprocessor failure analysis

    NASA Astrophysics Data System (ADS)

    Conard, Didier; Laurent, J.; Velazco, Raoul; Ziade, Haissam; Cabestany, J.; Sala, F.

    A new approach for fault location when testing microprocessors is presented. The startpoint for the backtracing analysis converging to the failure is constituted by the automatic localization of a reduced area. Automatic image comparison based on pattern recognition is performed by means of an electron beam tester. The developed hardware and software tools allow large circuit areas to be covered offering powerful diagnosis capabilities to the user. The validation of this technique was performed on faulty 68000 microprocessors. It shows the feasibility of the automation of the first and most important step of failure analysis: fault location at the chip surface.

  2. Interactive Graphics Tools for Analysis of MOLA and Other Data

    NASA Technical Reports Server (NTRS)

    Frey, H.; Roark, J.; Sakimoto, S.

    2000-01-01

    We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.

  3. MarC-V: a spreadsheet-based tool for analysis, normalization, and visualization of single cDNA microarray experiments.

    PubMed

    Schageman, J J; Basit, M; Gallardo, T D; Garner, H R; Shohet, R V

    2002-02-01

    The comprehensive analysis and visualization of data extracted from cDNA microarrays can be a time-consuming and error-prone process that becomes increasingly tedious with increased number of gene elements on a particular microarray. With the increasingly large number of gene elements on today's microarrays, analysis tools must be developed to meet this challenge. Here, we present MarC-V, a Microsoft Excel spreadsheet tool with Visual Basic macros to automate much of the visualization and calculation involved in the analysis process while providing the familiarity and flexibility of Excel. Automated features of this tool include (i) lower-bound thresholding, (ii) data normalization, (iii) generation of ratio frequency distribution plots, (iv) generation of scatter plots color-coded by expression level, (v) ratio scoring based on intensity measurements, (vi) filtering of data based on expression level or specific gene interests, and (vii) exporting data for subsequent multi-array analysis. MarC-V also has an importing function included for GenePix results (GPR) raw data files.

  4. Surging Seas Risk Finder: A Simple Search-Based Web Tool for Local Sea Level Rise Projections, Coastal Flood Risk Forecasts, and Inundation Exposure Analysis

    NASA Astrophysics Data System (ADS)

    Strauss, B.; Dodson, D.; Kulp, S. A.; Rizza, D. H.

    2016-12-01

    Surging Seas Risk Finder (riskfinder.org) is an online tool for accessing extensive local projections and analysis of sea level rise; coastal floods; and land, populations, contamination sources, and infrastructure and other assets that may be exposed to inundation. Risk Finder was first published in 2013 for Florida, New York and New Jersey, expanding to all states in the contiguous U.S. by 2016, when a major new version of the tool was released with a completely new interface. The revised tool was informed by hundreds of survey responses from and conversations with planners, local officials and other coastal stakeholders, plus consideration of modern best practices for responsive web design and user interfaces, and social science-based principles for science communication. Overarching design principles include simplicity and ease of navigation, leading to a landing page with Google-like sparsity and focus on search, and to an architecture based on search, so that each coastal zip code, city, county, state or other place type has its own webpage gathering all relevant analysis in modular, scrollable units. Millions of users have visited the Surging Seas suite of tools to date, and downloaded thousands of files, for stated purposes ranging from planning to business to education to personal decisions; and from institutions ranging from local to federal government agencies, to businesses, to NGOs, and to academia.

  5. Systems Thinking Tools for Improving Evidence-Based Practice: A Cross-Case Analysis of Two High School Leadership Teams

    ERIC Educational Resources Information Center

    Kensler, Lisa A. W.; Reames, Ellen; Murray, John; Patrick, Lynne

    2012-01-01

    Teachers and administrators have access to large volumes of data but research suggests that they lack the skills to use data effectively for continuous school improvement. This study involved a cross-case analysis of two high school leadership teams' early stages of evidence-based practice development; differing forms of external support were…

  6. Systems Thinking Tools for Improving Evidence-Based Practice: A Cross-Case Analysis of Two High School Leadership Teams

    ERIC Educational Resources Information Center

    Kensler, Lisa A. W.; Reames, Ellen; Murray, John; Patrick, Lynne

    2012-01-01

    Teachers and administrators have access to large volumes of data but research suggests that they lack the skills to use data effectively for continuous school improvement. This study involved a cross-case analysis of two high school leadership teams' early stages of evidence-based practice development; differing forms of external support were…

  7. CHOPPI: a web tool for the analysis of immunogenicity risk from host cell proteins in CHO-based protein production.

    PubMed

    Bailey-Kellogg, Chris; Gutiérrez, Andres H; Moise, Leonard; Terry, Frances; Martin, William D; De Groot, Anne S

    2014-11-01

    Despite high quality standards and continual process improvements in manufacturing, host cell protein (HCP) process impurities remain a substantial risk for biological products. Even at low levels, residual HCPs can induce a detrimental immune response compromising the safety and efficacy of a biologic. Consequently, advanced-stage clinical trials have been cancelled due to the identification of antibodies against HCPs. To enable earlier and rapid assessment of the risks in Chinese Hamster Ovary (CHO)-based protein production of residual CHO protein impurities (CHOPs), we have developed a web tool called CHOPPI, for CHO Protein Predicted Immunogenicity. CHOPPI integrates information regarding the possible presence of CHOPs (expression and secretion) with characterizations of their immunogenicity (T cell epitope count and density, and relative conservation with human counterparts). CHOPPI can generate a report for a specified CHO protein (e.g., identified from proteomics or immunoassays) or characterize an entire specified subset of the CHO genome (e.g., filtered based on confidence in transcription and similarity to human proteins). The ability to analyze potential CHOPs at a genomic scale provides a baseline to evaluate relative risk. We show here that CHOPPI can identify clear differences in immunogenicity risk among previously validated CHOPs, as well as identify additional "risky" CHO proteins that may be expressed during production and induce a detrimental immune response upon delivery. We conclude that CHOPPI is a powerful tool that provides a valuable computational complement to existing experimental approaches for CHOP risk assessment and can focus experimental efforts in the most important directions. Biotechnol. Bioeng. 2014;111: 2170-2182. © 2014 Wiley Periodicals, Inc.

  8. In Search of Practitioner-Based Social Capital: A Social Network Analysis Tool for Understanding and Facilitating Teacher Collaboration in a US-Based STEM Professional Development Program

    ERIC Educational Resources Information Center

    Baker-Doyle, Kira J.; Yoon, Susan A.

    2011-01-01

    This paper presents the first in a series of studies on the informal advice networks of a community of teachers in an in-service professional development program. The aim of the research was to use Social Network Analysis as a methodological tool to reveal the social networks developed by the teachers, and to examine whether these networks…

  9. In Search of Practitioner-Based Social Capital: A Social Network Analysis Tool for Understanding and Facilitating Teacher Collaboration in a US-Based STEM Professional Development Program

    ERIC Educational Resources Information Center

    Baker-Doyle, Kira J.; Yoon, Susan A.

    2011-01-01

    This paper presents the first in a series of studies on the informal advice networks of a community of teachers in an in-service professional development program. The aim of the research was to use Social Network Analysis as a methodological tool to reveal the social networks developed by the teachers, and to examine whether these networks…

  10. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    EPA Pesticide Factsheets

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  11. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  12. Chapter 13: Tools for analysis

    Treesearch

    William Elliot; Kevin Hyde; Lee MacDonald; James. McKean

    2007-01-01

    This chapter presents a synthesis of current computer modeling tools that are, or could be, adopted for use in evaluating the cumulative watershed effects of fuel management. The chapter focuses on runoff, soil erosion and slope stability predictive tools. Readers should refer to chapters on soil erosion and stability for more detailed information on the physical...

  13. Geospatial tool-based morphometric analysis using SRTM data in Sarabanga Watershed, Cauvery River, Salem district, Tamil Nadu, India

    NASA Astrophysics Data System (ADS)

    Arulbalaji, P.; Gurugnanam, B.

    2017-02-01

    A morphometric analysis of Sarabanga watershed in Salem district has been chosen for the present study. Geospatial tools, such as remote sensing and GIS, are utilized for the extraction of river basin and its drainage networks. The Shuttle Radar Topographic Mission (SRTM-30 m resolution) data have been used for morphometric analysis and evaluating various morphometric parameters. The morphometric parameters of Sarabanga watershed have been analyzed and evaluated by pioneer methods, such as Horton and Strahler. The dendritic type of drainage pattern is draining the Sarabanga watershed, which indicates that lithology and gentle slope category is controlling the study area. The Sarabanga watershed is covered an area of 1208 km2. The slope of the watershed is various from 10 to 40% and which is controlled by lithology of the watershed. The bifurcation ratio ranges from 3 to 4.66 indicating the influence of geological structure and suffered more structural disturbances. The form factor indicates elongated shape of the study area. The total stream length and area of watershed indicate that mean annual rainfall runoff is relatively moderate. The basin relief expressed that watershed has relatively high denudation rates. The drainage density of the watershed is low indicating that infiltration is more dominant. The ruggedness number shows the peak discharges that are likely to be relatively higher. The present study is very useful to plan the watershed management.

  14. Teaching tools in evidence based practice: evaluation of reusable learning objects (RLOs) for learning about meta-analysis.

    PubMed

    Bath-Hextall, Fiona; Wharrad, Heather; Leonardi-Bee, Jo

    2011-05-04

    All healthcare students are taught the principles of evidence based practice on their courses. The ability to understand the procedures used in systematically reviewing evidence reported in studies, such as meta-analysis, are an important element of evidence based practice. Meta-analysis is a difficult statistical concept for healthcare students to understand yet it is an important technique used in systematic reviews to pool data from studies to look at combined effectiveness of treatments. In other areas of the healthcare curricula, by supplementing lectures, workbooks and workshops with pedagogically designed, multimedia learning objects (known as reusable learning objects or RLOs) we have shown an improvement in students' perceived understanding in subjects they found difficult. In this study we describe the development and evaluation of two RLOs on meta-analysis. The RLOs supplement associated lectures and aim to improve students' understanding of meta-analysis in healthcare students. Following a quality controlled design process two RLOs were developed and delivered to two cohorts of students, a Master in Public Health course and Postgraduate diploma in nursing course. Students' understanding of five key concepts of Meta-analysis were measured before and after a lecture and again after RLO use. RLOs were also evaluated for their educational value, learning support, media attributes and usability using closed and open questions. Students rated their understanding of meta-analysis as improved after a lecture and further improved after completing the RLOs (Wilcoxon paired test, p < 0.01 in all cases) Whilst the media components of the RLOs such as animations helped most students (86%) understand concepts including for example Forest plots, 93% of students rated usability and control as important to their learning. A small number of students stated they needed the support of a lecturer alongside the RLOs (7% 'Agreed' and 21% 'Neutral'). Meta-analysis RLOs that

  15. Analysis Tools for CFD Multigrid Solvers

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

    2004-01-01

    Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

  16. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  17. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    PubMed

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  18. A Performance-Based Web Budget Tool

    ERIC Educational Resources Information Center

    Abou-Sayf, Frank K.; Lau, Wilson

    2007-01-01

    A web-based formula-driven tool has been developed for the purpose of performing two distinct academic department budgeting functions: allocation funding to the department, and budget management by the department. The tool's major features are discussed and its uses demonstrated. The tool's advantages are presented. (Contains 10 figures.)

  19. SamuROI, a Python-Based Software Tool for Visualization and Analysis of Dynamic Time Series Imaging at Multiple Spatial Scales

    PubMed Central

    Rueckl, Martin; Lenzi, Stephen C.; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W.

    2017-01-01

    The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca2+-imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca2+ imaging datasets, particularly when these have been acquired at different spatial scales. PMID:28706482

  20. SamuROI, a Python-Based Software Tool for Visualization and Analysis of Dynamic Time Series Imaging at Multiple Spatial Scales.

    PubMed

    Rueckl, Martin; Lenzi, Stephen C; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W

    2017-01-01

    The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca(2+)-imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca(2+) imaging datasets, particularly when these have been acquired at different spatial scales.

  1. Monoclonal antibody based inhibition ELISA as a new tool for the analysis of melamine in milk and pet food samples.

    PubMed

    Zhou, Yu; Li, Chun-Yuan; Li, Yan-Song; Ren, Hong-Lin; Lu, Shi-Ying; Tian, Xiang-Li; Hao, Ya-Ming; Zhang, Yuan-Yuan; Shen, Qing-Feng; Liu, Zeng-Shan; Meng, Xian-Mei; Zhang, Jun-Hui

    2012-12-15

    Stories of recent cases about melamine misuse to raise the false impression of a high protein content of milk in China emerged in September of 2008, have become an international health event. To meet the need for rapid and reliable monitoring of melamine in milk samples, a monoclonal antibody (mAb) was produced and an inhibition enzyme-linked immunosorbent assay (ELISA) was developed based on the mAb. The standard curve was linear in the range from 0.03 to 9 ng mL(-1) with a detection limit (LOD) of 0.01 ng mL(-1). The sensitivity of the assay was 0.35 ng mL(-1). The average recovery values of melamine in the liquid milk, powder milk, dog food and cat food were 99%, 96%, 9% and 98%, respectively and the coefficient of variation (CV) values of all samples were less than 10%. The obtained results showed a potential method as a tool for the rapid and reliable monitoring of melamine in liquid milk and milk powder samples (158 words). Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. 3Omics: a web-based systems biology tool for analysis, integration and visualization of human transcriptomic, proteomic and metabolomic data.

    PubMed

    Kuo, Tien-Chueh; Tian, Tze-Feng; Tseng, Yufeng Jane

    2013-07-23

    Integrative and comparative analyses of multiple transcriptomics, proteomics and metabolomics datasets require an intensive knowledge of tools and background concepts. Thus, it is challenging for users to perform such analyses, highlighting the need for a single tool for such purposes. The 3Omics one-click web tool was developed to visualize and rapidly integrate multiple human inter- or intra-transcriptomic, proteomic, and metabolomic data by combining five commonly used analyses: correlation networking, coexpression, phenotyping, pathway enrichment, and GO (Gene Ontology) enrichment. 3Omics generates inter-omic correlation networks to visualize relationships in data with respect to time or experimental conditions for all transcripts, proteins and metabolites. If only two of three omics datasets are input, then 3Omics supplements the missing transcript, protein or metabolite information related to the input data by text-mining the PubMed database. 3Omics' coexpression analysis assists in revealing functions shared among different omics datasets. 3Omics' phenotype analysis integrates Online Mendelian Inheritance in Man with available transcript or protein data. Pathway enrichment analysis on metabolomics data by 3Omics reveals enriched pathways in the KEGG/HumanCyc database. 3Omics performs statistical Gene Ontology-based functional enrichment analyses to display significantly overrepresented GO terms in transcriptomic experiments. Although the principal application of 3Omics is the integration of multiple omics datasets, it is also capable of analyzing individual omics datasets. The information obtained from the analyses of 3Omics in Case Studies 1 and 2 are also in accordance with comprehensive findings in the literature. 3Omics incorporates the advantages and functionality of existing software into a single platform, thereby simplifying data analysis and enabling the user to perform a one-click integrated analysis. Visualization and analysis results are

  3. General Mission Analysis Tool (GMAT) Mathematical Specifications

    NASA Technical Reports Server (NTRS)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  4. Multi-mission telecom analysis tool

    NASA Technical Reports Server (NTRS)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  5. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  6. Thermal System Analysis Tools (TSAT)

    DTIC Science & Technology

    2007-11-02

    Visual Basic Development Window.......................................................................................................... 19 Figure 12... Visual Basic Toolbox with TSAT Engineering Tools Added ................................................................... 19 Figure 13...Windows application, such as Excel, PowerPoint, Visual Basic , and Visio. Such component objects can then be assembled in an appropriate Windows

  7. The (non)comparability of the correlation effect size across different measurement procedures: a challenge to meta-analysis as a tool for identifying "evidence based practices".

    PubMed

    Nugent, William R

    2011-05-01

    Meta-analysis is becoming a principal tool for research synthesis and for the identification and justification of evidence based practices. A fundamental assumption in meta-analysis is that effect sizes based upon different measures are comparable. Recent work has challenged this assumption in the case of the standardized mean difference. In this article it is shown that population universe (true) score level correlation effect sizes, for the relationship between two constructs A and B, based upon different measures will be comparable only if construct validity invariance holds across the measures used to make inferences to A and the measures used to make inferences to B. The results of a simulation study are also reported which show that the results of a meta-analysis may be significantly and adversely affected by violations of construct validity invariance. Finally, it is concluded that the theoretical results obtained in this article, and the results of the simulation study, combine to suggest that the role of meta-analysis in the synthesis of social work research, and in the identification of evidence based practices, be de-emphasized until important questions about the sensitivity of meta-analysis to violations of construct validity invariance are answered.

  8. Analysis of Utility and Use of a Web-Based Tool for Digital Signal Processing Teaching by Means of a Technological Acceptance Model

    ERIC Educational Resources Information Center

    Toral, S. L.; Barrero, F.; Martinez-Torres, M. R.

    2007-01-01

    This paper presents an exploratory study about the development of a structural and measurement model for the technological acceptance (TAM) of a web-based educational tool. The aim consists of measuring not only the use of this tool, but also the external variables with a significant influence in its use for planning future improvements. The tool,…

  9. Analysis of Utility and Use of a Web-Based Tool for Digital Signal Processing Teaching by Means of a Technological Acceptance Model

    ERIC Educational Resources Information Center

    Toral, S. L.; Barrero, F.; Martinez-Torres, M. R.

    2007-01-01

    This paper presents an exploratory study about the development of a structural and measurement model for the technological acceptance (TAM) of a web-based educational tool. The aim consists of measuring not only the use of this tool, but also the external variables with a significant influence in its use for planning future improvements. The tool,…

  10. An image-based software tool for screening retinal fundus images using vascular morphology and network transport analysis

    NASA Astrophysics Data System (ADS)

    Clark, Richard D.; Dickrell, Daniel J.; Meadows, David L.

    2014-03-01

    As the number of digital retinal fundus images taken each year grows at an increasing rate, there exists a similarly increasing need for automatic eye disease detection through image-based analysis. A new method has been developed for classifying standard color fundus photographs into both healthy and diseased categories. This classification was based on the calculated network fluid conductance, a function of the geometry and connectivity of the vascular segments. To evaluate the network resistance, the retinal vasculature was first manually separated from the background to ensure an accurate representation of the geometry and connectivity. The arterial and venous networks were then semi-automatically separated into two separate binary images. The connectivity of the arterial network was then determined through a series of morphological image operations. The network comprised of segments of vasculature and points of bifurcation, with each segment having a characteristic geometric and fluid properties. Based on the connectivity and fluid resistance of each vascular segment, an arterial network flow conductance was calculated, which described the ease with which blood can pass through a vascular system. In this work, 27 eyes (13 healthy and 14 diabetic) from patients roughly 65 years in age were evaluated using this methodology. Healthy arterial networks exhibited an average fluid conductance of 419 ± 89 μm3/mPa-s while the average network fluid conductance of the diabetic set was 165 ± 87 μm3/mPa-s (p < 0.001). The results of this new image-based software demonstrated an ability to automatically, quantitatively and efficiently screen diseased eyes from color fundus imagery.

  11. Patent analysis as a tool for research planning: study on natural based therapeutics against cancer stem cells.

    PubMed

    Arya, Richa; Bhutkar, Smita; Dhulap, Sivakami; Hirwani, R R

    2015-01-01

    Medicines developed from traditional systems are well known for their various important pharmaceutical uses. Cancer has been known since ancient times and has been mentioned in the ancient Ayurvedic books. Thus natural based products play a significant role in cancer chemotherapeutics. Further, approximately 70% of anticancer compounds are based on natural products or have been derived from their structural scaffolds. Hence, there is a growing interest for developing medicines from these natural resources. Amongst the methods of treating cancer, therapies targeting cancer stem cell are found to control metastatic tumor which is a newly identified factor associated with relapse. This patent review aims to highlight the use of natural products to treat cancer by targeting the cancer stem cells. The review will also provide insights into the reported mechanisms by which the natural products act in order to suppress or kill cancer stem cells. The analysis has been done using various criteria such as the patenting trend over the years, comparison of active assignee and a comparison of the technical aspects as disclosed in the different patent documents. The analysis further highlights different bioactives, the scaffolds of which could thus be a promising candidate in the development of anti-cancer drugs by targeting the cancer stem cells. The technical aspects covered in this review include: Bioactives and formulations comprising the extracts or bioactives, their mode of action and the type of assay considered to study the efficacy of the natural products. Further the mapping has helped us to identify potential therapeutic areas to evaluate herbs/bioactives and their uses for developing new formulations.

  12. 1H NMR Spectroscopy and Multivariate Analysis of Monovarietal EVOOs as a Tool for Modulating Coratina-Based Blends

    PubMed Central

    Del Coco, Laura; De Pascali, Sandra Angelica; Fanizzi, Francesco Paolo

    2014-01-01

    Coratina cultivar-based olives are very common among 100% Italian extra virgin olive oils (EVOOs). Often, the very spicy character of this cultivar, mostly due to the high polyphenols concentration, requires blending with other “sweetener” oils. In this work, monovarietal EVOO samples from the Coratina cultivar (Apulia, Italy) were investigated and compared with monovarietal EVOO from native or recently introduced Apulian (Italy) cultivars (Ogliarola Garganica, Ogliarola Barese, Cima di Mola, Peranzana, Picholine), from Calabria (Italy) (Carolea and Rossanese) and from other Mediterranean countries, such as Spain (Picual) and Greece (Kalamata and Koroneiki) by 1H NMR spectroscopy and multivariate analysis (principal component analysis (PCA)). In this regard, NMR signals could allow a first qualitative evaluation of the chemical composition of EVOO and, in particular, of its minor component content (phenols and aldehydes), an intrinsic behavior of EVOO taste, related to the cultivar and geographical origins. Moreover, this study offers an opportunity to address blended EVOOs tastes by using oils from a specific region or country of origin. PMID:28234316

  13. Vibalizer: a free, web-based tool for rapid, quantitative comparison and analysis of calculated vibrational modes.

    PubMed

    Grafton, Anthony K

    2007-05-01

    This report describes the development and applications of a software package called Vibalizer, the first and only method that provides free, fast, interactive, and quantitative comparison and analysis of calculated vibrational modes. Using simple forms and menus in a web-based interface, Vibalizer permits the comparison of vibrational modes from different, but similar molecules and also performs rapid calculation and comparison of isotopically substituted molecules' normal modes. Comparing and matching complex vibrational modes can be completed in seconds with Vibalizer, whereas matching vibrational modes manually can take hours and gives only qualitative comparisons subject to human error and differing individual judgments. In addition to these core features, Vibalizer also provides several other useful features, including the ability to automatically determine first-approximation mode descriptions, to help users analyze the results of vibrational frequency calculations. Because the software can be dimensioned to handle almost arbitrarily large systems, Vibalizer may be of particular use when analyzing the vibrational modes of complex systems such as proteins and extended materials systems. Additionally, the ease of use of the Vibalizer interface and the straightforward interpretation of results may find favor with educators who incorporate molecular modeling into their classrooms. The Vibalizer interface is available for free use at http://www.compchem.org, and it is also available as a locally-installable package that will run on a Linux-based web server.

  14. A custom image-based analysis tool for quantifying elastin and collagen micro-architecture in the wall of the human aorta from multi-photon microscopy.

    PubMed

    Koch, Ryan G; Tsamis, Alkiviadis; D'Amore, Antonio; Wagner, William R; Watkins, Simon C; Gleason, Thomas G; Vorp, David A

    2014-03-21

    The aorta possesses a micro-architecture that imparts and supports a high degree of compliance and mechanical strength. Alteration of the quantity and/or arrangement of the main load-bearing components of this micro-architecture--the elastin and collagen fibers--leads to mechanical, and hence functional, changes associated with aortic disease and aging. Therefore, in the future, the ability to rigorously characterize the wall fiber micro-architecture could provide insight into the complicated mechanisms of aortic wall remodeling in aging and disease. Elastin and collagen fibers can be observed using state-of-the-art multi-photon microscopy. Image-analysis algorithms have been effective at characterizing fibrous constructs using various microscopy modalities. The objective of this study was to develop a custom MATLAB-language automated image-based analysis tool to describe multiple parameters of elastin and collagen micro-architecture in human soft fibrous tissue samples using multi-photon microscopy images. Human aortic tissue samples were used to develop the code. The tool smooths, cleans and equalizes fiber intensities in the image before segmenting the fibers into a binary image. The binary image is cleaned and thinned to a fiber skeleton representation of the image. The developed software analyzes the fiber skeleton to obtain intersections, fiber orientation, concentration, porosity, diameter distribution, segment length and tortuosity. In the future, the developed custom image-based analysis tool can be used to describe the micro-architecture of aortic wall samples in a variety of conditions. While this work targeted the aorta, the software has the potential to describe the architecture of other fibrous materials, tube-like networks and connective tissues.

  15. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYRDOLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly

    developed by the USDA Agricultural Research Service, the U.S. Environmental Protection

    Agency, the University of Arizona, and the University of Wyoming to automate the

    parame...

  16. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execu...

  17. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execu...

  18. AUTOMATED GEOSPATICAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOICAL MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execut...

  19. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYRDOLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly

    developed by the USDA Agricultural Research Service, the U.S. Environmental Protection

    Agency, the University of Arizona, and the University of Wyoming to automate the

    parame...

  20. AUTOMATED GEOSPATICAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOICAL MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execut...

  1. Problem-Based Learning Tools

    ERIC Educational Resources Information Center

    Chin, Christine; Chia, Li-Gek

    2008-01-01

    One way of implementing project-based science (PBS) is to use problem-based learning (PBL), in which students formulate their own problems. These problems are often ill-structured, mirroring complex real-life problems where data are often messy and inclusive. In this article, the authors describe how they used PBL in a ninth-grade biology class in…

  2. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing

    PubMed Central

    2014-01-01

    Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312

  3. Model Analysis ToolKit

    SciTech Connect

    Harp, Dylan R.

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  4. Is principal component analysis an effective tool to predict face attractiveness? A contribution based on real 3D faces of highly selected attractive women, scanned with stereophotogrammetry.

    PubMed

    Galantucci, Luigi Maria; Di Gioia, Eliana; Lavecchia, Fulvio; Percoco, Gianluca

    2014-05-01

    In the literature, several papers report studies on mathematical models used to describe facial features and to predict female facial beauty based on 3D human face data. Many authors have proposed the principal component analysis (PCA) method that permits modeling of the entire human face using a limited number of parameters. In some cases, these models have been correlated with beauty classifications, obtaining good attractiveness predictability using wrapped 2D or 3D models. To verify these results, in this paper, the authors conducted a three-dimensional digitization study of 66 very attractive female subjects using a computerized noninvasive tool known as 3D digital photogrammetry. The sample consisted of the 64 contestants of the final phase of the Miss Italy 2010 beauty contest, plus the two highest ranked contestants in the 2009 competition. PCA was conducted on this real faces sample to verify if there is a correlation between ranking and the principal components of the face models. There was no correlation and therefore, this hypothesis is not confirmed for our sample. Considering that the results of the contest are not only solely a function of facial attractiveness, but undoubtedly are significantly impacted by it, the authors based on their experience and real faces conclude that PCA analysis is not a valid prediction tool for attractiveness. The database of the features belonging to the sample analyzed are downloadable online and further contributions are welcome.

  5. LabVIEW-Based Data Acquisition, Control, and Analysis Programs for BESSY as Versatile Tools for Optimization and Machine Controls

    SciTech Connect

    Dressler, O.; Feikes, J.; Kuske, P.; Kuszynski, J.

    2004-11-10

    Complex machines like synchrotron light sources or newly proposed Free Electron Lasers (FEL) posses a variety of coupled parameters that need complex optimization procedures to achieve best possible working conditions. A programming tool like LabVIEW, with its emphasis on easy data acquisition and its very high flexibility, is used extensively to simultaneously access diverse measurement instruments like Scopes, Spectrum Analyzers, and Waveform Generators and combine them into measurement routines which can access all process variables available under EPICS.

  6. Public Participation Guide: Form-Based Tools

    EPA Pesticide Factsheets

    Form-based tools are tools that require participants to complete a form – whether in hard-copy (paper) or on the web – to respond to specific questions, register general comments about particular issues, evaluate various options, or rank order preferences.

  7. Prospector: A web-based tool for rapid acquisition of gold standard data for pathology research and image analysis.

    PubMed

    Wright, Alexander I; Magee, Derek R; Quirke, Philip; Treanor, Darren E

    2015-01-01

    Obtaining ground truth for pathological images is essential for various experiments, especially for training and testing image analysis algorithms. However, obtaining pathologist input is often difficult, time consuming and expensive. This leads to algorithms being over-fitted to small datasets, and inappropriate validation, which causes poor performance on real world data. There is a great need to gather data from pathologists in a simple and efficient manner, in order to maximise the amount of data obtained. We present a lightweight, web-based HTML5 system for administering and participating in data collection experiments. The system is designed for rapid input with minimal effort, and can be accessed from anywhere in the world with a reliable internet connection. We present two case studies that use the system to assess how limitations on fields of view affect pathologist agreement, and to what extent poorly stained slides affect judgement. In both cases, the system collects pathologist scores at a rate of less than two seconds per image. The system has multiple potential applications in pathology and other domains.

  8. HydrogeoSieveXL: an Excel-based tool to estimate hydraulic conductivity from grain-size analysis

    NASA Astrophysics Data System (ADS)

    Devlin, J. F.

    2015-06-01

    For over a century, hydrogeologists have estimated hydraulic conductivity ( K) from grain-size distribution curves. The benefits of the practice are simplicity, cost, and a means of identifying spatial variations in K. Many techniques have been developed over the years, but all suffer from similar shortcomings: no accounting of heterogeneity within samples (i.e., aquifer structure is lost), loss of grain packing characteristics, and failure to account for the effects of overburden pressure on K. In addition, K estimates can vary by an order of magnitude between the various methods, and it is not generally possible to identify the best method for a given sample. The drawbacks are serious, but the advantages have seen the use of grain-size distribution curves for K estimation continue, often using a single selected method to estimate K in a given project. In most cases, this restriction results from convenience. It is proposed here that extending the analysis to include several methods would be beneficial since it would provide a better indication of the range of K that might apply. To overcome the convenience limitation, an Excel-based spreadsheet program, HydrogeoSieveXL, is introduced here. HydrogeoSieveXL is a freely available program that calculates K from grain-size distribution curves using 15 different methods. HydrogeoSieveXL was found to calculate K values essentially identical to those reported in the literature, using the published grain-size distribution curves.

  9. SHARAD Radargram Analysis Tool Development in JMARS

    NASA Astrophysics Data System (ADS)

    Adler, J. B.; Anwar, S.; Dickenshied, S.; Carter, S.

    2016-09-01

    New tools are being developed in JMARS, a free GIS software, for SHARAD radargram viewing and analysis. These capabilities are useful for the polar science community, and for constraining the viability of ice resource deposits for human exploration.

  10. 2010 Solar Market Transformation Analysis and Tools

    SciTech Connect

    none,

    2010-04-01

    This document describes the DOE-funded solar market transformation analysis and tools under development in Fiscal Year 2010 so that stakeholders can access available resources and get engaged where interested.

  11. Statistical methods for the forensic analysis of striated tool marks

    SciTech Connect

    Hoeksema, Amy Beth

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  12. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  13. Statistical Tools for Forensic Analysis of Toolmarks

    SciTech Connect

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  14. Algal Functional Annotation Tool: a web-based analysis suite to functionally interpret large gene lists using integrated annotation and expression data

    PubMed Central

    2011-01-01

    Background Progress in genome sequencing is proceeding at an exponential pace, and several new algal genomes are becoming available every year. One of the challenges facing the community is the association of protein sequences encoded in the genomes with biological function. While most genome assembly projects generate annotations for predicted protein sequences, they are usually limited and integrate functional terms from a limited number of databases. Another challenge is the use of annotations to interpret large lists of 'interesting' genes generated by genome-scale datasets. Previously, these gene lists had to be analyzed across several independent biological databases, often on a gene-by-gene basis. In contrast, several annotation databases, such as DAVID, integrate data from multiple functional databases and reveal underlying biological themes of large gene lists. While several such databases have been constructed for animals, none is currently available for the study of algae. Due to renewed interest in algae as potential sources of biofuels and the emergence of multiple algal genome sequences, a significant need has arisen for such a database to process the growing compendiums of algal genomic data. Description The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of genes on KEGG pathway maps

  15. Simplified building energy analysis tool for architects

    NASA Astrophysics Data System (ADS)

    Chaisuparasmikul, Pongsak

    Energy Modeler is an energy software program designed to study the relative change of energy uses (heating, cooling, and lighting loads) in different architectural design schemes. This research focuses on developing a tool to improve energy efficiency of the built environment. The research studied the impact of different architectural design response for two distinct global climates: temperate and tropical climatic zones. This energy-based interfacing program is intended to help architects, engineers, educators, students, building designers, major consumers of architectural services, and other professionals whose work interfaces with that of architects, perceive, quickly visualize, and compare energy performance and savings of different design schemes. The buildings in which we live or work have a great impact on our natural environment. Energy savings and consumption reductions in our buildings probably are the best indications of solutions to help environmental sustainability; by reducing the depletion of the world's fossil fuel (oil, natural gas, coal etc.). Architects when they set about designing an environmentally responsive building for an owner or the public, often lack the energy-based information and design tools to tell them whether the building loads and energy consumption are very responsive to the modifications that they made. Buildings are dynamic in nature and changeable over time, with many design variables involved. Architects really need energy-based rules or tools to assist them in the design process. Energy efficient design for sustainable solutions requires attention throughout the design process and is very related to architectural solutions. Early involvement is the only guaranteed way of properly considering fundamental building design issues related to building site, form and exposure. The research presents the methodology and process, which leads to the discussion of the research findings. The innovative work is to make these tools

  16. JavaProtein Dossier: a novel web-based data visualization tool for comprehensive analysis of protein structure

    PubMed Central

    Neshich, Goran; Rocchia, Walter; Mancini, Adauto L.; Yamagishi, Michel E. B.; Kuser, Paula R.; Fileto, Renato; Baudet, Christian; Pinto, Ivan P.; Montagner, Arnaldo J.; Palandrani, Juliana F.; Krauchenco, Joao N.; Torres, Renato C.; Souza, Savio; Togawa, Roberto C.; Higa, Roberto H.

    2004-01-01

    JavaProtein Dossier (JPD) is a new concept, database and visualization tool providing one of the largest collections of the physicochemical parameters describing proteins' structure, stability, function and interaction with other macromolecules. By collecting as many descriptors/parameters as possible within a single database, we can achieve a better use of the available data and information. Furthermore, data grouping allows us to generate different parameters with the potential to provide new insights into the sequence–structure–function relationship. In JPD, residue selection can be performed according to multiple criteria. JPD can simultaneously display and analyze all the physicochemical parameters of any pair of structures, using precalculated structural alignments, allowing direct parameter comparison at corresponding amino acid positions among homologous structures. In order to focus on the physicochemical (and consequently pharmacological) profile of proteins, visualization tools (showing the structure and structural parameters) also had to be optimized. Our response to this challenge was the use of Java technology with its exceptional level of interactivity. JPD is freely accessible (within the Gold Sting Suite) at http://sms.cbi.cnptia.embrapa.br, http://mirrors.rcsb.org/SMS, http://trantor.bioc.columbia.edu/SMS and http://www.es.embnet.org/SMS/ (Option: JavaProtein Dossier). PMID:15215458

  17. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  18. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  19. Tool-use by rats (Rattus norvegicus): tool-choice based on tool features.

    PubMed

    Nagano, Akane; Aoyama, Kenjiro

    2017-03-01

    In the present study, we investigated whether rats (Rattus norvegicus) could be trained to use tools in an experimental setting. In Experiment 1, we investigated whether rats became able to choose appropriate hook-shaped tools to obtain food based on the spatial arrangements of the tool and food, similar to tests conducted in non-human primates and birds. With training, the rats were able to choose the appropriate hooks. In Experiments 2 and 3, we conducted transfer tests with novel tools. The rats had to choose between a functional and non-functional rake-shaped tool in these experiments. In Experiment 2, the tools differed from those of Experiment 1 in terms of shape, color, and texture. In Experiment 3, there was a contradiction between the appearance and the functionality of these tools. The rats could obtain the food with a functional rake with a transparent blade but could not obtain food with a non-functional rake with an opaque soft blade. All rats chose the functional over the non-functional rakes in Experiment 2, but none of the rats chose the functional rake in Experiment 3. Thus, the rats were able to choose the functional rakes only when there was no contradiction between the appearance and functionality of the tools. These results suggest that rats understand the spatial and physical relationships between the tool, food, and self when there was no such contradiction.

  20. AHCODA-DB: a data repository with web-based mining tools for the analysis of automated high-content mouse phenomics data.

    PubMed

    Koopmans, Bastijn; Smit, August B; Verhage, Matthijs; Loos, Maarten

    2017-04-04

    Systematic, standardized and in-depth phenotyping and data analyses of rodent behaviour empowers gene-function studies, drug testing and therapy design. However, no data repositories are currently available for standardized quality control, data analysis and mining at the resolution of individual mice. Here, we present AHCODA-DB, a public data repository with standardized quality control and exclusion criteria aimed to enhance robustness of data, enabled with web-based mining tools for the analysis of individually and group-wise collected mouse phenotypic data. AHCODA-DB allows monitoring in vivo effects of compounds collected from conventional behavioural tests and from automated home-cage experiments assessing spontaneous behaviour, anxiety and cognition without human interference. AHCODA-DB includes such data from mutant mice (transgenics, knock-out, knock-in), (recombinant) inbred strains, and compound effects in wildtype mice and disease models. AHCODA-DB provides real time statistical analyses with single mouse resolution and versatile suite of data presentation tools. On March 9th, 2017 AHCODA-DB contained 650 k data points on 2419 parameters from 1563 mice. AHCODA-DB provides users with tools to systematically explore mouse behavioural data, both with positive and negative outcome, published and unpublished, across time and experiments with single mouse resolution. The standardized (automated) experimental settings and the large current dataset (1563 mice) in AHCODA-DB provide a unique framework for the interpretation of behavioural data and drug effects. The use of common ontologies allows data export to other databases such as the Mouse Phenome Database. Unbiased presentation of positive and negative data obtained under the highly standardized screening conditions increase cost efficiency of publicly funded mouse screening projects and help to reach consensus conclusions on drug responses and mouse behavioural phenotypes. The website is publicly

  1. Fast-NPS-A Markov Chain Monte Carlo-based analysis tool to obtain structural information from single-molecule FRET measurements

    NASA Astrophysics Data System (ADS)

    Eilert, Tobias; Beckers, Maximilian; Drechsler, Florian; Michaelis, Jens

    2017-10-01

    The analysis tool and software package Fast-NPS can be used to analyse smFRET data to obtain quantitative structural information about macromolecules in their natural environment. In the algorithm a Bayesian model gives rise to a multivariate probability distribution describing the uncertainty of the structure determination. Since Fast-NPS aims to be an easy-to-use general-purpose analysis tool for a large variety of smFRET networks, we established an MCMC based sampling engine that approximates the target distribution and requires no parameter specification by the user at all. For an efficient local exploration we automatically adapt the multivariate proposal kernel according to the shape of the target distribution. In order to handle multimodality, the sampler is equipped with a parallel tempering scheme that is fully adaptive with respect to temperature spacing and number of chains. Since the molecular surrounding of a dye molecule affects its spatial mobility and thus the smFRET efficiency, we introduce dye models which can be selected for every dye molecule individually. These models allow the user to represent the smFRET network in great detail leading to an increased localisation precision. Finally, a tool to validate the chosen model combination is provided. Programme Files doi:http://dx.doi.org/10.17632/7ztzj63r68.1 Licencing provisions: Apache-2.0 Programming language: GUI in MATLAB (The MathWorks) and the core sampling engine in C++ Nature of problem: Sampling of highly diverse multivariate probability distributions in order to solve for macromolecular structures from smFRET data. Solution method: MCMC algorithm with fully adaptive proposal kernel and parallel tempering scheme.

  2. Surface analysis of stone and bone tools

    NASA Astrophysics Data System (ADS)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  3. A Meta-Analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    ERIC Educational Resources Information Center

    Zhang, Lin

    2014-01-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…

  4. A Meta-Analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    ERIC Educational Resources Information Center

    Zhang, Lin

    2014-01-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…

  5. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  6. The physics analysis tools project for the ATLAS experiment

    NASA Astrophysics Data System (ADS)

    Lenzi, Bruno; Atlas Collaboration

    2012-12-01

    The Large Hadron Collider is expected to start colliding proton beams in 2009. The enormous amount of data produced by the ATLAS experiment (≈1 PB per year) will be used in searches for the Higgs boson and Physics beyond the standard model. In order to meet this challenge, a suite of common Physics Analysis Tools has been developed as part of the Physics Analysis software project. These tools run within the ATLAS software framework, ATHENA, covering a wide range of applications. There are tools responsible for event selection based on analysed data and detector quality information, tools responsible for specific physics analysis operations including data quality monitoring and physics validation, and complete analysis toolkits (frameworks) with the goal to aid the physicist to perform his analysis hiding the details of the ATHENA framework.

  7. WaveNet: A Web-Based Metocean Data Access, Processing and Analysis Tool; Part 5 - WW3 Database

    DTIC Science & Technology

    2015-02-01

    modeling and planning missions require metocean data (e.g., winds, waves, tides, water levels). WaveNet is a web-based graphical-user- interface (GUI...ERDC/CHL CHETN-IV-103 February 2015 Approved for public release; distribution is unlimited. WaveNet: A Web-Based Metocean Data Access, Processing...Map interface to query, select, and display data for a given geographic region from the different sources available. The user may select the date range

  8. Photogrammetry Tool for Forensic Analysis

    NASA Technical Reports Server (NTRS)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  9. Built Environment Energy Analysis Tool Overview (Presentation)

    SciTech Connect

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  10. Correlation-Based Network Generation, Visualization, and Analysis as a Powerful Tool in Biological Studies: A Case Study in Cancer Cell Metabolism

    PubMed Central

    Toubiana, David; Fait, Aaron

    2016-01-01

    In the last decade vast data sets are being generated in biological and medical studies. The challenge lies in their summary, complexity reduction, and interpretation. Correlation-based networks and graph-theory based properties of this type of networks can be successfully used during this process. However, the procedure has its pitfalls and requires specific knowledge that often lays beyond classical biology and includes many computational tools and software. Here we introduce one of a series of methods for correlation-based network generation and analysis using freely available software. The pipeline allows the user to control each step of the network generation and provides flexibility in selection of correlation methods and thresholds. The pipeline was implemented on published metabolomics data of a population of human breast carcinoma cell lines MDA-MB-231 under two conditions: normal and hypoxia. The analysis revealed significant differences between the metabolic networks in response to the tested conditions. The network under hypoxia had 1.7 times more significant correlations between metabolites, compared to normal conditions. Unique metabolic interactions were identified which could lead to the identification of improved markers or aid in elucidating the mechanism of regulation between distantly related metabolites induced by the cancer growth. PMID:27840831

  11. Stable isotope ratio analysis as a tool to discriminate between rainbow trout (O. mykiss) fed diets based on plant or fish-meal proteins.

    PubMed

    Moreno-Rojas, J M; Tulli, F; Messina, M; Tibaldi, E; Guillou, C

    2008-12-01

    The use of stable isotope ratio analysis (SIRA) as a rapid analytical tool to characterize and discriminate farmed fish on the basis of the feedstuffs included in the diet formulation is discussed. Two isoproteic (44.8%) and isolipidic (19.6%) extruded diets were formulated: a fish-meal-based diet (FM diet), containing fish meal as the sole protein source; a plant-protein-based diet (PP diet), where pea protein concentrate and wheat gluten meal replaced 80% of fish meal protein. The diets were fed to eight groups of rainbow trout (initial body weight: 106.6g) for 103 days in two daily meals under controlled rearing conditions. Growth performance (final body weight: 318.5 g; specific growth rate: 1.06%) and feed-to-gain ratio (0.79) were not affected by the dietary treatment. The differences in isotopic values of the two diets were clearly reflected in the different carbon and nitrogen isotopic values in rainbow trout fillets. The delta(13)C and delta(15)N values of muscle of farmed rainbow trout showed differences between farmed fish fed a fish-protein-based diet (-20.47 +/- 0.34 and 12.38 +/- 0.57 for delta(13)C and delta(15)N, respectively) and those fed a plant-protein-based diet (-23.96 +/- 0.38 and 7.15 +/- 0.51 for delta(13)C and delta(15)N, respectively). The results suggest that SIRA provides a robust and verifiable analytical tool to discriminate between fish fed on a plant or a fish protein diet.

  12. Computer simulation is an undervalued tool for genetic analysis: a historical view and presentation of SHIMSHON--a Web-based genetic simulation package.

    PubMed

    Greenberg, David A

    2011-01-01

    Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. Copyright © 2011 S. Karger AG, Basel.

  13. Computer Simulation Is an Undervalued Tool for Genetic Analysis: A Historical View and Presentation of SHIMSHON – A Web-Based Genetic Simulation Package

    PubMed Central

    Greenberg, David A.

    2011-01-01

    Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. PMID:22189467

  14. Evaluation of the temporal structure of postural sway fluctuations based on a comprehensive set of analysis tools

    NASA Astrophysics Data System (ADS)

    Kirchner, M.; Schubert, P.; Schmidtbleicher, D.; Haas, C. T.

    2012-10-01

    The analysis of postural control has a long history. Traditionally, the amount of body sway is solely used as an index of postural stability. Although this leads to some extent to an effective evaluation of balance performance, the control mechanisms involved have not yet been fully understood. The concept of nonlinear dynamics suggests that variability in the motor output is not randomness but structure, providing the stimulus to reveal the functionality of postural sway. The present work evaluates sway dynamics by means of COP excursions in a quiet standing task versus a dual-task condition in three different test times (30, 60, 300 s). Besides the application of traditional methods-which estimate the overall size of sway-the temporal pattern of body sway was quantified via wavelet transform, multiscale entropy and fractal analysis. We found higher sensitivity of the structural parameters to modulations of postural control strategies and partly an improved evaluation of sway dynamics in longer recordings. It could be shown that postural control modifications take place on different timescales corresponding to the interplay of the sensory systems. A continued application of nonlinear analysis can help to better understand postural control mechanisms.

  15. Environmental Inquiry by College Students: Original Research and Peer Review Using Web-Based Collaborative Tools. Preliminary Quantitative Data Analysis.

    ERIC Educational Resources Information Center

    Cakir, Mustafa; Carlsen, William S.

    The Environmental Inquiry (EI) program (Cornell University and Pennsylvania State University) supports inquiry based, student-centered science teaching on selected topics in the environmental sciences. Texts to support high school student research are published by the National Science Teachers Association (NSTA) in the domains of environmental…

  16. Website Analysis as a Tool for Task-Based Language Learning and Higher Order Thinking in an EFL Context

    ERIC Educational Resources Information Center

    Roy, Debopriyo

    2014-01-01

    Besides focusing on grammar, writing skills, and web-based language learning, researchers in "CALL" and second language acquisition have also argued for the importance of promoting higher-order thinking skills in ESL (English as Second Language) and EFL (English as Foreign Language) classrooms. There is solid evidence supporting the…

  17. Website Analysis as a Tool for Task-Based Language Learning and Higher Order Thinking in an EFL Context

    ERIC Educational Resources Information Center

    Roy, Debopriyo

    2014-01-01

    Besides focusing on grammar, writing skills, and web-based language learning, researchers in "CALL" and second language acquisition have also argued for the importance of promoting higher-order thinking skills in ESL (English as Second Language) and EFL (English as Foreign Language) classrooms. There is solid evidence supporting the…

  18. Analysis of the Perceptions of CPM (Critical Path Method) as a Project Management Tool on Base Level Civil Engineering Projects.

    DTIC Science & Technology

    1986-09-01

    20% of minor 66 V construction, maintenance, and repair projects using CPrI is so small that no generalizations can be made. However, of the six...respondents in these categories, five indicate that CPrI would be effective on base-level projects. Summary of Responses to Research Objective #G. Exactly

  19. Automated Steel Cleanliness Analysis Tool (ASCAT)

    SciTech Connect

    Gary Casuccio; Michael Potter; Fred Schwerer; Dr. Richard J. Fruehan; Dr. Scott Story

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  20. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  1. Enabling Web-Based GIS Tools for Internet and Mobile Devices To Improve and Expand NASA Data Accessibility and Analysis Functionality for the Renewable Energy and Agricultural Applications

    NASA Astrophysics Data System (ADS)

    Ross, A.; Stackhouse, P. W.; Tisdale, B.; Tisdale, M.; Chandler, W.; Hoell, J. M., Jr.; Kusterer, J.

    2014-12-01

    The NASA Langley Research Center Science Directorate and Atmospheric Science Data Center have initiated a pilot program to utilize Geographic Information System (GIS) tools that enable, generate and store climatological averages using spatial queries and calculations in a spatial database resulting in greater accessibility of data for government agencies, industry and private sector individuals. The major objectives of this effort include the 1) Processing and reformulation of current data to be consistent with ESRI and openGIS tools, 2) Develop functions to improve capability and analysis that produce "on-the-fly" data products, extending these past the single location to regional and global scales. 3) Update the current web sites to enable both web-based and mobile application displays for optimization on mobile platforms, 4) Interact with user communities in government and industry to test formats and usage of optimization, and 5) develop a series of metrics that allow for monitoring of progressive performance. Significant project results will include the the development of Open Geospatial Consortium (OGC) compliant web services (WMS, WCS, WFS, WPS) that serve renewable energy and agricultural application products to users using GIS software and tools. Each data product and OGC service will be registered within ECHO, the Common Metadata Repository, the Geospatial Platform, and Data.gov to ensure the data are easily discoverable and provide data users with enhanced access to SSE data, parameters, services, and applications. This effort supports cross agency, cross organization, and interoperability of SSE data products and services by collaborating with DOI, NRCan, NREL, NCAR, and HOMER for requirements vetting and test bed users before making available to the wider public.

  2. The environment power system analysis tool development program

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  3. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    PubMed

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development.

  4. PHYLOViZ Online: web-based tool for visualization, phylogenetic inference, analysis and sharing of minimum spanning trees

    PubMed Central

    Ribeiro-Gonçalves, Bruno; Francisco, Alexandre P.; Vaz, Cátia; Ramirez, Mário; Carriço, João André

    2016-01-01

    High-throughput sequencing methods generated allele and single nucleotide polymorphism information for thousands of bacterial strains that are publicly available in online repositories and created the possibility of generating similar information for hundreds to thousands of strains more in a single study. Minimum spanning tree analysis of allelic data offers a scalable and reproducible methodological alternative to traditional phylogenetic inference approaches, useful in epidemiological investigations and population studies of bacterial pathogens. PHYLOViZ Online was developed to allow users to do these analyses without software installation and to enable easy accessing and sharing of data and analyses results from any Internet enabled computer. PHYLOViZ Online also offers a RESTful API for programmatic access to data and algorithms, allowing it to be seamlessly integrated into any third party web service or software. PHYLOViZ Online is freely available at https://online.phyloviz.net. PMID:27131357

  5. Development of an expert analysis tool based on an interactive subsidence hazard map for urban land use in the city of Celaya, Mexico

    NASA Astrophysics Data System (ADS)

    Alloy, A.; Gonzalez Dominguez, F.; Nila Fonseca, A. L.; Ruangsirikulchai, A.; Gentle, J. N., Jr.; Cabral, E.; Pierce, S. A.

    2016-12-01

    Land Subsidence as a result of groundwater extraction in central Mexico's larger urban centers initiated in the 80's as a result of population and economic growth. The city of Celaya has undergone subsidence for a few decades and a consequence is the development of an active normal fault system that affects its urban infrastructure and residential areas. To facilitate its analysis and a land use decision-making process we created an online interactive map enabling users to easily obtain information associated with land subsidence. Geological and socioeconomic data of the city was collected, including fault location, population data, and other important infrastructure and structural data has been obtained from fieldwork as part of a study abroad interchange undergraduate course. The subsidence and associated faulting hazard map was created using an InSAR derived subsidence velocity map and population data from INEGI to identify hazard zones using a subsidence gradient spatial analysis approach based on a subsidence gradient and population risk matrix. This interactive map provides a simple perspective of different vulnerable urban elements. As an accessible visualization tool, it will enhance communication between scientific and socio-economic disciplines. Our project also lays the groundwork for a future expert analysis system with an open source and easily accessible Python coded, SQLite database driven website which archives fault and subsidence data along with visual damage documentation to civil structures. This database takes field notes and provides an entry form for uniform datasets, which are used to generate a JSON. Such a database is useful because it allows geoscientists to have a centralized repository and access to their observations over time. Because of the widespread presence of the subsidence phenomena throughout cities in central Mexico, the spatial analysis has been automated using the open source software R. Raster, rgeos, shapefiles, and rgdal

  6. CImbinator: a web-based tool for drug synergy analysis in small- and large-scale datasets.

    PubMed

    Flobak, Åsmund; Vazquez, Miguel; Lægreid, Astrid; Valencia, Alfonso

    2017-08-01

    Drug synergies are sought to identify combinations of drugs particularly beneficial. User-friendly software solutions that can assist analysis of large-scale datasets are required. CImbinator is a web-service that can aid in batch-wise and in-depth analyzes of data from small-scale and large-scale drug combination screens. CImbinator offers to quantify drug combination effects, using both the commonly employed median effect equation, as well as advanced experimental mathematical models describing dose response relationships. CImbinator is written in Ruby and R. It uses the R package drc for advanced drug response modeling. CImbinator is available at http://cimbinator.bioinfo.cnio.es , the source-code is open and available at https://github.com/Rbbt-Workflows/combination_index . A Docker image is also available at https://hub.docker.com/r/mikisvaz/rbbt-ci_mbinator/ . asmund.flobak@ntnu.no or miguel.vazquez@cnio.es. Supplementary data are available at Bioinformatics online.

  7. Continued Development of Python-Based Thomson Data Analysis and Associated Visualization Tool for NSTX-U

    NASA Astrophysics Data System (ADS)

    Wallace, William; Miller, Jared; Diallo, Ahmed

    2015-11-01

    MultiPoint Thomson Scattering (MPTS) is an established, accurate method of finding the temperature, density, and pressure of a magnetically confined plasma. Two Nd:YAG (1064 nm) lasers are fired into the plasma with a effective frequency of 60 Hz, and the light is Doppler shifted by Thomson scattering. Polychromators on the NSTX-U midplane collect the scattered photons at various radii/scattering angles, and the avalanche photodiode voltages are saved to an MDSplus tree for later analysis. IDL code is then used to determine plasma temperature, pressure, and density from the captured polychromator measurements via Selden formulas. [1] Previous work [2] converted the single-processor IDL code into Python code, and prepared a new architecture for multiprocessing MPTS in parallel. However, that work was not completed to the generation of output data and curve fits that match with the previous IDL. This project refactored the Python code into a object-oriented architecture, and created a software test suite for the new architecture which allowed identification of the code which generated the difference in output. Another effort currently underway is to display the Thomson data in an intuitive, interactive format. This work was supported in part by the U.S. Department of Energy, Office of Science, Office of Workforce Development for Teachers and Scientists (WDTS) under the Community College Internship (CCI) program.

  8. Tools for Life Support Systems Analysis

    NASA Astrophysics Data System (ADS)

    Lange, K.; Ewert, M.

    An analysis of the optimum level of closure of a life support system is a complex task involving hundreds, if not thousands, of parameters. In the absence of complete data on candidate technologies and a complete definition of the mission architecture and requirements, many assumptions are necessary. Because of the large number of parameters, it is difficult to fully comprehend and compare studies performed by different analysts. The Systems Integration, Modeling, and Analysis (SIMA) Project Element within NASA's Advanced Life Support (ALS) Project has taken measures to improve this situation by issuing documents that define ALS requirements, baseline assumptions, and reference missions. As a further step to capture and retain available knowledge and to facilitate system-level studies, various software tools are being developed. These include a database tool for storing, organizing, and updating technology parameters, modeling tools for evaluating time-average and dynamic system performance, and sizing tools for estimating overall system mass, volume, power, cooling, logistics, and crew time. This presentation describes ongoing work on the development and integration of these tools for life support systems analysis.

  9. Geographical information system (GIS) as a new tool to evaluate epidemiology based on spatial analysis and clinical outcomes in acromegaly.

    PubMed

    Naves, Luciana Ansaneli; Porto, Lara Benigno; Rosa, João Willy Corrêa; Casulari, Luiz Augusto; Rosa, José Wilson Corrêa

    2015-02-01

    Geographical information systems (GIS) have emerged as a group of innovative software components useful for projects in epidemiology and planning in Health Care System. This is an original study to investigate environmental and geographical influences on epidemiology of acromegaly in Brazil. We aimed to validate a method to link an acromegaly registry with a GIS mapping program, to describe the spatial distribution of patients, to identify disease clusters and to evaluate if the access to Health Care could influence the outcome of the disease. Clinical data from 112 consecutive patients were collected and home addresses were plotted in the GIS software for spatial analysis. The buffer spatial distribution of patients living in Brasilia showed that 38.1% lived from 0.33 to 8.66 km, 17.7% from 8.67 to 18.06 km, 22.2% from 18.07 to 25.67 km and 22% from 25.68 to 36.70 km distant to the Reference Medical Center (RMC), and no unexpected clusters were identified. Migration of 26 patients from 11 others cities in different regions of the country was observed. Most of patients (64%) with adenomas bigger than 25 mm lived more than 20 km away from RMC, but no significant correlation between the distance from patient's home to the RMC and tumor diameter (r = 0.45 p = 0.20) nor for delay in diagnosis (r = 0.43 p = 0.30) was found. The geographical distribution of diagnosed cases did not impact in the latency of diagnosis or tumor size but the recognition of significant migration denotes that improvements in the medical assistance network are needed.

  10. Functional MRI-based connectivity analysis: A promising tool for the investigation of the pathophysiology and comorbidity of epilepsy.

    PubMed

    Xiao, Fenglai; An, Dongmei; Zhou, Dong

    2017-01-01

    Epilepsy has been recognized as a brain network disorder. Therefore, functional MRI (fMRI)-based connectivity is an ideal technique for exploring the complex effects of epilepsy on the brain. Functional connectivity studies have provided insights into the physiopathogenesis of the epileptic network underlying focal epilepsies, genetic generalized epilepsy, and specific epileptic syndromes. An increasing number of studies have focused on the deleterious effects of epilepsy on other brain networks to help to explain cognitive deficits and psychiatric symptoms. Anti-epileptic treatment studies have yielded information about the side effects and the restoration of functional abnormalities after using the drug. Researchers who have examined predictors of surgical outcomes have suggested that there might be identifiable pre-surgical patterns of functional connectivity that are associated with a greater likelihood of positive cognitive or seizure outcomes. However, knowledge regarding the role of fMRI connectivity remains limited in clinical settings. Further validation through invasive investigations and follow-up studies is required for its reliable application in the clinical management of individual patients. Copyright © 2016. Published by Elsevier Ltd.

  11. Constraint-Based Model of Shewanella oneidensis MR-1 Metabolism: A Tool for Data Analysis and Hypothesis Generation

    PubMed Central

    Hill, Eric A.; Geydebrekht, Oleg V.; De Ingeniis, Jessica; Zhang, Xiaolin; Osterman, Andrei; Scott, James H.; Reed, Samantha B.; Romine, Margaret F.; Konopka, Allan E.; Beliaev, Alexander S.; Fredrickson, Jim K.

    2010-01-01

    Shewanellae are gram-negative facultatively anaerobic metal-reducing bacteria commonly found in chemically (i.e., redox) stratified environments. Occupying such niches requires the ability to rapidly acclimate to changes in electron donor/acceptor type and availability; hence, the ability to compete and thrive in such environments must ultimately be reflected in the organization and utilization of electron transfer networks, as well as central and peripheral carbon metabolism. To understand how Shewanella oneidensis MR-1 utilizes its resources, the metabolic network was reconstructed. The resulting network consists of 774 reactions, 783 genes, and 634 unique metabolites and contains biosynthesis pathways for all cell constituents. Using constraint-based modeling, we investigated aerobic growth of S. oneidensis MR-1 on numerous carbon sources. To achieve this, we (i) used experimental data to formulate a biomass equation and estimate cellular ATP requirements, (ii) developed an approach to identify cycles (such as futile cycles and circulations), (iii) classified how reaction usage affects cellular growth, (iv) predicted cellular biomass yields on different carbon sources and compared model predictions to experimental measurements, and (v) used experimental results to refine metabolic fluxes for growth on lactate. The results revealed that aerobic lactate-grown cells of S. oneidensis MR-1 used less efficient enzymes to couple electron transport to proton motive force generation, and possibly operated at least one futile cycle involving malic enzymes. Several examples are provided whereby model predictions were validated by experimental data, in particular the role of serine hydroxymethyltransferase and glycine cleavage system in the metabolism of one-carbon units, and growth on different sources of carbon and energy. This work illustrates how integration of computational and experimental efforts facilitates the understanding of microbial metabolism at a systems

  12. Microarray-based gene expression analysis as a process characterization tool to establish comparability of complex biological products: scale-up of a whole-cell immunotherapy product.

    PubMed

    Wang, Min; Senger, Ryan S; Paredes, Carlos; Banik, Gautam G; Lin, Andy; Papoutsakis, Eleftherios T

    2009-11-01

    Whole-cell immunotherapies and other cellular therapies have shown promising results in clinical trials. Due to the complex nature of the whole cell product and of the sometimes limited correlation of clinical potency with the proposed mechanism of action, these cellular immunotherapy products are generally not considered well characterized. Therefore, one major challenge in the product development of whole cell therapies is the ability to demonstrate comparability of product after changes in the manufacturing process. Such changes are nearly inevitable with increase in manufacturing experience leading to improved and robust processes that may have higher commercial feasibility. In order to comprehensively assess the impact of the process changes on the final product, and thus establish comparability, a matrix of characterization assays (in addition to lot release assays) assessing the various aspects of the cellular product are required. In this study, we assessed the capability of DNA-microarray-based, gene-expression analysis as a characterization tool using GVAX cancer immunotherapy cells manufactured by Cell Genesys, Inc. The GVAX immunotherapy product consists two prostate cancer cell lines (CG1940 and CG8711) engineered to secrete human GM-CSF. To demonstrate the capability of the assay, we assessed the transcriptional changes in the product when produced in the presence or absence of fetal bovine serum, and under normal and hypoxic conditions, both changes intended to stress the cell lines. We then assessed the impact of an approximately 10-fold process scale-up on the final product at the transcriptional level. These data were used to develop comparisons and statistical analyses suitable for characterizing culture reproducibility and cellular product similarity. Use of gene-expression data for process characterization proved to be a reproducible and sensitive method for detecting differences due to small or large changes in culture conditions as might be

  13. Screening and case finding tools for the detection of dementia. Part I: evidence-based meta-analysis of multidomain tests.

    PubMed

    Mitchell, Alex J; Malladi, Srinivasa

    2010-09-01

    To evaluate the diagnostic accuracy of all brief multidomain alternatives to the Mini-Mental State Examination (MMSE) in the detection of dementia. A literature search, critical appraisal, and meta-analysis were conducted of robust diagnostic validity studies involving cognitive batteries. Twenty-nine distinct brief batteries were tested in 44 large-scale analyses. Twenty studies took place in specialist settings (11 in memory clinics and 9 in secondary care), ten studies were conducted in primary care, and 14 in the community. In community settings with a low prevalence of dementia, short screening methods of no more than 10 minutes had an overall sensitivity of 72.0% (95% confidence interval [CI] = 60.4%-82.3%) and a specificity of 88.2% (95% CI = 83.0%-92.5%). The optimal individual tests were the Telephonic interview based on MSQ, Category fluency/Memory impairment screen-Telephonic interview and 6 item Cognitive Impairment Test (6-CIT), but data were limited by the absence of multiple independent confirmation for any individual test. In primary care where the prevalence of dementia is usually modest, the optimal individual tools were the Abbreviated mental test score/Mental status questionnaire (MSQ), and Prueba cognitive de leganes (PCL). Furthermore, the Abbreviated mental test score (AMTS) was superior to the MMSE for case finding, but for screening the MMSE was optimal. If length is not a major consideration, the MMSE may remain the best tool for primary care clinicians who want to rule in and rule out a diagnosis. In specialist settings where the prevalence of dementia is often high, the optimal individual tools were the DEMTECT, Montreal cognitive assessment (MOCA), Memory Alteration test, and MINI-COG. Two tools were potentially superior to the MMSE for rule in and rule out, namely the 6-CIT and MINI-COG. Only four analyses looked specifically at accuracy in early-stage dementia, and each showed at least equivalent diagnostic accuracy, suggesting these

  14. A tool for model based diagnostics of the AGS Booster

    SciTech Connect

    Luccio, A.

    1993-12-31

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation.

  15. Hillmaker: an open source occupancy analysis tool.

    PubMed

    Isken, Mark W

    2005-12-01

    Managerial decision making problems in the healthcare industry often involve considerations of customer occupancy by time of day and day of week. We describe an occupancy analysis tool called Hillmaker which has been used in numerous healthcare operations studies. It is being released as a free and open source software project.

  16. Data Analysis — Algorithms and Tools

    NASA Astrophysics Data System (ADS)

    Spousta, Martin

    2015-05-01

    Modeling of detector response, modeling of physics, and software tools for modeling and analyzing are three fields among others that were discussed during 16th International workshop on Advanced Computing and Analysis Techniques in physics research - ACAT 2014. This short report represents a summary of track two where the current status and progress in these fields were reported and discussed.

  17. UDAT: A multi-purpose data analysis tool

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2017-04-01

    UDAT is a pattern recognition tool for mass analysis of various types of data, including image and audio. Based on its WND-CHARM (ascl:1312.002) prototype, UDAT computed a large set of numerical content descriptors from each file it analyzes, and selects the most informative features using statistical analysis. The tool can perform automatic classification of galaxy images by training with annotated galaxy images. It also has unsupervised learning capabilities, such as query-by-example of galaxies based on morphology. That is, given an input galaxy image of interest, the tool can search through a large database of images to retrieve the galaxies that are the most similar to the query image. The downside of the tool is its computational complexity, which in most cases will require a small or medium cluster.

  18. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  19. Integrated multidisciplinary analysis tool IMAT users' guide

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  20. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    SciTech Connect

    Bush, B.; Penev, M.; Melaina, M.; Zuboy, J.

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  1. From sensor networks to connected analysis tools

    NASA Astrophysics Data System (ADS)

    Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.

    2012-04-01

    Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx (www.swiss-experiment.ch) provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the

  2. Constructing Knowledge Bases: A Promising Instructional Tool.

    ERIC Educational Resources Information Center

    Trollip, Stanley R.; Lippert, Renate C.

    1987-01-01

    Argues that construction of knowledge bases is an instructional tool that encourages students' critical thinking in problem solving situations through metacognitive experiences. A study is described in which college students created expert systems to test the effectiveness of this method of instruction, and benefits for students and teachers are…

  3. Web-Based Learning Design Tool

    ERIC Educational Resources Information Center

    Bruno, F. B.; Silva, T. L. K.; Silva, R. P.; Teixeira, F. G.

    2012-01-01

    Purpose: The purpose of this paper is to propose a web-based tool that enables the development and provision of learning designs and its reuse and re-contextualization as generative learning objects, aimed at developing educational materials. Design/methodology/approach: The use of learning objects can facilitate the process of production and…

  4. Web-Based Learning Design Tool

    ERIC Educational Resources Information Center

    Bruno, F. B.; Silva, T. L. K.; Silva, R. P.; Teixeira, F. G.

    2012-01-01

    Purpose: The purpose of this paper is to propose a web-based tool that enables the development and provision of learning designs and its reuse and re-contextualization as generative learning objects, aimed at developing educational materials. Design/methodology/approach: The use of learning objects can facilitate the process of production and…

  5. Diamond-turning tool setting by interferogram analysis

    SciTech Connect

    Rasnick, W.H.; Yoder, R.C.

    1980-10-22

    A method was developed to establish a numerically controlled tool path with respect to the work spindle centerline. Particularly adapted to the diamond turning of optics, this method is based upon interferogram analysis and is applicable to the establishment of the work spindle centerline relative to the tool path for any center-turned optic having a well-defined vertex radius of curvature. The application reported is for an f/2 concave spherical mirror.

  6. Process-Based Quality (PBQ) Tools Development

    SciTech Connect

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  7. Abest: The Ada - Based Expert System Tool

    NASA Astrophysics Data System (ADS)

    Semeco, Antonio C.; Ho, David

    1987-05-01

    This article describes the design and implementation of ABEST, the ADA Based Expert System Tool. ABEST is a general purpose software tool intended for use in the development of expert system applications in an ADA environment. Since it may not be clear whether it is feasible (or even desirable) to use ADA in such applications, a discussion of the advantages and disadvantages of the language in that respect is included in the Introduction. An overview of ABEST follows, illustrating its main features through a sample application in digital circuit fault diagnosis. A discussion of planned enhancements and extensions is also provided.

  8. Web100-based Network Diagnostic Tool

    SciTech Connect

    Carlson, Richard A.

    2003-03-20

    NDT is a client/server based network diagnostic tool developed to aid in finding network performance and configuration problems. The tool measures data transfer rates between two internet hosts (client and server). It also gathers detailed TCP statistical variable counters supplied by the Web100 modified server and uses these TCP variables to compute the theoretical performance rate between the two internet hosts. It then compares these analytical results with the measured results to determine if performance or configuration problems exist and translates these results into plain text messages to aid users and network operators in resolving reported problems.

  9. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  10. Buffer$--An Economic Analysis Tool

    Treesearch

    Gary Bentrup

    2007-01-01

    Buffer$ is an economic spreadsheet tool for analyzing the cost-benefits of conservation buffers by resource professionals. Conservation buffers are linear strips of vegetation managed for multiple landowner and societal objectives. The Microsoft Excel based spreadsheet can calculate potential income derived from a buffer, including income from cost-share/incentive...

  11. miRDis: a Web tool for endogenous and exogenous microRNA discovery based on deep-sequencing data analysis.

    PubMed

    Zhang, Hanyuan; Vieira Resende E Silva, Bruno; Cui, Juan

    2017-01-10

    Small RNA sequencing is the most widely used tool for microRNA (miRNA) discovery, and shows great potential for the efficient study of miRNA cross-species transport, i.e., by detecting the presence of exogenous miRNA sequences in the host species. Because of the increased appreciation of dietary miRNAs and their far-reaching implication in human health, research interests are currently growing with regard to exogenous miRNAs bioavailability, mechanisms of cross-species transport and miRNA function in cellular biological processes. In this article, we present microRNA Discovery (miRDis), a new small RNA sequencing data analysis pipeline for both endogenous and exogenous miRNA detection. Specifically, we developed and deployed a Web service that supports the annotation and expression profiling data of known host miRNAs and the detection of novel miRNAs, other noncoding RNAs, and the exogenous miRNAs from dietary species. As a proof-of-concept, we analyzed a set of human plasma sequencing data from a milk-feeding study where 225 human miRNAs were detected in the plasma samples and 44 show elevated expression after milk intake. By examining the bovine-specific sequences, data indicate that three bovine miRNAs (bta-miR-378, -181* and -150) are present in human plasma possibly because of the dietary uptake. Further evaluation based on different sets of public data demonstrates that miRDis outperforms other state-of-the-art tools in both detection and quantification of miRNA from either animal or plant sources. The miRDis Web server is available at: http://sbbi.unl.edu/miRDis/index.php. © The Author 2017. Published by Oxford University Press.

  12. Fairing Separation Analysis Using SepTOOL

    NASA Technical Reports Server (NTRS)

    Zalewski, Bart F.; Dial, William B.; Kosareo, Daniel N.

    2015-01-01

    This document describes the relevant equations programmed in spreadsheet software, SepTOOL, developed by ZIN Technologies, Inc. (ZIN) to determine the separation clearance between a launch vehicle payload fairing and remaining stages. The software uses closed form rigid body dynamic solutions of the vehicle in combination with flexible body dynamics of the fairing, which is obtained from flexible body dynamic analysis or from test data, and superimposes the two results to obtain minimum separation clearance for any given set of flight trajectory conditions. Using closed form solutions allows SepTOOL to perform separation calculations several orders of magnitude faster compared to numerical methods which allows users to perform real time parameter studies. Moreover, SepTOOL can optimize vehicle performance to minimize separation clearance. This tool can evaluate various shapes and sizes of fairings along with different vehicle configurations and trajectories. These geometries and parameters are inputted in a user friendly interface. Although the software was specifically developed for evaluating the separation clearance of launch vehicle payload fairings, separation dynamics of other launch vehicle components can be evaluated provided that aerodynamic loads acting on the vehicle during the separation event are negligible. This document describes the development of SepTOOL providing analytical procedure and theoretical equations whose implementation of these equations is not disclosed. Realistic examples are presented, and the results are verified with ADAMS (MSC Software Corporation) simulations. It should be noted that SepTOOL is a preliminary separation clearance assessment software for payload fairings and should not be used for final clearance analysis.

  13. [Evidence based surgery. A necessary tool].

    PubMed

    Duran-Vega, Héctor César

    2015-01-01

    Evidence-based surgery is a tool that has been adopted worldwide by surgeons. As all decisions must be current and have a scientific basis, the approach for performing it must be standardised. Five important steps are required to perform surgery based on evidence. Convert the need for information into a question that can be answered, finding the best information to answer that question, critical evaluation of the evidence, and its validity, impact and applicability, integrating the evidence with your own experience, and with the evaluation of the patients. This should take into account their biology, values and specific circumstances, as well as to evaluate the effectiveness and efficiency of the execution of steps 1-4 and propose how to improve them. This article presents the main tools to perform surgery properly based on evidence. Copyright © 2015 Academia Mexicana de Cirugía A.C. Published by Masson Doyma México S.A. All rights reserved.

  14. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  15. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  16. Improving long-term, retrospective precipitation datasets using satellite-based surface soil moisture retrievals and the soil moisture analysis rainfall tool (SMART)

    USDA-ARS?s Scientific Manuscript database

    Using historical satellite surface soil moisture products, the Soil Moisture Analysis Rainfall Tool (SMART) is applied to improve the accuracy of a multi-decadal global daily rainfall product that has been bias-corrected to match the monthly totals of available ground observations. In order to adapt...

  17. [SIGAPS, a tool for the analysis of scientific publications].

    PubMed

    Sillet, Arnauld

    2015-04-01

    The System for the Identification, Management and Analysis of Scientific Publications (SIGAPS) is essential for the funding of teaching hospitals on the basis of scientific publications. It is based on the analysis of articles indexed in Medline and is calculated by taking into account the place of the author and the ranking of the journal according to the disciplinary field. It also offers tools for the bibliometric analysis of scientific production.

  18. Challenges Facing Design and Analysis Tools

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

    2001-01-01

    The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools

  19. Challenges Facing Design and Analysis Tools

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

    2001-01-01

    The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools

  20. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  1. Parachute system design, analysis, and simulation tool

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-01-01

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  2. Meta-analysis diagnostic accuracy of SNP-based pathogenicity detection tools: a case of UTG1A1 gene mutations

    PubMed Central

    Galehdari, Hamid; Saki, Najmaldin; Mohammadi-asl, Javad; Rahim, Fakher

    2013-01-01

    Crigler-Najjar syndrome (CNS) type I and type II are usually inherited as autosomal recessive conditions that result from mutations in the UGT1A1 gene. The main objective of the present review is to summarize results of all available evidence on the accuracy of SNP-based pathogenicity detection tools compared to published clinical result for the prediction of in nsSNPs that leads to disease using prediction performance method. A comprehensive search was performed to find all mutations related to CNS. Database searches included dbSNP, SNPdbe, HGMD, Swissvar, ensemble, and OMIM. All the mutation related to CNS was extracted. The pathogenicity prediction was done using SNP-based pathogenicity detection tools include SIFT, PHD-SNP, PolyPhen2, fathmm, Provean, and Mutpred. Overall, 59 different SNPs related to missense mutations in the UGT1A1 gene, were reviewed. Comparing the diagnostic OR, PolyPhen2 and Mutpred have the highest detection 4.983 (95% CI: 1.24 – 20.02) in both, following by SIFT (diagnostic OR: 3.25, 95% CI: 1.07 – 9.83). The highest MCC of SNP-based pathogenicity detection tools, was belong to SIFT (34.19%) followed by Provean, PolyPhen2, and Mutpred (29.99%, 29.89%, and 29.89%, respectively). Hence the highest SNP-based pathogenicity detection tools ACC, was fit to SIFT (62.71%) followed by PolyPhen2, and Mutpred (61.02%, in both). Our results suggest that some of the well-established SNP-based pathogenicity detection tools can appropriately reflect the role of a disease-associated SNP in both local and global structures. PMID:23875061

  3. The Adversarial Route Analysis Tool: A Web Application

    SciTech Connect

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  4. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    SciTech Connect

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  5. Web-based discovery, access and analysis tools for the provision of different data sources like remote sensing products and climate data

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Hese, S.; Schmullius, C.

    2012-12-01

    To provide different of Earth Observation products in the area of Siberia, the Siberian Earth System Science Cluster (SIB-ESS-C) was established as a spatial data infrastructure at the University of Jena (Germany), Department for Earth Observation. The infrastructure implements standards published by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO) for data discovery, data access and data analysis. The objective of SIB-ESS-C is to faciliate environmental research and Earth system science in Siberia. Several products from the Moderate Resolution Imaging Spectroradiometer sensor were integrated by serving ISO-compliant Metadata and providing OGC-compliant Web Map Service for data visualization and Web Coverage Services / Web Feature Service for data access. Furthermore climate data from the World Meteorological Organization were downloaded, converted, provided as OGC Sensor Observation Service. Each climate data station is described with ISO-compliant Metadata. All these datasets from multiple sources are provided within the SIB-ESS-C infrastructure (figure 1). Furthermore an automatic workflow integrates updates of these datasets daily. The brokering approach within the SIB-ESS-C system is to collect data from different sources, convert the data into common data formats, if necessary, and provide them with standardized Web services. Additional tools are made available within the SIB-ESS-C Geoportal for an easy access to download and analysis functions (figure 2). The data can be visualized, accessed and analysed with this Geoportal. Providing OGC-compliant services the data can also be accessed with other OGC-compliant clients.; Figure 1. Technical Concept of SIB-ESS-C providing different data sources ; Figure 2. Screenshot of the web-based SIB-ESS-C system.

  6. RSAT 2015: Regulatory Sequence Analysis Tools

    PubMed Central

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A.; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M.; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-01-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  7. RSAT 2015: Regulatory Sequence Analysis Tools.

    PubMed

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/.

  8. Correlating Petrophysical Well Logs Using Fractal-based Analysis to Identify Changes in the Signal Complexity Across Neutron, Density, Dipole Sonic, and Gamma Ray Tool Types

    NASA Astrophysics Data System (ADS)

    Matthews, L.; Gurrola, H.

    2015-12-01

    Typical petrophysical well log correlation is accomplished by manual pattern recognition leading to subjective correlations. The change in character in a well log is dependent upon the change in the response of the tool to lithology. The petrophysical interpreter looks for a change in one log type that would correspond to the way a different tool responds to the same lithology. To develop an objective way to pick changes in well log characteristics, we adapt a method of first arrival picking used in seismic data to analyze changes in the character of well logs. We chose to use the fractal method developed by Boschetti et al[1] (1996). This method worked better than we expected and we found similar changes in the fractal dimension across very different tool types (sonic vs density vs gamma ray). We reason the fractal response of the log is not dependent on the physics of the tool response but rather the change in the complexity of the log data. When a formation changes physical character in time or space the recorded magnitude in tool data changes complexity at the same time even if the original tool response is very different. The relative complexity of the data regardless of the tool used is dependent upon the complexity of the medium relative to tool measurement. The relative complexity of the recorded magnitude data changes as a tool transitions from one character type to another. The character we are measuring is the roughness or complexity of the petrophysical curve. Our method provides a way to directly compare different log types based on a quantitative change in signal complexity. For example, using changes in data complexity allow us to correlate gamma ray suites with sonic logs within a well and then across to an adjacent well with similar signatures. Our method creates reliable and automatic correlations to be made in data sets beyond the reasonable cognitive limits of geoscientists in both speed and consistent pattern recognition. [1] Fabio Boschetti

  9. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Astrophysics Data System (ADS)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  10. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Technical Reports Server (NTRS)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  11. BBAT: Bunch and bucket analysis tool

    SciTech Connect

    Deng, D.P.

    1995-05-01

    BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation.

  12. Software and tools for microarray data analysis.

    PubMed

    Mehta, Jai Prakash; Rani, Sweta

    2011-01-01

    A typical microarray experiment results in series of images, depending on the experimental design and number of samples. Software analyses the images to obtain the intensity at each spot and quantify the expression for each transcript. This is followed by normalization, and then various data analysis techniques are applied on the data. The whole analysis pipeline requires a large number of software to accurately handle the massive amount of data. Fortunately, there are large number of freely available and commercial software to churn the massive amount of data to manageable sets of differentially expressed genes, functions, and pathways. This chapter describes the software and tools which can be used to analyze the gene expression data right from the image analysis to gene list, ontology, and pathways.

  13. Interoperability of the analysis tools within the IMPEx project

    NASA Astrophysics Data System (ADS)

    Génot, Vincent; Khodachenko, Maxim; Kallio, Esa; Al-Ubaidi, Tarek; Gangloff, Michel; Budnik, Elena; Bouchemit, Myriam; Renard, Benjamin; Bourel, Natacha; Modolo, Ronan; Hess, Sébastien; André, Nicolas; Penou, Emmanuel; Topf, Florian; Alexeev, Igor; Belenkaya, Elena; Kalegaev, Vladimir; Schmidt, Walter

    2013-04-01

    The growing amount of data in planetary sciences requires adequate tools for visualisation enabling in depth analysis. Within the FP7 IMPEx infrastructure data will originate from heterogeneous sources : large observational databases (CDAWeb, AMDA at CDPP, ...), simulation databases for hybrid and MHD codes (FMI, LATMOS), planetary magnetic field models database and online services (SINP). Together with the common "time series" visualisation functionality for both in-situ and modeled data (provided by AMDA and CLWeb tools), IMPEx will also provide immersion capabilities into the complex 3D data originating from models (provided by 3DView). The functionalities of these tools will be described. The emphasis will be put on how these tools 1/ can share information (for instance Time Tables or user composed parameters) and 2/ be operated synchronously via dynamic connections based on Virtual Observatory standards.

  14. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  15. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  16. Enhancement of Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  17. Web Based Tool for Mission Operations Scenarios

    NASA Technical Reports Server (NTRS)

    Boyles, Carole A.; Bindschadler, Duane L.

    2008-01-01

    A conventional practice for spaceflight projects is to document scenarios in a monolithic Operations Concept document. Such documents can be hundreds of pages long and may require laborious updates. Software development practice utilizes scenarios in the form of smaller, individual use cases, which are often structured and managed using UML. We have developed a process and a web-based scenario tool that utilizes a similar philosophy of smaller, more compact scenarios (but avoids the formality of UML). The need for a scenario process and tool became apparent during the authors' work on a large astrophysics mission. It was noted that every phase of the Mission (e.g., formulation, design, verification and validation, and operations) looked back to scenarios to assess completeness of requirements and design. It was also noted that terminology needed to be clarified and structured to assure communication across all levels of the project. Attempts to manage, communicate, and evolve scenarios at all levels of a project using conventional tools (e.g., Excel) and methods (Scenario Working Group meetings) were not effective given limitations on budget and staffing. The objective of this paper is to document the scenario process and tool created to offer projects a low-cost capability to create, communicate, manage, and evolve scenarios throughout project development. The process and tool have the further benefit of allowing the association of requirements with particular scenarios, establishing and viewing relationships between higher- and lower-level scenarios, and the ability to place all scenarios in a shared context. The resulting structured set of scenarios is widely visible (using a web browser), easily updated, and can be searched according to various criteria including the level (e.g., Project, System, and Team) and Mission Phase. Scenarios are maintained in a web-accessible environment that provides a structured set of scenario fields and allows for maximum

  18. Web Based Tool for Mission Operations Scenarios

    NASA Technical Reports Server (NTRS)

    Boyles, Carole A.; Bindschadler, Duane L.

    2008-01-01

    A conventional practice for spaceflight projects is to document scenarios in a monolithic Operations Concept document. Such documents can be hundreds of pages long and may require laborious updates. Software development practice utilizes scenarios in the form of smaller, individual use cases, which are often structured and managed using UML. We have developed a process and a web-based scenario tool that utilizes a similar philosophy of smaller, more compact scenarios (but avoids the formality of UML). The need for a scenario process and tool became apparent during the authors' work on a large astrophysics mission. It was noted that every phase of the Mission (e.g., formulation, design, verification and validation, and operations) looked back to scenarios to assess completeness of requirements and design. It was also noted that terminology needed to be clarified and structured to assure communication across all levels of the project. Attempts to manage, communicate, and evolve scenarios at all levels of a project using conventional tools (e.g., Excel) and methods (Scenario Working Group meetings) were not effective given limitations on budget and staffing. The objective of this paper is to document the scenario process and tool created to offer projects a low-cost capability to create, communicate, manage, and evolve scenarios throughout project development. The process and tool have the further benefit of allowing the association of requirements with particular scenarios, establishing and viewing relationships between higher- and lower-level scenarios, and the ability to place all scenarios in a shared context. The resulting structured set of scenarios is widely visible (using a web browser), easily updated, and can be searched according to various criteria including the level (e.g., Project, System, and Team) and Mission Phase. Scenarios are maintained in a web-accessible environment that provides a structured set of scenario fields and allows for maximum

  19. Determinants for global cargo analysis tools

    NASA Astrophysics Data System (ADS)

    Wilmoth, M.; Kay, W.; Sessions, C.; Hancock, M.

    2007-04-01

    The purpose of Global TRADER (GT) is not only to gather and query supply-chain transactional data for facts but also to analyze that data for hidden knowledge for the purpose of useful and meaningful pattern prediction. The application of advanced analytics provides benefits beyond simple information retrieval from GT, including computer-aided detection of useful patterns and associations. Knowledge discovery, offering a breadth and depth of analysis unattainable by manual processes, involves three components: repository structures, analytical engines, and user tools and reports. For a large and complex domain like supply-chains, there are many stages to developing the most advanced analytic capabilities; however, significant benefits accrue as components are incrementally added. These benefits include detecting emerging patterns; identifying new patterns; fusing data; creating models that can learn and predict behavior; and identifying new features for future tools. The GT Analyst Toolset was designed to overcome a variety of constraints, including lack of third party data, partial data loads, non-cleansed data (non-disambiguation of parties, misspellings, transpositions, etc.), and varying levels of analyst experience and expertise. The end result was a set of analytical tools that are flexible, extensible, tunable, and able to support a wide range of analyst demands.

  20. PATHA: Performance Analysis Tool for HPC Applications

    SciTech Connect

    Yoo, Wucherl; Koo, Michelle; Cao, Yi; Sim, Alex; Nugent, Peter; Wu, Kesheng

    2016-02-18

    Large science projects rely on complex workflows to analyze terabytes or petabytes of data. These jobs are often running over thousands of CPU cores and simultaneously performing data accesses, data movements, and computation. It is difficult to identify bottlenecks or to debug the performance issues in these large workflows. In order to address these challenges, we have developed Performance Analysis Tool for HPC Applications (PATHA) using the state-of-art open source big data processing tools. Our framework can ingest system logs to extract key performance measures, and apply the most sophisticated statistical tools and data mining methods on the performance data. Furthermore, it utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of PATHA, we conduct a case study on the workflows from an astronomy project known as the Palomar Transient Factory (PTF). This study processed 1.6 TB of system logs collected on the NERSC supercomputer Edison. Using PATHA, we were able to identify performance bottlenecks, which reside in three tasks of PTF workflow with the dependency on the density of celestial objects.

  1. PATHA: Performance Analysis Tool for HPC Applications

    DOE PAGES

    Yoo, Wucherl; Koo, Michelle; Cao, Yi; ...

    2016-02-18

    Large science projects rely on complex workflows to analyze terabytes or petabytes of data. These jobs are often running over thousands of CPU cores and simultaneously performing data accesses, data movements, and computation. It is difficult to identify bottlenecks or to debug the performance issues in these large workflows. In order to address these challenges, we have developed Performance Analysis Tool for HPC Applications (PATHA) using the state-of-art open source big data processing tools. Our framework can ingest system logs to extract key performance measures, and apply the most sophisticated statistical tools and data mining methods on the performance data.more » Furthermore, it utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of PATHA, we conduct a case study on the workflows from an astronomy project known as the Palomar Transient Factory (PTF). This study processed 1.6 TB of system logs collected on the NERSC supercomputer Edison. Using PATHA, we were able to identify performance bottlenecks, which reside in three tasks of PTF workflow with the dependency on the density of celestial objects.« less

  2. Accelerator Based Tools of Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Seestrom, Susan

    2017-01-01

    The Manhattan Project had to solve difficult challenges in physics and materials science. During the cold war a large nuclear stockpile was developed. In both cases, the approach was largely empirical. Today that stockpile must be certified without nuclear testing, a task that becomes more difficult as the stockpile ages. I will discuss the role of modern accelerator based experiments, such as x-ray radiography, proton radiography, neutron and nuclear physics experiments, in stockpile stewardship. These new tools provide data of exceptional sensitivity and are answering questions about the stockpile, improving our scientific understanding, and providing validation for the computer simulations that are relied upon to certify todays' stockpile.

  3. Sustainability Tools Inventory - Initial Gaps Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influenc...

  4. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    PubMed

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-01-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis.

  5. A Comparative Analysis of Life-Cycle Assessment Tools for ...

    EPA Pesticide Factsheets

    We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal solid waste-decision support tool (MSW-DST), solid waste optimization life-cycle framework (SWOLF), environmental assessment system for environmental technologies (EASETECH), and waste and resources assessment for the environment (WRATE). WARM, MSW-DST, and SWOLF were developed for US-specific materials management strategies, while WRATE and EASETECH were developed for European-specific conditions. All of the tools (with the exception of WARM) allow specification of a wide variety of parameters (e.g., materials composition and energy mix) to a varying degree, thus allowing users to model specific EOL materials management methods even outside the geographical domain they are originally intended for. The flexibility to accept user-specified input for a large number of parameters increases the level of complexity and the skill set needed for using these tools. The tools were evaluated and compared based on a series of criteria, including general tool features, the scope of the analysis (e.g., materials and processes included), and the impact categories analyzed (e.g., climate change, acidification). A series of scenarios representing materials management problems currently relevant to c

  6. A Comparative Analysis of Life-Cycle Assessment Tools for ...

    EPA Pesticide Factsheets

    We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal solid waste-decision support tool (MSW-DST), solid waste optimization life-cycle framework (SWOLF), environmental assessment system for environmental technologies (EASETECH), and waste and resources assessment for the environment (WRATE). WARM, MSW-DST, and SWOLF were developed for US-specific materials management strategies, while WRATE and EASETECH were developed for European-specific conditions. All of the tools (with the exception of WARM) allow specification of a wide variety of parameters (e.g., materials composition and energy mix) to a varying degree, thus allowing users to model specific EOL materials management methods even outside the geographical domain they are originally intended for. The flexibility to accept user-specified input for a large number of parameters increases the level of complexity and the skill set needed for using these tools. The tools were evaluated and compared based on a series of criteria, including general tool features, the scope of the analysis (e.g., materials and processes included), and the impact categories analyzed (e.g., climate change, acidification). A series of scenarios representing materials management problems currently relevant to c

  7. Tools for augmented-reality-based liver resection planning

    NASA Astrophysics Data System (ADS)

    Reitinger, Bernhard; Bornik, Alexander; Beichel, Reinhard; Werkgartner, Georg; Sorantin, Erich

    2004-05-01

    Surgical resection has evolved to an accepted and widely-used method for the treatment of liver tumors. In order to elaborate an optimal resection strategy, computer-aided planning tools are required. However, measurements based on 2D cross sectional images are difficult to perform. Moreover, resection planning with current desktopbased systems using 3D visualization is also a tedious task because of limited 3D interaction. For facilitating the planning process, different tools are presented allowing easy user interaction in an Augmented Reality environment. Methods for quantitative analysis like volume calculation and distance measurements are discussed with focus on the user interaction aspect. In addition, a tool for automatically generating anatomical resection proposals based on knowledge about tumor locations and the portal vein tree is described. The presented methods are part of an evolving liver surgery planning system which is currently evaluated by physicians.

  8. Pattern recognition tool based on complex network-based approach

    NASA Astrophysics Data System (ADS)

    Casanova, Dalcimar; Backes, André Ricardo; Martinez Bruno, Odemir

    2013-02-01

    This work proposed a generalization of the method proposed by the authors: 'A complex network-based approach for boundary shape analysis'. Instead of modelling a contour into a graph and use complex networks rules to characterize it, here, we generalize the technique. This way, the work proposes a mathematical tool for characterization signals, curves and set of points. To evaluate the pattern description power of the proposal, an experiment of plat identification based on leaf veins image are conducted. Leaf vein is a taxon characteristic used to plant identification proposes, and one of its characteristics is that these structures are complex, and difficult to be represented as a signal or curves and this way to be analyzed in a classical pattern recognition approach. Here, we model the veins as a set of points and model as graphs. As features, we use the degree and joint degree measurements in a dynamic evolution. The results demonstrates that the technique has a good power of discrimination and can be used for plant identification, as well as other complex pattern recognition tasks.

  9. Retention payoff-based cost per day open regression equations: Application in a user-friendly decision support tool for investment analysis of automated estrus detection technologies.

    PubMed

    Dolecheck, K A; Heersche, G; Bewley, J M

    2016-12-01

    Assessing the economic implications of investing in automated estrus detection (AED) technologies can be overwhelming for dairy producers. The objectives of this study were to develop new regression equations for estimating the cost per day open (DO) and to apply the results to create a user-friendly, partial budget, decision support tool for investment analysis of AED technologies. In the resulting decision support tool, the end user can adjust herd-specific inputs regarding general management, current reproductive management strategies, and the proposed AED system. Outputs include expected DO, reproductive cull rate, net present value, and payback period for the proposed AED system. Utility of the decision support tool was demonstrated with an example dairy herd created using data from DairyMetrics (Dairy Records Management Systems, Raleigh, NC), Food and Agricultural Policy Research Institute (Columbia, MO), and published literature. Resulting herd size, rolling herd average milk production, milk price, and feed cost were 323 cows, 10,758kg, $0.41/kg, and $0.20/kg of dry matter, respectively. Automated estrus detection technologies with 2 levels of initial system cost (low: $5,000 vs. high: $10,000), tag price (low: $50 vs. high: $100), and estrus detection rate (low: 60% vs. high: 80%) were compared over a 7-yr investment period. Four scenarios were considered in a demonstration of the investment analysis tool: (1) a herd using 100% visual observation for estrus detection before adopting 100% AED, (2) a herd using 100% visual observation before adopting 75% AED and 25% visual observation, (3) a herd using 100% timed artificial insemination (TAI) before adopting 100% AED, and (4) a herd using 100% TAI before adopting 75% AED and 25% TAI. Net present value in scenarios 1 and 2 was always positive, indicating a positive investment situation. Net present value in scenarios 3 and 4 was always positive in combinations using a $50 tag price, and in scenario 4, the $5

  10. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  11. Tools for T-RFLP data analysis using Excel.

    PubMed

    Fredriksson, Nils Johan; Hermansson, Malte; Wilén, Britt-Marie

    2014-11-08

    Terminal restriction fragment length polymorphism (T-RFLP) analysis is a DNA-fingerprinting method that can be used for comparisons of the microbial community composition in a large number of samples. There is no consensus on how T-RFLP data should be treated and analyzed before comparisons between samples are made, and several different approaches have been proposed in the literature. The analysis of T-RFLP data can be cumbersome and time-consuming, and for large datasets manual data analysis is not feasible. The currently available tools for automated T-RFLP analysis, although valuable, offer little flexibility, and few, if any, options regarding what methods to use. To enable comparisons and combinations of different data treatment methods an analysis template and an extensive collection of macros for T-RFLP data analysis using Microsoft Excel were developed. The Tools for T-RFLP data analysis template provides procedures for the analysis of large T-RFLP datasets including application of a noise baseline threshold and setting of the analysis range, normalization and alignment of replicate profiles, generation of consensus profiles, normalization and alignment of consensus profiles and final analysis of the samples including calculation of association coefficients and diversity index. The procedures are designed so that in all analysis steps, from the initial preparation of the data to the final comparison of the samples, there are various different options available. The parameters regarding analysis range, noise baseline, T-RF alignment and generation of consensus profiles are all given by the user and several different methods are available for normalization of the T-RF profiles. In each step, the user can also choose to base the calculations on either peak height data or peak area data. The Tools for T-RFLP data analysis template enables an objective and flexible analysis of large T-RFLP datasets in a widely used spreadsheet application.

  12. A Tool for Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.

  13. A conceptual design tool for RBCC engine performance analysis

    NASA Astrophysics Data System (ADS)

    Olds, John R.; Saks, Greg

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960's. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework.

  14. A conceptual design tool for RBCC engine performance analysis

    SciTech Connect

    Olds, J.R.; Saks, G.

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

  15. A reliability analysis tool for SpaceWire network

    NASA Astrophysics Data System (ADS)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  16. Development of New Modeling and Analysis Tools for Solar Sails

    NASA Technical Reports Server (NTRS)

    Lou, Michael; Fang, Houfei; Yang, Bingen

    2004-01-01

    Existing finite-element-based structural analysis codes are ineffective in treating deployable gossamer space systems, including solar sails that are formed by long space-deployable booms and extremely large thin-film membrane apertures. Recognizing this, the NASA Space transportation Technology Program has initiated and sponsored a focused research effort to develop new and computationally efficient structural modeling and analysis tools for solar sails. The technical approach of this ongoing effort will be described. Two solution methods, the Distributed Transfer Function Method and the Parameter-Variation-Principle method, based on which the technical approach was formatted are also discussed.

  17. Development of New Modeling and Analysis Tools for Solar Sails

    NASA Technical Reports Server (NTRS)

    Lou, Michael; Fang, Houfei; Yang, Bingen

    2004-01-01

    Existing finite-element-based structural analysis codes are ineffective in treating deployable gossamer space systems, including solar sails that are formed by long space-deployable booms and extremely large thin-film membrane apertures. Recognizing this, the NASA Space transportation Technology Program has initiated and sponsored a focused research effort to develop new and computationally efficient structural modeling and analysis tools for solar sails. The technical approach of this ongoing effort will be described. Two solution methods, the Distributed Transfer Function Method and the Parameter-Variation-Principle method, based on which the technical approach was formatted are also discussed.

  18. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  19. RSAT 2011: regulatory sequence analysis tools

    PubMed Central

    Thomas-Chollier, Morgane; Defrance, Matthieu; Medina-Rivera, Alejandra; Sand, Olivier; Herrmann, Carl; Thieffry, Denis

    2011-01-01

    RSAT (Regulatory Sequence Analysis Tools) comprises a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. Thirteen new programs have been added to the 30 described in the 2008 NAR Web Software Issue, including an automated sequence retrieval from EnsEMBL (retrieve-ensembl-seq), two novel motif discovery algorithms (oligo-diff and info-gibbs), a 100-times faster version of matrix-scan enabling the scanning of genome-scale sequence sets, and a series of facilities for random model generation and statistical evaluation (random-genome-fragments, random-motifs, random-sites, implant-sites, sequence-probability, permute-matrix). Our most recent work also focused on motif comparison (compare-matrices) and evaluation of motif quality (matrix-quality) by combining theoretical and empirical measures to assess the predictive capability of position-specific scoring matrices. To process large collections of peak sequences obtained from ChIP-seq or related technologies, RSAT provides a new program (peak-motifs) that combines several efficient motif discovery algorithms to predict transcription factor binding motifs, match them against motif databases and predict their binding sites. Availability (web site, stand-alone programs and SOAP/WSDL (Simple Object Access Protocol/Web Services Description Language) web services): http://rsat.ulb.ac.be/rsat/. PMID:21715389

  20. Image-Based 3d Reconstruction Data as AN Analysis and Documentation Tool for Architects: the Case of Plaka Bridge in Greece

    NASA Astrophysics Data System (ADS)

    Kouimtzoglou, T.; Stathopoulou, E. K.; Agrafiotis, P.; Georgopoulos, A.

    2017-02-01

    Μodern advances in the field of image-based 3D reconstruction of complex architectures are valuable tools that may offer the researchers great possibilities integrating the use of such procedures in their studies. In the same way that photogrammetry was a well-known useful tool among the cultural heritage community for years, the state of the art reconstruction techniques generate complete and easy to use 3D data, thus enabling engineers, architects and other cultural heritage experts to approach their case studies in an exhaustive and efficient way. The generated data can be a valuable and accurate basis upon which further plans and studies will be drafted. These and other aspects of the use of image-based 3D data for architectural studies are to be presented and analysed in this paper, based on the experience gained from a specific case study, the Plaka Bridge. This historic structure is of particular interest, as it was recently lost due to extreme weather conditions and serves as a strong proof that preventive actions are of utmost importance in order to preserve our common past.

  1. Using the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel J.; Parker, Joel

    2017-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT). These slides will be used to accompany the demonstration. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. This talk is a combination of existing presentations and material; system user guide and technical documentation; a GMAT basics and overview, and technical presentations from the TESS projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project. Slides for navigation and optimal control are borrowed from system documentation and training material.

  2. Multi-Mission Power Analysis Tool

    NASA Technical Reports Server (NTRS)

    Broderick, Daniel

    2011-01-01

    Multi-Mission Power Analysis Tool (MMPAT) Version 2 simulates spacecraft power generation, use, and storage in order to support spacecraft design, mission planning, and spacecraft operations. It can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. A user-friendly GUI (graphical user interface) makes it easy to use. Multiple deployments allow use on the desktop, in batch mode, or as a callable library. It includes detailed models of solar arrays, radioisotope thermoelectric generators, nickel-hydrogen and lithium-ion batteries, and various load types. There is built-in flexibility through user-designed state models and table-driven parameters.

  3. Healthcare BI: a tool for meaningful analysis.

    PubMed

    Rohloff, Rose

    2011-05-01

    Implementing an effective business intelligence (BI) system requires organizationwide preparation and education to allow for meaningful analysis of information. Hospital executives should take steps to ensure that: Staff entering data are proficient in how the data are to be used for decision making, and integration is based on clean data from primary sources of entry. Managers have the business acumen required for effective data analysis. Decision makers understand how multidimensional BI offers new ways of analysis that represent significant improvements over historical approaches using static reporting.

  4. Timeline analysis tools for law enforcement

    NASA Astrophysics Data System (ADS)

    Mucks, John

    1997-02-01

    The timeline analysis system (TAS) was developed by Rome Laboratory to assist intelligence analysts with the comprehension of large amounts of information. Under the TAS program data visualization, manipulation and reasoning tools were developed in close coordination with end users. The initial TAS prototype was developed for foreign command and control analysts at Space Command in Colorado Springs and was fielded there in 1989. The TAS prototype replaced manual paper timeline maintenance and analysis techniques and has become an integral part of Space Command's information infrastructure. TAS was designed to be domain independent and has been tailored and proliferated to a number of other users. The TAS program continues to evolve because of strong user support. User funded enhancements and Rome Lab funded technology upgrades have significantly enhanced TAS over the years and will continue to do so for the foreseeable future. TAS was recently provided to the New York State Police (NYSP) for evaluation using actual case data. Timeline analysis it turns out is a popular methodology used in law enforcement. The evaluation has led to a more comprehensive application and evaluation project sponsored by the National Institute of Justice (NIJ). This paper describes the capabilities of TAS, results of the initial NYSP evaluation and the plan for a more comprehensive NYSP evaluation.

  5. Web-based portfolios: a valuable tool for surgical education.

    PubMed

    Lewis, Catherine E; Tillou, Areti; Yeh, Michael W; Quach, Chi; Hiatt, Jonathan R; Hines, O Joe

    2010-06-01

    Our residency program developed and implemented an online portfolio system. In the present communication, we describe this system and provide an early analysis of its effect on competency-based performance and acceptance of the system by the residents. To measure competency-based performance, end-of-rotation global evaluations of residents by faculty completed before (n = 1488) and after (n = 697) implementation of the portfolio were compared. To assess acceptance, residents completed a 20-question survey. Practice-based learning and improvement improved following implementation of the portfolio system (P = 0.002). There was also a trend toward improvement in the remaining competencies. In the survey tool (response rate 69%), 95% of the residents agreed that the purpose and functions of the system had been explained to them, and 82% affirmed understanding of ways in which the system could help them, although fewer than half reported that their portfolio had aided in their development of the competencies. All residents appreciated the system's organizational capabilities, and 87% agreed that the portfolio was a useful educational tool. This web portfolio program is a valuable new instrument for both residents and administrators. Early analysis of its impact demonstrates a positive effect across all competencies, and survey analysis revealed that residents have a positive view of this new system. As the portfolio is further incorporated into the educational program, we believe that our residents will discover new tools to craft a career of genuine self-directed learning. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  6. Forensic massively parallel sequencing data analysis tool: Implementation of MyFLq as a standalone web- and Illumina BaseSpace(®)-application.

    PubMed

    Van Neste, Christophe; Gansemans, Yannick; De Coninck, Dieter; Van Hoofstat, David; Van Criekinge, Wim; Deforce, Dieter; Van Nieuwerburgh, Filip

    2015-03-01

    Routine use of massively parallel sequencing (MPS) for forensic genomics is on the horizon. The last few years, several algorithms and workflows have been developed to analyze forensic MPS data. However, none have yet been tailored to the needs of the forensic analyst who does not possess an extensive bioinformatics background. We developed our previously published forensic MPS data analysis framework MyFLq (My-Forensic-Loci-queries) into an open-source, user-friendly, web-based application. It can be installed as a standalone web application, or run directly from the Illumina BaseSpace environment. In the former, laboratories can keep their data on-site, while in the latter, data from forensic samples that are sequenced on an Illumina sequencer can be uploaded to Basespace during acquisition, and can subsequently be analyzed using the published MyFLq BaseSpace application. Additional features were implemented such as an interactive graphical report of the results, an interactive threshold selection bar, and an allele length-based analysis in addition to the sequenced-based analysis. Practical use of the application is demonstrated through the analysis of four 16-plex short tandem repeat (STR) samples, showing the complementarity between the sequence- and length-based analysis of the same MPS data. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  7. Built Environment Analysis Tool: April 2013

    SciTech Connect

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  8. Virtual Tool Mark Generation for Efficient Striation Analysis

    SciTech Connect

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  9. Virtual tool mark generation for efficient striation analysis.

    PubMed

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-07-01

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5-10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  10. BASE: Bayesian Astrometric and Spectroscopic Exoplanet Detection and Characterization Tool

    NASA Astrophysics Data System (ADS)

    Schulze-Hartung, Tim

    2012-08-01

    BASE is a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The tool fulfills two major tasks of exoplanet science, namely the detection of exoplanets and the characterization of their orbits. BASE was developed to provide the possibility of an integrated Bayesian analysis of stellar astrometric and Doppler-spectroscopic measurements with respect to their binary or planetary companions’ signals, correctly treating the astrometric measurement uncertainties and allowing to explore the whole parameter space without the need for informative prior constraints. The tool automatically diagnoses convergence of its Markov chain Monte Carlo (MCMC[2]) sampler to the posterior and regularly outputs status information. For orbit characterization, BASE delivers important results such as the probability densities and correlations of model parameters and derived quantities. BASE is a highly configurable command-line tool developed in Fortran 2008 and compiled with GFortran. Options can be used to control the program’s behaviour and supply information such as the stellar mass or prior information. Any option can be supplied in a configuration file and/or on the command line.

  11. TERPRED: A Dynamic Structural Data Analysis Tool

    PubMed Central

    Walker, Karl; Cramer, Carole L.; Jennings, Steven F.; Huang, Xiuzhen

    2012-01-01

    Computational protein structure prediction mainly involves the main-chain prediction and the side-chain confirmation determination. In this research, we developed a new structural bioinformatics tool, TERPRED for generating dynamic protein side-chain rotamer libraries. Compared with current various rotamer sampling methods, our work is unique in that it provides a method to generate a rotamer library dynamically based on small sequence fragments of a target protein. The Rotamer Generator provides a means for existing side-chain sampling methods using static pre-existing rotamer libraries, to sample from dynamic target-dependent libraries. Also, existing side-chain packing algorithms that require large rotamer libraries for optimal performance, could possibly utilize smaller, target-relevant libraries for improved speed. PMID:25302339

  12. Haystack, a web-based tool for metabolomics research

    PubMed Central

    2014-01-01

    Background Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. Results To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Conclusion Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization

  13. Usability of a web-based personal nutrition management tool.

    PubMed

    Bozkurt, Selen; Zayim, Neşe; Gulkesen, Kemal Hakan; Samur, Mehmet Kemal; Karaağaoglu, Nilgun; Saka, Osman

    2011-12-01

    'Personal Nutrition Management Tool' (PENUMAT) is an interactive web-based application which aims to help individuals seeking nutrition information on the Internet. However, little is known about the usability of such applications. The purpose of this study was to evaluate the usability of PENUMAT using multi-method approach. For an in-depth usability analysis, using a multi-method approach involving protocol analysis, interviews and a system usability scale (SUS) was adopted. The sample consisted of 10 healthy (five males and five females) volunteers between the ages of 22 and 60. The overall usability score was calculated; usability problems and users' opinions were obtained. All usability problems were classified according to the heuristics and listed with their frequencies. Overall, the usability score ranged from 77.5 to 100, with a median of 88.7. In-depth usability analysis exposed several usability problems mostly related to content, navigation and interactivity. Interview results showed that 'being personal and private' (70%) and 'providing personal feedbacks' (60%) were the most appreciated characteristics of the tool. Although the tool has an acceptable overall usability score, several unnoticed usability problems of the interface design were realised with the in-depth analysis. Therefore, the importance of using a multi-method analysis of usability was pointed out.

  14. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  15. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  16. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  17. Multi-mission space vehicle subsystem analysis tools

    NASA Technical Reports Server (NTRS)

    Kordon, M.; Wood, E.

    2003-01-01

    Spacecraft engineers often rely on specialized simulation tools to facilitate the analysis, design and operation of space systems. Unfortunately these tools are often designed for one phase of a single mission and cannot be easily adapted to other phases or other misions. The Multi-Mission Pace Vehicle Susbsystem Analysis Tools are designed to provide a solution to this problem.

  18. ISHM Decision Analysis Tool: Operations Concept

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

  19. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  20. Biomedical tools based on magnetic nanoparticles

    NASA Astrophysics Data System (ADS)

    Saba, Anna R.; Castillo, Paula M.; Fantechi, Elvira; Sangregorio, Claudio; Lascialfari, Alessandro; Sbarbati, Andrea; Casu, Alberto; Falqui, Andrea; Casula, Maria F.

    2013-02-01

    Magnetic and superparamagnetic colloids represent a versatile platform for the design of functional nanostructures which may act as effective tools for biomedicine, being active in cancer therapy, tissue imaging and magnetic separation. The structural, morphological and hence magnetic features of the magnetic nanoparticles must be tuned for optimal perfomance in a given application. In this work, iron oxide nanocrystals have been prepared as prospective heat mediators in magnetic fluid hyperthermia therapy. A procedure based on the partial oxidation of iron (II) precursors in water based media has been adopted and the synthesis outcome has been investigated by X-Ray diffraction and Transmission electron microscopy. It was found that by adjusting the synthetic parameters (mainly the oxidation rate) magnetic iron oxide nanocrystals with cubic and cuboctahedral shape and average size 50 nm were obtained. The nanocrystals were tested as hyperthermic mediators through Specific Absorption Rate (SAR) measurements. The samples act as heat mediators, being able to increase the temperature from physiological temperature to the temperatures used for magnetic hyperthermia by short exposure to an alternative magnetic field and exhibit a reproducible temperature kinetic behavior.

  1. Solar Array Verification Analysis Tool (SAVANT) Developed

    NASA Technical Reports Server (NTRS)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  2. Is motion analysis a valid tool for assessing laparoscopic skill?

    PubMed

    Mason, John D; Ansell, James; Warren, Neil; Torkington, Jared

    2013-05-01

    The use of simulation for laparoscopic training has led to the development of objective tools for skills assessment. Motion analysis represents one area of focus. This study was designed to assess the evidence for the use of motion analysis as a valid tool for laparoscopic skills assessment. Embase, MEDLINE and PubMed were searched using the following domains: (1) motion analysis, (2) validation and (3) laparoscopy. Studies investigating motion analysis as a tool for assessment of laparoscopic skill in general surgery were included. Common endpoints in motion analysis metrics were compared between studies according to a modified form of the Oxford Centre for Evidence-Based Medicine levels of evidence and recommendation. Thirteen studies were included from 2,039 initial papers. Twelve (92.3 %) reported the construct validity of motion analysis across a range of laparoscopic tasks. Of these 12, 5 (41.7 %) evaluated the ProMIS Augmented Reality Simulator, 3 (25 %) the Imperial College Surgical Assessment Device (ICSAD), 2 (16.7 %) the Hiroshima University Endoscopic Surgical Assessment Device (HUESAD), 1 (8.33 %) the Advanced Dundee Endoscopic Psychomotor Tester (ADEPT) and 1 (8.33 %) the Robotic and Video Motion Analysis Software (ROVIMAS). Face validity was reported by 1 (7.7 %) study each for ADEPT and ICSAD. Concurrent validity was reported by 1 (7.7 %) study each for ADEPT, ICSAD and ProMIS. There was no evidence for predictive validity. Evidence exists to validate motion analysis for use in laparoscopic skills assessment. Valid parameters are time taken, path length and number of hand movements. Future work should concentrate on the conversion of motion data into competency-based scores for trainee feedback.

  3. Tool Use of Experienced Learners in Computer-Based Learning Environments: Can Tools Be Beneficial?

    ERIC Educational Resources Information Center

    Juarez Collazo, Norma A.; Corradi, David; Elen, Jan; Clarebout, Geraldine

    2014-01-01

    Research has documented the use of tools in computer-based learning environments as problematic, that is, learners do not use the tools and when they do, they tend to do it suboptimally. This study attempts to disentangle cause and effect of this suboptimal tool use for experienced learners. More specifically, learner variables (metacognitive and…

  4. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    NASA Astrophysics Data System (ADS)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  5. Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)

    SciTech Connect

    Melaina, M.; Penev, M.

    2012-09-01

    NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

  6. PyRAT - python radiography analysis tool (u)

    SciTech Connect

    Temple, Brian A; Buescher, Kevin L; Armstrong, Jerawan C

    2011-01-14

    PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

  7. Tool life modeling and computer simulation of tool wear when nickel-based material turning

    NASA Astrophysics Data System (ADS)

    Zebala, W.

    2016-09-01

    Paper presents some tool life investigations, concerning modeling and simulation of tool wear when turning a difficult-to-cut material like nickel based sintered powder workpiece. A cutting tool made of CBN has its special geometry. The workpiece in the form of disc is an aircraft engine part. The aim of researches is to optimize the cutting data for the purpose to decrease the tool wear and improve the machined surface roughness.

  8. WhichGenes: a web-based tool for gathering, building, storing and exporting gene sets with application in gene set enrichment analysis.

    PubMed

    Glez-Peña, Daniel; Gómez-López, Gonzalo; Pisano, David G; Fdez-Riverola, Florentino

    2009-07-01

    WhichGenes is a web-based interactive gene set building tool offering a very simple interface to extract always-updated gene lists from multiple databases and unstructured biological data sources. While the user can specify new gene sets of interest by following a simple four-step wizard, the tool is able to run several queries in parallel. Every time a new set is generated, it is automatically added to the private gene-set cart and the user is notified by an e-mail containing a direct link to the new set stored in the server. WhichGenes provides functionalities to edit, delete and rename existing sets as well as the capability of generating new ones by combining previous existing sets (intersection, union and difference operators). The user can export his sets configuring the output format and selecting among multiple gene identifiers. In addition to the user-friendly environment, WhichGenes allows programmers to access its functionalities in a programmatic way through a Representational State Transfer web service. WhichGenes front-end is freely available at http://www.whichgenes.org/, WhichGenes API is accessible at http://www.whichgenes.org/api/.

  9. Early warning tools for ecotoxicity assessment based on Phaeodactylum tricornutum.

    PubMed

    Renzi, Monia; Roselli, Leonilde; Giovani, Andrea; Focardi, Silvano E; Basset, Alberto

    2014-08-01

    Phaeodactylum tricornutum was exposed to various toxic substances (zinc, copper or dodecylbenzenesulfonic acid sodium salt) in accordance with the AlgalToxkit(®) protocol based on the UNI EN ISO 10253 method in order to quantitatively compare the responses obtained by traditional growth-rate inhibition tests with morphological (biovolume) and physiological (chlorophyll-a, phaeophytin ratio) endpoints. A novel approach is proposed for detecting early and sub-lethal effects based on biovolume quantification using confocal microscopy coupled with an image analysis system. The results showed that effects on both biovolume and the photosynthetic complex are sensitive and powerful early warning tools for evaluating sub-lethal effects of exposure. Specifically, biovolume showed significant sensitive and early responses for the tested surfactant. Qualitatively, we also observed structural anomalies and effects on natural auto-fluorescence in exposed cells that also represent potentially useful tools for ecotoxicological studies.

  10. Multiway Filtering Based on Multilinear Algebra Tools

    NASA Astrophysics Data System (ADS)

    Bourennane, Salah; Fossati, Caroline

    This paper presents some recent filtering methods based on the lower-rank tensor approximation approach for denoising tensor signals. In this approach, multicomponent data are represented by tensors, that is, multiway arrays, and the presented tensor filtering methods rely on multilinear algebra. First, the classical channel-by-channel SVD-based filtering method is overviewed. Then, an extension of the classical matrix filtering method is presented. It is based on the lower rank-(K 1,...,K N ) truncation of the HOSVD which performs a multimode Principal Component Analysis (PCA) and is implicitly developed for an additive white Gaussian noise. Two tensor filtering methods recently developed by the authors are also overviewed. The performances and comparative results between all these tensor filtering methods are presented for the cases of noise reduction in color images.

  11. Computer-Based Cognitive Tools: Description and Design.

    ERIC Educational Resources Information Center

    Kennedy, David; McNaught, Carmel

    With computers, tangible tools are represented by the hardware (e.g., the central processing unit, scanners, and video display unit), while intangible tools are represented by the software. There is a special category of computer-based software tools (CBSTs) that have the potential to mediate cognitive processes--computer-based cognitive tools…

  12. Industrial Geospatial Analysis Tool for Energy Evaluation (IGATE-E)

    SciTech Connect

    Alkadi, Nasr E; Starke, Michael R; Ma, Ookie; Nimbalkar, Sachin U; Cox, Daryl

    2013-01-01

    IGATE-E is an energy analysis tool for industrial energy evaluation. The tool applies statistical modeling to multiple publicly available datasets and provides information at the geospatial resolution of zip code using bottom up approaches. Within each zip code, the current version of the tool estimates electrical energy consumption of manufacturing industries based on each type of industries using DOE s Industrial Assessment Center database (IAC-DB) and DOE s Energy Information Administration Manufacturing Energy Consumption Survey database (EIA-MECS DB), in addition to other commercially available databases such as the Manufacturing News database (MNI, Inc.). Ongoing and future work include adding modules for the predictions of fuel energy consumption streams, manufacturing process steps energy consumption, major energy intensive processes (EIPs) within each industry type among other metrics of interest. The tool provides validation against DOE s EIA-MECS state level energy estimations and permits several statistical examinations. IGATE-E is intended to be a decision support and planning tool to a wide spectrum of energy analysts, researchers, government organizations, private consultants, industry partners, and alike.

  13. Procrustes rotation as a diagnostic tool for projection pursuit analysis.

    PubMed

    Wentzell, Peter D; Hou, Siyuan; Silva, Carolina Santos; Wicks, Chelsi C; Pimentel, Maria Fernanda

    2015-06-02

    Projection pursuit (PP) is an effective exploratory data analysis tool because it optimizes the projection of high dimensional data using distributional characteristics rather than variance or distance metrics. The recent development of fast and simple PP algorithms based on minimization of kurtosis for clustering data has made this powerful tool more accessible, but under conditions where the sample-to-variable ratio is small, PP fails due to opportunistic overfitting of random correlations to limiting distributional targets. Therefore, some kind of variable compression or data regularization is required in these cases. However, this introduces an additional parameter whose optimization is manually time consuming and subject to bias. The present work describes the use of Procrustes analysis as diagnostic tool that can be used to evaluate the results of PP analysis in an efficient manner. Through Procrustes rotation, the similarity of different PP projections can be examined in an automated fashion with "Procrustes maps" to establish regions of stable projections as a function of the parameter to be optimized. The application of this diagnostic is demonstrated using principal components analysis to compress FTIR spectra from ink samples of ten different brands of pen, and also in conjunction with regularized PP for soybean disease classification.

  14. Systematic Omics Analysis Review (SOAR) Tool to Support Risk Assessment

    PubMed Central

    McConnell, Emma R.; Bell, Shannon M.; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J.; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D.

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  15. Systematic Omics Analysis Review (SOAR) tool to support risk assessment.

    PubMed

    McConnell, Emma R; Bell, Shannon M; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment.

  16. IsoCleft Finder – a web-based tool for the detection and analysis of protein binding-site geometric and chemical similarities

    PubMed Central

    Najmanovich, Rafael

    2013-01-01

    IsoCleft Finder is a web-based tool for the detection of local geometric and chemical similarities between potential small-molecule binding cavities and a non-redundant dataset of ligand-bound known small-molecule binding-sites. The non-redundant dataset developed as part of this study is composed of 7339 entries representing unique Pfam/PDB-ligand (hetero group code) combinations with known levels of cognate ligand similarity. The query cavity can be uploaded by the user or detected automatically by the system using existing PDB entries as well as user-provided structures in PDB format. In all cases, the user can refine the definition of the cavity interactively via a browser-based Jmol 3D molecular visualization interface. Furthermore, users can restrict the search to a subset of the dataset using a cognate-similarity threshold. Local structural similarities are detected using the IsoCleft software and ranked according to two criteria (number of atoms in common and Tanimoto score of local structural similarity) and the associated Z-score and p-value measures of statistical significance. The results, including predicted ligands, target proteins, similarity scores, number of atoms in common, etc., are shown in a powerful interactive graphical interface. This interface permits the visualization of target ligands superimposed on the query cavity and additionally provides a table of pairwise ligand topological similarities. Similarities between top scoring ligands serve as an additional tool to judge the quality of the results obtained. We present several examples where IsoCleft Finder provides useful functional information. IsoCleft Finder results are complementary to existing approaches for the prediction of protein function from structure, rational drug design and x-ray crystallography. IsoCleft Finder can be found at: http://bcb.med.usherbrooke.ca/isocleftfinder. PMID:24555058

  17. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  18. The Development of a Specific and Sensitive LC-MS-Based Method for the Detection and Quantification of Hydroperoxy- and Hydroxydocosahexaenoic Acids as a Tool for Lipidomic Analysis

    PubMed Central

    Derogis, Priscilla B. M. C.; Freitas, Florêncio P.; Marques, Anna S. F.; Cunha, Daniela; Appolinário, Patricia P.; de Paula, Fernando; Lourenço, Tiago C.; Murgu, Michael; Di Mascio, Paolo; Medeiros, Marisa H. G.; Miyamoto, Sayuri

    2013-01-01

    Docosahexaenoic acid (DHA) is an n-3 polyunsaturated fatty acid that is highly enriched in the brain, and the oxidation products of DHA are present or increased during neurodegenerative disease progression. The characterization of the oxidation products of DHA is critical to understanding the roles that these products play in the development of such diseases. In this study, we developed a sensitive and specific analytical tool for the detection and quantification of twelve major DHA hydroperoxide (HpDoHE) and hydroxide (HDoHE) isomers (isomers at positions 4, 5, 7, 8, 10, 11, 13, 14, 16, 17, 19 and 20) in biological systems. In this study, HpDoHE were synthesized by photooxidation, and the corresponding hydroxides were obtained by reduction with NaBH4. The isolated isomers were characterized by LC-MS/MS, and unique and specific fragment ions were chosen to construct a selected reaction monitoring (SRM) method for the targeted quantitative analysis of each HpDoHE and HDoHE isomer. The detection limits for the LC-MS/MS-SRM assay were 1−670 pg for HpDoHE and 0.5−8.5 pg for HDoHE injected onto a column. Using this method, it was possible to detect the basal levels of HDoHE isomers in both rat plasma and brain samples. Therefore, the developed LC-MS/MS-SRM can be used as an important tool to identify and quantify the hydro(pero)xy derivatives of DHA in biological system and may be helpful for the oxidative lipidomic studies. PMID:24204871

  19. Web-based Open Tool Integration Framework

    DTIC Science & Technology

    2006-05-01

    Transformations in OMG’s Model-Driven Architecture: AGTIVE 2003, LNCS 2062. pp. 243-259. 4. Gabor Karsai: Tool Integration Aspects in the Model-Driven...Softw Syst Model (2005) 4: 157–170 / Digital Object Identifier (DOI) 10.1007/s10270-004-0073-y Design patterns for open tool integration Gabor Karsai...FSE Workshop on Tool-Integration in System Develop- ment, Helsinki, Finland, pp 1–5. See [2] Gabor Karsai isAssociatePro- fessor of Electrical and

  20. TA-DA: A TOOL FOR ASTROPHYSICAL DATA ANALYSIS

    SciTech Connect

    Da Rio, Nicola; Robberto, Massimo

    2012-12-01

    We present the Tool for Astrophysical Data Analysis (TA-DA), a new software aimed to greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled Interactive Data Language widget-based application; its graphical user interface makes it considerably user-friendly. In this paper, we describe the software and its functionalities.

  1. PANET: A GPU-Based Tool for Fast Parallel Analysis of Robustness Dynamics and Feed-Forward/Feedback Loop Structures in Large-Scale Biological Networks

    PubMed Central

    Trinh, Hung-Cuong; Le, Duc-Hau; Kwon, Yung-Keun

    2014-01-01

    It has been a challenge in systems biology to unravel relationships between structural properties and dynamic behaviors of biological networks. A Cytoscape plugin named NetDS was recently proposed to analyze the robustness-related dynamics and feed-forward/feedback loop structures of biological networks. Despite such a useful function, limitations on the network size that can be analyzed exist due to high computational costs. In addition, the plugin cannot verify an intrinsic property which can be induced by an observed result because it has no function to simulate the observation on a large number of random networks. To overcome these limitations, we have developed a novel software tool, PANET. First, the time-consuming parts of NetDS were redesigned to be processed in parallel using the OpenCL library. This approach utilizes the full computing power of multi-core central processing units and graphics processing units. Eventually, this made it possible to investigate a large-scale network such as a human signaling network with 1,609 nodes and 5,063 links. We also developed a new function to perform a batch-mode simulation where it generates a lot of random networks and conducts robustness calculations and feed-forward/feedback loop examinations of them. This helps us to determine if the findings in real biological networks are valid in arbitrary random networks or not. We tested our plugin in two case studies based on two large-scale signaling networks and found interesting results regarding relationships between coherently coupled feed-forward/feedback loops and robustness. In addition, we verified whether or not those findings are consistently conserved in random networks through batch-mode simulations. Taken together, our plugin is expected to effectively investigate various relationships between dynamics and structural properties in large-scale networks. Our software tool, user manual and example datasets are freely available at http

  2. PANET: a GPU-based tool for fast parallel analysis of robustness dynamics and feed-forward/feedback loop structures in large-scale biological networks.

    PubMed

    Trinh, Hung-Cuong; Le, Duc-Hau; Kwon, Yung-Keun

    2014-01-01

    It has been a challenge in systems biology to unravel relationships between structural properties and dynamic behaviors of biological networks. A Cytoscape plugin named NetDS was recently proposed to analyze the robustness-related dynamics and feed-forward/feedback loop structures of biological networks. Despite such a useful function, limitations on the network size that can be analyzed exist due to high computational costs. In addition, the plugin cannot verify an intrinsic property which can be induced by an observed result because it has no function to simulate the observation on a large number of random networks. To overcome these limitations, we have developed a novel software tool, PANET. First, the time-consuming parts of NetDS were redesigned to be processed in parallel using the OpenCL library. This approach utilizes the full computing power of multi-core central processing units and graphics processing units. Eventually, this made it possible to investigate a large-scale network such as a human signaling network with 1,609 nodes and 5,063 links. We also developed a new function to perform a batch-mode simulation where it generates a lot of random networks and conducts robustness calculations and feed-forward/feedback loop examinations of them. This helps us to determine if the findings in real biological networks are valid in arbitrary random networks or not. We tested our plugin in two case studies based on two large-scale signaling networks and found interesting results regarding relationships between coherently coupled feed-forward/feedback loops and robustness. In addition, we verified whether or not those findings are consistently conserved in random networks through batch-mode simulations. Taken together, our plugin is expected to effectively investigate various relationships between dynamics and structural properties in large-scale networks. Our software tool, user manual and example datasets are freely available at http://panet-csc.sourceforge.net/.

  3. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  4. The cognitive bases of human tool use.

    PubMed

    Vaesen, Krist

    2012-08-01

    This article has two goals. The first is to assess, in the face of accruing reports on the ingenuity of great ape tool use, whether and in what sense human tool use still evidences unique, higher cognitive ability. To that effect, I offer a systematic comparison between humans and nonhuman primates with respect to nine cognitive capacities deemed crucial to tool use: enhanced hand-eye coordination, body schema plasticity, causal reasoning, function representation, executive control, social learning, teaching, social intelligence, and language. Since striking differences between humans and great apes stand firm in eight out of nine of these domains, I conclude that human tool use still marks a major cognitive discontinuity between us and our closest relatives. As a second goal of the paper, I address the evolution of human technologies. In particular, I show how the cognitive traits reviewed help to explain why technological accumulation evolved so markedly in humans, and so modestly in apes.

  5. Reliability-Based Electronics Shielding Design Tools

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; O'Neill, P. J.; Zang, T. A.; Pandolf, J. E.; Tripathi, R. K.; Koontz, Steven L.; Boeder, P.; Reddell, B.; Pankop, C.

    2007-01-01

    Shielding design on large human-rated systems allows minimization of radiation impact on electronic systems. Shielding design tools require adequate methods for evaluation of design layouts, guiding qualification testing, and adequate follow-up on final design evaluation.

  6. HANSIS software tool for the automated analysis of HOLZ lines.

    PubMed

    Holec, D; Sridhara Rao, D V; Humphreys, C J

    2009-06-01

    A software tool, named as HANSIS (HOLZ analysis), has been developed for the automated analysis of higher-order Laue zone (HOLZ) lines in convergent beam electron diffraction (CBED) patterns. With this tool, the angles and distances between the HOLZ intersections can be measured and the data can be presented graphically with a user-friendly interface. It is capable of simultaneous analysis of several HOLZ patterns and thus provides a tool for systematic studies of CBED patterns.

  7. Methodology Investigation: Automation of the Multilingual Static Analysis Tool (MSAT).

    DTIC Science & Technology

    1987-03-01

    A189 240 METHODOLOGY INVESTIGATION: AUTOMATION OF THE in1 MULTILINGUAL STATIC ANALYSIS TOOL (MSAT)(U) ARMY AlaJDELECTRONIC PROVING GROUND FORT...AUTOMATION OF THE MULTILINGUAL STATIC ANALYSIS TOOL D T IC (MSAT) ELECTE MAY 1 9 W7 BY " D K. E. VAN KARSEN Software and Automation Division Electronic...AMSTE-TC-M SUBJECT: Final Report RDTE Methodology Improvement Program, Multilingual Static Analysis Tool (MSAT) Automation, TECOM Project No. 7-CO-P86

  8. Cost-Benefit Analysis Tools for Avionics Parts Obsolescence

    DTIC Science & Technology

    2002-04-01

    analysis tools for comparing the resolution options exist, they could be instrumental for program/item managers to assist in timely solution decisions...AU/ACSC/02-103/2002-04 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY COST-BENEFIT ANALYSIS TOOLS FOR AVIONICS PARTS OBSOLESCENCE by Luvenia...Analysis Tools For Avionics Parts Obsolescence 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e

  9. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    NASA Technical Reports Server (NTRS)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  10. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    NASA Technical Reports Server (NTRS)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  11. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    SciTech Connect

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  12. msBiodat analysis tool, big data analysis for high-throughput experiments.

    PubMed

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  13. Ball Bearing Analysis with the ORBIS Tool

    NASA Technical Reports Server (NTRS)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  14. ProMAT: protein microarray analysis tool

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Varnum, Susan M.; Anderson, Kevin K.; Bollinger, Nikki; Zangar, Richard C.

    2006-04-04

    Summary: ProMAT is a software tool for statistically analyzing data from ELISA microarray experiments. The software estimates standard curves, sample protein concentrations and their uncertainties for multiple assays. ProMAT generates a set of comprehensive figures for assessing results and diagnosing process quality. The tool is available for Windows or Mac, and is distributed as open-source Java and R code. Availability: ProMAT is available at http://www.pnl.gov/statistics/ProMAT. ProMAT requires Java version 1.5.0 and R version 1.9.1 (or more recent versions) which are distributed with the tool.

  15. Graphical tools for network meta-analysis in STATA.

    PubMed

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  16. Internet-based Modeling, Mapping, and Analysis for the Greater Everglades (IMMAGE; Version 1.0): web-based tools to assess the impact of sea level rise in south Florida

    USGS Publications Warehouse

    Hearn, Paul; Strong, David; Swain, Eric; Decker, Jeremy

    2013-01-01

    South Florida's Greater Everglades area is particularly vulnerable to sea level rise, due to its rich endowment of animal and plant species and its heavily populated urban areas along the coast. Rising sea levels are expected to have substantial impacts on inland flooding, the depth and extent of surge from coastal storms, the degradation of water supplies by saltwater intrusion, and the integrity of plant and animal habitats. Planners and managers responsible for mitigating these impacts require advanced tools to help them more effectively identify areas at risk. The U.S. Geological Survey's (USGS) Internet-based Modeling, Mapping, and Analysis for the Greater Everglades (IMMAGE) Web site has been developed to address these needs by providing more convenient access to projections from models that forecast the effects of sea level rise on surface water and groundwater, the extent of surge and resulting economic losses from coastal storms, and the distribution of habitats. IMMAGE not only provides an advanced geographic information system (GIS) interface to support decision making, but also includes topic-based modules that explain and illustrate key concepts for nontechnical users. The purpose of this report is to familiarize both technical and nontechnical users with the IMMAGE Web site and its various applications.

  17. Tools for Knowledge Analysis, Synthesis, and Sharing

    NASA Astrophysics Data System (ADS)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  18. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  19. Autocorrelation of molecular electrostatic potential surface properties combined with partial least squares analysis as alternative attractive tool to generate ligand-based 3D-QSARs.

    PubMed

    Moro, Stefano; Bacilieri, Magdalena; Ferrari, Cristina; Spalluto, Giampiero

    2005-03-01

    A database of 106 human A3 adenosine receptor antagonists was used to derive two alternative PLS models: one starting from CoMFA descriptors and the other starting from the autocorrelation descriptors. The peculiarity of this work is the introduction of autocorrelation vectors as molecular descriptors for the PLS analysis. The autocorrelation allows comparing molecules (and their properties) with different structures and with different spatial orientation without any previous alignment. In particular, Molecular Electrostatic Potential (MEP) was the property computed and its information encoded in autocorrelation vectors. The 3D spatial distribution and the values of the electrostatic potential is in fact largely responsible for the binding of a substrate to its receptor binding site. Validation was done with an external test set and the results of the two models were compared. Interestingly, our preliminary results seem to indicate that this new alternative approach could robustly compete with the already well consolidated CoMFA approach. In particular, we have suggested that it could be a very interesting tool to filter large structural database in several virtual screening applications.

  20. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    PubMed

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Exploratory data analysis as a tool for similarity assessment and clustering of chiral polysaccharide-based systems used to separate pharmaceuticals in supercritical fluid chromatography.

    PubMed

    De Klerck, Katrijn; Vander Heyden, Yvan; Mangelings, Debby

    2014-01-24

    In the search for appropriate chromatographic conditions to separate enantiomers, screening strategies are often applied because achieving chiral separations is tedious. These screenings aim to find relatively fast suitable separation conditions. However, the definition of these screenings mostly relies on years of expertise or on the labour- and time-intensive investigation of a broad range of chiral stationary- and mobile phases. A large amount of data is generated using either approach. In this study, the obtained data are investigated in a systematic manner and (dis)similar systems are searched for. For this case study, 48 chromatographic systems were characterized by the enantioresolutions of 29 racemates. Exploratory data analysis was performed by means of projection pursuit, revealing the different enantioselective patterns of the chromatographic systems. To quantify the (dis)similarity, correlation coefficients and Euclidean distances were calculated. These results were visualized in colour maps to allow investigating the degree of (dis)similarity between the systems. These maps proved to be a helpful tool in the selection of dissimilar/orthogonal chromatographic conditions. Hierarchical-cluster-analysis dendrograms were constructed next to evaluate the clustering of similar systems, i.e. with an equivalent enantioselectivity. Screening sequences were extracted and compared with the initial, defined by direct data interpretation. In a final section, selection of dissimilar systems was done by means of the Kennard and Stone algorithm. The systems selected by the applied techniques did not necessarily perform better than the selection by direct data interpretation. Nevertheless, high cumulative success rates are achieved for the selected combinations, due to the broad enantioselectivity, the high individual success rates and the complementarity of the chiral selectors. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  3. KaPPA-view: a web-based analysis tool for integration of transcript and metabolite data on plant metabolic pathway maps.

    PubMed

    Tokimatsu, Toshiaki; Sakurai, Nozomu; Suzuki, Hideyuki; Ohta, Hiroyuki; Nishitani, Kazuhiko; Koyama, Tanetoshi; Umezawa, Toshiaki; Misawa, Norihiko; Saito, Kazuki; Shibata, Daisuke

    2005-07-01

    The application of DNA array technology and chromatographic separation techniques coupled with mass spectrometry to transcriptomic and metabolomic analyses in plants has resulted in the generation of considerable quantitative data related to transcription and metabolism. The integration of "omic" data is one of the major concerns associated with research into identifying gene function. Thus, we developed a Web-based tool, KaPPA-View, for representing quantitative data for individual transcripts and/or metabolites on plant metabolic pathway maps. We prepared a set of comprehensive metabolic pathway maps for Arabidopsis (Arabidopsis thaliana) and depicted these graphically in Scalable Vector Graphics format. Individual transcripts assigned to a reaction are represented symbolically together with the symbols of the reaction and metabolites on metabolic pathway maps. Using quantitative values for transcripts and/or metabolites submitted by the user as Comma Separated Value-formatted text through the Internet, the KaPPA-View server inserts colored symbols corresponding to a defined metabolic process at that site on the maps and returns them to the user's browser. The server also provides information on transcripts and metabolites in pop-up windows. To demonstrate the process, we describe the dataset obtained for transgenic plants that overexpress the PAP1 gene encoding a MYB transcription factor on metabolic pathway maps. The presentation of data in this manner is useful for viewing metabolic data in a way that facilitates the discussion of gene function.

  4. A quality assessment tool for markup-based clinical guidelines.

    PubMed

    Shalom, Erez; Shahar, Yuval; Taieb-Maimon, Meirav; Lunenfeld, Eitan

    2008-11-06

    We introduce a tool for quality assessment of procedural and declarative knowledge. We developed this tool for evaluating the specification of mark-up-based clinical GLs. Using this graphical tool, the expert physician and knowledge engineer collaborate to perform scoring, using pre-defined scoring scale, each of the knowledge roles of the mark-ups, comparing it to a gold standard. The tool enables scoring the mark-ups simultaneously at different sites by different users at different locations.

  5. Analysis and validation of automated skull stripping tools: a validation study based on 296 MR images from the Honolulu Asia aging study.

    PubMed

    Hartley, S W; Scher, A I; Korf, E S C; White, L R; Launer, L J

    2006-05-01

    As population-based epidemiologic studies may acquire images from thousands of subjects, automated image post-processing is needed. However, error in these methods may be biased and related to subject characteristics relevant to the research question. Here, we compare two automated methods of brain extraction against manually segmented images and evaluate whether method accuracy is associated with subject demographic and health characteristics. MRI data (n = 296) are from the Honolulu Asia Aging Study, a population-based study of elderly Japanese-American men. The intracranial space was manually outlined on the axial proton density sequence by a single operator. The brain was extracted automatically using BET (Brain Extraction Tool) and BSE (Brain Surface Extractor) on axial proton density images. Total intracranial volume was calculated for the manually segmented images (ticvM), the BET segmented images (ticvBET) and the BSE segmented images (ticvBSE). Mean ticvBSE was closer to that of ticvM, but ticvBET was more highly correlated with ticvM than ticvBSE. BSE had significant over (positive error) and underestimated (negative error) ticv, but net error was relatively low. BET had large positive and very low negative error. Method accuracy, measured in percent positive and negative error, varied slightly with age, head circumference, presence of the apolipoprotein eepsilon4 polymorphism, subcortical and cortical infracts and enlarged ventricles. This epidemiologic approach to the assessment of potential bias in image post-processing tasks shows both skull-stripping programs performed well in this large image dataset when compared to manually segmented images. Although method accuracy was statistically associated with some subject characteristics, the extent of the misclassification (in terms of percent of brain volume) was small.

  6. Genome-wide analysis of signatures of selection in populations of African honey bees (Apis mellifera) using new web-based tools.

    PubMed

    Fuller, Zachary L; Niño, Elina L; Patch, Harland M; Bedoya-Reina, Oscar C; Baumgarten, Tracey; Muli, Elliud; Mumoki, Fiona; Ratan, Aakrosh; McGraw, John; Frazier, Maryann; Masiga, Daniel; Schuster, Stephen; Grozinger, Christina M; Miller, Webb

    2015-07-10

    With the development of inexpensive, high-throughput sequencing technologies, it has become feasible to examine questions related to population genetics and molecular evolution of non-model species in their ecological contexts on a genome-wide scale. Here, we employed a newly developed suite of integrated, web-based programs to examine population dynamics and signatures of selection across the genome using several well-established tests, including F ST, pN/pS, and McDonald-Kreitman. We applied these techniques to study populations of honey bees (Apis mellifera) in East Africa. In Kenya, there are several described A. mellifera subspecies, which are thought to be localized to distinct ecological regions. We performed whole genome sequencing of 11 worker honey bees from apiaries distributed throughout Kenya and identified 3.6 million putative single-nucleotide polymorphisms. The dense coverage allowed us to apply several computational procedures to study population structure and the evolutionary relationships among the populations, and to detect signs of adaptive evolution across the genome. While there is considerable gene flow among the sampled populations, there are clear distinctions between populations from the northern desert region and those from the temperate, savannah region. We identified several genes showing population genetic patterns consistent with positive selection within African bee populations, and between these populations and European A. mellifera or Asian Apis florea. These results lay the groundwork for future studies of adaptive ecological evolution in honey bees, and demonstrate the use of new, freely available web-based tools and workflows ( http://usegalaxy.org/r/kenyanbee ) that can be applied to any model system with genomic information.

  7. VMPLOT: A versatile analysis tool for mission operations

    NASA Technical Reports Server (NTRS)

    Bucher, Allen W.

    1993-01-01

    VMPLOT is a versatile analysis tool designed by the Magellan Spacecraft Team to graphically display engineering data used to support mission operations. While there is nothing revolutionary or innovative about graphical data analysis tools, VMPLOT has some distinguishing features that set it apart from other custom or commercially available software packages. These features include the ability to utilize time in a Universal Time Coordinated (UTC) or Spacecraft Clock (SCLK) format as an enumerated data type, the ability to automatically scale both axes based on the data to be displayed (including time), the ability to combine data from different files, and the ability to utilize the program either interactively or in batch mode, thereby enhancing automation. Another important feature of VMPLOT not visible to the user is the software engineering philosophies utilized. A layered approach was used to isolate program functionality to different layers. This was done to increase program portability to different platforms and to ease maintenance and enhancements due to changing requirements. The functionality of the unique features of VMPLOT as well as highlighting the algorithms that make these features possible are described. The software engineering philosophies used in the creation of the software tool are also summarized.

  8. SOFAST: Sandia Optical Fringe Analysis Slope Tool

    SciTech Connect

    Andraka, Charles E.

    2015-10-20

    SOFAST is used to characterize the surface slope of reflective mirrors for solar applications. SOFAST uses a large monitor or projections screen to display fringe patterns, and a machine vision camera to image the reflection of these patterns in the subject mirror. From these images, a detailed map of surface normals can be generated and compared to design or fitted mirror shapes. SOFAST uses standard Fringe Reflection (Deflectometry) approaches to measure the mirror surface normals. SOFAST uses an extrinsic analysis of key points on the facet to locate the camera and monitor relative to the facet coordinate system. It then refines this position based on the measured surface slope and integrated shape of the mirror facet. The facet is placed into a reference frame such that key points on the facet match the design facet in orientation and position. This is key to evaluating a facet as suitable for a specific solar application. SOFAST reports the measurements of the facet as detailed surface normal location in a format suitable for ray tracing optical analysis codes. SOFAST also reports summary information as to the facet fitted shape (monomial) and error parameters. Useful plots of the error distribution are also presented.

  9. Data Analysis Tools for Visualization Study

    DTIC Science & Technology

    2015-08-01

    represented true threats. The correct answers and the selections by each subject were recorded as fixed-format text files. My tools parse this text ...1 2.3 Three Display Types 2 2.4 Inputs from Test Subjects 3 3. Subject Trial Results 4 3.1 Selection Text Files 4 3.2 Creation of Table...the selections by each subject were recorded as fixed-format text files. My tools parse the text files and insert the data into tables in a

  10. Tool Use and Performance: Relationships between Tool- and Learner-Related Characteristics in a Computer-Based Learning Environment

    ERIC Educational Resources Information Center

    Juarez-Collazo, Norma A.; Elen, Jan; Clarebout, Geraldine

    2013-01-01

    It is still unclear on what and how tool and learner characteristics influence tool use and consequently performance in computer-based learning environments (CBLEs). This study examines the relationships between tool-related characteristics (tool presentation: non-/embedded tool and instructional cues: non-/explained tool functionality) and…

  11. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  12. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    SciTech Connect

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  13. Tools for integrated sequence-structure analysis with UCSF Chimera

    PubMed Central

    Meng, Elaine C; Pettersen, Eric F; Couch, Gregory S; Huang, Conrad C; Ferrin, Thomas E

    2006-01-01

    Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a) provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b) facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit); (c) can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d) interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is available for Microsoft

  14. MTK: An AI tool for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Rudokas, Mary R.

    1988-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Office is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control, and trend analysis of the Space Station Thermal Control System (TCS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined here with examples from the thermal system to highlight the motivating factors behind them, followed by an overview of the capabilities of MTK, which was developed to address these issues in a generic fashion.

  15. MTK: An AI tool for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  16. Entropy-Based Classification of Subsurface Scatterers: A Valuable Tool for the Analysis of Data Obtained by the Fully Polarimetric WISDOM Radar

    NASA Astrophysics Data System (ADS)

    Plettemeier, D.; Statz, C.; Hahnel, R.; Benedix, W. S.; Hamran, S. E.; Ciarletti, V.

    2016-12-01

    The "Water Ice Subsurface Deposition on Mars" Experiment (WISDOM) is a Ground Penetrating Radar (GPR) and part of the 2020 ExoMars Rover payload. It will be the first GPR operating on a planetary rover and the first fully polarimetric radar tasked at probing the subsurface of Mars. WISDOM operates at frequencies between 500 MHz and 3 GHz yielding a centimetric resolution and a penetration depth of about 3 meters in Martian soil. Its prime scientific objective is the detailed characterization of the material distribution within the first few meters of the Martian subsurface as a contribution to the search for evidence of past life. For the first time, WISDOM will give access to the geological structure, electromagnetic nature, and hydrological state of the shallow subsurface by retrieving the layering and properties of the buried reflectors at an unprecedented resolution and, due to the fully polarimetric measurements, amount of information. Furthermore, a "real time" subsurface analysis will support the drill operations by identifying locations of high scientific interest and low risk. Key element in the WISDOM data analysis is the fast and reliable classification and correct localization of subsurface scatterers and layers. The fully polarimetric nature of the WISDOM measurements allows the use of the entropy-alpha decomposition (H-alpha). This method enables the classification of reconstructed images of the subsurface (obtained by inverse imaging algorithms, e.g. f-k migration) with regard to the main scattering mechanisms of geological features present in the image of the subsurface. It is, for example, possible to differentiate smooth surfaces, rough surfaces, isolated spherical scatterers, double- and bounce scattering, anisotropic scatterers, clouds of small scatterers of similar shape as well as layers of oblate spheroids. Preliminary tests under laboratory conditions suggest the feasibility and value of the approach for the classification of geological

  17. Tools for Knowledge Analysis, Synthesis, and Sharing

    ERIC Educational Resources Information Center

    Medland, Michael B.

    2007-01-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…

  18. Tools for Knowledge Analysis, Synthesis, and Sharing

    ERIC Educational Resources Information Center

    Medland, Michael B.

    2007-01-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…

  19. STools: IDL Tools for Spectroscopic Analysis

    NASA Astrophysics Data System (ADS)

    Allende Prieto, Carlos

    2017-08-01

    STools contains a variety of simple tools for spectroscopy, such as reading an IRAF-formatted (multispec) echelle spectrum in FITS, measuring the wavelength of the center of a line, Gaussian convolution, deriving synthetic photometry from an input spectrum, and extracting and interpolating a MARCS model atmosphere (standard composition).

  20. Knowledge Mapping: A Multipurpose Task Analysis Tool.

    ERIC Educational Resources Information Center

    Esque, Timm J.

    1988-01-01

    Describes knowledge mapping, a tool developed to increase the objectivity and accuracy of task difficulty ratings for job design. Application in a semiconductor manufacturing environment is discussed, including identifying prerequisite knowledge for a given task; establishing training development priorities; defining knowledge levels; identifying…

  1. Knowledge Mapping: A Multipurpose Task Analysis Tool.

    ERIC Educational Resources Information Center

    Esque, Timm J.

    1988-01-01

    Describes knowledge mapping, a tool developed to increase the objectivity and accuracy of task difficulty ratings for job design. Application in a semiconductor manufacturing environment is discussed, including identifying prerequisite knowledge for a given task; establishing training development priorities; defining knowledge levels; identifying…

  2. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  3. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  4. Design and Application of the Exploration Maintainability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    requirements to support those activities. Using a Monte Carlo approach, the tool simulates potential failures in defined systems, based on established component reliabilities, and then evaluates the capability of the crew to repair those failures given a defined store of spares and maintenance items. Statistical analysis of Monte Carlo runs provides probabilistic estimates of overall mission safety and reliability. This paper will describe the operation of the EMAT, including historical data sources used to populate the model, simulation processes, and outputs. Analysis results are provided for a candidate exploration system, including baseline estimates of required sparing mass and volume. Sensitivity analysis regarding the effectiveness of proposed strategies to reduce mass and volume requirements and improve mission reliability is included in these results.

  5. Graphics-Based Parallel Programming Tools

    DTIC Science & Technology

    1991-09-01

    more general context by implementing perspective views within the Voyeur system [121. Voyeur is a more conventional tool for displaying application...Varadaraju. Interfacing Belvedere with Voyeur . Master’s Thesis, COINS Department, University of Massachusetts (June 1991). 13 David Socha and Mary L...Bailey and David Notkin, " Voyeur : Graphi- cal Views of Parallel Programs", SIGPLAN Workshop on Parallel and Distributed Debugging, pp. 206-215 (1988). 14

  6. Graphics-Based Parallel Programming Tools

    DTIC Science & Technology

    1992-01-01

    the Voyeur system [121. Voyeur is a more conventional tool for displaying application-specific visualizations of parallel programs [131 and it provides...Department. University of Massachusetts (June 1991). 13 David Socha and Mary L. Bailey and David Notkin. "’ Voyeur : Graphi- cal Views of Parallel Programs...Massachusetts (September 1991). Nandakumar Varadaraju. Interfacing Belvedere with Voyeur . Master’s The- sis. COINS Department. University of Massachusetts

  7. An Integrated Tool for System Analysis of Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  8. Dynamic drag force based on iterative density mapping: A new numerical tool for three-dimensional analysis of particle trajectories in a dielectrophoretic system.

    PubMed

    Knoerzer, Markus; Szydzik, Crispin; Tovar-Lopez, Francisco Javier; Tang, Xinke; Mitchell, Arnan; Khoshmanesh, Khashayar

    2016-02-01

    Dielectrophoresis is a widely used means of manipulating suspended particles within microfluidic systems. In order to efficiently design such systems for a desired application, various numerical methods exist that enable particle trajectory plotting in two or three dimensions based on the interplay of hydrodynamic and dielectrophoretic forces. While various models are described in the literature, few are capable of modeling interactions between particles as well as their surrounding environment as these interactions are complex, multifaceted, and computationally expensive to the point of being prohibitive when considering a large number of particles. In this paper, we present a numerical model designed to enable spatial analysis of the physical effects exerted upon particles within microfluidic systems employing dielectrophoresis. The model presents a means of approximating the effects of the presence of large numbers of particles through dynamically adjusting hydrodynamic drag force based on particle density, thereby introducing a measure of emulated particle-particle and particle-liquid interactions. This model is referred to as "dynamic drag force based on iterative density mapping." The resultant numerical model is used to simulate and predict particle trajectory and velocity profiles within a microfluidic system incorporating curved dielectrophoretic microelectrodes. The simulated data are compared favorably with experimental data gathered using microparticle image velocimetry, and is contrasted against simulated data generated using traditional "effective moment Stokes-drag method," showing more accurate particle velocity profiles for areas of high particle density.

  9. Distributed design tools: Mapping targeted design tools onto a Web-based distributed architecture for high-performance computing

    SciTech Connect

    Holmes, V.P.; Linebarger, J.M.; Miller, D.J.; Poore, C.A.

    1999-11-30

    Design Tools use a Web-based Java interface to guide a product designer through the design-to-analysis cycle for a specific, well-constrained design problem. When these Design Tools are mapped onto a Web-based distributed architecture for high-performance computing, the result is a family of Distributed Design Tools (DDTs). The software components that enable this mapping consist of a Task Sequencer, a generic Script Execution Service, and the storage of both data and metadata in an active, object-oriented database called the Product Database Operator (PDO). The benefits of DDTs include improved security, reliability, scalability (in both problem size and computing hardware), robustness, and reusability. In addition, access to the PDO unlocks its wide range of services for distributed components, such as lookup and launch capability, persistent shared memory for communication between cooperating services, state management, event notification, and archival of design-to-analysis session data.

  10. Integrated Modeling Tools for Thermal Analysis and Applications

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

    1999-01-01

    Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural

  11. Integrated Modeling Tools for Thermal Analysis and Applications

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

    1999-01-01

    Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural

  12. Model analysis tools in the Virtual Model Repository (VMR)

    NASA Astrophysics Data System (ADS)

    De Zeeuw, D.; Ridley, A. J.

    2013-12-01

    The Virtual Model Repository (VMR) provides scientific analysis tools for a wide variety of numerical models of the Earth's magnetosphere. Data discovery, visualization tools and data/model comparisons are provided in a consistent and intuitive format. A large collection of numerical model runs are available to analyze, including the large Earth magnetosphere event run library at the CCMC and many runs from the University of Michigan. Relevant data useful for data/model comparisons is found using various APIs and included in many of the visualization tools. Recent additions to the VMR include a comprehensive suite of tools for analysis of the Global Ionosphere Thermosphere Model (GITM).

  13. Graphical Acoustic Liner Design and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  14. Development of wavelet analysis tools for turbulence

    NASA Technical Reports Server (NTRS)

    Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.

    1992-01-01

    Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

  15. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  16. DUK - A Fast and Efficient Kmer Based Sequence Matching Tool

    SciTech Connect

    Li, Mingkun; Copeland, Alex; Han, James

    2011-03-21

    A new tool, DUK, is developed to perform matching task. Matching is to find whether a query sequence partially or totally matches given reference sequences or not. Matching is similar to alignment. Indeed many traditional analysis tasks like contaminant removal use alignment tools. But for matching, there is no need to know which bases of a query sequence matches which position of a reference sequence, it only need know whether there exists a match or not. This subtle difference can make matching task much faster than alignment. DUK is accurate, versatile, fast, and has efficient memory usage. It uses Kmer hashing method to index reference sequences and Poisson model to calculate p-value. DUK is carefully implemented in C++ in object oriented design. The resulted classes can also be used to develop other tools quickly. DUK have been widely used in JGI for a wide range of applications such as contaminant removal, organelle genome separation, and assembly refinement. Many real applications and simulated dataset demonstrate its power.

  17. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  18. Operations other than war: Requirements for analysis tools research report

    SciTech Connect

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  19. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  20. meRanTK: methylated RNA analysis ToolKit.

    PubMed

    Rieder, Dietmar; Amort, Thomas; Kugler, Elisabeth; Lusser, Alexandra; Trajanoski, Zlatko

    2016-03-01

    The significance and function of posttranscriptional cytosine methylation in poly(A)RNA attracts great interest but is still poorly understood. High-throughput sequencing of RNA treated with bisulfite (RNA-BSseq) or subjected to enrichment techniques like Aza-IP or miCLIP enables transcriptome wide studies of this particular modification at single base pair resolution. However, to date, there are no specialized software tools available for the analysis of RNA-BSseq or Aza-IP data. Therefore, we developed meRanTK, the first publicly available tool kit which addresses the special demands of high-throughput RNA cytosine methylation data analysis. It provides fast and easy to use splice-aware bisulfite sequencing read mapping, comprehensive methylation calling and identification of differentially methylated cytosines by statistical analysis of single- and multi-replicate experiments. Application of meRanTK to RNA-BSseq or Aza-IP data produces accurate results in standard compliant formats. meRanTK, source code and test data are released under the GNU GPLv3+ license and are available at http://icbi.at/software/meRanTK/ CONTACT: dietmar.rieder@i-med.ac.at. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Needs Assessment and Analysis: Tools for Change.

    ERIC Educational Resources Information Center

    Rodriguez, Stephen R.

    1988-01-01

    Considers the processes associated with holistic needs assessment and other front end activities such as needs analysis, front-end analysis, and task analysis. The Organizational Elements Model (OEM) is described to clarify how processes relate to levels of organizational planning, and the optimal contexts for use of each process are suggested.…

  2. The Lagrangian analysis tool LAGRANTO - version 2.0

    NASA Astrophysics Data System (ADS)

    Sprenger, M.; Wernli, H.

    2015-02-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities: (i) trajectory starting positions can be described easily based on different geometrical and/or meteorological conditions; e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels; (ii) a versatile selection of trajectories is offered based on single or combined criteria; these criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity (PV) greater than 2 PVU); and (iii) full versions are available for global ECMWF and regional COSMO data; core functionality is also provided for the regional WRF and UM models, and for the global 20th Century Reanalysis data set. The intuitive application of LAGRANTO is first presented for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO is used to quasi-operationally diagnose stratosphere-troposphere exchange events over Europe. Whereas these example rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution are needed to adequately resolve the rather complex flow structure associated with orographic blocking due to the Alps. Finally, an example of backward trajectories presents the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes simple tools, e.g., to visualize and merge trajectories. Furthermore, a detailed user guide exists, which describes all LAGRANTO capabilities.

  3. The LAGRANTO Lagrangian analysis tool - version 2.0

    NASA Astrophysics Data System (ADS)

    Sprenger, M.; Wernli, H.

    2015-08-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities. Trajectory starting positions can be defined easily and flexibly based on different geometrical and/or meteorological conditions, e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels. After the computation of the trajectories, a versatile selection of trajectories is offered based on single or combined criteria. These criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity, PV, greater than 2 PVU; 1 PVU = 10-6 K m2 kg-1 s-1). Full versions of this new version of LAGRANTO are available for global ECMWF and regional COSMO data, and core functionality is provided for the regional WRF and MetUM models and the global 20th Century Reanalysis data set. The paper first presents the intuitive application of LAGRANTO for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO can be used to quasi-operationally diagnose stratosphere-troposphere exchange events. Whereas these examples rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution serve to resolve the rather complex flow structure associated with orographic blocking due to the Alps, as shown in a third example. A final example illustrates the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes auxiliary tools, e.g., to visualize trajectories. A detailed user guide describes all LAGRANTO capabilities.

  4. PyRAT (python radiography analysis tool): overview

    SciTech Connect

    Armstrong, Jerawan C; Temple, Brian A; Buescher, Kevin L

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  5. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  6. SECRETOOL: integrated secretome analysis tool for fungi.

    PubMed

    Cortázar, Ana R; Aransay, Ana M; Alfaro, Manuel; Oguiza, José A; Lavín, José L

    2014-02-01

    The secretome (full set of secreted proteins) has been studied in multiple fungal genomes to elucidate the potential role of those protein collections involved in a number of metabolic processes from host infection to wood degradation. Being aminoacid composition a key factor to recognize secretory proteins, SECRETOOL comprises a group of web tools that enable secretome predictions out of aminoacid sequence files, up to complete fungal proteomes, in one step. SECRETOOL is freely available on the web at http://genomics.cicbiogune.es/SECRETOOL/Secretool.php .

  7. [Analysis on evaluation tool for literature quality in clinical study].

    PubMed

    Liu, Qing; Zhai, Wei; Tan, Ya-qin; Huang, Juan

    2014-09-01

    The tools used for the literature quality evaluation are introduced. The common evaluation tools that are publicly and extensively used for the evaluation of clinical trial literature quality in the world are analyzed, including Jadad scale, Consolidated Standards of Reporting Trials (CONSORT) statement and Grades of Recommendations Assessment, Development and Evaluation (GRADE) system and the others. Additionally, the present development, updates and applications of these tools are involved in analysis.

  8. A browser-based tool for space weather and space climate studies

    NASA Astrophysics Data System (ADS)

    Tanskanen, E. I.; Pérez-Suárez, D.

    2014-04-01

    A browser-based research tool has been developed for time series analysis on-line. Large amount of high-resolution measurements are nowadays available from different heliospheric locations. It has become an issue how to best handle the ever-increasing amount of information about the near-Earth space weather conditions, and how to improve the social data analysis tools for space studies. To resolve the problem, we have developed an interactive web interface, called Substorm Zoo, which we expect to become a powerful tool for scientists and a useful tool for public.

  9. Image analysis tools and emerging algorithms for expression proteomics

    PubMed Central

    English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.

    2012-01-01

    Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput ‘shotgun’ proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a user’s and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS. PMID:21046614

  10. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2015-11-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.

  11. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2014-10-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Initial results of the code parallelization will be reported. Work is supported by the U.S. DOE SBIR program.

  12. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2013-10-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Preliminary results of the code parallelization will be reported. Work is supported by the U.S. DOE SBIR program.

  13. Abstract Interfaces for Data Analysis - Component Architecture for Data Analysis Tools

    SciTech Connect

    Barrand, Guy

    2002-08-20

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualization), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organization, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimizing re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. This paper gives an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  14. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    PubMed

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper.

  15. A Methodology for Integrating Computer-Based Learning Tools in Science Curricula

    ERIC Educational Resources Information Center

    Papadouris, Nicos; Constantinou, Constantinos P.

    2009-01-01

    This paper demonstrates a methodology for effectively integrating computer-based learning tools in science teaching and learning. This methodology provides a means of systematic analysis to identify the capabilities of particular software tools and to formulate a series of competencies relevant to physical science that could be developed by means…

  16. A Methodology for Integrating Computer-Based Learning Tools in Science Curricula

    ERIC Educational Resources Information Center

    Papadouris, Nicos; Constantinou, Constantinos P.

    2009-01-01

    This paper demonstrates a methodology for effectively integrating computer-based learning tools in science teaching and learning. This methodology provides a means of systematic analysis to identify the capabilities of particular software tools and to formulate a series of competencies relevant to physical science that could be developed by means…

  17. Value Analysis: A Tool for Community Colleges.

    ERIC Educational Resources Information Center

    White, Rita A.

    Adoption of a value analysis program is proposed to aid colleges in identifying and implementing educationally sound labor-saving devices and procedures, enabling them to meet more students' needs at less cost with no quality reduction and a minimum of staff resistance. Value analysis is defined as a method for studying how well a product does…

  18. AOP Knowledge Base/Wiki Tool Set

    EPA Science Inventory

    Utilizing ToxCast Data and Lifestage Physiologically-Based Pharmacokinetic (PBPK) models to Drive Adverse Outcome Pathways (AOPs)-Based Margin of Exposures (ABME) to Chemicals. Hisham A. El-Masri1, Nicole C. Klienstreur2, Linda Adams1, Tamara Tal1, Stephanie Padilla1, Kristin Is...

  19. AOP Knowledge Base/Wiki Tool Set

    EPA Science Inventory

    Utilizing ToxCast Data and Lifestage Physiologically-Based Pharmacokinetic (PBPK) models to Drive Adverse Outcome Pathways (AOPs)-Based Margin of Exposures (ABME) to Chemicals. Hisham A. El-Masri1, Nicole C. Klienstreur2, Linda Adams1, Tamara Tal1, Stephanie Padilla1, Kristin Is...

  20. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    ERIC Educational Resources Information Center

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…