van Rhee, Henk; Hak, Tony
2017-01-01
We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932
Suurmond, Robert; van Rhee, Henk; Hak, Tony
2017-12-01
We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
Analysis of pre-service physics teacher skills designing simple physics experiments based technology
NASA Astrophysics Data System (ADS)
Susilawati; Huda, C.; Kurniawan, W.; Masturi; Khoiri, N.
2018-03-01
Pre-service physics teacher skill in designing simple experiment set is very important in adding understanding of student concept and practicing scientific skill in laboratory. This study describes the skills of physics students in designing simple experiments based technologicall. The experimental design stages include simple tool design and sensor modification. The research method used is descriptive method with the number of research samples 25 students and 5 variations of simple physics experimental design. Based on the results of interviews and observations obtained the results of pre-service physics teacher skill analysis in designing simple experimental physics charged technology is good. Based on observation result, pre-service physics teacher skill in designing simple experiment is good while modification and sensor application are still not good. This suggests that pre-service physics teacher still need a lot of practice and do experiments in designing physics experiments using sensor modifications. Based on the interview result, it is found that students have high enough motivation to perform laboratory activities actively and students have high curiosity to be skilled at making simple practicum tool for physics experiment.
Nutrition screening tools: an analysis of the evidence.
Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy
2012-05-01
In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.
Social Network Analysis: A Simple but Powerful Tool for Identifying Teacher Leaders
ERIC Educational Resources Information Center
Smith, P. Sean; Trygstad, Peggy J.; Hayes, Meredith L.
2018-01-01
Instructional teacher leadership is central to a vision of distributed leadership. However, identifying instructional teacher leaders can be a daunting task, particularly for administrators who find themselves either newly appointed or faced with high staff turnover. This article describes the use of social network analysis (SNA), a simple but…
The Adversarial Route Analysis Tool: A Web Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casson, William H. Jr.
2012-08-02
The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.
Using High Speed Smartphone Cameras and Video Analysis Techniques to Teach Mechanical Wave Physics
ERIC Educational Resources Information Center
Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano
2017-01-01
We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses…
EZ and GOSSIP, two new VO compliant tools for spectral analysis
NASA Astrophysics Data System (ADS)
Franzetti, P.; Garill, B.; Fumana, M.; Paioro, L.; Scodeggio, M.; Paltani, S.; Scaramella, R.
2008-10-01
We present EZ and GOSSIP, two new VO compliant tools dedicated to spectral analysis. EZ is a tool to perform automatic redshift measurement; GOSSIP is a tool created to perform the SED fitting procedure in a simple, user friendly and efficient way. These two tools have been developed by the PANDORA Group at INAF-IASF (Milano); EZ has been developed in collaboration with Osservatorio Monte Porzio (Roma) and Integral Science Data Center (Geneve). EZ is released to the astronomical community; GOSSIP is currently in beta-testing.
ERIC Educational Resources Information Center
Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin
2016-01-01
In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…
GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis
NASA Astrophysics Data System (ADS)
Nass, Andrea; van Gasselt, Stephan
2015-04-01
Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.
Wagner, Lucas; Schmal, Christoph; Staiger, Dorothee; Danisman, Selahattin
2017-01-01
The analysis of circadian leaf movement rhythms is a simple yet effective method to study effects of treatments or gene mutations on the circadian clock of plants. Currently, leaf movements are analysed using time lapse photography and subsequent bioinformatics analyses of leaf movements. Programs that are used for this purpose either are able to perform one function (i.e. leaf tip detection or rhythm analysis) or their function is limited to specific computational environments. We developed a leaf movement analysis tool-PALMA-that works in command line and combines image extraction with rhythm analysis using Fast Fourier transformation and non-linear least squares fitting. We validated PALMA in both simulated time series and in experiments using the known short period mutant sensitivity to red light reduced 1 ( srr1 - 1 ). We compared PALMA with two established leaf movement analysis tools and found it to perform equally well. Finally, we tested the effect of reduced iron conditions on the leaf movement rhythms of wild type plants. Here, we found that PALMA successfully detected period lengthening under reduced iron conditions. PALMA correctly estimated the period of both simulated and real-life leaf movement experiments. As a platform-independent console-program that unites both functions needed for the analysis of circadian leaf movements it is a valid alternative to existing leaf movement analysis tools.
The Persistence of Mode 1 Technology in the Korean Late Paleolithic
Lee, Hyeong Woo
2013-01-01
Ssangjungri (SJ), an open-air site with several Paleolithic horizons, was recently discovered in South Korea. Most of the identified artifacts are simple core and flake tools that indicate an expedient knapping strategy. Bifacially worked core tools, which might be considered non-classic bifaces, also have been found. The prolific horizons at the site were dated by accelerator mass spectrometry (AMS) to about 30 kya. Another newly discovered Paleolithic open-air site, Jeungsan (JS), shows a homogeneous lithic pattern during this period. The dominated artifact types and usage of raw materials are similar in character to those from SJ, although JS yielded a larger number of simple core and flake tools with non-classic bifaces. Chronometric analysis by AMS and optically stimulated luminescence (OSL) indicate that the prime stratigraphic levels at JS also date to approximately 30 kya, and the numerous conjoining pieces indicate that the layers were not seriously affected by post-depositional processes. Thus, it can be confirmed that simple core and flake tools were produced at temporally and culturally independent sites until after 30 kya, supporting the hypothesis of a wide and persistent use of simple technology into the Late Pleistocene. PMID:23724113
Development of methodology for horizontal axis wind turbine dynamic analysis
NASA Technical Reports Server (NTRS)
Dugundji, J.
1982-01-01
Horizontal axis wind turbine dynamics were studied. The following findings are summarized: (1) review of the MOSTAS computer programs for dynamic analysis of horizontal axis wind turbines; (2) review of various analysis methods for rotating systems with periodic coefficients; (3) review of structural dynamics analysis tools for large wind turbine; (4) experiments for yaw characteristics of a rotating rotor; (5) development of a finite element model for rotors; (6) development of simple models for aeroelastics; and (7) development of simple models for stability and response of wind turbines on flexible towers.
Environmental management practices are trending away from simple, local- scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to impleme...
I'll take that to go: Big data bags and minimal identifiers for exchange of large, complex datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chard, Kyle; D'Arcy, Mike; Heavner, Benjamin D.
Big data workflows often require the assembly and exchange of complex, multi-element datasets. For example, in biomedical applications, the input to an analytic pipeline can be a dataset consisting thousands of images and genome sequences assembled from diverse repositories, requiring a description of the contents of the dataset in a concise and unambiguous form. Typical approaches to creating datasets for big data workflows assume that all data reside in a single location, requiring costly data marshaling and permitting errors of omission and commission because dataset members are not explicitly specified. We address these issues by proposing simple methods and toolsmore » for assembling, sharing, and analyzing large and complex datasets that scientists can easily integrate into their daily workflows. These tools combine a simple and robust method for describing data collections (BDBags), data descriptions (Research Objects), and simple persistent identifiers (Minids) to create a powerful ecosystem of tools and services for big data analysis and sharing. We present these tools and use biomedical case studies to illustrate their use for the rapid assembly, sharing, and analysis of large datasets.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, Christopher A.
In this dissertation the possibility that chaos and simple determinism are governing the dynamics of reversed field pinch (RFP) plasmas is investigated. To properly assess this possibility, data from both numerical simulations and experiment are analyzed. A large repertoire of nonlinear analysis techniques is used to identify low dimensional chaos in the data. These tools include phase portraits and Poincare sections, correlation dimension, the spectrum of Lyapunov exponents and short term predictability. In addition, nonlinear noise reduction techniques are applied to the experimental data in an attempt to extract any underlying deterministic dynamics. Two model systems are used to simulatemore » the plasma dynamics. These are the DEBS code, which models global RFP dynamics, and the dissipative trapped electron mode (DTEM) model, which models drift wave turbulence. Data from both simulations show strong indications of low dimensional chaos and simple determinism. Experimental date were obtained from the Madison Symmetric Torus RFP and consist of a wide array of both global and local diagnostic signals. None of the signals shows any indication of low dimensional chaos or low simple determinism. Moreover, most of the analysis tools indicate the experimental system is very high dimensional with properties similar to noise. Nonlinear noise reduction is unsuccessful at extracting an underlying deterministic system.« less
Comparison of different objective functions for parameterization of simple respiration models
M.T. van Wijk; B. van Putten; D.Y. Hollinger; A.D. Richardson
2008-01-01
The eddy covariance measurements of carbon dioxide fluxes collected around the world offer a rich source for detailed data analysis. Simple, aggregated models are attractive tools for gap filling, budget calculation, and upscaling in space and time. Key in the application of these models is their parameterization and a robust estimate of the uncertainty and reliability...
Fisher, Rohan P; Myers, Bronwyn A
2011-02-25
Despite the demonstrated utility of GIS for health applications, there are perceived problems in low resource settings: GIS software can be expensive and complex; input data are often of low quality. This study aimed to test the appropriateness of new, inexpensive and simple GIS tools in poorly resourced areas of a developing country. GIS applications were trialled in pilot studies based on mapping of health resources and health indicators at the clinic and district level in the predominantly rural province of Nusa Tenggara Timur in eastern Indonesia. The pilot applications were (i) rapid field collection of health infrastructure data using a GPS enabled PDA, (ii) mapping health indicator data using open source GIS software, and (iii) service availability mapping using a free modelling tool. Through contextualised training, district and clinic staff acquired skills in spatial analysis and visualisation and, six months after the pilot studies, they were using these skills for advocacy in the planning process, to inform the allocation of some health resources, and to evaluate some public health initiatives. We demonstrated that GIS can be a useful and inexpensive tool for the decentralisation of health data analysis to low resource settings through the use of free and simple software, locally relevant training materials and by providing data collection tools to ensure data reliability.
2011-01-01
Background Despite the demonstrated utility of GIS for health applications, there are perceived problems in low resource settings: GIS software can be expensive and complex; input data are often of low quality. This study aimed to test the appropriateness of new, inexpensive and simple GIS tools in poorly resourced areas of a developing country. GIS applications were trialled in pilot studies based on mapping of health resources and health indicators at the clinic and district level in the predominantly rural province of Nusa Tenggara Timur in eastern Indonesia. The pilot applications were (i) rapid field collection of health infrastructure data using a GPS enabled PDA, (ii) mapping health indicator data using open source GIS software, and (iii) service availability mapping using a free modelling tool. Results Through contextualised training, district and clinic staff acquired skills in spatial analysis and visualisation and, six months after the pilot studies, they were using these skills for advocacy in the planning process, to inform the allocation of some health resources, and to evaluate some public health initiatives. Conclusions We demonstrated that GIS can be a useful and inexpensive tool for the decentralisation of health data analysis to low resource settings through the use of free and simple software, locally relevant training materials and by providing data collection tools to ensure data reliability. PMID:21352553
Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools
ERIC Educational Resources Information Center
Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.
2014-01-01
Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…
Anton TenWolde; Mark T. Bomberg
2009-01-01
Overall, despite the lack of exact input data, the use of design tools, including models, is much superior to the simple following of rules of thumbs, and a moisture analysis should be standard procedure for any building envelope design. Exceptions can only be made for buildings in the same climate, similar occupancy, and similar envelope construction. This chapter...
SPARTA: Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis.
Johnson, Benjamin K; Scholz, Matthew B; Teal, Tracy K; Abramovitch, Robert B
2016-02-04
Many tools exist in the analysis of bacterial RNA sequencing (RNA-seq) transcriptional profiling experiments to identify differentially expressed genes between experimental conditions. Generally, the workflow includes quality control of reads, mapping to a reference, counting transcript abundance, and statistical tests for differentially expressed genes. In spite of the numerous tools developed for each component of an RNA-seq analysis workflow, easy-to-use bacterially oriented workflow applications to combine multiple tools and automate the process are lacking. With many tools to choose from for each step, the task of identifying a specific tool, adapting the input/output options to the specific use-case, and integrating the tools into a coherent analysis pipeline is not a trivial endeavor, particularly for microbiologists with limited bioinformatics experience. To make bacterial RNA-seq data analysis more accessible, we developed a Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis (SPARTA). SPARTA is a reference-based bacterial RNA-seq analysis workflow application for single-end Illumina reads. SPARTA is turnkey software that simplifies the process of analyzing RNA-seq data sets, making bacterial RNA-seq analysis a routine process that can be undertaken on a personal computer or in the classroom. The easy-to-install, complete workflow processes whole transcriptome shotgun sequencing data files by trimming reads and removing adapters, mapping reads to a reference, counting gene features, calculating differential gene expression, and, importantly, checking for potential batch effects within the data set. SPARTA outputs quality analysis reports, gene feature counts and differential gene expression tables and scatterplots. SPARTA provides an easy-to-use bacterial RNA-seq transcriptional profiling workflow to identify differentially expressed genes between experimental conditions. This software will enable microbiologists with limited bioinformatics experience to analyze their data and integrate next generation sequencing (NGS) technologies into the classroom. The SPARTA software and tutorial are available at sparta.readthedocs.org.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.
Simonyan, Vahan; Mazumder, Raja
2014-09-30
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis
Simonyan, Vahan; Mazumder, Raja
2014-01-01
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953
NASA Technical Reports Server (NTRS)
Katz, Daniel S.; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.; Borgioli, Andrea
2000-01-01
The process of designing and analyzing a multiple-reflector system has traditionally been time-intensive, requiring large amounts of both computational and human time. At many frequencies, a discrete approximation of the radiation integral may be used to model the system. The code which implements this physical optics (PO) algorithm was developed at the Jet Propulsion Laboratory. It analyzes systems of antennas in pairs, and for each pair, the analysis can be computationally time-consuming. Additionally, the antennas must be described using a local coordinate system for each antenna, which makes it difficult to integrate the design into a multi-disciplinary framework in which there is traditionally one global coordinate system, even before considering deforming the antenna as prescribed by external structural and/or thermal factors. Finally, setting up the code to correctly analyze all the antenna pairs in the system can take a fair amount of time, and introduces possible human error. The use of parallel computing to reduce the computational time required for the analysis of a given pair of antennas has been previously discussed. This paper focuses on the other problems mentioned above. It will present a methodology and examples of use of an automated tool that performs the analysis of a complete multiple-reflector system in an integrated multi-disciplinary environment (including CAD modeling, and structural and thermal analysis) at the click of a button. This tool, named MOD Tool (Millimeter-wave Optics Design Tool), has been designed and implemented as a distributed tool, with a client that runs almost identically on Unix, Mac, and Windows platforms, and a server that runs primarily on a Unix workstation and can interact with parallel supercomputers with simple instruction from the user interacting with the client.
Gravitational Wave Detection in the Introductory Lab
NASA Astrophysics Data System (ADS)
Burko, Lior M.
2017-01-01
Great physics breakthroughs are rarely included in the introductory physics course. General relativity and binary black hole coalescence are no different, and can be included in the introductory course only in a very limited sense. However, we can design activities that directly involve the detection of GW150914, the designation of the Gravitation Wave signal detected on September 14, 2015, thereby engage the students in this exciting discovery directly. The activities naturally do not include the construction of a detector or the detection of gravitational waves. Instead, we design it to include analysis of the data from GW150914, which includes some interesting analysis activities for students of the introductory course. The same activities can be assigned either as a laboratory exercise or as a computational project for the same population of students. The analysis tools used here are simple and available to the intended student population. It does not include the sophisticated analysis tools, which were used by LIGO to carefully analyze the detected signal. However, these simple tools are sufficient to allow the student to get important results. We have successfully assigned this lab project for students of the introductory course with calculus at Georgia Gwinnett College.
ViSimpl: Multi-View Visual Analysis of Brain Simulation Data
Galindo, Sergio E.; Toharia, Pablo; Robles, Oscar D.; Pastor, Luis
2016-01-01
After decades of independent morphological and functional brain research, a key point in neuroscience nowadays is to understand the combined relationships between the structure of the brain and its components and their dynamics on multiple scales, ranging from circuits of neurons at micro or mesoscale to brain regions at macroscale. With such a goal in mind, there is a vast amount of research focusing on modeling and simulating activity within neuronal structures, and these simulations generate large and complex datasets which have to be analyzed in order to gain the desired insight. In such context, this paper presents ViSimpl, which integrates a set of visualization and interaction tools that provide a semantic view of brain data with the aim of improving its analysis procedures. ViSimpl provides 3D particle-based rendering that allows visualizing simulation data with their associated spatial and temporal information, enhancing the knowledge extraction process. It also provides abstract representations of the time-varying magnitudes supporting different data aggregation and disaggregation operations and giving also focus and context clues. In addition, ViSimpl tools provide synchronized playback control of the simulation being analyzed. Finally, ViSimpl allows performing selection and filtering operations relying on an application called NeuroScheme. All these views are loosely coupled and can be used independently, but they can also work together as linked views, both in centralized and distributed computing environments, enhancing the data exploration and analysis procedures. PMID:27774062
ViSimpl: Multi-View Visual Analysis of Brain Simulation Data.
Galindo, Sergio E; Toharia, Pablo; Robles, Oscar D; Pastor, Luis
2016-01-01
After decades of independent morphological and functional brain research, a key point in neuroscience nowadays is to understand the combined relationships between the structure of the brain and its components and their dynamics on multiple scales, ranging from circuits of neurons at micro or mesoscale to brain regions at macroscale. With such a goal in mind, there is a vast amount of research focusing on modeling and simulating activity within neuronal structures, and these simulations generate large and complex datasets which have to be analyzed in order to gain the desired insight. In such context, this paper presents ViSimpl, which integrates a set of visualization and interaction tools that provide a semantic view of brain data with the aim of improving its analysis procedures. ViSimpl provides 3D particle-based rendering that allows visualizing simulation data with their associated spatial and temporal information, enhancing the knowledge extraction process. It also provides abstract representations of the time-varying magnitudes supporting different data aggregation and disaggregation operations and giving also focus and context clues. In addition, ViSimpl tools provide synchronized playback control of the simulation being analyzed. Finally, ViSimpl allows performing selection and filtering operations relying on an application called NeuroScheme. All these views are loosely coupled and can be used independently, but they can also work together as linked views, both in centralized and distributed computing environments, enhancing the data exploration and analysis procedures.
Projectiles, pendula, and special relativity
NASA Astrophysics Data System (ADS)
Price, Richard H.
2005-05-01
The kind of flat-earth gravity used in introductory physics appears in an accelerated reference system in special relativity. From this viewpoint, we work out the special relativistic description of a ballistic projectile and a simple pendulum, two examples of simple motion driven by earth-surface gravity. The analysis uses only the basic mathematical tools of special relativity typical of a first-year university course.
Simple Example of Backtest Overfitting (SEBO)
DOE Office of Scientific and Technical Information (OSTI.GOV)
In the field of mathematical finance, a "backtest" is the usage of historical market data to assess the performance of a proposed trading strategy. It is a relatively simple matter for a present-day computer system to explore thousands, millions or even billions of variations of a proposed strategy, and pick the best performing variant as the "optimal" strategy "in sample" (i.e., on the input dataset). Unfortunately, such an "optimal" strategy often performs very poorly "out of sample" (i.e. on another dataset), because the parameters of the invest strategy have been oversit to the in-sample data, a situation known as "backtestmore » overfitting". While the mathematics of backtest overfitting has been examined in several recent theoretical studies, here we pursue a more tangible analysis of this problem, in the form of an online simulator tool. Given a input random walk time series, the tool develops an "optimal" variant of a simple strategy by exhaustively exploring all integer parameter values among a handful of parameters. That "optimal" strategy is overfit, since by definition a random walk is unpredictable. Then the tool tests the resulting "optimal" strategy on a second random walk time series. In most runs using our online tool, the "optimal" strategy derived from the first time series performs poorly on the second time series, demonstrating how hard it is not to overfit a backtest. We offer this online tool, "Simple Example of Backtest Overfitting (SEBO)", to facilitate further research in this area.« less
A Thermal Management Systems Model for the NASA GTX RBCC Concept
NASA Technical Reports Server (NTRS)
Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)
2002-01-01
The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.
On-Line Analysis of Southern FIA Data
Michael P. Spinney; Paul C. Van Deusen; Francis A. Roesch
2006-01-01
The Southern On-Line Estimator (SOLE) is a web-based FIA database analysis tool designed with an emphasis on modularity. The Java-based user interface is simple and intuitive to use and the R-based analysis engine is fast and stable. Each component of the program (data retrieval, statistical analysis and output) can be individually modified to accommodate major...
Benchmarking of Decision-Support Tools Used for Tiered Sustainable Remediation Appraisal.
Smith, Jonathan W N; Kerrison, Gavin
2013-01-01
Sustainable remediation comprises soil and groundwater risk-management actions that are selected, designed, and operated to maximize net environmental, social, and economic benefit (while assuring protection of human health and safety). This paper describes a benchmarking exercise to comparatively assess potential differences in environmental management decision making resulting from application of different sustainability appraisal tools ranging from simple (qualitative) to more quantitative (multi-criteria and fully monetized cost-benefit analysis), as outlined in the SuRF-UK framework. The appraisal tools were used to rank remedial options for risk management of a subsurface petroleum release that occurred at a petrol filling station in central England. The remediation options were benchmarked using a consistent set of soil and groundwater data for each tier of sustainability appraisal. The ranking of remedial options was very similar in all three tiers, and an environmental management decision to select the most sustainable options at tier 1 would have been the same decision at tiers 2 and 3. The exercise showed that, for relatively simple remediation projects, a simple sustainability appraisal led to the same remediation option selection as more complex appraisal, and can be used to reliably inform environmental management decisions on other relatively simple land contamination projects.
A Simple Framework for Evaluating Authorial Contributions for Scientific Publications.
Warrender, Jeffrey M
2016-10-01
A simple tool is provided to assist researchers in assessing contributions to a scientific publication, for ease in evaluating which contributors qualify for authorship, and in what order the authors should be listed. The tool identifies four phases of activity leading to a publication-Conception and Design, Data Acquisition, Analysis and Interpretation, and Manuscript Preparation. By comparing a project participant's contribution in a given phase to several specified thresholds, a score of up to five points can be assigned; the contributor's scores in all four phases are summed to yield a total "contribution score", which is compared to a threshold to determine which contributors merit authorship. This tool may be useful in a variety of contexts in which a systematic approach to authorial credit is desired.
Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard
2018-06-01
Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .
Architecture for interoperable software in biology.
Bare, James Christopher; Baliga, Nitin S
2014-07-01
Understanding biological complexity demands a combination of high-throughput data and interdisciplinary skills. One way to bring to bear the necessary combination of data types and expertise is by encapsulating domain knowledge in software and composing that software to create a customized data analysis environment. To this end, simple flexible strategies are needed for interconnecting heterogeneous software tools and enabling data exchange between them. Drawing on our own work and that of others, we present several strategies for interoperability and their consequences, in particular, a set of simple data structures--list, matrix, network, table and tuple--that have proven sufficient to achieve a high degree of interoperability. We provide a few guidelines for the development of future software that will function as part of an interoperable community of software tools for biological data analysis and visualization. © The Author 2012. Published by Oxford University Press.
ERIC Educational Resources Information Center
Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai
2016-01-01
A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…
Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan
2014-01-01
LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784
Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan
2014-12-15
LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Génot, V.; André, N.; Cecconi, B.; Bouchemit, M.; Budnik, E.; Bourrel, N.; Gangloff, M.; Dufourg, N.; Hess, S.; Modolo, R.; Renard, B.; Lormant, N.; Beigbeder, L.; Popescu, D.; Toniutti, J.-P.
2014-11-01
The interest for data communication between analysis tools in planetary sciences and space physics is illustrated in this paper via several examples of the uses of SAMP. The Simple Application Messaging Protocol is developed in the frame of the IVOA from an earlier protocol called PLASTIC. SAMP enables easy communication and interoperability between astronomy software, stand-alone and web-based; it is now increasingly adopted by the planetary sciences and space physics community. Its attractiveness is based, on one hand, on the use of common file formats for exchange and, on the other hand, on established messaging models. Examples of uses at the CDPP and elsewhere are presented. The CDPP (Centre de Données de la Physique des Plasmas, http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (Automated Multi Dataset Analysis, http://amda.cdpp.eu/) which enables in depth analysis of large amount of data through dedicated functionalities such as: visualization, conditional search and cataloging. Besides AMDA, the 3DView (http://3dview.cdpp.eu/) tool provides immersive visualizations and is further developed to include simulation and observational data. These tools and their interactions with each other, notably via SAMP, are presented via science cases of interest to planetary sciences and space physics communities.
Navigating freely-available software tools for metabolomics analysis.
Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph
2017-01-01
The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.
Dimensional Analysis in Physics and the Buckingham Theorem
ERIC Educational Resources Information Center
Misic, Tatjana; Najdanovic-Lukic, Marina; Nesic, Ljubisa
2010-01-01
Dimensional analysis is a simple, clear and intuitive method for determining the functional dependence of physical quantities that are of importance to a certain process. However, in physics textbooks, very little space is usually given to this approach and it is often presented only as a diagnostic tool used to determine the validity of…
Assessment and Planning Using Portfolio Analysis
ERIC Educational Resources Information Center
Roberts, Laura B.
2010-01-01
Portfolio analysis is a simple yet powerful management tool. Programs and activities are placed on a grid with mission along one axis and financial return on the other. The four boxes of the grid (low mission, low return; high mission, low return; high return, low mission; high return, high mission) help managers identify which programs might be…
Microcomputers, Software and Foreign Languages for Special Purposes: An Analysis of TXTPRO.
ERIC Educational Resources Information Center
Tang, Michael S.
TXTPRO, a computer program developed as a graduate-level research tool for descriptive linguistic analysis, produces simple alphabetic and word frequency lists, analyzes word combinations, and develops concordances. With modifications, a teacher could enter the program into a mainframe or a microcomputer and use it for text analyses to develop…
Lam, Johnny; Marklein, Ross A; Jimenez-Torres, Jose A; Beebe, David J; Bauer, Steven R; Sung, Kyung E
2017-12-01
Multipotent stromal cells (MSCs, often called mesenchymal stem cells) have garnered significant attention within the field of regenerative medicine because of their purported ability to differentiate down musculoskeletal lineages. Given the inherent heterogeneity of MSC populations, recent studies have suggested that cell morphology may be indicative of MSC differentiation potential. Toward improving current methods and developing simple yet effective approaches for the morphological evaluation of MSCs, we combined passive pumping microfluidic technology with high-dimensional morphological characterization to produce robust tools for standardized high-throughput analysis. Using ultraviolet (UV) light as a modality for reproducible polystyrene substrate modification, we show that MSCs seeded on microfluidic straight channel devices incorporating UV-exposed substrates exhibited morphological changes that responded accordingly to the degree of substrate modification. Substrate modification also effected greater morphological changes in MSCs seeded at a lower rather than higher density within microfluidic channels. Despite largely comparable trends in morphology, MSCs seeded in microscale as opposed to traditional macroscale platforms displayed much higher sensitivity to changes in substrate properties. In summary, we adapted and qualified microfluidic cell culture platforms comprising simple straight channel arrays as a viable and robust tool for high-throughput quantitative morphological analysis to study cell-material interactions.
Ben Rekaya, Mariem; Laroussi, Nadia; Messaoud, Olfa; Jones, Mariem; Jerbi, Manel; Naouali, Chokri; Bouyacoub, Yosra; Chargui, Mariem; Kefi, Rym; Fazaa, Becima; Boubaker, Mohamed Samir; Boussen, Hamouda; Mokni, Mourad; Abdelhak, Sonia; Zghal, Mohamed; Khaled, Aida; Yacoub-Youssef, Houda
2014-01-01
Xeroderma pigmentosum Variant (XP-V) form is characterized by a late onset of skin symptoms. Our aim is the clinical and genetic investigations of XP-V Tunisian patients in order to develop a simple tool for early diagnosis. We investigated 16 suspected XP patients belonging to ten consanguineous families. Analysis of the POLH gene was performed by linkage analysis, long range PCR, and sequencing. Genetic analysis showed linkage to the POLH gene with a founder haplotype in all affected patients. Long range PCR of exon 9 to exon 11 showed a 3926 bp deletion compared to control individuals. Sequence analysis demonstrates that this deletion has occurred between two Alu-Sq2 repetitive sequences in the same orientation, respectively, in introns 9 and 10. We suggest that this mutation POLH NG_009252.1: g.36847_40771del3925 is caused by an equal crossover event that occurred between two homologous chromosomes at meiosis. These results allowed us to develop a simple test based on a simple PCR in order to screen suspected XP-V patients. In Tunisia, the prevalence of XP-V group seems to be underestimated and clinical diagnosis is usually later. Cascade screening of this founder mutation by PCR in regions with high frequency of XP provides a rapid and cost-effective tool for early diagnosis of XP-V in Tunisia and North Africa.
Ben Rekaya, Mariem; Laroussi, Nadia; Messaoud, Olfa; Jones, Mariem; Jerbi, Manel; Bouyacoub, Yosra; Chargui, Mariem; Kefi, Rym; Fazaa, Becima; Boubaker, Mohamed Samir; Boussen, Hamouda; Mokni, Mourad; Abdelhak, Sonia; Zghal, Mohamed; Khaled, Aida; Yacoub-Youssef, Houda
2014-01-01
Xeroderma pigmentosum Variant (XP-V) form is characterized by a late onset of skin symptoms. Our aim is the clinical and genetic investigations of XP-V Tunisian patients in order to develop a simple tool for early diagnosis. We investigated 16 suspected XP patients belonging to ten consanguineous families. Analysis of the POLH gene was performed by linkage analysis, long range PCR, and sequencing. Genetic analysis showed linkage to the POLH gene with a founder haplotype in all affected patients. Long range PCR of exon 9 to exon 11 showed a 3926 bp deletion compared to control individuals. Sequence analysis demonstrates that this deletion has occurred between two Alu-Sq2 repetitive sequences in the same orientation, respectively, in introns 9 and 10. We suggest that this mutation POLH NG_009252.1: g.36847_40771del3925 is caused by an equal crossover event that occurred between two homologous chromosomes at meiosis. These results allowed us to develop a simple test based on a simple PCR in order to screen suspected XP-V patients. In Tunisia, the prevalence of XP-V group seems to be underestimated and clinical diagnosis is usually later. Cascade screening of this founder mutation by PCR in regions with high frequency of XP provides a rapid and cost-effective tool for early diagnosis of XP-V in Tunisia and North Africa. PMID:24877075
Simple Parametric Model for Airfoil Shape Description
NASA Astrophysics Data System (ADS)
Ziemkiewicz, David
2017-12-01
We show a simple, analytic equation describing a class of two-dimensional shapes well suited for representation of aircraft airfoil profiles. Our goal was to create a description characterized by a small number of parameters with easily understandable meaning, providing a tool to alter the shape with optimization procedures as well as manual tweaks by the designer. The generated shapes are well suited for numerical analysis with 2D flow solving software such as XFOIL.
Using high speed smartphone cameras and video analysis techniques to teach mechanical wave physics
NASA Astrophysics Data System (ADS)
Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano
2017-07-01
We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses along a spring and the period of transverse standing waves generated in the same spring. These experiments can be helpful in addressing several relevant concepts about the physics of mechanical waves and in overcoming some of the typical student misconceptions in this same field.
The Simple Video Coder: A free tool for efficiently coding social video data.
Barto, Daniel; Bird, Clark W; Hamilton, Derek A; Fink, Brandi C
2017-08-01
Videotaping of experimental sessions is a common practice across many disciplines of psychology, ranging from clinical therapy, to developmental science, to animal research. Audio-visual data are a rich source of information that can be easily recorded; however, analysis of the recordings presents a major obstacle to project completion. Coding behavior is time-consuming and often requires ad-hoc training of a student coder. In addition, existing software is either prohibitively expensive or cumbersome, which leaves researchers with inadequate tools to quickly process video data. We offer the Simple Video Coder-free, open-source software for behavior coding that is flexible in accommodating different experimental designs, is intuitive for students to use, and produces outcome measures of event timing, frequency, and duration. Finally, the software also offers extraction tools to splice video into coded segments suitable for training future human coders or for use as input for pattern classification algorithms.
R-based Tool for a Pairwise Structure-activity Relationship Analysis.
Klimenko, Kyrylo
2018-04-01
The Structure-Activity Relationship analysis is a complex process that can be enhanced by computational techniques. This article describes a simple tool for SAR analysis that has a graphic user interface and a flexible approach towards the input of molecular data. The application allows calculating molecular similarity represented by Tanimoto index & Euclid distance, as well as, determining activity cliffs by means of Structure-Activity Landscape Index. The calculation is performed in a pairwise manner either for the reference compound and other compounds or for all possible pairs in the data set. The results of SAR analysis are visualized using two types of plot. The application capability is demonstrated by the analysis of a set of COX2 inhibitors with respect to Isoxicam. This tool is available online: it includes manual and input file examples. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
WebScope: A New Tool for Fusion Data Analysis and Visualization
NASA Astrophysics Data System (ADS)
Yang, Fei; Dang, Ningning; Xiao, Bingjia
2010-04-01
A visualization tool was developed through a web browser based on Java applets embedded into HTML pages, in order to provide a world access to the EAST experimental data. It can display data from various trees in different servers in a single panel. With WebScope, it is easier to make a comparison between different data sources and perform a simple calculation over different data sources.
Development of a prototype commonality analysis tool for use in space programs
NASA Technical Reports Server (NTRS)
Yeager, Dorian P.
1988-01-01
A software tool to aid in performing commonality analyses, called Commonality Analysis Problem Solver (CAPS), was designed, and a prototype version (CAPS 1.0) was implemented and tested. The CAPS 1.0 runs in an MS-DOS or IBM PC-DOS environment. The CAPS is designed around a simple input language which provides a natural syntax for the description of feasibility constraints. It provides its users with the ability to load a database representing a set of design items, describe the feasibility constraints on items in that database, and do a comprehensive cost analysis to find the most economical substitution pattern.
Chaos in plasma simulation and experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, C.; Newman, D.E.; Sprott, J.C.
1993-09-01
We investigate the possibility that chaos and simple determinism are governing the dynamics of reversed field pinch (RFP) plasmas using data from both numerical simulations and experiment. A large repertoire of nonlinear analysis techniques is used to identify low dimensional chaos. These tools include phase portraits and Poincard sections, correlation dimension, the spectrum of Lyapunov exponents and short term predictability. In addition, nonlinear noise reduction techniques are applied to the experimental data in an attempt to extract any underlying deterministic dynamics. Two model systems are used to simulate the plasma dynamics. These are -the DEBS code, which models global RFPmore » dynamics, and the dissipative trapped electron mode (DTEM) model, which models drift wave turbulence. Data from both simulations show strong indications of low,dimensional chaos and simple determinism. Experimental data were obtained from the Madison Symmetric Torus RFP and consist of a wide array of both global and local diagnostic signals. None of the signals shows any indication of low dimensional chaos or other simple determinism. Moreover, most of the analysis tools indicate the experimental system is very high dimensional with properties similar to noise. Nonlinear noise reduction is unsuccessful at extracting an underlying deterministic system.« less
Web-Based Analysis for Student-Generated Complex Genetic Profiles
ERIC Educational Resources Information Center
Kass, David H.; LaRoe, Robert
2007-01-01
A simple, rapid method for generating complex genetic profiles using Alu-based markers was recently developed for students primarily at the undergraduate level to learn more about forensics and paternity analysis. On the basis of the Cold Spring Harbor Allele Server, which provides an excellent tool for analyzing a single Alu variant, we present a…
ERIC Educational Resources Information Center
Preacher, Kristopher J.; Curran, Patrick J.; Bauer, Daniel J.
2006-01-01
Simple slopes, regions of significance, and confidence bands are commonly used to evaluate interactions in multiple linear regression (MLR) models, and the use of these techniques has recently been extended to multilevel or hierarchical linear modeling (HLM) and latent curve analysis (LCA). However, conducting these tests and plotting the…
Video analysis of projectile motion using tablet computers as experimental tools
NASA Astrophysics Data System (ADS)
Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.
2014-01-01
Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.
Audio signal analysis for tool wear monitoring in sheet metal stamping
NASA Astrophysics Data System (ADS)
Ubhayaratne, Indivarie; Pereira, Michael P.; Xiang, Yong; Rolfe, Bernard F.
2017-02-01
Stamping tool wear can significantly degrade product quality, and hence, online tool condition monitoring is a timely need in many manufacturing industries. Even though a large amount of research has been conducted employing different sensor signals, there is still an unmet demand for a low-cost easy to set up condition monitoring system. Audio signal analysis is a simple method that has the potential to meet this demand, but has not been previously used for stamping process monitoring. Hence, this paper studies the existence and the significance of the correlation between emitted sound signals and the wear state of sheet metal stamping tools. The corrupting sources generated by the tooling of the stamping press and surrounding machinery have higher amplitudes compared to that of the sound emitted by the stamping operation itself. Therefore, a newly developed semi-blind signal extraction technique was employed as a pre-processing technique to mitigate the contribution of these corrupting sources. The spectral analysis results of the raw and extracted signals demonstrate a significant qualitative relationship between wear progression and the emitted sound signature. This study lays the basis for employing low-cost audio signal analysis in the development of a real-time industrial tool condition monitoring system.
Recent Developments in OVERGRID, OVERFLOW-2 and Chimera Grid Tools Scripts
NASA Technical Reports Server (NTRS)
Chan, William M.
2004-01-01
OVERGRID and OVERFLOW-2 feature easy to use multiple-body dynamics. The new features of OVERGRID include a preliminary chemistry interface, standard atmosphere and mass properties calculators, a simple unsteady solution viewer, and a debris tracking interface. Script library development in Chimera Grid Tools has applications in turbopump grid generation. This viewgraph presentation profiles multiple component dynamics, validation test cases for a sphere, cylinder, and oscillating airfoil, and debris analysis.
Bieri, Michael; d'Auvergne, Edward J; Gooley, Paul R
2011-06-01
Investigation of protein dynamics on the ps-ns and μs-ms timeframes provides detailed insight into the mechanisms of enzymes and the binding properties of proteins. Nuclear magnetic resonance (NMR) is an excellent tool for studying protein dynamics at atomic resolution. Analysis of relaxation data using model-free analysis can be a tedious and time consuming process, which requires good knowledge of scripting procedures. The software relaxGUI was developed for fast and simple model-free analysis and is fully integrated into the software package relax. It is written in Python and uses wxPython to build the graphical user interface (GUI) for maximum performance and multi-platform use. This software allows the analysis of NMR relaxation data with ease and the generation of publication quality graphs as well as color coded images of molecular structures. The interface is designed for simple data analysis and management. The software was tested and validated against the command line version of relax.
Analytical and multibody modeling for the power analysis of standing jumps.
Palmieri, G; Callegari, M; Fioretti, S
2015-01-01
Two methods for the power analysis of standing jumps are proposed and compared in this article. The first method is based on a simple analytical formulation which requires as input the coordinates of the center of gravity in three specified instants of the jump. The second method is based on a multibody model that simulates the jumps processing the data obtained by a three-dimensional (3D) motion capture system and the dynamometric measurements obtained by the force platforms. The multibody model is developed with OpenSim, an open-source software which provides tools for the kinematic and dynamic analyses of 3D human body models. The study is focused on two of the typical tests used to evaluate the muscular activity of lower limbs, which are the counter movement jump and the standing long jump. The comparison between the results obtained by the two methods confirms that the proposed analytical formulation is correct and represents a simple tool suitable for a preliminary analysis of total mechanical work and the mean power exerted in standing jumps.
RSAT 2018: regulatory sequence analysis tools 20th anniversary.
Nguyen, Nga Thi Thuy; Contreras-Moreira, Bruno; Castro-Mondragon, Jaime A; Santana-Garcia, Walter; Ossio, Raul; Robles-Espinoza, Carla Daniela; Bahin, Mathieu; Collombet, Samuel; Vincens, Pierre; Thieffry, Denis; van Helden, Jacques; Medina-Rivera, Alejandra; Thomas-Chollier, Morgane
2018-05-02
RSAT (Regulatory Sequence Analysis Tools) is a suite of modular tools for the detection and the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, including from genome-wide datasets like ChIP-seq/ATAC-seq, (ii) motif scanning, (iii) motif analysis (quality assessment, comparisons and clustering), (iv) analysis of regulatory variations, (v) comparative genomics. Six public servers jointly support 10 000 genomes from all kingdoms. Six novel or refactored programs have been added since the 2015 NAR Web Software Issue, including updated programs to analyse regulatory variants (retrieve-variation-seq, variation-scan, convert-variations), along with tools to extract sequences from a list of coordinates (retrieve-seq-bed), to select motifs from motif collections (retrieve-matrix), and to extract orthologs based on Ensembl Compara (get-orthologs-compara). Three use cases illustrate the integration of new and refactored tools to the suite. This Anniversary update gives a 20-year perspective on the software suite. RSAT is well-documented and available through Web sites, SOAP/WSDL (Simple Object Access Protocol/Web Services Description Language) web services, virtual machines and stand-alone programs at http://www.rsat.eu/.
NASA Astrophysics Data System (ADS)
Henley, E. M.; Pope, E. C. D.
2017-12-01
This commentary concerns recent work on solar wind forecasting by Owens and Riley (2017). The approach taken makes effective use of tools commonly used in terrestrial weather—notably, via use of a simple model—generation of an "ensemble" forecast, and application of a "cost-loss" analysis to the resulting probabilistic information, to explore the benefit of this forecast to users with different risk appetites. This commentary aims to highlight these useful techniques to the wider space weather audience and to briefly discuss the general context of application of terrestrial weather approaches to space weather.
Preliminary Development of an Object-Oriented Optimization Tool
NASA Technical Reports Server (NTRS)
Pak, Chan-gi
2011-01-01
The National Aeronautics and Space Administration Dryden Flight Research Center has developed a FORTRAN-based object-oriented optimization (O3) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. The object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the central executive module and the discipline modules, or both. Six sample optimization problems are presented. The first four sample problems are based on simple mathematical equations; the fifth and sixth problems consider a three-bar truss, which is a classical example in structural synthesis. Instructions for preparing input data for the O3 tool are presented.
Simultaneous Aerodynamic and Structural Design Optimization (SASDO) for a 3-D Wing
NASA Technical Reports Server (NTRS)
Gumbert, Clyde R.; Hou, Gene J.-W.; Newman, Perry A.
2001-01-01
The formulation and implementation of an optimization method called Simultaneous Aerodynamic and Structural Design Optimization (SASDO) is shown as an extension of the Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) method. It is extended by the inclusion of structure element sizing parameters as design variables and Finite Element Method (FEM) analysis responses as constraints. The method aims to reduce the computational expense. incurred in performing shape and sizing optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, FEM structural analysis and sensitivity analysis tools. SASDO is applied to a simple. isolated, 3-D wing in inviscid flow. Results show that the method finds the saine local optimum as a conventional optimization method with some reduction in the computational cost and without significant modifications; to the analysis tools.
User manual for two simple postscript output FORTRAN plotting routines
NASA Technical Reports Server (NTRS)
Nguyen, T. X.
1991-01-01
Graphics is one of the important tools in engineering analysis and design. However, plotting routines that generate output on high quality laser printers normally come in graphics packages, which tend to be expensive and system dependent. These factors become important for small computer systems or desktop computers, especially when only some form of a simple plotting routine is sufficient. With the Postscript language becoming popular, there are more and more Postscript laser printers now available. Simple, versatile, low cost plotting routines that can generate output on high quality laser printers are needed and standard FORTRAN language plotting routines using output in Postscript language seems logical. The purpose here is to explain two simple FORTRAN plotting routines that generate output in Postscript language.
Muir, Dylan R; Kampa, Björn M
2014-01-01
Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories.
Muir, Dylan R.; Kampa, Björn M.
2015-01-01
Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories1. PMID:25653614
PyFolding: Open-Source Graphing, Simulation, and Analysis of the Biophysical Properties of Proteins.
Lowe, Alan R; Perez-Riba, Albert; Itzhaki, Laura S; Main, Ewan R G
2018-02-06
For many years, curve-fitting software has been heavily utilized to fit simple models to various types of biophysical data. Although such software packages are easy to use for simple functions, they are often expensive and present substantial impediments to applying more complex models or for the analysis of large data sets. One field that is reliant on such data analysis is the thermodynamics and kinetics of protein folding. Over the past decade, increasingly sophisticated analytical models have been generated, but without simple tools to enable routine analysis. Consequently, users have needed to generate their own tools or otherwise find willing collaborators. Here we present PyFolding, a free, open-source, and extensible Python framework for graphing, analysis, and simulation of the biophysical properties of proteins. To demonstrate the utility of PyFolding, we have used it to analyze and model experimental protein folding and thermodynamic data. Examples include: 1) multiphase kinetic folding fitted to linked equations, 2) global fitting of multiple data sets, and 3) analysis of repeat protein thermodynamics with Ising model variants. Moreover, we demonstrate how PyFolding is easily extensible to novel functionality beyond applications in protein folding via the addition of new models. Example scripts to perform these and other operations are supplied with the software, and we encourage users to contribute notebooks and models to create a community resource. Finally, we show that PyFolding can be used in conjunction with Jupyter notebooks as an easy way to share methods and analysis for publication and among research teams. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Ramu, Chenna
2003-07-01
SIRW (http://sirw.embl.de/) is a World Wide Web interface to the Simple Indexing and Retrieval System (SIR) that is capable of parsing and indexing various flat file databases. In addition it provides a framework for doing sequence analysis (e.g. motif pattern searches) for selected biological sequences through keyword search. SIRW is an ideal tool for the bioinformatics community for searching as well as analyzing biological sequences of interest.
Estimation of kinematic parameters in CALIFA galaxies: no-assumption on internal dynamics
NASA Astrophysics Data System (ADS)
García-Lorenzo, B.; Barrera-Ballesteros, J.; CALIFA Team
2016-06-01
We propose a simple approach to homogeneously estimate kinematic parameters of a broad variety of galaxies (elliptical, spirals, irregulars or interacting systems). This methodology avoids the use of any kinematical model or any assumption on internal dynamics. This simple but novel approach allows us to determine: the frequency of kinematic distortions, systemic velocity, kinematic center, and kinematic position angles which are directly measured from the two dimensional-distributions of radial velocities. We test our analysis tools using the CALIFA Survey
Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley; Lung, Shun-fat
2008-01-01
An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.
ERIC Educational Resources Information Center
Onorato, P.; Mascheretti, P.; DeAmbrosis, A.
2012-01-01
In this paper, we describe how simple experiments realizable by using easily found and low-cost materials allow students to explore quantitatively the magnetic interaction thanks to the help of an Open Source Physics tool, the Tracker Video Analysis software. The static equilibrium of a "column" of permanents magnets is carefully investigated by…
Fernandes, Telmo J R; Costa, Joana; Oliveira, M Beatriz P P; Mafra, Isabel
2017-09-01
This work aimed to exploit the use of DNA mini-barcodes combined with high resolution melting (HRM) for the authentication of gadoid species: Atlantic cod (Gadus morhua), Pacific cod (Gadus macrocephalus), Alaska pollock (Theragra chalcogramma) and saithe (Pollachius virens). Two DNA barcode regions, namely cytochrome c oxidase subunit I (COI) and cytochrome b (cytb), were analysed in silico to identify genetic variability among the four species and used, subsequently, to develop a real-time PCR method coupled with HRM analysis. The cytb mini-barcode enabled best discrimination of the target species with a high level of confidence (99.3%). The approach was applied successfully to identify gadoid species in 30 fish-containing foods, 30% of which were not as declared on the label. Herein, a novel approach for rapid, simple and cost-effective discrimination/clustering, as a tool to authenticate Gadidae fish species, according to their genetic relationship, is proposed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Haider, Nadia
2017-01-01
Investigation of genetic variation and phylogenetic relationships among date palm (Phoenix dactylifera L.) cultivars is useful for their conservation and genetic improvement. Various molecular markers such as restriction fragment length polymorphisms (RFLPs), simple sequence repeat (SSR), representational difference analysis (RDA), and amplified fragment length polymorphism (AFLP) have been developed to molecularly characterize date palm cultivars. PCR-based markers random amplified polymorphic DNA (RAPD) and inter-simple sequence repeat (ISSR) are powerful tools to determine the relatedness of date palm cultivars that are difficult to distinguish morphologically. In this chapter, the principles, materials, and methods of RAPD and ISSR techniques are presented. Analysis of data generated from these two techniques and the use of these data to reveal phylogenetic relationships among date palm cultivars are also discussed.
SURE reliability analysis: Program and mathematics
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; White, Allan L.
1988-01-01
The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The computational methods on which the program is based provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.
Discourse analysis in general practice: a sociolinguistic approach.
Nessa, J; Malterud, K
1990-06-01
It is a simple but important fact that as general practitioners we talk to our patients. The quality of the conversation is of vital importance for the outcome of the consultation. The purpose of this article is to discuss a methodological tool borrowed from sociolinguistics--discourse analysis. To assess the suitability of this method for analysis of general practice consultations, the authors have performed a discourse analysis of one single consultation. Our experiences are presented here.
Cloud-Based Tools to Support High-Resolution Modeling (Invited)
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Swain, N.; Christensen, S.
2013-12-01
The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482
Next generation simulation tools: the Systems Biology Workbench and BioSPICE integration.
Sauro, Herbert M; Hucka, Michael; Finney, Andrew; Wellock, Cameron; Bolouri, Hamid; Doyle, John; Kitano, Hiroaki
2003-01-01
Researchers in quantitative systems biology make use of a large number of different software packages for modelling, analysis, visualization, and general data manipulation. In this paper, we describe the Systems Biology Workbench (SBW), a software framework that allows heterogeneous application components--written in diverse programming languages and running on different platforms--to communicate and use each others' capabilities via a fast binary encoded-message system. Our goal was to create a simple, high performance, opensource software infrastructure which is easy to implement and understand. SBW enables applications (potentially running on separate, distributed computers) to communicate via a simple network protocol. The interfaces to the system are encapsulated in client-side libraries that we provide for different programming languages. We describe in this paper the SBW architecture, a selection of current modules, including Jarnac, JDesigner, and SBWMeta-tool, and the close integration of SBW into BioSPICE, which enables both frameworks to share tools and compliment and strengthen each others capabilities.
Improved Analysis of Earth System Models and Observations using Simple Climate Models
NASA Astrophysics Data System (ADS)
Nadiga, B. T.; Urban, N. M.
2016-12-01
Earth system models (ESM) are the most comprehensive tools we have to study climate change and develop climate projections. However, the computational infrastructure required and the cost incurred in running such ESMs precludes direct use of such models in conjunction with a wide variety of tools that can further our understanding of climate. Here we are referring to tools that range from dynamical systems tools that give insight into underlying flow structure and topology to tools that come from various applied mathematical and statistical techniques and are central to quantifying stability, sensitivity, uncertainty and predictability to machine learning tools that are now being rapidly developed or improved. Our approach to facilitate the use of such models is to analyze output of ESM experiments (cf. CMIP) using a range of simpler models that consider integral balances of important quantities such as mass and/or energy in a Bayesian framework.We highlight the use of this approach in the context of the uptake of heat by the world oceans in the ongoing global warming. Indeed, since in excess of 90% of the anomalous radiative forcing due greenhouse gas emissions is sequestered in the world oceans, the nature of ocean heat uptake crucially determines the surface warming that is realized (cf. climate sensitivity). Nevertheless, ESMs themselves are never run long enough to directly assess climate sensitivity. So, we consider a range of models based on integral balances--balances that have to be realized in all first-principles based models of the climate system including the most detailed state-of-the art climate simulations. The models range from simple models of energy balance to those that consider dynamically important ocean processes such as the conveyor-belt circulation (Meridional Overturning Circulation, MOC), North Atlantic Deep Water (NADW) formation, Antarctic Circumpolar Current (ACC) and eddy mixing. Results from Bayesian analysis of such models using both ESM experiments and actual observations are presented. One such result points to the importance of direct sequestration of heat below 700 m, a process that is not allowed for in the simple models that have been traditionally used to deduce climate sensitivity.
Nonlinear transient analysis via energy minimization
NASA Technical Reports Server (NTRS)
Kamat, M. P.; Knight, N. F., Jr.
1978-01-01
The formulation basis for nonlinear transient analysis of finite element models of structures using energy minimization is provided. Geometric and material nonlinearities are included. The development is restricted to simple one and two dimensional finite elements which are regarded as being the basic elements for modeling full aircraft-like structures under crash conditions. The results indicate the effectiveness of the technique as a viable tool for this purpose.
Nancy Diaz; Dean Apostol
1992-01-01
This publication presents a Landscape Design and Analysis Process, along with some simple methods and tools for describing landscapes and their function. The information is qualitative in nature and highlights basic concepts, but does not address landscape ecology in great depth. Readers are encouraged to consult the list of selected references in Chapter 2 if they...
Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis.
Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E
2018-04-15
Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele. darrell.hurt@nih.gov.
POVME 2.0: An Enhanced Tool for Determining Pocket Shape and Volume Characteristics
2015-01-01
Analysis of macromolecular/small-molecule binding pockets can provide important insights into molecular recognition and receptor dynamics. Since its release in 2011, the POVME (POcket Volume MEasurer) algorithm has been widely adopted as a simple-to-use tool for measuring and characterizing pocket volumes and shapes. We here present POVME 2.0, which is an order of magnitude faster, has improved accuracy, includes a graphical user interface, and can produce volumetric density maps for improved pocket analysis. To demonstrate the utility of the algorithm, we use it to analyze the binding pocket of RNA editing ligase 1 from the unicellular parasite Trypanosoma brucei, the etiological agent of African sleeping sickness. The POVME analysis characterizes the full dynamics of a potentially druggable transient binding pocket and so may guide future antitrypanosomal drug-discovery efforts. We are hopeful that this new version will be a useful tool for the computational- and medicinal-chemist community. PMID:25400521
Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis
Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E
2018-01-01
Abstract Motivation Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. Results The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. Availability and implementation https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele Contact darrell.hurt@nih.gov PMID:29028892
PyPedal, an open source software package for pedigree analysis
USDA-ARS?s Scientific Manuscript database
The open source software package PyPedal (http://pypedal.sourceforge.net/) was first released in 2002, and provided users with a set of simple tools for manipulating pedigrees. Its flexibility has been demonstrated by its used in a number of settings for large and small populations. After substantia...
Automation tools for demonstration of goal directed and self-repairing flight control systems
NASA Technical Reports Server (NTRS)
Agarwal, A. K.
1988-01-01
The coupling of expert systems and control design and analysis techniques are documented to provide a realizable self repairing flight control system. Key features of such a flight control system are identified and a limited set of rules for a simple aircraft model are presented.
A Land-Use-Planning Simulation Using Google Earth
ERIC Educational Resources Information Center
Bodzin, Alec M.; Cirucci, Lori
2009-01-01
Google Earth (GE) is proving to be a valuable tool in the science classroom for understanding the environment and making responsible environmental decisions (Bodzin 2008). GE provides learners with a dynamic mapping experience using a simple interface with a limited range of functions. This interface makes geospatial analysis accessible and…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-21
... environmental analysis be completed for all major Federal actions significantly affecting the environment. NEPA... simple tool to ensure that project and environmental information is obtained. The questionnaire applies... collection). Affected Public: Business or other for profit organizations; individuals or households; not-for...
Application of Transformations in Parametric Inference
ERIC Educational Resources Information Center
Brownstein, Naomi; Pensky, Marianna
2008-01-01
The objective of the present paper is to provide a simple approach to statistical inference using the method of transformations of variables. We demonstrate performance of this powerful tool on examples of constructions of various estimation procedures, hypothesis testing, Bayes analysis and statistical inference for the stress-strength systems.…
Numerical model of solar dynamic radiator for parametric analysis
NASA Technical Reports Server (NTRS)
Rhatigan, Jennifer L.
1989-01-01
Growth power requirements for Space Station Freedom will be met through addition of 25 kW solar dynamic (SD) power modules. Extensive thermal and power cycle modeling capabilities have been developed which are powerful tools in Station design and analysis, but which prove cumbersome and costly for simple component preliminary design studies. In order to aid in refining the SD radiator to the mature design stage, a simple and flexible numerical model was developed. The model simulates heat transfer and fluid flow performance of the radiator and calculates area mass and impact survivability for many combinations of flow tube and panel configurations, fluid and material properties, and environmental and cycle variations.
Implementation of GenePattern within the Stanford Microarray Database.
Hubble, Jeremy; Demeter, Janos; Jin, Heng; Mao, Maria; Nitzberg, Michael; Reddy, T B K; Wymore, Farrell; Zachariah, Zachariah K; Sherlock, Gavin; Ball, Catherine A
2009-01-01
Hundreds of researchers across the world use the Stanford Microarray Database (SMD; http://smd.stanford.edu/) to store, annotate, view, analyze and share microarray data. In addition to providing registered users at Stanford access to their own data, SMD also provides access to public data, and tools with which to analyze those data, to any public user anywhere in the world. Previously, the addition of new microarray data analysis tools to SMD has been limited by available engineering resources, and in addition, the existing suite of tools did not provide a simple way to design, execute and share analysis pipelines, or to document such pipelines for the purposes of publication. To address this, we have incorporated the GenePattern software package directly into SMD, providing access to many new analysis tools, as well as a plug-in architecture that allows users to directly integrate and share additional tools through SMD. In this article, we describe our implementation of the GenePattern microarray analysis software package into the SMD code base. This extension is available with the SMD source code that is fully and freely available to others under an Open Source license, enabling other groups to create a local installation of SMD with an enriched data analysis capability.
A new tool to evaluate postgraduate training posts: the Job Evaluation Survey Tool (JEST).
Wall, David; Goodyear, Helen; Singh, Baldev; Whitehouse, Andrew; Hughes, Elizabeth; Howes, Jonathan
2014-10-02
Three reports in 2013 about healthcare and patient safety in the UK, namely Berwick, Francis and Keogh have highlighted the need for junior doctors' views about their training experience to be heard. In the UK, the General Medical Council (GMC) quality assures medical training programmes and requires postgraduate deaneries to undertake quality management and monitoring of all training posts in their area. The aim of this study was to develop a simple trainee questionnaire for evaluation of postgraduate training posts based on the GMC, UK standards and to look at the reliability and validity including comparison with a well-established and internationally validated tool, the Postgraduate Hospital Educational Environment Measure (PHEEM). The Job Evaluation Survey Tool (JEST), a fifteen item job evaluation questionnaire was drawn up in 2006, piloted with Foundation doctors (2007), field tested with specialist paediatric registrars (2008) and used over a three year period (2008-11) by Foundation Doctors. Statistical analyses including descriptives, reliability, correlation and factor analysis were undertaken and JEST compared with PHEEM. The JEST had a reliability of 0.91 in the pilot study of 76 Foundation doctors, 0.88 in field testing of 173 Paediatric specialist registrars and 0.91 in three years of general use in foundation training with 3367 doctors completing JEST. Correlation of JEST with PHEEM was 0.80 (p < 0.001). Factor analysis showed two factors, a teaching factor and a social and lifestyle one. The JEST has proved to be a simple, valid and reliable evaluation tool in the monitoring and evaluation of postgraduate hospital training posts.
Application of simple negative feedback model for avalanche photodetectors investigation
NASA Astrophysics Data System (ADS)
Kushpil, V. V.
2009-10-01
A simple negative feedback model based on Miller's formula is used to investigate the properties of Avalanche Photodetectors (APDs). The proposed method can be applied to study classical APD as well as new type of devices, which are operating in the Internal Negative Feedback (INF) regime. The method shows a good sensitivity to technological APD parameters making it possible to use it as a tool to analyse various APD parameters. It also allows better understanding of the APD operation conditions. The simulations and experimental data analysis for different types of APDs are presented.
Time-lapse and slow-motion tracking of temperature changes: response time of a thermometer
NASA Astrophysics Data System (ADS)
Moggio, L.; Onorato, P.; Gratton, L. M.; Oss, S.
2017-03-01
We propose the use of a smartphone based time-lapse and slow-motion video techniques together with tracking analysis as valuable tools for investigating thermal processes such as the response time of a thermometer. The two simple experimental activities presented here, suitable also for high school and undergraduate students, allow one to measure in a simple yet rigorous way the response time of an alcohol thermometer and show its critical dependence on the properties of the surrounding environment giving insight into instrument characteristics, heat transfer and thermal equilibrium concepts.
Methods and potentials for using satellite image classification in school lessons
NASA Astrophysics Data System (ADS)
Voss, Kerstin; Goetzke, Roland; Hodam, Henryk
2011-11-01
The FIS project - FIS stands for Fernerkundung in Schulen (Remote Sensing in Schools) - aims at a better integration of the topic "satellite remote sensing" in school lessons. According to this, the overarching objective is to teach pupils basic knowledge and fields of application of remote sensing. Despite the growing significance of digital geomedia, the topic "remote sensing" is not broadly supported in schools. Often, the topic is reduced to a short reflection on satellite images and used only for additional illustration of issues relevant for the curriculum. Without addressing the issue of image data, this can hardly contribute to the improvement of the pupils' methodical competences. Because remote sensing covers more than simple, visual interpretation of satellite images, it is necessary to integrate remote sensing methods like preprocessing, classification and change detection. Dealing with these topics often fails because of confusing background information and the lack of easy-to-use software. Based on these insights, the FIS project created different simple analysis tools for remote sensing in school lessons, which enable teachers as well as pupils to be introduced to the topic in a structured way. This functionality as well as the fields of application of these analysis tools will be presented in detail with the help of three different classification tools for satellite image classification.
Kero, Tanja; Lindsjö, Lars; Sörensen, Jens; Lubberink, Mark
2016-08-01
(11)C-PIB PET is a promising non-invasive diagnostic tool for cardiac amyloidosis. Semiautomatic analysis of PET data is now available but it is not known how accurate these methods are for amyloid imaging. The aim of this study was to evaluate the feasibility of one semiautomatic software tool for analysis and visualization of (11)C-PIB left ventricular retention index (RI) in cardiac amyloidosis. Patients with systemic amyloidosis and cardiac involvement (n = 10) and healthy controls (n = 5) were investigated with dynamic (11)C-PIB PET. Two observers analyzed the PET studies with semiautomatic software to calculate the left ventricular RI of (11)C-PIB and to create parametric images. The mean RI at 15-25 min from the semiautomatic analysis was compared with RI based on manual analysis and showed comparable values (0.056 vs 0.054 min(-1) for amyloidosis patients and 0.024 vs 0.025 min(-1) in healthy controls; P = .78) and the correlation was excellent (r = 0.98). Inter-reader reproducibility also was excellent (intraclass correlation coefficient, ICC > 0.98). Parametric polarmaps and histograms made visual separation of amyloidosis patients and healthy controls fast and simple. Accurate semiautomatic analysis of cardiac (11)C-PIB RI in amyloidosis patients is feasible. Parametric polarmaps and histograms make visual interpretation fast and simple.
The SURE Reliability Analysis Program
NASA Technical Reports Server (NTRS)
Butler, R. W.
1986-01-01
The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.
Prospective Molecular Characterization of Burn Wound Colonization: Novel Tools and Analysis
2012-10-01
sequence analysis to identify the genetic characteristics that enable Staphylococcus aureus to progress from simple skin and soft tissue infections ...to sepsis and endocarditis . We are confident that this work will lead to significant advancements in wound care and healing and human microbiome...of diabetic foot ulcers become infected at some point, with 25% of the infected foot ulcers resulting in lower limb amputation, making wound
Software Tools to Support Research on Airport Departure Planning
NASA Technical Reports Server (NTRS)
Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul
2003-01-01
A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.
Oligonucleotide (GTG)5 as an epidemiological tool in the study of nontuberculous mycobacteria.
Cilliers, F J; Warren, R M; Hauman, J H; Wiid, I J; van Helden, P D
1997-01-01
Analysis of restriction fragment length polymorphisms in the genome of Mycobacterium tuberculosis (DNA fingerprinting) has proved to be a useful epidemiological tool in the study of tuberculosis within populations or communities. However, to date, no similar method has been developed to study the epidemiology of nontuberculous mycobacteria (NTM). In this communication, we report that a simple oligonucleotide repeat, (GTG)5, can be used to accurately genotype all species and strains of NTM tested. We suggest that this technology is an easily applied and accurate tool which can be used for the study of the epidemiology of NTM. PMID:9163479
Coupled Oscillators: Interesting Experiments for High School Students
ERIC Educational Resources Information Center
Kodejška, C.; Lepil, O.; Sedlácková, H.
2018-01-01
This work deals with the experimental demonstration of coupled oscillators using simple tools in the form of mechanical coupled pendulums, magnetically coupled elastic strings or electromagnetic oscillators. For the evaluation of results the data logger Lab Quest Vernier and video analysis in the Tracker program were used. In the first part of…
Semi-quantitative analysis of FT-IR spectra of humic fractions of nine US soils
USDA-ARS?s Scientific Manuscript database
Fourier Transform Infrared Spectroscopy (FT-IR) is a simple and fast tool for characterizing soil organic matter. However, most FT-IR spectra are only analyzed qualitatively. In this work, we prepared mobile humic acid (MHA) and recalcitrant calcium humate (CaHA) from nine soils collected from six ...
Online Reflections about Tinkering in Early Childhood: A Socio-Cultural Analysis
ERIC Educational Resources Information Center
Jane, Beverley
2006-01-01
Science education research predominantly shows that students improve their scientific understandings when they tinker (or pull apart) tools and simple household machines. In this study, the qualitative data collected took the form of online journal entries by final year, female, primary teacher trainees, who reflected upon their early childhood…
Faster than "g", Revisited with High-Speed Imaging
ERIC Educational Resources Information Center
Vollmer, Michael; Mollmann, Klaus-Peter
2012-01-01
The introduction of modern high-speed cameras in physics teaching provides a tool not only for easy visualization, but also for quantitative analysis of many simple though fast occurring phenomena. As an example, we present a very well-known demonstration experiment--sometimes also discussed in the context of falling chimneys--which is commonly…
KIDMAP--A Diagnostic Tool for Teachers.
ERIC Educational Resources Information Center
Lee, Yew Jin; Linacre, John M.; Yeoh, Oon Chye
While assessment is the bread and butter of the teaching profession, its practitioners usually do not extend analysis of test responses beyond simple measures such as facility or discrimination indices in classical test theory. Item response theory (IRT) has much to offer but its nonintuitive content and difficulty make it a formidable obstacle in…
Geib, Scott M; Hall, Brian; Derego, Theodore; Bremer, Forest T; Cannoles, Kyle; Sim, Sheina B
2018-04-01
One of the most overlooked, yet critical, components of a whole genome sequencing (WGS) project is the submission and curation of the data to a genomic repository, most commonly the National Center for Biotechnology Information (NCBI). While large genome centers or genome groups have developed software tools for post-annotation assembly filtering, annotation, and conversion into the NCBI's annotation table format, these tools typically require back-end setup and connection to an Structured Query Language (SQL) database and/or some knowledge of programming (Perl, Python) to implement. With WGS becoming commonplace, genome sequencing projects are moving away from the genome centers and into the ecology or biology lab, where fewer resources are present to support the process of genome assembly curation. To fill this gap, we developed software to assess, filter, and transfer annotation and convert a draft genome assembly and annotation set into the NCBI annotation table (.tbl) format, facilitating submission to the NCBI Genome Assembly database. This software has no dependencies, is compatible across platforms, and utilizes a simple command to perform a variety of simple and complex post-analysis, pre-NCBI submission WGS project tasks. The Genome Annotation Generator is a consistent and user-friendly bioinformatics tool that can be used to generate a .tbl file that is consistent with the NCBI submission pipeline. The Genome Annotation Generator achieves the goal of providing a publicly available tool that will facilitate the submission of annotated genome assemblies to the NCBI. It is useful for any individual researcher or research group that wishes to submit a genome assembly of their study system to the NCBI.
Hall, Brian; Derego, Theodore; Bremer, Forest T; Cannoles, Kyle
2018-01-01
Abstract Background One of the most overlooked, yet critical, components of a whole genome sequencing (WGS) project is the submission and curation of the data to a genomic repository, most commonly the National Center for Biotechnology Information (NCBI). While large genome centers or genome groups have developed software tools for post-annotation assembly filtering, annotation, and conversion into the NCBI’s annotation table format, these tools typically require back-end setup and connection to an Structured Query Language (SQL) database and/or some knowledge of programming (Perl, Python) to implement. With WGS becoming commonplace, genome sequencing projects are moving away from the genome centers and into the ecology or biology lab, where fewer resources are present to support the process of genome assembly curation. To fill this gap, we developed software to assess, filter, and transfer annotation and convert a draft genome assembly and annotation set into the NCBI annotation table (.tbl) format, facilitating submission to the NCBI Genome Assembly database. This software has no dependencies, is compatible across platforms, and utilizes a simple command to perform a variety of simple and complex post-analysis, pre-NCBI submission WGS project tasks. Findings The Genome Annotation Generator is a consistent and user-friendly bioinformatics tool that can be used to generate a .tbl file that is consistent with the NCBI submission pipeline Conclusions The Genome Annotation Generator achieves the goal of providing a publicly available tool that will facilitate the submission of annotated genome assemblies to the NCBI. It is useful for any individual researcher or research group that wishes to submit a genome assembly of their study system to the NCBI. PMID:29635297
Causal Relation Analysis Tool of the Case Study in the Engineer Ethics Education
NASA Astrophysics Data System (ADS)
Suzuki, Yoshio; Morita, Keisuke; Yasui, Mitsukuni; Tanada, Ichirou; Fujiki, Hiroyuki; Aoyagi, Manabu
In engineering ethics education, the virtual experiencing of dilemmas is essential. Learning through the case study method is a particularly effective means. Many case studies are, however, difficult to deal with because they often include many complex causal relationships and social factors. It would thus be convenient if there were a tool that could analyze the factors of a case example and organize them into a hierarchical structure to get a better understanding of the whole picture. The tool that was developed applies a cause-and-effect matrix and simple graph theory. It analyzes the causal relationship between facts in a hierarchical structure and organizes complex phenomena. The effectiveness of this tool is shown by presenting an actual example.
gHRV: Heart rate variability analysis made easy.
Rodríguez-Liñares, L; Lado, M J; Vila, X A; Méndez, A J; Cuesta, P
2014-08-01
In this paper, the gHRV software tool is presented. It is a simple, free and portable tool developed in python for analysing heart rate variability. It includes a graphical user interface and it can import files in multiple formats, analyse time intervals in the signal, test statistical significance and export the results. This paper also contains, as an example of use, a clinical analysis performed with the gHRV tool, namely to determine whether the heart rate variability indexes change across different stages of sleep. Results from tests completed by researchers who have tried gHRV are also explained: in general the application was positively valued and results reflect a high level of satisfaction. gHRV is in continuous development and new versions will include suggestions made by testers. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
TableViewer for Herschel Data Processing
NASA Astrophysics Data System (ADS)
Zhang, L.; Schulz, B.
2006-07-01
The TableViewer utility is a GUI tool written in Java to support interactive data processing and analysis for the Herschel Space Observatory (Pilbratt et al. 2001). The idea was inherited from a prototype written in IDL (Schulz et al. 2005). It allows to graphically view and analyze tabular data organized in columns with equal numbers of rows. It can be run either as a standalone application, where data access is restricted to FITS (FITS 1999) files only, or it can be run from the Quick Look Analysis(QLA) or Interactive Analysis(IA) command line, from where also objects are accessible. The graphic display is very versatile, allowing plots in either linear or log scales. Zooming, panning, and changing data columns is performed rapidly using a group of navigation buttons. Selecting and de-selecting of fields of data points controls the input to simple analysis tasks like building a statistics table, or generating power spectra. The binary data stored in a TableDataset^1, a Product or in FITS files can also be displayed as tabular data, where values in individual cells can be modified. TableViewer provides several processing utilities which, besides calculation of statistics either for all channels or for selected channels, and calculation of power spectra, allows to convert/repair datasets by changing the unit name of data columns, and by modifying data values in columns with a simple calculator tool. Interactively selected data can be separated out, and modified data sets can be saved to FITS files. The tool will be very helpful especially in the early phases of Herschel data analysis when a quick access to contents of data products is important. TableDataset and Product are Java classes defined in herschel.ia.dataset.
Practical thoughts on cost-benefit analysis and health services.
Burchell, A; Weeden, R
1982-08-01
Cost-benefit analysis is fast becoming--if it is not already--an essential tool in decision making. It is, however, a complex subject, and one in which few doctors have been trained. This paper offers practical thoughts on the art of cost-benefit analysis, and is written for clinicians and other medical specialists who, though inexpert in the techniques of accountancy, nevertheless wish to carry out their own simple analyses in a manner that will enable them, and others, to take effective decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Fritz, John Floren
2013-08-27
Minimega is a simple emulytics platform for creating testbeds of networked devices. The platform consists of easily deployable tools to facilitate bringing up large networks of virtual machines including Windows, Linux, and Android. Minimega attempts to allow experiments to be brought up quickly with nearly no configuration. Minimega also includes tools for simple cluster management, as well as tools for creating Linux based virtual machine images.
Simple methods of exploiting the underlying structure of rule-based systems
NASA Technical Reports Server (NTRS)
Hendler, James
1986-01-01
Much recent work in the field of expert systems research has aimed at exploiting the underlying structures of the rule base for reasons of analysis. Such techniques as Petri-nets and GAGs have been proposed as representational structures that will allow complete analysis. Much has been made of proving isomorphisms between the rule bases and the mechanisms, and in examining the theoretical power of this analysis. In this paper we describe some early work in a new system which has much simpler (and thus, one hopes, more easily achieved) aims and less formality. The technique being examined is a very simple one: OPS5 programs are analyzed in a purely syntactic way and a FSA description is generated. In this paper we describe the technique and some user interface tools which exploit this structure.
Combustion and Magnetohydrodynamic Processes in Advanced Pulse Detonation Rocket Engines
2012-10-01
use of high-order numerical methods can also be a powerful tool in the analysis of such complex flows, but we need to understand the interaction of...computational physics, 43(2):357372, 1981. [47] B. Einfeldt. On godunov-type methods for gas dynamics . SIAM Journal on Numerical Analysis , pages 294...dimensional effects with complex reaction kinetics, the simple one-dimensional detonation structure provides a rich spectrum of dynamical features which are
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We report on the real-time creation of an application for hands-on neurophysiology in an advanced undergraduate teaching laboratory. Enabled by the rapid software development tools included in the Matlab technical computing environment (The Mathworks, Natick, MA), a team, consisting of a neurophysiology educator and a biophysicist trained as an electrical engineer, interfaced to a course of approximately 15 students from engineering and biology backgrounds. The result is the powerful freeware data acquisition and analysis environment, "g-PRIME." The software was developed from week to week in response to curriculum demands, and student feedback. The program evolved from a simple software oscilloscope, enabling RC circuit analysis, to a suite of tools supporting analysis of neuronal excitability and synaptic transmission analysis in invertebrate model systems. The program has subsequently expanded in application to university courses, research, and high school projects in the US and abroad as free courseware.
NASA Astrophysics Data System (ADS)
Sessa, Francesco; D'Angelo, Paola; Migliorati, Valentina
2018-01-01
In this work we have developed an analytical procedure to identify metal ion coordination geometries in liquid media based on the calculation of Combined Distribution Functions (CDFs) starting from Molecular Dynamics (MD) simulations. CDFs provide a fingerprint which can be easily and unambiguously assigned to a reference polyhedron. The CDF analysis has been tested on five systems and has proven to reliably identify the correct geometries of several ion coordination complexes. This tool is simple and general and can be efficiently applied to different MD simulations of liquid systems.
Meta-analyzing dependent correlations: an SPSS macro and an R script.
Cheung, Shu Fai; Chan, Darius K-S
2014-06-01
The presence of dependent correlation is a common problem in meta-analysis. Cheung and Chan (2004, 2008) have shown that samplewise-adjusted procedures perform better than the more commonly adopted simple within-sample mean procedures. However, samplewise-adjusted procedures have rarely been applied in meta-analytic reviews, probably due to the lack of suitable ready-to-use programs. In this article, we compare the samplewise-adjusted procedures with existing procedures to handle dependent effect sizes, and present the samplewise-adjusted procedures in a way that will make them more accessible to researchers conducting meta-analysis. We also introduce two tools, an SPSS macro and an R script, that researchers can apply to their meta-analyses; these tools are compatible with existing meta-analysis software packages.
Perl One-Liners: Bridging the Gap Between Large Data Sets and Analysis Tools.
Hokamp, Karsten
2015-01-01
Computational analyses of biological data are becoming increasingly powerful, and researchers intending on carrying out their own analyses can often choose from a wide array of tools and resources. However, their application might be obstructed by the wide variety of different data formats that are in use, from standard, commonly used formats to output files from high-throughput analysis platforms. The latter are often too large to be opened, viewed, or edited by standard programs, potentially leading to a bottleneck in the analysis. Perl one-liners provide a simple solution to quickly reformat, filter, and merge data sets in preparation for downstream analyses. This chapter presents example code that can be easily adjusted to meet individual requirements. An online version is available at http://bioinf.gen.tcd.ie/pol.
Divide and Conquer (DC) BLAST: fast and easy BLAST execution within HPC environments
Yim, Won Cheol; Cushman, John C.
2017-07-22
Bioinformatics is currently faced with very large-scale data sets that lead to computational jobs, especially sequence similarity searches, that can take absurdly long times to run. For example, the National Center for Biotechnology Information (NCBI) Basic Local Alignment Search Tool (BLAST and BLAST+) suite, which is by far the most widely used tool for rapid similarity searching among nucleic acid or amino acid sequences, is highly central processing unit (CPU) intensive. While the BLAST suite of programs perform searches very rapidly, they have the potential to be accelerated. In recent years, distributed computing environments have become more widely accessible andmore » used due to the increasing availability of high-performance computing (HPC) systems. Therefore, simple solutions for data parallelization are needed to expedite BLAST and other sequence analysis tools. However, existing software for parallel sequence similarity searches often requires extensive computational experience and skill on the part of the user. In order to accelerate BLAST and other sequence analysis tools, Divide and Conquer BLAST (DCBLAST) was developed to perform NCBI BLAST searches within a cluster, grid, or HPC environment by using a query sequence distribution approach. Scaling from one (1) to 256 CPU cores resulted in significant improvements in processing speed. Thus, DCBLAST dramatically accelerates the execution of BLAST searches using a simple, accessible, robust, and parallel approach. DCBLAST works across multiple nodes automatically and it overcomes the speed limitation of single-node BLAST programs. DCBLAST can be used on any HPC system, can take advantage of hundreds of nodes, and has no output limitations. Thus, this freely available tool simplifies distributed computation pipelines to facilitate the rapid discovery of sequence similarities between very large data sets.« less
Divide and Conquer (DC) BLAST: fast and easy BLAST execution within HPC environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yim, Won Cheol; Cushman, John C.
Bioinformatics is currently faced with very large-scale data sets that lead to computational jobs, especially sequence similarity searches, that can take absurdly long times to run. For example, the National Center for Biotechnology Information (NCBI) Basic Local Alignment Search Tool (BLAST and BLAST+) suite, which is by far the most widely used tool for rapid similarity searching among nucleic acid or amino acid sequences, is highly central processing unit (CPU) intensive. While the BLAST suite of programs perform searches very rapidly, they have the potential to be accelerated. In recent years, distributed computing environments have become more widely accessible andmore » used due to the increasing availability of high-performance computing (HPC) systems. Therefore, simple solutions for data parallelization are needed to expedite BLAST and other sequence analysis tools. However, existing software for parallel sequence similarity searches often requires extensive computational experience and skill on the part of the user. In order to accelerate BLAST and other sequence analysis tools, Divide and Conquer BLAST (DCBLAST) was developed to perform NCBI BLAST searches within a cluster, grid, or HPC environment by using a query sequence distribution approach. Scaling from one (1) to 256 CPU cores resulted in significant improvements in processing speed. Thus, DCBLAST dramatically accelerates the execution of BLAST searches using a simple, accessible, robust, and parallel approach. DCBLAST works across multiple nodes automatically and it overcomes the speed limitation of single-node BLAST programs. DCBLAST can be used on any HPC system, can take advantage of hundreds of nodes, and has no output limitations. Thus, this freely available tool simplifies distributed computation pipelines to facilitate the rapid discovery of sequence similarities between very large data sets.« less
Simple morphological control over functional diversity of SERS materials
NASA Astrophysics Data System (ADS)
Semenova, A. A.; Goodilin, E. A.
2018-03-01
Nowadays, surface-enhanced Raman spectroscopy (SERS) becomes a promising universal low-cost and real-time tool in biomedical applications, medical screening or forensic analysis allowing for detection of different molecules below nanomolar concentrations. Silver nanoparticles and nanostructures have proven to be a common choice for SERS measurements due to a tunable plasmon resonance, high stability and facile fabrication methods. However, a proper design of silver-based nanomaterials for highly sensitive SERS applications still remains a challenge. In this work, effective and simple preparation methods of various silver nanostructures are proposed and systematically developed using aqueous diamminesilver (I) hydroxide as a precursor.
Ganalyzer: A tool for automatic galaxy image analysis
NASA Astrophysics Data System (ADS)
Shamir, Lior
2011-05-01
Ganalyzer is a model-based tool that automatically analyzes and classifies galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.
NASA Astrophysics Data System (ADS)
Avila, Edward R.
The Electric Insertion Transfer Experiment (ELITE) is an Air Force Advanced Technology Transition Demonstration which is being executed as a cooperative Research and Development Agreement between the Phillips Lab and TRW. The objective is to build, test, and fly a solar-electric orbit transfer and orbit maneuvering vehicle, as a precursor to an operational electric orbit transfer vehicle (EOTV). This paper surveys some of the analysis tools used to do parametric studies and discusses the study results. The primary analysis tool was the Electric Vehicle Analyzer (EVA) developed by the Phillips Lab and modified by The Aerospace Corporation. It uses a simple orbit averaging approach to model low-thrust transfer performance, and runs in a PC environment. The assumptions used in deriving the EVA math model are presented. This tool and others surveyed were used to size the solar array power required for the spacecraft, and develop a baseline mission profile that meets the requirements of the ELITE mission.
Evaluation of interaction dynamics of concurrent processes
NASA Astrophysics Data System (ADS)
Sobecki, Piotr; Białasiewicz, Jan T.; Gross, Nicholas
2017-03-01
The purpose of this paper is to present the wavelet tools that enable the detection of temporal interactions of concurrent processes. In particular, the determination of interaction coherence of time-varying signals is achieved using a complex continuous wavelet transform. This paper has used electrocardiogram (ECG) and seismocardiogram (SCG) data set to show multiple continuous wavelet analysis techniques based on Morlet wavelet transform. MATLAB Graphical User Interface (GUI), developed in the reported research to assist in quick and simple data analysis, is presented. These software tools can discover the interaction dynamics of time-varying signals, hence they can reveal their correlation in phase and amplitude, as well as their non-linear interconnections. The user-friendly MATLAB GUI enables effective use of the developed software what enables to load two processes under investigation, make choice of the required processing parameters, and then perform the analysis. The software developed is a useful tool for researchers who have a need for investigation of interaction dynamics of concurrent processes.
eCAF: A New Tool for the Conversational Analysis of Electronic Communication
ERIC Educational Resources Information Center
Duncan-Howell, Jennifer
2009-01-01
Electronic communication is characteristically concerned with "the message" (eM), those who send them (S), and those who receive and read them (R). This relationship could be simplified into the equation eM = S + R. When this simple equation is applied to electronic communication, several elements are added that make this straightforward act of…
The Analysis of the Blogs Created in a Blended Course through the Reflective Thinking Perspective
ERIC Educational Resources Information Center
Dos, Bulent; Demir, Servet
2013-01-01
Blogs have evolved from simple online diaries to communication tools with the capacity to engage people in collaboration, knowledge sharing, reflection and debate. Blog archives can be a source of information about student learning, providing a basis for ongoing feedback and redesign of learning activities. Previous studies show that blogs can…
Random Amplified Polymorphic DNA PCR in the Teaching of Molecular Epidemiology
ERIC Educational Resources Information Center
Reinoso, Elina B.; Bettera, Susana G.
2016-01-01
In this article, we describe a basic practical laboratory designed for fifth-year undergraduate students of Microbiology as part of the Epidemiology course. This practice provides the students with the tools for molecular epidemiological analysis of pathogenic microorganisms using a rapid and simple PCR technique. The aim of this work was to assay…
Time-Lapse and Slow-Motion Tracking of Temperature Changes: Response Time of a Thermometer
ERIC Educational Resources Information Center
Moggio, L.; Onorato, P.; Gratton, L. M.; Oss, S.
2017-01-01
We propose the use of a smartphone based time-lapse and slow-motion video techniques together with tracking analysis as valuable tools for investigating thermal processes such as the response time of a thermometer. The two simple experimental activities presented here, suitable also for high school and undergraduate students, allow one to measure…
NASA Astrophysics Data System (ADS)
Zetterlind, V.; Pledgie, S.
2009-12-01
Low-cost, low-latency, robust geolocation and display of aerial video is a common need for a wide range of earth observing as well as emergency response and security applications. While hardware costs for aerial video collection systems, GPS, and inertial sensors have been decreasing, software costs for geolocation algorithms and reference imagery/DTED remain expensive and highly proprietary. As part of a Federal Small Business Innovative Research project, MosaicATM and EarthNC, Inc have developed a simple geolocation system based on the Google Earth API and Google's 'built-in' DTED and reference imagery libraries. This system geolocates aerial video based on platform and camera position, attitude, and field-of-view metadata using geometric photogrammetric principles of ray-intersection with DTED. Geolocated video can be directly rectified and viewed in the Google Earth API during processing. Work is underway to extend our geolocation code to NASA World Wind for additional flexibility and a fully open-source platform. In addition to our airborne remote sensing work, MosaicATM has developed the Surface Operations Data Analysis and Adaptation (SODAA) tool, funded by NASA Ames, which supports analysis of airport surface operations to optimize aircraft movements and reduce fuel burn and delays. As part of SODAA, MosaicATM and EarthNC, Inc have developed powerful tools to display national airspace data and time-animated 3D flight tracks in Google Earth for 4D analysis. The SODAA tool can convert raw format flight track data, FAA National Flight Data (NFD), and FAA 'Adaptation' airport surface data to a spatial database representation and then to Google Earth KML. The SODAA client provides users with a simple graphical interface through which to generate queries with a wide range of predefined and custom filters, plot results, and export for playback in Google Earth in conjunction with NFD and Adaptation overlays.
A Fan-tastic Quantitative Exploration of Ohm's Law
NASA Astrophysics Data System (ADS)
Mitchell, Brandon; Ekey, Robert; McCullough, Roy; Reitz, William
2018-02-01
Teaching simple circuits and Ohm's law to students in the introductory classroom has been extensively investigated through the common practice of using incandescent light bulbs to help students develop a conceptual foundation before moving on to quantitative analysis. However, the bulb filaments' resistance has a large temperature dependence, which makes them less suitable as a tool for quantitative analysis. Some instructors show that light bulbs do not obey Ohm's law either outright or through inquiry-based laboratory experiments. Others avoid the subject altogether by using bulbs strictly for qualitative purposes and then later switching to resistors for a numerical analysis, or by changing the operating conditions of the bulb so that it is "barely" glowing. It seems incongruous to develop a conceptual basis for the behavior of simple circuits using bulbs only to later reveal that they do not follow Ohm's law. Recently, small computer fans were proposed as a suitable replacement of bulbs for qualitative analysis of simple circuits where the current is related to the rotational speed of the fans. In this contribution, we demonstrate that fans can also be used for quantitative measurements and provide suggestions for successful classroom implementation.
Molecular beacon sequence design algorithm.
Monroe, W Todd; Haselton, Frederick R
2003-01-01
A method based on Web-based tools is presented to design optimally functioning molecular beacons. Molecular beacons, fluorogenic hybridization probes, are a powerful tool for the rapid and specific detection of a particular nucleic acid sequence. However, their synthesis costs can be considerable. Since molecular beacon performance is based on its sequence, it is imperative to rationally design an optimal sequence before synthesis. The algorithm presented here uses simple Microsoft Excel formulas and macros to rank candidate sequences. This analysis is carried out using mfold structural predictions along with other free Web-based tools. For smaller laboratories where molecular beacons are not the focus of research, the public domain algorithm described here may be usefully employed to aid in molecular beacon design.
fluff: exploratory analysis and visualization of high-throughput sequencing data
Georgiou, Georgios
2016-01-01
Summary. In this article we describe fluff, a software package that allows for simple exploration, clustering and visualization of high-throughput sequencing data mapped to a reference genome. The package contains three command-line tools to generate publication-quality figures in an uncomplicated manner using sensible defaults. Genome-wide data can be aggregated, clustered and visualized in a heatmap, according to different clustering methods. This includes a predefined setting to identify dynamic clusters between different conditions or developmental stages. Alternatively, clustered data can be visualized in a bandplot. Finally, fluff includes a tool to generate genomic profiles. As command-line tools, the fluff programs can easily be integrated into standard analysis pipelines. The installation is straightforward and documentation is available at http://fluff.readthedocs.org. Availability. fluff is implemented in Python and runs on Linux. The source code is freely available for download at https://github.com/simonvh/fluff. PMID:27547532
Rong, Y; Padron, A V; Hagerty, K J; Nelson, N; Chi, S; Keyhani, N O; Katz, J; Datta, S P A; Gomes, C; McLamore, E S
2018-04-30
Impedimetric biosensors for measuring small molecules based on weak/transient interactions between bioreceptors and target analytes are a challenge for detection electronics, particularly in field studies or in the analysis of complex matrices. Protein-ligand binding sensors have enormous potential for biosensing, but achieving accuracy in complex solutions is a major challenge. There is a need for simple post hoc analytical tools that are not computationally expensive, yet provide near real time feedback on data derived from impedance spectra. Here, we show the use of a simple, open source support vector machine learning algorithm for analyzing impedimetric data in lieu of using equivalent circuit analysis. We demonstrate two different protein-based biosensors to show that the tool can be used for various applications. We conclude with a mobile phone-based demonstration focused on the measurement of acetone, an important biomarker related to the onset of diabetic ketoacidosis. In all conditions tested, the open source classifier was capable of performing as well as, or better, than the equivalent circuit analysis for characterizing weak/transient interactions between a model ligand (acetone) and a small chemosensory protein derived from the tsetse fly. In addition, the tool has a low computational requirement, facilitating use for mobile acquisition systems such as mobile phones. The protocol is deployed through Jupyter notebook (an open source computing environment available for mobile phone, tablet or computer use) and the code was written in Python. For each of the applications, we provide step-by-step instructions in English, Spanish, Mandarin and Portuguese to facilitate widespread use. All codes were based on scikit-learn, an open source software machine learning library in the Python language, and were processed in Jupyter notebook, an open-source web application for Python. The tool can easily be integrated with the mobile biosensor equipment for rapid detection, facilitating use by a broad range of impedimetric biosensor users. This post hoc analysis tool can serve as a launchpad for the convergence of nanobiosensors in planetary health monitoring applications based on mobile phone hardware.
Comparison and correlation of Simple Sequence Repeats distribution in genomes of Brucella species
Kiran, Jangampalli Adi Pradeep; Chakravarthi, Veeraraghavulu Praveen; Kumar, Yellapu Nanda; Rekha, Somesula Swapna; Kruti, Srinivasan Shanthi; Bhaskar, Matcha
2011-01-01
Computational genomics is one of the important tools to understand the distribution of closely related genomes including simple sequence repeats (SSRs) in an organism, which gives valuable information regarding genetic variations. The central objective of the present study was to screen the SSRs distributed in coding and non-coding regions among different human Brucella species which are involved in a range of pathological disorders. Computational analysis of the SSRs in the Brucella indicates few deviations from expected random models. Statistical analysis also reveals that tri-nucleotide SSRs are overrepresented and tetranucleotide SSRs underrepresented in Brucella genomes. From the data, it can be suggested that over expressed tri-nucleotide SSRs in genomic and coding regions might be responsible in the generation of functional variation of proteins expressed which in turn may lead to different pathogenicity, virulence determinants, stress response genes, transcription regulators and host adaptation proteins of Brucella genomes. Abbreviations SSRs - Simple Sequence Repeats, ORFs - Open Reading Frames. PMID:21738309
Pimentel, Lígia; Fontes, Ana Luiza; Salsinha, Sofia; Machado, Manuela; Correia, Inês; Gomes, Ana Maria; Pintado, Manuela; Rodríguez-Alcalá, Luís Miguel
2018-03-08
Lipids are gaining relevance over the last 20 years, as our knowledge about their role has changed from merely energy/structural molecules to compounds also involved in several biological processes. This led to the creation in 2003 of a new emerging research field: lipidomics. In particular the phospholipids have pharmacological/food applications, participate in cell signalling/homeostatic pathways while their analysis faces some challenges. Their fractionation/purification is, in fact, especially difficult, as they are amphiphilic compounds. Moreover, it usually involves SPE or TLC procedures requiring specific materials hampering their suitableness for routine analysis. Finally, they can interfere with the ionization of other molecules during mass spectrometry analysis. Thus, simple high-throughput reliable methods to selectively isolate these compounds based on the difference between chemical characteristics of lipids would represent valuable tools for their study besides that of other compounds. The current review work aims to describe the state-of-the-art related to the extraction of phospholipids using liquid-liquid methods for their targeted isolation. The technological and biological importance of these compounds and ion suppression phenomena are also reviewed. Methods by precipitation with acetone or isolation using methanol seem to be suitable for selective isolation of phospholipids in both biological and food samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.
Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton
2013-01-01
The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).
Computational medicinal chemistry in fragment-based drug discovery: what, how and when.
Rabal, Obdulia; Urbano-Cuadrado, Manuel; Oyarzabal, Julen
2011-01-01
The use of fragment-based drug discovery (FBDD) has increased in the last decade due to the encouraging results obtained to date. In this scenario, computational approaches, together with experimental information, play an important role to guide and speed up the process. By default, FBDD is generally considered as a constructive approach. However, such additive behavior is not always present, therefore, simple fragment maturation will not always deliver the expected results. In this review, computational approaches utilized in FBDD are reported together with real case studies, where applicability domains are exemplified, in order to analyze them, and then, maximize their performance and reliability. Thus, a proper use of these computational tools can minimize misleading conclusions, keeping the credit on FBDD strategy, as well as achieve higher impact in the drug-discovery process. FBDD goes one step beyond a simple constructive approach. A broad set of computational tools: docking, R group quantitative structure-activity relationship, fragmentation tools, fragments management tools, patents analysis and fragment-hopping, for example, can be utilized in FBDD, providing a clear positive impact if they are utilized in the proper scenario - what, how and when. An initial assessment of additive/non-additive behavior is a critical point to define the most convenient approach for fragments elaboration.
Linear regression metamodeling as a tool to summarize and present simulation model results.
Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M
2013-10-01
Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.
DMET-analyzer: automatic analysis of Affymetrix DMET data.
Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario
2012-10-05
Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of clinical datasets produced in the University Hospital of Catanzaro, Italy. DMET Analyzer is a novel tool able to automatically analyse data produced by the DMET-platform in case-control association studies. Using such tool user may avoid wasting time in the manual execution of multiple statistical tests avoiding possible errors and reducing the amount of time needed for a whole experiment. Moreover annotations and the direct link to external databases may increase the biological knowledge extracted. The system is freely available for academic purposes at: https://sourceforge.net/projects/dmetanalyzer/files/
Reichert, Matthew D.; Alvarez, Nicolas J.; Brooks, Carlton F.; ...
2014-09-24
Pendant bubble and drop devices are invaluable tools in understanding surfactant behavior at fluid–fluid interfaces. The simple instrumentation and analysis are used widely to determine adsorption isotherms, transport parameters, and interfacial rheology. However, much of the analysis performed is developed for planar interfaces. Moreover, the application of a planar analysis to drops and bubbles (curved interfaces) can lead to erroneous and unphysical results. We revisit this analysis for a well-studied surfactant system at air–water interfaces over a wide range of curvatures as applied to both expansion/contraction experiments and interfacial elasticity measurements. The impact of curvature and transport on measured propertiesmore » is quantified and compared to other scaling relationships in the literature. Our results provide tools to design interfacial experiments for accurate determination of isotherm, transport and elastic properties.« less
Teaching meta-analysis using MetaLight.
Thomas, James; Graziosi, Sergio; Higgins, Steve; Coe, Robert; Torgerson, Carole; Newman, Mark
2012-10-18
Meta-analysis is a statistical method for combining the results of primary studies. It is often used in systematic reviews and is increasingly a method and topic that appears in student dissertations. MetaLight is a freely available software application that runs simple meta-analyses and contains specific functionality to facilitate the teaching and learning of meta-analysis. While there are many courses and resources for meta-analysis available and numerous software applications to run meta-analyses, there are few pieces of software which are aimed specifically at helping those teaching and learning meta-analysis. Valuable teaching time can be spent learning the mechanics of a new software application, rather than on the principles and practices of meta-analysis. We discuss ways in which the MetaLight tool can be used to present some of the main issues involved in undertaking and interpreting a meta-analysis. While there are many software tools available for conducting meta-analysis, in the context of a teaching programme such software can require expenditure both in terms of money and in terms of the time it takes to learn how to use it. MetaLight was developed specifically as a tool to facilitate the teaching and learning of meta-analysis and we have presented here some of the ways it might be used in a training situation.
Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira
2015-01-01
Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies.
Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira
2015-01-01
Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies. PMID:26325291
El Sharabasy, Sherif F; Soliman, Khaled A
2017-01-01
The date palm is an ancient domesticated plant with great diversity and has been cultivated in the Middle East and North Africa for at last 5000 years. Date palm cultivars are classified based on the fruit moisture content, as dry, semidry, and soft dates. There are a number of biochemical and molecular techniques available for characterization of the date palm variation. This chapter focuses on the DNA-based markers random amplified polymorphic DNA (RAPD) and inter-simple sequence repeats (ISSR) techniques, in addition to biochemical markers based on isozyme analysis. These techniques coupled with appropriate statistical tools proved useful for determining phylogenetic relationships among date palm cultivars and provide information resources for date palm gene banks.
The SURE reliability analysis program
NASA Technical Reports Server (NTRS)
Butler, R. W.
1986-01-01
The SURE program is a new reliability tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.
Control and prediction of the course of brewery fermentations by gravimetric analysis.
Kosín, P; Savel, J; Broz, A; Sigler, K
2008-01-01
A simple, fast and cheap test suitable for predicting the course of brewery fermentations based on mass analysis is described and its efficiency is evaluated. Compared to commonly used yeast vitality tests, this analysis takes into account wort composition and other factors that influence fermentation performance. It can be used to predict the shape of the fermentation curve in brewery fermentations and in research and development projects concerning yeast vitality, fermentation conditions and wort composition. It can also be a useful tool for homebrewers to control their fermentations.
Baggio, Jacopo A; BurnSilver, Shauna B; Arenas, Alex; Magdanz, James S; Kofinas, Gary P; De Domenico, Manlio
2016-11-29
Network analysis provides a powerful tool to analyze complex influences of social and ecological structures on community and household dynamics. Most network studies of social-ecological systems use simple, undirected, unweighted networks. We analyze multiplex, directed, and weighted networks of subsistence food flows collected in three small indigenous communities in Arctic Alaska potentially facing substantial economic and ecological changes. Our analysis of plausible future scenarios suggests that changes to social relations and key households have greater effects on community robustness than changes to specific wild food resources.
Container-Based Clinical Solutions for Portable and Reproducible Image Analysis.
Matelsky, Jordan; Kiar, Gregory; Johnson, Erik; Rivera, Corban; Toma, Michael; Gray-Roncal, William
2018-05-08
Medical imaging analysis depends on the reproducibility of complex computation. Linux containers enable the abstraction, installation, and configuration of environments so that software can be both distributed in self-contained images and used repeatably by tool consumers. While several initiatives in neuroimaging have adopted approaches for creating and sharing more reliable scientific methods and findings, Linux containers are not yet mainstream in clinical settings. We explore related technologies and their efficacy in this setting, highlight important shortcomings, demonstrate a simple use-case, and endorse the use of Linux containers for medical image analysis.
miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.
Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M
2009-07-01
Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/.
NASA Astrophysics Data System (ADS)
Pedersen, N. L.
2015-06-01
The strength of a gear is typically defined relative to durability (pitting) and load capacity (tooth-breakage). Tooth-breakage is controlled by the root shape and this gear part can be designed because there is no contact between gear pairs here. The shape of gears is generally defined by different standards, with the ISO standard probably being the most common one. Gears are manufactured using two principally different tools: rack tools and gear tools. In this work, the bending stress of involute teeth is minimized by shape optimization made directly on the final gear. This optimized shape is then used to find the cutting tool (the gear envelope) that can create this optimized gear shape. A simple but sufficiently flexible root parameterization is applied and emphasis is put on the importance of separating the shape parameterization from the finite element analysis of stresses. Large improvements in the stress level are found.
Exploratory Causal Analysis in Bivariate Time Series Data
NASA Astrophysics Data System (ADS)
McCracken, James M.
Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data sets, but little research exists of how these tools compare to each other in practice. This work introduces and defines exploratory causal analysis (ECA) to address this issue along with the concept of data causality in the taxonomy of causal studies introduced in this work. The motivation is to provide a framework for exploring potential causal structures in time series data sets. ECA is used on several synthetic and empirical data sets, and it is found that all of the tested time series causality tools agree with each other (and intuitive notions of causality) for many simple systems but can provide conflicting causal inferences for more complicated systems. It is proposed that such disagreements between different time series causality tools during ECA might provide deeper insight into the data than could be found otherwise.
2010-09-02
Dynamic Mechanical Analysis (DMA). The fracture behavior of the mechanophore-linked polymer is also examined through the Double Cleavage Drilled ...multinary complex structures. Structural, microstructural, and chemical characterizations were explored by metrological tools to support this...simple hydrocarbons in order to quantitatively define structure-property relationships for reacting materials under shock compression. Embedded gauge
Weighing Evidence "Steampunk" Style via the Meta-Analyser.
Bowden, Jack; Jackson, Chris
2016-10-01
The funnel plot is a graphical visualization of summary data estimates from a meta-analysis, and is a useful tool for detecting departures from the standard modeling assumptions. Although perhaps not widely appreciated, a simple extension of the funnel plot can help to facilitate an intuitive interpretation of the mathematics underlying a meta-analysis at a more fundamental level, by equating it to determining the center of mass of a physical system. We used this analogy to explain the concepts of weighing evidence and of biased evidence to a young audience at the Cambridge Science Festival, without recourse to precise definitions or statistical formulas and with a little help from Sherlock Holmes! Following on from the science fair, we have developed an interactive web-application (named the Meta-Analyser) to bring these ideas to a wider audience. We envisage that our application will be a useful tool for researchers when interpreting their data. First, to facilitate a simple understanding of fixed and random effects modeling approaches; second, to assess the importance of outliers; and third, to show the impact of adjusting for small study bias. This final aim is realized by introducing a novel graphical interpretation of the well-known method of Egger regression.
Medical image segmentation to estimate HER2 gene status in breast cancer
NASA Astrophysics Data System (ADS)
Palacios-Navarro, Guillermo; Acirón-Pomar, José Manuel; Vilchez-Sorribas, Enrique; Zambrano, Eddie Galarza
2016-02-01
This work deals with the estimation of HER2 Gene status in breast tumour images treated with in situ hybridization techniques (ISH). We propose a simple algorithm to obtain the amplification factor of HER2 gene. The obtained results are very close to those obtained by specialists in a manual way. The developed algorithm is based on colour image segmentation and has been included in a software application tool for breast tumour analysis. The developed tool focus on the estimation of the seriousness of tumours, facilitating the work of pathologists and contributing to a better diagnosis.
Geena 2, improved automated analysis of MALDI/TOF mass spectra.
Romano, Paolo; Profumo, Aldo; Rocco, Mattia; Mangerini, Rosa; Ferri, Fabio; Facchiano, Angelo
2016-03-02
Mass spectrometry (MS) is producing high volumes of data supporting oncological sciences, especially for translational research. Most of related elaborations can be carried out by combining existing tools at different levels, but little is currently available for the automation of the fundamental steps. For the analysis of MALDI/TOF spectra, a number of pre-processing steps are required, including joining of isotopic abundances for a given molecular species, normalization of signals against an internal standard, background noise removal, averaging multiple spectra from the same sample, and aligning spectra from different samples. In this paper, we present Geena 2, a public software tool for the automated execution of these pre-processing steps for MALDI/TOF spectra. Geena 2 has been developed in a Linux-Apache-MySQL-PHP web development environment, with scripts in PHP and Perl. Input and output are managed as simple formats that can be consumed by any database system and spreadsheet software. Input data may also be stored in a MySQL database. Processing methods are based on original heuristic algorithms which are introduced in the paper. Three simple and intuitive web interfaces are available: the Standard Search Interface, which allows a complete control over all parameters, the Bright Search Interface, which leaves to the user the possibility to tune parameters for alignment of spectra, and the Quick Search Interface, which limits the number of parameters to a minimum by using default values for the majority of parameters. Geena 2 has been utilized, in conjunction with a statistical analysis tool, in three published experimental works: a proteomic study on the effects of long-term cryopreservation on the low molecular weight fraction of serum proteome, and two retrospective serum proteomic studies, one on the risk of developing breat cancer in patients affected by gross cystic disease of the breast (GCDB) and the other for the identification of a predictor of breast cancer mortality following breast cancer surgery, whose results were validated by ELISA, a completely alternative method. Geena 2 is a public tool for the automated pre-processing of MS data originated by MALDI/TOF instruments, with a simple and intuitive web interface. It is now under active development for the inclusion of further filtering options and for the adoption of standard formats for MS spectra.
Web-based visual analysis for high-throughput genomics
2013-01-01
Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618
DNA Electrochemistry and Electrochemical Sensors for Nucleic Acids.
Ferapontova, Elena E
2018-06-12
Sensitive, specific, and fast analysis of nucleic acids (NAs) is strongly needed in medicine, environmental science, biodefence, and agriculture for the study of bacterial contamination of food and beverages and genetically modified organisms. Electrochemistry offers accurate, simple, inexpensive, and robust tools for the development of such analytical platforms that can successfully compete with other approaches for NA detection. Here, electrode reactions of DNA, basic principles of electrochemical NA analysis, and their relevance for practical applications are reviewed and critically discussed.
Text mining and its potential applications in systems biology.
Ananiadou, Sophia; Kell, Douglas B; Tsujii, Jun-ichi
2006-12-01
With biomedical literature increasing at a rate of several thousand papers per week, it is impossible to keep abreast of all developments; therefore, automated means to manage the information overload are required. Text mining techniques, which involve the processes of information retrieval, information extraction and data mining, provide a means of solving this. By adding meaning to text, these techniques produce a more structured analysis of textual knowledge than simple word searches, and can provide powerful tools for the production and analysis of systems biology models.
Data Visualization and Analysis for Climate Studies using NASA Giovanni Online System
NASA Technical Reports Server (NTRS)
Rui, Hualan; Leptoukh, Gregory; Lloyd, Steven
2008-01-01
With many global earth observation systems and missions focused on climate systems and the associated large volumes of observational data available for exploring and explaining how climate is changing and why, there is an urgent need for climate services. Giovanni, the NASA GES DISC Interactive Online Visualization ANd ANalysis Infrastructure, is a simple to use yet powerful tool for analysing these data for research on global warming and climate change, as well as for applications to weather. air quality, agriculture, and water resources,
A simple and effective method for detecting precipitated proteins in MALDI-TOF MS.
Oshikane, Hiroyuki; Watabe, Masahiko; Nakaki, Toshio
2018-04-01
MALDI-TOF MS has developed rapidly into an essential analytical tool for the life sciences. Cinnamic acid derivatives are generally employed in routine molecular weight determinations of intact proteins using MALDI-TOF MS. However, a protein of interest may precipitate when mixed with matrix solution, perhaps preventing MS detection. We herein provide a simple approach to enable the MS detection of such precipitated protein species by means of a "direct deposition method" -- loading the precipitant directly onto the sample plate. It is thus expected to improve routine MS analysis of intact proteins. Copyright © 2018. Published by Elsevier Inc.
Learning investment indicators through data extension
NASA Astrophysics Data System (ADS)
Dvořák, Marek
2017-07-01
Stock prices in the form of time series were analysed using single and multivariate statistical methods. After simple data preprocessing in the form of logarithmic differences, we augmented this single variate time series to a multivariate representation. This method makes use of sliding windows to calculate several dozen of new variables using simple statistic tools like first and second moments as well as more complicated statistic, like auto-regression coefficients and residual analysis, followed by an optional quadratic transformation that was further used for data extension. These were used as a explanatory variables in a regularized logistic LASSO regression which tried to estimate Buy-Sell Index (BSI) from real stock market data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanamori, Masashi, E-mail: kanamori.masashi@jaxa.jp; Takahashi, Takashi, E-mail: takahashi.takashi@jaxa.jp; Aoyama, Takashi, E-mail: aoyama.takashi@jaxa.jp
2015-10-28
Shown in this paper is an introduction of a prediction tool for the propagation of loud noise with the application to the aeronautics in mind. The tool, named SPnoise, is based on HOWARD approach, which can express almost exact multidimensionality of the diffraction effect at the cost of back scattering. This paper argues, in particular, the prediction of the effect of atmospheric turbulence on sonic boom as one of the important issues in aeronautics. Thanks to the simple and efficient modeling of the atmospheric turbulence, SPnoise successfully re-creates the feature of the effect, which often emerges in the region justmore » behind the front and rear shock waves in the sonic boom signature.« less
Measurement of jaw motion: the proposal of a simple and accurate method.
Pinheiro, A P; Pereira, A A; Andrade, A O; Bellomo, D
2011-01-01
The analysis of jaw movements has long been used as a measure for clinical diagnosis and assessment. A number of strategies are available for monitoring the trajectory; however most of these strategies make use of expensive tools, which are often not available to many clinics in the world. In this context, this research proposes the development of a new tool capable of quantifying the movements of opening/closing, protrusion and laterotrusion of the mandible. These movements are important for the clinical evaluation of both the temporomandibular function and muscles involved in mastication. The proposed system, unlike current commercial systems, employs a low-cost video camera and a computer program, which is used for reconstructing the trajectory of a reflective marker that is fixed on the jaw. In order to illustrate the application of the devised tool a clinical trial was carried out, investigating jaw movements of 10 subjects. The results obtained in this study were compatible with those found in the literature with the advantage of using a low-cost, simple, non-invasive and flexible solution customized for the practical needs of clinics. The average error of the system was less than 1.0%.
Mukherji, Sutapa
2018-03-01
In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.
NASA Astrophysics Data System (ADS)
Mukherji, Sutapa
2018-03-01
In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.
Clayton, William; Eaton, Carla Jane; Dupont, Pierre-Yves; Gillanders, Tim; Cameron, Nick; Saikia, Sanjay; Scott, Barry
2017-01-01
Epichloë grass endophytes comprise a group of filamentous fungi of both sexual and asexual species. Known for the beneficial characteristics they endow upon their grass hosts, the identification of these endophyte species has been of great interest agronomically and scientifically. The use of simple sequence repeat loci and the variation in repeat elements has been used to rapidly identify endophyte species and strains, however, little is known of how the structure of repeat elements changes between species and strains, and where these repeat elements are located in the fungal genome. We report on an in-depth analysis of the structure and genomic location of the simple sequence repeat locus B10, commonly used for Epichloë endophyte species identification. The B10 repeat was found to be located within an exon of a putative bZIP transcription factor, suggesting possible impacts on polypeptide sequence and thus protein function. Analysis of this repeat in the asexual endophyte hybrid Epichloë uncinata revealed that the structure of B10 alleles reflects the ancestral species that hybridized to give rise to this species. Understanding the structure and sequence of these simple sequence repeats provides a useful set of tools for readily distinguishing strains and for gaining insights into the ancestral species that have undergone hybridization events.
Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology
Grüning, Björn A.; Paszkiewicz, Konrad; Pritchard, Leighton
2013-01-01
The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu). PMID:24109552
NASA Technical Reports Server (NTRS)
Petrenko, M.; Hegde, M.; Bryant, K.; Johnson, J. E.; Ritrivi, A.; Shen, S.; Volmer, B.; Pham, L. B.
2015-01-01
Goddard Earth Sciences Data and Information Services Center (GES DISC) has been providing access to scientific data sets since 1990s. Beginning as one of the first Earth Observing System Data and Information System (EOSDIS) archive centers, GES DISC has evolved to offer a wide range of science-enabling services. With a growing understanding of needs and goals of its science users, GES DISC continues to improve and expand on its broad set of data discovery and access tools, sub-setting services, and visualization tools. Nonetheless, the multitude of the available tools, a partial overlap of functionality, and independent and uncoupled interfaces employed by these tools often leave the end users confused as of what tools or services are the most appropriate for a task at hand. As a result, some the services remain underutilized or largely unknown to the users, significantly reducing the availability of the data and leading to a great loss of scientific productivity. In order to improve the accessibility of GES DISC tools and services, we have designed and implemented UUI, the Unified User Interface. UUI seeks to provide a simple, unified, and intuitive one-stop shop experience for the key services available at GES DISC, including sub-setting (Simple Subset Wizard), granule file search (Mirador), plotting (Giovanni), and other services. In this poster, we will discuss the main lessons, obstacles, and insights encountered while designing the UUI experience. We will also present the architecture and technology behind UUI, including NodeJS, Angular, and Mongo DB, as well as speculate on the future of the tool at GES DISC as well as in a broader context of the Space Science Informatics.
Object-oriented parsing of biological databases with Python.
Ramu, C; Gemünd, C; Gibson, T J
2000-07-01
While database activities in the biological area are increasing rapidly, rather little is done in the area of parsing them in a simple and object-oriented way. We present here an elegant, simple yet powerful way of parsing biological flat-file databases. We have taken EMBL, SWISSPROT and GENBANK as examples. EMBL and SWISS-PROT do not differ much in the format structure. GENBANK has a very different format structure than EMBL and SWISS-PROT. Extracting the desired fields in an entry (for example a sub-sequence with an associated feature) for later analysis is a constant need in the biological sequence-analysis community: this is illustrated with tools to make new splice-site databases. The interface to the parser is abstract in the sense that the access to all the databases is independent from their different formats, since parsing instructions are hidden.
Semantic integration of gene expression analysis tools and data sources using software connectors
2013-01-01
Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380
Semantic integration of gene expression analysis tools and data sources using software connectors.
Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G
2013-10-25
The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.
Analyzing Discourse Processing Using a Simple Natural Language Processing Tool
ERIC Educational Resources Information Center
Crossley, Scott A.; Allen, Laura K.; Kyle, Kristopher; McNamara, Danielle S.
2014-01-01
Natural language processing (NLP) provides a powerful approach for discourse processing researchers. However, there remains a notable degree of hesitation by some researchers to consider using NLP, at least on their own. The purpose of this article is to introduce and make available a "simple" NLP (SiNLP) tool. The overarching goal of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ian Metzger, Jesse Dean
2010-12-31
This software requires inputs of simple water fixture inventory information and calculates the water/energy and cost benefits of various retrofit opportunities. This tool includes water conservation measures for: Low-flow Toilets, Low-flow Urinals, Low-flow Faucets, and Low-flow Showheads. This tool calculates water savings, energy savings, demand reduction, cost savings, and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.
Formal methods for modeling and analysis of hybrid systems
NASA Technical Reports Server (NTRS)
Tiwari, Ashish (Inventor); Lincoln, Patrick D. (Inventor)
2009-01-01
A technique based on the use of a quantifier elimination decision procedure for real closed fields and simple theorem proving to construct a series of successively finer qualitative abstractions of hybrid automata is taught. The resulting abstractions are always discrete transition systems which can then be used by any traditional analysis tool. The constructed abstractions are conservative and can be used to establish safety properties of the original system. The technique works on linear and non-linear polynomial hybrid systems: the guards on discrete transitions and the continuous flows in all modes can be specified using arbitrary polynomial expressions over the continuous variables. An exemplar tool in the SAL environment built over the theorem prover PVS is detailed. The technique scales well to large and complex hybrid systems.
Rotational Analysis of Phase Plane Curves: Complex and Pure Imaginary Eigenvalues
ERIC Educational Resources Information Center
Murray, Russell H.
2005-01-01
Although the phase plane can be plotted and analyzed using an appropriate software package, the author found it worthwhile to engage the students with the theorem and the two proofs. The theorem is a powerful tool that provides insight into the rotational behavior of the phase plane diagram in a simple way: just check the signs of c and [alpha].…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noethen, M.M.; Eggermann, K.; Propping, P.
1995-10-01
It is well accepted that association studies are a major tool in investigating the contribution of single genes to the development of diseases that do not follow simple Mendelian inheritance pattern (so-called complex traits). Such major psychiatric diseases as bipolar affective disorder and schizophrenia clearly fall into this category of diseases. 7 refs., 1 tab.
Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B
2013-03-23
Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.
Preliminary design methods for fiber reinforced composite structures employing a personal computer
NASA Technical Reports Server (NTRS)
Eastlake, C. N.
1986-01-01
The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.
Implementing change in health professions education: stakeholder analysis and coalition building.
Baum, Karyn D; Resnik, Cheryl D; Wu, Jennifer J; Roey, Steven C
2007-01-01
The challenges facing the health sciences education fields are more evident than ever. Professional health sciences educators have more demands on their time, more knowledge to manage, and ever-dwindling sources of financial support. Change is often necessary to either keep programs viable or meet the changing needs of health education. This article outlines a simple but powerful three-step tool to help educators become successful agents of change. Through the application of principles well known and widely used in business management, readers will understand the concepts behind stakeholder analysis and coalition building. These concepts are part of a powerful tool kit that educators need in order to become effective agents of change in the health sciences environment. Using the example of curriculum change at a school of veterinary medicine, we will outline the three steps involved, from stakeholder identification and analysis to building and managing coalitions for change.
NASA Technical Reports Server (NTRS)
Burns, K. Lee; Altino, Karen
2008-01-01
The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.
Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach
Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen
2016-01-01
A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design. PMID:28773800
Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach.
Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen
2016-08-09
A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design.
ELM - A SIMPLE TOOL FOR THERMAL-HYDRAULIC ANALYSIS OF SOLID-CORE NUCLEAR ROCKET FUEL ELEMENTS
NASA Technical Reports Server (NTRS)
Walton, J. T.
1994-01-01
ELM is a simple computational tool for modeling the steady-state thermal-hydraulics of propellant flow through fuel element coolant channels in nuclear thermal rockets. Written for the nuclear propulsion project of the Space Exploration Initiative, ELM evaluates the various heat transfer coefficient and friction factor correlations available for turbulent pipe flow with heat addition. In the past, these correlations were found in different reactor analysis codes, but now comparisons are possible within one program. The logic of ELM is based on the one-dimensional conservation of energy in combination with Newton's Law of Cooling to determine the bulk flow temperature and the wall temperature across a control volume. Since the control volume is an incremental length of tube, the corresponding pressure drop is determined by application of the Law of Conservation of Momentum. The size, speed, and accuracy of ELM make it a simple tool for use in fuel element parametric studies. ELM is a machine independent program written in FORTRAN 77. It has been successfully compiled on an IBM PC compatible running MS-DOS using Lahey FORTRAN 77, a DEC VAX series computer running VMS, and a Sun4 series computer running SunOS UNIX. ELM requires 565K of RAM under SunOS 4.1, 360K of RAM under VMS 5.4, and 406K of RAM under MS-DOS. Because this program is machine independent, no executable is provided on the distribution media. The standard distribution medium for ELM is one 5.25 inch 360K MS-DOS format diskette. ELM was developed in 1991. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. Sun4 and SunOS are trademarks of Sun Microsystems, Inc. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation.
Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC
NASA Astrophysics Data System (ADS)
Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.
2015-08-01
This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.
Brunner, J; Krummenauer, F; Lehr, H A
2000-04-01
Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.
[Model of Analysis and Prevention of Accidents - MAPA: tool for operational health surveillance].
de Almeida, Ildeberto Muniz; Vilela, Rodolfo Andrade de Gouveia; da Silva, Alessandro José Nunes; Beltran, Sandra Lorena
2014-12-01
The analysis of work-related accidents is important for accident surveillance and prevention. Current methods of analysis seek to overcome reductionist views that see these occurrences as simple events explained by operator error. The objective of this paper is to analyze the Model of Analysis and Prevention of Accidents (MAPA) and its use in monitoring interventions, duly highlighting aspects experienced in the use of the tool. The descriptive analytical method was used, introducing the steps of the model. To illustrate contributions and or difficulties, cases where the tool was used in the context of service were selected. MAPA integrates theoretical approaches that have already been tried in studies of accidents by providing useful conceptual support from the data collection stage until conclusion and intervention stages. Besides revealing weaknesses of the traditional approach, it helps identify organizational determinants, such as management failings, system design and safety management involved in the accident. The main challenges lie in the grasp of concepts by users, in exploring organizational aspects upstream in the chain of decisions or at higher levels of the hierarchy, as well as the intervention to change the determinants of these events.
ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures
NASA Astrophysics Data System (ADS)
Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.
2008-08-01
This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.
A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program
NASA Technical Reports Server (NTRS)
Bartoszek, J. T.; Huckins, B.; Coyle, M.
1979-01-01
A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.
A simple model of hysteresis behavior using spreadsheet analysis
NASA Astrophysics Data System (ADS)
Ehrmann, A.; Blachowicz, T.
2015-01-01
Hysteresis loops occur in many scientific and technical problems, especially as field dependent magnetization of ferromagnetic materials, but also as stress-strain-curves of materials measured by tensile tests including thermal effects, liquid-solid phase transitions, in cell biology or economics. While several mathematical models exist which aim to calculate hysteresis energies and other parameters, here we offer a simple model for a general hysteretic system, showing different hysteresis loops depending on the defined parameters. The calculation which is based on basic spreadsheet analysis plus an easy macro code can be used by students to understand how these systems work and how the parameters influence the reactions of the system on an external field. Importantly, in the step-by-step mode, each change of the system state, compared to the last step, becomes visible. The simple program can be developed further by several changes and additions, enabling the building of a tool which is capable of answering real physical questions in the broad field of magnetism as well as in other scientific areas, in which similar hysteresis loops occur.
CRCDA—Comprehensive resources for cancer NGS data analysis
Thangam, Manonanthini; Gopal, Ramesh Kumar
2015-01-01
Next generation sequencing (NGS) innovations put a compelling landmark in life science and changed the direction of research in clinical oncology with its productivity to diagnose and treat cancer. The aim of our portal comprehensive resources for cancer NGS data analysis (CRCDA) is to provide a collection of different NGS tools and pipelines under diverse classes with cancer pathways and databases and furthermore, literature information from PubMed. The literature data was constrained to 18 most common cancer types such as breast cancer, colon cancer and other cancers that exhibit in worldwide population. NGS-cancer tools for the convenience have been categorized into cancer genomics, cancer transcriptomics, cancer epigenomics, quality control and visualization. Pipelines for variant detection, quality control and data analysis were listed to provide out-of-the box solution for NGS data analysis, which may help researchers to overcome challenges in selecting and configuring individual tools for analysing exome, whole genome and transcriptome data. An extensive search page was developed that can be queried by using (i) type of data [literature, gene data and sequence read archive (SRA) data] and (ii) type of cancer (selected based on global incidence and accessibility of data). For each category of analysis, variety of tools are available and the biggest challenge is in searching and using the right tool for the right application. The objective of the work is collecting tools in each category available at various places and arranging the tools and other data in a simple and user-friendly manner for biologists and oncologists to find information easier. To the best of our knowledge, we have collected and presented a comprehensive package of most of the resources available in cancer for NGS data analysis. Given these factors, we believe that this website will be an useful resource to the NGS research community working on cancer. Database URL: http://bioinfo.au-kbc.org.in/ngs/ngshome.html. PMID:26450948
wft4galaxy: a workflow testing tool for galaxy.
Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi
2017-12-01
Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.
An Evidence-Based Videotaped Running Biomechanics Analysis.
Souza, Richard B
2016-02-01
Running biomechanics play an important role in the development of injuries. Performing a running biomechanics analysis on injured runners can help to develop treatment strategies. This article provides a framework for a systematic video-based running biomechanics analysis plan based on the current evidence on running injuries, using 2-dimensional (2D) video and readily available tools. Fourteen measurements are proposed in this analysis plan from lateral and posterior video. Identifying simple 2D surrogates for 3D biomechanic variables of interest allows for widespread translation of best practices, and have the best opportunity to impact the highly prevalent problem of the injured runner. Copyright © 2016 Elsevier Inc. All rights reserved.
Monteiro, Pedro Tiago; Pais, Pedro; Costa, Catarina; Manna, Sauvagya; Sá-Correia, Isabel; Teixeira, Miguel Cacho
2017-01-04
We present the PATHOgenic YEAst Search for Transcriptional Regulators And Consensus Tracking (PathoYeastract - http://pathoyeastract.org) database, a tool for the analysis and prediction of transcription regulatory associations at the gene and genomic levels in the pathogenic yeasts Candida albicans and C. glabrata Upon data retrieval from hundreds of publications, followed by curation, the database currently includes 28 000 unique documented regulatory associations between transcription factors (TF) and target genes and 107 DNA binding sites, considering 134 TFs in both species. Following the structure used for the YEASTRACT database, PathoYeastract makes available bioinformatics tools that enable the user to exploit the existing information to predict the TFs involved in the regulation of a gene or genome-wide transcriptional response, while ranking those TFs in order of their relative importance. Each search can be filtered based on the selection of specific environmental conditions, experimental evidence or positive/negative regulatory effect. Promoter analysis tools and interactive visualization tools for the representation of TF regulatory networks are also provided. The PathoYeastract database further provides simple tools for the prediction of gene and genomic regulation based on orthologous regulatory associations described for other yeast species, a comparative genomics setup for the study of cross-species evolution of regulatory networks. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Strauss, B.; Dodson, D.; Kulp, S. A.; Rizza, D. H.
2016-12-01
Surging Seas Risk Finder (riskfinder.org) is an online tool for accessing extensive local projections and analysis of sea level rise; coastal floods; and land, populations, contamination sources, and infrastructure and other assets that may be exposed to inundation. Risk Finder was first published in 2013 for Florida, New York and New Jersey, expanding to all states in the contiguous U.S. by 2016, when a major new version of the tool was released with a completely new interface. The revised tool was informed by hundreds of survey responses from and conversations with planners, local officials and other coastal stakeholders, plus consideration of modern best practices for responsive web design and user interfaces, and social science-based principles for science communication. Overarching design principles include simplicity and ease of navigation, leading to a landing page with Google-like sparsity and focus on search, and to an architecture based on search, so that each coastal zip code, city, county, state or other place type has its own webpage gathering all relevant analysis in modular, scrollable units. Millions of users have visited the Surging Seas suite of tools to date, and downloaded thousands of files, for stated purposes ranging from planning to business to education to personal decisions; and from institutions ranging from local to federal government agencies, to businesses, to NGOs, and to academia.
Pulling My Gut out--Simple Tools for Engaging Students in Gross Anatomy Lectures
ERIC Educational Resources Information Center
Chan, Lap Ki
2010-01-01
A lecture is not necessarily a monologue, promoting only passive learning. If appropriate techniques are used, a lecture can stimulate active learning too. One such method is demonstration, which can engage learners' attention and increase the interaction between the lecturer and the learners. This article describes two simple and useful tools for…
Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring
Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia
2010-01-01
The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551
SAMP, the Simple Application Messaging Protocol: Letting applications talk to each other
NASA Astrophysics Data System (ADS)
Taylor, M. B.; Boch, T.; Taylor, J.
2015-06-01
SAMP, the Simple Application Messaging Protocol, is a hub-based communication standard for the exchange of data and control between participating client applications. It has been developed within the context of the Virtual Observatory with the aim of enabling specialised data analysis tools to cooperate as a loosely integrated suite, and is now in use by many and varied desktop and web-based applications dealing with astronomical data. This paper reviews the requirements and design principles that led to SAMP's specification, provides a high-level description of the protocol, and discusses some of its common and possible future usage patterns, with particular attention to those factors that have aided its success in practice.
André, Etienne; Boutonnet, Baptiste; Charles, Pauline; Martini, Cyril; Aguiar-Hualde, Juan-Manuel; Latil, Sylvain; Guérineau, Vincent; Hammad, Karim; Ray, Priyanka; Guillot, Régis; Huc, Vincent
2016-02-24
Short segments of zigzag single-walled carbon nanotubes (SWCNTs) were obtained from a calixarene scaffold by using a completely new, simple and expedited strategy that allowed fine-tuning of their diameters. This new approach also allows for functionalised short segments of zigzag SWCNTs to be obtained; a prerequisite towards their lengthening. These new SWCNT short segments/calixarene composites show interesting behaviour in solution. DFT analysis of these new compounds also suggests interesting photophysical behaviour. Along with the synthesis of various SWCNTs segments, this approach also constitutes a powerful tool for the construction of new, radially oriented π systems. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Margaria, Tiziana; Kubczak, Christian; Steffen, Bernhard
2008-04-25
With Bio-jETI, we introduce a service platform for interdisciplinary work on biological application domains and illustrate its use in a concrete application concerning statistical data processing in R and xcms for an LC/MS analysis of FAAH gene knockout. Bio-jETI uses the jABC environment for service-oriented modeling and design as a graphical process modeling tool and the jETI service integration technology for remote tool execution. As a service definition and provisioning platform, Bio-jETI has the potential to become a core technology in interdisciplinary service orchestration and technology transfer. Domain experts, like biologists not trained in computer science, directly define complex service orchestrations as process models and use efficient and complex bioinformatics tools in a simple and intuitive way.
Creating Simple Admin Tools Using Info*Engine and Java
NASA Technical Reports Server (NTRS)
Jones, Corey; Kapatos, Dennis; Skradski, Cory; Felkins, J. D.
2012-01-01
PTC has provided a simple way to dynamically interact with Windchill using Info*Engine. This presentation will describe how to create a simple Info*Engine Tasks capable of saving Windchill 10.0 administration of tedious work.
NASA Astrophysics Data System (ADS)
Gerhard, Christoph; Adams, Geoff
2015-10-01
Geometric optics is at the heart of optics teaching. Some of us may remember using pins and string to test the simple lens equation at school. Matters get more complex at undergraduate/postgraduate levels as we are introduced to paraxial rays, real rays, wavefronts, aberration theory and much more. Software is essential for the later stages, and the right software can profitably be used even at school. We present two free PC programs, which have been widely used in optics teaching, and have been further developed in close cooperation with lecturers/professors in order to address the current content of the curricula for optics, photonics and lasers in higher education. PreDesigner is a single thin lens modeller. It illustrates the simple lens law with construction rays and then allows the user to include field size and aperture. Sliders can be used to adjust key values with instant graphical feedback. This tool thus represents a helpful teaching medium for the visualization of basic interrelations in optics. WinLens3DBasic can model multiple thin or thick lenses with real glasses. It shows the system focii, principal planes, nodal points, gives paraxial ray trace values, details the Seidel aberrations, offers real ray tracing and many forms of analysis. It is simple to reverse lenses and model tilts and decenters. This tool therefore provides a good base for learning lens design fundamentals. Much work has been put into offering these features in ways that are easy to use, and offer opportunities to enhance the student's background understanding.
Brunner, C; Hoffmann, K; Thiele, T; Schedler, U; Jehle, H; Resch-Genger, U
2015-04-01
Commercial platforms consisting of ready-to-use microarrays printed with target-specific DNA probes, a microarray scanner, and software for data analysis are available for different applications in medical diagnostics and food analysis, detecting, e.g., viral and bacteriological DNA sequences. The transfer of these tools from basic research to routine analysis, their broad acceptance in regulated areas, and their use in medical practice requires suitable calibration tools for regular control of instrument performance in addition to internal assay controls. Here, we present the development of a novel assay-adapted calibration slide for a commercialized DNA-based assay platform, consisting of precisely arranged fluorescent areas of various intensities obtained by incorporating different concentrations of a "green" dye and a "red" dye in a polymer matrix. These dyes present "Cy3" and "Cy5" analogues with improved photostability, chosen based upon their spectroscopic properties closely matching those of common labels for the green and red channel of microarray scanners. This simple tool allows to efficiently and regularly assess and control the performance of the microarray scanner provided with the biochip platform and to compare different scanners. It will be eventually used as fluorescence intensity scale for referencing of assays results and to enhance the overall comparability of diagnostic tests.
Simple Sensitivity Analysis for Orion GNC
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch.We describe in this paper a sensitivity analysis tool (Critical Factors Tool or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
Software Models Impact Stresses
NASA Technical Reports Server (NTRS)
Hanshaw, Timothy C.; Roy, Dipankar; Toyooka, Mark
1991-01-01
Generalized Impact Stress Software designed to assist engineers in predicting stresses caused by variety of impacts. Program straightforward, simple to implement on personal computers, "user friendly", and handles variety of boundary conditions applied to struck body being analyzed. Applications include mathematical modeling of motions and transient stresses of spacecraft, analysis of slamming of piston, of fast valve shutoffs, and play of rotating bearing assembly. Provides fast and inexpensive analytical tool for analysis of stresses and reduces dependency on expensive impact tests. Written in FORTRAN 77. Requires use of commercial software package PLOT88.
Joint symbolic dynamics for the assessment of cardiovascular and cardiorespiratory interactions
Baumert, Mathias; Javorka, Michal; Kabir, Muammar
2015-01-01
Beat-to-beat variations in heart period provide information on cardiovascular control and are closely linked to variations in arterial pressure and respiration. Joint symbolic analysis of heart period, systolic arterial pressure and respiration allows for a simple description of their shared short-term dynamics that are governed by cardiac baroreflex control and cardiorespiratory coupling. In this review, we discuss methodology and research applications. Studies suggest that analysis of joint symbolic dynamics provides a powerful tool for identifying physiological and pathophysiological changes in cardiovascular and cardiorespiratory control. PMID:25548272
Joint symbolic dynamics for the assessment of cardiovascular and cardiorespiratory interactions.
Baumert, Mathias; Javorka, Michal; Kabir, Muammar
2015-02-13
Beat-to-beat variations in heart period provide information on cardiovascular control and are closely linked to variations in arterial pressure and respiration. Joint symbolic analysis of heart period, systolic arterial pressure and respiration allows for a simple description of their shared short-term dynamics that are governed by cardiac baroreflex control and cardiorespiratory coupling. In this review, we discuss methodology and research applications. Studies suggest that analysis of joint symbolic dynamics provides a powerful tool for identifying physiological and pathophysiological changes in cardiovascular and cardiorespiratory control.
USDA-ARS?s Scientific Manuscript database
Plant pathogen detection takes many forms. In simple cases, researchers are attempting to detect a known pathogen from a known host utilizing targeted nucleic acid or antigenic assays. However, in more complex scenarios researchers may not know the identity of a pathogen, or they may need to screen ...
Detection of genomic rearrangements in cucumber using genomecmp software
NASA Astrophysics Data System (ADS)
Kulawik, Maciej; Pawełkowicz, Magdalena Ewa; Wojcieszek, Michał; PlÄ der, Wojciech; Nowak, Robert M.
2017-08-01
Comparative genomic by increasing information about the genomes sequences available in the databases is a rapidly evolving science. A simple comparison of the general features of genomes such as genome size, number of genes, and chromosome number presents an entry point into comparative genomic analysis. Here we present the utility of the new tool genomecmp for finding rearrangements across the compared sequences and applications in plant comparative genomics.
Cunningham, J C; Sinka, I C; Zavaliangos, A
2004-08-01
In this first of two articles on the modeling of tablet compaction, the experimental inputs related to the constitutive model of the powder and the powder/tooling friction are determined. The continuum-based analysis of tableting makes use of an elasto-plastic model, which incorporates the elements of yield, plastic flow potential, and hardening, to describe the mechanical behavior of microcrystalline cellulose over the range of densities experienced during tableting. Specifically, a modified Drucker-Prager/cap plasticity model, which includes material parameters such as cohesion, internal friction, and hydrostatic yield pressure that evolve with the internal state variable relative density, was applied. Linear elasticity is assumed with the elastic parameters, Young's modulus, and Poisson's ratio dependent on the relative density. The calibration techniques were developed based on a series of simple mechanical tests including diametrical compression, simple compression, and die compaction using an instrumented die. The friction behavior is measured using an instrumented die and the experimental data are analyzed using the method of differential slices. The constitutive model and frictional properties are essential experimental inputs to the finite element-based model described in the companion article. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 93:2022-2039, 2004
Weighing Evidence “Steampunk” Style via the Meta-Analyser
Bowden, Jack; Jackson, Chris
2016-01-01
ABSTRACT The funnel plot is a graphical visualization of summary data estimates from a meta-analysis, and is a useful tool for detecting departures from the standard modeling assumptions. Although perhaps not widely appreciated, a simple extension of the funnel plot can help to facilitate an intuitive interpretation of the mathematics underlying a meta-analysis at a more fundamental level, by equating it to determining the center of mass of a physical system. We used this analogy to explain the concepts of weighing evidence and of biased evidence to a young audience at the Cambridge Science Festival, without recourse to precise definitions or statistical formulas and with a little help from Sherlock Holmes! Following on from the science fair, we have developed an interactive web-application (named the Meta-Analyser) to bring these ideas to a wider audience. We envisage that our application will be a useful tool for researchers when interpreting their data. First, to facilitate a simple understanding of fixed and random effects modeling approaches; second, to assess the importance of outliers; and third, to show the impact of adjusting for small study bias. This final aim is realized by introducing a novel graphical interpretation of the well-known method of Egger regression. PMID:28003684
Rasheed, Waqas; Neoh, Yee Yik; Bin Hamid, Nor Hisham; Reza, Faruque; Idris, Zamzuri; Tang, Tong Boon
2017-10-01
Functional neuroimaging modalities play an important role in deciding the diagnosis and course of treatment of neuronal dysfunction and degeneration. This article presents an analytical tool with visualization by exploiting the strengths of the MEG (magnetoencephalographic) neuroimaging technique. The tool automates MEG data import (in tSSS format), channel information extraction, time/frequency decomposition, and circular graph visualization (connectogram) for simple result inspection. For advanced users, the tool also provides magnitude squared coherence (MSC) values allowing personalized threshold levels, and the computation of default model from MEG data of control population. Default model obtained from healthy population data serves as a useful benchmark to diagnose and monitor neuronal recovery during treatment. The proposed tool further provides optional labels with international 10-10 system nomenclature in order to facilitate comparison studies with EEG (electroencephalography) sensor space. Potential applications in epilepsy and traumatic brain injury studies are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Collaboration tools and techniques for large model datasets
Signell, R.P.; Carniel, S.; Chiggiato, J.; Janekovic, I.; Pullen, J.; Sherwood, C.R.
2008-01-01
In MREA and many other marine applications, it is common to have multiple models running with different grids, run by different institutions. Techniques and tools are described for low-bandwidth delivery of data from large multidimensional datasets, such as those from meteorological and oceanographic models, directly into generic analysis and visualization tools. Output is stored using the NetCDF CF Metadata Conventions, and then delivered to collaborators over the web via OPeNDAP. OPeNDAP datasets served by different institutions are then organized via THREDDS catalogs. Tools and procedures are then used which enable scientists to explore data on the original model grids using tools they are familiar with. It is also low-bandwidth, enabling users to extract just the data they require, an important feature for access from ship or remote areas. The entire implementation is simple enough to be handled by modelers working with their webmasters - no advanced programming support is necessary. ?? 2007 Elsevier B.V. All rights reserved.
Al-Khalifah, Nasser S; Shanavaskhan, A E
2017-01-01
Ambiguity in the total number of date palm cultivars across the world is pointing toward the necessity for an enumerative study using standard morphological and molecular markers. Among molecular markers, DNA markers are more suitable and ubiquitous to most applications. They are highly polymorphic in nature, frequently occurring in genomes, easy to access, and highly reproducible. Various molecular markers such as restriction fragment length polymorphism (RFLP), amplified fragment length polymorphism (AFLP), simple sequence repeats (SSR), inter-simple sequence repeats (ISSR), and random amplified polymorphic DNA (RAPD) markers have been successfully used as efficient tools for analysis of genetic variation in date palm. This chapter explains a stepwise protocol for extracting total genomic DNA from date palm leaves. A user-friendly protocol for RAPD analysis and a table showing the primers used in different molecular techniques that produce polymorphisms in date palm are also provided.
Decoding spike timing: the differential reverse correlation method
Tkačik, Gašper; Magnasco, Marcelo O.
2009-01-01
It is widely acknowledged that detailed timing of action potentials is used to encode information, for example in auditory pathways; however the computational tools required to analyze encoding through timing are still in their infancy. We present a simple example of encoding, based on a recent model of time-frequency analysis, in which units fire action potentials when a certain condition is met, but the timing of the action potential depends also on other features of the stimulus. We show that, as a result, spike-triggered averages are smoothed so much they do not represent the true features of the encoding. Inspired by this example, we present a simple method, differential reverse correlations, that can separate an analysis of what causes a neuron to spike, and what controls its timing. We analyze with this method the leaky integrate-and-fire neuron and show the method accurately reconstructs the model's kernel. PMID:18597928
[Quality assurance of the renal applications software].
del Real Núñez, R; Contreras Puertas, P I; Moreno Ortega, E; Mena Bares, L M; Maza Muret, F R; Latre Romero, J M
2007-01-01
The need for quality assurance of all technical aspects of nuclear medicine studies is widely recognised. However, little attention has been paid to the quality assurance of the applications software. Our work reported here aims at verifying the analysis software for processing of renal nuclear medicine studies (renograms). The software tools were used to build a synthetic dynamic model of renal system. The model consists of two phases: perfusion and function. The organs of interest (kidneys, bladder and aortic artery) were simple geometric forms. The uptake of the renal structures was described by mathematic functions. Curves corresponding to normal or pathological conditions were simulated for kidneys, bladder and aortic artery by appropriate selection of parameters. There was no difference between the parameters of the mathematic curves and the quantitative data produced by the renal analysis program. Our test procedure is simple to apply, reliable, reproducible and rapid to verify the renal applications software.
ERIC Educational Resources Information Center
Gron, Liz U.; Bradley, Shelly B.; McKenzie, Jennifer R.; Shinn, Sara E.; Teague, M. Warfield
2013-01-01
This paper presents the use of simple, outcome-based assessment tools to design and evaluate the first semester of a new introductory laboratory program created to teach green analytical chemistry using environmental samples. This general chemistry laboratory program, like many introductory courses, has a wide array of stakeholders within and…
2012-01-01
Background The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. Method A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID) or computed radiography (CR). We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Results Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. Conclusion The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence. PMID:22452821
Full quantum mechanical analysis of atomic three-grating Mach–Zehnder interferometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanz, A.S., E-mail: asanz@iff.csic.es; Davidović, M.; Božić, M.
2015-02-15
Atomic three-grating Mach–Zehnder interferometry constitutes an important tool to probe fundamental aspects of the quantum theory. There is, however, a remarkable gap in the literature between the oversimplified models and robust numerical simulations considered to describe the corresponding experiments. Consequently, the former usually lead to paradoxical scenarios, such as the wave–particle dual behavior of atoms, while the latter make difficult the data analysis in simple terms. Here these issues are tackled by means of a simple grating working model consisting of evenly-spaced Gaussian slits. As is shown, this model suffices to explore and explain such experiments both analytically and numerically,more » giving a good account of the full atomic journey inside the interferometer, and hence contributing to make less mystic the physics involved. More specifically, it provides a clear and unambiguous picture of the wavefront splitting that takes place inside the interferometer, illustrating how the momentum along each emerging diffraction order is well defined even though the wave function itself still displays a rather complex shape. To this end, the local transverse momentum is also introduced in this context as a reliable analytical tool. The splitting, apart from being a key issue to understand atomic Mach–Zehnder interferometry, also demonstrates at a fundamental level how wave and particle aspects are always present in the experiment, without incurring in any contradiction or interpretive paradox. On the other hand, at a practical level, the generality and versatility of the model and methodology presented, makes them suitable to attack analogous problems in a simple manner after a convenient tuning. - Highlights: • A simple model is proposed to analyze experiments based on atomic Mach–Zehnder interferometry. • The model can be easily handled both analytically and computationally. • A theoretical analysis based on the combination of the position and momentum representations is considered. • Wave and particle aspects are shown to coexist within the same experiment, thus removing the old wave-corpuscle dichotomy. • A good agreement between numerical simulations and experimental data is found without appealing to best-fit procedures.« less
Consumption value theory and the marketing of public health: an effective formative research tool.
Nelson, Douglas G; Byus, Kent
2002-01-01
Contemporary public health requires the support and participation of its constituency. This study assesses the capacity of consumption value theory to identify the basis of this support. A telephone survey design used simple random sampling of adult residents of Cherokee County, Oklahoma. Factor analysis and stepwise discriminant analysis was used to identify and classify personal and societal level support variables. Most residents base societal level support on epistemic values. Direct services clientele base their support on positive emotional values derived from personal contact and attractive programs. Residents are curious about public health and want to know more about the health department. Where marketing the effectiveness of public health programs would yield relatively little support, marketing health promotion activities may attract public opposition. This formative research tool suggests a marketing strategy for public health practitioners.
Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA
2008-01-01
Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks through a single easy-to-use web application. The system architecture is flexible and scalable to allow new array types, analysis algorithms and tools to be added with relative ease and to cope with large increases in data volume. PMID:19032776
Big Data Tools as Applied to ATLAS Event Data
NASA Astrophysics Data System (ADS)
Vukotic, I.; Gardner, R. W.; Bryant, L. A.
2017-10-01
Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and tools like Spark, Jupyter, R, SciPy, Caffe, TensorFlow, etc. Machine learning challenges such as the Higgs Boson Machine Learning Challenge, the Tracking challenge, Event viewers (VP1, ATLANTIS, ATLASrift), and still to be developed educational and outreach tools would be able to access the data through a simple REST API. In this preliminary investigation we focus on derived xAOD data sets. These are much smaller than the primary xAODs having containers, variables, and events of interest to a particular analysis. Being encouraged with the performance of Elasticsearch for the ADC analytics platform, we developed an algorithm for indexing derived xAOD event data. We have made an appropriate document mapping and have imported a full set of standard model W/Z datasets. We compare the disk space efficiency of this approach to that of standard ROOT files, the performance in simple cut flow type of data analysis, and will present preliminary results on its scaling characteristics with different numbers of clients, query complexity, and size of the data retrieved.
The Phyre2 web portal for protein modelling, prediction and analysis
Kelley, Lawrence A; Mezulis, Stefans; Yates, Christopher M; Wass, Mark N; Sternberg, Michael JE
2017-01-01
Summary Phyre2 is a suite of tools available on the web to predict and analyse protein structure, function and mutations. The focus of Phyre2 is to provide biologists with a simple and intuitive interface to state-of-the-art protein bioinformatics tools. Phyre2 replaces Phyre, the original version of the server for which we previously published a protocol. In this updated protocol, we describe Phyre2, which uses advanced remote homology detection methods to build 3D models, predict ligand binding sites, and analyse the effect of amino-acid variants (e.g. nsSNPs) for a user’s protein sequence. Users are guided through results by a simple interface at a level of detail determined by them. This protocol will guide a user from submitting a protein sequence to interpreting the secondary and tertiary structure of their models, their domain composition and model quality. A range of additional available tools is described to find a protein structure in a genome, to submit large number of sequences at once and to automatically run weekly searches for proteins difficult to model. The server is available at http://www.sbg.bio.ic.ac.uk/phyre2. A typical structure prediction will be returned between 30mins and 2 hours after submission. PMID:25950237
Simplified aeroelastic modeling of horizontal axis wind turbines
NASA Technical Reports Server (NTRS)
Wendell, J. H.
1982-01-01
Certain aspects of the aeroelastic modeling and behavior of the horizontal axis wind turbine (HAWT) are examined. Two simple three degree of freedom models are described in this report, and tools are developed which allow other simple models to be derived. The first simple model developed is an equivalent hinge model to study the flap-lag-torsion aeroelastic stability of an isolated rotor blade. The model includes nonlinear effects, preconing, and noncoincident elastic axis, center of gravity, and aerodynamic center. A stability study is presented which examines the influence of key parameters on aeroelastic stability. Next, two general tools are developed to study the aeroelastic stability and response of a teetering rotor coupled to a flexible tower. The first of these tools is an aeroelastic model of a two-bladed rotor on a general flexible support. The second general tool is a harmonic balance solution method for the resulting second order system with periodic coefficients. The second simple model developed is a rotor-tower model which serves to demonstrate the general tools. This model includes nacelle yawing, nacelle pitching, and rotor teetering. Transient response time histories are calculated and compared to a similar model in the literature. Agreement between the two is very good, especially considering how few harmonics are used. Finally, a stability study is presented which examines the effects of support stiffness and damping, inflow angle, and preconing.
Quality Control System using Simple Implementation of Seven Tools for Batik Textile Manufacturing
NASA Astrophysics Data System (ADS)
Ragil Suryoputro, Muhammad; Sugarindra, Muchamad; Erfaisalsyah, Hendy
2017-06-01
In order to produce better products and mitigate defect in products, every company must implement a quality control system. Company will find means to implement a quality control system that is capable and reliable. One of the methods is using the simple implementation of the seven tools in quality control defects. The case studied in this research was the level of disability xyz grey fabric on a shuttle loom 2 on the Batik manufacturing company. The seven tools that include: flowchart, check sheet, histogram, scatter diagram combined with control charts, Pareto diagrams and fishbone diagrams (causal diagram). Check sheet results obtained types of defects in the grey fabric was woven xyz is warp, double warp, the warp break, double warp, empty warp, warp tenuous, ugly edges, thick warp, and rust. Based on the analysis of control chart indicates that the process is out of control. This can be seen in the graph control where there is still a lot of outlier data. Based on a scatter diagram shows a positive correlation between the percentage of disability and the number of production. Based on Pareto diagram, repair needs priority is for the dominant type of defect is warp (44%) and based on double warp value histogram is also the highest with a value of 23635.11 m. In addition, based on the analysis of the factors causing defect by fishbone diagram double warp or other types of defects originating from the materials, methods, machines, measurements, man and environment. Thus the company can take to minimize the prevention and repair of defects and improve product quality.
NASA Astrophysics Data System (ADS)
Kalchenko, Vyacheslav; Molodij, Guillaume; Kuznetsov, Yuri; Smolyakov, Yuri; Israeli, David; Meglinski, Igor; Harmelin, Alon
2016-03-01
The use of fluorescence imaging of vascular permeability becomes a golden standard for assessing the inflammation process during experimental immune response in vivo. The use of the optical fluorescence imaging provides a very useful and simple tool to reach this purpose. The motivation comes from the necessity of a robust and simple quantification and data presentation of inflammation based on a vascular permeability. Changes of the fluorescent intensity, as a function of time is a widely accepted method to assess the vascular permeability during inflammation related to the immune response. In the present study we propose to bring a new dimension by applying a more sophisticated approach to the analysis of vascular reaction by using a quantitative analysis based on methods derived from astronomical observations, in particular by using a space-time Fourier filtering analysis followed by a polynomial orthogonal modes decomposition. We demonstrate that temporal evolution of the fluorescent intensity observed at certain pixels correlates quantitatively to the blood flow circulation at normal conditions. The approach allows to determine the regions of permeability and monitor both the fast kinetics related to the contrast material distribution in the circulatory system and slow kinetics associated with extravasation of the contrast material. Thus, we introduce a simple and convenient method for fast quantitative visualization of the leakage related to the inflammatory (immune) reaction in vivo.
Suárez Álvarez, Óscar; Fernández-Feito, Ana; Vallina Crespo, Henar; Aldasoro Unamuno, Elena; Cofiño, Rafael
2018-05-11
It is essential to develop a comprehensive approach to institutionally promoted interventions to assess their impact on health from the perspective of the social determinants of health and equity. Simple, adapted tools must be developed to carry out these assessments. The aim of this paper is to present two tools to assess the impact of programmes and community-based interventions on the social determinants of health. The first tool is intended to assess health programmes through interviews and analysis of information provided by the assessment team. The second tool, by means of online assessments of community-based interventions, also enables a report on inequality issues that includes recommendations for improvement. In addition to reducing health-related social inequities, the implementation of these tools can also help to improve the efficiency of public health interventions. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
Zakaria, Rasheed; Ellenbogen, Jonathan; Graham, Catherine; Pizer, Barry; Mallucci, Conor; Kumar, Ram
2013-08-01
Complications may occur following posterior fossa tumour surgery in children. Such complications are subjectively and inconsistently reported even though they may have significant long-term behavioural and cognitive consequences for the child. This makes comparison of surgeons, programmes and treatments problematic. We have devised a causality tool for assessing if an adverse event after surgery can be classified as a surgical complication using a series of simple questions, based on a tool used in assessing adverse drug reactions. This tool, which we have called the "Liverpool Neurosurgical Complication Causality Assessment Tool", was developed by reviewing a series of ten posterior fossa tumour cases with a panel of neurosurgery, neurology, oncology and neuropsychology specialists working in a multidisciplinary paediatric tumour treatment programme. We have demonstrated its use and hope that it may improve reliability between different assessors both in evaluating the outcomes of existing programmes and treatments as well as aiding in trials which may directly compare the effects of surgical and medical treatments.
Exposure assessment in health assessments for hand-arm vibration syndrome.
Mason, H J; Poole, K; Young, C
2011-08-01
Assessing past cumulative vibration exposure is part of assessing the risk of hand-arm vibration syndrome (HAVS) in workers exposed to hand-arm vibration and invariably forms part of a medical assessment of such workers. To investigate the strength of relationships between the presence and severity of HAVS and different cumulative exposure metrics obtained from a self-reporting questionnaire. Cumulative exposure metrics were constructed from a tool-based questionnaire applied in a group of HAVS referrals and workplace field studies. These metrics included simple years of vibration exposure, cumulative total hours of all tool use and differing combinations of acceleration magnitudes for specific tools and their daily use, including the current frequency-weighting method contained in ISO 5349-1:2001. Use of simple years of exposure is a weak predictor of HAVS or its increasing severity. The calculation of cumulative hours across all vibrating tools used is a more powerful predictor. More complex calculations based on involving likely acceleration data for specific classes of tools, either frequency weighted or not, did not offer a clear further advantage in this dataset. This may be due to the uncertainty associated with workers' recall of their past tool usage or the variability between tools in the magnitude of their vibration emission. Assessing years of exposure or 'latency' in a worker should be replaced by cumulative hours of tool use. This can be readily obtained using a tool-pictogram-based self-reporting questionnaire and a simple spreadsheet calculation.
Visualising nursing data using correspondence analysis.
Kokol, Peter; Blažun Vošner, Helena; Železnik, Danica
2016-09-01
Digitally stored, large healthcare datasets enable nurses to use 'big data' techniques and tools in nursing research. Big data is complex and multi-dimensional, so visualisation may be a preferable approach to analyse and understand it. To demonstrate the use of visualisation of big data in a technique called correspondence analysis. In the authors' study, relations among data in a nursing dataset were shown visually in graphs using correspondence analysis. The case presented demonstrates that correspondence analysis is easy to use, shows relations between data visually in a form that is simple to interpret, and can reveal hidden associations between data. Correspondence analysis supports the discovery of new knowledge. Implications for practice Knowledge obtained using correspondence analysis can be transferred immediately into practice or used to foster further research.
Evaluation of IOTA Simple Ultrasound Rules to Distinguish Benign and Malignant Ovarian Tumours.
Garg, Sugandha; Kaur, Amarjit; Mohi, Jaswinder Kaur; Sibia, Preet Kanwal; Kaur, Navkiran
2017-08-01
IOTA stands for International Ovarian Tumour Analysis group. Ovarian cancer is one of the common cancers in women and is diagnosed at later stage in majority. The limiting factor for early diagnosis is lack of standardized terms and procedures in gynaecological sonography. Introduction of IOTA rules has provided some consistency in defining morphological features of ovarian masses through a standardized examination technique. To evaluate the efficacy of IOTA simple ultrasound rules in distinguishing benign and malignant ovarian tumours and establishing their use as a tool in early diagnosis of ovarian malignancy. A hospital based case control prospective study was conducted. Patients with suspected ovarian pathology were evaluated using IOTA ultrasound rules and designated as benign or malignant. Findings were correlated with histopathological findings. Collected data was statistically analysed using chi-square test and kappa statistical method. Out of initial 55 patients, 50 patients were included in the final analysis who underwent surgery. IOTA simple rules were applicable in 45 out of these 50 patients (90%). The sensitivity for the detection of malignancy in cases where IOTA simple rules were applicable was 91.66% and the specificity was 84.84%. Accuracy was 86.66%. Classifying inconclusive cases as malignant, the sensitivity and specificity was 93% and 80% respectively. High level of agreement was found between USG and histopathological diagnosis with Kappa value as 0.323. IOTA simple ultrasound rules were highly sensitive and specific in predicting ovarian malignancy preoperatively yet being reproducible, easy to train and use.
LADES: a software for constructing and analyzing longitudinal designs in biomedical research.
Vázquez-Alcocer, Alan; Garzón-Cortes, Daniel Ladislao; Sánchez-Casas, Rosa María
2014-01-01
One of the most important steps in biomedical longitudinal studies is choosing a good experimental design that can provide high accuracy in the analysis of results with a minimum sample size. Several methods for constructing efficient longitudinal designs have been developed based on power analysis and the statistical model used for analyzing the final results. However, development of this technology is not available to practitioners through user-friendly software. In this paper we introduce LADES (Longitudinal Analysis and Design of Experiments Software) as an alternative and easy-to-use tool for conducting longitudinal analysis and constructing efficient longitudinal designs. LADES incorporates methods for creating cost-efficient longitudinal designs, unequal longitudinal designs, and simple longitudinal designs. In addition, LADES includes different methods for analyzing longitudinal data such as linear mixed models, generalized estimating equations, among others. A study of European eels is reanalyzed in order to show LADES capabilities. Three treatments contained in three aquariums with five eels each were analyzed. Data were collected from 0 up to the 12th week post treatment for all the eels (complete design). The response under evaluation is sperm volume. A linear mixed model was fitted to the results using LADES. The complete design had a power of 88.7% using 15 eels. With LADES we propose the use of an unequal design with only 14 eels and 89.5% efficiency. LADES was developed as a powerful and simple tool to promote the use of statistical methods for analyzing and creating longitudinal experiments in biomedical research.
Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures
NASA Technical Reports Server (NTRS)
Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini,Benjamin; Young, Daniel
1990-01-01
A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite bonded joints below the cryogenic temperature of 30K (-405 F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to M55J/954-6 and T300/954-6 hybrid composite tubes (75mm square). Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.
MI-Sim: A MATLAB package for the numerical analysis of microbial ecological interactions.
Wade, Matthew J; Oakley, Jordan; Harbisher, Sophie; Parker, Nicholas G; Dolfing, Jan
2017-01-01
Food-webs and other classes of ecological network motifs, are a means of describing feeding relationships between consumers and producers in an ecosystem. They have application across scales where they differ only in the underlying characteristics of the organisms and substrates describing the system. Mathematical modelling, using mechanistic approaches to describe the dynamic behaviour and properties of the system through sets of ordinary differential equations, has been used extensively in ecology. Models allow simulation of the dynamics of the various motifs and their numerical analysis provides a greater understanding of the interplay between the system components and their intrinsic properties. We have developed the MI-Sim software for use with MATLAB to allow a rigorous and rapid numerical analysis of several common ecological motifs. MI-Sim contains a series of the most commonly used motifs such as cooperation, competition and predation. It does not require detailed knowledge of mathematical analytical techniques and is offered as a single graphical user interface containing all input and output options. The tools available in the current version of MI-Sim include model simulation, steady-state existence and stability analysis, and basin of attraction analysis. The software includes seven ecological interaction motifs and seven growth function models. Unlike other system analysis tools, MI-Sim is designed as a simple and user-friendly tool specific to ecological population type models, allowing for rapid assessment of their dynamical and behavioural properties.
New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.
Evaluating Lexical Coverage in Simple English Wikipedia Articles: A Corpus-Driven Study
ERIC Educational Resources Information Center
Hendry, Clinton; Sheepy, Emily
2017-01-01
Simple English Wikipedia is a user-contributed online encyclopedia intended for young readers and readers whose first language is not English. We compiled a corpus of the entirety of Simple English Wikipedia as of June 20th, 2017. We used lexical frequency profiling tools to investigate the vocabulary size needed to comprehend Simple English…
Mind mapping in qualitative research.
Tattersall, Christopher; Powell, Julia; Stroud, James; Pringle, Jan
We tested a theory that mind mapping could be used as a tool in qualitative research to transcribe and analyse an interview. We compared results derived from mind mapping with those from interpretive phenomenological analysis by examining patients' and carers' perceptions of a new nurse-led service. Mind mapping could be used to rapidly analyse simple qualitative audio-recorded interviews. More research is needed to establish the extent to which mind mapping can assist qualitative researchers.
Future Automotive Systems Technology Simulator (FASTSim)
DOE Office of Scientific and Technical Information (OSTI.GOV)
An advanced vehicle powertrain systems analysis tool, the Future Automotive Systems Technology Simulator (FASTSim) provides a simple way to compare powertrains and estimate the impact of technology improvements on light-, medium- and heavy-duty vehicle efficiency, performance, cost, and battery life. Created by the National Renewable Energy Laboratory, FASTSim accommodates a range of vehicle types - including conventional vehicles, electric-drive vehicles, and fuel cell vehicles - and is available for free download in Microsoft Excel and Python formats.
Preloaded joint analysis methodology for space flight systems
NASA Technical Reports Server (NTRS)
Chambers, Jeffrey A.
1995-01-01
This report contains a compilation of some of the most basic equations governing simple preloaded joint systems and discusses the more common modes of failure associated with such hardware. It is intended to provide the mechanical designer with the tools necessary for designing a basic bolted joint. Although the information presented is intended to aid in the engineering of space flight structures, the fundamentals are equally applicable to other forms of mechanical design.
REopt Improves the Operations of Alcatraz's Solar PV-Battery-Diesel Hybrid System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olis, Daniel R; Walker, H. A; Van Geet, Otto D
This poster identifies operations improvement strategies for a photovoltaic (PV)-battery-diesel hybrid system at the National Park Service's Alcatraz Island using NREL's REopt analysis tool. The current 'cycle charging' strategy results in significant curtailing of energy production from the PV array, requiring excessive diesel use, while also incurring high wear on batteries without benefit of improved efficiency. A simple 'load following' strategy results in near optimal operating cost reduction.
Ganalyzer: A Tool for Automatic Galaxy Image Analysis
NASA Astrophysics Data System (ADS)
Shamir, Lior
2011-08-01
We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.
Margaria, Tiziana; Kubczak, Christian; Steffen, Bernhard
2008-01-01
Background With Bio-jETI, we introduce a service platform for interdisciplinary work on biological application domains and illustrate its use in a concrete application concerning statistical data processing in R and xcms for an LC/MS analysis of FAAH gene knockout. Methods Bio-jETI uses the jABC environment for service-oriented modeling and design as a graphical process modeling tool and the jETI service integration technology for remote tool execution. Conclusions As a service definition and provisioning platform, Bio-jETI has the potential to become a core technology in interdisciplinary service orchestration and technology transfer. Domain experts, like biologists not trained in computer science, directly define complex service orchestrations as process models and use efficient and complex bioinformatics tools in a simple and intuitive way. PMID:18460173
In Silico PCR Tools for a Fast Primer, Probe, and Advanced Searching.
Kalendar, Ruslan; Muterko, Alexandr; Shamekova, Malika; Zhambakin, Kabyl
2017-01-01
The polymerase chain reaction (PCR) is fundamental to molecular biology and is the most important practical molecular technique for the research laboratory. The principle of this technique has been further used and applied in plenty of other simple or complex nucleic acid amplification technologies (NAAT). In parallel to laboratory "wet bench" experiments for nucleic acid amplification technologies, in silico or virtual (bioinformatics) approaches have been developed, among which in silico PCR analysis. In silico NAAT analysis is a useful and efficient complementary method to ensure the specificity of primers or probes for an extensive range of PCR applications from homology gene discovery, molecular diagnosis, DNA fingerprinting, and repeat searching. Predicting sensitivity and specificity of primers and probes requires a search to determine whether they match a database with an optimal number of mismatches, similarity, and stability. In the development of in silico bioinformatics tools for nucleic acid amplification technologies, the prospects for the development of new NAAT or similar approaches should be taken into account, including forward-looking and comprehensive analysis that is not limited to only one PCR technique variant. The software FastPCR and the online Java web tool are integrated tools for in silico PCR of linear and circular DNA, multiple primer or probe searches in large or small databases and for advanced search. These tools are suitable for processing of batch files that are essential for automation when working with large amounts of data. The FastPCR software is available for download at http://primerdigital.com/fastpcr.html and the online Java version at http://primerdigital.com/tools/pcr.html .
Suresh, R
2017-08-01
Pertinent marks of fired cartridge cases such as firing pin, breech face, extractor, ejector, etc. are used for firearm identification. A non-standard semiautomatic pistol and four .22rim fire cartridges (head stamp KF) is used for known source comparison study. Two test fired cartridge cases are examined under stereomicroscope. The characteristic marks are captured by digital camera and comparative analysis of striation marks is done by using different tools available in the Microsoft word (Windows 8) of a computer system. The similarities of striation marks thus obtained are highly convincing to identify the firearm. In this paper, an effort has been made to study and compare the striation marks of two fired cartridge cases using stereomicroscope, digital camera and computer system. Comparison microscope is not used in this study. The method described in this study is simple, cost effective, transport to field study and can be equipped in a crime scene vehicle to facilitate immediate on spot examination. The findings may be highly helpful to the forensic community, law enforcement agencies and students. Copyright © 2017 Elsevier B.V. All rights reserved.
A selective nanosensor device for exhaled breath analysis.
Gouma, P; Prasad, A; Stanacevic, S
2011-09-01
This paper describes a novel concept of a three-nanosensor array microsystem that may potentially serve as a coarse diagnostic tool handheld breath analyzer to provide a first detection device. The specification and performance of a simple metal oxide nanosensor operating between three distinct temperatures are discussed, focusing on the need for a noninvasive blood cholesterol monitor. Interfacing the sensor array to an integrated circuit for electrical readout and temperature control provides a complete microsystem capable of capturing a single exhaled breath and analyzing it with respect to the relative content of isoprene, carbon dioxide and ammonia gas. This inexpensive sensor technology may be used as a personalized medical diagnostics tool in the near future.
Simplified formulae for the estimation of offshore wind turbines clutter on marine radars.
Grande, Olatz; Cañizo, Josune; Angulo, Itziar; Jenn, David; Danoon, Laith R; Guerra, David; de la Vega, David
2014-01-01
The potential impact that offshore wind farms may cause on nearby marine radars should be considered before the wind farm is installed. Strong radar echoes from the turbines may degrade radars' detection capability in the area around the wind farm. Although conventional computational methods provide accurate results of scattering by wind turbines, they are not directly implementable in software tools that can be used to conduct the impact studies. This paper proposes a simple model to assess the clutter that wind turbines may generate on marine radars. This method can be easily implemented in the system modeling software tools for the impact analysis of a wind farm in a real scenario.
Simplified Formulae for the Estimation of Offshore Wind Turbines Clutter on Marine Radars
Grande, Olatz; Cañizo, Josune; Jenn, David; Danoon, Laith R.; Guerra, David
2014-01-01
The potential impact that offshore wind farms may cause on nearby marine radars should be considered before the wind farm is installed. Strong radar echoes from the turbines may degrade radars' detection capability in the area around the wind farm. Although conventional computational methods provide accurate results of scattering by wind turbines, they are not directly implementable in software tools that can be used to conduct the impact studies. This paper proposes a simple model to assess the clutter that wind turbines may generate on marine radars. This method can be easily implemented in the system modeling software tools for the impact analysis of a wind farm in a real scenario. PMID:24782682
Description of operation of fast-response solenoid actuator in diesel fuel system model
NASA Astrophysics Data System (ADS)
Zhao, J.; Grekhov, L. V.; Fan, L.; Ma, X.; Song, E.
2018-03-01
The performance of the fast-response solenoid actuator (FRSA) of engine fuel systems is characterized by the response time of less than 0.1 ms and the necessity to take into consideration the non-stationary peculiarities of mechanical, hydraulic, electrical and magnetic processes. Simple models for magnetization in static and dynamic hysteresis are used for this purpose. The experimental study of the FRSA performance within the electro-hydraulic injector of the Common Rail demonstrated an agreement between the computational and experimental results. The computation of the processes is not only a tool for analysis, but also a tool for design and optimization of the solenoid actuator of new engine fuels systems.
Transition Matrices: A Tool to Assess Student Learning and Improve Instruction
NASA Astrophysics Data System (ADS)
Morris, Gary A.; Walter, Paul; Skees, Spencer; Schwartz, Samantha
2017-03-01
This paper introduces a new spreadsheet tool for adoption by high school or college-level physics teachers who use common assessments in a pre-instruction/post-instruction mode to diagnose student learning and teaching effectiveness. The spreadsheet creates a simple matrix that identifies the percentage of students who select each possible pre-/post-test answer combination on each question of the diagnostic exam. Leveraging analysis of the quality of the incorrect answer choices, one can order the answer choices from worst to best (i.e., correct), resulting in "transition matrices" that can provide deeper insight into student learning and the success or failure of the pedagogical approach than traditional analyses that employ dichotomous scoring.
Measurement and classification of heart and lung sounds by using LabView for educational use.
Altrabsheh, B
2010-01-01
This study presents the design, development and implementation of a simple low-cost method of phonocardiography signal detection. Human heart and lung signals are detected by using a simple microphone through a personal computer; the signals are recorded and analysed using LabView software. Amplitude and frequency analyses are carried out for various phonocardiography pathological cases. Methods for automatic classification of normal and abnormal heart sounds, murmurs and lung sounds are presented. Various cases of heart and lung sound measurement are recorded and analysed. The measurements can be saved for further analysis. The method in this study can be used by doctors as a detection tool aid and may be useful for teaching purposes at medical and nursing schools.
Network structure of multivariate time series.
Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito
2015-10-21
Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.
NASA Astrophysics Data System (ADS)
Nine, H. M. Zulker
The adversity of metallic corrosion is of growing concern to industrial engineers and scientists. Corrosion attacks metal surface and causes structural as well as direct and indirect economic losses. Multiple corrosion monitoring tools are available although those are time-consuming and costly. Due to the availability of image capturing devices in today's world, image based corrosion control technique is a unique innovation. By setting up stainless steel SS 304 and low carbon steel QD 1008 panels in distilled water, half-saturated sodium chloride and saturated sodium chloride solutions and subsequent RGB image analysis in Matlab, in this research, a simple and cost-effective corrosion measurement tool has identified and investigated. Additionally, the open circuit potential and electrochemical impedance spectroscopy results have been compared with RGB analysis to gratify the corrosion. Additionally, to understand the importance of ambiguity in crisis communication, the communication process between Union Carbide and Indian Government regarding the Bhopal incident in 1984 was analyzed.
Interpolation problem for the solutions of linear elasticity equations based on monogenic functions
NASA Astrophysics Data System (ADS)
Grigor'ev, Yuri; Gürlebeck, Klaus; Legatiuk, Dmitrii
2017-11-01
Interpolation is an important tool for many practical applications, and very often it is beneficial to interpolate not only with a simple basis system, but rather with solutions of a certain differential equation, e.g. elasticity equation. A typical example for such type of interpolation are collocation methods widely used in practice. It is known, that interpolation theory is fully developed in the framework of the classical complex analysis. However, in quaternionic analysis, which shows a lot of analogies to complex analysis, the situation is more complicated due to the non-commutative multiplication. Thus, a fundamental theorem of algebra is not available, and standard tools from linear algebra cannot be applied in the usual way. To overcome these problems, a special system of monogenic polynomials the so-called Pseudo Complex Polynomials, sharing some properties of complex powers, is used. In this paper, we present an approach to deal with the interpolation problem, where solutions of elasticity equations in three dimensions are used as an interpolation basis.
Day, Charles A.; Kraft, Lewis J.; Kang, Minchul; Kenworthy, Anne K.
2012-01-01
Fluorescence recovery after photobleaching (FRAP) is a powerful, versatile and widely accessible tool to monitor molecular dynamics in living cells that can be performed using modern confocal microscopes. Although the basic principles of FRAP are simple, quantitative FRAP analysis requires careful experimental design, data collection and analysis. In this review we discuss the theoretical basis for confocal FRAP, followed by step-by-step protocols for FRAP data acquisition using a laser scanning confocal microscope for (1) measuring the diffusion of a membrane protein, (2) measuring the diffusion of a soluble protein, and (3) analysis of intracellular trafficking. Finally, data analysis procedures are discussed and an equation for determining the diffusion coefficient of a molecular species undergoing pure diffusion is presented. PMID:23042527
Intuitive Tools for the Design and Analysis of Communication Payloads for Satellites
NASA Technical Reports Server (NTRS)
Culver, Michael R.; Soong, Christine; Warner, Joseph D.
2014-01-01
In an effort to make future communications satellite payload design more efficient and accessible, two tools were created with intuitive graphical user interfaces (GUIs). The first tool allows payload designers to graphically design their payload by using simple drag and drop of payload components onto a design area within the program. Information about each picked component is pulled from a database of common space-qualified communication components sold by commerical companies. Once a design is completed, various reports can be generated, such as the Master Equipment List. The second tool is a link budget calculator designed specifically for ease of use. Other features of this tool include being able to access a database of NASA ground based apertures for near Earth and Deep Space communication, the Tracking and Data Relay Satellite System (TDRSS) base apertures, and information about the solar system relevant to link budget calculations. The link budget tool allows for over 50 different combinations of user inputs, eliminating the need for multiple spreadsheets and the user errors associated with using them. Both of the aforementioned tools increase the productivity of space communication systems designers, and have the colloquial latitude to allow non-communication experts to design preliminary communication payloads.
Cox, Trevor F; Ranganath, Lakshminarayan
2011-12-01
Alkaptonuria (AKU) is due to excessive homogentisic acid accumulation in body fluids due to lack of enzyme homogentisate dioxygenase leading in turn to varied clinical manifestations mainly by a process of conversion of HGA to a polymeric melanin-like pigment known as ochronosis. A potential treatment, a drug called nitisinone, to decrease formation of HGA is available. However, successful demonstration of its efficacy in modifying the natural history of AKU requires an effective quantitative assessment tool. We have described two potential tools that could be used to quantitate disease burden in AKU. One tool describes scoring the clinical features that includes clinical assessments, investigations and questionnaires in 15 patients with AKU. The second tool describes a scoring system that only includes items obtained from questionnaires used in 44 people with AKU. Statistical analyses were carried out on the two patient datasets to assess the AKU tools; these included the calculation of Chronbach's alpha, multidimensional scaling and simple linear regression analysis. The conclusion was that there was good evidence that the tools could be adopted as AKU assessment tools, but perhaps with further refinement before being used in the practical setting of a clinical trial.
Rueckl, Martin; Lenzi, Stephen C; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W
2017-01-01
The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca 2+ -imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca 2+ imaging datasets, particularly when these have been acquired at different spatial scales.
Rueckl, Martin; Lenzi, Stephen C.; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W.
2017-01-01
The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca2+-imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca2+ imaging datasets, particularly when these have been acquired at different spatial scales. PMID:28706482
Development of Waypoint Planning Tool in Response to NASA Field Campaign Challenges
NASA Technical Reports Server (NTRS)
He, Matt; Hardin, Danny; Mayer, Paul; Blakeslee, Richard; Goodman, Michael
2012-01-01
Airborne real time observations are a major component of NASA 's Earth Science research and satellite ground validation studies. Multiple aircraft are involved in most NASA field campaigns. The coordination of the aircraft with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. Planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. A flight planning tools is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama ]Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point -and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin on web platform, and to the rising open source GIS tools with New Java Script frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientist reach their mission objectives.
NASA Astrophysics Data System (ADS)
Domínguez-Rodrigo, Manuel; Barba, Rebeca; Soto, Enrique; Sesé, Carmen; Santonja, Manuel; Pérez-González, Alfredo; Yravedra, José; Galán, Ana Belén
2015-10-01
Cuesta de la Bajada is a Middle Pleistocene site (MIS 8-9) in which some of the earliest evidence of Middle Paleolithic stone tool tradition is documented. The small format tool assemblage, dominated by simple flakes and scrapers, is associated to abundant remains of equids and cervids, in which both percussion and cut marks are well represented. The anatomical distribution of these bone surface modifications indicate primary access to fleshed carcasses by hominins. Hunting is further supported by the analysis of age profiles, in which prime adults are predominant both in equids and cervids. The taphonomic analysis of the site adds more information to human predatory behaviors as documented in other Middle Pleistocene sites and is one of the best examples of hunting documented in the Middle Pleistocene European archaeological record.
Coupled oscillators: interesting experiments for high school students
NASA Astrophysics Data System (ADS)
Kodejška, Č.; Lepil, O.; Sedláčková, H.
2018-07-01
This work deals with the experimental demonstration of coupled oscillators using simple tools in the form of mechanical coupled pendulums, magnetically coupled elastic strings or electromagnetic oscillators. For the evaluation of results the data logger Lab Quest Vernier and video analysis in the Tracker program were used. In the first part of this work, coupled mechanical oscillators of different types are shown and the data analysis by the Tracker or Vernier Logger Pro programs. The second part describes a measurement using two LC circuits with inductively or capacitive coupled electromagnetic oscillators and the obtained experimental results.
InterFace: A software package for face image warping, averaging, and principal components analysis.
Kramer, Robin S S; Jenkins, Rob; Burton, A Mike
2017-12-01
We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.
Simple Sensitivity Analysis for Orion Guidance Navigation and Control
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
CytoBayesJ: software tools for Bayesian analysis of cytogenetic radiation dosimetry data.
Ainsbury, Elizabeth A; Vinnikov, Volodymyr; Puig, Pedro; Maznyk, Nataliya; Rothkamm, Kai; Lloyd, David C
2013-08-30
A number of authors have suggested that a Bayesian approach may be most appropriate for analysis of cytogenetic radiation dosimetry data. In the Bayesian framework, probability of an event is described in terms of previous expectations and uncertainty. Previously existing, or prior, information is used in combination with experimental results to infer probabilities or the likelihood that a hypothesis is true. It has been shown that the Bayesian approach increases both the accuracy and quality assurance of radiation dose estimates. New software entitled CytoBayesJ has been developed with the aim of bringing Bayesian analysis to cytogenetic biodosimetry laboratory practice. CytoBayesJ takes a number of Bayesian or 'Bayesian like' methods that have been proposed in the literature and presents them to the user in the form of simple user-friendly tools, including testing for the most appropriate model for distribution of chromosome aberrations and calculations of posterior probability distributions. The individual tools are described in detail and relevant examples of the use of the methods and the corresponding CytoBayesJ software tools are given. In this way, the suitability of the Bayesian approach to biological radiation dosimetry is highlighted and its wider application encouraged by providing a user-friendly software interface and manual in English and Russian. Copyright © 2013 Elsevier B.V. All rights reserved.
Interactive 3D visualization for theoretical virtual observatories
NASA Astrophysics Data System (ADS)
Dykes, T.; Hassan, A.; Gheller, C.; Croton, D.; Krokos, M.
2018-06-01
Virtual observatories (VOs) are online hubs of scientific knowledge. They encompass a collection of platforms dedicated to the storage and dissemination of astronomical data, from simple data archives to e-research platforms offering advanced tools for data exploration and analysis. Whilst the more mature platforms within VOs primarily serve the observational community, there are also services fulfilling a similar role for theoretical data. Scientific visualization can be an effective tool for analysis and exploration of data sets made accessible through web platforms for theoretical data, which often contain spatial dimensions and properties inherently suitable for visualization via e.g. mock imaging in 2D or volume rendering in 3D. We analyse the current state of 3D visualization for big theoretical astronomical data sets through scientific web portals and virtual observatory services. We discuss some of the challenges for interactive 3D visualization and how it can augment the workflow of users in a virtual observatory context. Finally we showcase a lightweight client-server visualization tool for particle-based data sets, allowing quantitative visualization via data filtering, highlighting two example use cases within the Theoretical Astrophysical Observatory.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.
2014-10-01
Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.
2015-03-01
Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
NASA Astrophysics Data System (ADS)
Pascoe, Stephen; Iwi, Alan; kershaw, philip; Stephens, Ag; Lawrence, Bryan
2014-05-01
The advent of large-scale data and the consequential analysis problems have led to two new challenges for the research community: how to share such data to get the maximum value and how to carry out efficient analysis. Solving both challenges require a form of parallelisation: the first is social parallelisation (involving trust and information sharing), the second data parallelisation (involving new algorithms and tools). The JASMIN infrastructure supports both kinds of parallelism by providing a multi-tennent environment with petabyte-scale storage, VM provisioning and batch cluster facilities. The JASMIN Analysis Platform (JAP) is an analysis software layer for JASMIN which emphasises ease of transition from a researcher's local environment to JASMIN. JAP brings together tools traditionally used by multiple communities and configures them to work together, enabling users to move analysis from their local environment to JASMIN without rewriting code. JAP also provides facilities to exploit JASMIN's parallel capabilities whilst maintaining their familiar analysis environment where ever possible. Modern opensource analysis tools typically have multiple dependent packages, increasing the installation burden on system administrators. When you consider a suite of tools, often with both common and conflicting dependencies, analysis pipelines can become locked to a particular installation simply because of the effort required to reconstruct the dependency tree. JAP addresses this problem by providing a consistent suite of RPMs compatible with RedHat Enterprise Linux and CentOS 6.4. Researchers can install JAP locally, either as RPMs or through a pre-built VM image, giving them the confidence to know moving analysis to JASMIN will not disrupt their environment. Analysis parallelisation is in it's infancy in climate sciences, with few tools capable of exploiting any parallel environment beyond manual scripting of the use of multiple processors. JAP begins to bridge this gap through a veriety of higher-level tools for parallelisation and job scheduling such as IPython-parallel and MPI support for interactive analysis languages. We find that enabling even simple parallelisation of workflows, together with the state of the art I/O performance of JASMIN storage, provides many users with the large increases in efficiency they need to scale their analyses to conteporary data volumes and tackly new, previously inaccessible, problems.
The Multisensory Attentional Consequences of Tool Use: A Functional Magnetic Resonance Imaging Study
Holmes, Nicholas P.; Spence, Charles; Hansen, Peter C.; Mackay, Clare E.; Calvert, Gemma A.
2008-01-01
Background Tool use in humans requires that multisensory information is integrated across different locations, from objects seen to be distant from the hand, but felt indirectly at the hand via the tool. We tested the hypothesis that using a simple tool to perceive vibrotactile stimuli results in the enhanced processing of visual stimuli presented at the distal, functional part of the tool. Such a finding would be consistent with a shift of spatial attention to the location where the tool is used. Methodology/Principal Findings We tested this hypothesis by scanning healthy human participants' brains using functional magnetic resonance imaging, while they used a simple tool to discriminate between target vibrations, accompanied by congruent or incongruent visual distractors, on the same or opposite side to the tool. The attentional hypothesis was supported: BOLD response in occipital cortex, particularly in the right hemisphere lingual gyrus, varied significantly as a function of tool position, increasing contralaterally, and decreasing ipsilaterally to the tool. Furthermore, these modulations occurred despite the fact that participants were repeatedly instructed to ignore the visual stimuli, to respond only to the vibrotactile stimuli, and to maintain visual fixation centrally. In addition, the magnitude of multisensory (visual-vibrotactile) interactions in participants' behavioural responses significantly predicted the BOLD response in occipital cortical areas that were also modulated as a function of both visual stimulus position and tool position. Conclusions/Significance These results show that using a simple tool to locate and to perceive vibrotactile stimuli is accompanied by a shift of spatial attention to the location where the functional part of the tool is used, resulting in enhanced processing of visual stimuli at that location, and decreased processing at other locations. This was most clearly observed in the right hemisphere lingual gyrus. Such modulations of visual processing may reflect the functional importance of visuospatial information during human tool use. PMID:18958150
iCanPlot: Visual Exploration of High-Throughput Omics Data Using Interactive Canvas Plotting
Sinha, Amit U.; Armstrong, Scott A.
2012-01-01
Increasing use of high throughput genomic scale assays requires effective visualization and analysis techniques to facilitate data interpretation. Moreover, existing tools often require programming skills, which discourages bench scientists from examining their own data. We have created iCanPlot, a compelling platform for visual data exploration based on the latest technologies. Using the recently adopted HTML5 Canvas element, we have developed a highly interactive tool to visualize tabular data and identify interesting patterns in an intuitive fashion without the need of any specialized computing skills. A module for geneset overlap analysis has been implemented on the Google App Engine platform: when the user selects a region of interest in the plot, the genes in the region are analyzed on the fly. The visualization and analysis are amalgamated for a seamless experience. Further, users can easily upload their data for analysis—which also makes it simple to share the analysis with collaborators. We illustrate the power of iCanPlot by showing an example of how it can be used to interpret histone modifications in the context of gene expression. PMID:22393367
NASA Astrophysics Data System (ADS)
Hoebelheinrich, N. J.; Lynnes, C.; West, P.; Ferritto, M.
2014-12-01
Two problems common to many geoscience domains are the difficulties in finding tools to work with a given dataset collection, and conversely, the difficulties in finding data for a known tool. A collaborative team from the Earth Science Information Partnership (ESIP) has gotten together to design and create a web service, called ToolMatch, to address these problems. The team began their efforts by defining an initial, relatively simple conceptual model that addressed the two uses cases briefly described above. The conceptual model is expressed as an ontology using OWL (Web Ontology Language) and DCterms (Dublin Core Terms), and utilizing standard ontologies such as DOAP (Description of a Project), FOAF (Friend of a Friend), SKOS (Simple Knowledge Organization System) and DCAT (Data Catalog Vocabulary). The ToolMatch service will be taking advantage of various Semantic Web and Web standards, such as OpenSearch, RESTful web services, SWRL (Semantic Web Rule Language) and SPARQL (Simple Protocol and RDF Query Language). The first version of the ToolMatch service was deployed in early fall 2014. While more complete testing is required, a number of communities besides ESIP member organizations have expressed interest in collaborating to create, test and use the service and incorporate it into their own web pages, tools and / or services including the USGS Data Catalog service, DataONE, the Deep Carbon Observatory, Virtual Solar Terrestrial Observatory (VSTO), and the U.S. Global Change Research Program. In this session, presenters will discuss the inception and development of the ToolMatch service, the collaborative process used to design, refine, and test the service, and future plans for the service.
Bhalla, Kavi; Harrison, James E
2016-04-01
Burden of disease and injury methods can be used to summarise and compare the effects of conditions in terms of disability-adjusted life years (DALYs). Burden estimation methods are not inherently complex. However, as commonly implemented, the methods include complex modelling and estimation. To provide a simple and open-source software tool that allows estimation of incidence-DALYs due to injury, given data on incidence of deaths and non-fatal injuries. The tool includes a default set of estimation parameters, which can be replaced by users. The tool was written in Microsoft Excel. All calculations and values can be seen and altered by users. The parameter sets currently used in the tool are based on published sources. The tool is available without charge online at http://calculator.globalburdenofinjuries.org. To use the tool with the supplied parameter sets, users need to only paste a table of population and injury case data organised by age, sex and external cause of injury into a specified location in the tool. Estimated DALYs can be read or copied from tables and figures in another part of the tool. In some contexts, a simple and user-modifiable burden calculator may be preferable to undertaking a more complex study to estimate the burden of disease. The tool and the parameter sets required for its use can be improved by user innovation, by studies comparing DALYs estimates calculated in this way and in other ways, and by shared experience of its use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
ERIC Educational Resources Information Center
Plummer, Donna; Kuhlman, Wilma
2005-01-01
To introduce students to rocks and their characteristics, teacher can begin rock units with the activities described in this article. Students need the ability to make simple observations using their senses and simple tools.
p3d--Python module for structural bioinformatics.
Fufezan, Christian; Specht, Michael
2009-08-21
High-throughput bioinformatic analysis tools are needed to mine the large amount of structural data via knowledge based approaches. The development of such tools requires a robust interface to access the structural data in an easy way. For this the Python scripting language is the optimal choice since its philosophy is to write an understandable source code. p3d is an object oriented Python module that adds a simple yet powerful interface to the Python interpreter to process and analyse three dimensional protein structure files (PDB files). p3d's strength arises from the combination of a) very fast spatial access to the structural data due to the implementation of a binary space partitioning (BSP) tree, b) set theory and c) functions that allow to combine a and b and that use human readable language in the search queries rather than complex computer language. All these factors combined facilitate the rapid development of bioinformatic tools that can perform quick and complex analyses of protein structures. p3d is the perfect tool to quickly develop tools for structural bioinformatics using the Python scripting language.
NASA Astrophysics Data System (ADS)
Kuusela, Tom A.
2017-09-01
A He-Ne laser is an example of a class A laser, which can be described by a single nonlinear differential equation of the complex electric field. This laser system has only one degree of freedom and is thus inherently stable. A He-Ne laser can be driven to the chaotic condition when a large fraction of the output beam is injected back to the laser. In practice, this can be done simply by adding an external mirror. In this situation, the laser system has infinite degrees of freedom and therefore it can have a chaotic attractor. We show the fundamental laser equations and perform elementary stability analysis. In experiments, the laser intensity variations are measured by a simple photodiode circuit. The laser output intensity time series is studied using nonlinear analysis tools which can be found freely on the internet. The results show that the laser system with feedback has an attractor of a reasonably high dimension and that the maximal Lyapunov exponent is positive, which is clear evidence of chaotic behaviour. The experimental setup and analysis steps are so simple that the studies can even be implemented in the undergraduate physics laboratory.
Birch, Ivan; Vernon, Wesley; Walker, Jeremy; Saxelby, Jai
2013-10-01
Gait analysis from closed circuit camera footage is now commonly used as evidence in criminal trials. The biomechanical analysis of human gait is a well established science in both clinical and laboratory settings. However, closed circuit camera footage is rarely of the quality of that taken in the more controlled clinical and laboratory environments. The less than ideal quality of much of this footage for use in gait analysis is associated with a range of issues, the combination of which can often render the footage unsuitable for use in gait analysis. The aim of this piece of work was to develop a tool for assessing the suitability of closed circuit camera footage for the purpose of forensic gait analysis. A Delphi technique was employed with a small sample of expert forensic gait analysis practitioners, to identify key quality elements of CCTV footage used in legal proceedings. Five elements of the footage were identified and then subdivided into 15 contributing sub-elements, each of which was scored using a 5-point Likert scale. A Microsoft Excel worksheet was developed to calculate automatically an overall score from the fifteen sub-element scores. Five expert witnesses experienced in using CCTV footage for gait analysis then trialled the prototype tool on current case footage. A repeatability study was also undertaken using standardized CCTV footage. The results showed the tool to be a simple and repeatable means of assessing the suitability of closed circuit camera footage for use in forensic gait analysis. The inappropriate use of poor quality footage could lead to challenges to the practice of forensic gait analysis. All parties involved in criminal proceedings must therefore understand the fitness for purpose of any footage used. The development of this tool could offer a method of achieving this goal, and help to assure the continued role of forensic gait analysis as an aid to the identification process. Copyright © 2013 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Zarzycki, Paweł K; Zarzycka, Magdalena B; Clifton, Vicki L; Adamski, Jerzy; Głód, Bronisław K
2011-08-19
The goal of this paper is to demonstrate the separation and detection capability of eco-friendly micro-TLC technique for the classification of spirulina and selected herbs from pharmaceutical and food products. Target compounds were extracted using relatively low-parachor liquids. A number of the spirulina samples which originated from pharmaceutical formulations and food products, were isolated using a simple one step extraction with small volume of methanol, acetone or tetrahydrofuran. Herb samples rich in chlorophyll dyes were analyzed as reference materials. Quantitative data derived from micro-plates under visible light conditions and after iodine staining were explored using chemometrics tools including cluster analysis and principal components analysis. Using this method we could easily distinguish genuine spirulina and non-spirulina samples as well as fresh from expired commercial products and furthermore, we could identify some biodegradation peaks appearing on micro-TLC profiles. This methodology can be applied as a fast screening or fingerprinting tool for the classification of genuine spirulina and herb samples and in particular may be used commercially for the rapid quality control screening of products. Furthermore, this approach allows low-cost fractionation of target substances including cyanobacteria pigments in raw biological or environmental samples for preliminary chemotaxonomic investigations. Due to the low consumption of the mobile phase (usually less than 1 mL per run), this method can be considered as environmentally friendly analytical tool, which may be an alternative for fingerprinting protocols based on HPLC machines and simple separation systems involving planar micro-fluidic or micro-chip devices. Copyright © 2011 Elsevier B.V. All rights reserved.
Turner, Andrew D; Waack, Julia; Lewis, Adam; Edwards, Christine; Lawton, Linda
2018-02-01
A simple, rapid UHPLC-MS/MS method has been developed and optimised for the quantitation of microcystins and nodularin in wide variety of sample matrices. Microcystin analogues targeted were MC-LR, MC-RR, MC-LA, MC-LY, MC-LF, LC-LW, MC-YR, MC-WR, [Asp3] MC-LR, [Dha7] MC-LR, MC-HilR and MC-HtyR. Optimisation studies were conducted to develop a simple, quick and efficient extraction protocol without the need for complex pre-analysis concentration procedures, together with a rapid sub 5min chromatographic separation of toxins in shellfish and algal supplement tablet powders, as well as water and cyanobacterial bloom samples. Validation studies were undertaken on each matrix-analyte combination to the full method performance characteristics following international guidelines. The method was found to be specific and linear over the full calibration range. Method sensitivity in terms of limits of detection, quantitation and reporting were found to be significantly improved in comparison to LC-UV methods and applicable to the analysis of each of the four matrices. Overall, acceptable recoveries were determined for each of the matrices studied, with associated precision and within-laboratory reproducibility well within expected guidance limits. Results from the formalised ruggedness analysis of all available cyanotoxins, showed that the method was robust for all parameters investigated. The results presented here show that the optimised LC-MS/MS method for cyanotoxins is fit for the purpose of detection and quantitation of a range of microcystins and nodularin in shellfish, algal supplement tablet powder, water and cyanobacteria. The method provides a valuable early warning tool for the rapid, routine extraction and analysis of natural waters, cyanobacterial blooms, algal powders, food supplements and shellfish tissues, enabling monitoring labs to supplement traditional microscopy techniques and report toxicity results within a short timeframe of sample receipt. The new method, now accredited to ISO17025 standard, is simple, quick, applicable to multiple matrices and is highly suitable for use as a routine, high-throughout, fast turnaround regulatory monitoring tool. Copyright © 2017 Elsevier B.V. All rights reserved.
Evaluation of IOTA Simple Ultrasound Rules to Distinguish Benign and Malignant Ovarian Tumours
Kaur, Amarjit; Mohi, Jaswinder Kaur; Sibia, Preet Kanwal; Kaur, Navkiran
2017-01-01
Introduction IOTA stands for International Ovarian Tumour Analysis group. Ovarian cancer is one of the common cancers in women and is diagnosed at later stage in majority. The limiting factor for early diagnosis is lack of standardized terms and procedures in gynaecological sonography. Introduction of IOTA rules has provided some consistency in defining morphological features of ovarian masses through a standardized examination technique. Aim To evaluate the efficacy of IOTA simple ultrasound rules in distinguishing benign and malignant ovarian tumours and establishing their use as a tool in early diagnosis of ovarian malignancy. Materials and Methods A hospital based case control prospective study was conducted. Patients with suspected ovarian pathology were evaluated using IOTA ultrasound rules and designated as benign or malignant. Findings were correlated with histopathological findings. Collected data was statistically analysed using chi-square test and kappa statistical method. Results Out of initial 55 patients, 50 patients were included in the final analysis who underwent surgery. IOTA simple rules were applicable in 45 out of these 50 patients (90%). The sensitivity for the detection of malignancy in cases where IOTA simple rules were applicable was 91.66% and the specificity was 84.84%. Accuracy was 86.66%. Classifying inconclusive cases as malignant, the sensitivity and specificity was 93% and 80% respectively. High level of agreement was found between USG and histopathological diagnosis with Kappa value as 0.323. Conclusion IOTA simple ultrasound rules were highly sensitive and specific in predicting ovarian malignancy preoperatively yet being reproducible, easy to train and use. PMID:28969237
Ibrahim, G H; Buch, M H; Lawson, C; Waxman, R; Helliwell, P S
2009-01-01
To evaluate an existing tool (the Swedish modification of the Psoriasis Assessment Questionnaire) and to develop a new instrument to screen for psoriatic arthritis in people with psoriasis. The starting point was a community-based survey of people with psoriasis using questionnaires developed from the literature. Selected respondents were examined and additional known cases of psoriatic arthritis were included in the analysis. The new instrument was developed using univariate statistics and a logistic regression model, comparing people with and without psoriatic arthritis. The instruments were compared using receiver operating curve (ROC) curve analysis. 168 questionnaires were returned (response rate 27%) and 93 people attended for examination (55% of questionnaire respondents). Of these 93, twelve were newly diagnosed with psoriatic arthritis during this study. These 12 were supplemented by 21 people with known psoriatic arthritis. Just 5 questions were found to be significant predictors of psoriatic arthritis in this population. Figures for sensitivity and specificity were 0.92 and 0.78 respectively, an improvement on the Alenius tool (sensitivity and specificity, 0.63 and 0.72 respectively). A new screening tool for identifying people with psoriatic arthritis has been developed. Five simple questions demonstrated good sensitivity and specificity in this population but further validation is required.
NASA Technical Reports Server (NTRS)
Van Dyke, Michael B.
2014-01-01
During random vibration testing of electronic boxes there is often a desire to know the dynamic response of certain internal printed wiring boards (PWBs) for the purpose of monitoring the response of sensitive hardware or for post-test forensic analysis in support of anomaly investigation. Due to restrictions on internally mounted accelerometers for most flight hardware there is usually no means to empirically observe the internal dynamics of the unit, so one must resort to crude and highly uncertain approximations. One common practice is to apply Miles Equation, which does not account for the coupled response of the board in the chassis, resulting in significant over- or under-prediction. This paper explores the application of simple multiple-degree-of-freedom lumped parameter modeling to predict the coupled random vibration response of the PWBs in their fundamental modes of vibration. A simple tool using this approach could be used during or following a random vibration test to interpret vibration test data from a single external chassis measurement to deduce internal board dynamics by means of a rapid correlation analysis. Such a tool might also be useful in early design stages as a supplemental analysis to a more detailed finite element analysis to quickly prototype and analyze the dynamics of various design iterations. After developing the theoretical basis, a lumped parameter modeling approach is applied to an electronic unit for which both external and internal test vibration response measurements are available for direct comparison. Reasonable correlation of the results demonstrates the potential viability of such an approach. Further development of the preliminary approach presented in this paper will involve correlation with detailed finite element models and additional relevant test data.
Design/analysis of the JWST ISIM bonded joints for survivability at cryogenic temperatures
NASA Astrophysics Data System (ADS)
Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini, Benjamin; Young, Daniel
2005-08-01
A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite adhesively bonded joints at the cryogenic temperature of 30K (-405°F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to hybrid composite tubes (75mm square) made with M55J/954-6 and T300/954-6 prepregs. Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently, the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.
A sophisticated cad tool for the creation of complex models for electromagnetic interaction analysis
NASA Astrophysics Data System (ADS)
Dion, Marc; Kashyap, Satish; Louie, Aloisius
1991-06-01
This report describes the essential features of the MS-DOS version of DIDEC-DREO, an interactive program for creating wire grid, surface patch, and cell models of complex structures for electromagnetic interaction analysis. It uses the device-independent graphics library DIGRAF and the graphics kernel system HALO, and can be executed on systems with various graphics devices. Complicated structures can be created by direct alphanumeric keyboard entry, digitization of blueprints, conversion form existing geometric structure files, and merging of simple geometric shapes. A completed DIDEC geometric file may then be converted to the format required for input to a variety of time domain and frequency domain electromagnetic interaction codes. This report gives a detailed description of the program DIDEC-DREO, its installation, and its theoretical background. Each available interactive command is described. The associated program HEDRON which generates simple geometric shapes, and other programs that extract the current amplitude data from electromagnetic interaction code outputs, are also discussed.
Analyzing Hidden Semantics in Social Bookmarking of Open Educational Resources
NASA Astrophysics Data System (ADS)
Minguillón, Julià
Web 2.0 services such as social bookmarking allow users to manage and share the links they find interesting, adding their own tags for describing them. This is especially interesting in the field of open educational resources, as delicious is a simple way to bridge the institutional point of view (i.e. learning object repositories) with the individual one (i.e. personal collections), thus promoting the discovering and sharing of such resources by other users. In this paper we propose a methodology for analyzing such tags in order to discover hidden semantics (i.e. taxonomies and vocabularies) that can be used to improve descriptions of learning objects and make learning object repositories more visible and discoverable. We propose the use of a simple statistical analysis tool such as principal component analysis to discover which tags create clusters that can be semantically interpreted. We will compare the obtained results with a collection of resources related to open educational resources, in order to better understand the real needs of people searching for open educational resources.
Kim, Y S; Balland, V; Limoges, B; Costentin, C
2017-07-21
Cyclic voltammetry is a particularly useful tool for characterizing charge accumulation in conductive materials. A simple model is presented to evaluate proton transport effects on charge storage in conductive materials associated with a redox process coupled with proton insertion in the bulk material from an aqueous buffered solution, a situation frequently encountered in metal oxide materials. The interplay between proton transport inside and outside the materials is described using a formulation of the problem through introduction of dimensionless variables that allows defining the minimum number of parameters governing the cyclic voltammetry response with consideration of a simple description of the system geometry. This approach is illustrated by analysis of proton insertion in a mesoporous TiO 2 film.
An operational approach to high resolution agro-ecological zoning in West-Africa.
Le Page, Y; Vasconcelos, Maria; Palminha, A; Melo, I Q; Pereira, J M C
2017-01-01
The objective of this work is to develop a simple methodology for high resolution crop suitability analysis under current and future climate, easily applicable and useful in Least Developed Countries. The approach addresses both regional planning in the context of climate change projections and pre-emptive short-term rural extension interventions based on same-year agricultural season forecasts, while implemented with off-the-shelf resources. The developed tools are applied operationally in a case-study developed in three regions of Guinea-Bissau and the obtained results, as well as the advantages and limitations of methods applied, are discussed. In this paper we show how a simple approach can easily generate information on climate vulnerability and how it can be operationally used in rural extension services.
Random amplified polymorphic DNA PCR in the teaching of molecular epidemiology.
Reinoso, Elina B; Bettera, Susana G
2016-07-08
In this article, we describe a basic practical laboratory designed for fifth-year undergraduate students of Microbiology as part of the Epidemiology course. This practice provides the students with the tools for molecular epidemiological analysis of pathogenic microorganisms using a rapid and simple PCR technique. The aim of this work was to assay RAPD-PCR technique in order to infer possible epidemiological relationships. The activity gives students an appreciation of the value of applying a simple molecular biological method as RAPD-PCR to a discipline-specific question. It comprises a three-session laboratory module to genetically assay DNAs from strains isolated from a food outbreak. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):391-396, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.
Velderraín, José Dávila; Martínez-García, Juan Carlos; Álvarez-Buylla, Elena R
2017-01-01
Mathematical models based on dynamical systems theory are well-suited tools for the integration of available molecular experimental data into coherent frameworks in order to propose hypotheses about the cooperative regulatory mechanisms driving developmental processes. Computational analysis of the proposed models using well-established methods enables testing the hypotheses by contrasting predictions with observations. Within such framework, Boolean gene regulatory network dynamical models have been extensively used in modeling plant development. Boolean models are simple and intuitively appealing, ideal tools for collaborative efforts between theorists and experimentalists. In this chapter we present protocols used in our group for the study of diverse plant developmental processes. We focus on conceptual clarity and practical implementation, providing directions to the corresponding technical literature.
LIMO EEG: a toolbox for hierarchical LInear MOdeling of ElectroEncephaloGraphic data.
Pernet, Cyril R; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A
2011-01-01
Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses.
LIMO EEG: A Toolbox for Hierarchical LInear MOdeling of ElectroEncephaloGraphic Data
Pernet, Cyril R.; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A.
2011-01-01
Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses. PMID:21403915
High energy PIXE: A tool to characterize multi-layer thick samples
NASA Astrophysics Data System (ADS)
Subercaze, A.; Koumeir, C.; Métivier, V.; Servagent, N.; Guertin, A.; Haddad, F.
2018-02-01
High energy PIXE is a useful and non-destructive tool to characterize multi-layer thick samples such as cultural heritage objects. In a previous work, we demonstrated the possibility to perform quantitative analysis of simple multi-layer samples using high energy PIXE, without any assumption on their composition. In this work an in-depth study of the parameters involved in the method previously published is proposed. Its extension to more complex samples with a repeated layer is also presented. Experiments have been performed at the ARRONAX cyclotron using 68 MeV protons. The thicknesses and sequences of a multi-layer sample including two different layers of the same element have been determined. Performances and limits of this method are presented and discussed.
NONMEMory: a run management tool for NONMEM.
Wilkins, Justin J
2005-06-01
NONMEM is an extremely powerful tool for nonlinear mixed-effect modelling and simulation of pharmacokinetic and pharmacodynamic data. However, it is a console-based application whose output does not lend itself to rapid interpretation or efficient management. NONMEMory has been created to be a comprehensive project manager for NONMEM, providing detailed summary, comparison and overview of the runs comprising a given project, including the display of output data, simple post-run processing, fast diagnostic plots and run output management, complementary to other available modelling aids. Analysis time ought not to be spent on trivial tasks, and NONMEMory's role is to eliminate these as far as possible by increasing the efficiency of the modelling process. NONMEMory is freely available from http://www.uct.ac.za/depts/pha/nonmemory.php.
Biniarz, Piotr; Łukaszewicz, Marcin
2017-06-01
The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.
NASA Astrophysics Data System (ADS)
Horodinca, M.
2016-08-01
This paper intend to propose some new results related with computer aided monitoring of transient regimes on machine-tools based on the evolution of active electrical power absorbed by the electric motor used to drive the main kinematic chains and the evolution of rotational speed and acceleration of the main shaft. The active power is calculated in numerical format using the evolution of instantaneous voltage and current delivered by electrical power system to the electric motor. The rotational speed and acceleration of the main shaft are calculated based on the signal delivered by a sensor. Three real-time analogic signals are acquired with a very simple computer assisted setup which contains a voltage transformer, a current transformer, an AC generator as rotational speed sensor, a data acquisition system and a personal computer. The data processing and analysis was done using Matlab software. Some different transient regimes were investigated; several important conclusions related with the advantages of this monitoring technique were formulated. Many others features of the experimental setup are also available: to supervise the mechanical loading of machine-tools during cutting processes or for diagnosis of machine-tools condition by active electrical power signal analysis in frequency domain.
Parametric Study of Biconic Re-Entry Vehicles
NASA Technical Reports Server (NTRS)
Steele, Bryan; Banks, Daniel W.; Whitmore, Stephen A.
2007-01-01
An optimization based on hypersonic aerodynamic performance and volumetric efficiency was accomplished for a range of biconic configurations. Both axisymmetric and quasi-axisymmetric geometries (bent and flattened) were analyzed. The aerodynamic optimization wag based on hypersonic simple Incidence angle analysis tools. The range of configurations included those suitable for r lunar return trajectory with a lifting aerocapture at Earth and an overall volume that could support a nominal crew. The results yielded five configurations that had acceptable aerodynamic performance and met overall geometry and size limitations
A Computer-Based Educational Approach to the Air Command and Staff College Associate Program
1985-04-01
control interactive vid e o, grade student responses and perform some analysis on the dat a . Its main advantages lie in the ability of the author to...basic goal of provid- ing the instructor with assitance in the development of good CBE. One way of viewing the different tools on the market is to...ractice , tutorials and simple games all have as their premise the computer replacing the teacher in a one-on-one en- counter. The other modes, simulation
Software Development Cost Estimation Executive Summary
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Menzies, Tim
2006-01-01
Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.
A simple and inexpensive external fixator.
Noor, M A
1988-11-01
A simple and inexpensive external fixator has been designed. It is constructed of galvanized iron pipe and mild steel bolts and nuts. It can easily be manufactured in a hospital workshop with a minimum of tools.
NASA Astrophysics Data System (ADS)
Hassan, A. H.; Fluke, C. J.; Barnes, D. G.
2012-09-01
Upcoming and future astronomy research facilities will systematically generate terabyte-sized data sets moving astronomy into the Petascale data era. While such facilities will provide astronomers with unprecedented levels of accuracy and coverage, the increases in dataset size and dimensionality will pose serious computational challenges for many current astronomy data analysis and visualization tools. With such data sizes, even simple data analysis tasks (e.g. calculating a histogram or computing data minimum/maximum) may not be achievable without access to a supercomputing facility. To effectively handle such dataset sizes, which exceed today's single machine memory and processing limits, we present a framework that exploits the distributed power of GPUs and many-core CPUs, with a goal of providing data analysis and visualizing tasks as a service for astronomers. By mixing shared and distributed memory architectures, our framework effectively utilizes the underlying hardware infrastructure handling both batched and real-time data analysis and visualization tasks. Offering such functionality as a service in a “software as a service” manner will reduce the total cost of ownership, provide an easy to use tool to the wider astronomical community, and enable a more optimized utilization of the underlying hardware infrastructure.
ERIC Educational Resources Information Center
Sigford, Ann; Nelson, Nancy
1998-01-01
Presents a program for elementary teachers to learn how to use hand tools and household appliances to teach the principles of physics. The lesson helps teachers become familiar with simple hand tools, combat the apprehension of mechanical devices, and develop an interest in tools and technology. Session involves disassembling appliances to…
Development and Validation of the Texas Best Management Practice Evaluation Tool (TBET)
USDA-ARS?s Scientific Manuscript database
Conservation planners need simple yet accurate tools to predict sediment and nutrient losses from agricultural fields to guide conservation practice implementation and increase cost-effectiveness. The Texas Best management practice Evaluation Tool (TBET), which serves as an input/output interpreter...
Díaz-Gay, Marcos; Vila-Casadesús, Maria; Franch-Expósito, Sebastià; Hernández-Illán, Eva; Lozano, Juan José; Castellví-Bel, Sergi
2018-06-14
Mutational signatures have been proved as a valuable pattern in somatic genomics, mainly regarding cancer, with a potential application as a biomarker in clinical practice. Up to now, several bioinformatic packages to address this topic have been developed in different languages/platforms. MutationalPatterns has arisen as the most efficient tool for the comparison with the signatures currently reported in the Catalogue of Somatic Mutations in Cancer (COSMIC) database. However, the analysis of mutational signatures is nowadays restricted to a small community of bioinformatic experts. In this work we present Mutational Signatures in Cancer (MuSiCa), a new web tool based on MutationalPatterns and built using the Shiny framework in R language. By means of a simple interface suited to non-specialized researchers, it provides a comprehensive analysis of the somatic mutational status of the supplied cancer samples. It permits characterizing the profile and burden of mutations, as well as quantifying COSMIC-reported mutational signatures. It also allows classifying samples according to the above signature contributions. MuSiCa is a helpful web application to characterize mutational signatures in cancer samples. It is accessible online at http://bioinfo.ciberehd.org/GPtoCRC/en/tools.html and source code is freely available at https://github.com/marcos-diazg/musica .
Metsalu, Tauno; Vilo, Jaak
2015-01-01
The Principal Component Analysis (PCA) is a widely used method of reducing the dimensionality of high-dimensional data, often followed by visualizing two of the components on the scatterplot. Although widely used, the method is lacking an easy-to-use web interface that scientists with little programming skills could use to make plots of their own data. The same applies to creating heatmaps: it is possible to add conditional formatting for Excel cells to show colored heatmaps, but for more advanced features such as clustering and experimental annotations, more sophisticated analysis tools have to be used. We present a web tool called ClustVis that aims to have an intuitive user interface. Users can upload data from a simple delimited text file that can be created in a spreadsheet program. It is possible to modify data processing methods and the final appearance of the PCA and heatmap plots by using drop-down menus, text boxes, sliders etc. Appropriate defaults are given to reduce the time needed by the user to specify input parameters. As an output, users can download PCA plot and heatmap in one of the preferred file formats. This web server is freely available at http://biit.cs.ut.ee/clustvis/. PMID:25969447
An open source GIS-based tool to integrate the fragmentation mechanism in rockfall propagation
NASA Astrophysics Data System (ADS)
Matas, Gerard; Lantada, Nieves; Gili, Josep A.; Corominas, Jordi
2015-04-01
Rockfalls are frequent instability processes in road cuts, open pit mines and quarries, steep slopes and cliffs. Even though the stability of rock slopes can be determined using analytical approaches, the assessment of large rock cliffs require simplifying assumptions due to the difficulty of working with a large amount of joints, the scattering of both the orientations and strength parameters. The attitude and persistency of joints within the rock mass define the size of kinematically unstable rock volumes. Furthermore the rock block will eventually split in several fragments during its propagation downhill due its impact with the ground surface. Knowledge of the size, energy, trajectory… of each block resulting from fragmentation is critical in determining the vulnerability of buildings and protection structures. The objective of this contribution is to present a simple and open source tool to simulate the fragmentation mechanism in rockfall propagation models and in the calculation of impact energies. This tool includes common modes of motion for falling boulders based on the previous literature. The final tool is being implemented in a GIS (Geographic Information Systems) using open source Python programming. The tool under development will be simple, modular, compatible with any GIS environment, open source, able to model rockfalls phenomena correctly. It could be used in any area susceptible to rockfalls with a previous adjustment of the parameters. After the adjustment of the model parameters to a given area, a simulation could be performed to obtain maps of kinetic energy, frequency, stopping density and passing heights. This GIS-based tool and the analysis of the fragmentation laws using data collected from recent rockfall have being developed within the RockRisk Project (2014-2016). This project is funded by the Spanish Ministerio de Economía y Competitividad and entitled "Rockfalls in cliffs: risk quantification and its prevention"(BIA2013-42582-P).
The Precision Formation Flying Integrated Analysis Tool (PFFIAT)
NASA Technical Reports Server (NTRS)
Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor
2004-01-01
Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.
The Precision Formation Flying Integrated Analysis Tool (PFFIAT)
NASA Technical Reports Server (NTRS)
Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor
2004-01-01
Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.
MyGeoHub: A Collaborative Geospatial Research and Education Platform
NASA Astrophysics Data System (ADS)
Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.
2017-12-01
Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.
Hydrologic analysis for selection and placement of conservation practices at the watershed scale
NASA Astrophysics Data System (ADS)
Wilson, C.; Brooks, E. S.; Boll, J.
2012-12-01
When a water body is exceeding water quality standards and a Total Maximum Daily Load has been established, conservation practices in the watershed are able to reduce point and non-point source pollution. Hydrological analysis is needed to place conservation practices in the most hydrologically sensitive areas. The selection and placement of conservation practices, however, is challenging in ungauged watersheds with little or no data for the hydrological analysis. The objective of this research is to perform a hydrological analysis for mitigation of erosion and total phosphorus in a mixed land use watershed, and to select and place the conservation practices in the most sensitive areas. The study area is the Hangman Creek watershed in Idaho and Washington State, upstream of Long Lake (WA) reservoir, east of Spokane, WA. While the pollutant of concern is total phosphorus (TP), reductions in TP were translated to total suspended solids or reductions in nonpoint source erosion and sediment delivery to streams. Hydrological characterization was done with a simple web-based tool, which runs the Water Erosion Prediction Project (WEPP) model for representative land types in the watersheds, where a land type is defined as a unique combination of soil type, slope configuration, land use and management, and climate. The web-based tool used site-specific spatial and temporal data on land use, soil physical parameters, slope, and climate derived from readily available data sources and provided information on potential pollutant pathways (i.e. erosion, runoff, lateral flow, and percolation). Multiple land types representative in the watershed were ordered from most effective to least effective, and displayed spatially using GIS. The methodology for the Hangman Creek watershed was validated in the nearby Paradise Creek watershed that has long-term stream discharge and monitoring as well as land use data. Output from the web-based tool shows the potential reductions for different tillage practices, buffer strips, streamside management, and conversion to the conservation reserve program in the watershed. The output also includes the relationship between land area where conservation practices are placed and the potential reduction in pollution, showing the diminished returns on investment as less sensitive areas are being treated. This application of a simple web-based tool and the use of a physically-based erosion model (i.e. WEPP) illustrates that quantitative, spatial and temporal analysis of changes in pollutant loading and site-specific recommendations of conservation practices can be made in ungauged watersheds.
Twelve essential tools for living the life of whole person health care.
Schlitz, Marilyn; Valentina, Elizabeth
2013-01-01
The integration of body, mind, and spirit has become a key dimension of health education and disease prevention and treatment; however, our health care system remains primarily disease centered. Finding simple steps to help each of us find our own balance can improve our lives, our work, and our relationships. On the basis of interviews with health care experts at the leading edge of the new model of medicine, this article identifies simple tools to improve the health of patients and caregivers.
A novel methodology for building robust design rules by using design based metrology (DBM)
NASA Astrophysics Data System (ADS)
Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan
2013-03-01
This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.
History of optics: a modern teaching tool
NASA Astrophysics Data System (ADS)
Vazquez, D.; Gonzalez-Cano, A.; Diaz-Herrera, N.; Llombart, N.; Alda, J.
2012-10-01
The history of optics is a very rich field of science and it is possible to find many simple and significant examples of the application and success of the experimental method and therefore is a very good tool to transmit to the student the way science proceeds and to introduce the right spirit of critical analysis, building and testing of models, etc. Optical phenomena are specially well suited for this because in fact optical observations and experiments have made science advance in a crucial way in many different periods of history, because they are in many cases quite visual, quite simple in concept and it is very easy to produce experimental setups in classrooms. Also, the intrinsic multidisciplinary character of Optics, which is a subject that has historically influenced in a notorious way fields as art, philosophy, religion and cultural and social studies in general, provide a very wide frame that permits to apply these examples to many different auditories. We present here some reflections about the role that history of optics can play in teaching and show some real examples of its application during the many years that we have been employing it in the context of the Optics School of the Complutense University of Madrid, Spain.
3D inelastic analysis methods for hot section components
NASA Technical Reports Server (NTRS)
Dame, L. T.; Chen, P. C.; Hartle, M. S.; Huang, H. T.
1985-01-01
The objective is to develop analytical tools capable of economically evaluating the cyclic time dependent plasticity which occurs in hot section engine components in areas of strain concentration resulting from the combination of both mechanical and thermal stresses. Three models were developed. A simple model performs time dependent inelastic analysis using the power law creep equation. The second model is the classical model of Professors Walter Haisler and David Allen of Texas A and M University. The third model is the unified model of Bodner, Partom, et al. All models were customized for linear variation of loads and temperatures with all material properties and constitutive models being temperature dependent.
Transcriptome analysis by strand-specific sequencing of complementary DNA
Parkhomchuk, Dmitri; Borodina, Tatiana; Amstislavskiy, Vyacheslav; Banaru, Maria; Hallen, Linda; Krobitsch, Sylvia; Lehrach, Hans; Soldatov, Alexey
2009-01-01
High-throughput complementary DNA sequencing (RNA-Seq) is a powerful tool for whole-transcriptome analysis, supplying information about a transcript's expression level and structure. However, it is difficult to determine the polarity of transcripts, and therefore identify which strand is transcribed. Here, we present a simple cDNA sequencing protocol that preserves information about a transcript's direction. Using Saccharomyces cerevisiae and mouse brain transcriptomes as models, we demonstrate that knowing the transcript's orientation allows more accurate determination of the structure and expression of genes. It also helps to identify new genes and enables studying promoter-associated and antisense transcription. The transcriptional landscapes we obtained are available online. PMID:19620212
Transcriptome analysis by strand-specific sequencing of complementary DNA.
Parkhomchuk, Dmitri; Borodina, Tatiana; Amstislavskiy, Vyacheslav; Banaru, Maria; Hallen, Linda; Krobitsch, Sylvia; Lehrach, Hans; Soldatov, Alexey
2009-10-01
High-throughput complementary DNA sequencing (RNA-Seq) is a powerful tool for whole-transcriptome analysis, supplying information about a transcript's expression level and structure. However, it is difficult to determine the polarity of transcripts, and therefore identify which strand is transcribed. Here, we present a simple cDNA sequencing protocol that preserves information about a transcript's direction. Using Saccharomyces cerevisiae and mouse brain transcriptomes as models, we demonstrate that knowing the transcript's orientation allows more accurate determination of the structure and expression of genes. It also helps to identify new genes and enables studying promoter-associated and antisense transcription. The transcriptional landscapes we obtained are available online.
Structural analyses for the modification and verification of the Viking aeroshell
NASA Technical Reports Server (NTRS)
Stephens, W. B.; Anderson, M. S.
1976-01-01
The Viking aeroshell is an extremely lightweight flexible shell structure that has undergone thorough buckling analyses in the course of its development. The analytical tools and modeling technique required to reveal the structural behavior are presented. Significant results are given which illustrate the complex failure modes not usually observed in simple models and analyses. Both shell-of-revolution analysis for the pressure loads and thermal loads during entry and a general shell analysis for concentrated tank loads during launch were used. In many cases fixes or alterations to the structure were required, and the role of the analytical results in determining these modifications is indicated.
Risk Interfaces to Support Integrated Systems Analysis and Development
NASA Technical Reports Server (NTRS)
Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria
2016-01-01
Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4
Neuronal and network computation in the brain
NASA Astrophysics Data System (ADS)
Babloyantz, A.
1999-03-01
The concepts and methods of non-linear dynamics have been a powerful tool for studying some gamow aspects of brain dynamics. In this paper we show how, from time series analysis of electroencepholograms in sick and healthy subjects, chaotic nature of brain activity could be unveiled. This finding gave rise to the concept of spatiotemporal cortical chaotic networks which in turn was the foundation for a simple brain-like device which is able to become attentive, perform pattern recognition and motion detection. A new method of time series analysis is also proposed which demonstrates for the first time the existence of neuronal code in interspike intervals of coclear cells.
Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) for a 3-D Flexible Wing
NASA Technical Reports Server (NTRS)
Gumbert, Clyde R.; Hou, Gene J.-W.
2001-01-01
The formulation and implementation of an optimization method called Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) are extended from single discipline analysis (aerodynamics only) to multidisciplinary analysis - in this case, static aero-structural analysis - and applied to a simple 3-D wing problem. The method aims to reduce the computational expense incurred in performing shape optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, Finite Element Method (FEM) structural analysis and sensitivity analysis tools. Results for this small problem show that the method reaches the same local optimum as conventional optimization. However, unlike its application to the win,, (single discipline analysis), the method. as I implemented here, may not show significant reduction in the computational cost. Similar reductions were seen in the two-design-variable (DV) problem results but not in the 8-DV results given here.
2013-01-01
Background Perturbations in intestinal microbiota composition have been associated with a variety of gastrointestinal tract-related diseases. The alleviation of symptoms has been achieved using treatments that alter the gastrointestinal tract microbiota toward that of healthy individuals. Identifying differences in microbiota composition through the use of 16S rRNA gene hypervariable tag sequencing has profound health implications. Current computational methods for comparing microbial communities are usually based on multiple alignments and phylogenetic inference, making them time consuming and requiring exceptional expertise and computational resources. As sequencing data rapidly grows in size, simpler analysis methods are needed to meet the growing computational burdens of microbiota comparisons. Thus, we have developed a simple, rapid, and accurate method, independent of multiple alignments and phylogenetic inference, to support microbiota comparisons. Results We create a metric, called compression-based distance (CBD) for quantifying the degree of similarity between microbial communities. CBD uses the repetitive nature of hypervariable tag datasets and well-established compression algorithms to approximate the total information shared between two datasets. Three published microbiota datasets were used as test cases for CBD as an applicable tool. Our study revealed that CBD recaptured 100% of the statistically significant conclusions reported in the previous studies, while achieving a decrease in computational time required when compared to similar tools without expert user intervention. Conclusion CBD provides a simple, rapid, and accurate method for assessing distances between gastrointestinal tract microbiota 16S hypervariable tag datasets. PMID:23617892
The Hand Burn Severity (HABS) score: A simple tool for stratifying severity of hand burns.
Bache, Sarah E; Fitzgerald O'Connor, Edmund; Theodorakopoulou, Evgenia; Frew, Quentin; Philp, Bruce; Dziewulski, Peter
2017-02-01
Hand burns represent a unique challenge to the burns team due to the intricate structure and unrivalled functional importance of the hand. The initial assessment and prognosis relies on consideration of the specific site involved as well as depth of the burn. We created a simple severity score that could be used by referring non-specialists and researchers alike. The Hand Burn Severity (HABS) score stratifies hand burns according to severity with a numerical value of between 0 (no burn) and 18 (most severe) per hand. Three independent assessors scored the photographs of 121 burned hands of 106 adult and paediatric patients, demonstrating excellent inter-rater reliability (r=0.91, p<0.0001 on testing with Lin's correlation coefficient). A significant relationship was shown between the HABS score and a reliable binary outcome of the requirement for surgical excision on Mann-Whitney U testing (U=152; Z=9.8; p=0.0001). A receiver operator characteristic (ROC) curve analysis found a cut off score of 5.5, indicating that those with a HABS score below 6 did not require an operation, whereas those with a score above 6 did. The HABS score was shown to be more sensitive and specific that assessment of burn depth alone. The HABS score is a simple to use tool to stratify severity at initial presentation of hand burns which will be useful when referring, and when reporting outcomes. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.
Yang, Fang; Chia, Nicholas; White, Bryan A; Schook, Lawrence B
2013-04-23
Perturbations in intestinal microbiota composition have been associated with a variety of gastrointestinal tract-related diseases. The alleviation of symptoms has been achieved using treatments that alter the gastrointestinal tract microbiota toward that of healthy individuals. Identifying differences in microbiota composition through the use of 16S rRNA gene hypervariable tag sequencing has profound health implications. Current computational methods for comparing microbial communities are usually based on multiple alignments and phylogenetic inference, making them time consuming and requiring exceptional expertise and computational resources. As sequencing data rapidly grows in size, simpler analysis methods are needed to meet the growing computational burdens of microbiota comparisons. Thus, we have developed a simple, rapid, and accurate method, independent of multiple alignments and phylogenetic inference, to support microbiota comparisons. We create a metric, called compression-based distance (CBD) for quantifying the degree of similarity between microbial communities. CBD uses the repetitive nature of hypervariable tag datasets and well-established compression algorithms to approximate the total information shared between two datasets. Three published microbiota datasets were used as test cases for CBD as an applicable tool. Our study revealed that CBD recaptured 100% of the statistically significant conclusions reported in the previous studies, while achieving a decrease in computational time required when compared to similar tools without expert user intervention. CBD provides a simple, rapid, and accurate method for assessing distances between gastrointestinal tract microbiota 16S hypervariable tag datasets.
Eksborg, Staffan
2013-01-01
Pharmacokinetic studies are important for optimizing of drug dosing, but requires proper validation of the used pharmacokinetic procedures. However, simple and reliable statistical methods suitable for evaluation of the predictive performance of pharmacokinetic analysis are essentially lacking. The aim of the present study was to construct and evaluate a graphic procedure for quantification of predictive performance of individual and population pharmacokinetic compartment analysis. Original data from previously published pharmacokinetic compartment analyses after intravenous, oral, and epidural administration, and digitized data, obtained from published scatter plots of observed vs predicted drug concentrations from population pharmacokinetic studies using the NPEM algorithm and NONMEM computer program and Bayesian forecasting procedures, were used for estimating the predictive performance according to the proposed graphical method and by the method of Sheiner and Beal. The graphical plot proposed in the present paper proved to be a useful tool for evaluation of predictive performance of both individual and population compartment pharmacokinetic analysis. The proposed method is simple to use and gives valuable information concerning time- and concentration-dependent inaccuracies that might occur in individual and population pharmacokinetic compartment analysis. Predictive performance can be quantified by the fraction of concentration ratios within arbitrarily specified ranges, e.g. within the range 0.8-1.2.
Toward Interactive Scenario Analysis and Exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gayle, Thomas R.; Summers, Kenneth Lee; Jungels, John
2015-01-01
As Modeling and Simulation (M&S) tools have matured, their applicability and importance have increased across many national security challenges. In particular, they provide a way to test how something may behave without the need to do real world testing. However, current and future changes across several factors including capabilities, policy, and funding are driving a need for rapid response or evaluation in ways that many M&S tools cannot address. Issues around large data, computational requirements, delivery mechanisms, and analyst involvement already exist and pose significant challenges. Furthermore, rising expectations, rising input complexity, and increasing depth of analysis will only increasemore » the difficulty of these challenges. In this study we examine whether innovations in M&S software coupled with advances in ''cloud'' computing and ''big-data'' methodologies can overcome many of these challenges. In particular, we propose a simple, horizontally-scalable distributed computing environment that could provide the foundation (i.e. ''cloud'') for next-generation M&S-based applications based on the notion of ''parallel multi-simulation''. In our context, the goal of parallel multi- simulation is to consider as many simultaneous paths of execution as possible. Therefore, with sufficient resources, the complexity is dominated by the cost of single scenario runs as opposed to the number of runs required. We show the feasibility of this architecture through a stable prototype implementation coupled with the Umbra Simulation Framework [6]. Finally, we highlight the utility through multiple novel analysis tools and by showing the performance improvement compared to existing tools.« less
Customisation of the exome data analysis pipeline using a combinatorial approach.
Pattnaik, Swetansu; Vaidyanathan, Srividya; Pooja, Durgad G; Deepak, Sa; Panda, Binay
2012-01-01
The advent of next generation sequencing (NGS) technologies have revolutionised the way biologists produce, analyse and interpret data. Although NGS platforms provide a cost-effective way to discover genome-wide variants from a single experiment, variants discovered by NGS need follow up validation due to the high error rates associated with various sequencing chemistries. Recently, whole exome sequencing has been proposed as an affordable option compared to whole genome runs but it still requires follow up validation of all the novel exomic variants. Customarily, a consensus approach is used to overcome the systematic errors inherent to the sequencing technology, alignment and post alignment variant detection algorithms. However, the aforementioned approach warrants the use of multiple sequencing chemistry, multiple alignment tools, multiple variant callers which may not be viable in terms of time and money for individual investigators with limited informatics know-how. Biologists often lack the requisite training to deal with the huge amount of data produced by NGS runs and face difficulty in choosing from the list of freely available analytical tools for NGS data analysis. Hence, there is a need to customise the NGS data analysis pipeline to preferentially retain true variants by minimising the incidence of false positives and make the choice of right analytical tools easier. To this end, we have sampled different freely available tools used at the alignment and post alignment stage suggesting the use of the most suitable combination determined by a simple framework of pre-existing metrics to create significant datasets.
SIGKit: Software for Introductory Geophysics Toolkit
NASA Astrophysics Data System (ADS)
Kruse, S.; Bank, C. G.; Esmaeili, S.; Jazayeri, S.; Liu, S.; Stoikopoulos, N.
2017-12-01
The Software for Introductory Geophysics Toolkit (SIGKit) affords students the opportunity to create model data and perform simple processing of field data for various geophysical methods. SIGkit provides a graphical user interface built with the MATLAB programming language, but can run even without a MATLAB installation. At this time SIGkit allows students to pick first arrivals and match a two-layer model to seismic refraction data; grid total-field magnetic data, extract a profile, and compare this to a synthetic profile; and perform simple processing steps (subtraction of a mean trace, hyperbola fit) to ground-penetrating radar data. We also have preliminary tools for gravity, resistivity, and EM data representation and analysis. SIGkit is being built by students for students, and the intent of the toolkit is to provide an intuitive interface for simple data analysis and understanding of the methods, and act as an entrance to more sophisticated software. The toolkit has been used in introductory courses as well as field courses. First reactions from students are positive. Think-aloud observations of students using the toolkit have helped identify problems and helped shape it. We are planning to compare the learning outcomes of students who have used the toolkit in a field course to students in a previous course to test its effectiveness.
Simplified, inverse, ejector design tool
NASA Technical Reports Server (NTRS)
Dechant, Lawrence J.
1993-01-01
A simple lumped parameter based inverse design tool has been developed which provides flow path geometry and entrainment estimates subject to operational, acoustic, and design constraints. These constraints are manifested through specification of primary mass flow rate or ejector thrust, fully-mixed exit velocity, and static pressure matching. Fundamentally, integral forms of the conservation equations coupled with the specified design constraints are combined to yield an easily invertible linear system in terms of the flow path cross-sectional areas. Entrainment is computed by back substitution. Initial comparison with experimental and analogous one-dimensional methods show good agreement. Thus, this simple inverse design code provides an analytically based, preliminary design tool with direct application to High Speed Civil Transport (HSCT) design studies.
Venhorst, Kristie; Zelle, Sten G; Tromp, Noor; Lauer, Jeremy A
2014-01-01
The objective of this study was to develop a rating tool for policy makers to prioritize breast cancer interventions in low- and middle- income countries (LMICs), based on a simple multi-criteria decision analysis (MCDA) approach. The definition and identification of criteria play a key role in MCDA, and our rating tool could be used as part of a broader priority setting exercise in a local setting. This tool may contribute to a more transparent priority-setting process and fairer decision-making in future breast cancer policy development. First, an expert panel (n = 5) discussed key considerations for tool development. A literature review followed to inventory all relevant criteria and construct an initial set of criteria. A Delphi study was then performed and questionnaires used to discuss a final list of criteria with clear definitions and potential scoring scales. For this Delphi study, multiple breast cancer policy and priority-setting experts from different LMICs were selected and invited by the World Health Organization. Fifteen international experts participated in all three Delphi rounds to assess and evaluate each criterion. This study resulted in a preliminary rating tool for assessing breast cancer interventions in LMICs. The tool consists of 10 carefully crafted criteria (effectiveness, quality of the evidence, magnitude of individual health impact, acceptability, cost-effectiveness, technical complexity, affordability, safety, geographical coverage, and accessibility), with clear definitions and potential scoring scales. This study describes the development of a rating tool to assess breast cancer interventions in LMICs. Our tool can offer supporting knowledge for the use or development of rating tools as part of a broader (MCDA based) priority setting exercise in local settings. Further steps for improving the tool are proposed and should lead to its useful adoption in LMICs.
Remote Control and Data Acquisition: A Case Study
NASA Technical Reports Server (NTRS)
DeGennaro, Alfred J.; Wilkinson, R. Allen
2000-01-01
This paper details software tools developed to remotely command experimental apparatus, and to acquire and visualize the associated data in soft real time. The work was undertaken because commercial products failed to meet the needs. This work has identified six key factors intrinsic to development of quality research laboratory software. Capabilities include access to all new instrument functions without any programming or dependence on others to write drivers or virtual instruments, simple full screen text-based experiment configuration and control user interface, months of continuous experiment run-times, order of 1% CPU load for condensed matter physics experiment described here, very little imposition of software tool choices on remote users, and total remote control from anywhere in the world over the Internet or from home on a 56 Kb modem as if the user is sitting in the laboratory. This work yielded a set of simple robust tools that are highly reliable, resource conserving, extensible, and versatile, with a uniform simple interface.
Analytical Tools Interface for Landscape Assessments
Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...
Development of a simplified urban water balance model (WABILA).
Henrichs, M; Langner, J; Uhl, M
2016-01-01
During the last decade, water sensitive urban design (WSUD) has become more and more accepted. However, there is not any simple tool or option available to evaluate the influence of these measures on the local water balance. To counteract the impact of new settlements, planners focus on mitigating increases in runoff through installation of infiltration systems. This leads to an increasing non-natural groundwater recharge and decreased evapotranspiration. Simple software tools which evaluate or simulate the effect of WSUD on the local water balance are still needed. The authors developed a tool named WABILA (Wasserbilanz) that could support planners for optimal WSUD. WABILA is an easy-to-use planning tool that is based on simplified regression functions for established measures and land covers. Results show that WSUD has to be site-specific, based on climate conditions and the natural water balance.
Sampling and sensitivity analyses tools (SaSAT) for computational modelling
Hoare, Alexander; Regan, David G; Wilson, David P
2008-01-01
SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361
NASA Astrophysics Data System (ADS)
Kacprzyk, Janusz; Zadrożny, Sławomir
2010-05-01
We present how the conceptually and numerically simple concept of a fuzzy linguistic database summary can be a very powerful tool for gaining much insight into the very essence of data. The use of linguistic summaries provides tools for the verbalisation of data analysis (mining) results which, in addition to the more commonly used visualisation, e.g. via a graphical user interface, can contribute to an increased human consistency and ease of use, notably for supporting decision makers via the data-driven decision support system paradigm. Two new relevant aspects of the analysis are also outlined which were first initiated by the authors. First, following Kacprzyk and Zadrożny, it is further considered how linguistic data summarisation is closely related to some types of solutions used in natural language generation (NLG). This can make it possible to use more and more effective and efficient tools and techniques developed in NLG. Second, similar remarks are given on relations to systemic functional linguistics. Moreover, following Kacprzyk and Zadrożny, comments are given on an extremely relevant aspect of scalability of linguistic summarisation of data, using a new concept of a conceptual scalability.
NEFI: Network Extraction From Images
Dirnberger, M.; Kehl, T.; Neumann, A.
2015-01-01
Networks are amongst the central building blocks of many systems. Given a graph of a network, methods from graph theory enable a precise investigation of its properties. Software for the analysis of graphs is widely available and has been applied to study various types of networks. In some applications, graph acquisition is relatively simple. However, for many networks data collection relies on images where graph extraction requires domain-specific solutions. Here we introduce NEFI, a tool that extracts graphs from images of networks originating in various domains. Regarding previous work on graph extraction, theoretical results are fully accessible only to an expert audience and ready-to-use implementations for non-experts are rarely available or insufficiently documented. NEFI provides a novel platform allowing practitioners to easily extract graphs from images by combining basic tools from image processing, computer vision and graph theory. Thus, NEFI constitutes an alternative to tedious manual graph extraction and special purpose tools. We anticipate NEFI to enable time-efficient collection of large datasets. The analysis of these novel datasets may open up the possibility to gain new insights into the structure and function of various networks. NEFI is open source and available at http://nefi.mpi-inf.mpg.de. PMID:26521675
MOLEonline: a web-based tool for analyzing channels, tunnels and pores (2018 update).
Pravda, Lukáš; Sehnal, David; Toušek, Dominik; Navrátilová, Veronika; Bazgier, Václav; Berka, Karel; Svobodová Vareková, Radka; Koca, Jaroslav; Otyepka, Michal
2018-04-30
MOLEonline is an interactive, web-based application for the detection and characterization of channels (pores and tunnels) within biomacromolecular structures. The updated version of MOLEonline overcomes limitations of the previous version by incorporating the recently developed LiteMol Viewer visualization engine and providing a simple, fully interactive user experience. The application enables two modes of calculation: one is dedicated to the analysis of channels while the other was specifically designed for transmembrane pores. As the application can use both PDB and mmCIF formats, it can be leveraged to analyze a wide spectrum of biomacromolecular structures, e.g. stemming from NMR, X-ray and cryo-EM techniques. The tool is interconnected with other bioinformatics tools (e.g., PDBe, CSA, ChannelsDB, OPM, UniProt) to help both setup and the analysis of acquired results. MOLEonline provides unprecedented analytics for the detection and structural characterization of channels, as well as information about their numerous physicochemical features. Here we present the application of MOLEonline for structural analyses of α-hemolysin and transient receptor potential mucolipin 1 (TRMP1) pores. The MOLEonline application is freely available via the Internet at https://mole.upol.cz.
ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data.
Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J; Intarapanich, Apichart; Tongsima, Sissades; Piriyapongsa, Jittima
2017-01-01
Biochemical methods are available for enriching 5' ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5' ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5' ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5' ends than TSSAR. In general, the transcript 5' ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5'ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER).
Powell, Rebecca L R; Urbanski, Mateusz M; Burda, Sherri; Nanfack, Aubin; Kinge, Thompson; Nyambi, Phillipe N
2008-01-01
The predominance of unique recombinant forms (URFs) of HIV-1 in Cameroon suggests that dual infection, the concomitant or sequential infection with genetically distinct HIV-1 strains, occurs frequently in this region; yet, identifying dual infection among large HIV cohorts in local, resource-limited settings is uncommon, since this generally relies on labor-intensive and costly sequencing methods. Consequently, there is a need to develop an effective, cost-efficient method appropriate to the developing world to identify these infections. In the present study, the heteroduplex assay (HDA) was used to verify dual or single infection status, as shown by traditional sequence analysis, for 15 longitudinally sampled study subjects from Cameroon. Heteroduplex formation, indicative of a dual infection, was identified for all five study subjects shown by sequence analysis to be dually infected. Conversely, heteroduplex formation was not detectable for all 10 HDA reactions of the singly infected study subjects. These results suggest that the HDA is a simple yet powerful and inexpensive tool for the detection of both intersubtype and intrasubtype dual infections, and that the HDA harbors significant potential for reliable, high-throughput screening for dual infection. As these infections and the recombinants they generate facilitate leaps in HIV-1 evolution, and may present major challenges for treatment and vaccine design, this assay will be critical for monitoring the continuing pandemic in regions of the world where HIV-1 viral diversity is broad.
NASA Technical Reports Server (NTRS)
Agarwal, G. C.; Osafo-Charles, F.; Oneill, W. D.; Gottlieb, G. L.
1982-01-01
Time series analysis is applied to model human operator dynamics in pursuit and compensatory tracking modes. The normalized residual criterion is used as a one-step analytical tool to encompass the processes of identification, estimation, and diagnostic checking. A parameter constraining technique is introduced to develop more reliable models of human operator dynamics. The human operator is adequately modeled by a second order dynamic system both in pursuit and compensatory tracking modes. In comparing the data sampling rates, 100 msec between samples is adequate and is shown to provide better results than 200 msec sampling. The residual power spectrum and eigenvalue analysis show that the human operator is not a generator of periodic characteristics.
Nadadhur, Aishwarya G; Emperador Melero, Javier; Meijer, Marieke; Schut, Desiree; Jacobs, Gerbren; Li, Ka Wan; Hjorth, J J Johannes; Meredith, Rhiannon M; Toonen, Ruud F; Van Kesteren, Ronald E; Smit, August B; Verhage, Matthijs; Heine, Vivi M
2017-01-01
Generation of neuronal cultures from induced pluripotent stem cells (hiPSCs) serve the studies of human brain disorders. However we lack neuronal networks with balanced excitatory-inhibitory activities, which are suitable for single cell analysis. We generated low-density networks of hPSC-derived GABAergic and glutamatergic cortical neurons. We used two different co-culture models with astrocytes. We show that these cultures have balanced excitatory-inhibitory synaptic identities using confocal microscopy, electrophysiological recordings, calcium imaging and mRNA analysis. These simple and robust protocols offer the opportunity for single-cell to multi-level analysis of patient hiPSC-derived cortical excitatory-inhibitory networks; thereby creating advanced tools to study disease mechanisms underlying neurodevelopmental disorders.
Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey
2014-01-01
With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.
Exploring Magnetic Fields with a Compass
NASA Astrophysics Data System (ADS)
Lunk, Brandon; Beichner, Robert
2011-01-01
A compass is an excellent classroom tool for the exploration of magnetic fields. Any student can tell you that a compass is used to determine which direction is north, but when paired with some basic trigonometry, the compass can be used to actually measure the strength of the magnetic field due to a nearby magnet or current-carrying wire. In this paper, we present a series of simple activities adapted from the Matter & Interactions textbook for doing just this. Interestingly, these simple measurements are comparable to predictions made by the Bohr model of the atom. Although antiquated, Bohr's atom can lead the way to a deeper analysis of the atomic properties of magnets. Although originally developed for an introductory calculus-based course, these activities can easily be adapted for use in an algebra-based class or even at the high school level.
Li, Daojin; Yin, Danyang; Chen, Yang; Liu, Zhen
2017-05-19
Protein phosphorylation is a major post-translational modification, which plays a vital role in cellular signaling of numerous biological processes. Mass spectrometry (MS) has been an essential tool for the analysis of protein phosphorylation, for which it is a key step to selectively enrich phosphopeptides from complex biological samples. In this study, metal-organic frameworks (MOFs)-based monolithic capillary has been successfully prepared as an effective sorbent for the selective enrichment of phosphopeptides and has been off-line coupled with matrix-assisted laser desorption ionization-time-of-flight mass spectrometry (MALDI-TOF MS) for efficient analysis of phosphopeptides. Using š-casein as a representative phosphoprotein, efficient phosphorylation analysis by this off-line platform was verified. Phosphorylation analysis of a nonfat milk sample was also demonstrated. Through introducing large surface areas and highly ordered pores of MOFs into monolithic column, the MOFs-based monolithic capillary exhibited several significant advantages, such as excellent selectivity toward phosphopeptides, superb tolerance to interference and simple operation procedure. Because of these highly desirable properties, the MOFs-based monolithic capillary could be a useful tool for protein phosphorylation analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
Analysis Tools for the Ion Cyclotron Emission Diagnostic on DIII-D
NASA Astrophysics Data System (ADS)
Del Castillo, C. A.; Thome, K. E.; Pinsker, R. I.; Meneghini, O.; Pace, D. C.
2017-10-01
Ion cyclotron emission (ICE) waves are excited by suprathermal particles such as neutral beam particles and fusion products. An ICE diagnostic is in consideration for use at ITER, where it could provide important passive measurement of fast ions location and losses, which are otherwise difficult to determine. Simple ICE data analysis codes had previously been developed, but more sophisticated codes are required to facilitate data analysis. Several terabytes of ICE data were collected on DIII-D during the 2015-2017 campaign. The ICE diagnostic consists of antenna straps and dedicated magnetic probes that are both digitized at 200 MHz. A suite of Python spectral analysis tools within the OMFIT framework is under development to perform the memory-intensive analysis of this data. A fast and optimized analysis allows ready access to data visualizations as spectrograms and as plots of both frequency and time cuts of the data. A database of processed ICE data is being constructed to understand the relationship between the frequency and intensity of ICE and a variety of experimental parameters including neutral beam power and geometry, local and global plasma parameters, magnetic fields, and many others. Work supported in part by US DoE under the Science Undergraduate Laboratory Internship (SULI) program and under DE-FC02-04ER54698.
Rocker: Open source, easy-to-use tool for AUC and enrichment calculations and ROC visualization.
Lätti, Sakari; Niinivehmas, Sanna; Pentikäinen, Olli T
2016-01-01
Receiver operating characteristics (ROC) curve with the calculation of area under curve (AUC) is a useful tool to evaluate the performance of biomedical and chemoinformatics data. For example, in virtual drug screening ROC curves are very often used to visualize the efficiency of the used application to separate active ligands from inactive molecules. Unfortunately, most of the available tools for ROC analysis are implemented into commercially available software packages, or are plugins in statistical software, which are not always the easiest to use. Here, we present Rocker, a simple ROC curve visualization tool that can be used for the generation of publication quality images. Rocker also includes an automatic calculation of the AUC for the ROC curve and Boltzmann-enhanced discrimination of ROC (BEDROC). Furthermore, in virtual screening campaigns it is often important to understand the early enrichment of active ligand identification, for this Rocker offers automated calculation routine. To enable further development of Rocker, it is freely available (MIT-GPL license) for use and modifications from our web-site (http://www.jyu.fi/rocker).
Mobile Modelling for Crowdsourcing Building Interior Data
NASA Astrophysics Data System (ADS)
Rosser, J.; Morley, J.; Jackson, M.
2012-06-01
Indoor spatial data forms an important foundation to many ubiquitous computing applications. It gives context to users operating location-based applications, provides an important source of documentation of buildings and can be of value to computer systems where an understanding of environment is required. Unlike external geographic spaces, no centralised body or agency is charged with collecting or maintaining such information. Widespread deployment of mobile devices provides a potential tool that would allow rapid model capture and update by a building's users. Here we introduce some of the issues involved in volunteering building interior data and outline a simple mobile tool for capture of indoor models. The nature of indoor data is inherently private; however in-depth analysis of this issue and legal considerations are not discussed in detail here.
Kinetic Profiling of Catalytic Organic Reactions as a Mechanistic Tool.
Blackmond, Donna G
2015-09-02
The use of modern kinetic tools to obtain virtually continuous reaction progress data over the course of a catalytic reaction opens up a vista that provides mechanistic insights into both simple and complex catalytic networks. Reaction profiles offer a rate/concentration scan that tells the story of a batch reaction time course in a qualitative "fingerprinting" manner as well as in quantitative detail. Reaction progress experiments may be mathematically designed to elucidate catalytic rate laws from only a fraction of the number of experiments required in classical kinetic measurements. The information gained from kinetic profiles provides clues to direct further mechanistic analysis by other approaches. Examples from a variety of catalytic reactions spanning two decades of the author's work help to delineate nuances on a central mechanistic theme.
Advancements in RNASeqGUI towards a Reproducible Analysis of RNA-Seq Experiments
Russo, Francesco; Righelli, Dario
2016-01-01
We present the advancements and novelties recently introduced in RNASeqGUI, a graphical user interface that helps biologists to handle and analyse large data collected in RNA-Seq experiments. This work focuses on the concept of reproducible research and shows how it has been incorporated in RNASeqGUI to provide reproducible (computational) results. The novel version of RNASeqGUI combines graphical interfaces with tools for reproducible research, such as literate statistical programming, human readable report, parallel executions, caching, and interactive and web-explorable tables of results. These features allow the user to analyse big datasets in a fast, efficient, and reproducible way. Moreover, this paper represents a proof of concept, showing a simple way to develop computational tools for Life Science in the spirit of reproducible research. PMID:26977414
Multisource feedback: 360-degree assessment of professional skills of clinical directors.
Palmer, Robert; Rayner, Hugh; Wall, David
2007-08-01
For measuring behaviour of National Health Service (NHS) staff, 360-degree assessment is a valuable tool. The important role of a clinical director as a medical leader is increasingly recognized, and attributes of a good clinical director can be defined. Set against these attributes, a 360-degree assessment tool has been designed. The job description for clinical directors has been used to develop a questionnaire sent to senior hospital staff. The views of staff within the hospital are similar irrespective of gender, post held or length of time in post. Analysis has shown that three independent factors can be distilled, namely operational management, interpersonal skills and creative/strategic thinking. A simple validated questionnaire has been developed and successfully introduced for the 360-degree assessment of clinical directors.
Tissue enrichment analysis for C. elegans genomics.
Angeles-Albores, David; N Lee, Raymond Y; Chan, Juancarlos; Sternberg, Paul W
2016-09-13
Over the last ten years, there has been explosive development in methods for measuring gene expression. These methods can identify thousands of genes altered between conditions, but understanding these datasets and forming hypotheses based on them remains challenging. One way to analyze these datasets is to associate ontologies (hierarchical, descriptive vocabularies with controlled relations between terms) with genes and to look for enrichment of specific terms. Although Gene Ontology (GO) is available for Caenorhabditis elegans, it does not include anatomical information. We have developed a tool for identifying enrichment of C. elegans tissues among gene sets and generated a website GUI where users can access this tool. Since a common drawback to ontology enrichment analyses is its verbosity, we developed a very simple filtering algorithm to reduce the ontology size by an order of magnitude. We adjusted these filters and validated our tool using a set of 30 gold standards from Expression Cluster data in WormBase. We show our tool can even discriminate between embryonic and larval tissues and can even identify tissues down to the single-cell level. We used our tool to identify multiple neuronal tissues that are down-regulated due to pathogen infection in C. elegans. Our Tissue Enrichment Analysis (TEA) can be found within WormBase, and can be downloaded using Python's standard pip installer. It tests a slimmed-down C. elegans tissue ontology for enrichment of specific terms and provides users with a text and graphic representation of the results.
Myokit: A simple interface to cardiac cellular electrophysiology.
Clerx, Michael; Collins, Pieter; de Lange, Enno; Volders, Paul G A
2016-01-01
Myokit is a new powerful and versatile software tool for modeling and simulation of cardiac cellular electrophysiology. Myokit consists of an easy-to-read modeling language, a graphical user interface, single and multi-cell simulation engines and a library of advanced analysis tools accessible through a Python interface. Models can be loaded from Myokit's native file format or imported from CellML. Model export is provided to C, MATLAB, CellML, CUDA and OpenCL. Patch-clamp data can be imported and used to estimate model parameters. In this paper, we review existing tools to simulate the cardiac cellular action potential to find that current tools do not cater specifically to model development and that there is a gap between easy-to-use but limited software and powerful tools that require strong programming skills from their users. We then describe Myokit's capabilities, focusing on its model description language, simulation engines and import/export facilities in detail. Using three examples, we show how Myokit can be used for clinically relevant investigations, multi-model testing and parameter estimation in Markov models, all with minimal programming effort from the user. This way, Myokit bridges a gap between performance, versatility and user-friendliness. Copyright © 2015 Elsevier Ltd. All rights reserved.
Revel8or: Model Driven Capacity Planning Tool Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Liu, Yan; Bui, Ngoc B.
2007-05-31
Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less
VARED: Verification and Analysis of Requirements and Early Designs
NASA Technical Reports Server (NTRS)
Badger, Julia; Throop, David; Claunch, Charles
2014-01-01
Requirements are a part of every project life cycle; everything going forward in a project depends on them. Good requirements are hard to write, there are few useful tools to test, verify, or check them, and it is difficult to properly marry them to the subsequent design, especially if the requirements are written in natural language. In fact, the inconsistencies and errors in the requirements along with the difficulty in finding these errors contribute greatly to the cost of the testing and verification stage of flight software projects [1]. Large projects tend to have several thousand requirements written at various levels by different groups of people. The design process is distributed and a lack of widely accepted standards for requirements often results in a product that varies widely in style and quality. A simple way to improve this would be to standardize the design process using a set of tools and widely accepted requirements design constraints. The difficulty with this approach is finding the appropriate constraints and tools. Common complaints against the tools available include ease of use, functionality, and available features. Also, although preferable, it is rare that these tools are capable of testing the quality of the requirements.
Correction tool for Active Shape Model based lumbar muscle segmentation.
Valenzuela, Waldo; Ferguson, Stephen J; Ignasiak, Dominika; Diserens, Gaelle; Vermathen, Peter; Boesch, Chris; Reyes, Mauricio
2015-08-01
In the clinical environment, accuracy and speed of the image segmentation process plays a key role in the analysis of pathological regions. Despite advances in anatomic image segmentation, time-effective correction tools are commonly needed to improve segmentation results. Therefore, these tools must provide faster corrections with a low number of interactions, and a user-independent solution. In this work we present a new interactive correction method for correcting the image segmentation. Given an initial segmentation and the original image, our tool provides a 2D/3D environment, that enables 3D shape correction through simple 2D interactions. Our scheme is based on direct manipulation of free form deformation adapted to a 2D environment. This approach enables an intuitive and natural correction of 3D segmentation results. The developed method has been implemented into a software tool and has been evaluated for the task of lumbar muscle segmentation from Magnetic Resonance Images. Experimental results show that full segmentation correction could be performed within an average correction time of 6±4 minutes and an average of 68±37 number of interactions, while maintaining the quality of the final segmentation result within an average Dice coefficient of 0.92±0.03.
PsychoPy--Psychophysics software in Python.
Peirce, Jonathan W
2007-05-15
The vast majority of studies into visual processing are conducted using computer display technology. The current paper describes a new free suite of software tools designed to make this task easier, using the latest advances in hardware and software. PsychoPy is a platform-independent experimental control system written in the Python interpreted language using entirely free libraries. PsychoPy scripts are designed to be extremely easy to read and write, while retaining complete power for the user to customize the stimuli and environment. Tools are provided within the package to allow everything from stimulus presentation and response collection (from a wide range of devices) to simple data analysis such as psychometric function fitting. Most importantly, PsychoPy is highly extensible and the whole system can evolve via user contributions. If a user wants to add support for a particular stimulus, analysis or hardware device they can look at the code for existing examples, modify them and submit the modifications back into the package so that the whole community benefits.
Selection by consequences, behavioral evolution, and the price equation.
Baum, William M
2017-05-01
Price's equation describes evolution across time in simple mathematical terms. Although it is not a theory, but a derived identity, it is useful as an analytical tool. It affords lucid descriptions of genetic evolution, cultural evolution, and behavioral evolution (often called "selection by consequences") at different levels (e.g., individual vs. group) and at different time scales (local and extended). The importance of the Price equation for behavior analysis lies in its ability to precisely restate selection by consequences, thereby restating, or even replacing, the law of effect. Beyond this, the equation may be useful whenever one regards ontogenetic behavioral change as evolutionary change, because it describes evolutionary change in abstract, general terms. As an analytical tool, the behavioral Price equation is an excellent aid in understanding how behavior changes within organisms' lifetimes. For example, it illuminates evolution of response rate, analyses of choice in concurrent schedules, negative contingencies, and dilemmas of self-control. © 2017 Society for the Experimental Analysis of Behavior.
Analysis of Tube Hydroforming by means of an Inverse Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Johnson, Kenneth I.; Khaleel, Mohammad A.
2003-05-01
This paper presents a computational tool for the analysis of freely hydroformed tubes by means of an inverse approach. The formulation of the inverse method developed by Guo et al. is adopted and extended to the tube hydrofoming problems in which the initial geometry is a round tube submitted to hydraulic pressure and axial feed at the tube ends (end-feed). A simple criterion based on a forming limit diagram is used to predict the necking regions in the deformed workpiece. Although the developed computational tool is a stand-alone code, it has been linked to the Marc finite element code formore » meshing and visualization of results. The application of the inverse approach to tube hydroforming is illustrated through the analyses of the aluminum alloy AA6061-T4 seamless tubes under free hydroforming conditions. The results obtained are in good agreement with those issued from a direct incremental approach. However, the computational time in the inverse procedure is much less than that in the incremental method.« less
Framework for SEM contour analysis
NASA Astrophysics Data System (ADS)
Schneider, L.; Farys, V.; Serret, E.; Fenouillet-Beranger, C.
2017-03-01
SEM images provide valuable information about patterning capability. Geometrical properties such as Critical Dimension (CD) can be extracted from them and are used to calibrate OPC models, thus making OPC more robust and reliable. However, there is currently a shortage of appropriate metrology tools to inspect complex two-dimensional patterns in the same way as one would work with simple one-dimensional patterns. In this article we present a full framework for the analysis of SEM images. It has been proven to be fast, reliable and robust for every type of structure, and particularly for two-dimensional structures. To achieve this result, several innovative solutions have been developed and will be presented in the following pages. Firstly, we will present a new noise filter which is used to reduce noise on SEM images, followed by an efficient topography identifier, and finally we will describe the use of a topological skeleton as a measurement tool that can extend CD measurements on all kinds of patterns.
PsychoPy—Psychophysics software in Python
Peirce, Jonathan W.
2007-01-01
The vast majority of studies into visual processing are conducted using computer display technology. The current paper describes a new free suite of software tools designed to make this task easier, using the latest advances in hardware and software. PsychoPy is a platform-independent experimental control system written in the Python interpreted language using entirely free libraries. PsychoPy scripts are designed to be extremely easy to read and write, while retaining complete power for the user to customize the stimuli and environment. Tools are provided within the package to allow everything from stimulus presentation and response collection (from a wide range of devices) to simple data analysis such as psychometric function fitting. Most importantly, PsychoPy is highly extensible and the whole system can evolve via user contributions. If a user wants to add support for a particular stimulus, analysis or hardware device they can look at the code for existing examples, modify them and submit the modifications back into the package so that the whole community benefits. PMID:17254636
PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.
Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco
2016-07-11
Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.
Burridge-Knopoff Model as an Educational and Demonstrational Tool in Seismicity Prediction
NASA Astrophysics Data System (ADS)
Kato, M.
2007-12-01
While our effort is ongoing, the fact that predicting destructive earthquakes is not a straightforward business is hard to sell to the general public. Japan is prone to two types of destructive earthquakes; interplate events along Japan Trench and Nankai Trough, and intraplate events that often occur beneath megacities. Periodicity of interplate earthquakes is usually explained by the elastic rebound theory, but we are aware that the historical seismicity along Nankai Trough is not simply periodic. Inland intraplate events have geologically postulated recurrence intervals that are far longer than human lifetime, and we do not have ample knowledge to model their behavior that includes interaction among intraplate and interplate events. To demonstrate that accumulation and release of elastic energy is complex even in a simple system, we propose to utilize the Burridge-Knopoff (BK) model as a demonstrational tool. This original one-dimensional model is easy to construct and handle so that this is also an effective educational tool for classroom use. Our simulator is a simple realization of the original one dimensional BK, which consists of small blocks, springs and a motor. Accumulation and release of strain is visibly observable, and by guessing when the next large events occur we are able to intuitively learn that observation of strain accumulation is only one element in predicting large events. Quantitative analysis of the system is also possible by measuring the movement of blocks. While the long term average of strain energy is controlled by the loading rate, observed seismicity is neither time-predictable nor slip-predictable. Time between successive events is never a constant. Distribution of released energy obeys the power law, similar to Ishimoto- Iida and Gutenberg-Richter Law. This tool is also useful in demonstration of nonlinear behavior of complex system.
Rule Systems for Runtime Verification: A Short Tutorial
NASA Astrophysics Data System (ADS)
Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex
In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.
Saliva as a diagnostic fluid. Literature review
Mancheño-Franch, Aisha; Marzal-Gamarra, Cristina; Carlos-Fabuel, Laura
2012-01-01
There is a growing interest in diagnosis based on the analysis of saliva. This is a simple, non-invasive method of obtaining oral samples which is safe for both the health worker and the patient, not to mention allowing for simple and cost-efficient storage. The majority of studies use general saliva samples in their entirety, complex fluids containing both local and systemic sources and whose composition corresponds to that of the blood. General saliva contains a considerable amount of desquamated epithelial cells, microorganisms and remnants of food and drink; it is essential to cleanse and refine the saliva samples to remove any external elements. Immediate processing of the sample is recommended in order to avoid decomposition, where this is not possible, the sample may be stored at -80ºC. Salivary analysis – much the same as blood analysis – aims to identify diverse medication or indications of certain diseases while providing a relatively simple tool for both early diagnosis and monitoring various irregularities. The practicalities of salivary analysis have been studied in fields such as: viral and bacterial infections, autoimmune diseases (like Sjögren’s syndrome and cɶliac disease), endocrinopathies (such as Cushing’s syndrome), oncology (early diagnosis of breast, lung and stomach carcinoma and oral squamous cell carcinoma), stress assessment, medication detection and forensic science among others. It is hoped that salivary analysis, with the help of current technological advances, will be valued much more highly in the near future. There still remain contradictory results with respect to analytic markers, which is why further studies into wider-ranging samples are fundamental to prove its viability. Key words:Saliva, biomarkers, early diagnosis. PMID:24558562
NASA Astrophysics Data System (ADS)
Hart, D. M.; Merchant, B. J.; Abbott, R. E.
2012-12-01
The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.
Fisher, Rohan; Lassa, Jonatan
2017-04-18
Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and responsive nature of the applications described has the potential to allow complex environmental social and political considerations to be incorporated and visualised. Through supporting evidence-based planning the innovative modelling practices described have the potential to help local health and emergency response planning in the developing world.
NASA Technical Reports Server (NTRS)
Chwalowski, Pawel; Samareh, Jamshid A.; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.
2009-01-01
The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium- to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing, a folding wing, and a bat-like wing. The paper also includes the verification of a medium-fidelity aerodynamic tool used for the aerodynamic database generation with a steady and unsteady high-fidelity CFD analysis tool for a folding wing example.
High-Performance Data Analysis Tools for Sun-Earth Connection Missions
NASA Technical Reports Server (NTRS)
Messmer, Peter
2011-01-01
The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.
Rocket Engine Oscillation Diagnostics
NASA Technical Reports Server (NTRS)
Nesman, Tom; Turner, James E. (Technical Monitor)
2002-01-01
Rocket engine oscillating data can reveal many physical phenomena ranging from unsteady flow and acoustics to rotordynamics and structural dynamics. Because of this, engine diagnostics based on oscillation data should employ both signal analysis and physical modeling. This paper describes an approach to rocket engine oscillation diagnostics, types of problems encountered, and example problems solved. Determination of design guidelines and environments (or loads) from oscillating phenomena is required during initial stages of rocket engine design, while the additional tasks of health monitoring, incipient failure detection, and anomaly diagnostics occur during engine development and operation. Oscillations in rocket engines are typically related to flow driven acoustics, flow excited structures, or rotational forces. Additional sources of oscillatory energy are combustion and cavitation. Included in the example problems is a sampling of signal analysis tools employed in diagnostics. The rocket engine hardware includes combustion devices, valves, turbopumps, and ducts. Simple models of an oscillating fluid system or structure can be constructed to estimate pertinent dynamic parameters governing the unsteady behavior of engine systems or components. In the example problems it is shown that simple physical modeling when combined with signal analysis can be successfully employed to diagnose complex rocket engine oscillatory phenomena.
Li, Xinan; Xu, Hongyuan; Cheung, Jeffrey T
2016-12-01
This work describes a new approach for gait analysis and balance measurement. It uses an inertial measurement unit (IMU) that can either be embedded inside a dynamically unstable platform for balance measurement or mounted on the lower back of a human participant for gait analysis. The acceleration data along three Cartesian coordinates is analyzed by the gait-force model to extract bio-mechanics information in both the dynamic state as in the gait analyzer and the steady state as in the balance scale. For the gait analyzer, the simple, noninvasive and versatile approach makes it appealing to a broad range of applications in clinical diagnosis, rehabilitation monitoring, athletic training, sport-apparel design, and many other areas. For the balance scale, it provides a portable platform to measure the postural deviation and the balance index under visual or vestibular sensory input conditions. Despite its simple construction and operation, excellent agreement has been demonstrated between its performance and the high-cost commercial balance unit over a wide dynamic range. The portable balance scale is an ideal tool for routine monitoring of balance index, fall-risk assessment, and other balance-related health issues for both clinical and household use.
External validation of a simple clinical tool used to predict falls in people with Parkinson disease
Duncan, Ryan P.; Cavanaugh, James T.; Earhart, Gammon M.; Ellis, Terry D.; Ford, Matthew P.; Foreman, K. Bo; Leddy, Abigail L.; Paul, Serene S.; Canning, Colleen G.; Thackeray, Anne; Dibble, Leland E.
2015-01-01
Background Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. METHODS We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. RESULTS The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76 –0.89), comparable to the developmental study. CONCLUSION The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual’s risk of an impending fall. PMID:26003412
Duncan, Ryan P; Cavanaugh, James T; Earhart, Gammon M; Ellis, Terry D; Ford, Matthew P; Foreman, K Bo; Leddy, Abigail L; Paul, Serene S; Canning, Colleen G; Thackeray, Anne; Dibble, Leland E
2015-08-01
Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76-0.89), comparable to the developmental study. The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual's risk of an impending fall. Copyright © 2015 Elsevier Ltd. All rights reserved.
Direct Simple Shear Test Data Analysis using Jupyter Notebooks on DesignSafe-CI
NASA Astrophysics Data System (ADS)
Eslami, M.; Esteva, M.; Brandenberg, S. J.
2017-12-01
Due to the large number of files and their complex structure, managing data generated during natural hazards experiments requires scalable and specialized tools. DesignSafe-CI (https://www.designsafe-ci.org/) is a web-based research platform that provides computational tools to analyze, curate, and publish critical data for natural hazards research making it understandable and reusable. We present a use case from a series of Direct Simple Shear (DSS) experiments in which we used DS-CI to post-process, visualize, publish, and enable further analysis of the data. Current practice in geotechnical design against earthquakes relies on the soil's plasticity index (PI) to assess liquefaction susceptibility, and cyclic softening triggering procedures, although, quite divergent recommendations on recommended levels of plasticity can be found in the literature for these purposes. A series of cyclic and monotonic direct simple shear experiments was conducted on three low-plasticity fine-grained mixtures at the same plasticity index to examine the effectiveness of the PI in characterization of these types of materials. Results revealed that plasticity index is an insufficient indicator of the cyclic behavior of low-plasticity fine-grained soils, and corrections for pore fluid chemistry and clay minerology may be necessary for future liquefaction susceptibility and cyclic softening assessment procedures. Each monotonic, or cyclic experiment contains two stages; consolidation and shear, which include time series of load, displacement, and corresponding stresses and strains, as well as equivalent excess pore-water pressure. Using the DS-CI curation pipeline we categorized the data to display and describe the experiment's structure and files corresponding to each stage of the experiments. Two separate notebooks in Python 3 were created using the Jupyter application available in DS-CI. A data plotter aids visualizing the experimental data in relation to the sensor from which it was generated. The analysis notebook allows combining outcomes of multiple tests, conducting diverse analyses to find critical parameters, and developing plots at arbitrary strain levels. Using the platform aids both researchers work with the data and those reusing it.
Laplace Transform Based Radiative Transfer Studies
NASA Astrophysics Data System (ADS)
Hu, Y.; Lin, B.; Ng, T.; Yang, P.; Wiscombe, W.; Herath, J.; Duffy, D.
2006-12-01
Multiple scattering is the major uncertainty for data analysis of space-based lidar measurements. Until now, accurate quantitative lidar data analysis has been limited to very thin objects that are dominated by single scattering, where photons from the laser beam only scatter a single time with particles in the atmosphere before reaching the receiver, and simple linear relationship between physical property and lidar signal exists. In reality, multiple scattering is always a factor in space-based lidar measurement and it dominates space- based lidar returns from clouds, dust aerosols, vegetation canopy and phytoplankton. While multiple scattering are clear signals, the lack of a fast-enough lidar multiple scattering computation tool forces us to treat the signal as unwanted "noise" and use simple multiple scattering correction scheme to remove them. Such multiple scattering treatments waste the multiple scattering signals and may cause orders of magnitude errors in retrieved physical properties. Thus the lack of fast and accurate time-dependent radiative transfer tools significantly limits lidar remote sensing capabilities. Analyzing lidar multiple scattering signals requires fast and accurate time-dependent radiative transfer computations. Currently, multiple scattering is done with Monte Carlo simulations. Monte Carlo simulations take minutes to hours and are too slow for interactive satellite data analysis processes and can only be used to help system / algorithm design and error assessment. We present an innovative physics approach to solve the time-dependent radiative transfer problem. The technique utilizes FPGA based reconfigurable computing hardware. The approach is as following, 1. Physics solution: Perform Laplace transform on the time and spatial dimensions and Fourier transform on the viewing azimuth dimension, and convert the radiative transfer differential equation solving into a fast matrix inversion problem. The majority of the radiative transfer computation goes to matrix inversion processes, FFT and inverse Laplace transforms. 2. Hardware solutions: Perform the well-defined matrix inversion, FFT and Laplace transforms on highly parallel, reconfigurable computing hardware. This physics-based computational tool leads to accurate quantitative analysis of space-based lidar signals and improves data quality of current lidar mission such as CALIPSO. This presentation will introduce the basic idea of this approach, preliminary results based on SRC's FPGA-based Mapstation, and how we may apply it to CALIPSO data analysis.
CANEapp: a user-friendly application for automated next generation transcriptomic data analysis.
Velmeshev, Dmitry; Lally, Patrick; Magistri, Marco; Faghihi, Mohammad Ali
2016-01-13
Next generation sequencing (NGS) technologies are indispensable for molecular biology research, but data analysis represents the bottleneck in their application. Users need to be familiar with computer terminal commands, the Linux environment, and various software tools and scripts. Analysis workflows have to be optimized and experimentally validated to extract biologically meaningful data. Moreover, as larger datasets are being generated, their analysis requires use of high-performance servers. To address these needs, we developed CANEapp (application for Comprehensive automated Analysis of Next-generation sequencing Experiments), a unique suite that combines a Graphical User Interface (GUI) and an automated server-side analysis pipeline that is platform-independent, making it suitable for any server architecture. The GUI runs on a PC or Mac and seamlessly connects to the server to provide full GUI control of RNA-sequencing (RNA-seq) project analysis. The server-side analysis pipeline contains a framework that is implemented on a Linux server through completely automated installation of software components and reference files. Analysis with CANEapp is also fully automated and performs differential gene expression analysis and novel noncoding RNA discovery through alternative workflows (Cuffdiff and R packages edgeR and DESeq2). We compared CANEapp to other similar tools, and it significantly improves on previous developments. We experimentally validated CANEapp's performance by applying it to data derived from different experimental paradigms and confirming the results with quantitative real-time PCR (qRT-PCR). CANEapp adapts to any server architecture by effectively using available resources and thus handles large amounts of data efficiently. CANEapp performance has been experimentally validated on various biological datasets. CANEapp is available free of charge at http://psychiatry.med.miami.edu/research/laboratory-of-translational-rna-genomics/CANE-app . We believe that CANEapp will serve both biologists with no computational experience and bioinformaticians as a simple, timesaving but accurate and powerful tool to analyze large RNA-seq datasets and will provide foundations for future development of integrated and automated high-throughput genomics data analysis tools. Due to its inherently standardized pipeline and combination of automated analysis and platform-independence, CANEapp is an ideal for large-scale collaborative RNA-seq projects between different institutions and research groups.
Value-added Data Services at the Goddard Earth Sciences Data and Information Services Center
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory G.; Alcott, Gary T.; Kempler, Steven J.; Lynnes, Christopher S.; Vollmer, Bruce E.
2004-01-01
The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC), in addition to serving the Earth Science community as one of the major Distributed Active Archives Centers (DAACs), provides much more than just data. Among the value-added services available to general users are subsetting data spatially and/or by parameter, online analysis (to avoid downloading unnecessarily all the data), and assistance in obtaining data from other centers. Services available to data producers and high-volume users include consulting on building new products with standard formats and metadata and construction of data management systems. A particularly useful service is data processing at the DISC (i.e., close to the input data) with the users algorithm. This can take a number of different forms: as a configuration-managed algorithm within the main processing stream; as a stand-alone program next to the on-line data storage; as build-it-yourself code within the Near-Archive Data Mining (NADM) system; or as an on-the-fly analysis with simple algorithms embedded into the web-based tools. Partnerships between the GES DISC and scientists, both producers and users, allow the scientists to concentrate on science, while the GES DISC handles the data management, e.g., formats, integration, and data processing. The existing data management infrastructure at the GES DISC supports a wide spectrum of options: from simple data support to sophisticated on-line analysis tools, producing economies of scale and rapid time-to-deploy. At the same time, such partnerships allow the GES DISC to serve the user community more efficiently and to better prioritize on-line holdings. Several examples of successful partnerships are described in the presentation.
Benefits and Pitfalls: Simple Guidelines for the Use of Social Networking Tools in K-12 Education
ERIC Educational Resources Information Center
Huffman, Stephanie
2013-01-01
The article will outline a framework for the use of social networking tools in K-12 education framed around four thought provoking questions: 1) what are the benefits and pitfalls of using social networking tools in P-12 education, 2) how do we plan effectively for the use of social networking tool, 3) what role does professional development play…
NASA Astrophysics Data System (ADS)
Warren, M. A.; Goult, S.; Clewley, D.
2018-06-01
Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.
Molgenis-impute: imputation pipeline in a box.
Kanterakis, Alexandros; Deelen, Patrick; van Dijk, Freerk; Byelas, Heorhiy; Dijkstra, Martijn; Swertz, Morris A
2015-08-19
Genotype imputation is an important procedure in current genomic analysis such as genome-wide association studies, meta-analyses and fine mapping. Although high quality tools are available that perform the steps of this process, considerable effort and expertise is required to set up and run a best practice imputation pipeline, particularly for larger genotype datasets, where imputation has to scale out in parallel on computer clusters. Here we present MOLGENIS-impute, an 'imputation in a box' solution that seamlessly and transparently automates the set up and running of all the steps of the imputation process. These steps include genome build liftover (liftovering), genotype phasing with SHAPEIT2, quality control, sample and chromosomal chunking/merging, and imputation with IMPUTE2. MOLGENIS-impute builds on MOLGENIS-compute, a simple pipeline management platform for submission and monitoring of bioinformatics tasks in High Performance Computing (HPC) environments like local/cloud servers, clusters and grids. All the required tools, data and scripts are downloaded and installed in a single step. Researchers with diverse backgrounds and expertise have tested MOLGENIS-impute on different locations and imputed over 30,000 samples so far using the 1,000 Genomes Project and new Genome of the Netherlands data as the imputation reference. The tests have been performed on PBS/SGE clusters, cloud VMs and in a grid HPC environment. MOLGENIS-impute gives priority to the ease of setting up, configuring and running an imputation. It has minimal dependencies and wraps the pipeline in a simple command line interface, without sacrificing flexibility to adapt or limiting the options of underlying imputation tools. It does not require knowledge of a workflow system or programming, and is targeted at researchers who just want to apply best practices in imputation via simple commands. It is built on the MOLGENIS compute workflow framework to enable customization with additional computational steps or it can be included in other bioinformatics pipelines. It is available as open source from: https://github.com/molgenis/molgenis-imputation.
A numerical tool for reproducing driver behaviour: experiments and predictive simulations.
Casucci, M; Marchitto, M; Cacciabue, P C
2010-03-01
This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions.
Diagnosis of Late-Stage, Early-Onset, Small-Fiber Polyneuropathy
2016-10-01
develop biotechnology tools for simple diagnosis (sweat testing and pupilometry), 3) identify gene polymorphisms to detect risk for SFPN. None...Goal 4) Specific Aim 2: To develop and evaluate simple biotechnology devices for diagnosing and monitoring longstanding eoSFPN based on
Analyzing the dynamics of cell cycle processes from fixed samples through ergodic principles
Wheeler, Richard John
2015-01-01
Tools to analyze cyclical cellular processes, particularly the cell cycle, are of broad value for cell biology. Cell cycle synchronization and live-cell time-lapse observation are widely used to analyze these processes but are not available for many systems. Simple mathematical methods built on the ergodic principle are a well-established, widely applicable, and powerful alternative analysis approach, although they are less widely used. These methods extract data about the dynamics of a cyclical process from a single time-point “snapshot” of a population of cells progressing through the cycle asynchronously. Here, I demonstrate application of these simple mathematical methods to analysis of basic cyclical processes—cycles including a division event, cell populations undergoing unicellular aging, and cell cycles with multiple fission (schizogony)—as well as recent advances that allow detailed mapping of the cell cycle from continuously changing properties of the cell such as size and DNA content. This includes examples using existing data from mammalian, yeast, and unicellular eukaryotic parasite cell biology. Through the ongoing advances in high-throughput cell analysis by light microscopy, electron microscopy, and flow cytometry, these mathematical methods are becoming ever more important and are a powerful complementary method to traditional synchronization and time-lapse cell cycle analysis methods. PMID:26543196
Exploring the Dynamics of Cell Processes through Simulations of Fluorescence Microscopy Experiments
Angiolini, Juan; Plachta, Nicolas; Mocskos, Esteban; Levi, Valeria
2015-01-01
Fluorescence correlation spectroscopy (FCS) methods are powerful tools for unveiling the dynamical organization of cells. For simple cases, such as molecules passively moving in a homogeneous media, FCS analysis yields analytical functions that can be fitted to the experimental data to recover the phenomenological rate parameters. Unfortunately, many dynamical processes in cells do not follow these simple models, and in many instances it is not possible to obtain an analytical function through a theoretical analysis of a more complex model. In such cases, experimental analysis can be combined with Monte Carlo simulations to aid in interpretation of the data. In response to this need, we developed a method called FERNET (Fluorescence Emission Recipes and Numerical routines Toolkit) based on Monte Carlo simulations and the MCell-Blender platform, which was designed to treat the reaction-diffusion problem under realistic scenarios. This method enables us to set complex geometries of the simulation space, distribute molecules among different compartments, and define interspecies reactions with selected kinetic constants, diffusion coefficients, and species brightness. We apply this method to simulate single- and multiple-point FCS, photon-counting histogram analysis, raster image correlation spectroscopy, and two-color fluorescence cross-correlation spectroscopy. We believe that this new program could be very useful for predicting and understanding the output of fluorescence microscopy experiments. PMID:26039162
An R package for the integrated analysis of metabolomics and spectral data.
Costa, Christopher; Maraschin, Marcelo; Rocha, Miguel
2016-06-01
Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as nuclear magnetic resonance, gas or liquid chromatography, mass spectrometry, infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Improved motors for utility applications: Volume 6, Squirrel-cage rotor analysis: Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffith, J.W.; McCoy, R.M.
1986-11-01
An analysis of squirrel cage induction motor rotors was undertaken in response to an Industry Assessment Study finding 10% of motor failures to be rotor related. The analysis focuses on evaluating rotor design life. The evaluation combines state-of-the-art electromagnetic, thermal, and structural solution techniques into an integrated analysis and presents a simple summary. Finite element techniques are central tools in the analysis. The analysis is applied to a specific forced draft fan drive design. Fans as a category of application have a higher failure rate than other categories of power station auxiliary motor applications. Forced-draft fan drives are one ofmore » the major fan drives which accelerate a relatively high value of rotor load inertia. Various starting and operating conditions are studied for this forced-draft fan drive motor including a representative application duty cycle.« less
NASA Astrophysics Data System (ADS)
Zuhdi, Ubaidillah
2014-04-01
The purpose of this study is to get another perspective related to the role of Information and Communication Technology (ICT) sectors in national economy of Indonesia. The period of analysis of this study is 1990-2005. This study employs Input-Output (IO) analysis as a tool of analysis. More specifically, this study uses simple output multipliers method in order to achieve the purpose. Comparison with previous study is conducted in order to get the objective of this study. Previous study, using Structural Decomposition Analysis (SDA), showed that ICT sectors did not have an important role in Indonesian national economy in above period. The similar results also appear in this study. In other words, from this study, another perspective related to the role of these sectors in Indonesian national economy in analysis period is not found.
Simple Nutrition Screening Tool for Pediatric Inpatients.
White, Melinda; Lawson, Karen; Ramsey, Rebecca; Dennis, Nicole; Hutchinson, Zoe; Soh, Xin Ying; Matsuyama, Misa; Doolan, Annabel; Todd, Alwyn; Elliott, Aoife; Bell, Kristie; Littlewood, Robyn
2016-03-01
Pediatric nutrition risk screening tools are not routinely implemented throughout many hospitals, despite prevalence studies demonstrating malnutrition is common in hospitalized children. Existing tools lack the simplicity of those used to assess nutrition risk in the adult population. This study reports the accuracy of a new, quick, and simple pediatric nutrition screening tool (PNST) designed to be used for pediatric inpatients. The pediatric Subjective Global Nutrition Assessment (SGNA) and anthropometric measures were used to develop and assess the validity of 4 simple nutrition screening questions comprising the PNST. Participants were pediatric inpatients in 2 tertiary pediatric hospitals and 1 regional hospital. Two affirmative answers to the PNST questions were found to maximize the specificity and sensitivity to the pediatric SGNA and body mass index (BMI) z scores for malnutrition in 295 patients. The PNST identified 37.6% of patients as being at nutrition risk, whereas the pediatric SGNA identified 34.2%. The sensitivity and specificity of the PNST compared with the pediatric SGNA were 77.8% and 82.1%, respectively. The sensitivity of the PNST at detecting patients with a BMI z score of less than -2 was 89.3%, and the specificity was 66.2%. Both the PNST and pediatric SGNA were relatively poor at detecting patients who were stunted or overweight, with the sensitivity and specificity being less than 69%. The PNST provides a sensitive, valid, and simpler alternative to existing pediatric nutrition screening tools such as Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP), Screening Tool Risk on Nutritional status and Growth (STRONGkids), and Paediatric Yorkhill Malnutrition Score (PYMS) to ensure the early detection of hospitalized children at nutrition risk. © 2014 American Society for Parenteral and Enteral Nutrition.
Simple proteomics data analysis in the object-oriented PowerShell.
Mohammed, Yassene; Palmblad, Magnus
2013-01-01
Scripting languages such as Perl and Python are appreciated for solving simple, everyday tasks in bioinformatics. A more recent, object-oriented command shell and scripting language, Windows PowerShell, has many attractive features: an object-oriented interactive command line, fluent navigation and manipulation of XML files, ability to consume Web services from the command line, consistent syntax and grammar, rich regular expressions, and advanced output formatting. The key difference between classical command shells and scripting languages, such as bash, and object-oriented ones, such as PowerShell, is that in the latter the result of a command is a structured object with inherited properties and methods rather than a simple stream of characters. Conveniently, PowerShell is included in all new releases of Microsoft Windows and therefore already installed on most computers in classrooms and teaching labs. In this chapter we demonstrate how PowerShell in particular allows easy interaction with mass spectrometry data in XML formats, connection to Web services for tools such as BLAST, and presentation of results as formatted text or graphics. These features make PowerShell much more than "yet another scripting language."
Community-driven computational biology with Debian Linux.
Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles
2010-12-21
The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.
Proposal of a micromagnetic standard problem for ferromagnetic resonance simulations
NASA Astrophysics Data System (ADS)
Baker, Alexander; Beg, Marijan; Ashton, Gregory; Albert, Maximilian; Chernyshenko, Dmitri; Wang, Weiwei; Zhang, Shilei; Bisotti, Marc-Antonio; Franchin, Matteo; Hu, Chun Lian; Stamps, Robert; Hesjedal, Thorsten; Fangohr, Hans
2017-01-01
Nowadays, micromagnetic simulations are a common tool for studying a wide range of different magnetic phenomena, including the ferromagnetic resonance. A technique for evaluating reliability and validity of different micromagnetic simulation tools is the simulation of proposed standard problems. We propose a new standard problem by providing a detailed specification and analysis of a sufficiently simple problem. By analyzing the magnetization dynamics in a thin permalloy square sample, triggered by a well defined excitation, we obtain the ferromagnetic resonance spectrum and identify the resonance modes via Fourier transform. Simulations are performed using both finite difference and finite element numerical methods, with OOMMF and Nmag simulators, respectively. We report the effects of initial conditions and simulation parameters on the character of the observed resonance modes for this standard problem. We provide detailed instructions and code to assist in using the results for evaluation of new simulator tools, and to help with numerical calculation of ferromagnetic resonance spectra and modes in general.
2016-02-01
from the tools being used. For example, while Coq proves properties it does not dump an explanation of the proofs in any currently supported form. The...Distribution Unlimited 5 Hotel room locks and card keys use a simple protocol to manage the transition of rooms from one guest to the next. The lock...retains that guest key’s code. A new guest checks in and gets a card with a new current code, and the previous code set to the previous guest’s current
The potential of genetic algorithms for conceptual design of rotor systems
NASA Technical Reports Server (NTRS)
Crossley, William A.; Wells, Valana L.; Laananen, David H.
1993-01-01
The capabilities of genetic algorithms as a non-calculus based, global search method make them potentially useful in the conceptual design of rotor systems. Coupling reasonably simple analysis tools to the genetic algorithm was accomplished, and the resulting program was used to generate designs for rotor systems to match requirements similar to those of both an existing helicopter and a proposed helicopter design. This provides a comparison with the existing design and also provides insight into the potential of genetic algorithms in design of new rotors.
BioImageXD: an open, general-purpose and high-throughput image-processing platform.
Kankaanpää, Pasi; Paavolainen, Lassi; Tiitta, Silja; Karjalainen, Mikko; Päivärinne, Joacim; Nieminen, Jonna; Marjomäki, Varpu; Heino, Jyrki; White, Daniel J
2012-06-28
BioImageXD puts open-source computer science tools for three-dimensional visualization and analysis into the hands of all researchers, through a user-friendly graphical interface tuned to the needs of biologists. BioImageXD has no restrictive licenses or undisclosed algorithms and enables publication of precise, reproducible and modifiable workflows. It allows simple construction of processing pipelines and should enable biologists to perform challenging analyses of complex processes. We demonstrate its performance in a study of integrin clustering in response to selected inhibitors.
2010-01-01
We present an extensible software model for the genotype and phenotype community, XGAP. Readers can download a standard XGAP (http://www.xgap.org) or auto-generate a custom version using MOLGENIS with programming interfaces to R-software and web-services or user interfaces for biologists. XGAP has simple load formats for any type of genotype, epigenotype, transcript, protein, metabolite or other phenotype data. Current functionality includes tools ranging from eQTL analysis in mouse to genome-wide association studies in humans. PMID:20214801
Swertz, Morris A; Velde, K Joeri van der; Tesson, Bruno M; Scheltema, Richard A; Arends, Danny; Vera, Gonzalo; Alberts, Rudi; Dijkstra, Martijn; Schofield, Paul; Schughart, Klaus; Hancock, John M; Smedley, Damian; Wolstencroft, Katy; Goble, Carole; de Brock, Engbert O; Jones, Andrew R; Parkinson, Helen E; Jansen, Ritsert C
2010-01-01
We present an extensible software model for the genotype and phenotype community, XGAP. Readers can download a standard XGAP (http://www.xgap.org) or auto-generate a custom version using MOLGENIS with programming interfaces to R-software and web-services or user interfaces for biologists. XGAP has simple load formats for any type of genotype, epigenotype, transcript, protein, metabolite or other phenotype data. Current functionality includes tools ranging from eQTL analysis in mouse to genome-wide association studies in humans.
SIMBA: a web tool for managing bacterial genome assembly generated by Ion PGM sequencing technology.
Mariano, Diego C B; Pereira, Felipe L; Aguiar, Edgar L; Oliveira, Letícia C; Benevides, Leandro; Guimarães, Luís C; Folador, Edson L; Sousa, Thiago J; Ghosh, Preetam; Barh, Debmalya; Figueiredo, Henrique C P; Silva, Artur; Ramos, Rommel T J; Azevedo, Vasco A C
2016-12-15
The evolution of Next-Generation Sequencing (NGS) has considerably reduced the cost per sequenced-base, allowing a significant rise of sequencing projects, mainly in prokaryotes. However, the range of available NGS platforms requires different strategies and software to correctly assemble genomes. Different strategies are necessary to properly complete an assembly project, in addition to the installation or modification of various software. This requires users to have significant expertise in these software and command line scripting experience on Unix platforms, besides possessing the basic expertise on methodologies and techniques for genome assembly. These difficulties often delay the complete genome assembly projects. In order to overcome this, we developed SIMBA (SImple Manager for Bacterial Assemblies), a freely available web tool that integrates several component tools for assembling and finishing bacterial genomes. SIMBA provides a friendly and intuitive user interface so bioinformaticians, even with low computational expertise, can work under a centralized administrative control system of assemblies managed by the assembly center head. SIMBA guides the users to execute assembly process through simple and interactive pages. SIMBA workflow was divided in three modules: (i) projects: allows a general vision of genome sequencing projects, in addition to data quality analysis and data format conversions; (ii) assemblies: allows de novo assemblies with the software Mira, Minia, Newbler and SPAdes, also assembly quality validations using QUAST software; and (iii) curation: presents methods to finishing assemblies through tools for scaffolding contigs and close gaps. We also presented a case study that validated the efficacy of SIMBA to manage bacterial assemblies projects sequenced using Ion Torrent PGM. Besides to be a web tool for genome assembly, SIMBA is a complete genome assemblies project management system, which can be useful for managing of several projects in laboratories. SIMBA source code is available to download and install in local webservers at http://ufmg-simba.sourceforge.net .
SEADS 3.0 Sectoral Energy/Employment Analysis and Data System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roop, Joseph M.; Anderson, David A.; Schultz, Robert W.
2007-12-17
SEADS 3.0, the Sectoral Energy/Employment Analysis and Data System, is a revision and upgrading of SEADS--PC, a software package designed for the analysis of policy that could be described by modifying final demands of consumer, businesses, or governments (Roop, et al., 1995). If a question can be formulated so that implications can be translated into changes in final demands for goods and services, then SEADS 3.0 provides a quick and easy tool to assess preliminary impacts. And SEADS 3.0 should be considered just that: a quick and easy way to get preliminary results. Often a thorough answer, even to suchmore » a simple question as, “What would be the effect on U. S. energy use and employment if the Federal Government doubled R&D expenditures?” requires a more sophisticated analytical framework than the input-output structure embedded in SEADS 3.0. This tool uses a static, input-output model to assess the impacts of changes in final demands on first industry output, then employment and energy use. The employment and energy impacts are derived by multiplying the industry outputs (derived from the changed final demands) by industry-specific energy and employment coefficients. The tool also allows for the specification of regional or state employment impacts, though this option is not available for energy impacts.« less
NASA Technical Reports Server (NTRS)
Stockwell, Alan E.; Cooper, Paul A.
1991-01-01
The Integrated Multidisciplinary Analysis Tool (IMAT) consists of a menu driven executive system coupled with a relational database which links commercial structures, structural dynamics and control codes. The IMAT graphics system, a key element of the software, provides a common interface for storing, retrieving, and displaying graphical information. The IMAT Graphics Manual shows users of commercial analysis codes (MATRIXx, MSC/NASTRAN and I-DEAS) how to use the IMAT graphics system to obtain high quality graphical output using familiar plotting procedures. The manual explains the key features of the IMAT graphics system, illustrates their use with simple step-by-step examples, and provides a reference for users who wish to take advantage of the flexibility of the software to customize their own applications.
A simple method for processing data with least square method
NASA Astrophysics Data System (ADS)
Wang, Chunyan; Qi, Liqun; Chen, Yongxiang; Pang, Guangning
2017-08-01
The least square method is widely used in data processing and error estimation. The mathematical method has become an essential technique for parameter estimation, data processing, regression analysis and experimental data fitting, and has become a criterion tool for statistical inference. In measurement data analysis, the distribution of complex rules is usually based on the least square principle, i.e., the use of matrix to solve the final estimate and to improve its accuracy. In this paper, a new method is presented for the solution of the method which is based on algebraic computation and is relatively straightforward and easy to understand. The practicability of this method is described by a concrete example.
Inter-subject phase synchronization for exploratory analysis of task-fMRI.
Bolt, Taylor; Nomi, Jason S; Vij, Shruti G; Chang, Catie; Uddin, Lucina Q
2018-08-01
Analysis of task-based fMRI data is conventionally carried out using a hypothesis-driven approach, where blood-oxygen-level dependent (BOLD) time courses are correlated with a hypothesized temporal structure. In some experimental designs, this temporal structure can be difficult to define. In other cases, experimenters may wish to take a more exploratory, data-driven approach to detecting task-driven BOLD activity. In this study, we demonstrate the efficiency and power of an inter-subject synchronization approach for exploratory analysis of task-based fMRI data. Combining the tools of instantaneous phase synchronization and independent component analysis, we characterize whole-brain task-driven responses in terms of group-wise similarity in temporal signal dynamics of brain networks. We applied this framework to fMRI data collected during performance of a simple motor task and a social cognitive task. Analyses using an inter-subject phase synchronization approach revealed a large number of brain networks that dynamically synchronized to various features of the task, often not predicted by the hypothesized temporal structure of the task. We suggest that this methodological framework, along with readily available tools in the fMRI community, provides a powerful exploratory, data-driven approach for analysis of task-driven BOLD activity. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Pearl, Judea
2000-03-01
Written by one of the pre-eminent researchers in the field, this book provides a comprehensive exposition of modern analysis of causation. It shows how causality has grown from a nebulous concept into a mathematical theory with significant applications in the fields of statistics, artificial intelligence, philosophy, cognitive science, and the health and social sciences. Pearl presents a unified account of the probabilistic, manipulative, counterfactual and structural approaches to causation, and devises simple mathematical tools for analyzing the relationships between causal connections, statistical associations, actions and observations. The book will open the way for including causal analysis in the standard curriculum of statistics, artifical intelligence, business, epidemiology, social science and economics. Students in these areas will find natural models, simple identification procedures, and precise mathematical definitions of causal concepts that traditional texts have tended to evade or make unduly complicated. This book will be of interest to professionals and students in a wide variety of fields. Anyone who wishes to elucidate meaningful relationships from data, predict effects of actions and policies, assess explanations of reported events, or form theories of causal understanding and causal speech will find this book stimulating and invaluable.
Jorge-Botana, Guillermo; Olmos, Ricardo; Luzón, José M
2018-01-01
The aim of this paper is to describe and explain one useful computational methodology to model the semantic development of word representation: Word maturity. In particular, the methodology is based on the longitudinal word monitoring created by Kirylev and Landauer using latent semantic analysis for the representation of lexical units. The paper is divided into two parts. First, the steps required to model the development of the meaning of words are explained in detail. We describe the technical and theoretical aspects of each step. Second, we provide a simple example of application of this methodology with some simple tools that can be used by applied researchers. This paper can serve as a user-friendly guide for researchers interested in modeling changes in the semantic representations of words. Some current aspects of the technique and future directions are also discussed. WIREs Cogn Sci 2018, 9:e1457. doi: 10.1002/wcs.1457 This article is categorized under: Computer Science > Natural Language Processing Linguistics > Language Acquisition Psychology > Development and Aging. © 2017 Wiley Periodicals, Inc.
NASA Giovanni: A Tool for Visualizing, Analyzing, and Inter-comparing Soil Moisture Data
NASA Technical Reports Server (NTRS)
Teng, William; Rui, Hualan; Vollmer, Bruce; deJeu, Richard; Fang, Fan; Lei, Guang-Dih; Parinussa, Robert
2014-01-01
There are many existing satellite soil moisture algorithms and their derived data products, but there is no simple way for a user to inter-compare the products or analyze them together with other related data. An environment that facilitates such inter-comparison and analysis would be useful for validation of satellite soil moisture retrievals against in situ data and for determining the relationships between different soil moisture products. As part of the NASA Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure) family of portals, which has provided users worldwide with a simple but powerful way to explore NASA data, a beta prototype Giovanni Inter-comparison of Soil Moisture Products portal has been developed. A number of soil moisture data products are currently included in the prototype portal. More will be added, based on user requirements and feedback and as resources become available. Two application examples for the portal are provided. The NASA Giovanni Soil Moisture portal is versatile and extensible, with many possible uses, for research and applications, as well as for the education community.
Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations
NASA Technical Reports Server (NTRS)
Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.
2015-01-01
The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.
Use of simple models to determine wake vortex categories for new aircraft.
DOT National Transportation Integrated Search
2015-06-22
The paper describes how to use simple models and, if needed, sensitivity analyses to determine the wake vortex categories for new aircraft. The methodology provides a tool for the regulators to assess the relative risk of introducing new aircraft int...
Some research perspectives in galloping phenomena: critical conditions and post-critical behavior
NASA Astrophysics Data System (ADS)
Piccardo, Giuseppe; Pagnini, Luisa Carlotta; Tubino, Federica
2015-01-01
This paper gives an overview of wind-induced galloping phenomena, describing its manifold features and the many advances that have taken place in this field. Starting from a quasi-steady model of aeroelastic forces exerted by the wind on a rigid cylinder with three degree-of-freedom, two translations and a rotation in the plane of the model cross section, the fluid-structure interaction forces are described in simple terms, yet suitable with complexity of mechanical systems, both in the linear and in the nonlinear field, thus allowing investigation of a wide range of structural typologies and their dynamic behavior. The paper is driven by some key concerns. A great effort is made in underlying strengths and weaknesses of the classic quasi-steady theory as well as of the simplistic assumptions that are introduced in order to investigate such complex phenomena through simple engineering models. A second aspect, which is crucial to the authors' approach, is to take into account and harmonize the engineering, physical and mathematical perspectives in an interdisciplinary way—something which does not happen often. The authors underline that the quasi-steady approach is an irreplaceable tool, tough approximate and simple, for performing engineering analyses; at the same time, the study of this phenomenon gives origin to numerous problems that make the application of high-level mathematical solutions particularly attractive. Finally, the paper discusses a wide range of features of the galloping theory and its practical use which deserve further attention and refinements, pointing to the great potential represented by new fields of application and advanced analysis tools.
NASA Astrophysics Data System (ADS)
Masoud, Alaa; Koike, Katsuaki
2017-09-01
Detection and analysis of linear features related to surface and subsurface structures have been deemed necessary in natural resource exploration and earth surface instability assessment. Subjectivity in choosing control parameters required in conventional methods of lineament detection may cause unreliable results. To reduce this ambiguity, we developed LINDA (LINeament Detection and Analysis), an integrated tool with graphical user interface in Visual Basic. This tool automates processes of detection and analysis of linear features from grid data of topography (digital elevation model; DEM), gravity and magnetic surfaces, as well as data from remote sensing imagery. A simple interface with five display windows forms a user-friendly interactive environment. The interface facilitates grid data shading, detection and grouping of segments, lineament analyses for calculating strike and dip and estimating fault type, and interactive viewing of lineament geometry. Density maps of the center and intersection points of linear features (segments and lineaments) are also included. A systematic analysis of test DEMs and Landsat 7 ETM+ imagery datasets in the North and South Eastern Deserts of Egypt is implemented to demonstrate the capability of LINDA and correct use of its functions. Linear features from the DEM are superior to those from the imagery in terms of frequency, but both linear features agree with location and direction of V-shaped valleys and dykes and reference fault data. Through the case studies, LINDA applicability is demonstrated to highlight dominant structural trends, which can aid understanding of geodynamic frameworks in any region.
Technology: Presentations in the Cloud with a Twist
ERIC Educational Resources Information Center
Siegle, Del
2011-01-01
Technology tools have come a long way from early word processing applications and opportunities for students to engage in simple programming. Many tools now exist for students to develop and share products in a variety of formats and for a wide range of audiences. PowerPoint is probably the most ubiquitously used tool for student projects. In…
Scratch as a Computational Modelling Tool for Teaching Physics
ERIC Educational Resources Information Center
Lopez, Victor; Hernandez, Maria Isabel
2015-01-01
The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…
Selb, Melissa; Gimigliano, Francesca; Prodinger, Birgit; Stucki, Gerold; Pestelli, Germano; Iocco, Maurizio; Boldrini, Paolo
2017-04-01
As part of international efforts to develop and implement national models including the specification of ICF-based clinical data collection tools, the Italian rehabilitation community initiated a project to develop simple, intuitive descriptions of the ICF Rehabilitation Set, highlighting the core concept of each category in user-friendly language. This paper outlines the Italian experience in developing simple, intuitive descriptions of the ICF Rehabilitation Set as an ICF-based clinical data collection tool for Italy. Consensus process. Expert conference. Multidisciplinary group of rehabilitation professionals. The first of a two-stage consensus process involved developing an initial proposal for simple, intuitive descriptions of each ICF Rehabilitation Set category based on descriptions generated in a similar process in China. Stage two involved a consensus conference. Divided into three working groups, participants discussed and voted (vote A) whether the initially proposed descriptions of each ICF Rehabilitation Set category was simple and intuitive enough for use in daily practice. Afterwards the categories with descriptions considered ambiguous i.e. not simple and intuitive enough, were divided among the working groups, who were asked to propose a new description for the allocated categories. These proposals were then voted (vote B) on in a plenary session. The last step of the consensus conference required each working group to develop a new proposal for each and the same categories with descriptions still considered ambiguous. Participants then voted (final vote) for which of the three proposed descriptions they preferred. Nineteen clinicians from diverse rehabilitation disciplines from various regions of Italy participated in the consensus process. Three ICF categories already achieved consensus in vote A, while 20 ICF categories were accepted in vote B. The remaining 7 categories were decided in the final vote. The findings were discussed in light of current efforts toward developing strategies for ICF implementation, specifically for the application of an ICF-based clinical data collection tool, not only for Italy but also for the rest of Europe. Promising as minimal standards for monitoring the impact of interventions and for standardized reporting of functioning as a relevant outcome in rehabilitation.
Rubin, Katrine Hass; Friis-Holmberg, Teresa; Hermann, Anne Pernille; Abrahamsen, Bo; Brixen, Kim
2013-08-01
A huge number of risk assessment tools have been developed. Far from all have been validated in external studies, more of them have absence of methodological and transparent evidence, and few are integrated in national guidelines. Therefore, we performed a systematic review to provide an overview of existing valid and reliable risk assessment tools for prediction of osteoporotic fractures. Additionally, we aimed to determine if the performance of each tool was sufficient for practical use, and last, to examine whether the complexity of the tools influenced their discriminative power. We searched PubMed, Embase, and Cochrane databases for papers and evaluated these with respect to methodological quality using the Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS) checklist. A total of 48 tools were identified; 20 had been externally validated, however, only six tools had been tested more than once in a population-based setting with acceptable methodological quality. None of the tools performed consistently better than the others and simple tools (i.e., the Osteoporosis Self-assessment Tool [OST], Osteoporosis Risk Assessment Instrument [ORAI], and Garvan Fracture Risk Calculator [Garvan]) often did as well or better than more complex tools (i.e., Simple Calculated Risk Estimation Score [SCORE], WHO Fracture Risk Assessment Tool [FRAX], and Qfracture). No studies determined the effectiveness of tools in selecting patients for therapy and thus improving fracture outcomes. High-quality studies in randomized design with population-based cohorts with different case mixes are needed. Copyright © 2013 American Society for Bone and Mineral Research.
NASA Technical Reports Server (NTRS)
Cooke, C. H.
1975-01-01
STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.
Cygankiewicz, Iwona; Zareba, Wojciech
2013-01-01
Heart rate variability (HRV) provides indirect insight into autonomic nervous system tone, and has a well-established role as a marker of cardiovascular risk. Recent decades brought an increasing interest in HRV assessment as a diagnostic tool in detection of autonomic impairment, and prediction of prognosis in several neurological disorders. Both bedside analysis of simple markers of HRV, as well as more sophisticated HRV analyses including time, frequency domain and nonlinear analysis have been proven to detect early autonomic involvement in several neurological disorders. Furthermore, altered HRV parameters were shown to be related with cardiovascular risk, including sudden cardiac risk, in patients with neurological diseases. This chapter aims to review clinical and prognostic application of HRV analysis in diabetes, stroke, multiple sclerosis, muscular dystrophies, Parkinson's disease and epilepsy. © 2013 Elsevier B.V. All rights reserved.
Development of a bi-equilibrium model for biomass gasification in a downdraft bed reactor.
Biagini, Enrico; Barontini, Federica; Tognotti, Leonardo
2016-02-01
This work proposes a simple and accurate tool for predicting the main parameters of biomass gasification (syngas composition, heating value, flow rate), suitable for process study and system analysis. A multizonal model based on non-stoichiometric equilibrium models and a repartition factor, simulating the bypass of pyrolysis products through the oxidant zone, was developed. The results of tests with different feedstocks (corn cobs, wood pellets, rice husks and vine pruning) in a demonstrative downdraft gasifier (350kW) were used for validation. The average discrepancy between model and experimental results was up to 8 times less than the one with the simple equilibrium model. The repartition factor was successfully related to the operating conditions and characteristics of the biomass to simulate different conditions of the gasifier (variation in potentiality, densification and mixing of feedstock) and analyze the model sensitivity. Copyright © 2015 Elsevier Ltd. All rights reserved.
Finite state model and compatibility theory - New analysis tools for permutation networks
NASA Technical Reports Server (NTRS)
Huang, S.-T.; Tripathi, S. K.
1986-01-01
A simple model to describe the fundamental operation theory of shuffle-exchange-type permutation networks, the finite permutation machine (FPM), is described, and theorems which transform the control matrix result to a continuous compatible vector result are developed. It is found that only 2n-1 shuffle exchange passes are necessary, and that 3n-3 passes are sufficient, to realize all permutations, reducing the sufficient number of passes by two from previous results. The flexibility of the approach is demonstrated by the description of a stack permutation machine (SPM) which can realize all permutations, and by showing that the FPM corresponding to the Benes (1965) network belongs to the SPM. The FPM corresponding to the network with two cascaded reverse-exchange networks is found to realize all permutations, and a simple mechanism to verify several equivalence relationships of various permutation networks is discussed.
Project Management Software for Distributed Industrial Companies
NASA Astrophysics Data System (ADS)
Dobrojević, M.; Medjo, B.; Rakin, M.; Sedmak, A.
This paper gives an overview of the development of a new software solution for project management, intended mainly to use in industrial environment. The main concern of the proposed solution is application in everyday engineering practice in various, mainly distributed industrial companies. Having this in mind, special care has been devoted to development of appropriate tools for tracking, storing and analysis of the information about the project, and in-time delivering to the right team members or other responsible persons. The proposed solution is Internet-based and uses LAMP/WAMP (Linux or Windows - Apache - MySQL - PHP) platform, because of its stability, versatility, open source technology and simple maintenance. Modular structure of the software makes it easy for customization according to client specific needs, with a very short implementation period. Its main advantages are simple usage, quick implementation, easy system maintenance, short training and only basic computer skills needed for operators.
Forensic collection of trace chemicals from diverse surfaces with strippable coatings.
Jakubowski, Michael J; Beltis, Kevin J; Drennan, Paul M; Pindzola, Bradford A
2013-11-07
Surface sampling for chemical analysis plays a vital role in environmental monitoring, industrial hygiene, homeland security and forensics. The standard surface sampling tool, a simple cotton gauze pad, is failing to meet the needs of the community as analytical techniques become more sensitive and the variety of analytes increases. In previous work, we demonstrated the efficacy of non-destructive, conformal, spray-on strippable coatings for chemical collection from simple glass surfaces. Here we expand that work by presenting chemical collection at a low spiking level (0.1 g m(-2)) from a diverse array of common surfaces - painted metal, engineering plastics, painted wallboard and concrete - using strippable coatings. The collection efficiency of the strippable coatings is compared to and far exceeds gauze pads. Collection from concrete, a particular challenge for wipes like gauze, averaged 73% over eight chemically diverse compounds for the strippable coatings whereas gauze averaged 10%.
Several steps/day indicators predict changes in anthropometric outcomes: HUB city steps
USDA-ARS?s Scientific Manuscript database
Walking for exercise remains the most frequently reported leisure-time activity, likely because it is simple, inexpensive, and easily incorporated into most people’s lifestyle. Pedometers are simple, convenient, and economical tools that can be used to quantify step-determined physical activity. F...
Predicting Fish Densities in Lotic Systems: a Simple Modeling Approach
Fish density models are essential tools for fish ecologists and fisheries managers. However, applying these models can be difficult because of high levels of model complexity and the large number of parameters that must be estimated. We designed a simple fish density model and te...
A Progression of Static Equilibrium Laboratory Exercises
ERIC Educational Resources Information Center
Kutzner, Mickey; Kutzner, Andrew
2013-01-01
Although simple architectural structures like bridges, catwalks, cantilevers, and Stonehenge have been integral in human societies for millennia, as have levers and other simple tools, modern students of introductory physics continue to grapple with Newton's conditions for static equilibrium. As formulated in typical introductory physics…
Development of Genomic Simple Sequence Repeats (SSR) by Enrichment Libraries in Date Palm.
Al-Faifi, Sulieman A; Migdadi, Hussein M; Algamdi, Salem S; Khan, Mohammad Altaf; Al-Obeed, Rashid S; Ammar, Megahed H; Jakse, Jerenj
2017-01-01
Development of highly informative markers such as simple sequence repeats (SSR) for cultivar identification and germplasm characterization and management is essential for date palms genetic studies. The present study documents the development of SSR markers and assesses genetic relationships of commonly grown date palm (Phoenix dactylifera L.) cultivars in different geographical regions of Saudi Arabia. A total of 93 novel simple sequence repeat (SSR) markers were screened for their ability to detect polymorphism in date palm. Around 71% of genomic SSRs are dinucleotide, 25% trinucleotide, 3% tetranucleotide, and 1% pentanucleotide motives and show 100% polymorphism. The Unweighted Pair Group Method with Arithmetic Mean (UPGMA) cluster analysis illustrates that cultivars trend to group according to their class of maturity, region of cultivation, and fruit color. Analysis of molecular variations (AMOVA) reveals genetic variation among and within cultivars of 27% and 73%, respectively, according to the geographical distribution of the cultivars. Developed microsatellite markers are of additional value to date palm characterization, tools which can be used by researchers in population genetics, cultivar identification, as well as genetic resource exploration and management. The cultivars tested exhibited a significant amount of genetic diversity and could be suitable for successful breeding programs. Genomic sequences generated from this study are available at the National Center for Biotechnology Information (NCBI), Sequence Read Archive (Accession numbers. LIBGSS_039019).
NASA Technical Reports Server (NTRS)
Hairr, John W.; Huang, Jui-Ten; Ingram, J. Edward; Shah, Bharat M.
1992-01-01
The ISPAN Program (Interactive Stiffened Panel Analysis) is an interactive design tool that is intended to provide a means of performing simple and self contained preliminary analysis of aircraft primary structures made of composite materials. The program combines a series of modules with the finite element code DIAL as its backbone. Four ISPAN Modules were developed and are documented. These include: (1) flat stiffened panel; (2) curved stiffened panel; (3) flat tubular panel; and (4) curved geodesic panel. Users are instructed to input geometric and material properties, load information and types of analysis (linear, bifurcation buckling, or post-buckling) interactively. The program utilizing this information will generate finite element mesh and perform analysis. The output in the form of summary tables of stress or margins of safety, contour plots of loads or stress, and deflected shape plots may be generalized and used to evaluate specific design.
Simple tool for planting acorns
William R. Beaufait
1957-01-01
A handy, inexpensive tool for planting acorns has been developed at the Delta Research Center of the Southern Forest Experiment Station and used successfully in experimental plantings. One of its merits is that it ensures a planting hole of eactly the desired depth.
Mousavi, Soraya; Mariotti, Roberto; Regni, Luca; Nasini, Luigi; Bufacchi, Marina; Pandolfi, Saverio; Baldoni, Luciana; Proietti, Primo
2017-01-01
Germplasm collections of tree crop species represent fundamental tools for conservation of diversity and key steps for its characterization and evaluation. For the olive tree, several collections were created all over the world, but only few of them have been fully characterized and molecularly identified. The olive collection of Perugia University (UNIPG), established in the years' 60, represents one of the first attempts to gather and safeguard olive diversity, keeping together cultivars from different countries. In the present study, a set of 370 olive trees previously uncharacterized was screened with 10 standard simple sequence repeats (SSRs) and nine new EST-SSR markers, to correctly and thoroughly identify all genotypes, verify their representativeness of the entire cultivated olive variation, and validate the effectiveness of new markers in comparison to standard genotyping tools. The SSR analysis revealed the presence of 59 genotypes, corresponding to 72 well known cultivars, 13 of them resulting exclusively present in this collection. The new EST-SSRs have shown values of diversity parameters quite similar to those of best standard SSRs. When compared to hundreds of Mediterranean cultivars, the UNIPG olive accessions were splitted into the three main populations (East, Center and West Mediterranean), confirming that the collection has a good representativeness of the entire olive variability. Furthermore, Bayesian analysis, performed on the 59 genotypes of the collection by the use of both sets of markers, have demonstrated their splitting into four clusters, with a well balanced membership obtained by EST respect to standard SSRs. The new OLEST ( Olea expressed sequence tags) SSR markers resulted as effective as the best standard markers. The information obtained from this study represents a high valuable tool for ex situ conservation and management of olive genetic resources, useful to build a common database from worldwide olive cultivar collections, also based on recently developed markers.
Making Temporal Logic Calculational: A Tool for Unification and Discovery
NASA Astrophysics Data System (ADS)
Boute, Raymond
In temporal logic, calculational proofs beyond simple cases are often seen as challenging. The situation is reversed by making temporal logic calculational, yielding shorter and clearer proofs than traditional ones, and serving as a (mental) tool for unification and discovery. A side-effect of unifying theories is easier access by practicians. The starting point is a simple generic (software tool independent) Functional Temporal Calculus (FTC). Specific temporal logics are then captured via endosemantic functions. This concept reflects tacit conventions throughout mathematics and, once identified, is general and useful. FTC also yields a reasoning style that helps discovering theorems by calculation rather than just proving given facts. This is illustrated by deriving various theorems, most related to liveness issues in TLA+, and finding strengthenings of known results. Educational issues are addressed in passing.
Szarka, Mate; Guttman, Andras
2017-10-17
We present the application of a smartphone anatomy based technology in the field of liquid phase bioseparations, particularly in capillary electrophoresis. A simple capillary electrophoresis system was built with LED induced fluorescence detection and a credit card sized minicomputer to prove the concept of real time fluorescent imaging (zone adjustable time-lapse fluorescence image processor) and separation controller. The system was evaluated by analyzing under- and overloaded aminopyrenetrisulfonate (APTS)-labeled oligosaccharide samples. The open source software based image processing tool allowed undistorted signal modulation (reprocessing) if the signal was inappropriate for the actual detection system settings (too low or too high). The novel smart detection tool for fluorescently labeled biomolecules greatly expands dynamic range and enables retrospective correction for injections with unsuitable signal levels without the necessity to repeat the analysis.
orthoFind Facilitates the Discovery of Homologous and Orthologous Proteins.
Mier, Pablo; Andrade-Navarro, Miguel A; Pérez-Pulido, Antonio J
2015-01-01
Finding homologous and orthologous protein sequences is often the first step in evolutionary studies, annotation projects, and experiments of functional complementation. Despite all currently available computational tools, there is a requirement for easy-to-use tools that provide functional information. Here, a new web application called orthoFind is presented, which allows a quick search for homologous and orthologous proteins given one or more query sequences, allowing a recurrent and exhaustive search against reference proteomes, and being able to include user databases. It addresses the protein multidomain problem, searching for homologs with the same domain architecture, and gives a simple functional analysis of the results to help in the annotation process. orthoFind is easy to use and has been proven to provide accurate results with different datasets. Availability: http://www.bioinfocabd.upo.es/orthofind/.
Dumont, Elodie; De Bleye, Charlotte; Sacré, Pierre-Yves; Netchacovitch, Lauranne; Hubert, Philippe; Ziemons, Eric
2016-05-01
Over recent decades, spreading environmental concern entailed the expansion of green chemistry analytical tools. Vibrational spectroscopy, belonging to this class of analytical tool, is particularly interesting taking into account its numerous advantages such as fast data acquisition and no sample preparation. In this context, near-infrared, Raman and mainly surface-enhanced Raman spectroscopy (SERS) have thus gained interest in many fields including bioanalysis. The two former techniques only ensure the analysis of concentrated compounds in simple matrices, whereas the emergence of SERS improved the performances of vibrational spectroscopy to very sensitive and selective analyses. Complex SERS substrates were also developed enabling biomarker measurements, paving the way for SERS immunoassays. Therefore, in this paper, the strengths and weaknesses of these techniques will be highlighted with a focus on recent progress.
Seed: a user-friendly tool for exploring and visualizing microbial community data.
Beck, Daniel; Dennis, Christopher; Foster, James A
2015-02-15
In this article we present Simple Exploration of Ecological Data (Seed), a data exploration tool for microbial communities. Seed is written in R using the Shiny library. This provides access to powerful R-based functions and libraries through a simple user interface. Seed allows users to explore ecological datasets using principal coordinate analyses, scatter plots, bar plots, hierarchal clustering and heatmaps. Seed is open source and available at https://github.com/danlbek/Seed. danlbek@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Shen, Lishuang; Attimonelli, Marcella; Bai, Renkui; Lott, Marie T; Wallace, Douglas C; Falk, Marni J; Gai, Xiaowu
2018-06-01
Accurate mitochondrial DNA (mtDNA) variant annotation is essential for the clinical diagnosis of diverse human diseases. Substantial challenges to this process include the inconsistency in mtDNA nomenclatures, the existence of multiple reference genomes, and a lack of reference population frequency data. Clinicians need a simple bioinformatics tool that is user-friendly, and bioinformaticians need a powerful informatics resource for programmatic usage. Here, we report the development and functionality of the MSeqDR mtDNA Variant Tool set (mvTool), a one-stop mtDNA variant annotation and analysis Web service. mvTool is built upon the MSeqDR infrastructure (https://mseqdr.org), with contributions of expert curated data from MITOMAP (https://www.mitomap.org) and HmtDB (https://www.hmtdb.uniba.it/hmdb). mvTool supports all mtDNA nomenclatures, converts variants to standard rCRS- and HGVS-based nomenclatures, and annotates novel mtDNA variants. Besides generic annotations from dbNSFP and Variant Effect Predictor (VEP), mvTool provides allele frequencies in more than 47,000 germline mitogenomes, and disease and pathogenicity classifications from MSeqDR, Mitomap, HmtDB and ClinVar (Landrum et al., 2013). mvTools also provides mtDNA somatic variants annotations. "mvTool API" is implemented for programmatic access using inputs in VCF, HGVS, or classical mtDNA variant nomenclatures. The results are reported as hyperlinked html tables, JSON, Excel, and VCF formats. MSeqDR mvTool is freely accessible at https://mseqdr.org/mvtool.php. © 2018 Wiley Periodicals, Inc.
Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony
2018-01-01
This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.
Guitton, Yann; Tremblay-Franco, Marie; Le Corguillé, Gildas; Martin, Jean-François; Pétéra, Mélanie; Roger-Mele, Pierrick; Delabrière, Alexis; Goulitquer, Sophie; Monsoor, Misharl; Duperier, Christophe; Canlet, Cécile; Servien, Rémi; Tardivel, Patrick; Caron, Christophe; Giacomoni, Franck; Thévenot, Etienne A
2017-12-01
Metabolomics is a key approach in modern functional genomics and systems biology. Due to the complexity of metabolomics data, the variety of experimental designs, and the multiplicity of bioinformatics tools, providing experimenters with a simple and efficient resource to conduct comprehensive and rigorous analysis of their data is of utmost importance. In 2014, we launched the Workflow4Metabolomics (W4M; http://workflow4metabolomics.org) online infrastructure for metabolomics built on the Galaxy environment, which offers user-friendly features to build and run data analysis workflows including preprocessing, statistical analysis, and annotation steps. Here we present the new W4M 3.0 release, which contains twice as many tools as the first version, and provides two features which are, to our knowledge, unique among online resources. First, data from the four major metabolomics technologies (i.e., LC-MS, FIA-MS, GC-MS, and NMR) can be analyzed on a single platform. By using three studies in human physiology, alga evolution, and animal toxicology, we demonstrate how the 40 available tools can be easily combined to address biological issues. Second, the full analysis (including the workflow, the parameter values, the input data and output results) can be referenced with a permanent digital object identifier (DOI). Publication of data analyses is of major importance for robust and reproducible science. Furthermore, the publicly shared workflows are of high-value for e-learning and training. The Workflow4Metabolomics 3.0 e-infrastructure thus not only offers a unique online environment for analysis of data from the main metabolomics technologies, but it is also the first reference repository for metabolomics workflows. Copyright © 2017 Elsevier Ltd. All rights reserved.
Swartz, Jordan; Koziatek, Christian; Theobald, Jason; Smith, Silas; Iturrate, Eduardo
2017-05-01
Testing for venous thromboembolism (VTE) is associated with cost and risk to patients (e.g. radiation). To assess the appropriateness of imaging utilization at the provider level, it is important to know that provider's diagnostic yield (percentage of tests positive for the diagnostic entity of interest). However, determining diagnostic yield typically requires either time-consuming, manual review of radiology reports or the use of complex and/or proprietary natural language processing software. The objectives of this study were twofold: 1) to develop and implement a simple, user-configurable, and open-source natural language processing tool to classify radiology reports with high accuracy and 2) to use the results of the tool to design a provider-specific VTE imaging dashboard, consisting of both utilization rate and diagnostic yield. Two physicians reviewed a training set of 400 lower extremity ultrasound (UTZ) and computed tomography pulmonary angiogram (CTPA) reports to understand the language used in VTE-positive and VTE-negative reports. The insights from this review informed the arguments to the five modifiable parameters of the NLP tool. A validation set of 2,000 studies was then independently classified by the reviewers and by the tool; the classifications were compared and the performance of the tool was calculated. The tool was highly accurate in classifying the presence and absence of VTE for both the UTZ (sensitivity 95.7%; 95% CI 91.5-99.8, specificity 100%; 95% CI 100-100) and CTPA reports (sensitivity 97.1%; 95% CI 94.3-99.9, specificity 98.6%; 95% CI 97.8-99.4). The diagnostic yield was then calculated at the individual provider level and the imaging dashboard was created. We have created a novel NLP tool designed for users without a background in computer programming, which has been used to classify venous thromboembolism reports with a high degree of accuracy. The tool is open-source and available for download at http://iturrate.com/simpleNLP. Results obtained using this tool can be applied to enhance quality by presenting information about utilization and yield to providers via an imaging dashboard. Copyright © 2017 Elsevier B.V. All rights reserved.
Boenzi, Sara; Deodato, Federica; Taurisano, Roberta; Martinelli, Diego; Verrigni, Daniela; Carrozzo, Rosalba; Bertini, Enrico; Pastore, Anna; Dionisi-Vici, Carlo; Johnson, David W
2014-11-01
Two oxysterols, cholestan-3β,5α,6β-triol (C-triol) and 7-ketocholesterol (7-KC), have been recently proposed as diagnostic markers of Niemann-Pick type C (NP-C) disease, representing a potential alternative diagnostic tool to the more invasive and time consuming filipin test in cultured fibroblasts. Usually, the oxysterols are detected and quantified by liquid chromatography-tandem mass spectrometry (LC-MS/MS) method using atmospheric pressure chemical ionization (APCI) or electro-spray-ionization (ESI) sources, after a variety of derivatization procedures to enhance sensitivity. We developed a sensitive LC-MS/MS method to quantify the oxysterols in plasma as dimethylaminobutyrate ester, suitable for ESI analysis. This method, with an easy liquid-phase extraction and a short derivatization procedure, has been validated to demonstrate specificity, linearity, recovery, lowest limit of quantification, accuracy and precision. The assay was linear over a concentration range of 0.5-200ng/mL for C-triol and 1.0-200ng/mL for 7-KC. Intra-day and inter-day coefficients of variation (CV%) were <15% for both metabolites. Receiver operating characteristic analysis estimates that the area under curve was 0.998 for C-triol, and 0.972 for 7-KC, implying a significant discriminatory power for the method in this patient population of both oxysterols. In summary, our method provides a simple, rapid and non-invasive diagnostic tool for the biochemical diagnosis of NP-C disease. Copyright © 2014 Elsevier B.V. All rights reserved.
Impedance microflow cytometry for viability studies of microorganisms
NASA Astrophysics Data System (ADS)
Di Berardino, Marco; Hebeisen, Monika; Hessler, Thomas; Ziswiler, Adrian; Largiadèr, Stephanie; Schade, Grit
2011-02-01
Impedance-based Coulter counters and its derivatives are widely used cell analysis tools in many laboratories and use normally DC or low frequency AC to perform these electrical analyses. The emergence of micro-fabrication technologies in the last decade, however, provides a new means of measuring electrical properties of cells. Microfluidic approaches combined with impedance spectroscopy measurements in the radio frequency (RF) range increase sensitivity and information content and thus push single cell analyses beyond simple cell counting and sizing applications towards multiparametric cell characterization. Promising results have been shown already in the fields of cell differentiation and blood analysis. Here we emphasize the potential of this technology by presenting new data obtained from viability studies on microorganisms. Impedance measurements of several yeast and bacteria strains performed at frequencies around 10 MHz enable an easy discrimination between dead and viable cells. Moreover, cytotoxic effects of antibiotics and other reagents, as well as cell starvation can also be monitored easily. Control analyses performed with conventional flow cytometers using various fluorescent dyes (propidium iodide, oxonol) indicate a good correlation and further highlight the capability of this device. The label-free approach makes on the one hand the use of usually expensive fluorochromes obsolete, on the other hand practically eliminates laborious sample preparation procedures. Until now, online cell monitoring was limited to the determination of viable biomass, which provides rather poor information of a cell culture. Impedance microflow cytometry, besides other aspects, proposes a simple solution to these limitations and might become an important tool for bioprocess monitoring applications in the biotech industry.
Microstructure, crystallographic texture and mechanical properties of friction stir welded AA2017A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmed, M.M.Z., E-mail: mohamed_ahmed4@s-petrol.suez.edu.eg; Department of Metallurgical and Materials Engineering, Suez Canal University, Suez 43721; Wynne, B.P.
2012-02-15
In this study a thick section (20 mm) friction stir welded AA2017A-T451 has been characterized in terms of microstructure, crystallographic texture and mechanical properties. For microstructural analysis both optical and scanning electron microscopes have been used. A detailed crystallographic texture analysis has been carried out using the electron back scattering diffraction technique. Crystallographic texture has been examined in both shoulder and probe affected regions of the weld NG. An entirely weak texture is observed at the shoulder affected region which is mainly explained by the effect of the sequential multi pass deformation experienced by both tool probe and tool shoulder.more » The texture in the probe dominated region at the AS side of the weld is relatively weak but still assembles the simple shear texture of FCC metals with B/B{sup Macron} and C components existing across the whole map. However, the texture is stronger at the RS than at the AS of the weld, mainly dominated byB/B{sup Macron} components and with C component almost absent across the map. An alternating bands between (B) components and (B{sup Macron }) component are observed only at the AS side of the weld. - Highlights: Black-Right-Pointing-Pointer Detailed investigation of microstructure and crystallographic texture. Black-Right-Pointing-Pointer The grain size is varied from the top to the bottom of the NG. Black-Right-Pointing-Pointer An entirely weak texture is observed at the shoulder affected region. Black-Right-Pointing-Pointer The texture in the probe affected region is dominated by simple shear texture.« less
SIGNUM: A Matlab, TIN-based landscape evolution model
NASA Astrophysics Data System (ADS)
Refice, A.; Giachetta, E.; Capolongo, D.
2012-08-01
Several numerical landscape evolution models (LEMs) have been developed to date, and many are available as open source codes. Most are written in efficient programming languages such as Fortran or C, but often require additional code efforts to plug in to more user-friendly data analysis and/or visualization tools to ease interpretation and scientific insight. In this paper, we present an effort to port a common core of accepted physical principles governing landscape evolution directly into a high-level language and data analysis environment such as Matlab. SIGNUM (acronym for Simple Integrated Geomorphological Numerical Model) is an independent and self-contained Matlab, TIN-based landscape evolution model, built to simulate topography development at various space and time scales. SIGNUM is presently capable of simulating hillslope processes such as linear and nonlinear diffusion, fluvial incision into bedrock, spatially varying surface uplift which can be used to simulate changes in base level, thrust and faulting, as well as effects of climate changes. Although based on accepted and well-known processes and algorithms in its present version, it is built with a modular structure, which allows to easily modify and upgrade the simulated physical processes to suite virtually any user needs. The code is conceived as an open-source project, and is thus an ideal tool for both research and didactic purposes, thanks to the high-level nature of the Matlab environment and its popularity among the scientific community. In this paper the simulation code is presented together with some simple examples of surface evolution, and guidelines for development of new modules and algorithms are proposed.
A Simple Mechanical Model for the Isotropic Harmonic Oscillator
ERIC Educational Resources Information Center
Nita, Gelu M.
2010-01-01
A constrained elastic pendulum is proposed as a simple mechanical model for the isotropic harmonic oscillator. The conceptual and mathematical simplicity of this model recommends it as an effective pedagogical tool in teaching basic physics concepts at advanced high school and introductory undergraduate course levels. (Contains 2 figures.)
The Simple Theory of Public Library Services.
ERIC Educational Resources Information Center
Newhouse, Joseph P.
A simple normative theory applicable to public library services was developed as a tool to aid libraries in answering the question: which books should be bought by the library? Although developed for normative purposes, the theory generates testable predictions. It is relevant to measuring benefits from services which are provided publicly because…
Safety in the Chemical Laboratory: Laboratory Air Quality: Part I. A Concentration Model.
ERIC Educational Resources Information Center
Butcher, Samuel S.; And Others
1985-01-01
Offers a simple model for estimating vapor concentrations in instructional laboratories. Three methods are described for measuring ventilation rates, and the results of measurements in six laboratories are presented. The model should provide a simple screening tool for evaluating worst-case personal exposures. (JN)
USDA-ARS?s Scientific Manuscript database
Simple sequence repeat (SSR) markers are widely used tools for inferences about genetic diversity, phylogeography and spatial genetic structure. Their applications assume that variation among alleles is essentially caused by an expansion or contraction of the number of repeats and that, accessorily,...
López Varela, Maria Victorina; Montes de Oca, Maria; Rey, Alejandra; Casas, Alejandro; Stirbulov, Roberto; Di Boscio, Valentina
2016-10-01
Opportunistic chronic obstructive pulmonary disease (COPD) case finding approaches for high-risk individuals with or without symptoms is a feasible option for disease identification. PUMA is an opportunistic case finding study conducted in primary care setting of Argentina, Colombia, Venezuela and Uruguay. The objectives were to measure COPD prevalence in an at-risk population visiting primary care for any reason, to assess the yield of this opportunistic approach and the accuracy of a score developed to detect COPD. Subjects attending routine primary care visits, ≥40 years of age, current or former smokers or exposed to biomass smoke, completed a questionnaire and performed spirometry. COPD was defined as post-bronchodilator (post-BD) forced expiratory volume in 1 s (FEV1 )/forced vital capacity (FVC) < 0.70 and the lower limit of normal of FEV1 /FVC. A total of 1743 subjects completed the interview; 1540 performed acceptable spirometry. COPD prevalence was 20.1% (n = 309; ranging from 11.0% in Venezuela to 29.6% in Argentina) when defined using post-BD FEV1 /FVC < 0.70, and 14.7% (n = 226; ranging from 8.3% in Venezuela to 21.8% in Colombia) using the lower limit of normal. Logistic regression analysis for both definitions showed that the risk of COPD was significantly higher for persons >50 years, heavy smokers (>30 pack-years), with dyspnoea, and having prior spirometry. A simple score and a weighted score constructed using the following predictive factors: gender, age, pack-years smoking, dyspnoea, sputum, cough and spirometry, had a mean accuracy for detecting COPD (post-BD FEV1 /FVC < 0.70) of 76% and 79% for the simple and weighted scores, respectively. This simple seven-item score is an accurate screening tool to select subjects for spirometry in primary care. © 2016 Asian Pacific Society of Respirology.
Kangaroo – A pattern-matching program for biological sequences
2002-01-01
Background Biologists are often interested in performing a simple database search to identify proteins or genes that contain a well-defined sequence pattern. Many databases do not provide straightforward or readily available query tools to perform simple searches, such as identifying transcription binding sites, protein motifs, or repetitive DNA sequences. However, in many cases simple pattern-matching searches can reveal a wealth of information. We present in this paper a regular expression pattern-matching tool that was used to identify short repetitive DNA sequences in human coding regions for the purpose of identifying potential mutation sites in mismatch repair deficient cells. Results Kangaroo is a web-based regular expression pattern-matching program that can search for patterns in DNA, protein, or coding region sequences in ten different organisms. The program is implemented to facilitate a wide range of queries with no restriction on the length or complexity of the query expression. The program is accessible on the web at http://bioinfo.mshri.on.ca/kangaroo/ and the source code is freely distributed at http://sourceforge.net/projects/slritools/. Conclusion A low-level simple pattern-matching application can prove to be a useful tool in many research settings. For example, Kangaroo was used to identify potential genetic targets in a human colorectal cancer variant that is characterized by a high frequency of mutations in coding regions containing mononucleotide repeats. PMID:12150718
ERIC Educational Resources Information Center
New Teacher Project, 2011
2011-01-01
This "Rating a Teacher Observation Tool" identifies five simple questions and provides an easy-to-use scorecard to help policymakers decide whether an observation framework is likely to produce fair and accurate results. The five questions are: (1) Do the criteria and tools cover the classroom performance areas most connected to student outcomes?…
NASA Technical Reports Server (NTRS)
Casey, E. J.; Commadore, C. C.; Ingles, M. E.
1980-01-01
Long wire bundles twist into uniform spiral harnesses with help of simple apparatus. Wires pass through spacers and through hand-held tool with hole for each wire. Ends are attached to low speed bench motor. As motor turns, operator moves hand tool away forming smooth twists in wires between motor and tool. Technique produces harnesses that generate less radio-frequency interference than do irregularly twisted cables.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorokine, Alexandre
2011-10-01
Simple Ontology Format (SOFT) library and file format specification provides a set of simple tools for developing and maintaining ontologies. The library, implemented as a perl module, supports parsing and verification of the files in SOFt format, operations with ontologies (adding, removing, or filtering of entities), and converting of ontologies into other formats. SOFT allows users to quickly create ontologies using only a basic text editor, verify it, and portray it in a graph layout system using customized styles.
ERPLAB: an open-source toolbox for the analysis of event-related potentials
Lopez-Calderon, Javier; Luck, Steven J.
2014-01-01
ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations. PMID:24782741
Mimee, Benjamin; Duceppe, Marc-Olivier; Véronneau, Pierre-Yves; Lafond-Lapalme, Joël; Jean, Martine; Belzile, François; Bélair, Guy
2015-11-01
Cyst nematodes are important agricultural pests responsible for billions of dollars of losses each year. Plant resistance is the most effective management tool, but it requires a close monitoring of population genetics. Current technologies for pathotyping and genotyping cyst nematodes are time-consuming, expensive and imprecise. In this study, we capitalized on the reproduction mode of cyst nematodes to develop a simple population genetic analysis pipeline based on genotyping-by-sequencing and Pool-Seq. This method yielded thousands of SNPs and allowed us to study the relationships between populations of different origins or pathotypes. Validation of the method on well-characterized populations also demonstrated that it was a powerful and accurate tool for population genetics. The genomewide allele frequencies of 23 populations of golden nematode, from nine countries and representing the five known pathotypes, were compared. A clear separation of the pathotypes and fine genetic relationships between and among global populations were obtained using this method. In addition to being powerful, this tool has proven to be very time- and cost-efficient and could be applied to other cyst nematode species. © 2015 Her Majesty the Queen in Right of Canada Molecular Ecology Resources © 2015 John Wiley & Sons Ltd Reproduced with the permission of the Minister of Agriculture and Agri-food.
ERPLAB: an open-source toolbox for the analysis of event-related potentials.
Lopez-Calderon, Javier; Luck, Steven J
2014-01-01
ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB's EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB's tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user's guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.
Development of a software tool to support chemical and biological terrorism intelligence analysis
NASA Astrophysics Data System (ADS)
Hunt, Allen R.; Foreman, William
1997-01-01
AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.
Névéol, Aurélie; Pereira, Suzanne; Kerdelhué, Gaetan; Dahamna, Badisse; Joubert, Michel; Darmoni, Stéfan J
2007-01-01
The growing number of resources to be indexed in the catalogue of online health resources in French (CISMeF) calls for curating strategies involving automatic indexing tools while maintaining the catalogue's high indexing quality standards. To develop a simple automatic tool that retrieves MeSH descriptors from documents titles. In parallel to research on advanced indexing methods, a bag-of-words tool was developed for timely inclusion in CISMeF's maintenance system. An evaluation was carried out on a corpus of 99 documents. The indexing sets retrieved by the automatic tool were compared to manual indexing based on the title and on the full text of resources. 58% of the major main headings were retrieved by the bag-of-words algorithm and the precision on main heading retrieval was 69%. Bag-of-words indexing has effectively been used on selected resources to be included in CISMeF since August 2006. Meanwhile, on going work aims at improving the current version of the tool.
Cox, Ruth; Sanchez, Javier; Revie, Crawford W
2013-01-01
Global climate change is known to result in the emergence or re-emergence of some infectious diseases. Reliable methods to identify the infectious diseases of humans and animals and that are most likely to be influenced by climate are therefore required. Since different priorities will affect the decision to address a particular pathogen threat, decision makers need a standardised method of prioritisation. Ranking methods and Multi-Criteria Decision approaches provide such a standardised method and were employed here to design two different pathogen prioritisation tools. The opinion of 64 experts was elicited to assess the importance of 40 criteria that could be used to prioritise emerging infectious diseases of humans and animals in Canada. A weight was calculated for each criterion according to the expert opinion. Attributes were defined for each criterion as a transparent and repeatable method of measurement. Two different Multi-Criteria Decision Analysis tools were tested, both of which used an additive aggregation approach. These were an Excel spreadsheet tool and a tool developed in software 'M-MACBETH'. The tools were trialed on nine 'test' pathogens. Two different methods of criteria weighting were compared, one using fixed weighting values, the other using probability distributions to account for uncertainty and variation in expert opinion. The ranking of the nine pathogens varied according to the weighting method that was used. In both tools, using both weighting methods, the diseases that tended to rank the highest were West Nile virus, Giardiasis and Chagas, while Coccidioidomycosis tended to rank the lowest. Both tools are a simple and user friendly approach to prioritising pathogens according to climate change by including explicit scoring of 40 criteria and incorporating weighting methods based on expert opinion. They provide a dynamic interactive method that can help to identify pathogens for which a full risk assessment should be pursued.
Cox, Ruth; Sanchez, Javier; Revie, Crawford W.
2013-01-01
Global climate change is known to result in the emergence or re-emergence of some infectious diseases. Reliable methods to identify the infectious diseases of humans and animals and that are most likely to be influenced by climate are therefore required. Since different priorities will affect the decision to address a particular pathogen threat, decision makers need a standardised method of prioritisation. Ranking methods and Multi-Criteria Decision approaches provide such a standardised method and were employed here to design two different pathogen prioritisation tools. The opinion of 64 experts was elicited to assess the importance of 40 criteria that could be used to prioritise emerging infectious diseases of humans and animals in Canada. A weight was calculated for each criterion according to the expert opinion. Attributes were defined for each criterion as a transparent and repeatable method of measurement. Two different Multi-Criteria Decision Analysis tools were tested, both of which used an additive aggregation approach. These were an Excel spreadsheet tool and a tool developed in software ‘M-MACBETH’. The tools were trialed on nine ‘test’ pathogens. Two different methods of criteria weighting were compared, one using fixed weighting values, the other using probability distributions to account for uncertainty and variation in expert opinion. The ranking of the nine pathogens varied according to the weighting method that was used. In both tools, using both weighting methods, the diseases that tended to rank the highest were West Nile virus, Giardiasis and Chagas, while Coccidioidomycosis tended to rank the lowest. Both tools are a simple and user friendly approach to prioritising pathogens according to climate change by including explicit scoring of 40 criteria and incorporating weighting methods based on expert opinion. They provide a dynamic interactive method that can help to identify pathogens for which a full risk assessment should be pursued. PMID:23950868
The Statistical Consulting Center for Astronomy (SCCA)
NASA Technical Reports Server (NTRS)
Akritas, Michael
2001-01-01
The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.
Tool for analyzing the vulnerability of buildings to flooding: the case of Switzerland
NASA Astrophysics Data System (ADS)
Choffet, Marc; Bianchi, Renzo; Jaboyedoff, Michel; Kölz, Ehrfried; Lateltin, Olivier; Leroi, Eric; Mayis, Arnaud
2010-05-01
Whatever the way used to protect property exposed to flood, there exists a residual risk. That is what feedbacks of past flooding show. This residual risk is on one hand linked with the possibility that the protection measures may fail or may not work as intended. The residual risk is on the other hand linked with the possibility that the flood exceeds the chosen level of protection.In many European countries, governments and insurance companies are thinking in terms of vulnerability reduction. This publication will present a new tool to evaluate the vulnerability of buildings in a context of flooding. This tool is developed by the project "Analysis of the vulnerability of buildings to flooding" which is funded by the Foundation for Prevention of Cantonal insurances, Switzerland. It is composed by three modules and it aims to provide a method for reducing the vulnerability of buildings to flooding. The first two modules allow identifying all the elements composing the building and listing it. The third module is dedicated to the choice of efficient risk reducing measures on the basis of cost-benefit analyses. The diagnostic tool for different parts of the building is being developed to allow real estate appraisers, insurance companies and homeowners rapidly assess the vulnerability of buildings in flood prone areas. The tool works with by several databases that have been selected from the collection and analysis of data, information, standards and feedback from risk management, hydrology, architecture, construction, materials engineering, insurance, or economy of construction. A method for determining the local hazard is also proposed, to determine the height of potential floods threatening a building, based on a back analysis of Swiss hazard maps. To calibrate the model, seven cantonal insurance institutions participate in the study by providing data, such as the the amount of damage in flooded areas. The poster will present some results from the development of the tool, such as the amount of damages to buildings and the possibility of analysis offered by the tool. Furthermore, analysis of data from the insurance company led to the emergence of trends in costs of damage due to flooding. Some graphics will be presented in the poster to illustrate the tool design. It will be shown that the tool allow for a census of buildings and the awareness of its vulnerability to flooding. A database development explanation concerning the remediation cost measures and the damage costs are also proposed. Simple and innovative remedial measures could be shown in the poster. By the help of some examples it is shown that the tool allows for an investigation of some interesting perspectives in the development of insurance strategies for building stocks in flood prone areas.
IDD Info: a software to manage surveillance data of Iodine Deficiency Disorders.
Liu, Peng; Teng, Bai-Jun; Zhang, Shu-Bin; Su, Xiao-Hui; Yu, Jun; Liu, Shou-Jun
2011-08-01
IDD info, a new software for managing survey data of Iodine Deficiency Disorders (IDD), is presented in this paper. IDD Info aims to create IDD project databases, process, analyze various national or regional surveillance data and form final report. It has series measures of choosing database from existing ones, revising it, choosing indicators from pool to establish database and adding indicators to pool. It also provides simple tools to scan one database and compare two databases, to set IDD standard parameters, to analyze data by single indicator and multi-indicators, and finally to form typeset report with content customized. IDD Info was developed using Chinese national IDD surveillance data of 2005. Its validity was evaluated by comparing with survey report given by China CDC. The IDD Info is a professional analysis tool, which succeeds in speeding IDD data analysis up to about 14.28% with respect to standard reference routines. It consequently enhances analysis performance and user compliance. IDD Info is a practical and accurate means of managing the multifarious IDD surveillance data that can be widely used by non-statisticians in national and regional IDD surveillance. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mugnes, J.-M.; Robert, C.
2015-11-01
Spectral analysis is a powerful tool to investigate stellar properties and it has been widely used for decades now. However, the methods considered to perform this kind of analysis are mostly based on iteration among a few diagnostic lines to determine the stellar parameters. While these methods are often simple and fast, they can lead to errors and large uncertainties due to the required assumptions. Here, we present a method based on Bayesian statistics to find simultaneously the best combination of effective temperature, surface gravity, projected rotational velocity, and microturbulence velocity, using all the available spectral lines. Different tests are discussed to demonstrate the strength of our method, which we apply to 54 mid-resolution spectra of field and cluster B stars obtained at the Observatoire du Mont-Mégantic. We compare our results with those found in the literature. Differences are seen which are well explained by the different methods used. We conclude that the B-star microturbulence velocities are often underestimated. We also confirm the trend that B stars in clusters are on average faster rotators than field B stars.
Functional analysis of regulatory single-nucleotide polymorphisms.
Pampín, Sandra; Rodríguez-Rey, José C
2007-04-01
The identification of regulatory polymorphisms has become a key problem in human genetics. In the past few years there has been a conceptual change in the way in which regulatory single-nucleotide polymorphisms are studied. We revise the new approaches and discuss how gene expression studies can contribute to a better knowledge of the genetics of common diseases. New techniques for the association of single-nucleotide polymorphisms with changes in gene expression have been recently developed. This, together with a more comprehensive use of the old in-vitro methods, has produced a great amount of genetic information. When added to current databases, it will help to design better tools for the detection of regulatory single-nucleotide polymorphisms. The identification of functional regulatory single-nucleotide polymorphisms cannot be done by the simple inspection of DNA sequence. In-vivo techniques, based on primer-extension, and the more recently developed 'haploChIP' allow the association of gene variants to changes in gene expression. Gene expression analysis by conventional in-vitro techniques is the only way to identify the functional consequences of regulatory single-nucleotide polymorphisms. The amount of information produced in the last few years will help to refine the tools for the future analysis of regulatory gene variants.
NASA Technical Reports Server (NTRS)
Funk, Christie J.; Perry, Boyd, III; Silva, Walter A.; Newman, Brett
2014-01-01
A software program and associated methodology to study gust loading on aircraft exists for a classification of geometrically simplified flexible configurations. This program consists of a simple aircraft response model with two rigid and three flexible symmetric degrees-of - freedom and allows for the calculation of various airplane responses due to a discrete one-minus- cosine gust as well as continuous turbulence. Simplifications, assumptions, and opportunities for potential improvements pertaining to the existing software program are first identified, then a revised version of the original software tool is developed with improved methodology to include more complex geometries, additional excitation cases, and additional output data so as to provide a more useful and precise tool for gust load analysis. In order to improve the original software program to enhance usefulness, a wing control surface and horizontal tail control surface is added, an extended application of the discrete one-minus-cosine gust input is employed, a supplemental continuous turbulence spectrum is implemented, and a capability to animate the total vehicle deformation response to gust inputs is included. These revisions and enhancements are implemented and an analysis of the results is used to validate the modifications.
Fischer, John P; Nelson, Jonas A; Shang, Eric K; Wink, Jason D; Wingate, Nicholas A; Woo, Edward Y; Jackson, Benjamin M; Kovach, Stephen J; Kanchwala, Suhail
2014-12-01
Groin wound complications after open vascular surgery procedures are common, morbid, and costly. The purpose of this study was to generate a simple, validated, clinically usable risk assessment tool for predicting groin wound morbidity after infra-inguinal vascular surgery. A retrospective review of consecutive patients undergoing groin cutdowns for femoral access between 2005-2011 was performed. Patients necessitating salvage flaps were compared to those who did not, and a stepwise logistic regression was performed and validated using a bootstrap technique. Utilising this analysis, a simplified risk score was developed to predict the risk of developing a wound which would necessitate salvage. A total of 925 patients were included in the study. The salvage flap rate was 11.2% (n = 104). Predictors determined by logistic regression included prior groin surgery (OR = 4.0, p < 0.001), prosthetic graft (OR = 2.7, p < 0.001), coronary artery disease (OR = 1.8, p = 0.019), peripheral arterial disease (OR = 5.0, p < 0.001), and obesity (OR = 1.7, p = 0.039). Based upon the respective logistic coefficients, a simplified scoring system was developed to enable the preoperative risk stratification regarding the likelihood of a significant complication which would require a salvage muscle flap. The c-statistic for the regression demonstrated excellent discrimination at 0.89. This study presents a simple, internally validated risk assessment tool that accurately predicts wound morbidity requiring flap salvage in open groin vascular surgery patients. The preoperatively high-risk patient can be identified and selectively targeted as a candidate for a prophylactic muscle flap.
Thermal performance modeling of NASA s scientific balloons
NASA Astrophysics Data System (ADS)
Franco, H.; Cathey, H.
The flight performance of a scientific balloon is highly dependant on the interaction between the balloon and its environment. The balloon is a thermal vehicle. Modeling a scientific balloon's thermal performance has proven to be a difficult analytical task. Most previous thermal models have attempted these analyses by using either a bulk thermal model approach, or by simplified representations of the balloon. These approaches to date have provided reasonable, but not very accurate results. Improvements have been made in recent years using thermal analysis tools developed for the thermal modeling of spacecraft and other sophisticated heat transfer problems. These tools, which now allow for accurate modeling of highly transmissive materials, have been applied to the thermal analysis of NASA's scientific balloons. A research effort has been started that utilizes the "Thermal Desktop" addition to AUTO CAD. This paper will discuss the development of thermal models for both conventional and Ultra Long Duration super-pressure balloons. This research effort has focused on incremental analysis stages of development to assess the accuracy of the tool and the required model resolution to produce usable data. The first stage balloon thermal analyses started with simple spherical balloon models with a limited number of nodes, and expanded the number of nodes to determine required model resolution. These models were then modified to include additional details such as load tapes. The second stage analyses looked at natural shaped Zero Pressure balloons. Load tapes were then added to these shapes, again with the goal of determining the required modeling accuracy by varying the number of gores. The third stage, following the same steps as the Zero Pressure balloon efforts, was directed at modeling super-pressure pumpkin shaped balloons. The results were then used to develop analysis guidelines and an approach for modeling balloons for both simple first order estimates and detailed full models. The development of the radiative environment and program input files, the development of the modeling techniques for balloons, and the development of appropriate data output handling techniques for both the raw data and data plots will be discussed. A general guideline to match predicted balloon performance with known flight data will also be presented. One long-term goal of this effort is to develop simplified approaches and techniques to include results in performance codes being developed.
J-Earth: An Essential Resource for Terrestrial Remote Sensing and Data Analysis
NASA Astrophysics Data System (ADS)
Dunn, S.; Rupp, J.; Cheeseman, S.; Christensen, P. R.; Prashad, L. C.; Dickenshied, S.; Anwar, S.; Noss, D.; Murray, K.
2011-12-01
There is a need for a software tool that has the ability to display and analyze various types of earth science and social data through a simple, user-friendly interface. The J-Earth software tool has been designed to be easily accessible for download and intuitive use, regardless of the technical background of the user base. This tool does not require courses or text books to learn to use, yet is powerful enough to allow a more general community of users to perform complex data analysis. Professions that will benefit from this tool range from geologists, geographers, and climatologists to sociologists, economists, and ecologists as well as policy makers. J-Earth was developed by the Arizona State University Mars Space Flight Facility as part of the JMARS (Java Mission-planning and Analysis for Remote Sensing) suite of open-source tools. The program is a Geographic Information Systems (GIS) application used for viewing and processing satellite and airborne remote sensing data. While the functionality of JMARS has historically focused on the research needs of the planetary science community, J-Earth has been designed for a much broader Earth-based user audience. NASA instrument products accessible within J-Earth include data from ASTER, GOES, Landsat, MODIS, and TIMS. While J-Earth contains exceptionally comprehensive and high resolution satellite-derived data and imagery, this tool also includes many socioeconomic data products from projects lead by international organizations and universities. Datasets used in J-Earth take the form of grids, rasters, remote sensor "stamps", maps, and shapefiles. Some highly demanded global datasets available within J-Earth include five levels of administrative/political boundaries, climate data for current conditions as well as models for future climates, population counts and densities, land cover/land use, and poverty indicators. While this application does share the same powerful functionality of JMARS, J-Earth's apperance is enhanced for much easier data analysis. J-Earth utilizes a layering system to view data from different sources which can then be exported, scaled, colored and superimposed for quick comparisons. Users may now perform spatial analysis over several diverse datasets with respect to a defined geographic area or the entire globe. In addition, several newly acquired global datasets contain a temporal dimension which when accessed through J-Earth, make this a unique and powerful tool for spatial analysis over time. The functionality and ease of use set J-Earth apart from all other terrestrial GIS software packages and enable endless social, political, and scientific possibilities
Pan, Hongwei; Lei, Hongjun; Liu, Xin; Wei, Huaibin; Liu, Shufang
2017-09-01
A large number of simple and informal landfills exist in developing countries, which pose as tremendous soil and groundwater pollution threats. Early warning and monitoring of landfill leachate pollution status is of great importance. However, there is a shortage of affordable and effective tools and methods. In this study, a soil column experiment was performed to simulate the pollution status of leachate using three-dimensional excitation-emission fluorescence (3D-EEMF) and parallel factor analysis (PARAFAC) models. Sum of squared residuals (SSR) and principal component analysis (PCA) were used to determine the optimal components for PARAFAC. A one-way analysis of variance showed that the component scores of the soil column leachate were significant influenced by landfill leachate (p<0.05). Therefore, the ratio of the component scores of the soil under the landfill to that of natural soil could be used to evaluate the leakage status of landfill leachate. Furthermore, a hazard index (HI) and a hazard evaluation standard were established. A case study of Kaifeng landfill indicated a low hazard (level 5) by the use of HI. In summation, HI is presented as a tool to evaluate landfill pollution status and for the guidance of municipal solid waste management. Copyright © 2017 Elsevier Ltd. All rights reserved.
2015-01-01
We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1–100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets. PMID:25411686
French, William R; Zimmerman, Lisa J; Schilling, Birgit; Gibson, Bradford W; Miller, Christine A; Townsend, R Reid; Sherrod, Stacy D; Goodwin, Cody R; McLean, John A; Tabb, David L
2015-02-06
We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1-100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets.
Mohammed, Emad A; Naugler, Christopher
2017-01-01
Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.
IT-based wellness tools for older adults: Design concepts and feedback.
Joe, Jonathan; Hall, Amanda; Chi, Nai-Ching; Thompson, Hilaire; Demiris, George
2018-03-01
To explore older adults' preferences regarding e-health applications through use of generated concepts that inform wellness tool design. The 6-8-5 method and affinity mapping were used to create e-health design ideas that were translated into storyboards and scenarios. Focus groups were conducted to obtain feedback on the prototypes and included participant sketching. A qualitative analysis of the focus groups for emerging themes was conducted, and sketches were analyzed. Forty-three older adults participated in six focus group sessions. The majority of participants found the wellness tools useful. Preferences included features that supported participants in areas of unmet needs, such as ability to find reliable health information, cognitive training, or maintaining social ties. Participants favored features such as use of voice navigation, but were concerned over cost and the need for technology skills and access. Sketches reinforced these wants, including portability, convenience, and simplicity. Several factors were found to increase the desirability of such devices including convenient access to their health and health information, a simple, accessible interface, and support for memory issues. Researchers and designers should incorporate the feedback of older adults regarding wellness tools, so that future designs meet the needs of older adults.
Mohammed, Emad A.; Naugler, Christopher
2017-01-01
Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996
Analytical Tools in School Finance Reform.
ERIC Educational Resources Information Center
Johns, R. L.
This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…
Airtightness the simple(CS) way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, S.
Builders who might buck against such time consuming air sealing methods as polyethylene wrap and the airtight drywall approach (ADA) may respond better to current strategies. One such method, called SimpleCS, has proven especially effective. SimpleCS, pronounced simplex, stands for simple caulk and seal. A modification of the ADA, SimpleCS is an air-sealing management tool, a simplified systems approach to building tight homes. The system address the crucial question of when and by whom various air sealing steps should be done. It avoids the problems that often occur when later contractors cut open polyethylene wrap to drill holes in themore » drywall. The author describes how SimpleCS works, and the cost and training involved.« less
Optimal low thrust geocentric transfer. [mission analysis computer program
NASA Technical Reports Server (NTRS)
Edelbaum, T. N.; Sackett, L. L.; Malchow, H. L.
1973-01-01
A computer code which will rapidly calculate time-optimal low thrust transfers is being developed as a mission analysis tool. The final program will apply to NEP or SEP missions and will include a variety of environmental effects. The current program assumes constant acceleration. The oblateness effect and shadowing may be included. Detailed state and costate equations are given for the thrust effect, oblateness effect, and shadowing. A simple but adequate model yields analytical formulas for power degradation due to the Van Allen radiation belts for SEP missions. The program avoids the classical singularities by the use of equinoctial orbital elements. Kryloff-Bogoliuboff averaging is used to facilitate rapid calculation. Results for selected cases using the current program are given.
LOD significance thresholds for QTL analysis in experimental populations of diploid species
Van Ooijen JW
1999-11-01
Linkage analysis with molecular genetic markers is a very powerful tool in the biological research of quantitative traits. The lack of an easy way to know what areas of the genome can be designated as statistically significant for containing a gene affecting the quantitative trait of interest hampers the important prediction of the rate of false positives. In this paper four tables, obtained by large-scale simulations, are presented that can be used with a simple formula to get the false-positives rate for analyses of the standard types of experimental populations with diploid species with any size of genome. A new definition of the term 'suggestive linkage' is proposed that allows a more objective comparison of results across species.
Brown, Ryan M; Meah, Christopher J; Heath, Victoria L; Styles, Iain B; Bicknell, Roy
2016-01-01
Angiogenesis involves the generation of new blood vessels from the existing vasculature and is dependent on many growth factors and signaling events. In vivo angiogenesis is dynamic and complex, meaning assays are commonly utilized to explore specific targets for research into this area. Tube-forming assays offer an excellent overview of the molecular processes in angiogenesis. The Matrigel tube forming assay is a simple-to-implement but powerful tool for identifying biomolecules involved in angiogenesis. A detailed experimental protocol on the implementation of the assay is described in conjunction with an in-depth review of methods that can be applied to the analysis of the tube formation. In addition, an ImageJ plug-in is presented which allows automatic quantification of tube images reducing analysis times while removing user bias and subjectivity.
A simple computer-based measurement and analysis system of pulmonary auscultation sounds.
Polat, Hüseyin; Güler, Inan
2004-12-01
Listening to various lung sounds has proven to be an important diagnostic tool for detecting and monitoring certain types of lung diseases. In this study a computer-based system has been designed for easy measurement and analysis of lung sound using the software package DasyLAB. The designed system presents the following features: it is able to digitally record the lung sounds which are captured with an electronic stethoscope plugged to a sound card on a portable computer, display the lung sound waveform for auscultation sites, record the lung sound into the ASCII format, acoustically reproduce the lung sound, edit and print the sound waveforms, display its time-expanded waveform, compute the Fast Fourier Transform (FFT), and display the power spectrum and spectrogram.
Bishoyi, Ashok Kumar; Sharma, Anjali; Kavane, Aarti; Geetha, K A
2016-06-01
Cymbopogon is an important genus of family Poaceae, cultivated mainly for its essential oils which possess high medicinal and economical value. Several cultivars of Cymbopogon species are available for commercial cultivation in India and identification of these cultivars was conceded by means of morphological markers and essential oil constitution. Since these parameters are highly influenced by environmental factors, in most of the cases, it is difficult to identify Cymbopogon cultivars. In the present study, Random amplified polymorphic DNA (RAPD) and Inter-simple sequence repeat (ISSR) markers were employed to discriminate nine leading varieties of Cymbopogon since prior genomic information is lacking or very little in the genus. Ninety RAPD and 70 ISSR primers were used which generated 63 and 69 % polymorphic amplicons, respectively. Similarity in the pattern of UPGMA-derived dendrogram of RAPD and ISSR analysis revealed the reliability of the markers chosen for the study. Varietal/cultivar-specific markers generated from the study could be utilised for varietal/cultivar authentication, thus monitoring the quality of the essential oil production in Cymbopogon. These markers can also be utilised for the IPR protection of the cultivars. Moreover, the study provides molecular marker tool kit in both random and simple sequence repeats for diverse molecular research in the same or related genera.
Valletta, Elisa; Kučera, Lukáš; Prokeš, Lubomír; Amato, Filippo; Pivetta, Tiziana; Hampl, Aleš; Havel, Josef; Vaňhara, Petr
2016-01-01
Cross-contamination of eukaryotic cell lines used in biomedical research represents a highly relevant problem. Analysis of repetitive DNA sequences, such as Short Tandem Repeats (STR), or Simple Sequence Repeats (SSR), is a widely accepted, simple, and commercially available technique to authenticate cell lines. However, it provides only qualitative information that depends on the extent of reference databases for interpretation. In this work, we developed and validated a rapid and routinely applicable method for evaluation of cell culture cross-contamination levels based on mass spectrometric fingerprints of intact mammalian cells coupled with artificial neural networks (ANNs). We used human embryonic stem cells (hESCs) contaminated by either mouse embryonic stem cells (mESCs) or mouse embryonic fibroblasts (MEFs) as a model. We determined the contamination level using a mass spectra database of known calibration mixtures that served as training input for an ANN. The ANN was then capable of correct quantification of the level of contamination of hESCs by mESCs or MEFs. We demonstrate that MS analysis, when linked to proper mathematical instruments, is a tangible tool for unraveling and quantifying heterogeneity in cell cultures. The analysis is applicable in routine scenarios for cell authentication and/or cell phenotyping in general.
Prokeš, Lubomír; Amato, Filippo; Pivetta, Tiziana; Hampl, Aleš; Havel, Josef; Vaňhara, Petr
2016-01-01
Cross-contamination of eukaryotic cell lines used in biomedical research represents a highly relevant problem. Analysis of repetitive DNA sequences, such as Short Tandem Repeats (STR), or Simple Sequence Repeats (SSR), is a widely accepted, simple, and commercially available technique to authenticate cell lines. However, it provides only qualitative information that depends on the extent of reference databases for interpretation. In this work, we developed and validated a rapid and routinely applicable method for evaluation of cell culture cross-contamination levels based on mass spectrometric fingerprints of intact mammalian cells coupled with artificial neural networks (ANNs). We used human embryonic stem cells (hESCs) contaminated by either mouse embryonic stem cells (mESCs) or mouse embryonic fibroblasts (MEFs) as a model. We determined the contamination level using a mass spectra database of known calibration mixtures that served as training input for an ANN. The ANN was then capable of correct quantification of the level of contamination of hESCs by mESCs or MEFs. We demonstrate that MS analysis, when linked to proper mathematical instruments, is a tangible tool for unraveling and quantifying heterogeneity in cell cultures. The analysis is applicable in routine scenarios for cell authentication and/or cell phenotyping in general. PMID:26821236
ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data
Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J.; Intarapanich, Apichart; Tongsima, Sissades
2017-01-01
Background Biochemical methods are available for enriching 5′ ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5′ ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. Results We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5′ ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5′ ends than TSSAR. In general, the transcript 5′ ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. Conclusion ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5′ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER). PMID:28542466
Mohammad Al Alfy, Ibrahim
2018-01-01
A set of three pads was constructed from primary materials (sand, gravel and cement) to calibrate the gamma-gamma density tool. A simple equation was devised to convert the qualitative cps values to quantitative g/cc values. The neutron-neutron porosity tool measures the qualitative cps porosity values. A direct equation was derived to calculate the porosity percentage from the cps porosity values. Cement-bond log illustrates the cement quantities, which surround well pipes. This log needs a difficult process due to the existence of various parameters, such as: drilling well diameter as well as internal diameter, thickness and type of well pipes. An equation was invented to calculate the cement percentage at standard conditions. This equation can be modified according to varying conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Using cluster analysis to organize and explore regional GPS velocities
Simpson, Robert W.; Thatcher, Wayne; Savage, James C.
2012-01-01
Cluster analysis offers a simple visual exploratory tool for the initial investigation of regional Global Positioning System (GPS) velocity observations, which are providing increasingly precise mappings of actively deforming continental lithosphere. The deformation fields from dense regional GPS networks can often be concisely described in terms of relatively coherent blocks bounded by active faults, although the choice of blocks, their number and size, can be subjective and is often guided by the distribution of known faults. To illustrate our method, we apply cluster analysis to GPS velocities from the San Francisco Bay Region, California, to search for spatially coherent patterns of deformation, including evidence of block-like behavior. The clustering process identifies four robust groupings of velocities that we identify with four crustal blocks. Although the analysis uses no prior geologic information other than the GPS velocities, the cluster/block boundaries track three major faults, both locked and creeping.
NASA Technical Reports Server (NTRS)
Bogert, Philip B.; Satyanarayana, Arunkumar; Chunchu, Prasad B.
2006-01-01
Splitting, ultimate failure load and the damage path in center notched composite specimens subjected to in-plane tension loading are predicted using progressive failure analysis methodology. A 2-D Hashin-Rotem failure criterion is used in determining intra-laminar fiber and matrix failures. This progressive failure methodology has been implemented in the Abaqus/Explicit and Abaqus/Standard finite element codes through user written subroutines "VUMAT" and "USDFLD" respectively. A 2-D finite element model is used for predicting the intra-laminar damages. Analysis results obtained from the Abaqus/Explicit and Abaqus/Standard code show good agreement with experimental results. The importance of modeling delamination in progressive failure analysis methodology is recognized for future studies. The use of an explicit integration dynamics code for simple specimen geometry and static loading establishes a foundation for future analyses where complex loading and nonlinear dynamic interactions of damage and structure will necessitate it.
CellProfiler Tracer: exploring and validating high-throughput, time-lapse microscopy image data.
Bray, Mark-Anthony; Carpenter, Anne E
2015-11-04
Time-lapse analysis of cellular images is an important and growing need in biology. Algorithms for cell tracking are widely available; what researchers have been missing is a single open-source software package to visualize standard tracking output (from software like CellProfiler) in a way that allows convenient assessment of track quality, especially for researchers tuning tracking parameters for high-content time-lapse experiments. This makes quality assessment and algorithm adjustment a substantial challenge, particularly when dealing with hundreds of time-lapse movies collected in a high-throughput manner. We present CellProfiler Tracer, a free and open-source tool that complements the object tracking functionality of the CellProfiler biological image analysis package. Tracer allows multi-parametric morphological data to be visualized on object tracks, providing visualizations that have already been validated within the scientific community for time-lapse experiments, and combining them with simple graph-based measures for highlighting possible tracking artifacts. CellProfiler Tracer is a useful, free tool for inspection and quality control of object tracking data, available from http://www.cellprofiler.org/tracer/.
Garten, Justin; Hoover, Joe; Johnson, Kate M; Boghrati, Reihane; Iskiwitch, Carol; Dehghani, Morteza
2018-02-01
Theory-driven text analysis has made extensive use of psychological concept dictionaries, leading to a wide range of important results. These dictionaries have generally been applied through word count methods which have proven to be both simple and effective. In this paper, we introduce Distributed Dictionary Representations (DDR), a method that applies psychological dictionaries using semantic similarity rather than word counts. This allows for the measurement of the similarity between dictionaries and spans of text ranging from complete documents to individual words. We show how DDR enables dictionary authors to place greater emphasis on construct validity without sacrificing linguistic coverage. We further demonstrate the benefits of DDR on two real-world tasks and finally conduct an extensive study of the interaction between dictionary size and task performance. These studies allow us to examine how DDR and word count methods complement one another as tools for applying concept dictionaries and where each is best applied. Finally, we provide references to tools and resources to make this method both available and accessible to a broad psychological audience.
Study of Tools for Network Discovery and Network Mapping
2003-11-01
connected to the switch. iv. Accessibility of historical data and event data In general, network discovery tools keep a history of the collected...has the following software dependencies: - Java Virtual machine 76 - Perl modules - RRD Tool - TomCat - PostgreSQL STRENGTHS AND...systems - provide a simple view of the current network status - generate alarms on status change - generate history of status change VISUAL MAP
Development of Waypoint Planning Tool in Response to NASA Field Campaign Challenges
NASA Technical Reports Server (NTRS)
He, Matt; Hardin, Danny; Conover, Helen; Graves, Sara; Meyer, Paul; Blakeslee, Richard; Goodman, Michael
2012-01-01
Airborne real time observations are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientists, planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircrafts are often involved in NASA field campaigns. The coordination of the aircrafts with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point -and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin and Java Web Start/Applet on web platform, and to the rising open source GIS tools with new JavaScript frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. The newly innovated, cross ]platform, modular designed JavaScript ]controlled Way Point Tool is planned to be integrated with NASA Airborne Science Mission Tool Suite. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientists reach their mission objectives. This presentation will discuss the development processes of the Waypoint Planning Tool in responding to field campaign challenges, identify new information technologies, and describe the capabilities and features of the Waypoint Planning Tool with the real time aspect, interactive nature, and the resultant benefits to the airborne science community.
Development of Way Point Planning Tool in Response to NASA Field Campaign Challenges
NASA Astrophysics Data System (ADS)
He, M.; Hardin, D. M.; Conover, H.; Graves, S. J.; Meyer, P.; Blakeslee, R. J.; Goodman, M. L.
2012-12-01
Airborne real time observations are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientists, planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircrafts are often involved in NASA field campaigns. The coordination of the aircrafts with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point-and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin and Java Web Start/Applet on web platform, and to the rising open source GIS tools with new JavaScript frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. The newly innovated, cross-platform, modular designed JavaScript-controlled Way Point Tool is planned to be integrated with NASA Airborne Science Mission Tool Suite. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientists reach their mission objectives. This presentation will discuss the development processes of the Waypoint Planning Tool in responding to field campaign challenges, identify new information technologies, and describe the capabilities and features of the Waypoint Planning Tool with the real time aspect, interactive nature, and the resultant benefits to the airborne science community.
Wang, Tong; Wu, Hai-Long; Xie, Li-Xia; Zhu, Li; Liu, Zhi; Sun, Xiao-Dong; Xiao, Rong; Yu, Ru-Qin
2017-04-01
In this work, a smart chemometrics-enhanced strategy, high-performance liquid chromatography, and diode array detection coupled with second-order calibration method based on alternating trilinear decomposition algorithm was proposed to simultaneously quantify 12 polyphenols in different kinds of apple peel and pulp samples. The proposed strategy proved to be a powerful tool to solve the problems of coelution, unknown interferences, and chromatographic shifts in the process of high-performance liquid chromatography analysis, making it possible for the determination of 12 polyphenols in complex apple matrices within 10 min under simple conditions of elution. The average recoveries with standard deviations, and figures of merit including sensitivity, selectivity, limit of detection, and limit of quantitation were calculated to validate the accuracy of the proposed method. Compared to the quantitative analysis results from the classic high-performance liquid chromatography method, the statistical and graphical analysis showed that our proposed strategy obtained more reliable results. All results indicated that our proposed method used in the quantitative analysis of apple polyphenols was an accurate, fast, universal, simple, and green one, and it was expected to be developed as an attractive alternative method for simultaneous determination of multitargeted analytes in complex matrices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A simple stochastic weather generator for ecological modeling
A.G. Birt; M.R. Valdez-Vivas; R.M. Feldman; C.W. Lafon; D. Cairns; R.N. Coulson; M. Tchakerian; W. Xi; Jim Guldin
2010-01-01
Stochastic weather generators are useful tools for exploring the relationship between organisms and their environment. This paper describes a simple weather generator that can be used in ecological modeling projects. We provide a detailed description of methodology, and links to full C++ source code (http://weathergen.sourceforge.net) required to implement or modify...
M"Health" for Higher Education
ERIC Educational Resources Information Center
Aburas, Abdurazzag A.; Ayran, Mujgan
2013-01-01
Better education is required better advanced tools to be used for students. Smart phone becomes main part of our daily life. New medical design interface is introduced for medicine student based mobile. The Graphic User Interface must be easy and simple. The main interface design issue for mobile is simple and easy to use. Human Mobile…
Methods for quantifying simple gravity sensing in Drosophila melanogaster.
Inagaki, Hidehiko K; Kamikouchi, Azusa; Ito, Kei
2010-01-01
Perception of gravity is essential for animals: most animals possess specific sense organs to detect the direction of the gravitational force. Little is known, however, about the molecular and neural mechanisms underlying their behavioral responses to gravity. Drosophila melanogaster, having a rather simple nervous system and a large variety of molecular genetic tools available, serves as an ideal model for analyzing the mechanisms underlying gravity sensing. Here we describe an assay to measure simple gravity responses of flies behaviorally. This method can be applied for screening genetic mutants of gravity perception. Furthermore, in combination with recent genetic techniques to silence or activate selective sets of neurons, it serves as a powerful tool to systematically identify neural substrates required for the proper behavioral responses to gravity. The assay requires 10 min to perform, and two experiments can be performed simultaneously, enabling 12 experiments per hour.
Simple tool for prediction of parotid gland sparing in intensity-modulated radiation therapy.
Gensheimer, Michael F; Hummel-Kramer, Sharon M; Cain, David; Quang, Tony S
2015-01-01
Sparing one or both parotid glands is a key goal when planning head and neck cancer radiation treatment. If the planning target volume (PTV) overlaps one or both parotid glands substantially, it may not be possible to achieve adequate gland sparing. This finding results in physicians revising their PTV contours after an intensity-modulated radiation therapy (IMRT) plan has been run and reduces workflow efficiency. We devised a simple formula for predicting mean parotid gland dose from the overlap of the parotid gland and isotropically expanded PTV contours. We tested the tool using 44 patients from 2 institutions and found agreement between predicted and actual parotid gland doses (mean absolute error = 5.3 Gy). This simple method could increase treatment planning efficiency by improving the chance that the first plan presented to the physician will have optimal parotid gland sparing. Published by Elsevier Inc.
Simple tool for prediction of parotid gland sparing in intensity-modulated radiation therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gensheimer, Michael F.; Hummel-Kramer, Sharon M., E-mail: sharonhummel@comcast.net; Cain, David
Sparing one or both parotid glands is a key goal when planning head and neck cancer radiation treatment. If the planning target volume (PTV) overlaps one or both parotid glands substantially, it may not be possible to achieve adequate gland sparing. This finding results in physicians revising their PTV contours after an intensity-modulated radiation therapy (IMRT) plan has been run and reduces workflow efficiency. We devised a simple formula for predicting mean parotid gland dose from the overlap of the parotid gland and isotropically expanded PTV contours. We tested the tool using 44 patients from 2 institutions and found agreementmore » between predicted and actual parotid gland doses (mean absolute error = 5.3 Gy). This simple method could increase treatment planning efficiency by improving the chance that the first plan presented to the physician will have optimal parotid gland sparing.« less
An overview of C. elegans biology.
Strange, Kevin
2006-01-01
The establishment of Caenorhabditis elegans as a "model organism" began with the efforts of Sydney Brenner in the early 1960s. Brenner's focus was to find a suitable animal model in which the tools of genetic analysis could be used to define molecular mechanisms of development and nervous system function. C. elegans provides numerous experimental advantages for such studies. These advantages include a short life cycle, production of large numbers of offspring, easy and inexpensive laboratory culture, forward and reverse genetic tractability, and a relatively simple anatomy. This chapter will provide a brief overview of C. elegans biology.
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1989-01-01
The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.
Invariant approach to the character classification
NASA Astrophysics Data System (ADS)
Šariri, Kristina; Demoli, Nazif
2008-04-01
Image moments analysis is a very useful tool which allows image description invariant to translation and rotation, scale change and some types of image distortions. The aim of this work was development of simple method for fast and reliable classification of characters by using Hu's and affine moment invariants. Measure of Eucleidean distance was used as a discrimination feature with statistical parameters estimated. The method was tested in classification of Times New Roman font letters as well as sets of the handwritten characters. It is shown that using all Hu's and three affine invariants as discrimination set improves recognition rate by 30%.
On a computational model of building thermal dynamic response
NASA Astrophysics Data System (ADS)
Jarošová, Petra; Vala, Jiří
2016-07-01
Development and exploitation of advanced materials, structures and technologies in civil engineering, both for buildings with carefully controlled interior temperature and for common residential houses, together with new European and national directives and technical standards, stimulate the development of rather complex and robust, but sufficiently simple and inexpensive computational tools, supporting their design and optimization of energy consumption. This paper demonstrates the possibility of consideration of such seemingly contradictory requirements, using the simplified non-stationary thermal model of a building, motivated by the analogy with the analysis of electric circuits; certain semi-analytical forms of solutions come from the method of lines.
Kopyt, Paweł; Celuch, Małgorzata
2007-01-01
A practical implementation of a hybrid simulation system capable of modeling coupled electromagnetic-thermodynamic problems typical in microwave heating is described. The paper presents two approaches to modeling such problems. Both are based on an FDTD-based commercial electromagnetic solver coupled to an external thermodynamic analysis tool required for calculations of heat diffusion. The first approach utilizes a simple FDTD-based thermal solver while in the second it is replaced by a universal commercial CFD solver. The accuracy of the two modeling systems is verified against the original experimental data as well as the measurement results available in literature.
NASA Astrophysics Data System (ADS)
Carpi, Laura; Masoller, Cristina
2018-02-01
Many natural systems display transitions among different dynamical regimes, which are difficult to identify when the data are noisy and high dimensional. A technologically relevant example is a fiber laser, which can display complex dynamical behaviors that involve nonlinear interactions of millions of cavity modes. Here we study the laminar-turbulence transition that occurs when the laser pump power is increased. By applying various data analysis tools to empirical intensity time series we characterize their persistence and demonstrate that at the transition temporal correlations can be precisely represented by a surprisingly simple model.
Gwee, Kok-Ann; Bergmans, Paul; Kim, JinYong; Coudsy, Bogdana; Sim, Angelia; Chen, Minhu; Lin, Lin; Hou, Xiaohua; Wang, Huahong; Goh, Khean-Lee; Pangilinan, John A; Kim, Nayoung; des Varannes, Stanislas Bruley
2017-01-01
Background/Aims There is a need for a simple and practical tool adapted for the diagnosis of chronic constipation (CC) in the Asian population. This study compared the Asian Neurogastroenterology and Motility Association (ANMA) CC tool and Rome III criteria for the diagnosis of CC in Asian subjects. Methods This multicenter, cross-sectional study included subjects presenting at outpatient gastrointestinal clinics across Asia. Subjects with CC alert symptoms completed a combination Diagnosis Questionnaire to obtain a diagnosis based on 4 different diagnostic methods: self-defined, investigator’s judgment, ANMA CC tool, and Rome III criteria. The primary endpoint was the level of agreement/disagreement between the ANMA CC diagnostic tool and Rome III criteria for the diagnosis of CC. Results The primary analysis comprised of 449 subjects, 414 of whom had a positive diagnosis according to the ANMA CC tool. Rome III positive/ANMA positive and Rome III negative/ANMA negative diagnoses were reported in 76.8% and 7.8% of subjects, respectively, resulting in an overall percentage agreement of 84.6% between the 2 diagnostic methods. The overall percentage disagreement between these 2 diagnostic methods was 15.4%. A higher level of agreement was seen between the ANMA CC tool and self-defined (374 subjects [90.3%]) or investigator’s judgment criteria (388 subjects [93.7%]) compared with the Rome III criteria. Conclusion This study demonstrates that the ANMA CC tool can be a useful for Asian patients with CC. PMID:27764907
Supervised learning of tools for content-based search of image databases
NASA Astrophysics Data System (ADS)
Delanoy, Richard L.
1996-03-01
A computer environment, called the Toolkit for Image Mining (TIM), is being developed with the goal of enabling users with diverse interests and varied computer skills to create search tools for content-based image retrieval and other pattern matching tasks. Search tools are generated using a simple paradigm of supervised learning that is based on the user pointing at mistakes of classification made by the current search tool. As mistakes are identified, a learning algorithm uses the identified mistakes to build up a model of the user's intentions, construct a new search tool, apply the search tool to a test image, display the match results as feedback to the user, and accept new inputs from the user. Search tools are constructed in the form of functional templates, which are generalized matched filters capable of knowledge- based image processing. The ability of this system to learn the user's intentions from experience contrasts with other existing approaches to content-based image retrieval that base searches on the characteristics of a single input example or on a predefined and semantically- constrained textual query. Currently, TIM is capable of learning spectral and textural patterns, but should be adaptable to the learning of shapes, as well. Possible applications of TIM include not only content-based image retrieval, but also quantitative image analysis, the generation of metadata for annotating images, data prioritization or data reduction in bandwidth-limited situations, and the construction of components for larger, more complex computer vision algorithms.
NASA Astrophysics Data System (ADS)
Rivers, M. L.; Gualda, G. A.
2009-05-01
One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk, and (3) generate MPEG movies for inclusion in presentations, publications, websites, etc. Both are freely available as run-time ('.sav') versions that can be run using the free IDL Virtual Machine TM, available from ITT Visual Information Solutions: http://www.ittvis.com/ProductServices/IDL/VirtualMachine.aspx The run-time versions of 'tomo_display' and 'vol_tools' can be downloaded from: http://cars.uchicago.edu/software/idl/tomography.html http://sites.google.com/site/voltools/
NASA Astrophysics Data System (ADS)
Han, Xu; Xie, Guangping; Laflen, Brandon; Jia, Ming; Song, Guiju; Harding, Kevin G.
2015-05-01
In the real application environment of field engineering, a large variety of metrology tools are required by the technician to inspect part profile features. However, some of these tools are burdensome and only address a sole application or measurement. In other cases, standard tools lack the capability of accessing irregular profile features. Customers of field engineering want the next generation metrology devices to have the ability to replace the many current tools with one single device. This paper will describe a method based on the ring optical gage concept to the measurement of numerous kinds of profile features useful for the field technician. The ring optical system is composed of a collimated laser, a conical mirror and a CCD camera. To be useful for a wide range of applications, the ring optical system requires profile feature extraction algorithms and data manipulation directed toward real world applications in field operation. The paper will discuss such practical applications as measuring the non-ideal round hole with both off-centered and oblique axes. The algorithms needed to analyze other features such as measuring the width of gaps, radius of transition fillets, fall of step surfaces, and surface parallelism will also be discussed in this paper. With the assistance of image processing and geometric algorithms, these features can be extracted with a reasonable performance. Tailoring the feature extraction analysis to this specific gage offers the potential for a wider application base beyond simple inner diameter measurements. The paper will present experimental results that are compared with standard gages to prove the performance and feasibility of the analysis in real world field engineering. Potential accuracy improvement methods, a new dual ring design and future work will be discussed at the end of this paper.
Perry, Cary; LeMay, Nancy; Rodway, Greg; Tracy, Allison; Galer, Joan
2005-01-01
Background This article describes the validation of an instrument to measure work group climate in public health organizations in developing countries. The instrument, the Work Group Climate Assessment Tool (WCA), was applied in Brazil, Mozambique, and Guinea to assess the intermediate outcomes of a program to develop leadership for performance improvement. Data were collected from 305 individuals in 42 work groups, who completed a self-administered questionnaire. Methods The WCA was initially validated using Cronbach's alpha reliability coefficient and exploratory factor analysis. This article presents the results of a second validation study to refine the initial analyses to account for nested data, to provide item-level psychometrics, and to establish construct validity. Analyses included eigenvalue decomposition analysis, confirmatory factor analysis, and validity and reliability analyses. Results This study confirmed the validity and reliability of the WCA across work groups with different demographic characteristics (gender, education, management level, and geographical location). The study showed that there is agreement between the theoretical construct of work climate and the items in the WCA tool across different populations. The WCA captures a single perception of climate rather than individual sub-scales of clarity, support, and challenge. Conclusion The WCA is useful for comparing the climates of different work groups, tracking the changes in climate in a single work group over time, or examining differences among individuals' perceptions of their work group climate. Application of the WCA before and after a leadership development process can help work groups hold a discussion about current climate and select a target for improvement. The WCA provides work groups with a tool to take ownership of their own group climate through a process that is simple and objective and that protects individual confidentiality. PMID:16223447