Distributed and Collaborative Software Analysis
NASA Astrophysics Data System (ADS)
Ghezzi, Giacomo; Gall, Harald C.
Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of
Rotorcraft Conceptual Design Environment
2009-10-01
systems engineering design tool sets. The DaVinci Project vision is to develop software architecture and tools specifically for acquisition system...enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described. Introduction...information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION
Computational Analysis of Material Flow During Friction Stir Welding of AA5059 Aluminum Alloys
2011-01-01
tool material (AISI H13 tool steel ) is modeled as an isotropic linear-elastic material. Within the analysis, the effects of some of the FSW key process...threads/m; (b) tool 598 material = AISI H13 tool steel ; (c) workpiece material = 599 AA5059; (d) tool rotation speed = 500 rpm; (e) tool travel 600 speed...the strain-hardening term is augmented to take into account for the effect of dynamic recrystallization) while the FSW tool material (AISI H13
Analysis and design of friction stir welding tool
NASA Astrophysics Data System (ADS)
Jagadeesha, C. B.
2016-12-01
Since its inception no one has done analysis and design of FSW tool. Initial dimensions of FSW tool are decided by educated guess. Optimum stresses on tool pin have been determined at optimized parameters for bead on plate welding on AZ31B-O Mg alloy plate. Fatigue analysis showed that the chosen FSW tool for the welding experiment has not ∞ life and it has determined that the life of FSW tool is 2.66×105 cycles or revolutions. So one can conclude that any arbitrarily decided FSW tool generally has finite life and cannot be used for ∞ life. In general, one can determine the suitability of tool and its material to be used in FSW of the given workpiece materials in advance by this analysis in terms of fatigue life of the tool.
GOMA: functional enrichment analysis tool based on GO modules
Huang, Qiang; Wu, Ling-Yun; Wang, Yong; Zhang, Xiang-Sun
2013-01-01
Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology. A variety of enrichment analysis tools have been developed in recent years, but most output a long list of significantly enriched terms that are often redundant, making it difficult to extract the most meaningful functions. In this paper, we present GOMA, a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules. With this method, we systematically revealed functional GO modules, i.e., groups of functionally similar GO terms, via an optimization model and then ranked them by enrichment scores. Our new method simplifies enrichment analysis results by reducing redundancy, thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results. PMID:23237213
DaGO-Fun: tool for Gene Ontology-based functional analysis using term information content measures.
Mazandu, Gaston K; Mulder, Nicola J
2013-09-25
The use of Gene Ontology (GO) data in protein analyses have largely contributed to the improved outcomes of these analyses. Several GO semantic similarity measures have been proposed in recent years and provide tools that allow the integration of biological knowledge embedded in the GO structure into different biological analyses. There is a need for a unified tool that provides the scientific community with the opportunity to explore these different GO similarity measure approaches and their biological applications. We have developed DaGO-Fun, an online tool available at http://web.cbio.uct.ac.za/ITGOM, which incorporates many different GO similarity measures for exploring, analyzing and comparing GO terms and proteins within the context of GO. It uses GO data and UniProt proteins with their GO annotations as provided by the Gene Ontology Annotation (GOA) project to precompute GO term information content (IC), enabling rapid response to user queries. The DaGO-Fun online tool presents the advantage of integrating all the relevant IC-based GO similarity measures, including topology- and annotation-based approaches to facilitate effective exploration of these measures, thus enabling users to choose the most relevant approach for their application. Furthermore, this tool includes several biological applications related to GO semantic similarity scores, including the retrieval of genes based on their GO annotations, the clustering of functionally related genes within a set, and term enrichment analysis.
Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C
2015-11-01
Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.
DaGO-Fun: tool for Gene Ontology-based functional analysis using term information content measures
2013-01-01
Background The use of Gene Ontology (GO) data in protein analyses have largely contributed to the improved outcomes of these analyses. Several GO semantic similarity measures have been proposed in recent years and provide tools that allow the integration of biological knowledge embedded in the GO structure into different biological analyses. There is a need for a unified tool that provides the scientific community with the opportunity to explore these different GO similarity measure approaches and their biological applications. Results We have developed DaGO-Fun, an online tool available at http://web.cbio.uct.ac.za/ITGOM, which incorporates many different GO similarity measures for exploring, analyzing and comparing GO terms and proteins within the context of GO. It uses GO data and UniProt proteins with their GO annotations as provided by the Gene Ontology Annotation (GOA) project to precompute GO term information content (IC), enabling rapid response to user queries. Conclusions The DaGO-Fun online tool presents the advantage of integrating all the relevant IC-based GO similarity measures, including topology- and annotation-based approaches to facilitate effective exploration of these measures, thus enabling users to choose the most relevant approach for their application. Furthermore, this tool includes several biological applications related to GO semantic similarity scores, including the retrieval of genes based on their GO annotations, the clustering of functionally related genes within a set, and term enrichment analysis. PMID:24067102
Hinton, Elizabeth G; Oelschlegel, Sandra; Vaughn, Cynthia J; Lindsay, J Michael; Hurst, Sachiko M; Earl, Martha
2013-01-01
This study utilizes an informatics tool to analyze a robust literature search service in an academic medical center library. Structured interviews with librarians were conducted focusing on the benefits of such a tool, expectations for performance, and visual layout preferences. The resulting application utilizes Microsoft SQL Server and .Net Framework 3.5 technologies, allowing for the use of a web interface. Customer tables and MeSH terms are included. The National Library of Medicine MeSH database and entry terms for each heading are incorporated, resulting in functionality similar to searching the MeSH database through PubMed. Data reports will facilitate analysis of the search service.
WholePathwayScope: a comprehensive pathway-based analysis tool for high-throughput data
Yi, Ming; Horton, Jay D; Cohen, Jonathan C; Hobbs, Helen H; Stephens, Robert M
2006-01-01
Background Analysis of High Throughput (HTP) Data such as microarray and proteomics data has provided a powerful methodology to study patterns of gene regulation at genome scale. A major unresolved problem in the post-genomic era is to assemble the large amounts of data generated into a meaningful biological context. We have developed a comprehensive software tool, WholePathwayScope (WPS), for deriving biological insights from analysis of HTP data. Result WPS extracts gene lists with shared biological themes through color cue templates. WPS statistically evaluates global functional category enrichment of gene lists and pathway-level pattern enrichment of data. WPS incorporates well-known biological pathways from KEGG (Kyoto Encyclopedia of Genes and Genomes) and Biocarta, GO (Gene Ontology) terms as well as user-defined pathways or relevant gene clusters or groups, and explores gene-term relationships within the derived gene-term association networks (GTANs). WPS simultaneously compares multiple datasets within biological contexts either as pathways or as association networks. WPS also integrates Genetic Association Database and Partial MedGene Database for disease-association information. We have used this program to analyze and compare microarray and proteomics datasets derived from a variety of biological systems. Application examples demonstrated the capacity of WPS to significantly facilitate the analysis of HTP data for integrative discovery. Conclusion This tool represents a pathway-based platform for discovery integration to maximize analysis power. The tool is freely available at . PMID:16423281
USDA-ARS?s Scientific Manuscript database
Using multiple historical satellite surface soil moisture products, the Kalman Filtering-based Soil Moisture Analysis Rainfall Tool (SMART) is applied to improve the accuracy of a multi-decadal global daily rainfall product that has been bias-corrected to match the monthly totals of available rain g...
Analysis of Java Client/Server and Web Programming Tools for Development of Educational Systems.
ERIC Educational Resources Information Center
Muldner, Tomasz
This paper provides an analysis of old and new programming tools for development of client/server programs, particularly World Wide Web-based programs. The focus is on development of educational systems that use interactive shared workspaces to provide portable and expandable solutions. The paper begins with a short description of relevant terms.…
GARNET--gene set analysis with exploration of annotation relations.
Rho, Kyoohyoung; Kim, Bumjin; Jang, Youngjun; Lee, Sanghyun; Bae, Taejeong; Seo, Jihae; Seo, Chaehwa; Lee, Jihyun; Kang, Hyunjung; Yu, Ungsik; Kim, Sunghoon; Lee, Sanghyuk; Kim, Wan Kyu
2011-02-15
Gene set analysis is a powerful method of deducing biological meaning for an a priori defined set of genes. Numerous tools have been developed to test statistical enrichment or depletion in specific pathways or gene ontology (GO) terms. Major difficulties towards biological interpretation are integrating diverse types of annotation categories and exploring the relationships between annotation terms of similar information. GARNET (Gene Annotation Relationship NEtwork Tools) is an integrative platform for gene set analysis with many novel features. It includes tools for retrieval of genes from annotation database, statistical analysis & visualization of annotation relationships, and managing gene sets. In an effort to allow access to a full spectrum of amassed biological knowledge, we have integrated a variety of annotation data that include the GO, domain, disease, drug, chromosomal location, and custom-defined annotations. Diverse types of molecular networks (pathways, transcription and microRNA regulations, protein-protein interaction) are also included. The pair-wise relationship between annotation gene sets was calculated using kappa statistics. GARNET consists of three modules--gene set manager, gene set analysis and gene set retrieval, which are tightly integrated to provide virtually automatic analysis for gene sets. A dedicated viewer for annotation network has been developed to facilitate exploration of the related annotations. GARNET (gene annotation relationship network tools) is an integrative platform for diverse types of gene set analysis, where complex relationships among gene annotations can be easily explored with an intuitive network visualization tool (http://garnet.isysbio.org/ or http://ercsb.ewha.ac.kr/garnet/).
NASA Technical Reports Server (NTRS)
Cheung, Kar-Ming; Tung, Ramona H.; Lee, Charles H.
2003-01-01
In this paper, we describe the development roadmap and discuss the various challenges of an evolvable and extensible multi-mission telecom planning and analysis framework. Our long-term goal is to develop a set of powerful flexible telecommunications analysis tools that can be easily adapted to different missions while maintain the common Deep Space Communication requirements. The ability of re-using the DSN ground models and the common software utilities in our adaptations has contributed significantly to our development efforts measured in terms of consistency, accuracy, and minimal effort redundancy, which can translate into shorter development time and major cost savings for the individual missions. In our roadmap, we will address the design principles, technical achievements and the associated challenges for following telecom analysis tools (i) Telecom Forecaster Predictor - TFP (ii) Unified Telecom Predictor - UTP (iii) Generalized Telecom Predictor - GTP (iv) Generic TFP (v) Web-based TFP (vi) Application Program Interface - API (vii) Mars Relay Network Planning Tool - MRNPT.
Interactive visual analysis promotes exploration of long-term ecological data
T.N. Pham; J.A. Jones; R. Metoyer; F.J. Swanson; R.J. Pabst
2013-01-01
Long-term ecological data are crucial in helping ecologists understand ecosystem function and environmental change. Nevertheless, these kinds of data sets are difficult to analyze because they are usually large, multivariate, and spatiotemporal. Although existing analysis tools such as statistical methods and spreadsheet software permit rigorous tests of pre-conceived...
Analysis of Ten Reverse Engineering Tools
NASA Astrophysics Data System (ADS)
Koskinen, Jussi; Lehmonen, Tero
Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.
Longitudinal adoption rates of complex decision support tools in primary care.
McCullagh, Lauren; Mann, Devin; Rosen, Lisa; Kannry, Joseph; McGinn, Thomas
2014-12-01
Translating research findings into practice promises to standardise care. Translation includes the integration of evidence-based guidelines at the point of care, discerning the best methods to disseminate research findings and models to sustain the implementation of best practices.By applying usability testing to clinical decision support(CDS) design, overall adoption rates of 60% can be realised.What has not been examined is how long adoption rates are sustained and the characteristics associated with long-term use. We conducted secondary analysis to decipher the factors impacting sustained use of CD Stools. This study was a secondary data analysis from a clinical trial conducted at an academic institution in New York City. Study data was identified patients electronic health records (EHR). The trial was to test the implementation of an integrated clinical prediction rule(iCPR) into the EHR. The primary outcome variable was iCPR tool acceptance of the tool. iCPR tool completion and iCPR smartest completion were additional outcome variables of interest. The secondary aim was to examine user characteristics associated with iCPR tool use in later time periods. Characteristics of interest included age, resident year, use of electronic health records (yes/no) and use of best practice alerts (BPA) (yes/no). Generalised linear mixed models (GLiMM) were used to compare iCPR use over time for each outcome of interest: namely, iCPR acceptance, iCPR completion and iCPR smartset completion.GLiMM was also used to examine resident characteristics associated with iCPR tool use in later time periods; specifically, intermediate and long-term (ie, 90+days). The tool was accepted, on average, 82.18% in the first 90 days (short-term period). The use decreases to 56.07% and 45.61% in intermediate and long-term time periods, respectively. There was a significant association between iCPR tool completion and time periods(p<0.0001). There was no significant difference in iCPR tool completion between resident encounters in the intermediate and long-term periods (p<0.6627). There was a significant association between iCPR smartset completion and time periods (p<0.0021). There were no significant associations between iCPR smartest completion and any of the four predictors of interest. We examined the frequencies of components of the iCPR tool being accepted over time by individual clinicians. Rates of adoption of the different components of the tool decreased substantially over time. The data suggest that over time and prolonged exposure to CDS tools, providers are less likely to utilise the tool. It is not clear if it is fatigue with the CDS tool, acquired knowledge of the clinical prediction rule, or gained clinical experience and gestalt that are influencing adoption rates. Further analysis of individual adoption rates over time and the impact it has on clinical outcomes should be conducted.
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Nemeth, Michael P.; Hilburger, Mark W.
2004-01-01
A technology review and assessment of modeling and analysis efforts underway in support of a safe return to flight of the thermal protection system (TPS) for the Space Shuttle external tank (ET) are summarized. This review and assessment effort focuses on the structural modeling and analysis practices employed for ET TPS foam design and analysis and on identifying analysis capabilities needed in the short-term and long-term. The current understanding of the relationship between complex flight environments and ET TPS foam failure modes are reviewed as they relate to modeling and analysis. A literature review on modeling and analysis of TPS foam material systems is also presented. Finally, a review of modeling and analysis tools employed in the Space Shuttle Program is presented for the ET TPS acreage and close-out foam regions. This review includes existing simplified engineering analysis tools are well as finite element analysis procedures.
Teaching meta-analysis using MetaLight.
Thomas, James; Graziosi, Sergio; Higgins, Steve; Coe, Robert; Torgerson, Carole; Newman, Mark
2012-10-18
Meta-analysis is a statistical method for combining the results of primary studies. It is often used in systematic reviews and is increasingly a method and topic that appears in student dissertations. MetaLight is a freely available software application that runs simple meta-analyses and contains specific functionality to facilitate the teaching and learning of meta-analysis. While there are many courses and resources for meta-analysis available and numerous software applications to run meta-analyses, there are few pieces of software which are aimed specifically at helping those teaching and learning meta-analysis. Valuable teaching time can be spent learning the mechanics of a new software application, rather than on the principles and practices of meta-analysis. We discuss ways in which the MetaLight tool can be used to present some of the main issues involved in undertaking and interpreting a meta-analysis. While there are many software tools available for conducting meta-analysis, in the context of a teaching programme such software can require expenditure both in terms of money and in terms of the time it takes to learn how to use it. MetaLight was developed specifically as a tool to facilitate the teaching and learning of meta-analysis and we have presented here some of the ways it might be used in a training situation.
Comparison of seven fall risk assessment tools in community-dwelling Korean older women.
Kim, Taekyoung; Xiong, Shuping
2017-03-01
This study aimed to compare seven widely used fall risk assessment tools in terms of validity and practicality, and to provide a guideline for choosing appropriate fall risk assessment tools for elderly Koreans. Sixty community-dwelling Korean older women (30 fallers and 30 matched non-fallers) were evaluated. Performance measures of all tools were compared between the faller and non-faller groups through two sample t-tests. Receiver Operating Characteristic curves were generated with odds ratios for discriminant analysis. Results showed that four tools had significant discriminative power, and the shortened version of Falls Efficacy Scale (SFES) showed excellent discriminant validity, followed by Berg Balance Scale (BBS) with acceptable discriminant validity. The Mini Balance Evaluation System Test and Timed Up and Go, however, had limited discriminant validities. In terms of practicality, SFES was also excellent. These findings suggest that SFES is the most suitable tool for assessing the fall risks of community-dwelling Korean older women, followed by BBS. Practitioner Summary: There is no general guideline on which fall risk assessment tools are suitable for community-dwelling Korean older women. This study compared seven widely used assessment tools in terms of validity and practicality. Results suggested that the short Falls Efficacy Scale is the most suitable tool, followed by Berg Balance Scale.
Decision Analysis Tools for Volcano Observatories
NASA Astrophysics Data System (ADS)
Hincks, T. H.; Aspinall, W.; Woo, G.
2005-12-01
Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.
Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Yannan; Hou, Zhangshuan; Meng, Da
2016-07-17
In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.
GOEAST: a web-based software toolkit for Gene Ontology enrichment analysis.
Zheng, Qi; Wang, Xiu-Jie
2008-07-01
Gene Ontology (GO) analysis has become a commonly used approach for functional studies of large-scale genomic or transcriptomic data. Although there have been a lot of software with GO-related analysis functions, new tools are still needed to meet the requirements for data generated by newly developed technologies or for advanced analysis purpose. Here, we present a Gene Ontology Enrichment Analysis Software Toolkit (GOEAST), an easy-to-use web-based toolkit that identifies statistically overrepresented GO terms within given gene sets. Compared with available GO analysis tools, GOEAST has the following improved features: (i) GOEAST displays enriched GO terms in graphical format according to their relationships in the hierarchical tree of each GO category (biological process, molecular function and cellular component), therefore, provides better understanding of the correlations among enriched GO terms; (ii) GOEAST supports analysis for data from various sources (probe or probe set IDs of Affymetrix, Illumina, Agilent or customized microarrays, as well as different gene identifiers) and multiple species (about 60 prokaryote and eukaryote species); (iii) One unique feature of GOEAST is to allow cross comparison of the GO enrichment status of multiple experiments to identify functional correlations among them. GOEAST also provides rigorous statistical tests to enhance the reliability of analysis results. GOEAST is freely accessible at http://omicslab.genetics.ac.cn/GOEAST/
LONG TERM HYDROLOGICAL IMPACT ASSESSMENT (LTHIA)
LTHIA is a universal Urban Sprawl analysis tool that is available to all at no charge through the Internet. It estimates impacts on runoff, recharge and nonpoint source pollution resulting from past or proposed land use changes. It gives long-term average annual runoff for a lan...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munasinghe, M.; Meier, P.
1988-01-01
Given the importance of energy in modern economies, the first part of the volume is devoted to examining some of the key conceptual and analytical tools available for energy-policy analysis and planning. Policy tools and institutional frameworks that will facilitate better energy management are also discussed. Energy-policy analysis is explained, while effective energy management techniques are discussed to achieve desirable national objectives, using a selected set of policies and policy instruments. In the second part of the volume, the actual application of the principles set out earlier is explained through a case study of Sri Lanka. The monograph integrates themore » many aspects of the short-term programs already begun with the options for the medium to long term, and ends with the outline of a long-term strategy for Sri Lanka.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez Galdamez, Rinaldo A.; Recknagle, Kurtis P.
2012-04-30
This report provides an overview of the work performed for Solid Oxide Fuel Cell (SOFC) modeling during the 2012 Winter/Spring Science Undergraduate Laboratory Internship at Pacific Northwest National Laboratory (PNNL). A brief introduction on the concept, operation basics and applications of fuel cells is given for the general audience. Further details are given regarding the modifications and improvements of the Distributed Electrochemistry (DEC) Modeling tool developed by PNNL engineers to model SOFC long term performance. Within this analysis, a literature review on anode degradation mechanisms is explained and future plans of implementing these into the DEC modeling tool are alsomore » proposed.« less
Tissue enrichment analysis for C. elegans genomics.
Angeles-Albores, David; N Lee, Raymond Y; Chan, Juancarlos; Sternberg, Paul W
2016-09-13
Over the last ten years, there has been explosive development in methods for measuring gene expression. These methods can identify thousands of genes altered between conditions, but understanding these datasets and forming hypotheses based on them remains challenging. One way to analyze these datasets is to associate ontologies (hierarchical, descriptive vocabularies with controlled relations between terms) with genes and to look for enrichment of specific terms. Although Gene Ontology (GO) is available for Caenorhabditis elegans, it does not include anatomical information. We have developed a tool for identifying enrichment of C. elegans tissues among gene sets and generated a website GUI where users can access this tool. Since a common drawback to ontology enrichment analyses is its verbosity, we developed a very simple filtering algorithm to reduce the ontology size by an order of magnitude. We adjusted these filters and validated our tool using a set of 30 gold standards from Expression Cluster data in WormBase. We show our tool can even discriminate between embryonic and larval tissues and can even identify tissues down to the single-cell level. We used our tool to identify multiple neuronal tissues that are down-regulated due to pathogen infection in C. elegans. Our Tissue Enrichment Analysis (TEA) can be found within WormBase, and can be downloaded using Python's standard pip installer. It tests a slimmed-down C. elegans tissue ontology for enrichment of specific terms and provides users with a text and graphic representation of the results.
Practical applications of surface analytic tools in tribology
NASA Technical Reports Server (NTRS)
Ferrante, J.
1980-01-01
Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.
Automation life-cycle cost model
NASA Technical Reports Server (NTRS)
Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne
1992-01-01
The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.
Analysis of Orbital Lifetime Prediction Parameters in Preparation for Post-Mission Disposal
NASA Astrophysics Data System (ADS)
Choi, Ha-Yeon; Kim, Hae-Dong; Seong, Jae-Dong
2015-12-01
Atmospheric drag force is an important source of perturbation of Low Earth Orbit (LEO) orbit satellites, and solar activity is a major factor for changes in atmospheric density. In particular, the orbital lifetime of a satellite varies with changes in solar activity, so care must be taken in predicting the remaining orbital lifetime during preparation for post-mission disposal. In this paper, the System Tool Kit (STK®) Long-term Orbit Propagator is used to analyze the changes in orbital lifetime predictions with respect to solar activity. In addition, the STK® Lifetime tool is used to analyze the change in orbital lifetime with respect to solar flux data generation, which is needed for the orbital lifetime calculation, and its control on the drag coefficient control. Analysis showed that the application of the most recent solar flux file within the Lifetime tool gives a predicted trend that is closest to the actual orbit. We also examine the effect of the drag coefficient, by performing a comparative analysis between varying and constant coefficients in terms of solar activity intensities.
Discrete event simulation tool for analysis of qualitative models of continuous processing systems
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)
1990-01-01
An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.
A comparative analysis of Patient-Reported Expanded Disability Status Scale tools.
Collins, Christian DE; Ivry, Ben; Bowen, James D; Cheng, Eric M; Dobson, Ruth; Goodin, Douglas S; Lechner-Scott, Jeannette; Kappos, Ludwig; Galea, Ian
2016-09-01
Patient-Reported Expanded Disability Status Scale (PREDSS) tools are an attractive alternative to the Expanded Disability Status Scale (EDSS) during long term or geographically challenging studies, or in pressured clinical service environments. Because the studies reporting these tools have used different metrics to compare the PREDSS and EDSS, we undertook an individual patient data level analysis of all available tools. Spearman's rho and the Bland-Altman method were used to assess correlation and agreement respectively. A systematic search for validated PREDSS tools covering the full EDSS range identified eight such tools. Individual patient data were available for five PREDSS tools. Excellent correlation was observed between EDSS and PREDSS with all tools. A higher level of agreement was observed with increasing levels of disability. In all tools, the 95% limits of agreement were greater than the minimum EDSS difference considered to be clinically significant. However, the intra-class coefficient was greater than that reported for EDSS raters of mixed seniority. The visual functional system was identified as the most significant predictor of the PREDSS-EDSS difference. This analysis will (1) enable researchers and service providers to make an informed choice of PREDSS tool, depending on their individual requirements, and (2) facilitate improvement of current PREDSS tools. © The Author(s), 2015.
Prioritizing biological pathways by recognizing context in time-series gene expression data.
Lee, Jusang; Jo, Kyuri; Lee, Sunwon; Kang, Jaewoo; Kim, Sun
2016-12-23
The primary goal of pathway analysis using transcriptome data is to find significantly perturbed pathways. However, pathway analysis is not always successful in identifying pathways that are truly relevant to the context under study. A major reason for this difficulty is that a single gene is involved in multiple pathways. In the KEGG pathway database, there are 146 genes, each of which is involved in more than 20 pathways. Thus activation of even a single gene will result in activation of many pathways. This complex relationship often makes the pathway analysis very difficult. While we need much more powerful pathway analysis methods, a readily available alternative way is to incorporate the literature information. In this study, we propose a novel approach for prioritizing pathways by combining results from both pathway analysis tools and literature information. The basic idea is as follows. Whenever there are enough articles that provide evidence on which pathways are relevant to the context, we can be assured that the pathways are indeed related to the context, which is termed as relevance in this paper. However, if there are few or no articles reported, then we should rely on the results from the pathway analysis tools, which is termed as significance in this paper. We realized this concept as an algorithm by introducing Context Score and Impact Score and then combining the two into a single score. Our method ranked truly relevant pathways significantly higher than existing pathway analysis tools in experiments with two data sets. Our novel framework was implemented as ContextTRAP by utilizing two existing tools, TRAP and BEST. ContextTRAP will be a useful tool for the pathway based analysis of gene expression data since the user can specify the context of the biological experiment in a set of keywords. The web version of ContextTRAP is available at http://biohealth.snu.ac.kr/software/contextTRAP .
Taverna: a tool for building and running workflows of services
Hull, Duncan; Wolstencroft, Katy; Stevens, Robert; Goble, Carole; Pocock, Mathew R.; Li, Peter; Oinn, Tom
2006-01-01
Taverna is an application that eases the use and integration of the growing number of molecular biology tools and databases available on the web, especially web services. It allows bioinformaticians to construct workflows or pipelines of services to perform a range of different analyses, such as sequence analysis and genome annotation. These high-level workflows can integrate many different resources into a single analysis. Taverna is available freely under the terms of the GNU Lesser General Public License (LGPL) from . PMID:16845108
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
Stone, John E.; Hynninen, Antti-Pekka; Phillips, James C.; Schulten, Klaus
2017-01-01
All-atom molecular dynamics simulations of biomolecules provide a powerful tool for exploring the structure and dynamics of large protein complexes within realistic cellular environments. Unfortunately, such simulations are extremely demanding in terms of their computational requirements, and they present many challenges in terms of preparation, simulation methodology, and analysis and visualization of results. We describe our early experiences porting the popular molecular dynamics simulation program NAMD and the simulation preparation, analysis, and visualization tool VMD to GPU-accelerated OpenPOWER hardware platforms. We report our experiences with compiler-provided autovectorization and compare with hand-coded vector intrinsics for the POWER8 CPU. We explore the performance benefits obtained from unique POWER8 architectural features such as 8-way SMT and its value for particular molecular modeling tasks. Finally, we evaluate the performance of several GPU-accelerated molecular modeling kernels and relate them to other hardware platforms. PMID:29202130
NASA Astrophysics Data System (ADS)
Huang, J. Y.; Tung, C. P.
2017-12-01
There is an important book called "Peasant Calendar" in the Chinese society. The Peasant Calendar is originally based on the orbit of the Sun and each year is divided into 24 solar terms. Each term has its own special meaning and conception. For example, "Spring Begins" means the end of winter and the beginning of spring. In Taiwan, 24 solar terms play an important role in agriculture because farmers always use the Peasant Calendar to decide when to sow. However, the current solar term in Taiwan is fixed about 15 days. This way doesn't show the temporal variability of climate and also can't truly reflect the regional climate characteristics in different areas.The number of days in each solar term should be more flexible. Since weather is associated with climate, all weather phenomena can be regarded as a multiple fluctuation signal. In this research, 30 years observation data of surface temperature and precipitation from 1976 2016 are used. The data is cut into different time series, such as a week, a month, six months to one year and so on. Signal analysis tools such as wavelet, change point analysis and Fourier transform are used to determine the length of each solar term. After determining the days of each solar term, statistical tests are used to find the relationships between the length of solar terms and climate turbulent (e.g., ENSO and PDO).For example, one of the solar terms called "Major Heat" should typically be more than 20 days in Taiwan due to global warming and heat island effect. The advance of Peasant Calendar can help farmers to make better decision, controlling crop schedule and using the farmland more efficient. For instance, warmer condition can accelerate the accumulation of accumulated temperature, which is the key of crop's growth stage. The result also can be used on disaster reduction (e.g., preventing agricultural damage) and water resources project.
Scenario analysis and strategic planning: practical applications for radiology practices.
Lexa, Frank James; Chan, Stephen
2010-05-01
Modern business science has many tools that can be of great value to radiologists and their practices. One of the most important and underused is long-term planning. Part of the problem has been the pace of change. Making a 5-year plan makes sense only if your develop robust scenarios of possible future conditions you will face. Scenario analysis is one of many highly regarded tools that can improve your predictive capability. However, as with many tools, it pays to have some training and to get practical tips on how to improve their value. It also helps to learn from other people's mistakes rather than your own. The authors discuss both theoretical and practical issues in using scenario analysis to improve your planning process. They discuss actionable ways this set of tools can be applied in a group meeting or retreat. Copyright (c) 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.
A computational image analysis glossary for biologists.
Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M
2012-09-01
Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies.
HydroClimATe: hydrologic and climatic analysis toolkit
Dickinson, Jesse; Hanson, Randall T.; Predmore, Steven K.
2014-01-01
The potential consequences of climate variability and climate change have been identified as major issues for the sustainability and availability of the worldwide water resources. Unlike global climate change, climate variability represents deviations from the long-term state of the climate over periods of a few years to several decades. Currently, rich hydrologic time-series data are available, but the combination of data preparation and statistical methods developed by the U.S. Geological Survey as part of the Groundwater Resources Program is relatively unavailable to hydrologists and engineers who could benefit from estimates of climate variability and its effects on periodic recharge and water-resource availability. This report documents HydroClimATe, a computer program for assessing the relations between variable climatic and hydrologic time-series data. HydroClimATe was developed for a Windows operating system. The software includes statistical tools for (1) time-series preprocessing, (2) spectral analysis, (3) spatial and temporal analysis, (4) correlation analysis, and (5) projections. The time-series preprocessing tools include spline fitting, standardization using a normal or gamma distribution, and transformation by a cumulative departure. The spectral analysis tools include discrete Fourier transform, maximum entropy method, and singular spectrum analysis. The spatial and temporal analysis tool is empirical orthogonal function analysis. The correlation analysis tools are linear regression and lag correlation. The projection tools include autoregressive time-series modeling and generation of many realizations. These tools are demonstrated in four examples that use stream-flow discharge data, groundwater-level records, gridded time series of precipitation data, and the Multivariate ENSO Index.
WEC Design Response Toolbox v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey
2016-03-30
The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.
ERIC Educational Resources Information Center
Rose, Carolyn; Wang, Yi-Chia; Cui, Yue; Arguello, Jaime; Stegmann, Karsten; Weinberger, Armin; Fischer, Frank
2008-01-01
In this article we describe the emerging area of text classification research focused on the problem of collaborative learning process analysis both from a broad perspective and more specifically in terms of a publicly available tool set called TagHelper tools. Analyzing the variety of pedagogically valuable facets of learners' interactions is a…
Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide
NASA Astrophysics Data System (ADS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.
1993-03-01
A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.
Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide
NASA Technical Reports Server (NTRS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.
1993-01-01
A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.
Application of risk analysis in water resourses management
NASA Astrophysics Data System (ADS)
Varouchakis, Emmanouil; Palogos, Ioannis
2017-04-01
A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers (stakeholders) to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits. This tool is developed in a web service for the easier stakeholders' access.
A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.
ERIC Educational Resources Information Center
Kim, Jin Eun
A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…
1999-03-01
of epistemic forms and games , which can form the basis for building a tool to support expert analyses. 15. SUBJECT TERMS Expert analysis Epistemic...forms Epistemic games SECURITY CLASSIFICATION OF 16. REPORT Unclassified 17. ABSTRACT Unclassified 18. THIS PAGE Unclassified 19. LIMITATION OF...1998 Principal Investigators: Allan Collins & William Ferguson BBN Technologies Introduction 1 Prior Work 2 Structural-Analysis Games 2 Functional
Whalen, Kimberly J; Buchholz, Susan W
The overall objective of this review is to quantitatively measure the psychometric properties and the feasibility of caregiver burden screening tools. The more specific objectives were to determine the reliability, validity as well as feasibility of tools that are used to screen for caregiver burden and strain. This review considered international quantitative research papers that addressed the psychometric properties and feasibility of caregiver burden screening tools. The search strategy aimed to find both published and unpublished studies from 1980-2007 published only in the English language. An initial limited search of MEDLINE and CINAHL was undertaken followed by analysis of the text words contained in the title and abstract and the index terms used to describe the article. A second search identified keywords and index terms across major databases. Third, the reference list of identified reports and articles was searched for additional studies. Each paper was assessed by two independent reviewers for methodological quality prior to inclusion in the review using an appropriate critical appraisal instrument from the Joanna Briggs Institutes' System for the Unified Management, Assessment and Review (SUMARI) package. Because burden is a multidimensional construct defined internationally with a multitude of other terms, only those studies whose title, abstract or keywords contained the search terminology developed for this review were identified for retrieval. The construct of caregiver burden is not standardized, and many terms are used to describe burden. A caregiver is also identified as a carer. Instruments exist in multiple languages and have been tested in multiple populations. A total of 112 papers, experimental and non-experimental in nature, were included in the review. The majority of papers were non-experimental studies that tested or used a caregiver burden screening tool. Because of the nature of these papers, a meta-analysis of the results was not possible. Instead a table is used to depict the 74 caregiver burden screening tools that meet the psychometric and feasibility standards of this review. The Zarit Burden Interview (ZBI), in particular the 22-item version, has been examined the most throughout the literature. In addition to its sound psychometric properties, the ZBI has been widely used across languages and cultures. The significant amount of research that has already been done on psychometric testing of caregiver burden tools has provided a solid foundation for additional research. Although some tools have been well tested, many tools have published limited psychometric properties and feasibility data. The clinician needs to be aware of this and may need to team up with a researcher to obtain additional research data on their specific population before using a minimally tested caregiver burden screening tool. Because caregiver burden is multidimensional and many different terms are used to describe burden, both the clinician and researcher need to be precise in their selection of the appropriate tool for their work.
Surface analysis of stone and bone tools
NASA Astrophysics Data System (ADS)
Stemp, W. James; Watson, Adam S.; Evans, Adrian A.
2016-03-01
Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.
Comparative genome analysis in the integrated microbial genomes (IMG) system.
Markowitz, Victor M; Kyrpides, Nikos C
2007-01-01
Comparative genome analysis is critical for the effective exploration of a rapidly growing number of complete and draft sequences for microbial genomes. The Integrated Microbial Genomes (IMG) system (img.jgi.doe.gov) has been developed as a community resource that provides support for comparative analysis of microbial genomes in an integrated context. IMG allows users to navigate the multidimensional microbial genome data space and focus their analysis on a subset of genes, genomes, and functions of interest. IMG provides graphical viewers, summaries, and occurrence profile tools for comparing genes, pathways, and functions (terms) across specific genomes. Genes can be further examined using gene neighborhoods and compared with sequence alignment tools.
McNamee, R L; Eddy, W F
2001-12-01
Analysis of variance (ANOVA) is widely used for the study of experimental data. Here, the reach of this tool is extended to cover the preprocessing of functional magnetic resonance imaging (fMRI) data. This technique, termed visual ANOVA (VANOVA), provides both numerical and pictorial information to aid the user in understanding the effects of various parts of the data analysis. Unlike a formal ANOVA, this method does not depend on the mathematics of orthogonal projections or strictly additive decompositions. An illustrative example is presented and the application of the method to a large number of fMRI experiments is discussed. Copyright 2001 Wiley-Liss, Inc.
van Dusseldorp, Loes; Hamers, Hub; van Achterberg, Theo; Schoonhoven, Lisette
2014-07-15
At many hospitals and long-term care organizations (such as nursing homes), executive board members have a responsibility to manage patient safety. Executive WalkRounds offer an opportunity for boards to build a trusting relationship with professionals and seem useful as a leadership tool to pick up on soft signals, which are indirect signals or early warnings that something is wrong. Because the majority of the research on WalkRounds has been performed in hospitals, it is unknown how board members of long-term care organizations develop their patient safety policy. Also, it is not clear if these board members use soft signals as a leadership tool and, if so, how this influences their patient safety policies. The objective of this study is to explore the added value and the feasibility of WalkRounds for patient safety management in long-term care. This study also aims to identify how executive board members of long-term care organizations manage patient safety and to describe the characteristics of boards. An explorative before-and-after study was conducted between April 2012 and February 2014 in 13 long-term care organizations in the Netherlands. After implementing the intervention in 6 organizations, data from 72 WalkRounds were gathered by observation and a reporting form. Before and after the intervention period, data collection included interviews, questionnaires, and studying reports of the executive boards. A mixed-method analysis is performed using descriptive statistics, t tests, and content analysis. Results are expected to be ready in mid 2014. It is a challenge to keep track of ongoing development and implementation of patient safety management tools in long-term care. By performing this study in cooperation with the participating long-term care organizations, insight into the potential added value and the feasibility of this method will increase.
Image Analysis in Plant Sciences: Publish Then Perish.
Lobet, Guillaume
2017-07-01
Image analysis has become a powerful technique for most plant scientists. In recent years dozens of image analysis tools have been published in plant science journals. These tools cover the full spectrum of plant scales, from single cells to organs and canopies. However, the field of plant image analysis remains in its infancy. It still has to overcome important challenges, such as the lack of robust validation practices or the absence of long-term support. In this Opinion article, I: (i) present the current state of the field, based on data from the plant-image-analysis.org database; (ii) identify the challenges faced by its community; and (iii) propose workable ways of improvement. Copyright © 2017 Elsevier Ltd. All rights reserved.
Optimality problem of network topology in stocks market analysis
NASA Astrophysics Data System (ADS)
Djauhari, Maman Abdurachman; Gan, Siew Lee
2015-02-01
Since its introduction fifteen years ago, minimal spanning tree has become an indispensible tool in econophysics. It is to filter the important economic information contained in a complex system of financial markets' commodities. Here we show that, in general, that tool is not optimal in terms of topological properties. Consequently, the economic interpretation of the filtered information might be misleading. To overcome that non-optimality problem, a set of criteria and a selection procedure of an optimal minimal spanning tree will be developed. By using New York Stock Exchange data, the advantages of the proposed method will be illustrated in terms of the power-law of degree distribution.
Louis. R. Iverson; Paul. G. Risser; Paul. G. Risser
1987-01-01
Geographic information systems and remote sensing techniques are powerful tools in the analysis of long-term changes in vegetation and land use, especially because spatial information from two or more time intervals can be compared more readily than by manual methods. A primary restriction is the paucity of data that has been digitized from earlier periods. The...
Mathematical tool from corn stover TGA to determine its composition.
Freda, Cesare; Zimbardi, Francesco; Nanna, Francesco; Viola, Egidio
2012-08-01
Corn stover was treated by steam explosion process at four different temperatures. A fraction of the four exploded matters was extracted by water. The eight samples (four from steam explosion and four from water extraction of exploded matters) were analysed by wet chemical way to quantify the amount of cellulose, hemicellulose and lignin. Thermogravimetric analysis in air atmosphere was executed on the eight samples. A mathematical tool was developed, using TGA data, to determine the composition of corn stover in terms of cellulose, hemicellulose and lignin. It uses the biomass degradation temperature as multiple linear function of the cellulose, hemicellulose and lignin content of the biomass with interactive terms. The mathematical tool predicted cellulose, hemicellulose and lignin contents with average absolute errors of 1.69, 5.59 and 0.74 %, respectively, compared to the wet chemical method.
Serious Games as New Educational Tools: How Effective Are They? A Meta-Analysis of Recent Studies
ERIC Educational Resources Information Center
Girard, C.; Ecalle, J.; Magnan, A.
2013-01-01
Computer-assisted learning is known to be an effective tool for improving learning in both adults and children. Recent years have seen the emergence of the so-called "serious games (SGs)" that are flooding the educational games market. In this paper, the term "serious games" is used to refer to video games (VGs) intended to serve a useful purpose.…
NASA Astrophysics Data System (ADS)
Brennan-Tonetta, Margaret
This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or GGE generated by different feedstock/technology combinations at the state and county level. Modification of the model to incorporate factors that will enable the analysis of broader energy policy issues as those mentioned above, are recommended for future research efforts.
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay W.; Lyzenga, Gregory A.; Granat, Robert A.; Norton, Charles D.; Rundle, John B.; Pierce, Marlon E.; Fox, Geoffrey C.; McLeod, Dennis; Ludwig, Lisa Grant
2012-01-01
QuakeSim 2.0 improves understanding of earthquake processes by providing modeling tools and integrating model applications and various heterogeneous data sources within a Web services environment. QuakeSim is a multisource, synergistic, data-intensive environment for modeling the behavior of earthquake faults individually, and as part of complex interacting systems. Remotely sensed geodetic data products may be explored, compared with faults and landscape features, mined by pattern analysis applications, and integrated with models and pattern analysis applications in a rich Web-based and visualization environment. Integration of heterogeneous data products with pattern informatics tools enables efficient development of models. Federated database components and visualization tools allow rapid exploration of large datasets, while pattern informatics enables identification of subtle, but important, features in large data sets. QuakeSim is valuable for earthquake investigations and modeling in its current state, and also serves as a prototype and nucleus for broader systems under development. The framework provides access to physics-based simulation tools that model the earthquake cycle and related crustal deformation. Spaceborne GPS and Inter ferometric Synthetic Aperture (InSAR) data provide information on near-term crustal deformation, while paleoseismic geologic data provide longerterm information on earthquake fault processes. These data sources are integrated into QuakeSim's QuakeTables database system, and are accessible by users or various model applications. UAVSAR repeat pass interferometry data products are added to the QuakeTables database, and are available through a browseable map interface or Representational State Transfer (REST) interfaces. Model applications can retrieve data from Quake Tables, or from third-party GPS velocity data services; alternatively, users can manually input parameters into the models. Pattern analysis of GPS and seismicity data has proved useful for mid-term forecasting of earthquakes, and for detecting subtle changes in crustal deformation. The GPS time series analysis has also proved useful as a data-quality tool, enabling the discovery of station anomalies and data processing and distribution errors. Improved visualization tools enable more efficient data exploration and understanding. Tools provide flexibility to science users for exploring data in new ways through download links, but also facilitate standard, intuitive, and routine uses for science users and end users such as emergency responders.
Whalen, Kimberly; Bavuso, Karen; Bouyer-Ferullo, Sharon; Goldsmith, Denise; Fairbanks, Amanda; Gesner, Emily; Lagor, Charles; Collins, Sarah
2016-01-01
To understand requests for nursing Clinical Decision Support (CDS) interventions at a large integrated health system undergoing vendor-based EHR implementation. In addition, to establish a process to guide both short-term implementation and long-term strategic goals to meet nursing CDS needs. We conducted an environmental scan to understand current state of nursing CDS over three months. The environmental scan consisted of a literature review and an analysis of CDS requests received from across our health system. We identified existing high priority CDS and paper-based tools used in nursing practice at our health system that guide decision-making. A total of 46 nursing CDS requests were received. Fifty-six percent (n=26) were specific to a clinical specialty; 22 percent (n=10) were focused on facilitating clinical consults in the inpatient setting. "Risk Assessments/Risk Reduction/Promotion of Healthy Habits" (n=23) was the most requested High Priority Category received for nursing CDS. A continuum of types of nursing CDS needs emerged using the Data-Information-Knowledge-Wisdom Conceptual Framework: 1) facilitating data capture, 2) meeting information needs, 3) guiding knowledge-based decision making, and 4) exposing analytics for wisdom-based clinical interpretation by the nurse. Identifying and prioritizing paper-based tools that can be modified into electronic CDS is a challenge. CDS strategy is an evolving process that relies on close collaboration and engagement with clinical sites for short-term implementation and should be incorporated into a long-term strategic plan that can be optimized and achieved overtime. The Data-Information-Knowledge-Wisdom Conceptual Framework in conjunction with the High Priority Categories established may be a useful tool to guide a strategic approach for meeting short-term nursing CDS needs and aligning with the organizational strategic plan.
Practical applications of surface analytic tools in tribology
NASA Technical Reports Server (NTRS)
Ferrante, J.
1980-01-01
A brief description of many of the widely used tools is presented. Of this list, those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology along with being truly surface sensitive (that is less than 10 atomic layers) are presented. The latter group is critiqued in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under real conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.
EEG analysis using wavelet-based information tools.
Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A
2006-06-15
Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity.
Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp
2016-11-18
ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .
A cascading failure analysis tool for post processing TRANSCARE simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
This is a MATLAB-based tool to post process simulation results in the EPRI software TRANSCARE, for massive cascading failure analysis following severe disturbances. There are a few key modules available in this tool, including: 1. automatically creating a contingency list to run TRANSCARE simulations, including substation outages above a certain kV threshold, N-k (1, 2 or 3) generator outages and branche outages; 2. read in and analyze a CKO file of PCG definition, an initiating event list, and a CDN file; 3. post process all the simulation results saved in a CDN file and perform critical event corridor analysis; 4.more » provide a summary of TRANSCARE simulations; 5. Identify the most frequently occurring event corridors in the system; and 6. Rank the contingencies using a user defined security index to quantify consequences in terms of total load loss, total number of cascades, etc.« less
Wilkinson, Jessica; Goff, Morgan; Rusoja, Evan; Hanson, Carl; Swanson, Robert Chad
2018-06-01
This review of systems thinking (ST) case studies seeks to compile and analyse cases from ST literature and provide practitioners with a reference for ST in health practice. Particular attention was given to (1) reviewing the frequency and use of key ST terms, methods, and tools in the context of health, and (2) extracting and analysing longitudinal themes across cases. A systematic search of databases was conducted, and a total of 36 case studies were identified. A combination of integrative and inductive qualitative approaches to analysis was used. Most cases identified took place in high-income countries and applied ST retrospectively. The most commonly used ST terms were agent/stakeholder/actor (n = 29), interdependent/interconnected (n = 28), emergence (n = 26), and adaptability/adaptation (n = 26). Common ST methods and tools were largely underutilized. Social network analysis was the most commonly used method (n = 4), and innovation or change management history was the most frequently used tool (n = 11). Four overarching themes were identified; the importance of the interdependent and interconnected nature of a health system, characteristics of leaders in a complex adaptive system, the benefits of using ST, and barriers to implementing ST. This review revealed that while much has been written about the potential benefits of applying ST to health, it has yet to completely transition from theory to practice. There is however evidence of the practical use of an ST lens as well as specific methods and tools. With clear examples of ST applications, the global health community will be better equipped to understand and address key health challenges. © 2017 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Phojanamongkolkij, Nipa; Okuniek, Nikolai; Lohr, Gary W.; Schaper, Meilin; Christoffels, Lothar; Latorella, Kara A.
2014-01-01
The runway is a critical resource of any air transport system. It is used for arrivals, departures, and for taxiing aircraft and is universally acknowledged as a constraining factor to capacity for both surface and airspace operations. It follows that investigation of the effective use of runways, both in terms of selection and assignment as well as the timing and sequencing of the traffic is paramount to the efficient traffic flows. Both the German Aerospace Center (DLR) and NASA have developed concepts and tools to improve atomic aspects of coordinated arrival/departure/surface management operations and runway configuration management. In December 2012, NASA entered into a Collaborative Agreement with DLR. Four collaborative work areas were identified, one of which is called "Runway Management." As part of collaborative research in the "Runway Management" area, which is conducted with the DLR Institute of Flight Guidance, located in Braunschweig, the goal is to develop an integrated system comprised of the three DLR tools - arrival, departure, and surface management (collectively referred to as A/D/S-MAN) - and NASA's tactical runway configuration management (TRCM) tool. To achieve this goal, it is critical to prepare a concept of operations (ConOps) detailing how the NASA runway management and DLR arrival, departure, and surface management tools will function together to the benefit of each. To assist with the preparation of the ConOps, the integrated NASA and DLR tools are assessed through a functional analysis method described in this report. The report first provides the highlevel operational environments for air traffic management (ATM) in Germany and in the U.S., and the descriptions of the DLR's A/D/S-MAN and NASA's TRCM tools at the level of details necessary to compliment the purpose of the study. Functional analyses of each tool and a completed functional analysis of an integrated system design are presented next in the report. Future efforts to fully develop the ConOps will include: developing scenarios to fully test environmental, procedural, and data availability assumptions; executing the analysis by a walk-through of the integrated system using these scenarios; defining the appropriate role of operators in terms of their monitoring requirements and decision authority; executing the analysis by a walk-through of the integrated system with operator involvement; characterizing the environmental, system data requirements, and operator role assumptions for the ConOps.
Redefining risk research priorities for nanomaterials
NASA Astrophysics Data System (ADS)
Grieger, Khara D.; Baun, Anders; Owen, Richard
2010-02-01
Chemical-based risk assessment underpins the current approach to responsible development of nanomaterials (NM). It is now recognised, however, that this process may take decades, leaving decision makers with little support in the near term. Despite this, current and near future research efforts are largely directed at establishing (eco)toxicological and exposure data for NM, and comparatively little research has been undertaken on tools or approaches that may facilitate near-term decisions, some of which we briefly outline in this analysis. We propose a reprioritisation of NM risk research efforts to redress this imbalance, including the development of more adaptive risk governance frameworks, alternative/complementary tools to risk assessment, and health and environment surveillance.
I PASS: an interactive policy analysis simulation system.
Doug Olson; Con Schallau; Wilbur Maki
1984-01-01
This paper describes an interactive policy analysis simulation system(IPASS) that can be used to analyze the long-term economic and demographic effects of alternative forest resource management policies. The IPASS model is a dynamic analytical tool that forecasts growth and development of an economy. It allows the user to introduce changes in selected parameters based...
Principles and tools for collaborative entity-based intelligence analysis.
Bier, Eric A; Card, Stuart K; Bodnar, John W
2010-01-01
Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.
Analysis of post-mining excavations as places for municipal waste
NASA Astrophysics Data System (ADS)
Górniak-Zimroz, Justyna
2018-01-01
Waste management planning is an interdisciplinary task covering a wide range of issues including costs, legal requirements, spatial planning, environmental protection, geography, demographics, and techniques used in collecting, transporting, processing and disposing of waste. Designing and analyzing this issue is difficult and requires the use of advanced analysis methods and tools available in GIS geographic information systems containing readily available graphical and descriptive databases, data analysis tools providing expert decision support while selecting the best-designed alternative, and simulation models that allow the user to simulate many variants of waste management together with graphical visualization of the results of performed analyzes. As part of the research study, there have been works undertaken concerning the use of multi-criteria data analysis in waste management in areas located in southwestern Poland. These works have proposed the inclusion in waste management of post-mining excavations as places for the final or temporary collection of waste assessed in terms of their suitability with the tools available in GIS systems.
ISS Mini AERCam Radio Frequency (RF) Coverage Analysis Using iCAT Development Tool
NASA Technical Reports Server (NTRS)
Bolen, Steve; Vazquez, Luis; Sham, Catherine; Fredrickson, Steven; Fink, Patrick; Cox, Jan; Phan, Chau; Panneton, Robert
2003-01-01
The long-term goals of the National Aeronautics and Space Administration's (NASA's) Human Exploration and Development of Space (HEDS) enterprise may require the development of autonomous free-flier (FF) robotic devices to operate within the vicinity of low-Earth orbiting spacecraft to supplement human extravehicular activities (EVAs) in space. Future missions could require external visual inspection of the spacecraft that would be difficult, or dangerous, for humans to perform. Under some circumstance, it may be necessary to employ an un-tethered communications link between the FF and the users. The interactive coverage analysis tool (ICAT) is a software tool that has been developed to perform critical analysis of the communications link performance for a FF operating in the vicinity of the International Space Station (ISS) external environment. The tool allows users to interactively change multiple parameters of the communications link parameters to efficiently perform systems engineering trades on network performance. These trades can be directly translated into design and requirements specifications. This tool significantly reduces the development time in determining a communications network topology by allowing multiple parameters to be changed, and the results of link coverage to be statistically characterized and plotted interactively.
A comprehensive comparison of tools for differential ChIP-seq analysis
Steinhauser, Sebastian; Kurzawa, Nils; Eils, Roland
2016-01-01
ChIP-seq has become a widely adopted genomic assay in recent years to determine binding sites for transcription factors or enrichments for specific histone modifications. Beside detection of enriched or bound regions, an important question is to determine differences between conditions. While this is a common analysis for gene expression, for which a large number of computational approaches have been validated, the same question for ChIP-seq is particularly challenging owing to the complexity of ChIP-seq data in terms of noisiness and variability. Many different tools have been developed and published in recent years. However, a comprehensive comparison and review of these tools is still missing. Here, we have reviewed 14 tools, which have been developed to determine differential enrichment between two conditions. They differ in their algorithmic setups, and also in the range of applicability. Hence, we have benchmarked these tools on real data sets for transcription factors and histone modifications, as well as on simulated data sets to quantitatively evaluate their performance. Overall, there is a great variety in the type of signal detected by these tools with a surprisingly low level of agreement. Depending on the type of analysis performed, the choice of method will crucially impact the outcome. PMID:26764273
Wei, Qing; Khan, Ishita K; Ding, Ziyun; Yerneni, Satwica; Kihara, Daisuke
2017-03-20
The number of genomics and proteomics experiments is growing rapidly, producing an ever-increasing amount of data that are awaiting functional interpretation. A number of function prediction algorithms were developed and improved to enable fast and automatic function annotation. With the well-defined structure and manual curation, Gene Ontology (GO) is the most frequently used vocabulary for representing gene functions. To understand relationship and similarity between GO annotations of genes, it is important to have a convenient pipeline that quantifies and visualizes the GO function analyses in a systematic fashion. NaviGO is a web-based tool for interactive visualization, retrieval, and computation of functional similarity and associations of GO terms and genes. Similarity of GO terms and gene functions is quantified with six different scores including protein-protein interaction and context based association scores we have developed in our previous works. Interactive navigation of the GO function space provides intuitive and effective real-time visualization of functional groupings of GO terms and genes as well as statistical analysis of enriched functions. We developed NaviGO, which visualizes and analyses functional similarity and associations of GO terms and genes. The NaviGO webserver is freely available at: http://kiharalab.org/web/navigo .
Névéol, Aurélie; Zeng, Kelly; Bodenreider, Olivier
2006-01-01
Objective This paper explores alternative approaches for the evaluation of an automatic indexing tool for MEDLINE, complementing the traditional precision and recall method. Materials and methods The performance of MTI, the Medical Text Indexer used at NLM to produce MeSH recommendations for biomedical journal articles is evaluated on a random set of MEDLINE citations. The evaluation examines semantic similarity at the term level (indexing terms). In addition, the documents retrieved by queries resulting from MTI index terms for a given document are compared to the PubMed related citations for this document. Results Semantic similarity scores between sets of index terms are higher than the corresponding Dice similarity scores. Overall, 75% of the original documents and 58% of the top ten related citations are retrieved by queries based on the automatic indexing. Conclusions The alternative measures studied in this paper confirm previous findings and may be used to select particular documents from the test set for a more thorough analysis. PMID:17238409
Neveol, Aurélie; Zeng, Kelly; Bodenreider, Olivier
2006-01-01
This paper explores alternative approaches for the evaluation of an automatic indexing tool for MEDLINE, complementing the traditional precision and recall method. The performance of MTI, the Medical Text Indexer used at NLM to produce MeSH recommendations for biomedical journal articles is evaluated on a random set of MEDLINE citations. The evaluation examines semantic similarity at the term level (indexing terms). In addition, the documents retrieved by queries resulting from MTI index terms for a given document are compared to the PubMed related citations for this document. Semantic similarity scores between sets of index terms are higher than the corresponding Dice similarity scores. Overall, 75% of the original documents and 58% of the top ten related citations are retrieved by queries based on the automatic indexing. The alternative measures studied in this paper confirm previous findings and may be used to select particular documents from the test set for a more thorough analysis.
Using association rules to measure Subjective Organization after Acquired Brain Injury.
Parente, Frederick; Finley, John-Christopher
2018-01-01
Subjective Organization (SO) refers to the human tendency to impose organization on our environment. Persons with Acquired Brain Injury (ABI) often lose the ability to organize however, there are no performance based measures of organization that can be used to document this disability. The authors propose a method of association rule analysis (AR) that can be used as a clinical tool for assessing a patient's ability to organize. Twenty three patients with ABI recalled a list of twelve unrelated nouns over twelve study and test trials. Several measures of AR computed on these data were correlated with various measures of short-term, long-term, and delayed recall of the words. All of the AR measures correlated significantly with the short-term and long-term memory measures. The confidence measure was the best predictor of memory and the number of association rules generated was the best predictor of learning. The confidence measure can be used as a clinical tool to assess SO with individual ABI survivors.
Application of Bayesian and cost benefit risk analysis in water resources management
NASA Astrophysics Data System (ADS)
Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.
2016-03-01
Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.
Human and Robotic Mission to Small Bodies: Mapping, Planning and Exploration
NASA Technical Reports Server (NTRS)
Neffian, Ara V.; Bellerose, Julie; Beyer, Ross A.; Archinal, Brent; Edwards, Laurence; Lee, Pascal; Colaprete, Anthony; Fong, Terry
2013-01-01
This study investigates the requirements, performs a gap analysis and makes a set of recommendations for mapping products and exploration tools required to support operations and scientific discovery for near- term and future NASA missions to small bodies. The mapping products and their requirements are based on the analysis of current mission scenarios (rendezvous, docking, and sample return) and recommendations made by the NEA Users Team (NUT) in the framework of human exploration. The mapping products that sat- isfy operational, scienti c, and public outreach goals include topography, images, albedo, gravity, mass, density, subsurface radar, mineralogical and thermal maps. The gap analysis points to a need for incremental generation of mapping products from low (flyby) to high-resolution data needed for anchoring and docking, real-time spatial data processing for hazard avoidance and astronaut or robot localization in low gravity, high dynamic environments, and motivates a standard for coordinate reference systems capable of describing irregular body shapes. Another aspect investigated in this study is the set of requirements and the gap analysis for exploration tools that support visualization and simulation of operational conditions including soil interactions, environment dynamics, and communications coverage. Building robust, usable data sets and visualisation/simulation tools is the best way for mission designers and simulators to make correct decisions for future missions. In the near term, it is the most useful way to begin building capabilities for small body exploration without needing to commit to specific mission architectures.
High Frequency Scattering Code in a Distributed Processing Environment
1991-06-01
Block 6. Author(s). Name(s) of person (s) Block 14. Subiect Terms. Keywords or phrases responsible for writing the report, performing identifying major...use of auttomated analysis tools is indicated. One tool developed by Pacific-Sierra Re- 22 search Corporation and marketed by Intel Corporation for...XQ: EXECUTE CODE EN : END CODE This input deck differs from that in the manual because the "PP" option is disabled in the modified code. 45 A.3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Jerden, James
2016-02-01
The overall objective of the SFR Regulatory Technology Development Plan (RTDP) effort is to identify and address potential impediments to the SFR regulatory licensing process. In FY14, an analysis by Argonne identified the development of an SFR-specific MST methodology as an existing licensing gap with high regulatory importance and a potentially long lead-time to closure. This work was followed by an initial examination of the current state-of-knowledge regarding SFR source term development (ANLART-3), which reported several potential gaps. Among these were the potential inadequacies of current computational tools to properly model and assess the transport and retention of radionuclides duringmore » a metal fuel pool-type SFR core damage incident. The objective of the current work is to determine the adequacy of existing computational tools, and the associated knowledge database, for the calculation of an SFR MST. To accomplish this task, a trial MST calculation will be performed using available computational tools to establish their limitations with regard to relevant radionuclide release/retention/transport phenomena. The application of existing modeling tools will provide a definitive test to assess their suitability for an SFR MST calculation, while also identifying potential gaps in the current knowledge base and providing insight into open issues regarding regulatory criteria/requirements. The findings of this analysis will assist in determining future research and development needs.« less
Analysis of the Relevance of Posts in Asynchronous Discussions
ERIC Educational Resources Information Center
Azevedo, Breno T.; Reategui, Eliseo; Behar, Patrícia A.
2014-01-01
This paper presents ForumMiner, a tool for the automatic analysis of students' posts in asynchronous discussions. ForumMiner uses a text mining system to extract graphs from texts that are given to students as a basis for their discussion. These graphs contain the most relevant terms found in the texts, as well as the relationships between them.…
Control Theoretic Modeling for Uncertain Cultural Attitudes and Unknown Adversarial Intent
2009-02-01
Constructive computational tools. 15. SUBJECT TERMS social learning, social networks , multiagent systems, game theory 16. SECURITY CLASSIFICATION OF: a...over- reactionary behaviors; 3) analysis of rational social learning in networks : analysis of belief propagation in social networks in various...general methodology as a predictive device for social network formation and for communication network formation with constraints on the lengths of
Source-term development for a contaminant plume for use by multimedia risk assessment models
NASA Astrophysics Data System (ADS)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.
2000-02-01
Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.
Using Global Climate Data to Inform Long-Term Water Planning Decisions
NASA Astrophysics Data System (ADS)
Groves, D. G.; Lempert, R.
2008-12-01
Water managers throughout the world are working to consider climate change in their long-term water planning processes. The best available information regarding plausible future hydrologic conditions are largely derived from global circulation models and from paleoclimate data. To date there lacks a single approach for (1) utilizing these data in water management planning tools for analysis and (2) evaluating the myriad of possible adaptation options. This talk will describe several approaches being used at RAND to incorporate global projections of climate change into local, regional, and state-wide long-term water planning. It will draw on current work with the California Department of Water Resources and other local Western water agencies, and a recently completed project with the Inland Empire Utilities Agency. Work to date suggests that climate information can be assimilated into local water planning tools to help identify robust climate adaptation water management strategies.
NASA Technical Reports Server (NTRS)
Waters, Eric D.
2013-01-01
Recent high level interest in the capability of small launch vehicles has placed significant demand on determining the trade space these vehicles occupy. This has led to the development of a zero level analysis tool that can quickly determine the minimum expected vehicle gross liftoff weight (GLOW) in terms of vehicle stage specific impulse (Isp) and propellant mass fraction (pmf) for any given payload value. Utilizing an extensive background in Earth to orbit trajectory experience a total necessary delta v the vehicle must achieve can be estimated including relevant loss terms. This foresight into expected losses allows for more specific assumptions relating to the initial estimates of thrust to weight values for each stage. This tool was further validated against a trajectory model, in this case the Program to Optimize Simulated Trajectories (POST), to determine if the initial sizing delta v was adequate to meet payload expectations. Presented here is a description of how the tool is setup and the approach the analyst must take when using the tool. Also, expected outputs which are dependent on the type of small launch vehicle being sized will be displayed. The method of validation will be discussed as well as where the sizing tool fits into the vehicle design process.
Open access for ALICE analysis based on virtualization technology
NASA Astrophysics Data System (ADS)
Buncic, P.; Gheata, M.; Schutz, Y.
2015-12-01
Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.
Quality assurance software inspections at NASA Ames: Metrics for feedback and modification
NASA Technical Reports Server (NTRS)
Wenneson, G.
1985-01-01
Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.
Economic, health, and environmental impacts of cyanobacteria and associated harmful algal blooms are increasingly recognized by policymakers, managers, and scientific researchers. However, spatially-distributed, long-term data on cyanobacteria blooms are largely unavailable. The ...
Detection of mental stress due to oral academic examination via ultra-short-term HRV analysis.
Castaldo, R; Xu, W; Melillo, P; Pecchia, L; Santamaria, L; James, C
2016-08-01
Mental stress may cause cognitive dysfunctions, cardiovascular disorders and depression. Mental stress detection via short-term Heart Rate Variability (HRV) analysis has been widely explored in the last years, while ultra-short term (less than 5 minutes) HRV has been not. This study aims to detect mental stress using linear and non-linear HRV features extracted from 3 minutes ECG excerpts recorded from 42 university students, during oral examination (stress) and at rest after a vacation. HRV features were then extracted and analyzed according to the literature using validated software tools. Statistical and data mining analysis were then performed on the extracted HRV features. The best performing machine learning method was the C4.5 tree algorithm, which discriminated between stress and rest with sensitivity, specificity and accuracy rate of 78%, 80% and 79% respectively.
The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis
NASA Technical Reports Server (NTRS)
Burks, Jason Eric; Sperow, Ken
2015-01-01
A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.
Hoogendam, Arjen; Stalenhoef, Anton FH; Robbé, Pieter F de Vries; Overbeke, A John PM
2008-01-01
Background The use of PubMed to answer daily medical care questions is limited because it is challenging to retrieve a small set of relevant articles and time is restricted. Knowing what aspects of queries are likely to retrieve relevant articles can increase the effectiveness of PubMed searches. The objectives of our study were to identify queries that are likely to retrieve relevant articles by relating PubMed search techniques and tools to the number of articles retrieved and the selection of articles for further reading. Methods This was a prospective observational study of queries regarding patient-related problems sent to PubMed by residents and internists in internal medicine working in an Academic Medical Centre. We analyzed queries, search results, query tools (Mesh, Limits, wildcards, operators), selection of abstract and full-text for further reading, using a portal that mimics PubMed. Results PubMed was used to solve 1121 patient-related problems, resulting in 3205 distinct queries. Abstracts were viewed in 999 (31%) of these queries, and in 126 (39%) of 321 queries using query tools. The average term count per query was 2.5. Abstracts were selected in more than 40% of queries using four or five terms, increasing to 63% if the use of four or five terms yielded 2–161 articles. Conclusion Queries sent to PubMed by physicians at our hospital during daily medical care contain fewer than three terms. Queries using four to five terms, retrieving less than 161 article titles, are most likely to result in abstract viewing. PubMed search tools are used infrequently by our population and are less effective than the use of four or five terms. Methods to facilitate the formulation of precise queries, using more relevant terms, should be the focus of education and research. PMID:18816391
Chang, Le; Baseggio, Oscar; Sementa, Luca; Cheng, Daojian; Fronzoni, Giovanna; Toffoli, Daniele; Aprà, Edoardo; Stener, Mauro; Fortunelli, Alessandro
2018-06-13
We introduce Individual Component Maps of Rotatory Strength (ICM-RS) and Rotatory Strength Density (RSD) plots as analysis tools of chiro-optical linear response spectra deriving from time-dependent density functional theory (TDDFT) simulations. ICM-RS and RSD allow one to visualize the origin of chiro-optical response in momentum or real space, including signed contributions and therefore highlighting cancellation terms that are ubiquitous in chirality phenomena, and should be especially useful in analyzing the spectra of complex systems. As test cases, we use ICM-RS and RSD to analyze circular dichroism spectra of selected (Ag-Au)30(SR)18 monolayer-protected metal nanoclusters, showing the potential of the proposed tools to derive insight and understanding, and eventually rational design, in chiro-optical studies of complex systems.
Information transfer satellite concept study. Volume 2: Technical
NASA Technical Reports Server (NTRS)
Bergin, P.; Kincade, C.; Kurpiewski, D.; Leinhaupel, F.; Millican, F.; Onstad, R.
1971-01-01
The ITS concept study is preceded by two requirements studies whose primary objectives are to identify viable demands and to develop the functional requirements associated with these demands. In addition to continuing this basic activity the ITS concept study objectives are to: (1) develop tools and techniques for planning advanced information transfer satellite communications systems, and to (2) select viable systems for further analysis both in their near-term and in the far-term aspects.
Robust retention and transfer of tool construction techniques in chimpanzees (Pan troglodytes).
Vale, Gill L; Flynn, Emma G; Pender, Lydia; Price, Elizabeth; Whiten, Andrew; Lambeth, Susan P; Schapiro, Steven J; Kendal, Rachel L
2016-02-01
Long-term memory can be critical to a species' survival in environments with seasonal and even longer-term cycles of resource availability. The present, longitudinal study investigated whether complex tool behaviors used to gain an out-of-reach reward, following a hiatus of about 3 years and 7 months since initial experiences with a tool use task, were retained and subsequently executed more quickly by experienced than by naïve chimpanzees. Ten of the 11 retested chimpanzees displayed impressive long-term procedural memory, creating elongated tools using the same methods employed years previously, either combining 2 tools or extending a single tool. The complex tool behaviors were also transferred to a different task context, showing behavioral flexibility. This represents some of the first evidence for appreciable long-term procedural memory, and improvements in the utility of complex tool manufacture in chimpanzees. Such long-term procedural memory and behavioral flexibility have important implications for the longevity and transmission of behavioral traditions. (c) 2016 APA, all rights reserved).
Robust Retention and Transfer of Tool Construction Techniques in Chimpanzees (Pan troglodytes)
Vale, Gill L.; Flynn, Emma G.; Pender, Lydia; Price, Elizabeth; Whiten, Andrew; Lambeth, Susan P.; Schapiro, Steven J.; Kendal, Rachel L.
2016-01-01
Long-term memory can be critical to a species’ survival in environments with seasonal and even longer-term cycles of resource availability. The present, longitudinal study investigated whether complex tool behaviors used to gain an out-of-reach reward, following a hiatus of about 3 years and 7 months since initial experiences with a tool use task, were retained and subsequently executed more quickly by experienced than by naïve chimpanzees. Ten of the 11 retested chimpanzees displayed impressive long-term procedural memory, creating elongated tools using the same methods employed years previously, either combining 2 tools or extending a single tool. The complex tool behaviors were also transferred to a different task context, showing behavioral flexibility. This represents some of the first evidence for appreciable long-term procedural memory, and improvements in the utility of complex tool manufacture in chimpanzees. Such long-term procedural memory and behavioral flexibility have important implications for the longevity and transmission of behavioral traditions. PMID:26881941
Selection by consequences, behavioral evolution, and the price equation.
Baum, William M
2017-05-01
Price's equation describes evolution across time in simple mathematical terms. Although it is not a theory, but a derived identity, it is useful as an analytical tool. It affords lucid descriptions of genetic evolution, cultural evolution, and behavioral evolution (often called "selection by consequences") at different levels (e.g., individual vs. group) and at different time scales (local and extended). The importance of the Price equation for behavior analysis lies in its ability to precisely restate selection by consequences, thereby restating, or even replacing, the law of effect. Beyond this, the equation may be useful whenever one regards ontogenetic behavioral change as evolutionary change, because it describes evolutionary change in abstract, general terms. As an analytical tool, the behavioral Price equation is an excellent aid in understanding how behavior changes within organisms' lifetimes. For example, it illuminates evolution of response rate, analyses of choice in concurrent schedules, negative contingencies, and dilemmas of self-control. © 2017 Society for the Experimental Analysis of Behavior.
Web-based applications for building, managing and analysing kinetic models of biological systems.
Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A
2009-01-01
Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.
Design and Analysis Tool for External-Compression Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.
2012-01-01
A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.
Chang, Xing; Zhou, Xin; Luo, Linzhi; Yang, Chengjia; Pan, Hui; Zhang, Shuyang
2017-09-12
This study aimed to identify hotspots in research on clinical competence measurements from 2012 to 2016. The authors retrieved literature published between 2012 and 2016 from PubMed using selected medical subject headings (MeSH) terms. They used BibExcel software to generate high-frequency MeSH terms and identified hotspots by co-word analysis and cluster analysis. The authors searched 588 related articles and identified 31 high-frequency MeSH terms. In addition, they obtained 6 groups of high-frequency MeSH terms that reflected the domain hotspots. This study identified 6 hotspots of domain research, including studies on influencing factors and perception evaluation, improving and developing measurement tools, feedback measurement, measurement approaches based on computer simulation, the measurement of specific students in different learning phases, and the measurement of students' communication ability. All of these research topics could provide useful information for educators and researchers to continually conduct in-depth studies.
Development of a site analysis tool for distributed wind projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaw, Shawn
The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimatesmore » of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.« less
Alam, Zaid; Peddinti, Gopal
2017-01-01
Abstract The advent of polypharmacology paradigm in drug discovery calls for novel chemoinformatic tools for analyzing compounds’ multi-targeting activities. Such tools should provide an intuitive representation of the chemical space through capturing and visualizing underlying patterns of compound similarities linked to their polypharmacological effects. Most of the existing compound-centric chemoinformatics tools lack interactive options and user interfaces that are critical for the real-time needs of chemical biologists carrying out compound screening experiments. Toward that end, we introduce C-SPADE, an open-source exploratory web-tool for interactive analysis and visualization of drug profiling assays (biochemical, cell-based or cell-free) using compound-centric similarity clustering. C-SPADE allows the users to visually map the chemical diversity of a screening panel, explore investigational compounds in terms of their similarity to the screening panel, perform polypharmacological analyses and guide drug-target interaction predictions. C-SPADE requires only the raw drug profiling data as input, and it automatically retrieves the structural information and constructs the compound clusters in real-time, thereby reducing the time required for manual analysis in drug development or repurposing applications. The web-tool provides a customizable visual workspace that can either be downloaded as figure or Newick tree file or shared as a hyperlink with other users. C-SPADE is freely available at http://cspade.fimm.fi/. PMID:28472495
Global Alignment of Pairwise Protein Interaction Networks for Maximal Common Conserved Patterns
Tian, Wenhong; Samatova, Nagiza F.
2013-01-01
A number of tools for the alignment of protein-protein interaction (PPI) networks have laid the foundation for PPI network analysis. Most of alignment tools focus on finding conserved interaction regions across the PPI networks through either local or global mapping of similar sequences. Researchers are still trying to improve the speed, scalability, and accuracy of network alignment. In view of this, we introduce a connected-components based fast algorithm, HopeMap, for network alignment. Observing that the size of true orthologs across species is small comparing to the total number of proteins in all species, we take a different approach based onmore » a precompiled list of homologs identified by KO terms. Applying this approach to S. cerevisiae (yeast) and D. melanogaster (fly), E. coli K12 and S. typhimurium , E. coli K12 and C. crescenttus , we analyze all clusters identified in the alignment. The results are evaluated through up-to-date known gene annotations, gene ontology (GO), and KEGG ortholog groups (KO). Comparing to existing tools, our approach is fast with linear computational cost, highly accurate in terms of KO and GO terms specificity and sensitivity, and can be extended to multiple alignments easily.« less
Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P
2016-10-01
An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.
Generic trending and analysis system
NASA Technical Reports Server (NTRS)
Keehan, Lori; Reese, Jay
1994-01-01
The Generic Trending and Analysis System (GTAS) is a generic spacecraft performance monitoring tool developed by NASA Code 511 and Loral Aerosys. It is designed to facilitate quick anomaly resolution and trend analysis. Traditionally, the job of off-line analysis has been performed using hardware and software systems developed for real-time spacecraft contacts; then, the systems were supplemented with a collection of tools developed by Flight Operations Team (FOT) members. Since the number of upcoming missions is increasing, NASA can no longer afford to operate in this manner. GTAS improves control center productivity and effectiveness because it provides a generic solution across multiple missions. Thus, GTAS eliminates the need for each individual mission to develop duplicate capabilities. It also allows for more sophisticated tools to be developed because it draws resources from several projects. In addition, the GTAS software system incorporates commercial off-the-shelf tools software (COTS) packages and reuses components of other NASA-developed systems wherever possible. GTAS has incorporated lessons learned from previous missions by involving the users early in the development process. GTAS users took a proactive role in requirements analysis, design, development, and testing. Because of user involvement, several special tools were designed and are now being developed. GTAS users expressed considerable interest in facilitating data collection for long term trending and analysis. As a result, GTAS provides easy access to large volumes of processed telemetry data directly in the control center. The GTAS archival and retrieval capabilities are supported by the integration of optical disk technology and a COTS relational database management system.
Bavuso, Karen; Bouyer-Ferullo, Sharon; Goldsmith, Denise; Fairbanks, Amanda; Gesner, Emily; Lagor, Charles; Collins, Sarah
2016-01-01
Summary Objectives To understand requests for nursing Clinical Decision Support (CDS) interventions at a large integrated health system undergoing vendor-based EHR implementation. In addition, to establish a process to guide both short-term implementation and long-term strategic goals to meet nursing CDS needs. Materials and Methods We conducted an environmental scan to understand current state of nursing CDS over three months. The environmental scan consisted of a literature review and an analysis of CDS requests received from across our health system. We identified existing high priority CDS and paper-based tools used in nursing practice at our health system that guide decision-making. Results A total of 46 nursing CDS requests were received. Fifty-six percent (n=26) were specific to a clinical specialty; 22 percent (n=10) were focused on facilitating clinical consults in the inpatient setting. “Risk Assessments/Risk Reduction/Promotion of Healthy Habits” (n=23) was the most requested High Priority Category received for nursing CDS. A continuum of types of nursing CDS needs emerged using the Data-Information-Knowledge-Wisdom Conceptual Framework: 1) facilitating data capture, 2) meeting information needs, 3) guiding knowledge-based decision making, and 4) exposing analytics for wisdom-based clinical interpretation by the nurse. Conclusion Identifying and prioritizing paper-based tools that can be modified into electronic CDS is a challenge. CDS strategy is an evolving process that relies on close collaboration and engagement with clinical sites for short-term implementation and should be incorporated into a long-term strategic plan that can be optimized and achieved overtime. The Data-Information-Knowledge-Wisdom Conceptual Framework in conjunction with the High Priority Categories established may be a useful tool to guide a strategic approach for meeting short-term nursing CDS needs and aligning with the organizational strategic plan. PMID:27437036
Open Science and the Monitoring of Aquatic Ecosystems
Open science represents both a philosophy and a set of tools that can be leveraged for more effective scientific analysis. At the core of the open science movement is the concept that research should be reproducible and transparent, in addition to having long-term provenance thro...
Geospatial Technologies and Higher Education in Argentina
ERIC Educational Resources Information Center
Leguizamon, Saturnino
2010-01-01
The term "geospatial technologies" encompasses a large area of fields involving cartography, spatial analysis, geographic information system, remote sensing, global positioning systems and many others. These technologies should be expected to be available (as "natural tools") for a country with a large surface and a variety of…
Reward, Punishment, and Cooperation: A Meta-Analysis
ERIC Educational Resources Information Center
Balliet, Daniel; Mulder, Laetitia B.; Van Lange, Paul A. M.
2011-01-01
How effective are rewards (for cooperation) and punishment (for noncooperation) as tools to promote cooperation in social dilemmas or situations when immediate self-interest and longer term collective interest conflict? What variables can promote the impact of these incentives? Although such questions have been examined, social and behavioral…
NASA Astrophysics Data System (ADS)
Jupri, Al; Drijvers, Paul; van den Heuvel-Panhuizen, Marja
2016-02-01
The use of digital tools in algebra education is expected to not only contribute to master skill, but also to acquire conceptual understanding. The question is how digital tools affect students" thinking and understanding. This paper presents an analysis of data of one group of three grade seventh students (12-13 year-old) on the use of a digital tool for algebra, the Cover-up applet for solving equations in particular. This case study was part of a larger teaching experiment on initial algebra enriched with digital technology which aimed to improve students" conceptual understanding and skills in solving equations in one variable. The qualitative analysis of a video observation, digital and written work showed that the use of the applet affects student thinking in terms of strategies used by students while dealing with the equations. We conclude that the effects of the use of the digital tool can be traced from student problem solving strategies on paper-and-pencil environment which are similar to strategies while working with the digital tool. In future research, we recommend to use specific theoretical lenses, such as the theory of instrumental genesis and the onto-semiotic approach, to reveal more explicit relationships between students" conceptual understanding and the use of a digital tool.
Stevens, John R; Jones, Todd R; Lefevre, Michael; Ganesan, Balasubramanian; Weimer, Bart C
2017-01-01
Microbial community analysis experiments to assess the effect of a treatment intervention (or environmental change) on the relative abundance levels of multiple related microbial species (or operational taxonomic units) simultaneously using high throughput genomics are becoming increasingly common. Within the framework of the evolutionary phylogeny of all species considered in the experiment, this translates to a statistical need to identify the phylogenetic branches that exhibit a significant consensus response (in terms of operational taxonomic unit abundance) to the intervention. We present the R software package SigTree , a collection of flexible tools that make use of meta-analysis methods and regular expressions to identify and visualize significantly responsive branches in a phylogenetic tree, while appropriately adjusting for multiple comparisons.
Morrison, James; Kaufman, John
2016-12-01
Vascular access is invaluable in the treatment of hospitalized patients. Central venous catheters provide a durable and long-term solution while saving patients from repeated needle sticks for peripheral IVs and blood draws. The initial catheter placement procedure and long-term catheter usage place patients at risk for infection. The goal of this project was to develop a system to track and evaluate central line-associated blood stream infections related to interventional radiology placement of central venous catheters. A customized web-based clinical database was developed via open-source tools to provide a dashboard for data mining and analysis of the catheter placement and infection information. Preliminary results were gathered over a 4-month period confirming the utility of the system. The tools and methodology employed to develop the vascular access tracking system could be easily tailored to other clinical scenarios to assist in quality control and improvement programs.
Astrophysics and Big Data: Challenges, Methods, and Tools
NASA Astrophysics Data System (ADS)
Garofalo, Mauro; Botta, Alessio; Ventre, Giorgio
2017-06-01
Nowadays there is no field research which is not flooded with data. Among the sciences, astrophysics has always been driven by the analysis of massive amounts of data. The development of new and more sophisticated observation facilities, both ground-based and spaceborne, has led data more and more complex (Variety), an exponential growth of both data Volume (i.e., in the order of petabytes), and Velocity in terms of production and transmission. Therefore, new and advanced processing solutions will be needed to process this huge amount of data. We investigate some of these solutions, based on machine learning models as well as tools and architectures for Big Data analysis that can be exploited in the astrophysical context.
Development of a software tool to support chemical and biological terrorism intelligence analysis
NASA Astrophysics Data System (ADS)
Hunt, Allen R.; Foreman, William
1997-01-01
AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.
eSACP - a new Nordic initiative towards developing statistical climate services
NASA Astrophysics Data System (ADS)
Thorarinsdottir, Thordis; Thejll, Peter; Drews, Martin; Guttorp, Peter; Venälainen, Ari; Uotila, Petteri; Benestad, Rasmus; Mesquita, Michel d. S.; Madsen, Henrik; Fox Maule, Cathrine
2015-04-01
The Nordic research council NordForsk has recently announced its support for a new 3-year research initiative on "statistical analysis of climate projections" (eSACP). eSACP will focus on developing e-science tools and services based on statistical analysis of climate projections for the purpose of helping decision-makers and planners in the face of expected future challenges in regional climate change. The motivation behind the project is the growing recognition in our society that forecasts of future climate change is associated with various sources of uncertainty, and that any long-term planning and decision-making dependent on a changing climate must account for this. At the same time there is an obvious gap between scientists from different fields and between practitioners in terms of understanding how climate information relates to different parts of the "uncertainty cascade". In eSACP we will develop generic e-science tools and statistical climate services to facilitate the use of climate projections by decision-makers and scientists from all fields for climate impact analyses and for the development of robust adaptation strategies, which properly (in a statistical sense) account for the inherent uncertainty. The new tool will be publically available and include functionality to utilize the extensive and dynamically growing repositories of data and use state-of-the-art statistical techniques to quantify the uncertainty and innovative approaches to visualize the results. Such a tool will not only be valuable for future assessments and underpin the development of dedicated climate services, but will also assist the scientific community in making more clearly its case on the consequences of our changing climate to policy makers and the general public. The eSACP project is led by Thordis Thorarinsdottir, Norwegian Computing Center, and also includes the Finnish Meteorological Institute, the Norwegian Meteorological Institute, the Technical University of Denmark and the Bjerknes Centre for Climate Research, Norway. This poster will present details of focus areas in the project and show some examples of the expected analysis tools.
Time-Series Analysis: A Cautionary Tale
NASA Technical Reports Server (NTRS)
Damadeo, Robert
2015-01-01
Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.
Orbiter Boundary Layer Transition Prediction Tool Enhancements
NASA Technical Reports Server (NTRS)
Berry, Scott A.; King, Rudolph A.; Kegerise, Michael A.; Wood, William A.; McGinley, Catherine B.; Berger, Karen T.; Anderson, Brian P.
2010-01-01
Updates to an analytic tool developed for Shuttle support to predict the onset of boundary layer transition resulting from thermal protection system damage or repair are presented. The boundary layer transition tool is part of a suite of tools that analyze the local aerothermodynamic environment to enable informed disposition of damage for making recommendations to fly as is or to repair. Using mission specific trajectory information and details of each d agmea site or repair, the expected time (and thus Mach number) of transition onset is predicted to help define proper environments for use in subsequent thermal and stress analysis of the thermal protection system and structure. The boundary layer transition criteria utilized within the tool were updated based on new local boundary layer properties obtained from high fidelity computational solutions. Also, new ground-based measurements were obtained to allow for a wider parametric variation with both protuberances and cavities and then the resulting correlations were calibrated against updated flight data. The end result is to provide correlations that allow increased confidence with the resulting transition predictions. Recently, a new approach was adopted to remove conservatism in terms of sustained turbulence along the wing leading edge. Finally, some of the newer flight data are also discussed in terms of how these results reflect back on the updated correlations.
STOP using just GO: a multi-ontology hypothesis generation tool for high throughput experimentation
2013-01-01
Background Gene Ontology (GO) enrichment analysis remains one of the most common methods for hypothesis generation from high throughput datasets. However, we believe that researchers strive to test other hypotheses that fall outside of GO. Here, we developed and evaluated a tool for hypothesis generation from gene or protein lists using ontological concepts present in manually curated text that describes those genes and proteins. Results As a consequence we have developed the method Statistical Tracking of Ontological Phrases (STOP) that expands the realm of testable hypotheses in gene set enrichment analyses by integrating automated annotations of genes to terms from over 200 biomedical ontologies. While not as precise as manually curated terms, we find that the additional enriched concepts have value when coupled with traditional enrichment analyses using curated terms. Conclusion Multiple ontologies have been developed for gene and protein annotation, by using a dataset of both manually curated GO terms and automatically recognized concepts from curated text we can expand the realm of hypotheses that can be discovered. The web application STOP is available at http://mooneygroup.org/stop/. PMID:23409969
Comparisons of Kinematics and Dynamics Simulation Software Tools
NASA Technical Reports Server (NTRS)
Shiue, Yeu-Sheng Paul
2002-01-01
Kinematic and dynamic analyses for moving bodies are essential to system engineers and designers in the process of design and validations. 3D visualization and motion simulation plus finite element analysis (FEA) give engineers a better way to present ideas and results. Marshall Space Flight Center (MSFC) system engineering researchers are currently using IGRIP from DELMIA Inc. as a kinematic simulation tool for discrete bodies motion simulations. Although IGRIP is an excellent tool for kinematic simulation with some dynamic analysis capabilities in robotic control, explorations of other alternatives with more powerful dynamic analysis and FEA capabilities are necessary. Kinematics analysis will only examine the displacement, velocity, and acceleration of the mechanism without considering effects from masses of components. With dynamic analysis and FEA, effects such as the forces or torques at the joint due to mass and inertia of components can be identified. With keen market competition, ALGOR Mechanical Event Simulation (MES), MSC visualNastran 4D, Unigraphics Motion+, and Pro/MECHANICA were chosen for explorations. In this study, comparisons between software tools were presented in terms of following categories: graphical user interface (GUI), import capability, tutorial availability, ease of use, kinematic simulation capability, dynamic simulation capability, FEA capability, graphical output, technical support, and cost. Propulsion Test Article (PTA) with Fastrac engine model exported from IGRIP and an office chair mechanism were used as examples for simulations.
Abstract:Managing urban water infrastructures faces the challenge of jointly dealing with assets of diverse types, useful life, cost, ages and condition. Service quality and sustainability require sound long-term planning, well aligned with tactical and operational planning and m...
Using Course Currency as a Didactic Tool
ERIC Educational Resources Information Center
Wachsman, Yoav
2007-01-01
Classroom participation is an important and frequently used pedagogical strategy. This paper examines how awarding students with course currency, bills that are redeemable for bonus points at the end of the term, affects class participation and students' understanding of the material. The research uses surveys and data analysis to examine the…
A Streamlined Molecular Biology Module for Undergraduate Biochemistry Labs
ERIC Educational Resources Information Center
Muth, Gregory W.; Chihade, Joseph W.
2008-01-01
Site-directed mutagenesis and other molecular biology techniques, including plasmid manipulation and restriction analysis, are commonly used tools in the biochemistry research laboratory. In redesigning our biochemistry lab curricula, we sought to integrate these techniques into a term-long, project-based course. In the module presented here,…
Hasan Imani-Nasab, Mohammad; Seyedin, Hesam; Yazdizadeh, Bahareh; Majdzadeh, Reza
2017-01-01
Background: SUPPORT tools consist of 18 articles addressing the health policy-makers so that they can learn how to make evidence-informed health policies. These tools have been particularly recommended for developing countries. The present study tries to explain the process of evidence utilization for developing policy documents in the Iranian Ministry of Health and Medical Education (MoHME) and to compare the findings with those of SUPPORT tools. Methods: A qualitative research was conducted, using the framework analysis approach. Participants consisted of senior managers and technicians in MoHME. Purposeful sampling was done, with a maximum variety, for the selection of research participants: individuals having at least 5 years of experience in preparing evidence-based policy documents. Face-to-face interviews were conducted for data collection. As a guideline for the interviews, ‘the Utilization of Evidence in Policy-Making Organizations’ procedure was used. The data were analyzed through the analysis of the framework method using MAXQDA 10 software. Results: The participants acquired the research evidence in a topic-based form, and they were less likely to search on the basis of the evidence pyramid. To assess the quality of evidence, they did not use standard critical tools; to adapt the evidence and interventions with the local setting, they did not use the ideas and experiences of all stakeholders, and in preparing the evidence-based policy documents, they did not take into consideration the window of opportunity, did not refrain from using highly technical terms, did not write user-friendly summaries, and did not present alternative policy options. In order to develop health policies, however, they used the following innovations: attention to the financial burden of policy issues on the agenda, sensitivity analysis of the preferred policy option on the basis of technical, sociopolitical, and economic feasibility, advocacy from other scholars, using the multi-criteria decision-making models for the prioritization of policy options, implementation of policy based on the degree of readiness of policy-implementing units, and the classification of policy documents on the basis of different conditions of policy-making (urgent, short-term, and long-term). Conclusion: Findings showed that the process of evidence utilization in IR-MoH enjoys some innovations for the support of health policy development. The present study provides IR-MoH with considerable opportunities for the improvement of evidence-informed health policy-making. Moreover, the SUPPORT process and tools are recommended to be used in developing countries. PMID:28812845
Liau, Siow Yen; Mohamed Izham, M I; Hassali, M A; Shafie, A A
2010-01-01
Cardiovascular diseases, the main causes of hospitalisations and death globally, have put an enormous economic burden on the healthcare system. Several risk factors are associated with the occurrence of cardiovascular events. At the heart of efficient prevention of cardiovascular disease is the concept of risk assessment. This paper aims to review the available cardiovascular risk-assessment tools and its applicability in predicting cardiovascular risk among Asian populations. A systematic search was performed using keywords as MeSH and Boolean terms. A total of 25 risk-assessment tools were identified. Of these, only two risk-assessment tools (8%) were derived from an Asian population. These risk-assessment tools differ in various ways, including characteristics of the derivation sample, type of study, time frame of follow-up, end points, statistical analysis and risk factors included. Very few cardiovascular risk-assessment tools were developed in Asian populations. In order to accurately predict the cardiovascular risk of our population, there is a need to develop a risk-assessment tool based on local epidemiological data.
SlideJ: An ImageJ plugin for automated processing of whole slide images.
Della Mea, Vincenzo; Baroni, Giulia L; Pilutti, David; Di Loreto, Carla
2017-01-01
The digital slide, or Whole Slide Image, is a digital image, acquired with specific scanners, that represents a complete tissue sample or cytological specimen at microscopic level. While Whole Slide image analysis is recognized among the most interesting opportunities, the typical size of such images-up to Gpixels- can be very demanding in terms of memory requirements. Thus, while algorithms and tools for processing and analysis of single microscopic field images are available, Whole Slide images size makes the direct use of such tools prohibitive or impossible. In this work a plugin for ImageJ, named SlideJ, is proposed with the objective to seamlessly extend the application of image analysis algorithms implemented in ImageJ for single microscopic field images to a whole digital slide analysis. The plugin has been complemented by examples of macro in the ImageJ scripting language to demonstrate its use in concrete situations.
SlideJ: An ImageJ plugin for automated processing of whole slide images
Baroni, Giulia L.; Pilutti, David; Di Loreto, Carla
2017-01-01
The digital slide, or Whole Slide Image, is a digital image, acquired with specific scanners, that represents a complete tissue sample or cytological specimen at microscopic level. While Whole Slide image analysis is recognized among the most interesting opportunities, the typical size of such images—up to Gpixels- can be very demanding in terms of memory requirements. Thus, while algorithms and tools for processing and analysis of single microscopic field images are available, Whole Slide images size makes the direct use of such tools prohibitive or impossible. In this work a plugin for ImageJ, named SlideJ, is proposed with the objective to seamlessly extend the application of image analysis algorithms implemented in ImageJ for single microscopic field images to a whole digital slide analysis. The plugin has been complemented by examples of macro in the ImageJ scripting language to demonstrate its use in concrete situations. PMID:28683129
Sun, Duanchen; Liu, Yinliang; Zhang, Xiang-Sun; Wu, Ling-Yun
2017-09-21
High-throughput experimental techniques have been dramatically improved and widely applied in the past decades. However, biological interpretation of the high-throughput experimental results, such as differential expression gene sets derived from microarray or RNA-seq experiments, is still a challenging task. Gene Ontology (GO) is commonly used in the functional enrichment studies. The GO terms identified via current functional enrichment analysis tools often contain direct parent or descendant terms in the GO hierarchical structure. Highly redundant terms make users difficult to analyze the underlying biological processes. In this paper, a novel network-based probabilistic generative model, NetGen, was proposed to perform the functional enrichment analysis. An additional protein-protein interaction (PPI) network was explicitly used to assist the identification of significantly enriched GO terms. NetGen achieved a superior performance than the existing methods in the simulation studies. The effectiveness of NetGen was explored further on four real datasets. Notably, several GO terms which were not directly linked with the active gene list for each disease were identified. These terms were closely related to the corresponding diseases when accessed to the curated literatures. NetGen has been implemented in the R package CopTea publicly available at GitHub ( http://github.com/wulingyun/CopTea/ ). Our procedure leads to a more reasonable and interpretable result of the functional enrichment analysis. As a novel term combination-based functional enrichment analysis method, NetGen is complementary to current individual term-based methods, and can help to explore the underlying pathogenesis of complex diseases.
Annotare--a tool for annotating high-throughput biomedical investigations and resulting data.
Shankar, Ravi; Parkinson, Helen; Burdett, Tony; Hastings, Emma; Liu, Junmin; Miller, Michael; Srinivasa, Rashmi; White, Joseph; Brazma, Alvis; Sherlock, Gavin; Stoeckert, Christian J; Ball, Catherine A
2010-10-01
Computational methods in molecular biology will increasingly depend on standards-based annotations that describe biological experiments in an unambiguous manner. Annotare is a software tool that enables biologists to easily annotate their high-throughput experiments, biomaterials and data in a standards-compliant way that facilitates meaningful search and analysis. Annotare is available from http://code.google.com/p/annotare/ under the terms of the open-source MIT License (http://www.opensource.org/licenses/mit-license.php). It has been tested on both Mac and Windows.
Reactive-Diffusive-Advective Traveling Waves in a Family of Degenerate Nonlinear Equations.
Sánchez-Garduño, Faustino; Pérez-Velázquez, Judith
This paper deals with the analysis of existence of traveling wave solutions (TWS) for a diffusion-degenerate (at D (0) = 0) and advection-degenerate (at h '(0) = 0) reaction-diffusion-advection (RDA) equation. Diffusion is a strictly increasing function and the reaction term generalizes the kinetic part of the Fisher-KPP equation. We consider different forms of the convection term h ( u ): (1) h '( u ) is constant k , (2) h '( u ) = ku with k > 0, and (3) it is a quite general form which guarantees the degeneracy in the advective term. In Case 1, we prove that the task can be reduced to that for the corresponding equation, where k = 0, and then previous results reported from the authors can be extended. For the other two cases, we use both analytical and numerical tools. The analysis we carried out is based on the restatement of searching TWS for the full RDA equation into a two-dimensional dynamical problem. This consists of searching for the conditions on the parameter values for which there exist heteroclinic trajectories of the ordinary differential equations (ODE) system in the traveling wave coordinates. Throughout the paper we obtain the dynamics by using tools coming from qualitative theory of ODE.
NASA's In-Space Propulsion Technology Project Overview, Near-term Products and Mission Applicability
NASA Technical Reports Server (NTRS)
Dankanich, John; Anderson, David J.
2008-01-01
The In-Space Propulsion Technology (ISPT) Project, funded by NASA's Science Mission Directorate (SMD), is continuing to invest in propulsion technologies that will enable or enhance NASA robotic science missions. This overview provides development status, near-term mission benefits, applicability, and availability of in-space propulsion technologies in the areas of aerocapture, electric propulsion, advanced chemical thrusters, and systems analysis tools. Aerocapture investments improved (1) guidance, navigation, and control models of blunt-body rigid aeroshells, 2) atmospheric models for Earth, Titan, Mars and Venus, and 3) models for aerothermal effects. Investments in electric propulsion technologies focused on completing NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system. The project is also concluding its High Voltage Hall Accelerator (HiVHAC) mid-term product specifically designed for a low-cost electric propulsion option. The primary chemical propulsion investment is on the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance for lower cost. The project is also delivering products to assist technology infusion and quantify mission applicability and benefits through mission analysis and tools. In-space propulsion technologies are applicable, and potentially enabling for flagship destinations currently under evaluation, as well as having broad applicability to future Discovery and New Frontiers mission solicitations.
Reactive-Diffusive-Advective Traveling Waves in a Family of Degenerate Nonlinear Equations
Sánchez-Garduño, Faustino
2016-01-01
This paper deals with the analysis of existence of traveling wave solutions (TWS) for a diffusion-degenerate (at D(0) = 0) and advection-degenerate (at h′(0) = 0) reaction-diffusion-advection (RDA) equation. Diffusion is a strictly increasing function and the reaction term generalizes the kinetic part of the Fisher-KPP equation. We consider different forms of the convection term h(u): (1) h′(u) is constant k, (2) h′(u) = ku with k > 0, and (3) it is a quite general form which guarantees the degeneracy in the advective term. In Case 1, we prove that the task can be reduced to that for the corresponding equation, where k = 0, and then previous results reported from the authors can be extended. For the other two cases, we use both analytical and numerical tools. The analysis we carried out is based on the restatement of searching TWS for the full RDA equation into a two-dimensional dynamical problem. This consists of searching for the conditions on the parameter values for which there exist heteroclinic trajectories of the ordinary differential equations (ODE) system in the traveling wave coordinates. Throughout the paper we obtain the dynamics by using tools coming from qualitative theory of ODE. PMID:27689131
Hasan Imani-Nasab, Mohammad; Seyedin, Hesam; Yazdizadeh, Bahareh; Majdzadeh, Reza
2017-01-08
SUPPORT tools consist of 18 articles addressing the health policy-makers so that they can learn how to make evidence-informed health policies. These tools have been particularly recommended for developing countries. The present study tries to explain the process of evidence utilization for developing policy documents in the Iranian Ministry of Health and Medical Education (MoHME) and to compare the findings with those of SUPPORT tools. A qualitative research was conducted, using the framework analysis approach. Participants consisted of senior managers and technicians in MoHME. Purposeful sampling was done, with a maximum variety, for the selection of research participants: individuals having at least 5 years of experience in preparing evidence-based policy documents. Face-to-face interviews were conducted for data collection. As a guideline for the interviews, 'the Utilization of Evidence in Policy-Making Organizations' procedure was used. The data were analyzed through the analysis of the framework method using MAXQDA 10 software. The participants acquired the research evidence in a topic-based form, and they were less likely to search on the basis of the evidence pyramid. To assess the quality of evidence, they did not use standard critical tools; to adapt the evidence and interventions with the local setting, they did not use the ideas and experiences of all stakeholders, and in preparing the evidence-based policy documents, they did not take into consideration the window of opportunity, did not refrain from using highly technical terms, did not write user-friendly summaries, and did not present alternative policy options. In order to develop health policies, however, they used the following innovations: attention to the financial burden of policy issues on the agenda, sensitivity analysis of the preferred policy option on the basis of technical, sociopolitical, and economic feasibility, advocacy from other scholars, using the multi-criteria decision-making models for the prioritization of policy options, implementation of policy based on the degree of readiness of policy-implementing units, and the classification of policy documents on the basis of different conditions of policy-making (urgent, short-term, and long-term). Findings showed that the process of evidence utilization in IR-MoH enjoys some innovations for the support of health policy development. The present study provides IR-MoH with considerable opportunities for the improvement of evidence-informed health policy-making. Moreover, the SUPPORT process and tools are recommended to be used in developing countries. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
On-Line Tool for the Assessment of Radiation in Space - Deep Space Mission Enhancements
NASA Technical Reports Server (NTRS)
Sandridge, Chris a.; Blattnig, Steve R.; Norman, Ryan B.; Slaba, Tony C.; Walker, Steve A.; Spangler, Jan L.
2011-01-01
The On-Line Tool for the Assessment of Radiation in Space (OLTARIS, https://oltaris.nasa.gov) is a web-based set of tools and models that allows engineers and scientists to assess the effects of space radiation on spacecraft, habitats, rovers, and spacesuits. The site is intended to be a design tool for those studying the effects of space radiation for current and future missions as well as a research tool for those developing advanced material and shielding concepts. The tools and models are built around the HZETRN radiation transport code and are primarily focused on human- and electronic-related responses. The focus of this paper is to highlight new capabilities that have been added to support deep space (outside Low Earth Orbit) missions. Specifically, the electron, proton, and heavy ion design environments for the Europa mission have been incorporated along with an efficient coupled electron-photon transport capability to enable the analysis of complicated geometries and slabs exposed to these environments. In addition, a neutron albedo lunar surface environment was also added, that will be of value for the analysis of surface habitats. These updates will be discussed in terms of their implementation and on how OLTARIS can be used by instrument vendors, mission designers, and researchers to analyze their specific requirements.12
BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Selva, Jacopo
2013-04-01
The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).
Computational Analysis of Material Flow During Friction Stir Welding of AA5059 Aluminum Alloys
NASA Astrophysics Data System (ADS)
Grujicic, M.; Arakere, G.; Pandurangan, B.; Ochterbeck, J. M.; Yen, C.-F.; Cheeseman, B. A.; Reynolds, A. P.; Sutton, M. A.
2012-09-01
Workpiece material flow and stirring/mixing during the friction stir welding (FSW) process are investigated computationally. Within the numerical model of the FSW process, the FSW tool is treated as a Lagrangian component while the workpiece material is treated as an Eulerian component. The employed coupled Eulerian/Lagrangian computational analysis of the welding process was of a two-way thermo-mechanical character (i.e., frictional-sliding/plastic-work dissipation is taken to act as a heat source in the thermal-energy balance equation) while temperature is allowed to affect mechanical aspects of the model through temperature-dependent material properties. The workpiece material (AA5059, solid-solution strengthened and strain-hardened aluminum alloy) is represented using a modified version of the classical Johnson-Cook model (within which the strain-hardening term is augmented to take into account for the effect of dynamic recrystallization) while the FSW tool material (AISI H13 tool steel) is modeled as an isotropic linear-elastic material. Within the analysis, the effects of some of the FSW key process parameters are investigated (e.g., weld pitch, tool tilt-angle, and the tool pin-size). The results pertaining to the material flow during FSW are compared with their experimental counterparts. It is found that, for the most part, experimentally observed material-flow characteristics are reproduced within the current FSW-process model.
Lakeside: Merging Urban Design with Scientific Analysis
Guzowski, Leah; Catlett, Charlie; Woodbury, Ed
2018-01-16
Researchers at the U.S. Department of Energy's Argonne National Laboratory and the University of Chicago are developing tools that merge urban design with scientific analysis to improve the decision-making process associated with large-scale urban developments. One such tool, called LakeSim, has been prototyped with an initial focus on consumer-driven energy and transportation demand, through a partnership with the Chicago-based architectural and engineering design firm Skidmore, Owings & Merrill, Clean Energy Trust and developer McCaffery Interests. LakeSim began with the need to answer practical questions about urban design and planning, requiring a better understanding about the long-term impact of design decisions on energy and transportation demand for a 600-acre development project on Chicago's South Side - the Chicago Lakeside Development project.
Methodology Development for Assessment of Spaceport Technology Returns and Risks
NASA Technical Reports Server (NTRS)
Joglekar, Prafulla; Zapata, Edgar
2001-01-01
As part of Kennedy Space Center's (KSC's) challenge to open the space frontier, new spaceport technologies must be developed, matured and successfully transitioned to operational systems. R&D investment decisions can be considered from multiple perspectives. Near mid and far term technology horizons must be understood. Because a multitude of technology investment opportunities are available, we must identify choices that promise the greatest likelihood of significant lifecycle At the same time, the costs and risks of any choice must be well understood and balanced against its potential returns The problem is not one of simply rank- ordering projects in terms of their desirability. KSC wants to determine a portfolio of projects that simultaneously satisfies multiple goals, such as getting the biggest bang for the buck, supporting projects that may be too risky for private funding, staying within annual budget cycles without foregoing the requirements of a long term technology vision, and ensuring the development of a diversity of technologies that, support the variety of operational functions involved in space transportation. This work aims to assist in the development of in methods and techniques that support strategic technology investment decisions and ease the process of determining an optimal portfolio of spaceport R&D investments. Available literature on risks and returns to R&D is reviewed and most useful pieces are brought to the attention of the Spaceport Technology Development Office (STDO). KSC's current project management procedures are reviewed. It is found that the "one size fits all" nature of KSC's existing procedures and project selection criteria is not conducive to prudent decision-making. Directions for improving KSC's - procedures and criteria are outlined. With help of a contractor, STDO is currently developing a tool, named Change Management Analysis Tool (CMAT)/ Portfolio Analysis Tool (PAT), to assist KSC's R&D portfolio determination. A critical review of CMAT/PAT is undertaken. Directions for the improvement of this tool are provided. STDO and KSC intend to follow up on many, if not all, of the recommendations provided.
Developing a uniformed assessment tool to evaluate care service needs for disabled persons in Japan.
Takei, Teiji; Takahashi, Hiroshi; Nakatani, Hiroki
2008-05-01
Until recently, the care services for disabled persons have been under rigid control by public sectors in terms of provision and funding in Japan. A reform was introduced in 2003 that brought a rapid increase of utilization of services and serious shortage of financial resources. Under these circumstances, the "Services and Supports for Persons with Disabilities Act" was enacted in 2005, requiring that the care service provision process should be transparent, fair and standardized. The purpose of this study is to develop an objective tool for assessing the need for disability care. In the present study we evaluate 1423 cases of patients receiving care services in 60 municipalities, including all three categories of disabilities (physical, intellectual and mental). Using the data of the total 106 items, we conducted factor analysis and regression analysis to develop an assessment tool for people with disabilities. The data revealed that instrumental activities of daily living (IADL) played an essential role in assessing disability levels. We have developed the uniformed assessment tool that has been utilized to guide the types and quantity of care services throughout Japan.
Cavalcante, Fátima Gonçalves; Minayo, Maria Cecília de Souza; Gutierrez, Denise Machado Duran; de Sousa, Girliani Silva; da Silva, Raimunda Magalhães; Moura, Rosylaine; Meneghel, Stela Nazareth; Grubits, Sonia; Conte, Marta; Cavalcante, Ana Célia Sousa; Figueiredo, Ana Elisa Bastos; Mangas, Raimunda Matilde do Nascimento; Fachola, María Cristina Heuguerot; Izquierdo, Giovane Mendieta
2015-06-01
The article analyses the quality and consistency of a comprehensive interview guide, adapted to study attempted suicide and its ideation among the elderly, and imparts the method followed in applying this tool. The objective is to show how the use of a semi-structured interview and the organization and data analysis set-up were tested and perfected by a network of researchers from twelve universities or research centers in Brazil, Uruguay and Colombia. The method involved application and evaluation of the tool and joint production of an instruction manual on data collection, systematization and analysis. The methodology was followed in 67 interviews with elderly people of 60 or older and in 34 interviews with health professionals in thirteen Brazilian municipalities and in Montevideo and Bogotá, allowing the consistency of the tool and the applicability of the method to be checked, during the process and at the end. The enhanced guide and the instructions for reproducing it are presented herein. The results indicate the suitability and credibility of this methodological approach, tested and certified in interdisciplinary and interinstitutional terms.
Reed, Shelby D.; Neilson, Matthew P.; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H.; Polsky, Daniel E.; Graham, Felicia L.; Bowers, Margaret T.; Paul, Sara C.; Granger, Bradi B.; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara; Levy, Wayne C.
2015-01-01
Background Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. Methods We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics, use of evidence-based medications, and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model (SHFM). Projections of resource use and quality of life are modeled using relationships with time-varying SHFM scores. The model can be used to evaluate parallel-group and single-cohort designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. Results The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. Conclusion The TEAM-HF Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. PMID:26542504
NASA Astrophysics Data System (ADS)
Bartolini, Stefania; Sobradelo, Rosa; Martí, Joan
2016-08-01
Short-term hazard assessment is an important part of the volcanic management cycle, above all at the onset of an episode of volcanic agitation (unrest). For this reason, one of the main tasks of modern volcanology is to use monitoring data to identify and analyse precursory signals and so determine where and when an eruption might occur. This work follows from Sobradelo and Martí [Short-term volcanic hazard assessment through Bayesian inference: retrospective application to the Pinatubo 1991 volcanic crisis. Journal of Volcanology and Geothermal Research 290, 111, 2015] who defined the principle for a new methodology for conducting short-term hazard assessment in unrest volcanoes. Using the same case study, the eruption on Pinatubo (15 June 1991), this work introduces a new free Python tool, ST-HASSET, for implementing Sobradelo and Martí (2015) methodology in the time evolution of unrest indicators in the volcanic short-term hazard assessment. Moreover, this tool is designed for complementing long-term hazard assessment with continuous monitoring data when the volcano goes into unrest. It is based on Bayesian inference and transforms different pre-eruptive monitoring parameters into a common probabilistic scale for comparison among unrest episodes from the same volcano or from similar ones. This allows identifying common pre-eruptive behaviours and patterns. ST-HASSET is especially designed to assist experts and decision makers as a crisis unfolds, and allows detecting sudden changes in the activity of a volcano. Therefore, it makes an important contribution to the analysis and interpretation of relevant data for understanding the evolution of volcanic unrest.
Battery Lifetime Analysis and Simulation Tool Suite | Transportation
comparisons of different battery-use strategies to predict long-term performance in electric vehicle (EV) and economic and greenhouse gas impacts of different EV scenarios. An illustrated graphic showing thermal . Users can enter their own battery duty cycles for direct simulation to evaluate the impacts of different
Construction and Validation of Textbook Analysis Grids for Ecology and Environmental Education
ERIC Educational Resources Information Center
Caravita, Silvia; Valente, Adriana; Luzi, Daniela; Pace, Paul; Valanides, Nicos; Khalil, Iman; Berthou, Guillemette; Kozan-Naumescu, Adrienne; Clement, Pierre
2008-01-01
Knowledge about ecology and environmental education (EE) constitutes a basic tool for promoting a sustainable future, and was a target area of the BIOHEAD-Citizen Project. School textbooks were considered as representative sources of evidence in terms of ecology and environmental education, and were used for comparison among the countries…
Time as a Tool for Policy Analysis in Aging.
ERIC Educational Resources Information Center
Pastorello, Thomas
National policy makers have put forth different life cycle planning proposals for the more satisfying integration of education, work and leisure over the life course. This speech describes a decision making scheme, the Time Paradigm, for researched-based choice among various proposals. The scheme is defined in terms of a typology of time-related…
ERIC Educational Resources Information Center
Yoon, Ee-Seul; Lubienski, Christopher
2018-01-01
This paper suggests that synergies can be produced by using geospatial analyses as a bridge between traditional qualitative-quantitative distinctions in education research. While mapping tools have been effective for informing education policy studies, especially in terms of educational access and choice, they have also been underutilized and…
Using Spreadsheet Modeling Techniques for Capital Project Review. AIR 1985 Annual Forum Paper.
ERIC Educational Resources Information Center
Kaynor, Robert K.
The value of microcomputer modeling tools and spreadsheets to help college institutional researchers analyze proposed capital projects is discussed, along with strengths and weaknesses of different software packages. Capital budgeting is the analysis that supports decisions about the allocation and commitment of funds to long-term capital…
Got Tools? The Blended Learning Analysis and Design Expediter
ERIC Educational Resources Information Center
Elsenheimer, Jim
2006-01-01
Blended learning is an approach to instructional design that seeks to maximize learning potential by applying the most effective form of instruction for a given program element. The term "blended learning" should not refer to just the mixing of training delivery methods (as it is often defined) but to the orchestrated application and integration…
A Computer Simulation Modeling Tool to Assist Colleges in Long-Range Planning. Final Report.
ERIC Educational Resources Information Center
Salmon, Richard; And Others
Long-range planning involves the establishment of educational objectives within a rational philosophy, the design of activities and programs to meet stated objectives, the organization and allocation of resources to implement programs, and the analysis of results in terms of the objectives. Current trends of educational growth and complexity…
Layer-oriented simulation tool.
Arcidiacono, Carmelo; Diolaiti, Emiliano; Tordi, Massimiliano; Ragazzoni, Roberto; Farinato, Jacopo; Vernet, Elise; Marchetti, Enrico
2004-08-01
The Layer-Oriented Simulation Tool (LOST) is a numerical simulation code developed for analysis of the performance of multiconjugate adaptive optics modules following a layer-oriented approach. The LOST code computes the atmospheric layers in terms of phase screens and then propagates the phase delays introduced in the natural guide stars' wave fronts by using geometrical optics approximations. These wave fronts are combined in an optical or numerical way, including the effects of wave-front sensors on measurements in terms of phase noise. The LOST code is described, and two applications to layer-oriented modules are briefly presented. We have focus on the Multiconjugate adaptive optics demonstrator to be mounted upon the Very Large Telescope and on the Near-IR-Visible Adaptive Interferometer for Astronomy (NIRVANA) interferometric system to be installed on the combined focus of the Large Binocular Telescope.
A Microarray Tool Provides Pathway and GO Term Analysis.
Koch, Martin; Royer, Hans-Dieter; Wiese, Michael
2011-12-01
Analysis of gene expression profiles is no longer exclusively a task for bioinformatic experts. However, gaining statistically significant results is challenging and requires both biological knowledge and computational know-how. Here we present a novel, user-friendly microarray reporting tool called maRt. The software provides access to bioinformatic resources, like gene ontology terms and biological pathways by use of the DAVID and the BioMart web-service. Results are summarized in structured HTML reports, each presenting a different layer of information. In these report, contents of diverse sources are integrated and interlinked. To speed up processing, maRt takes advantage of the multi-core technology of modern desktop computers by using parallel processing. Since the software is built upon a RCP infrastructure it might be an outset for developers aiming to integrate novel R based applications. Installer, documentation and various kinds of tutorials are available under LGPL license at the website of our institute http://www.pharma.uni-bonn.de/www/mart. This software is free for academic use. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Gómez-Beas, R.; Moñino, A.; Polo, M. J.
2012-05-01
In compliance with the development of the Water Framework Directive, there is a need for an integrated management of water resources, which involves the elaboration of reservoir management models. These models should include the operational and technical aspects which allow us to forecast an optimal management in the short term, besides the factors that may affect the volume of water stored in the medium and long term. The climate fluctuations of the water cycle that affect the reservoir watershed should be considered, as well as the social and economic aspects of the area. This paper shows the development of a management model for Rules reservoir (southern Spain), through which the water supply is regulated based on set criteria, in a sustainable way with existing commitments downstream, with the supply capacity being well established depending on demand, and the probability of failure when the operating requirements are not fulfilled. The results obtained allowed us: to find out the reservoir response at different time scales, to introduce an uncertainty analysis and to demonstrate the potential of the methodology proposed here as a tool for decision making.
A network-base analysis of CMIP5 "historical" experiments
NASA Astrophysics Data System (ADS)
Bracco, A.; Foudalis, I.; Dovrolis, C.
2012-12-01
In computer science, "complex network analysis" refers to a set of metrics, modeling tools and algorithms commonly used in the study of complex nonlinear dynamical systems. Its main premise is that the underlying topology or network structure of a system has a strong impact on its dynamics and evolution. By allowing to investigate local and non-local statistical interaction, network analysis provides a powerful, but only marginally explored, framework to validate climate models and investigate teleconnections, assessing their strength, range, and impacts on the climate system. In this work we propose a new, fast, robust and scalable methodology to examine, quantify, and visualize climate sensitivity, while constraining general circulation models (GCMs) outputs with observations. The goal of our novel approach is to uncover relations in the climate system that are not (or not fully) captured by more traditional methodologies used in climate science and often adopted from nonlinear dynamical systems analysis, and to explain known climate phenomena in terms of the network structure or its metrics. Our methodology is based on a solid theoretical framework and employs mathematical and statistical tools, exploited only tentatively in climate research so far. Suitably adapted to the climate problem, these tools can assist in visualizing the trade-offs in representing global links and teleconnections among different data sets. Here we present the methodology, and compare network properties for different reanalysis data sets and a suite of CMIP5 coupled GCM outputs. With an extensive model intercomparison in terms of the climate network that each model leads to, we quantify how each model reproduces major teleconnections, rank model performances, and identify common or specific errors in comparing model outputs and observations.
Willard, Scott D; Nguyen, Mike M
2013-01-01
To evaluate the utility of using Internet search trends data to estimate kidney stone occurrence and understand the priorities of patients with kidney stones. Internet search trends data represent a unique resource for monitoring population self-reported illness and health information-seeking behavior. The Google Insights for Search analysis tool was used to study searches related to kidney stones, with each search term returning a search volume index (SVI) according to the search frequency relative to the total search volume. SVIs for the term, "kidney stones," were compiled by location and time parameters and compared with the published weather and stone prevalence data. Linear regression analysis was performed to determine the association of the search interest score with known epidemiologic variations in kidney stone disease, including latitude, temperature, season, and state. The frequency of the related search terms was categorized by theme and qualitatively analyzed. The SVI correlated significantly with established kidney stone epidemiologic predictors. The SVI correlated with the state latitude (R-squared=0.25; P<.001), the state mean annual temperature (R-squared=0.24; P<.001), and state combined sex prevalence (R-squared=0.25; P<.001). Female prevalence correlated more strongly than did male prevalence (R-squared=0.37; P<.001, and R-squared=0.17; P=.003, respectively). The national SVI correlated strongly with the average U.S. temperature by month (R-squared=0.54; P=.007). The search term ranking suggested that Internet users are most interested in the diagnosis, followed by etiology, infections, and treatment. Geographic and temporal variability in kidney stone disease appear to be accurately reflected in Internet search trends data. Internet search trends data might have broader applications for epidemiologic and urologic research. Copyright © 2013 Elsevier Inc. All rights reserved.
A hardware acceleration based on high-level synthesis approach for glucose-insulin analysis
NASA Astrophysics Data System (ADS)
Daud, Nur Atikah Mohd; Mahmud, Farhanahani; Jabbar, Muhamad Hairol
2017-01-01
In this paper, the research is focusing on Type 1 Diabetes Mellitus (T1DM). Since this disease requires a full attention on the blood glucose concentration with the help of insulin injection, it is important to have a tool that able to predict that level when consume a certain amount of carbohydrate during meal time. Therefore, to make it realizable, a Hovorka model which is aiming towards T1DM is chosen in this research. A high-level language is chosen that is C++ to construct the mathematical model of the Hovorka model. Later, this constructed code is converted into intellectual property (IP) which is also known as a hardware accelerator by using of high-level synthesis (HLS) approach which able to improve in terms of design and performance for glucose-insulin analysis tool later as will be explained further in this paper. This is the first step in this research before implementing the design into system-on-chip (SoC) to achieve a high-performance system for the glucose-insulin analysis tool.
Dyrlund, Thomas F; Poulsen, Ebbe T; Scavenius, Carsten; Sanggaard, Kristian W; Enghild, Jan J
2012-09-01
Data processing and analysis of proteomics data are challenging and time consuming. In this paper, we present MS Data Miner (MDM) (http://sourceforge.net/p/msdataminer), a freely available web-based software solution aimed at minimizing the time required for the analysis, validation, data comparison, and presentation of data files generated in MS software, including Mascot (Matrix Science), Mascot Distiller (Matrix Science), and ProteinPilot (AB Sciex). The program was developed to significantly decrease the time required to process large proteomic data sets for publication. This open sourced system includes a spectra validation system and an automatic screenshot generation tool for Mascot-assigned spectra. In addition, a Gene Ontology term analysis function and a tool for generating comparative Excel data reports are included. We illustrate the benefits of MDM during a proteomics study comprised of more than 200 LC-MS/MS analyses recorded on an AB Sciex TripleTOF 5600, identifying more than 3000 unique proteins and 3.5 million peptides. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A primer in macromolecular linguistics.
Searls, David B
2013-03-01
Polymeric macromolecules, when viewed abstractly as strings of symbols, can be treated in terms of formal language theory, providing a mathematical foundation for characterizing such strings both as collections and in terms of their individual structures. In addition this approach offers a framework for analysis of macromolecules by tools and conventions widely used in computational linguistics. This article introduces the ways that linguistics can be and has been applied to molecular biology, covering the relevant formal language theory at a relatively nontechnical level. Analogies between macromolecules and human natural language are used to provide intuitive insights into the relevance of grammars, parsing, and analysis of language complexity to biology. Copyright © 2012 Wiley Periodicals, Inc.
Substance flow analysis as a tool for urban water management.
Chèvre, N; Guignard, C; Rossi, L; Pfeifer, H-R; Bader, H-P; Scheidegger, R
2011-01-01
Human activity results in the production of a wide range of pollutants that can enter the water cycle through stormwater or wastewater. Among others, heavy metals are still detected in high concentrations around urban areas and their impact on aquatic organisms is of major concern. In this study, we propose to use a substance flow analysis as a tool for heavy metals management in urban areas. We illustrate the approach with the case of copper in Lausanne, Switzerland. The results show that around 1,500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment enrichment, which may pose a long-term risk for benthic organisms. The major sources of copper in receiving surface water are roofs and catenaries of trolleybuses. They represent 75% of the total input of copper into the urban water system. Actions to reduce copper pollution should therefore focus on these sources. Substance flow analysis also highlights that copper enters surface water mainly during rain events, i.e., without passing through any treatment procedure. A reduction in pollution could also be achieved by improving stormwater management. In conclusion, the study showed that substance flow analysis is a very effective tool for sustainable urban water management.
Yi, Ming; Mudunuri, Uma; Che, Anney; Stephens, Robert M
2009-06-29
One of the challenges in the analysis of microarray data is to integrate and compare the selected (e.g., differential) gene lists from multiple experiments for common or unique underlying biological themes. A common way to approach this problem is to extract common genes from these gene lists and then subject these genes to enrichment analysis to reveal the underlying biology. However, the capacity of this approach is largely restricted by the limited number of common genes shared by datasets from multiple experiments, which could be caused by the complexity of the biological system itself. We now introduce a new Pathway Pattern Extraction Pipeline (PPEP), which extends the existing WPS application by providing a new pathway-level comparative analysis scheme. To facilitate comparing and correlating results from different studies and sources, PPEP contains new interfaces that allow evaluation of the pathway-level enrichment patterns across multiple gene lists. As an exploratory tool, this analysis pipeline may help reveal the underlying biological themes at both the pathway and gene levels. The analysis scheme provided by PPEP begins with multiple gene lists, which may be derived from different studies in terms of the biological contexts, applied technologies, or methodologies. These lists are then subjected to pathway-level comparative analysis for extraction of pathway-level patterns. This analysis pipeline helps to explore the commonality or uniqueness of these lists at the level of pathways or biological processes from different but relevant biological systems using a combination of statistical enrichment measurements, pathway-level pattern extraction, and graphical display of the relationships of genes and their associated pathways as Gene-Term Association Networks (GTANs) within the WPS platform. As a proof of concept, we have used the new method to analyze many datasets from our collaborators as well as some public microarray datasets. This tool provides a new pathway-level analysis scheme for integrative and comparative analysis of data derived from different but relevant systems. The tool is freely available as a Pathway Pattern Extraction Pipeline implemented in our existing software package WPS, which can be obtained at http://www.abcc.ncifcrf.gov/wps/wps_index.php.
Patthi, Basavaraj; Kumar, Jishnu Krishna; Singla, Ashish; Gupta, Ritu; Prasad, Monika; Ali, Irfan; Dhama, Kuldeep; Niraj, Lav Kumar
2017-09-01
Oral diseases are pandemic cause of morbidity with widespread geographic distribution. This technology based era has brought about easy knowledge transfer than traditional dependency on information obtained from family doctors. Hence, harvesting this system of trends can aid in oral disease quantification. To conduct an exploratory analysis of the changes in internet search volumes of oral diseases by using Google Trends © (GT © ). GT © were utilized to provide real world facts based on search terms related to categories, interest by region and interest over time. Time period chosen was from January 2004 to December 2016. Five different search terms were explored and compared based on the highest relative search volumes along with comma separated value files to obtain an insight into highest search traffic. The search volume measured over the time span noted the term "Dental caries" to be the most searched in Japan, "Gingivitis" in Jordan, "Oral Cancer" in Taiwan, "No Teeth" in Australia, "HIV symptoms" in Zimbabwe, "Broken Teeth" in United Kingdom, "Cleft palate" in Philippines, "Toothache" in Indonesia and the comparison of top five searched terms provided the "Gingivitis" with highest search volume. The results from the present study offers an insight into a competent tool that can analyse and compare oral diseases over time. The trend research platform can be used on emerging diseases and their drift in geographic population with great acumen. This tool can be utilized in forecasting, modulating marketing strategies and planning disability limitation techniques.
PanWeb: A web interface for pan-genomic analysis.
Pantoja, Yan; Pinheiro, Kenny; Veras, Allan; Araújo, Fabrício; Lopes de Sousa, Ailton; Guimarães, Luis Carlos; Silva, Artur; Ramos, Rommel T J
2017-01-01
With increased production of genomic data since the advent of next-generation sequencing (NGS), there has been a need to develop new bioinformatics tools and areas, such as comparative genomics. In comparative genomics, the genetic material of an organism is directly compared to that of another organism to better understand biological species. Moreover, the exponentially growing number of deposited prokaryote genomes has enabled the investigation of several genomic characteristics that are intrinsic to certain species. Thus, a new approach to comparative genomics, termed pan-genomics, was developed. In pan-genomics, various organisms of the same species or genus are compared. Currently, there are many tools that can perform pan-genomic analyses, such as PGAP (Pan-Genome Analysis Pipeline), Panseq (Pan-Genome Sequence Analysis Program) and PGAT (Prokaryotic Genome Analysis Tool). Among these software tools, PGAP was developed in the Perl scripting language and its reliance on UNIX platform terminals and its requirement for an extensive parameterized command line can become a problem for users without previous computational knowledge. Thus, the aim of this study was to develop a web application, known as PanWeb, that serves as a graphical interface for PGAP. In addition, using the output files of the PGAP pipeline, the application generates graphics using custom-developed scripts in the R programming language. PanWeb is freely available at http://www.computationalbiology.ufpa.br/panweb.
Neu, Thomas R; Kuhlicke, Ute
2017-02-10
Microbial biofilm systems are defined as interface-associated microorganisms embedded into a self-produced matrix. The extracellular matrix represents a continuous challenge in terms of characterization and analysis. The tools applied in more detailed studies comprise extraction/chemical analysis, molecular characterization, and visualisation using various techniques. Imaging by laser microscopy became a standard tool for biofilm analysis, and, in combination with fluorescently labelled lectins, the glycoconjugates of the matrix can be assessed. By employing this approach a wide range of pure culture biofilms from different habitats were examined using the commercially available lectins. From the results, a binary barcode pattern of lectin binding can be generated. Furthermore, the results can be fine-tuned and transferred into a heat map according to signal intensity. The lectin barcode approach is suggested as a useful tool for investigating the biofilm matrix characteristics and dynamics at various levels, e.g. bacterial cell surfaces, adhesive footprints, individual microcolonies, and the gross biofilm or bio-aggregate. Hence fluorescence lectin bar-coding (FLBC) serves as a basis for a subsequent tailor-made fluorescence lectin-binding analysis (FLBA) of a particular biofilm. So far, the lectin approach represents the only tool for in situ characterization of the glycoconjugate makeup in biofilm systems. Furthermore, lectin staining lends itself to other fluorescence techniques in order to correlate it with cellular biofilm constituents in general and glycoconjugate producers in particular.
BATCH-GE: Batch analysis of Next-Generation Sequencing data for genome editing assessment
Boel, Annekatrien; Steyaert, Woutert; De Rocker, Nina; Menten, Björn; Callewaert, Bert; De Paepe, Anne; Coucke, Paul; Willaert, Andy
2016-01-01
Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome. PMID:27461955
Automated shock detection and analysis algorithm for space weather application
NASA Astrophysics Data System (ADS)
Vorotnikov, Vasiliy S.; Smith, Charles W.; Hu, Qiang; Szabo, Adam; Skoug, Ruth M.; Cohen, Christina M. S.
2008-03-01
Space weather applications have grown steadily as real-time data have become increasingly available. Numerous industrial applications have arisen with safeguarding of the power distribution grids being a particular interest. NASA uses short-term and long-term space weather predictions in its launch facilities. Researchers studying ionospheric, auroral, and magnetospheric disturbances use real-time space weather services to determine launch times. Commercial airlines, communication companies, and the military use space weather measurements to manage their resources and activities. As the effects of solar transients upon the Earth's environment and society grow with the increasing complexity of technology, better tools are needed to monitor and evaluate the characteristics of the incoming disturbances. A need is for automated shock detection and analysis methods that are applicable to in situ measurements upstream of the Earth. Such tools can provide advance warning of approaching disturbances that have significant space weather impacts. Knowledge of the shock strength and speed can also provide insight into the nature of the approaching solar transient prior to arrival at the magnetopause. We report on efforts to develop a tool that can find and analyze shocks in interplanetary plasma data without operator intervention. This method will run with sufficient speed to be a practical space weather tool providing useful shock information within 1 min of having the necessary data to ground. The ability to run without human intervention frees space weather operators to perform other vital services. We describe ways of handling upstream data that minimize the frequency of false positive alerts while providing the most complete description of approaching disturbances that is reasonably possible.
Droughts and water scarcity: facing challenges
NASA Astrophysics Data System (ADS)
Pereira, Luis S.
2014-05-01
Water scarcity characterizes large portions of the world, particularly the Mediterranean area. It is due to natural causes - climate aridity, which is permanent, and droughts, that are temporary - and to human causes - long term desertification and short term water shortages. Droughts aggravate water scarcity. Knowledge has well developed relative to all processes but management tools still are insufficient as well as the tools required to support appropriate planning and management. Particularly, new approaches on tools for assessing related impacts in agriculture and other economic and social activities are required. Droughts occur in all climates but their characteristics largely differ among regions both in terms frequency, duration and intensity. Research has already produced a large number of tools that allow appropriate monitoring of droughts occurrence and intensity, including dynamics of drought occurrence and time evolution. Advances in drought prediction already are available but we still are far from knowing when a drought will start, how it will evolve and when it dissipates. New developments using teleconnections and GCM are being considered. Climate change is a fact. Are droughts occurrence and severity changing with global change? Opinions are divided about this subject since driving factors and processes are varied and tools for the corresponding analysis are also various. Particularly, weather data series are often too short for obtaining appropriate answers. In a domain where research is producing improved knowledge and innovative approaches, research faces however a variety of challenges. The main ones, dealt in this keynote, refer to concepts and definitions, use of monitoring indices, prediction of drought initiation and evolution, improved assessment of drought impacts, and possible influence of climate change on drought occurrence and severity.
Stansfield, Claire; O'Mara-Eves, Alison; Thomas, James
2017-09-01
Using text mining to aid the development of database search strings for topics described by diverse terminology has potential benefits for systematic reviews; however, methods and tools for accomplishing this are poorly covered in the research methods literature. We briefly review the literature on applications of text mining for search term development for systematic reviewing. We found that the tools can be used in 5 overarching ways: improving the precision of searches; identifying search terms to improve search sensitivity; aiding the translation of search strategies across databases; searching and screening within an integrated system; and developing objectively derived search strategies. Using a case study and selected examples, we then reflect on the utility of certain technologies (term frequency-inverse document frequency and Termine, term frequency, and clustering) in improving the precision and sensitivity of searches. Challenges in using these tools are discussed. The utility of these tools is influenced by the different capabilities of the tools, the way the tools are used, and the text that is analysed. Increased awareness of how the tools perform facilitates the further development of methods for their use in systematic reviews. Copyright © 2017 John Wiley & Sons, Ltd.
Automated Video Analysis of Non-verbal Communication in a Medical Setting.
Hart, Yuval; Czerniak, Efrat; Karnieli-Miller, Orit; Mayo, Avraham E; Ziv, Amitai; Biegon, Anat; Citron, Atay; Alon, Uri
2016-01-01
Non-verbal communication plays a significant role in establishing good rapport between physicians and patients and may influence aspects of patient health outcomes. It is therefore important to analyze non-verbal communication in medical settings. Current approaches to measure non-verbal interactions in medicine employ coding by human raters. Such tools are labor intensive and hence limit the scale of possible studies. Here, we present an automated video analysis tool for non-verbal interactions in a medical setting. We test the tool using videos of subjects that interact with an actor portraying a doctor. The actor interviews the subjects performing one of two scripted scenarios of interviewing the subjects: in one scenario the actor showed minimal engagement with the subject. The second scenario included active listening by the doctor and attentiveness to the subject. We analyze the cross correlation in total kinetic energy of the two people in the dyad, and also characterize the frequency spectrum of their motion. We find large differences in interpersonal motion synchrony and entrainment between the two performance scenarios. The active listening scenario shows more synchrony and more symmetric followership than the other scenario. Moreover, the active listening scenario shows more high-frequency motion termed jitter that has been recently suggested to be a marker of followership. The present approach may be useful for analyzing physician-patient interactions in terms of synchrony and dominance in a range of medical settings.
EMR continuance usage intention of healthcare professionals.
Sayyah Gilani, Mina; Iranmanesh, Mohammad; Nikbin, Davoud; Zailani, Suhaiza
2017-03-01
Electronic medical records (EMRs) have been proven to be effective tools for improving the safety and quality of healthcare despite their relatively low usage rate in hospitals. The long-term development by EMRs depends on the continued use of healthcare professionals. In this study, technology continuance theory (TCT) was used to evaluate the short-term and long-term continuance acceptance of EMRs among healthcare professionals. Data were gathered by surveying 195 medical professionals in Iran. The data were analyzed using the partial least squares (PLS) technique. The analysis showed that the TCT provided a deep understanding of user continuance intention toward EMRs. In addition, the findings illustrated that the determinants of continuance intention vary between short-term and long-term users. The theoretical and practical implications of the study are discussed.
Material Protection, Accounting, and Control Technologies (MPACT): Modeling and Simulation Roadmap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cipiti, Benjamin; Dunn, Timothy; Durbin, Samual
The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal. This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. These tools willmore » consist of instrumentation and devices as well as computer software for modeling. To aid in framing its long-term goal, during FY16, a modeling and simulation roadmap is being developed for three major areas of investigation: (1) radiation transport and sensors, (2) process and chemical models, and (3) shock physics and assessments. For each area, current modeling approaches are described, and gaps and needs are identified.« less
Flight Mechanics of the Entry, Descent and Landing of the ExoMars Mission
NASA Technical Reports Server (NTRS)
HayaRamos, Rodrigo; Boneti, Davide
2007-01-01
ExoMars is ESA's current mission to planet Mars. A high mobility rover and a fixed station will be deployed on the surface of Mars. This paper regards the flight mechanics of the Entry, Descent and Landing (EDL) phases used for the mission analysis and design of the Baseline and back-up scenarios of the mission. The EDL concept is based on a ballistic entry, followed by a descent under parachutes and inflatable devices (airbags) for landing. The mission analysis and design is driven by the flexibility in terms of landing site, arrival dates and the very stringent requirement in terms of landing accuracy. The challenging requirements currently imposed to the mission need innovative analysis and design techniques to support system design trade-offs to cope with the variability in entry conditions. The concept of the Global Entry Corridor has been conceived, designed, implemented and successfully validated as a key tool to provide a global picture of the mission capabilities in terms of landing site reachability.
Deriving Earth Science Data Analytics Requirements
NASA Technical Reports Server (NTRS)
Kempler, Steven J.
2015-01-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.
POLYVIEW-MM: web-based platform for animation and analysis of molecular simulations
Porollo, Aleksey; Meller, Jaroslaw
2010-01-01
Molecular simulations offer important mechanistic and functional clues in studies of proteins and other macromolecules. However, interpreting the results of such simulations increasingly requires tools that can combine information from multiple structural databases and other web resources, and provide highly integrated and versatile analysis tools. Here, we present a new web server that integrates high-quality animation of molecular motion (MM) with structural and functional analysis of macromolecules. The new tool, dubbed POLYVIEW-MM, enables animation of trajectories generated by molecular dynamics and related simulation techniques, as well as visualization of alternative conformers, e.g. obtained as a result of protein structure prediction methods or small molecule docking. To facilitate structural analysis, POLYVIEW-MM combines interactive view and analysis of conformational changes using Jmol and its tailored extensions, publication quality animation using PyMol, and customizable 2D summary plots that provide an overview of MM, e.g. in terms of changes in secondary structure states and relative solvent accessibility of individual residues in proteins. Furthermore, POLYVIEW-MM integrates visualization with various structural annotations, including automated mapping of known inter-action sites from structural homologs, mapping of cavities and ligand binding sites, transmembrane regions and protein domains. URL: http://polyview.cchmc.org/conform.html. PMID:20504857
Single-Cell RNA-Sequencing: Assessment of Differential Expression Analysis Methods.
Dal Molin, Alessandra; Baruzzo, Giacomo; Di Camillo, Barbara
2017-01-01
The sequencing of the transcriptomes of single-cells, or single-cell RNA-sequencing, has now become the dominant technology for the identification of novel cell types and for the study of stochastic gene expression. In recent years, various tools for analyzing single-cell RNA-sequencing data have been proposed, many of them with the purpose of performing differentially expression analysis. In this work, we compare four different tools for single-cell RNA-sequencing differential expression, together with two popular methods originally developed for the analysis of bulk RNA-sequencing data, but largely applied to single-cell data. We discuss results obtained on two real and one synthetic dataset, along with considerations about the perspectives of single-cell differential expression analysis. In particular, we explore the methods performance in four different scenarios, mimicking different unimodal or bimodal distributions of the data, as characteristic of single-cell transcriptomics. We observed marked differences between the selected methods in terms of precision and recall, the number of detected differentially expressed genes and the overall performance. Globally, the results obtained in our study suggest that is difficult to identify a best performing tool and that efforts are needed to improve the methodologies for single-cell RNA-sequencing data analysis and gain better accuracy of results.
Wittich, Walter; Höbler, Fiona; Jarry, Jonathan; McGilton, Katherine S
2018-01-26
This study aimed to identify screening tools, technologies and strategies that vision and hearing care specialists recommend to front-line healthcare professionals for the screening of older adults in long-term care homes who have dementia. An environmental scan of healthcare professionals took place via telephone interviews between December 2015 and March 2016. All interviews were audio recorded, transcribed, proofed for accuracy, and their contents thematically analysed by two members of the research team. A convenience sample of 11 professionals from across Canada specialising in the fields of vision and hearing healthcare and technology for older adults with cognitive impairment were included in the study. As part of a larger mixed-methods project, this qualitative study used semistructured interviews and their subsequent content analysis. Following a two-step content analysis of interview data, coded citations were grouped into three main categories: (1) barriers, (2) facilitators and (3) tools and strategies that do or do not work for sensory screening of older adults with dementia. We report on the information offered by participants within each of these themes, along with a summary of tools and strategies that work for screening older adults with dementia. Recommendations from sensory specialists to nurses working in long-term care included the need for improved interprofessional communication and collaboration, as well as flexibility, additional time and strategic use of clinical intuition and ingenuity. These suggestions at times contradicted the realities of service provision or the need for standardised and validated measures. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Applying AI tools to operational space environmental analysis
NASA Technical Reports Server (NTRS)
Krajnak, Mike; Jesse, Lisa; Mucks, John
1995-01-01
The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.
methylPipe and compEpiTools: a suite of R packages for the integrative analysis of epigenomics data.
Kishore, Kamal; de Pretis, Stefano; Lister, Ryan; Morelli, Marco J; Bianchi, Valerio; Amati, Bruno; Ecker, Joseph R; Pelizzola, Mattia
2015-09-29
Numerous methods are available to profile several epigenetic marks, providing data with different genome coverage and resolution. Large epigenomic datasets are then generated, and often combined with other high-throughput data, including RNA-seq, ChIP-seq for transcription factors (TFs) binding and DNase-seq experiments. Despite the numerous computational tools covering specific steps in the analysis of large-scale epigenomics data, comprehensive software solutions for their integrative analysis are still missing. Multiple tools must be identified and combined to jointly analyze histone marks, TFs binding and other -omics data together with DNA methylation data, complicating the analysis of these data and their integration with publicly available datasets. To overcome the burden of integrating various data types with multiple tools, we developed two companion R/Bioconductor packages. The former, methylPipe, is tailored to the analysis of high- or low-resolution DNA methylomes in several species, accommodating (hydroxy-)methyl-cytosines in both CpG and non-CpG sequence context. The analysis of multiple whole-genome bisulfite sequencing experiments is supported, while maintaining the ability of integrating targeted genomic data. The latter, compEpiTools, seamlessly incorporates the results obtained with methylPipe and supports their integration with other epigenomics data. It provides a number of methods to score these data in regions of interest, leading to the identification of enhancers, lncRNAs, and RNAPII stalling/elongation dynamics. Moreover, it allows a fast and comprehensive annotation of the resulting genomic regions, and the association of the corresponding genes with non-redundant GeneOntology terms. Finally, the package includes a flexible method based on heatmaps for the integration of various data types, combining annotation tracks with continuous or categorical data tracks. methylPipe and compEpiTools provide a comprehensive Bioconductor-compliant solution for the integrative analysis of heterogeneous epigenomics data. These packages are instrumental in providing biologists with minimal R skills a complete toolkit facilitating the analysis of their own data, or in accelerating the analyses performed by more experienced bioinformaticians.
NASA Astrophysics Data System (ADS)
Shprits, Y.; Zhelavskaya, I. S.; Kellerman, A. C.; Spasojevic, M.; Kondrashov, D. A.; Ghil, M.; Aseev, N.; Castillo Tibocha, A. M.; Cervantes Villa, J. S.; Kletzing, C.; Kurth, W. S.
2017-12-01
Increasing volume of satellite measurements requires deployment of new tools that can utilize such vast amount of data. Satellite measurements are usually limited to a single location in space, which complicates the data analysis geared towards reproducing the global state of the space environment. In this study we show how measurements can be combined by means of data assimilation and how machine learning can help analyze large amounts of data and can help develop global models that are trained on single point measurement. Data Assimilation: Manual analysis of the satellite measurements is a challenging task, while automated analysis is complicated by the fact that measurements are given at various locations in space, have different instrumental errors, and often vary by orders of magnitude. We show results of the long term reanalysis of radiation belt measurements along with fully operational real-time predictions using data assimilative VERB code. Machine Learning: We present application of the machine learning tools for the analysis of NASA Van Allen Probes upper-hybrid frequency measurements. Using the obtained data set we train a new global predictive neural network. The results for the Van Allen Probes based neural network are compared with historical IMAGE satellite observations. We also show examples of predictions of geomagnetic indices using neural networks. Combination of machine learning and data assimilation: We discuss how data assimilation tools and machine learning tools can be combine so that physics-based insight into the dynamics of the particular system can be combined with empirical knowledge of it's non-linear behavior.
Accidents at work and costs analysis: a field study in a large Italian company.
Battaglia, Massimo; Frey, Marco; Passetti, Emilio
2014-01-01
Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology.
Accidents at Work and Costs Analysis: A Field Study in a Large Italian Company
BATTAGLIA, Massimo; FREY, Marco; PASSETTI, Emilio
2014-01-01
Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894
Failure Modes and Effects Analysis (FMEA): A Bibliography
NASA Technical Reports Server (NTRS)
2000-01-01
Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.
GenomicTools: a computational platform for developing high-throughput analytics in genomics.
Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo
2012-01-15
Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.
Multi-criteria analysis for PM10 planning
NASA Astrophysics Data System (ADS)
Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa
To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.
MatchingTools: A Python library for symbolic effective field theory calculations
NASA Astrophysics Data System (ADS)
Criado, Juan C.
2018-06-01
MatchingTools is a Python library for doing symbolic calculations in effective field theory. It provides the tools to construct general models by defining their field content and their interaction Lagrangian. Once a model is given, the heavy particles can be integrated out at the tree level to obtain an effective Lagrangian in which only the light particles appear. After integration, some of the terms of the resulting Lagrangian might not be independent. MatchingTools contains functions for transforming these terms to rewrite them in terms of any chosen set of operators.
Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.
2005-01-01
Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.
VisualUrText: A Text Analytics Tool for Unstructured Textual Data
NASA Astrophysics Data System (ADS)
Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.
2018-05-01
The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.
Global Nanotribology Research Output (1996–2010): A Scientometric Analysis
Elango, Bakthavachalam; Rajendran, Periyaswamy; Bornmann, Lutz
2013-01-01
This study aims to assess the nanotribology research output at global level using scientometric tools. The SCOPUS database was used to retrieve records related to the nanotribology research for the period 1996–2010. Publications were counted on a fractional basis. The level of collaboration and its citation impact were examined. The performance of the most productive countries, institutes and most preferred journals is assessed. Various visualization tools such as the Sci2 tool and Ucinet were employed. The USA ranked top in terms of number of publications, citations per paper and h-index, while Switzerland published a higher percentage of international collaborative papers. The most productive institution was Tsinghua University followed by Ohio State University and Lanzhou Institute of Chemical Physics, CAS. The most preferred journals were Tribology Letters, Wear and Journal of Japanese Society of Tribologists. The result of author keywords analysis reveals that Molecular Dynamics, MEMS, Hard Disk and Diamond like Carbon are major research topics. PMID:24339900
Comparative utility of LANDSAT-1 and Skylab data for coastal wetland mapping and ecological studies
NASA Technical Reports Server (NTRS)
Anderson, R.; Alsid, L.; Carter, V.
1975-01-01
Skylab 190-A photography and LANDSAT-1 analog data have been analyzed to determine coastal wetland mapping potential as a near term substitute for aircraft data and as a long term monitoring tool. The level of detail and accuracy of each was compared. Skylab data provides more accurate classification of wetland types, better delineation of freshwater marshes and more detailed analysis of drainage patterns. LANDSAT-1 analog data is useful for general classification, boundary definition and monitoring of human impact in wetlands.
1991-06-01
500 remaining machine tool firms had less than twenty employees each. Manufacturing rationalization was negligible; product specialization and combined...terms, most of the fuselage. Over 130 Japanese employees were dispatched to Seattle during the 767 development, even though the agreement was for...through, and consider not just the name plates, but who’s involved in sharing the risk- -and the rewards , if any--you recite lots of other names: M.T.U
NASA Astrophysics Data System (ADS)
Pérez-Moreno, Javier; Clays, Koen
The generalized Thomas-Kuhn sum rules are used to characterize the nonlinear optical response of organic chromophores in terms of fundamental parameters that can be measured experimentally. The nonlinear optical performance of organic molecules is evaluated from the combination of hyper-Rayleigh scattering measurements and the analysis in terms of the fundamental limits. Different strategies for the enhancement of nonlinear optical behavior at the molecular and supramolecular level are evaluated and new paradigms for the design of more efficient nonlinear optical molecules are proposed and investigated.
Building Information Model: advantages, tools and adoption efficiency
NASA Astrophysics Data System (ADS)
Abakumov, R. G.; Naumov, A. E.
2018-03-01
The paper expands definition and essence of Building Information Modeling. It describes content and effects from application of Information Modeling at different stages of a real property item. Analysis of long-term and short-term advantages is given. The authors included an analytical review of Revit software package in comparison with Autodesk with respect to: features, advantages and disadvantages, cost and pay cutoff. A prognostic calculation is given for efficiency of adoption of the Building Information Modeling technology, with examples of its successful adoption in Russia and worldwide.
Collection Evaluation and Evolution
NASA Technical Reports Server (NTRS)
Habermann, Ted; Kozimor, John
2017-01-01
We will review metadata evaluation tools and share results from our most recent CMR analysis. We will demonstrate results using Google spreadsheets and present new results in terms of number of records that include specific content. We will show evolution of UMM-compliance over time and also show results of comparing various CMR collections (NASA, non-NASA, and SciOps).
A Geometric Approach to Fair Division
ERIC Educational Resources Information Center
Barbanel, Julius
2010-01-01
We wish to divide a cake among some collection of people (who may have very different notions of the comparative value of pieces of cake) in a way that is both "fair" and "efficient." We explore the meaning of these terms, introduce two geometric tools to aid our analysis, and present a proof (due to Dietrich Weller) that establishes the existence…
ERIC Educational Resources Information Center
Broderick, Jane Tingle; Hong, Seong Bock
2011-01-01
The Cycle of Inquiry (COI) is a tool for emergent curriculum planning and for professional development of early childhood teachers and teacher education students. The COI includes a sequence of five organizational forms connecting analysis of documentation data with intentional planning for long-term emergent inquiry inspired by the Reggio Emilia…
NASA Astrophysics Data System (ADS)
Donato, M. B.; Milasi, M.; Vitanza, C.
2010-09-01
An existence result of a Walrasian equilibrium for an integrated model of exchange, consumption and production is obtained. The equilibrium model is characterized in terms of a suitable generalized quasi-variational inequality; so the existence result comes from an original technique which takes into account tools of convex and set-valued analysis.
System Dynamics in Medical Education: A Tool for Life
ERIC Educational Resources Information Center
Rubin, David M.; Richards, Christopher L.; Keene, Penelope A. C.; Paiker, Janice E.; Gray, A. Rosemary T.; Herron, Robyn F. R.; Russell, Megan J.; Wigdorowitz, Brian
2012-01-01
A course in system dynamics has been included in the first year of our university's six-year medical curriculum. System Dynamics is a discipline that facilitates the modelling, simulation and analysis of a wide range of problems in terms of two fundamental concepts viz. rates and levels. Many topics encountered in the medical school curriculum,…
An Open Source Agenda for Research Linking Text and Image Content Features.
ERIC Educational Resources Information Center
Goodrum, Abby A.; Rorvig, Mark E.; Jeong, Ki-Tai; Suresh, Chitturi
2001-01-01
Proposes methods to utilize image primitives to support term assignment for image classification. Proposes to release code for image analysis in a common tool set for other researchers to use. Of particular focus is the expansion of work by researchers in image indexing to include image content-based feature extraction capabilities in their work.…
MEMORE: An Environment for Data Collection and Analysis on the Use of Computers in Education
ERIC Educational Resources Information Center
Goldschmidt, Ronaldo; Fernandes de Souza, Isabel; Norris, Monica; Passos, Claudio; Ferlin, Claudia; Cavalcanti, Maria Claudia; Soares, Jorge
2016-01-01
The use of computers as teaching and learning tools plays a particularly important role in modern society. Within this scenario, Brazil launched its own version of the "One Laptop per Child" (OLPC) program, and this initiative, termed PROUCA, has already distributed hundreds of low-cost laptops for educational purposes in many Brazilian…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reis, Chuck; Nelson, Eric; Armer, James
The purpose of this playbook and accompanying spreadsheets is to generalize the detailed CBP analysis and to put tools in the hands of experienced refrigeration designers to evaluate multiple applications of refrigeration waste heat reclaim across the United States. Supermarkets with large portfolios of similar buildings can use these tools to assess the impact of large-scale implementation of heat reclaim systems. In addition, the playbook provides best practices for implementing heat reclaim systems to achieve the best long-term performance possible. It includes guidance on operations and maintenance as well as measurement and verification.
Annotare—a tool for annotating high-throughput biomedical investigations and resulting data
Shankar, Ravi; Parkinson, Helen; Burdett, Tony; Hastings, Emma; Liu, Junmin; Miller, Michael; Srinivasa, Rashmi; White, Joseph; Brazma, Alvis; Sherlock, Gavin; Stoeckert, Christian J.; Ball, Catherine A.
2010-01-01
Summary: Computational methods in molecular biology will increasingly depend on standards-based annotations that describe biological experiments in an unambiguous manner. Annotare is a software tool that enables biologists to easily annotate their high-throughput experiments, biomaterials and data in a standards-compliant way that facilitates meaningful search and analysis. Availability and Implementation: Annotare is available from http://code.google.com/p/annotare/ under the terms of the open-source MIT License (http://www.opensource.org/licenses/mit-license.php). It has been tested on both Mac and Windows. Contact: rshankar@stanford.edu PMID:20733062
Antenna pattern interpolation by generalized Whittaker reconstruction
NASA Astrophysics Data System (ADS)
Tjonneland, K.; Lindley, A.; Balling, P.
Whittaker reconstruction is an effective tool for interpolation of band limited data. Whittaker originally introduced the interpolation formula termed the cardinal function as the function that represents a set of equispaced samples but has no periodic components of period less than twice the sample spacing. It appears that its use for reflector antennas was pioneered in France. The method is now a useful tool in the analysis and design of multiple beam reflector antenna systems. A good description of the method has been given by Bucci et al. This paper discusses some problems encountered with the method and their solution.
Evaluation of a New Digital Automated Glycemic Pattern Detection Tool.
Comellas, María José; Albiñana, Emma; Artes, Maite; Corcoy, Rosa; Fernández-García, Diego; García-Alemán, Jorge; García-Cuartero, Beatriz; González, Cintia; Rivero, María Teresa; Casamira, Núria; Weissmann, Jörg
2017-11-01
Blood glucose meters are reliable devices for data collection, providing electronic logs of historical data easier to interpret than handwritten logbooks. Automated tools to analyze these data are necessary to facilitate glucose pattern detection and support treatment adjustment. These tools emerge in a broad variety in a more or less nonevaluated manner. The aim of this study was to compare eDetecta, a new automated pattern detection tool, to nonautomated pattern analysis in terms of time investment, data interpretation, and clinical utility, with the overarching goal to identify early in development and implementation of tool areas of improvement and potential safety risks. Multicenter web-based evaluation in which 37 endocrinologists were asked to assess glycemic patterns of 4 real reports (2 continuous subcutaneous insulin infusion [CSII] and 2 multiple daily injection [MDI]). Endocrinologist and eDetecta analyses were compared on time spent to analyze each report and agreement on the presence or absence of defined patterns. eDetecta module markedly reduced the time taken to analyze each case on the basis of the emminens eConecta reports (CSII: 18 min; MDI: 12.5), compared to the automatic eDetecta analysis. Agreement between endocrinologists and eDetecta varied depending on the patterns, with high level of agreement in patterns of glycemic variability. Further analysis of low level of agreement led to identifying areas where algorithms used could be improved to optimize trend pattern identification. eDetecta was a useful tool for glycemic pattern detection, helping clinicians to reduce time required to review emminens eConecta glycemic reports. No safety risks were identified during the study.
Kennerly, Susan; Heggestad, Eric D; Myers, Haley; Yap, Tracey L
2015-07-29
An effective workforce performing within the context of a positive cultural environment is central to a healthcare organization's ability to achieve quality outcomes. The Nursing Culture Assessment Tool (NCAT) provides nurses with a valid and reliable tool that captures the general aspects of nursing culture. This study extends earlier work confirming the tool's construct validity and dimensionality by standardizing the scoring approach and establishing norm-referenced scoring. Scoring standardization provides a reliable point of comparison for NCAT users. NCAT assessments support nursing's ability to evaluate nursing culture, use results to shape the culture into one that supports change, and advance nursing's best practices and care outcomes. Registered nurses, licensed practical nurses, and certified nursing assistants from 54 long-term care facilities in Kentucky, Nevada, North Carolina, and Oregon were surveyed. Confirmatory factor analysis yielded six first order factors forming the NCAT's subscales (Expectations, Behaviors, Teamwork, Communication, Satisfaction, Commitment) (Comparative Fit Index 0.93) and a second order factor-The Total Culture Score. Aggregated facility level comparisons of observed group variance with expected random variance using rwg(J) statistics is presented. Normative scores and cumulative rank percentages and how the NCAT can be used in implementing planned change are provided.
Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean
2017-03-01
In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.
Kovacs Burns, Katharina; Bellows, Mandy; Eigenseher, Carol; Gallivan, Jennifer
2014-04-15
Extensive literature exists on public involvement or engagement, but what actual tools or guides exist that are practical, tested and easy to use specifically for initiating and implementing patient and family engagement, is uncertain. No comprehensive review and synthesis of general international published or grey literature on this specific topic was found. A systematic scoping review of published and grey literature is, therefore, appropriate for searching through the vast general engagement literature to identify 'patient/family engagement' tools and guides applicable in health organization decision-making, such as within Alberta Health Services in Alberta, Canada. This latter organization requested this search and review to inform the contents of a patient engagement resource kit for patients, providers and leaders. Search terms related to 'patient engagement', tools, guides, education and infrastructure or resources, were applied to published literature databases and grey literature search engines. Grey literature also included United States, Australia and Europe where most known public engagement practices exist, and Canada as the location for this study. Inclusion and exclusion criteria were set, and include: English documents referencing 'patient engagement' with specific criteria, and published between 1995 and 2011. For document analysis and synthesis, document analysis worksheets were used by three reviewers for the selected 224 published and 193 grey literature documents. Inter-rater reliability was ensured for the final reviews and syntheses of 76 published and 193 grey documents. Seven key themes emerged from the literature synthesis analysis, and were identified for patient, provider and/or leader groups. Articles/items within each theme were clustered under main topic areas of 'tools', 'education' and 'infrastructure'. The synthesis and findings in the literature include 15 different terms and definitions for 'patient engagement', 17 different engagement models, numerous barriers and benefits, and 34 toolkits for various patient engagement and evaluation initiatives. Patient engagement is very complex. This scoping review for patient/family engagement tools and guides is a good start for a resource inventory and can guide the content development of a patient engagement resource kit to be used by patients/families, healthcare providers and administrators.
Knickpoint finder: A software tool that improves neotectonic analysis
NASA Astrophysics Data System (ADS)
Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.
2015-03-01
This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.
How to support forest management in a world of change: results of some regional studies.
Fürst, C; Lorz, C; Vacik, H; Potocic, N; Makeschin, F
2010-12-01
This article presents results of several studies in Middle, Eastern and Southeastern Europe on needs and application areas, desirable attributes and marketing potentials of forest management support tools. By comparing present and future application areas, a trend from sectoral planning towards landscape planning and integration of multiple stakeholder needs is emerging. In terms of conflicts, where management support tools might provide benefit, no clear tendencies were found, neither on local nor on regional level. In contrast, on national and European levels, support of the implementation of laws, directives, and regulations was found to be of highest importance. Following the user-requirements analysis, electronic tools supporting communication are preferred against paper-based instruments. The users identified most important attributes of optimized management support tools: (i) a broad accessibility for all users at any time should be guaranteed, (ii) the possibility to integrate iteratively experiences from case studies and from regional experts into the knowledge base (learning system) should be given, and (iii) a self-explanatory user interface is demanded, which is also suitable for users rather inexperienced with electronic tools. However, a market potential analysis revealed that the willingness to pay for management tools is very limited, although the participants specified realistic ranges of maximal amounts of money, which would be invested if the products were suitable and payment inevitable. To bridge the discrepancy between unwillingness to pay and the need to use management support tools, optimized financing or cooperation models between practice and science must be found.
How to Support Forest Management in a World of Change: Results of Some Regional Studies
NASA Astrophysics Data System (ADS)
Fürst, C.; Lorz, C.; Vacik, H.; Potocic, N.; Makeschin, F.
2010-12-01
This article presents results of several studies in Middle, Eastern and Southeastern Europe on needs and application areas, desirable attributes and marketing potentials of forest management support tools. By comparing present and future application areas, a trend from sectoral planning towards landscape planning and integration of multiple stakeholder needs is emerging. In terms of conflicts, where management support tools might provide benefit, no clear tendencies were found, neither on local nor on regional level. In contrast, on national and European levels, support of the implementation of laws, directives, and regulations was found to be of highest importance. Following the user-requirements analysis, electronic tools supporting communication are preferred against paper-based instruments. The users identified most important attributes of optimized management support tools: (i) a broad accessibility for all users at any time should be guaranteed, (ii) the possibility to integrate iteratively experiences from case studies and from regional experts into the knowledge base (learning system) should be given, and (iii) a self-explanatory user interface is demanded, which is also suitable for users rather inexperienced with electronic tools. However, a market potential analysis revealed that the willingness to pay for management tools is very limited, although the participants specified realistic ranges of maximal amounts of money, which would be invested if the products were suitable and payment inevitable. To bridge the discrepancy between unwillingness to pay and the need to use management support tools, optimized financing or cooperation models between practice and science must be found.
A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.
Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy
2016-12-01
Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.
Continuous Coordination Tools and their Evaluation
NASA Astrophysics Data System (ADS)
Sarma, Anita; Al-Ani, Ban; Trainer, Erik; Silva Filho, Roberto S.; da Silva, Isabella A.; Redmiles, David; van der Hoek, André
This chapter discusses a set of co-ordination tools (the Continuous Co-ordination (CC) tool suite that includes Ariadne, Workspace Activity Viewer (WAV), Lighthouse, Palantír, and YANCEES) and details of our evaluation framework for these tools. Specifically, we discuss how we assessed the usefulness and the usability of these tools within the context of a predefined evaluation framework called
A Qualitative Evaluation of Clinical Audit in UK Dental Foundation Training.
Thornley, Peter; Quinn, Alyson
2017-11-10
Clinical Audit (CA) has been recognized as a useful tool for tool for improving service delivery, clinical governance, and the education and performance of the dental team. This study develops the discussion by investigating its use as an educational tool within UK Dental Foundation Training (DFT). The aim was to investigate the views of Foundation Dentists (FDs) and Training Programme Directors (TPDs) on the CA module in their FD training schemes, to provide insight and recommendations for those supervising and undertaking CA. A literature review was conducted followed by a qualitative research methodology, using group interviews. The interviews were transcribed and thematically analyzed using NVIVO, a Computer-Assisted Qualitative Data Analysis tool. CA was found to be a useful tool for teaching management and professionalism and can bring some improvement to clinical practice, but TPDs have doubts about the long-term effects on service delivery. The role of the Educational Supervisor (ES) is discussed and recommendations are given for those supervising and conducting CA.
A Qualitative Evaluation of Clinical Audit in UK Dental Foundation Training
Quinn, Alyson
2017-01-01
Clinical Audit (CA) has been recognized as a useful tool for tool for improving service delivery, clinical governance, and the education and performance of the dental team. This study develops the discussion by investigating its use as an educational tool within UK Dental Foundation Training (DFT). The aim was to investigate the views of Foundation Dentists (FDs) and Training Programme Directors (TPDs) on the CA module in their FD training schemes, to provide insight and recommendations for those supervising and undertaking CA. A literature review was conducted followed by a qualitative research methodology, using group interviews. The interviews were transcribed and thematically analyzed using NVIVO, a Computer-Assisted Qualitative Data Analysis tool. CA was found to be a useful tool for teaching management and professionalism and can bring some improvement to clinical practice, but TPDs have doubts about the long-term effects on service delivery. The role of the Educational Supervisor (ES) is discussed and recommendations are given for those supervising and conducting CA. PMID:29563436
The BioCyc collection of microbial genomes and metabolic pathways.
Karp, Peter D; Billington, Richard; Caspi, Ron; Fulcher, Carol A; Latendresse, Mario; Kothari, Anamika; Keseler, Ingrid M; Krummenacker, Markus; Midford, Peter E; Ong, Quang; Ong, Wai Kit; Paley, Suzanne M; Subhraveti, Pallavi
2017-08-17
BioCyc.org is a microbial genome Web portal that combines thousands of genomes with additional information inferred by computer programs, imported from other databases and curated from the biomedical literature by biologist curators. BioCyc also provides an extensive range of query tools, visualization services and analysis software. Recent advances in BioCyc include an expansion in the content of BioCyc in terms of both the number of genomes and the types of information available for each genome; an expansion in the amount of curated content within BioCyc; and new developments in the BioCyc software tools including redesigned gene/protein pages and metabolite pages; new search tools; a new sequence-alignment tool; a new tool for visualizing groups of related metabolic pathways; and a facility called SmartTables, which enables biologists to perform analyses that previously would have required a programmer's assistance. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Williams, Kent E; Voigt, Jeffrey R
2004-01-01
The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.
Computer applications in scientific balloon quality control
NASA Astrophysics Data System (ADS)
Seely, Loren G.; Smith, Michael S.
Seal defects and seal tensile strength are primary determinants of product quality in scientific balloon manufacturing; they therefore require a unit of quality measure. The availability of inexpensive and powerful data-processing tools can serve as the basis of a quality-trends-discerning analysis of products. The results of one such analysis are presently given in graphic form for use on the production floor. Software descriptions and their sample outputs are presented, together with a summary of overall and long-term effects of these methods on product quality.
A managerial accounting analysis of hospital costs.
Frank, W G
1976-01-01
Variance analysis, an accounting technique, is applied to an eight-component model of hospital costs to determine the contribution each component makes to cost increases. The method is illustrated by application to data on total costs from 1950 to 1973 for all U.S. nongovernmental not-for-profit short-term general hospitals. The costs of a single hospital are analyzed and compared to the group costs. The potential uses and limitations of the method as a planning and research tool are discussed. PMID:965233
A managerial accounting analysis of hospital costs.
Frank, W G
1976-01-01
Variance analysis, an accounting technique, is applied to an eight-component model of hospital costs to determine the contribution each component makes to cost increases. The method is illustrated by application to data on total costs from 1950 to 1973 for all U.S. nongovernmental not-for-profit short-term general hospitals. The costs of a single hospital are analyzed and compared to the group costs. The potential uses and limitations of the method as a planning and research tool are discussed.
Flis, Ivan; van Eck, Nees Jan
2017-07-20
This study investigated the structure of psychological literature as represented by a corpus of 676,393 articles in the period from 1950 to 1999. The corpus was extracted from 1,269 journals indexed by PsycINFO. The data in our analysis consisted of the relevant terms mined from the titles and abstracts of all of the articles in the corpus. Based on the co-occurrences of these terms, we developed a series of chronological visualizations using a bibliometric software tool called VOSviewer. These visualizations produced a stable structure through the 5 decades under analysis, and this structure was analyzed as a data-mined proxy for the disciplinary formation of scientific psychology in the second part of the 20th century. Considering the stable structure uncovered by our term co-occurrence analysis and its visualization, we discuss it in the context of Lee Cronbach's "Two Disciplines of Scientific Psychology" (1957) and conventional history of 20th-century psychology's disciplinary formation and history of methods. Our aim was to provide a comprehensive digital humanities perspective on the large-scale structural development of research in English-language psychology from 1950 to 1999. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Dust control effectiveness of drywall sanding tools.
Young-Corbett, Deborah E; Nussbaum, Maury A
2009-07-01
In this laboratory study, four drywall sanding tools were evaluated in terms of dust generation rates in the respirable and thoracic size classes. In a repeated measures study design, 16 participants performed simulated drywall finishing tasks with each of four tools: (1) ventilated sander, (2) pole sander, (3) block sander, and (4) wet sponge. Dependent variables of interest were thoracic and respirable breathing zone dust concentrations. Analysis by Friedman's Test revealed that the ventilated drywall sanding tool produced significantly less dust, of both size classes, than did the other three tools. The pole and wet sanders produced significantly less dust of both size classes than did the block sander. The block sander, the most commonly used tool in drywall finishing operations, produced significantly more dust of both size classes than did the other three tools. When compared with the block sander, the other tools offer substantial dust reduction. The ventilated tool reduced respirable concentrations by 88% and thoracic concentrations by 85%. The pole sander reduced respirable concentrations by 58% and thoracic by 50%. The wet sander produced reductions of 60% and 47% in the respirable and thoracic classes, respectively. Wet sponge sanders and pole sanders are effective at reducing breathing-zone dust concentrations; however, based on its superior dust control effectiveness, the ventilated sander is the recommended tool for drywall finishing operations.
Developing person-centred analysis of harm in a paediatric hospital: a quality improvement report.
Lachman, Peter; Linkson, Lynette; Evans, Trish; Clausen, Henning; Hothi, Daljit
2015-05-01
The provision of safe care is complex and difficult to achieve. Awareness of what happens in real time is one of the ways to develop a safe system within a culture of safety. At Great Ormond Street Hospital, we developed and tested a tool specifically designed for patients and families to report harm, with the aim of raising awareness and opportunities for staff to continually improve and provide safe care. Over a 10-month period, we developed processes to report harm. We used the Model for Improvement and multiple Plan, Do, Study, Act cycles for testing. We measured changes using culture surveys as well as analysis of the reports. The tool was tested in different formats and moved from a provider centric to a person-centred tool analysed in real time. An independent person working with the families was best placed to support reporting. Immediate feedback to families was managed by senior staff, and provided the opportunity for clarification, transparency and apologies. Feedback to staff provided learning opportunities. Improvements in culture climate and staff reporting were noted in the short term. The integration of patient involvement in safety monitoring systems is essential to achieve safety. The high number of newly identified 'near-misses' and 'critical incidents' by families demonstrated an underestimation of potentially harmful events. This testing and introduction of a self-reporting, real-time bedside tool has led to active engagement with families and patients and raised situation awareness. We believe that this will lead to improved and safer care in the longer term. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Global Search Trends of Oral Problems using Google Trends from 2004 to 2016: An Exploratory Analysis
Patthi, Basavaraj; Singla, Ashish; Gupta, Ritu; Prasad, Monika; Ali, Irfan; Dhama, Kuldeep; Niraj, Lav Kumar
2017-01-01
Introduction Oral diseases are pandemic cause of morbidity with widespread geographic distribution. This technology based era has brought about easy knowledge transfer than traditional dependency on information obtained from family doctors. Hence, harvesting this system of trends can aid in oral disease quantification. Aim To conduct an exploratory analysis of the changes in internet search volumes of oral diseases by using Google Trends© (GT©). Materials and Methods GT© were utilized to provide real world facts based on search terms related to categories, interest by region and interest over time. Time period chosen was from January 2004 to December 2016. Five different search terms were explored and compared based on the highest relative search volumes along with comma separated value files to obtain an insight into highest search traffic. Results The search volume measured over the time span noted the term “Dental caries” to be the most searched in Japan, “Gingivitis” in Jordan, “Oral Cancer” in Taiwan, “No Teeth” in Australia, “HIV symptoms” in Zimbabwe, “Broken Teeth” in United Kingdom, “Cleft palate” in Philippines, “Toothache” in Indonesia and the comparison of top five searched terms provided the “Gingivitis” with highest search volume. Conclusion The results from the present study offers an insight into a competent tool that can analyse and compare oral diseases over time. The trend research platform can be used on emerging diseases and their drift in geographic population with great acumen. This tool can be utilized in forecasting, modulating marketing strategies and planning disability limitation techniques. PMID:29207825
ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.
Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa
2016-05-01
The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of intensity and fluorescent pattern with accuracy better or comparable with the state of the art techniques, even when such techniques are run on manually segmented cells. Hence, ANAlyte can be proposed as a valid solution to the problem of ANA testing automatization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Dean N.
The climate and weather data science community gathered December 3–5, 2013, at Lawrence Livermore National Laboratory, in Livermore, California, for the third annual Earth System Grid Federation (ESGF) and Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Meeting, which was hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UV-CDAT are global collaborations designed to develop a new generation of open-source software infrastructure that provides distributed access and analysis to observed andmore » simulated data from the climate and weather communities. The tools and infrastructure developed under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change, while the F2F meetings help to build a stronger climate and weather data science community and stronger federated software infrastructure. The 2013 F2F meeting determined requirements for existing and impending national and international community projects; enhancements needed for data distribution, analysis, and visualization infrastructure; and standards and resources needed for better collaborations.« less
Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS
NASA Astrophysics Data System (ADS)
Joshi, D. M.; Patel, H. K.
2015-10-01
Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.
Zakaria, Rasheed; Ellenbogen, Jonathan; Graham, Catherine; Pizer, Barry; Mallucci, Conor; Kumar, Ram
2013-08-01
Complications may occur following posterior fossa tumour surgery in children. Such complications are subjectively and inconsistently reported even though they may have significant long-term behavioural and cognitive consequences for the child. This makes comparison of surgeons, programmes and treatments problematic. We have devised a causality tool for assessing if an adverse event after surgery can be classified as a surgical complication using a series of simple questions, based on a tool used in assessing adverse drug reactions. This tool, which we have called the "Liverpool Neurosurgical Complication Causality Assessment Tool", was developed by reviewing a series of ten posterior fossa tumour cases with a panel of neurosurgery, neurology, oncology and neuropsychology specialists working in a multidisciplinary paediatric tumour treatment programme. We have demonstrated its use and hope that it may improve reliability between different assessors both in evaluating the outcomes of existing programmes and treatments as well as aiding in trials which may directly compare the effects of surgical and medical treatments.
Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Mike; Cipiti, Ben; Demuth, Scott Francis
2017-01-30
The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less
42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems
NASA Technical Reports Server (NTRS)
Stoneking, Eric
2018-01-01
Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.
Comparing Educational Tools Using Activity Theory: Clickers and Flashcards
NASA Astrophysics Data System (ADS)
Price, Edward; De Leone, Charles; Lasry, Nathaniel
2010-10-01
Physics educators and researchers have recently begun to distinguish between pedagogical approaches and the educational technologies that are used to implement them. For instance, peer instruction has been shown to be equally effective, in terms of student learning outcomes, when implemented with clickers or flashcards. Therefore, technological tools (clickers and flashcards) can be viewed as means to mediate pedagogical techniques (peer instruction or traditional instruction). In this paper, we use activity theory to examine peer instruction, with particular attention to the role of tools. This perspective helps clarify clickers' and flashcards' differences, similarities, impacts in the classroom, and utility to education researchers. Our analysis can suggest improvements and new uses. Finally, we propose activity theory as a useful approach in understanding and improving the use of technology in the physics classroom.
Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durkee, Joe W.; Cipiti, Ben; Demuth, Scott Francis
The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less
2016-12-01
based on life expectancy and the TSP account selected. The TSP Growth and Annuity Element also estimates how taxes will increase at the time the service...the BRS. 14. SUBJECT TERMS military retirement, blended retirement, HIGH-36, thrift savings plan, investment risk, retirement taxes , net present...20 2. Tax Impacts
Radar Orbit Analysis Tool Using Least Squares Estimator
2007-09-01
g gravity................................................................................................ km/s2 bodyg −2 v gravity due to two ...motion, it is necessary to determine how the dynamics between the two groups differ. One solution is to develop a model that can detect non...with just J2 and two -body terms was also addressed. Methodology Solving the estimation problem required dividing the process into four stages
C 3, A Command-line Catalog Cross-match Tool for Large Astrophysical Catalogs
NASA Astrophysics Data System (ADS)
Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio
2017-02-01
Modern Astrophysics is based on multi-wavelength data organized into large and heterogeneous catalogs. Hence, the need for efficient, reliable and scalable catalog cross-matching methods plays a crucial role in the era of the petabyte scale. Furthermore, multi-band data have often very different angular resolution, requiring the highest generality of cross-matching features, mainly in terms of region shape and resolution. In this work we present C 3 (Command-line Catalog Cross-match), a multi-platform application designed to efficiently cross-match massive catalogs. It is based on a multi-core parallel processing paradigm and conceived to be executed as a stand-alone command-line process or integrated within any generic data reduction/analysis pipeline, providing the maximum flexibility to the end-user, in terms of portability, parameter configuration, catalog formats, angular resolution, region shapes, coordinate units and cross-matching types. Using real data, extracted from public surveys, we discuss the cross-matching capabilities and computing time efficiency also through a direct comparison with some publicly available tools, chosen among the most used within the community, and representative of different interface paradigms. We verified that the C 3 tool has excellent capabilities to perform an efficient and reliable cross-matching between large data sets. Although the elliptical cross-match and the parametric handling of angular orientation and offset are known concepts in the astrophysical context, their availability in the presented command-line tool makes C 3 competitive in the context of public astronomical tools.
NASA Astrophysics Data System (ADS)
Pérez-Peña, J. V.; Al-Awabdeh, M.; Azañón, J. M.; Galve, J. P.; Booth-Rea, G.; Notti, D.
2017-07-01
The present-day great availability of high-resolution Digital Elevation Models has improved tectonic geomorphology analyses in their methodological aspects and geological meaning. Analyses based on topographic profiles are valuable to explore the short and long-term landscape response to tectonic activity and climate changes. Swath and river longitudinal profiles are two of the most used analysis to explore the long and short-term landscape responses. Most of these morphometric analyses are conducted in GIS software, which have become standard tools for analyzing drainage network metrics. In this work we present two ArcGIS Add-Ins to automatically delineate swath and normalized river profiles. Both tools are programmed in Visual Basic . NET and use ArcObjects library-architecture to access directly to vector and raster data. The SwathProfiler Add-In allows analyzing the topography within a swath or band by representing maximum-minimum-mean elevations, first and third quartile, local relief and hypsometry. We have defined a new transverse hypsometric integral index (THi) that analyzes hypsometry along the swath and offer valuable information in these kind of graphics. The NProfiler Add-In allows representing longitudinal normalized river profiles and their related morphometric indexes as normalized concavity (CT), maximum concavity (Cmax) and length of maximum concavity (Lmax). Both tools facilitate the spatial analysis of topography and drainage networks directly in a GIS environment as ArcMap and provide graphical outputs. To illustrate how these tools work, we analyzed two study areas, the Sierra Alhamilla mountain range (Betic Cordillera, SE Spain) and the Eastern margin of the Dead Sea (Jordan). The first study area has been recently studied from a morphotectonic perspective and these new tools can show an added value to the previous studies. The second study area has not been analyzed by quantitative tectonic geomorphology and the results suggest a landscape in transient state due to a continuous base-level fall produced by the formation of the Dead Sea basin.
Rosado, Luís; da Costa, José M Correia; Elias, Dirk; Cardoso, Jaime S
2017-09-21
Microscopy examination has been the pillar of malaria diagnosis, being the recommended procedure when its quality can be maintained. However, the need for trained personnel and adequate equipment limits its availability and accessibility in malaria-endemic areas. Rapid, accurate, accessible diagnostic tools are increasingly required, as malaria control programs extend parasite-based diagnosis and the prevalence decreases. This paper presents an image processing and analysis methodology using supervised classification to assess the presence of malaria parasites and determine the species and life cycle stage in Giemsa-stained thin blood smears. The main differentiation factor is the usage of microscopic images exclusively acquired with low cost and accessible tools such as smartphones, a dataset of 566 images manually annotated by an experienced parasilogist being used. Eight different species-stage combinations were considered in this work, with an automatic detection performance ranging from 73.9% to 96.2% in terms of sensitivity and from 92.6% to 99.3% in terms of specificity. These promising results attest to the potential of using this approach as a valid alternative to conventional microscopy examination, with comparable detection performances and acceptable computational times.
da Costa, José M. Correia; Elias, Dirk
2017-01-01
Microscopy examination has been the pillar of malaria diagnosis, being the recommended procedure when its quality can be maintained. However, the need for trained personnel and adequate equipment limits its availability and accessibility in malaria-endemic areas. Rapid, accurate, accessible diagnostic tools are increasingly required, as malaria control programs extend parasite-based diagnosis and the prevalence decreases. This paper presents an image processing and analysis methodology using supervised classification to assess the presence of malaria parasites and determine the species and life cycle stage in Giemsa-stained thin blood smears. The main differentiation factor is the usage of microscopic images exclusively acquired with low cost and accessible tools such as smartphones, a dataset of 566 images manually annotated by an experienced parasilogist being used. Eight different species-stage combinations were considered in this work, with an automatic detection performance ranging from 73.9% to 96.2% in terms of sensitivity and from 92.6% to 99.3% in terms of specificity. These promising results attest to the potential of using this approach as a valid alternative to conventional microscopy examination, with comparable detection performances and acceptable computational times. PMID:28934170
A Near-Term, High-Confidence Heavy Lift Launch Vehicle
NASA Technical Reports Server (NTRS)
Rothschild, William J.; Talay, Theodore A.
2009-01-01
The use of well understood, legacy elements of the Space Shuttle system could yield a near-term, high-confidence Heavy Lift Launch Vehicle that offers significant performance, reliability, schedule, risk, cost, and work force transition benefits. A side-mount Shuttle-Derived Vehicle (SDV) concept has been defined that has major improvements over previous Shuttle-C concepts. This SDV is shown to carry crew plus large logistics payloads to the ISS, support an operationally efficient and cost effective program of lunar exploration, and offer the potential to support commercial launch operations. This paper provides the latest data and estimates on the configurations, performance, concept of operations, reliability and safety, development schedule, risks, costs, and work force transition opportunities for this optimized side-mount SDV concept. The results presented in this paper have been based on established models and fully validated analysis tools used by the Space Shuttle Program, and are consistent with similar analysis tools commonly used throughout the aerospace industry. While these results serve as a factual basis for comparisons with other launch system architectures, no such comparisons are presented in this paper. The authors welcome comparisons between this optimized SDV and other Heavy Lift Launch Vehicle concepts.
The nature and evaluation of commercial expert system building tools, revision 1
NASA Technical Reports Server (NTRS)
Gevarter, William B.
1987-01-01
This memorandum reviews the factors that constitute an Expert System Building Tool (ESBT) and evaluates current tools in terms of these factors. Evaluation of these tools is based on their structure and their alternative forms of knowledge representation, inference mechanisms and developer end-user interfaces. Next, functional capabilities, such as diagnosis and design, are related to alternative forms of mechanization. The characteristics and capabilities of existing commercial tools are then reviewed in terms of these criteria.
Mira, José Joaquín; Vicente, Maria Asuncion; Fernandez, Cesar; Guilabert, Mercedes; Ferrús, Lena; Zavala, Elena; Silvestre, Carmen; Pérez-Pérez, Pastora
2016-01-01
Background Lack of time, lack of familiarity with root cause analysis, or suspicion that the reporting may result in negative consequences hinder involvement in the analysis of safety incidents and the search for preventive actions that can improve patient safety. Objective The aim was develop a tool that enables hospitals and primary care professionals to immediately analyze the causes of incidents and to propose and implement measures intended to prevent their recurrence. Methods The design of the Web-based tool (BACRA) considered research on the barriers for reporting, review of incident analysis tools, and the experience of eight managers from the field of patient safety. BACRA’s design was improved in successive versions (BACRA v1.1 and BACRA v1.2) based on feedback from 86 middle managers. BACRA v1.1 was used by 13 frontline professionals to analyze incidents of safety; 59 professionals used BACRA v1.2 and assessed the respective usefulness and ease of use of both versions. Results BACRA contains seven tabs that guide the user through the process of analyzing a safety incident and proposing preventive actions for similar future incidents. BACRA does not identify the person completing each analysis since the password introduced to hide said analysis only is linked to the information concerning the incident and not to any personal data. The tool was used by 72 professionals from hospitals and primary care centers. BACRA v1.2 was assessed more favorably than BACRA v1.1, both in terms of its usefulness (z=2.2, P=.03) and its ease of use (z=3.0, P=.003). Conclusions BACRA helps to analyze incidents of safety and to propose preventive actions. BACRA guarantees anonymity of the analysis and reduces the reluctance of professionals to carry out this task. BACRA is useful and easy to use. PMID:27678308
Carrillo, Irene; Mira, José Joaquín; Vicente, Maria Asuncion; Fernandez, Cesar; Guilabert, Mercedes; Ferrús, Lena; Zavala, Elena; Silvestre, Carmen; Pérez-Pérez, Pastora
2016-09-27
Lack of time, lack of familiarity with root cause analysis, or suspicion that the reporting may result in negative consequences hinder involvement in the analysis of safety incidents and the search for preventive actions that can improve patient safety. The aim was develop a tool that enables hospitals and primary care professionals to immediately analyze the causes of incidents and to propose and implement measures intended to prevent their recurrence. The design of the Web-based tool (BACRA) considered research on the barriers for reporting, review of incident analysis tools, and the experience of eight managers from the field of patient safety. BACRA's design was improved in successive versions (BACRA v1.1 and BACRA v1.2) based on feedback from 86 middle managers. BACRA v1.1 was used by 13 frontline professionals to analyze incidents of safety; 59 professionals used BACRA v1.2 and assessed the respective usefulness and ease of use of both versions. BACRA contains seven tabs that guide the user through the process of analyzing a safety incident and proposing preventive actions for similar future incidents. BACRA does not identify the person completing each analysis since the password introduced to hide said analysis only is linked to the information concerning the incident and not to any personal data. The tool was used by 72 professionals from hospitals and primary care centers. BACRA v1.2 was assessed more favorably than BACRA v1.1, both in terms of its usefulness (z=2.2, P=.03) and its ease of use (z=3.0, P=.003). BACRA helps to analyze incidents of safety and to propose preventive actions. BACRA guarantees anonymity of the analysis and reduces the reluctance of professionals to carry out this task. BACRA is useful and easy to use.
2013-01-01
Background The harmonization of European health systems brings with it a need for tools to allow the standardized collection of information about medical care. A common coding system and standards for the description of services are needed to allow local data to be incorporated into evidence-informed policy, and to permit equity and mobility to be assessed. The aim of this project has been to design such a classification and a related tool for the coding of services for Long Term Care (DESDE-LTC), based on the European Service Mapping Schedule (ESMS). Methods The development of DESDE-LTC followed an iterative process using nominal groups in 6 European countries. 54 researchers and stakeholders in health and social services contributed to this process. In order to classify services, we use the minimal organization unit or “Basic Stable Input of Care” (BSIC), coded by its principal function or “Main Type of Care” (MTC). The evaluation of the tool included an analysis of feasibility, consistency, ontology, inter-rater reliability, Boolean Factor Analysis, and a preliminary impact analysis (screening, scoping and appraisal). Results DESDE-LTC includes an alpha-numerical coding system, a glossary and an assessment instrument for mapping and counting LTC. It shows high feasibility, consistency, inter-rater reliability and face, content and construct validity. DESDE-LTC is ontologically consistent. It is regarded by experts as useful and relevant for evidence-informed decision making. Conclusion DESDE-LTC contributes to establishing a common terminology, taxonomy and coding of LTC services in a European context, and a standard procedure for data collection and international comparison. PMID:23768163
Attribute classification for generating GPR facies models
NASA Astrophysics Data System (ADS)
Tronicke, Jens; Allroggen, Niklas
2017-04-01
Ground-penetrating radar (GPR) is an established geophysical tool to explore near-surface sedimentary environments. It has been successfully used, for example, to reconstruct past depositional environments, to investigate sedimentary processes, to aid hydrogeological investigations, and to assist in hydrocarbon reservoir analog studies. Interpreting such 2D/3D GPR data, usually relies on concepts known as GPR facies analysis, in which GPR facies are defined as units composed of characteristic reflection patterns (in terms of reflection amplitude, continuity, geometry, and internal configuration). The resulting facies models are then interpreted in terms of depositional processes, sedimentary environments, litho-, and hydrofacies. Typically, such GPR facies analyses are implemented in a manual workflow being laborious and rather inefficient especially for 3D data sets. In addition, such a subjective strategy bears the potential of inconsistency because the outcome depends on the expertise and experience of the interpreter. In this presentation, we investigate the feasibility of delineating GPR facies in an objective and largely automated manner. Our proposed workflow relies on a three-step procedure. First, we calculate a variety of geometrical and physical attributes from processed 2D and 3D GPR data sets. Then, we analyze and evaluate this attribute data base (e.g., using statistical tools such as principal component analysis) to reduce its dimensionality and to avoid redundant information, respectively. Finally, we integrate the reduced data base using tools such as composite imaging, cluster analysis, and neural networks. Using field examples that have been acquired across different depositional environments, we demonstrate that the resulting 2D/3D facies models ease and improve the interpretation of GPR data. We conclude that our interpretation strategy allows to generate GPR facies models in a consistent and largely automated manner and might be helpful in variety near-surface applications.
Automated Video Analysis of Non-verbal Communication in a Medical Setting
Hart, Yuval; Czerniak, Efrat; Karnieli-Miller, Orit; Mayo, Avraham E.; Ziv, Amitai; Biegon, Anat; Citron, Atay; Alon, Uri
2016-01-01
Non-verbal communication plays a significant role in establishing good rapport between physicians and patients and may influence aspects of patient health outcomes. It is therefore important to analyze non-verbal communication in medical settings. Current approaches to measure non-verbal interactions in medicine employ coding by human raters. Such tools are labor intensive and hence limit the scale of possible studies. Here, we present an automated video analysis tool for non-verbal interactions in a medical setting. We test the tool using videos of subjects that interact with an actor portraying a doctor. The actor interviews the subjects performing one of two scripted scenarios of interviewing the subjects: in one scenario the actor showed minimal engagement with the subject. The second scenario included active listening by the doctor and attentiveness to the subject. We analyze the cross correlation in total kinetic energy of the two people in the dyad, and also characterize the frequency spectrum of their motion. We find large differences in interpersonal motion synchrony and entrainment between the two performance scenarios. The active listening scenario shows more synchrony and more symmetric followership than the other scenario. Moreover, the active listening scenario shows more high-frequency motion termed jitter that has been recently suggested to be a marker of followership. The present approach may be useful for analyzing physician-patient interactions in terms of synchrony and dominance in a range of medical settings. PMID:27602002
SUPIN: A Computational Tool for Supersonic Inlet Design
NASA Technical Reports Server (NTRS)
Slater, John W.
2016-01-01
A computational tool named SUPIN is being developed to design and analyze the aerodynamic performance of supersonic inlets. The inlet types available include the axisymmetric pitot, three-dimensional pitot, axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flow-field is divided into parts to provide a framework for the geometry and aerodynamic modeling. Each part of the inlet is defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick design and analysis. SUPIN provides inlet geometry in the form of coordinates, surface angles, and cross-sectional areas. SUPIN can generate inlet surface grids and three-dimensional, structured volume grids for use with higher-fidelity computational fluid dynamics (CFD) analysis. Capabilities highlighted in this paper include the design and analysis of streamline-traced external-compression inlets, modeling of porous bleed, and the design and analysis of mixed-compression inlets. CFD analyses are used to verify the SUPIN results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khaleel, Mohammad A.; Lin, Zijing; Singh, Prabhakar
2004-05-03
A 3D simulation tool for modeling solid oxide fuel cells is described. The tool combines the versatility and efficiency of a commercial finite element analysis code, MARC{reg_sign}, with an in-house developed robust and flexible electrochemical (EC) module. Based upon characteristic parameters obtained experimentally and assigned by the user, the EC module calculates the current density distribution, heat generation, and fuel and oxidant species concentration, taking the temperature profile provided by MARC{reg_sign} and operating conditions such as the fuel and oxidant flow rate and the total stack output voltage or current as the input. MARC{reg_sign} performs flow and thermal analyses basedmore » on the initial and boundary thermal and flow conditions and the heat generation calculated by the EC module. The main coupling between MARC{reg_sign} and EC is for MARC{reg_sign} to supply the temperature field to EC and for EC to give the heat generation profile to MARC{reg_sign}. The loosely coupled, iterative scheme is advantageous in terms of memory requirement, numerical stability and computational efficiency. The coupling is iterated to self-consistency for a steady-state solution. Sample results for steady states as well as the startup process for stacks with different flow designs are presented to illustrate the modeling capability and numerical performance characteristic of the simulation tool.« less
CytoBayesJ: software tools for Bayesian analysis of cytogenetic radiation dosimetry data.
Ainsbury, Elizabeth A; Vinnikov, Volodymyr; Puig, Pedro; Maznyk, Nataliya; Rothkamm, Kai; Lloyd, David C
2013-08-30
A number of authors have suggested that a Bayesian approach may be most appropriate for analysis of cytogenetic radiation dosimetry data. In the Bayesian framework, probability of an event is described in terms of previous expectations and uncertainty. Previously existing, or prior, information is used in combination with experimental results to infer probabilities or the likelihood that a hypothesis is true. It has been shown that the Bayesian approach increases both the accuracy and quality assurance of radiation dose estimates. New software entitled CytoBayesJ has been developed with the aim of bringing Bayesian analysis to cytogenetic biodosimetry laboratory practice. CytoBayesJ takes a number of Bayesian or 'Bayesian like' methods that have been proposed in the literature and presents them to the user in the form of simple user-friendly tools, including testing for the most appropriate model for distribution of chromosome aberrations and calculations of posterior probability distributions. The individual tools are described in detail and relevant examples of the use of the methods and the corresponding CytoBayesJ software tools are given. In this way, the suitability of the Bayesian approach to biological radiation dosimetry is highlighted and its wider application encouraged by providing a user-friendly software interface and manual in English and Russian. Copyright © 2013 Elsevier B.V. All rights reserved.
Senol Cali, Damla; Kim, Jeremie S; Ghose, Saugata; Alkan, Can; Mutlu, Onur
2018-04-02
Nanopore sequencing technology has the potential to render other sequencing technologies obsolete with its ability to generate long reads and provide portability. However, high error rates of the technology pose a challenge while generating accurate genome assemblies. The tools used for nanopore sequence analysis are of critical importance, as they should overcome the high error rates of the technology. Our goal in this work is to comprehensively analyze current publicly available tools for nanopore sequence analysis to understand their advantages, disadvantages and performance bottlenecks. It is important to understand where the current tools do not perform well to develop better tools. To this end, we (1) analyze the multiple steps and the associated tools in the genome assembly pipeline using nanopore sequence data, and (2) provide guidelines for determining the appropriate tools for each step. Based on our analyses, we make four key observations: (1) the choice of the tool for basecalling plays a critical role in overcoming the high error rates of nanopore sequencing technology. (2) Read-to-read overlap finding tools, GraphMap and Minimap, perform similarly in terms of accuracy. However, Minimap has a lower memory usage, and it is faster than GraphMap. (3) There is a trade-off between accuracy and performance when deciding on the appropriate tool for the assembly step. The fast but less accurate assembler Miniasm can be used for quick initial assembly, and further polishing can be applied on top of it to increase the accuracy, which leads to faster overall assembly. (4) The state-of-the-art polishing tool, Racon, generates high-quality consensus sequences while providing a significant speedup over another polishing tool, Nanopolish. We analyze various combinations of different tools and expose the trade-offs between accuracy, performance, memory usage and scalability. We conclude that our observations can guide researchers and practitioners in making conscious and effective choices for each step of the genome assembly pipeline using nanopore sequence data. Also, with the help of bottlenecks we have found, developers can improve the current tools or build new ones that are both accurate and fast, to overcome the high error rates of the nanopore sequencing technology.
Secure FAST: Security Enhancement in the NATO Time Sensitive Targeting Tool
2010-11-01
designed to aid in the tracking and prosecuting of Time Sensitive Targets. The FAST tool provides user level authentication and authorisation in terms...level authentication and authorisation in terms of security. It uses operating system level security but does not provide application level security for...and collaboration tool, designed to aid in the tracking and prosecuting of Time Sensitive Targets. The FAST tool provides user level authentication and
ERIC Educational Resources Information Center
Johs-Artisensi, Jennifer L.; Olson, Douglas M.; Nahm, Abraham Y.
2016-01-01
Long term care administrators need a broad base of knowledge, skills, and interests to provide leadership and be successful in managing a fiscally responsible, quality long term care organization. Researchers developed a tool to help students assess whether a long term care administration major is a compatible fit. With input from professionals in…
Breakdown Cause and Effect Analysis. Case Study
NASA Astrophysics Data System (ADS)
Biały, Witold; Ružbarský, Juraj
2018-06-01
Every company must ensure that the production process proceeds without interferences. Within this article, the author uses the term "interferences" in reference to unplanned stoppages caused by breakdowns. Unfortunately, usually due to machine operators' mistakes, machines break, which causes stoppages thus generating additional costs for the company. This article shows a cause and effect analysis of a breakdown in a production process. The FMEA as well as quality management tools: the Ishikawa diagram and Pareto chart were used for the analysis. Correction measures were presented which allowed for a significant reduction in the number of stoppages caused by breakdowns.
Evaluation of a New Digital Automated Glycemic Pattern Detection Tool
Albiñana, Emma; Artes, Maite; Corcoy, Rosa; Fernández-García, Diego; García-Alemán, Jorge; García-Cuartero, Beatriz; González, Cintia; Rivero, María Teresa; Casamira, Núria; Weissmann, Jörg
2017-01-01
Abstract Background: Blood glucose meters are reliable devices for data collection, providing electronic logs of historical data easier to interpret than handwritten logbooks. Automated tools to analyze these data are necessary to facilitate glucose pattern detection and support treatment adjustment. These tools emerge in a broad variety in a more or less nonevaluated manner. The aim of this study was to compare eDetecta, a new automated pattern detection tool, to nonautomated pattern analysis in terms of time investment, data interpretation, and clinical utility, with the overarching goal to identify early in development and implementation of tool areas of improvement and potential safety risks. Methods: Multicenter web-based evaluation in which 37 endocrinologists were asked to assess glycemic patterns of 4 real reports (2 continuous subcutaneous insulin infusion [CSII] and 2 multiple daily injection [MDI]). Endocrinologist and eDetecta analyses were compared on time spent to analyze each report and agreement on the presence or absence of defined patterns. Results: eDetecta module markedly reduced the time taken to analyze each case on the basis of the emminens eConecta reports (CSII: 18 min; MDI: 12.5), compared to the automatic eDetecta analysis. Agreement between endocrinologists and eDetecta varied depending on the patterns, with high level of agreement in patterns of glycemic variability. Further analysis of low level of agreement led to identifying areas where algorithms used could be improved to optimize trend pattern identification. Conclusion: eDetecta was a useful tool for glycemic pattern detection, helping clinicians to reduce time required to review emminens eConecta glycemic reports. No safety risks were identified during the study. PMID:29091477
NASA Astrophysics Data System (ADS)
Solecki, W. D.; Friedman, E. S.; Breitzer, R.
2016-12-01
Increasingly frequent extreme weather events are becoming an immediate priority for urban coastal practitioners and stakeholders, adding complexity to decisions concerning risk management for short-term action and long-term needs of city climate stakeholders. The conflict between the prioritization of short versus long-term events by decision-makers creates disconnect between climate science and its applications. The Consortium for Climate Risk in the Urban Northeast (CCRUN), a NOAA RISA team, is developing a set of mechanisms to help bridge this gap. The mechanisms are designed to promote the application of climate science on extreme weather events and their aftermath. It is in the post event policy window where significant opportunities for science-policy linkages exist. In particular, CCRUN is interested in producing actionable and useful information for city managers to use in decision-making processes surrounding extreme weather events and climate change. These processes include a sector specific needs assessment survey instrument and two tools for urban coastal practitioners and stakeholders. The tools focus on post event learning and connections between resilience and transformative adaptation. Elements of the two tools are presented. Post extreme event learning supports urban coastal practitioners and decision-makers concerned about maximizing opportunities for knowledge transfer and assimilation, and policy initiation and development following an extreme weather event. For the urban U.S. Northeast, post event learning helps coastal stakeholders build the capacity to adapt to extreme weather events, and inform and develop their planning capacity through analysis of past actions and steps taken in response to Hurricane Sandy. Connecting resilience with transformative adaptation is intended to promote resilience in urban Northeast coastal settings to the long-term negative consequences of extreme weather events. This is done through a knowledge co-production engagement process that links innovative and flexible adaptation pathways that can address requirements for short-term action and long-term needs.
NASA Astrophysics Data System (ADS)
O'Connor, Alison; Kirtman, Benjamin; Harrison, Scott; Gorman, Joe
2016-05-01
The US Navy faces several limitations when planning operations in regard to forecasting environmental conditions. Currently, mission analysis and planning tools rely heavily on short-term (less than a week) forecasts or long-term statistical climate products. However, newly available data in the form of weather forecast ensembles provides dynamical and statistical extended-range predictions that can produce more accurate predictions if ensemble members can be combined correctly. Charles River Analytics is designing the Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS), which performs data fusion over extended-range multi-model ensembles, such as the North American Multi-Model Ensemble (NMME), to produce a unified forecast for several weeks to several seasons in the future. We evaluated thirty years of forecasts using machine learning to select predictions for an all-encompassing and superior forecast that can be used to inform the Navy's decision planning process.
WebGIVI: a web-based gene enrichment analysis and visualization tool.
Sun, Liang; Zhu, Yongnan; Mahmood, A S M Ashique; Tudor, Catalina O; Ren, Jia; Vijay-Shanker, K; Chen, Jian; Schmidt, Carl J
2017-05-04
A major challenge of high throughput transcriptome studies is presenting the data to researchers in an interpretable format. In many cases, the outputs of such studies are gene lists which are then examined for enriched biological concepts. One approach to help the researcher interpret large gene datasets is to associate genes and informative terms (iTerm) that are obtained from the biomedical literature using the eGIFT text-mining system. However, examining large lists of iTerm and gene pairs is a daunting task. We have developed WebGIVI, an interactive web-based visualization tool ( http://raven.anr.udel.edu/webgivi/ ) to explore gene:iTerm pairs. WebGIVI was built via Cytoscape and Data Driven Document JavaScript libraries and can be used to relate genes to iTerms and then visualize gene and iTerm pairs. WebGIVI can accept a gene list that is used to retrieve the gene symbols and corresponding iTerm list. This list can be submitted to visualize the gene iTerm pairs using two distinct methods: a Concept Map or a Cytoscape Network Map. In addition, WebGIVI also supports uploading and visualization of any two-column tab separated data. WebGIVI provides an interactive and integrated network graph of gene and iTerms that allows filtering, sorting, and grouping, which can aid biologists in developing hypothesis based on the input gene lists. In addition, WebGIVI can visualize hundreds of nodes and generate a high-resolution image that is important for most of research publications. The source code can be freely downloaded at https://github.com/sunliang3361/WebGIVI . The WebGIVI tutorial is available at http://raven.anr.udel.edu/webgivi/tutorial.php .
A Middle Palaeolithic wooden digging stick from Aranbaltza III, Spain.
Rios-Garaizar, Joseba; López-Bultó, Oriol; Iriarte, Eneko; Pérez-Garrido, Carlos; Piqué, Raquel; Aranburu, Arantza; Iriarte-Chiapusso, María José; Ortega-Cordellat, Illuminada; Bourguignon, Laurence; Garate, Diego; Libano, Iñaki
2018-01-01
Aranbaltza is an archaeological complex formed by at least three open-air sites. Between 2014 and 2015 a test excavation carried out in Aranbaltza III revealed the presence of a sand and clay sedimentary sequence formed in floodplain environments, within which six sedimentary units have been identified. This sequence was formed between 137-50 ka, and includes several archaeological horizons, attesting to the long-term presence of Neanderthal communities in this area. One of these horizons, corresponding with Unit 4, yielded two wooden tools. One of these tools is a beveled pointed tool that was shaped through a complex operational sequence involving branch shaping, bark peeling, twig removal, shaping, polishing, thermal exposition and chopping. A use-wear analysis of the tool shows it to have traces related with digging soil so it has been interpreted as representing a digging stick. This is the first time such a tool has been identified in a European Late Middle Palaeolithic context; it also represents one of the first well-preserved Middle Palaeolithic wooden tool found in southern Europe. This artefact represents one of the few examples available of wooden tool preservation for the European Palaeolithic, allowing us to further explore the role wooden technologies played in Neanderthal communities.
Mañas-Martínez, Ana B; Bucar-Barjud, Marina; Campos-Fernández, Julia; Gimeno-Orna, José Antonio; Pérez-Calvo, Juan; Ocón-Bretón, Julia
2018-04-24
To assess the prevalence of oropharyngeal dysphagia (OD) using the Eating Assessment Tool (EAT-10) and its association with malnutrition and long-term mortality. A retrospective cohort study of patients admitted to the general internal medicine ward. In the first 48hours after hospital admission, OD was assessed using the EAT-10, and presence of malnutrition with the Mini Nutritional Assessment-Short Form (MNA-SF). Association of OD to malnutrition and long-term mortality was analyzed. Ninety patients with a mean age of 83 (SD: 11.8) years were enrolled. Of these, 56.7% were at risk of OD according to EAT-10. This group of patients had greater prevalence rates of malnutrition (88.2% vs. 48.7%; P=.001) and mortality (70% vs 35.9%; P=.001). During follow-up for 872.71 (SD: 642.89) days, risk of DO according to EAT-10 was an independent predictor of mortality factor in a multivariate analysis (HR: 2.8; 95%CI: 1.49-5.28; P=.001). The EAT-10 is a useful tool for screening OD. Adequate screening for OD is important because of its associated risks of malnutrition and long-term mortality. Copyright © 2018 SEEN y SED. Publicado por Elsevier España, S.L.U. All rights reserved.
Tebel, Katrin; Boldt, Vivien; Steininger, Anne; Port, Matthias; Ebert, Grit; Ullmann, Reinhard
2017-01-06
The analysis of DNA copy number variants (CNV) has increasing impact in the field of genetic diagnostics and research. However, the interpretation of CNV data derived from high resolution array CGH or NGS platforms is complicated by the considerable variability of the human genome. Therefore, tools for multidimensional data analysis and comparison of patient cohorts are needed to assist in the discrimination of clinically relevant CNVs from others. We developed GenomeCAT, a standalone Java application for the analysis and integrative visualization of CNVs. GenomeCAT is composed of three modules dedicated to the inspection of single cases, comparative analysis of multidimensional data and group comparisons aiming at the identification of recurrent aberrations in patients sharing the same phenotype, respectively. Its flexible import options ease the comparative analysis of own results derived from microarray or NGS platforms with data from literature or public depositories. Multidimensional data obtained from different experiment types can be merged into a common data matrix to enable common visualization and analysis. All results are stored in the integrated MySQL database, but can also be exported as tab delimited files for further statistical calculations in external programs. GenomeCAT offers a broad spectrum of visualization and analysis tools that assist in the evaluation of CNVs in the context of other experiment data and annotations. The use of GenomeCAT does not require any specialized computer skills. The various R packages implemented for data analysis are fully integrated into GenomeCATs graphical user interface and the installation process is supported by a wizard. The flexibility in terms of data import and export in combination with the ability to create a common data matrix makes the program also well suited as an interface between genomic data from heterogeneous sources and external software tools. Due to the modular architecture the functionality of GenomeCAT can be easily extended by further R packages or customized plug-ins to meet future requirements.
Trajectory Design for the Transiting Exoplanet Survey Satellite
NASA Technical Reports Server (NTRS)
Dichmann, Donald J.; Parker, Joel J. K.; Williams, Trevor W.; Mendelsohn, Chad R.
2014-01-01
The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission, scheduled to be launched in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the Schematics Window Methodology (SWM76) launch window analysis tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements. Keywords: resonant orbit, stability, lunar flyby, phasing loops, trajectory optimization
NASA Technical Reports Server (NTRS)
Modesitt, Kenneth L.
1990-01-01
A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.
Curtis, Helen J; Goldacre, Ben
2018-02-23
We aimed to compile and normalise England's national prescribing data for 1998-2016 to facilitate research on long-term time trends and create an open-data exploration tool for wider use. We compiled data from each individual year's national statistical publications and normalised them by mapping each drug to its current classification within the national formulary where possible. We created a freely accessible, interactive web tool to allow anyone to interact with the processed data. We downloaded all available annual prescription cost analysis datasets, which include cost and quantity for all prescription items dispensed in the community in England. Medical devices and appliances were excluded. We measured the extent of normalisation of data and aimed to produce a functioning accessible analysis tool. All data were imported successfully. 87.5% of drugs were matched exactly on name to the current formulary and a further 6.5% to similar drug names. All drugs in core clinical chapters were reconciled to their current location in the data schema, with only 1.26% of drugs not assigned a current chemical code. We created an openly accessible interactive tool to facilitate wider use of these data. Publicly available data can be made accessible through interactive online tools to help researchers and policy-makers explore time trends in prescribing. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Karthikeyan, Muthukumarasamy; Pandit, Yogesh; Pandit, Deepak; Vyas, Renu
2015-01-01
Virtual screening is an indispensable tool to cope with the massive amount of data being tossed by the high throughput omics technologies. With the objective of enhancing the automation capability of virtual screening process a robust portal termed MegaMiner has been built using the cloud computing platform wherein the user submits a text query and directly accesses the proposed lead molecules along with their drug-like, lead-like and docking scores. Textual chemical structural data representation is fraught with ambiguity in the absence of a global identifier. We have used a combination of statistical models, chemical dictionary and regular expression for building a disease specific dictionary. To demonstrate the effectiveness of this approach, a case study on malaria has been carried out in the present work. MegaMiner offered superior results compared to other text mining search engines, as established by F score analysis. A single query term 'malaria' in the portlet led to retrieval of related PubMed records, protein classes, drug classes and 8000 scaffolds which were internally processed and filtered to suggest new molecules as potential anti-malarials. The results obtained were validated by docking the virtual molecules into relevant protein targets. It is hoped that MegaMiner will serve as an indispensable tool for not only identifying hidden relationships between various biological and chemical entities but also for building better corpus and ontologies.
Evaluation of a novel Serious Game based assessment tool for patients with Alzheimer's disease.
Vallejo, Vanessa; Wyss, Patric; Rampa, Luca; Mitache, Andrei V; Müri, René M; Mosimann, Urs P; Nef, Tobias
2017-01-01
Despite growing interest in developing ecological assessment of difficulties in patients with Alzheimer's disease new methods assessing the cognitive difficulties related to functional activities are missing. To complete current evaluation, the use of Serious Games can be a promising approach as it offers the possibility to recreate a virtual environment with daily living activities and a precise and complete cognitive evaluation. The aim of the present study was to evaluate the usability and the screening potential of a new ecological tool for assessment of cognitive functions in patients with Alzheimer's disease. Eighteen patients with Alzheimer's disease and twenty healthy controls participated to the study. They were asked to complete six daily living virtual tasks assessing several cognitive functions: three navigation tasks, one shopping task, one cooking task and one table preparation task following a one-day scenario. Usability of the game was evaluated through a questionnaire and through the analysis of the computer interactions for the two groups. Furthermore, the performances in terms of time to achieve the task and percentage of completion on the several tasks were recorded. Results indicate that both groups subjectively found the game user friendly and they were objectively able to play the game without computer interactions difficulties. Comparison of the performances between the two groups indicated a significant difference in terms of percentage of achievement of the several tasks and in terms of time they needed to achieve the several tasks. This study suggests that this new Serious Game based assessment tool is a user-friendly and ecological method to evaluate the cognitive abilities related to the difficulties patients can encounter in daily living activities and can be used as a screening tool as it allowed to distinguish Alzheimer's patient's performance from healthy controls.
Brownfields Environmental Insurance and Risk Management Tools Glossary of Terms
This document provides a list of terms that are typically used by the environmental insurance industry, transactional specialists, and other parties involved in using environmental insurance or risk management tools.
Bio-TDS: bioscience query tool discovery system.
Gnimpieba, Etienne Z; VanDiermen, Menno S; Gustafson, Shayla M; Conn, Bill; Lushbough, Carol M
2017-01-04
Bioinformatics and computational biology play a critical role in bioscience and biomedical research. As researchers design their experimental projects, one major challenge is to find the most relevant bioinformatics toolkits that will lead to new knowledge discovery from their data. The Bio-TDS (Bioscience Query Tool Discovery Systems, http://biotds.org/) has been developed to assist researchers in retrieving the most applicable analytic tools by allowing them to formulate their questions as free text. The Bio-TDS is a flexible retrieval system that affords users from multiple bioscience domains (e.g. genomic, proteomic, bio-imaging) the ability to query over 12 000 analytic tool descriptions integrated from well-established, community repositories. One of the primary components of the Bio-TDS is the ontology and natural language processing workflow for annotation, curation, query processing, and evaluation. The Bio-TDS's scientific impact was evaluated using sample questions posed by researchers retrieved from Biostars, a site focusing on BIOLOGICAL DATA ANALYSIS: The Bio-TDS was compared to five similar bioscience analytic tool retrieval systems with the Bio-TDS outperforming the others in terms of relevance and completeness. The Bio-TDS offers researchers the capacity to associate their bioscience question with the most relevant computational toolsets required for the data analysis in their knowledge discovery process. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Susan Will-Wolf; Peter Neitlich
2010-01-01
Development of a regional lichen gradient model from community data is a powerful tool to derive lichen indexes of response to environmental factors for large-scale and long-term monitoring of forest ecosystems. The Forest Inventory and Analysis (FIA) Program of the U.S. Department of Agriculture Forest Service includes lichens in its national inventory of forests of...
ERIC Educational Resources Information Center
Coulson, Andrew J.
2014-01-01
Long-term trends in academic performance and spending are valuable tools for evaluating past education policies and informing current ones. But such data have been scarce at the state level, where the most important education policy decisions are made. State spending data exist reaching back to the 1960s, but the figures have been scattered across…
Simulating Mission Command for Planning and Analysis
2015-06-01
mission plan. 14. SUBJECT TERMS Mission Planning, CPM , PERT, Simulation, DES, Simkit, Triangle Distribution, Critical Path 15. NUMBER OF...Battalion Task Force CO Company CPM Critical Path Method DES Discrete Event Simulation FA BAT Field Artillery Battalion FEL Future Event List FIST...management tools that can be utilized to find the critical path in military projects. These are the Critical Path Method ( CPM ) and the Program Evaluation and
ERIC Educational Resources Information Center
Scrimshaw, Susan
This guidebook is both a practical tool and a source book to aid health planners assess the importance, extent, and impact of indigenous and private sector medical systems in developing nations. Guidelines are provided for assessment in terms of: use patterns; the meaning and importance to users of various available health services; and ways of…
Analysis of Commercial Unsaturated Polyester Repair Resins
2009-07-01
resins utilizing renewable fatty acid -based monomers. 15. SUBJECT TERMS vinyl ester, styrene, fatty acid monomers, HAP, triglycerides 16. SECURITY...criteria for selecting the appropriate repair include whether the component can be removed and whether the back side is accessible. For a typical moderate...field repair, any remaining coating in the repair area is removed by hand sanding or portable tools. Damage is cut out in an appropriate
Herpich, Carolina Marciela; Amaral, Ana Paula; Leal-Junior, Ernesto Cesar Pinto; Tosato, Juliana de Paiva; Gomes, Cid Andre Fidelis de Paula; Arruda, Éric Edmur Camargo; Glória, Igor Phillip dos Santos; Garcia, Marilia Barbosa Santos; Barbosa, Bruno Roberto Borges; Rodrigues, Monique Sampaio; Silva, Katiane Lima; El Hage, Yasmin; Politti, Fabiano; Gonzalez, Tabajara de Oliveira; Bussadori, Sandra Kalil; Biasotto-Gonzalez, Daniela Aparecida
2015-01-01
The aim of the present study was to perform a systematic review of the literature on the effects of low-level laser therapy in the treatment of TMD, and to analyze the use of different assessment tools. [Subjects and Methods] Searches were carried out of the BIREME, MEDLINE, PubMed and SciELO electronic databases by two independent researchers for papers published in English and Portuguese using the terms: “temporomandibular joint laser therapy” and “TMJ laser treatment”. [Results] Following the application of the eligibility criteria, 11 papers were selected for in-depth analysis. The papers analyzed exhibited considerable methodological differences, especially with regard to the number of sessions, anatomic site and duration of low-level laser therapy irradiation, as well as irradiation parameters, diagnostic criteria and assessment tools. [Conclusion] Further studies are needed, especially randomized clinical trials, to establish the exact dose and ideal parameters for low-level laser therapy and define the best assessment tools in this promising field of research that may benefit individuals with signs and symptoms of TMD. PMID:25642095
LittleQuickWarp: an ultrafast image warping tool.
Qu, Lei; Peng, Hanchuan
2015-02-01
Warping images into a standard coordinate space is critical for many image computing related tasks. However, for multi-dimensional and high-resolution images, an accurate warping operation itself is often very expensive in terms of computer memory and computational time. For high-throughput image analysis studies such as brain mapping projects, it is desirable to have high performance image warping tools that are compatible with common image analysis pipelines. In this article, we present LittleQuickWarp, a swift and memory efficient tool that boosts 3D image warping performance dramatically and at the same time has high warping quality similar to the widely used thin plate spline (TPS) warping. Compared to the TPS, LittleQuickWarp can improve the warping speed 2-5 times and reduce the memory consumption 6-20 times. We have implemented LittleQuickWarp as an Open Source plug-in program on top of the Vaa3D system (http://vaa3d.org). The source code and a brief tutorial can be found in the Vaa3D plugin source code repository. Copyright © 2014 Elsevier Inc. All rights reserved.
RootGraph: a graphic optimization tool for automated image analysis of plant roots
Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N.; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J.
2015-01-01
This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions. PMID:26224880
Toward a Responsibility-Catering Prioritarian Ethical Theory of Risk.
Wikman-Svahn, Per; Lindblom, Lars
2018-03-05
Standard tools used in societal risk management such as probabilistic risk analysis or cost-benefit analysis typically define risks in terms of only probabilities and consequences and assume a utilitarian approach to ethics that aims to maximize expected utility. The philosopher Carl F. Cranor has argued against this view by devising a list of plausible aspects of the acceptability of risks that points towards a non-consequentialist ethical theory of societal risk management. This paper revisits Cranor's list to argue that the alternative ethical theory responsibility-catering prioritarianism can accommodate the aspects identified by Cranor and that the elements in the list can be used to inform the details of how to view risks within this theory. An approach towards operationalizing the theory is proposed based on a prioritarian social welfare function that operates on responsibility-adjusted utilities. A responsibility-catering prioritarian ethical approach towards managing risks is a promising alternative to standard tools such as cost-benefit analysis.
Braunschmidt, Brigitte; Müller, Gerhard; Jukic-Puntigam, Margareta; Steininger, Alfred
2013-01-01
Incontinence-associated dermatitis (IAD) is the clinical manifestation of moisture related skin damage (Beeckman, Woodward, & Gray, 2011). Valid assessment instruments are needed for risk assessment and classification of IAD. Aim of the quantitative-descriptive cross-sectional study was to determine the inter-rater reliability of the item scores of the German Incontinence Associated Dermatitis Intervention Tool (IADIT-D) between two independent assessors of nursing home residents (n = 381) in long-term care facilities. The 19 pairs of assessors consisted of registered nurses. The data analysis was computed first with the calculation of the total percentage of agreement. Because this value is not randomly adjusted, the calculation of the Kappa-coefficients and AC1-Statistic was done as well. The total percentage of the inter-rater agreement was 84% (n = 319). In a second step of analysis, the calculation of all items determined high (kappa = .70) and very high agreement (AC1 = .83) levels, respectively. For the risk assessment (kappa = .82; AC1 = .94), the values amounted to very high agreement levels and for the classification (kappa(w) = .70; AC1 = .76) to high agreement levels. The high to very high agreement values of IADIT-D demonstrate that the items can be regarded as stable in regards to the inter-rater reliability for the use in long-term care facilities. In addition, further validation studies are needed.
Applications of "Integrated Data Viewer'' (IDV) in the classroom
NASA Astrophysics Data System (ADS)
Nogueira, R.; Cutrim, E. M.
2006-06-01
Conventionally, weather products utilized in synoptic meteorology reduce phenomena occurring in four dimensions to a 2-dimensional form. This constitutes a road-block for non-atmospheric-science majors who need to take meteorology as a non-mathematical and complementary course to their major programs. This research examines the use of Integrated Data Viewer-IDV as a teaching tool, as it allows a 4-dimensional representation of weather products. IDV was tested in the teaching of synoptic meteorology, weather analysis, and weather map interpretation to non-science students in the laboratory sessions of an introductory meteorology class at Western Michigan University. Comparison of student exam scores according to the laboratory teaching techniques, i.e., traditional lab manual and IDV was performed for short- and long-term learning. Results of the statistical analysis show that the Fall 2004 students in the IDV-based lab session retained learning. However, in the Spring 2005 the exam scores did not reflect retention in learning when compared with IDV-based and MANUAL-based lab scores (short term learning, i.e., exam taken one week after the lab exercise). Testing the long-term learning, seven weeks between the two exams in the Spring 2005, show no statistically significant difference between IDV-based group scores and MANUAL-based group scores. However, the IDV group obtained exam score average slightly higher than the MANUAL group. Statistical testing of the principal hypothesis in this study, leads to the conclusion that the IDV-based method did not prove to be a better teaching tool than the traditional paper-based method. Future studies could potentially find significant differences in the effectiveness of both manual and IDV methods if the conditions had been more controlled. That is, students in the control group should not be exposed to the weather analysis using IDV during lecture.
A Systematic Analysis of Term Reuse and Term Overlap across Biomedical Ontologies
Kamdar, Maulik R.; Tudorache, Tania; Musen, Mark A.
2016-01-01
Reusing ontologies and their terms is a principle and best practice that most ontology development methodologies strongly encourage. Reuse comes with the promise to support the semantic interoperability and to reduce engineering costs. In this paper, we present a descriptive study of the current extent of term reuse and overlap among biomedical ontologies. We use the corpus of biomedical ontologies stored in the BioPortal repository, and analyze different types of reuse and overlap constructs. While we find an approximate term overlap between 25–31%, the term reuse is only <9%, with most ontologies reusing fewer than 5% of their terms from a small set of popular ontologies. Clustering analysis shows that the terms reused by a common set of ontologies have >90% semantic similarity, hinting that ontology developers tend to reuse terms that are sibling or parent–child nodes. We validate this finding by analysing the logs generated from a Protégé plugin that enables developers to reuse terms from BioPortal. We find most reuse constructs were 2-level subtrees on the higher levels of the class hierarchy. We developed a Web application that visualizes reuse dependencies and overlap among ontologies, and that proposes similar terms from BioPortal for a term of interest. We also identified a set of error patterns that indicate that ontology developers did intend to reuse terms from other ontologies, but that they were using different and sometimes incorrect representations. Our results stipulate the need for semi-automated tools that augment term reuse in the ontology engineering process through personalized recommendations. PMID:28819351
Enhancement of Local Climate Analysis Tool
NASA Astrophysics Data System (ADS)
Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.
2012-12-01
The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).
NASA Technical Reports Server (NTRS)
Hague, D. S.; Woodbury, N. W.
1975-01-01
The Mars system is a tool for rapid prediction of aircraft or engine characteristics based on correlation-regression analysis of past designs stored in the data bases. An example of output obtained from the MARS system, which involves derivation of an expression for gross weight of subsonic transport aircraft in terms of nine independent variables is given. The need is illustrated for careful selection of correlation variables and for continual review of the resulting estimation equations. For Vol. 1, see N76-10089.
Joint symbolic dynamics for the assessment of cardiovascular and cardiorespiratory interactions
Baumert, Mathias; Javorka, Michal; Kabir, Muammar
2015-01-01
Beat-to-beat variations in heart period provide information on cardiovascular control and are closely linked to variations in arterial pressure and respiration. Joint symbolic analysis of heart period, systolic arterial pressure and respiration allows for a simple description of their shared short-term dynamics that are governed by cardiac baroreflex control and cardiorespiratory coupling. In this review, we discuss methodology and research applications. Studies suggest that analysis of joint symbolic dynamics provides a powerful tool for identifying physiological and pathophysiological changes in cardiovascular and cardiorespiratory control. PMID:25548272
Joint symbolic dynamics for the assessment of cardiovascular and cardiorespiratory interactions.
Baumert, Mathias; Javorka, Michal; Kabir, Muammar
2015-02-13
Beat-to-beat variations in heart period provide information on cardiovascular control and are closely linked to variations in arterial pressure and respiration. Joint symbolic analysis of heart period, systolic arterial pressure and respiration allows for a simple description of their shared short-term dynamics that are governed by cardiac baroreflex control and cardiorespiratory coupling. In this review, we discuss methodology and research applications. Studies suggest that analysis of joint symbolic dynamics provides a powerful tool for identifying physiological and pathophysiological changes in cardiovascular and cardiorespiratory control.
Martin, Anne; Connelly, Andrew; Bland, Ruth M; Reilly, John J
2017-01-01
This study aimed to systematically review and appraise evidence on the short-term (e.g. morbidity, mortality) and long-term (obesity and non-communicable diseases, NCDs) health consequences of catch-up growth (vs. no catch-up growth) in individuals with a history of low birth weight (LBW).We searched MEDLINE, EMBASE, Global Health, CINAHL plus, Cochrane Library, ProQuest Dissertations and Thesis and reference lists. Study quality was assessed using the risk of bias assessment tool from the Agency for Health Care Research and Quality, and the evidence base was assessed using the GRADE tool. Eight studies in seven cohorts (two from high-income countries, five from low-middle-income countries) met the inclusion criteria for short-term (mean age: 13.4 months) and/or longer-term (mean age: 11.1 years) health outcomes of catch-up growth, which had occurred by 24 or 59 months. Of five studies on short-term health outcomes, three found positive associations between weight catch-up growth and body mass and/or glucose metabolism; one suggested reduced risk of hospitalisation and mortality with catch-up growth. Three studies on longer-term health outcomes found catch-up growth were associated with higher body mass, BMI or cholesterol. GRADE assessment suggested that evidence quantity and quality were low. Catch-up growth following LBW may have benefits for the individual with LBW in the short term, and may have adverse population health impacts in the long-term, but the evidence is limited. Future cohort studies could address the question of the consequences of catch-up growth following LBW more convincingly, with a view to informing future prevention of obesity and NCDs. © 2016 John Wiley & Sons Ltd. © 2016 John Wiley & Sons Ltd.
Orlandi, Silvia; Reyes Garcia, Carlos Alberto; Bandini, Andrea; Donzelli, Gianpaolo; Manfredi, Claudia
2016-11-01
Scientific and clinical advances in perinatology and neonatology have enhanced the chances of survival of preterm and very low weight neonates. Infant cry analysis is a suitable noninvasive complementary tool to assess the neurologic state of infants particularly important in the case of preterm neonates. This article aims at exploiting differences between full-term and preterm infant cry with robust automatic acoustical analysis and data mining techniques. Twenty-two acoustical parameters are estimated in more than 3000 cry units from cry recordings of 28 full-term and 10 preterm newborns. Feature extraction is performed through the BioVoice dedicated software tool, developed at the Biomedical Engineering Lab, University of Firenze, Italy. Classification and pattern recognition is based on genetic algorithms for the selection of the best attributes. Training is performed comparing four classifiers: Logistic Curve, Multilayer Perceptron, Support Vector Machine, and Random Forest and three different testing options: full training set, 10-fold cross-validation, and 66% split. Results show that the best feature set is made up by 10 parameters capable to assess differences between preterm and full-term newborns with about 87% of accuracy. Best results are obtained with the Random Forest method (receiver operating characteristic area, 0.94). These 10 cry features might convey important additional information to assist the clinical specialist in the diagnosis and follow-up of possible delays or disorders in the neurologic development due to premature birth in this extremely vulnerable population of patients. The proposed approach is a first step toward an automatic infant cry recognition system for fast and proper identification of risk in preterm babies. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
Rothenfluh, Fabia; Schulz, Peter J
2018-06-14
Websites on which users can rate their physician are becoming increasingly popular, but little is known about the website quality, the information content, and the tools they offer users to assess physicians. This study assesses these aspects on physician-rating websites in German- and English-speaking countries. The objective of this study was to collect information on websites with a physician rating or review tool in 12 countries in terms of metadata, website quality (transparency, privacy and freedom of speech of physicians and patients, check mechanisms for appropriateness and accuracy of reviews, and ease of page navigation), professional information about the physician, rating scales and tools, as well as traffic rank. A systematic Web search based on a set of predefined keywords was conducted on Google, Bing, and Yahoo in August 2016. A final sample of 143 physician-rating websites was analyzed and coded for metadata, quality, information content, and the physician-rating tools. The majority of websites were registered in the United States (40/143) or Germany (25/143). The vast majority were commercially owned (120/143, 83.9%), and 69.9% (100/143) displayed some form of physician advertisement. Overall, information content (mean 9.95/25) as well as quality were low (mean 18.67/47). Websites registered in the United Kingdom obtained the highest quality scores (mean 26.50/47), followed by Australian websites (mean 21.50/47). In terms of rating tools, physician-rating websites were most frequently asking users to score overall performance, punctuality, or wait time in practice. This study evidences that websites that provide physician rating should improve and communicate their quality standards, especially in terms of physician and user protection, as well as transparency. In addition, given that quality standards on physician-rating websites are low overall, the development of transparent guidelines is required. Furthermore, attention should be paid to the financial goals that the majority of physician-rating websites, especially the ones that are commercially owned, pursue. ©Fabia Rothenfluh, Peter J Schulz. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 14.06.2018.
RuleGO: a logical rules-based tool for description of gene groups by means of Gene Ontology
Gruca, Aleksandra; Sikora, Marek; Polanski, Andrzej
2011-01-01
Genome-wide expression profiles obtained with the use of DNA microarray technology provide abundance of experimental data on biological and molecular processes. Such amount of data need to be further analyzed and interpreted in order to obtain biological conclusions on the basis of experimental results. The analysis requires a lot of experience and is usually time-consuming process. Thus, frequently various annotation databases are used to improve the whole process of analysis. Here, we present RuleGO—the web-based application that allows the user to describe gene groups on the basis of logical rules that include Gene Ontology (GO) terms in their premises. Presented application allows obtaining rules that reflect coappearance of GO-terms describing genes supported by the rules. The ontology level and number of coappearing GO-terms is adjusted in automatic manner. The user limits the space of possible solutions only. The RuleGO application is freely available at http://rulego.polsl.pl/. PMID:21715384
A Data-Driven Diagnostic Framework for Wind Turbine Structures: A Holistic Approach
Bogoevska, Simona; Spiridonakos, Minas; Chatzi, Eleni; Dumova-Jovanoska, Elena; Höffer, Rudiger
2017-01-01
The complex dynamics of operational wind turbine (WT) structures challenges the applicability of existing structural health monitoring (SHM) strategies for condition assessment. At the center of Europe’s renewable energy strategic planning, WT systems call for implementation of strategies that may describe the WT behavior in its complete operational spectrum. The framework proposed in this paper relies on the symbiotic treatment of acting environmental/operational variables and the monitored vibration response of the structure. The approach aims at accurate simulation of the temporal variability characterizing the WT dynamics, and subsequently at the tracking of the evolution of this variability in a longer-term horizon. The bi-component analysis tool is applied on long-term data, collected as part of continuous monitoring campaigns on two actual operating WT structures located in different sites in Germany. The obtained data-driven structural models verify the potential of the proposed strategy for development of an automated SHM diagnostic tool. PMID:28358346
Fault tree analysis for data-loss in long-term monitoring networks.
Dirksen, J; ten Veldhuis, J A E; Schilperoort, R P S
2009-01-01
Prevention of data-loss is an important aspect in the design as well as the operational phase of monitoring networks since data-loss can seriously limit intended information yield. In the literature limited attention has been paid to the origin of unreliable or doubtful data from monitoring networks. Better understanding of causes of data-loss points out effective solutions to increase data yield. This paper introduces FTA as a diagnostic tool to systematically deduce causes of data-loss in long-term monitoring networks in urban drainage systems. In order to illustrate the effectiveness of FTA, a fault tree is developed for a monitoring network and FTA is applied to analyze the data yield of a UV/VIS submersible spectrophotometer. Although some of the causes of data-loss cannot be recovered because the historical database of metadata has been updated infrequently, the example points out that FTA still is a powerful tool to analyze the causes of data-loss and provides useful information on effective data-loss prevention.
Trends and Issues in Fuzzy Control and Neuro-Fuzzy Modeling
NASA Technical Reports Server (NTRS)
Chiu, Stephen
1996-01-01
Everyday experience in building and repairing things around the home have taught us the importance of using the right tool for the right job. Although we tend to think of a 'job' in broad terms, such as 'build a bookcase,' we understand well that the 'right job' associated with each 'right tool' is typically a narrowly bounded subtask, such as 'tighten the screws.' Unfortunately, we often lose sight of this principle when solving engineering problems; we treat a broadly defined problem, such as controlling or modeling a system, as a narrow one that has a single 'right tool' (e.g., linear analysis, fuzzy logic, neural network). We need to recognize that a typical real-world problem contains a number of different sub-problems, and that a truly optimal solution (the best combination of cost, performance and feature) is obtained by applying the right tool to the right sub-problem. Here I share some of my perspectives on what constitutes the 'right job' for fuzzy control and describe recent advances in neuro-fuzzy modeling to illustrate and to motivate the synergistic use of different tools.
GO2PUB: Querying PubMed with semantic expansion of gene ontology terms
2012-01-01
Background With the development of high throughput methods of gene analyses, there is a growing need for mining tools to retrieve relevant articles in PubMed. As PubMed grows, literature searches become more complex and time-consuming. Automated search tools with good precision and recall are necessary. We developed GO2PUB to automatically enrich PubMed queries with gene names, symbols and synonyms annotated by a GO term of interest or one of its descendants. Results GO2PUB enriches PubMed queries based on selected GO terms and keywords. It processes the result and displays the PMID, title, authors, abstract and bibliographic references of the articles. Gene names, symbols and synonyms that have been generated as extra keywords from the GO terms are also highlighted. GO2PUB is based on a semantic expansion of PubMed queries using the semantic inheritance between terms through the GO graph. Two experts manually assessed the relevance of GO2PUB, GoPubMed and PubMed on three queries about lipid metabolism. Experts’ agreement was high (kappa = 0.88). GO2PUB returned 69% of the relevant articles, GoPubMed: 40% and PubMed: 29%. GO2PUB and GoPubMed have 17% of their results in common, corresponding to 24% of the total number of relevant results. 70% of the articles returned by more than one tool were relevant. 36% of the relevant articles were returned only by GO2PUB, 17% only by GoPubMed and 14% only by PubMed. For determining whether these results can be generalized, we generated twenty queries based on random GO terms with a granularity similar to those of the first three queries and compared the proportions of GO2PUB and GoPubMed results. These were respectively of 77% and 40% for the first queries, and of 70% and 38% for the random queries. The two experts also assessed the relevance of seven of the twenty queries (the three related to lipid metabolism and four related to other domains). Expert agreement was high (0.93 and 0.8). GO2PUB and GoPubMed performances were similar to those of the first queries. Conclusions We demonstrated that the use of genes annotated by either GO terms of interest or a descendant of these GO terms yields some relevant articles ignored by other tools. The comparison of GO2PUB, based on semantic expansion, with GoPubMed, based on text mining techniques, showed that both tools are complementary. The analysis of the randomly-generated queries suggests that the results obtained about lipid metabolism can be generalized to other biological processes. GO2PUB is available at http://go2pub.genouest.org. PMID:22958570
Aerospace Systems Design in NASA's Collaborative Engineering Environment
NASA Technical Reports Server (NTRS)
Monell, Donald W.; Piland, William M.
1999-01-01
Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g. manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.
Aerospace Systems Design in NASA's Collaborative Engineering Environment
NASA Technical Reports Server (NTRS)
Monell, Donald W.; Piland, William M.
2000-01-01
Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operation). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographical distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across Agency.
Aerospace Systems Design in NASA's Collaborative Engineering Environment
NASA Astrophysics Data System (ADS)
Monell, Donald W.; Piland, William M.
2000-07-01
Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often led to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.
A Knowledge Portal and Collaboration Environment for the Earth Sciences
NASA Astrophysics Data System (ADS)
D'Agnese, F. A.
2008-12-01
Earth Knowledge is developing a web-based 'Knowledge Portal and Collaboration Environment' that will serve as the information-technology-based foundation of a modular Internet-based Earth-Systems Monitoring, Analysis, and Management Tool. This 'Knowledge Portal' is essentially a 'mash- up' of web-based and client-based tools and services that support on-line collaboration, community discussion, and broad public dissemination of earth and environmental science information in a wide-area distributed network. In contrast to specialized knowledge-management or geographic-information systems developed for long- term and incremental scientific analysis, this system will exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize existing environmental datasets using Google Earth and Google Maps. An early form of these tools and services is being used by Earth Knowledge to facilitate the investigations and conversations of scientists, resource managers, and citizen-stakeholders addressing water resource sustainability issues in the Great Basin region of the desert southwestern United States. These ongoing projects will serve as use cases for the further development of this information-technology infrastructure. This 'Knowledge Portal' will accelerate the deployment of Earth- system data and information into an operational knowledge management system that may be used by decision-makers concerned with stewardship of water resources in the American Desert Southwest.
He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison
2018-01-12
Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Dean N.
2015-01-27
The climate and weather data science community met December 9–11, 2014, in Livermore, California, for the fourth annual Earth System Grid Federation (ESGF) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Conference, hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UVCDATremain global collaborations committed to developing a new generation of open-source software infrastructure that provides distributed access and analysis to simulated and observed data from the climate and weather communities.more » The tools and infrastructure created under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change. In addition, the F2F conference fosters a stronger climate and weather data science community and facilitates a stronger federated software infrastructure. The 2014 F2F conference detailed the progress of ESGF, UV-CDAT, and other community efforts over the year and sets new priorities and requirements for existing and impending national and international community projects, such as the Coupled Model Intercomparison Project Phase Six. Specifically discussed at the conference were project capabilities and enhancements needs for data distribution, analysis, visualization, hardware and network infrastructure, standards, and resources.« less
Chrimes, Dillon; Kitos, Nicole R; Kushniruk, Andre; Mann, Devin M
2014-09-01
Usability testing can be used to evaluate human-computer interaction (HCI) and communication in shared decision making (SDM) for patient-provider behavioral change and behavioral contracting. Traditional evaluations of usability using scripted or mock patient scenarios with think-aloud protocol analysis provide a way to identify HCI issues. In this paper we describe the application of these methods in the evaluation of the Avoiding Diabetes Thru Action Plan Targeting (ADAPT) tool, and test the usability of the tool to support the ADAPT framework for integrated care counseling of pre-diabetes. The think-aloud protocol analysis typically does not provide an assessment of how patient-provider interactions are effected in "live" clinical workflow or whether a tool is successful. Therefore, "Near-live" clinical simulations involving applied simulation methods were used to compliment the think-aloud results. This complementary usability technique was used to test the end-user HCI and tool performance by more closely mimicking the clinical workflow and capturing interaction sequences along with assessing the functionality of computer module prototypes on clinician workflow. We expected this method to further complement and provide different usability findings as compared to think-aloud analysis. Together, this mixed method evaluation provided comprehensive and realistic feedback for iterative refinement of the ADAPT system prior to implementation. The study employed two phases of testing of a new interactive ADAPT tool that embedded an evidence-based shared goal setting component into primary care workflow for dealing with pre-diabetes counseling within a commercial physician office electronic health record (EHR). Phase I applied usability testing that involved "think-aloud" protocol analysis of eight primary care providers interacting with several scripted clinical scenarios. Phase II used "near-live" clinical simulations of five providers interacting with standardized trained patient actors enacting the clinical scenario of counseling for pre-diabetes, each of whom had a pedometer that recorded the number of steps taken over a week. In both phases, all sessions were audio-taped and motion screen-capture software was activated for onscreen recordings. Transcripts were coded using iterative qualitative content analysis methods. In Phase I, the impact of the components and layout of ADAPT on user's Navigation, Understandability, and Workflow were associated with the largest volume of negative comments (i.e. approximately 80% of end-user commentary), while Usability and Content of ADAPT were representative of more positive than negative user commentary. The heuristic category of Usability had a positive-to-negative comment ratio of 2.1, reflecting positive perception of the usability of the tool, its functionality, and overall co-productive utilization of ADAPT. However, there were mixed perceptions about content (i.e., how the information was displayed, organized and described in the tool). In Phase II, the duration of patient encounters was approximately 10 min with all of the Patient Instructions (prescriptions) and behavioral contracting being activated at the end of each visit. Upon activation, providers accepted the pathway prescribed by the tool 100% of the time and completed all the fields in the tool in the simulation cases. Only 14% of encounter time was spent using the functionality of the ADAPT tool in terms of keystrokes and entering relevant data. The rest of the time was spent on communication and dialog to populate the patient instructions. In all cases, the interaction sequence of reviewing and discussing exercise and diet of the patient was linked to the functionality of the ADAPT tool in terms of monitoring, response-efficacy, self-efficacy, and negotiation in the patient-provider dialog. There was a change from one-way dialog to two-way dialog and negotiation that ended in a behavioral contract. This change demonstrated the tool's sequence, which supported recording current exercise and diet followed by a diet and exercise goal setting procedure to reduce the risk of diabetes onset. This study demonstrated that "think-aloud" protocol analysis with "near-live" clinical simulations provided a successful usability evaluation of a new primary care pre-diabetes shared goal setting tool. Each phase of the study provided complementary observations on problems with the new onscreen tool and was used to show the influence of the ADAPT framework on the usability, workflow integration, and communication between the patient and provider. The think-aloud tests with the provider showed the tool can be used according to the ADAPT framework (exercise-to-diet behavior change and tool utilization), while the clinical simulations revealed the ADAPT framework to realistically support patient-provider communication to obtain behavioral change contract. SDM interactions and mechanisms affecting protocol-based care can be more completely captured by combining "near-live" clinical simulations with traditional "think-aloud analysis" which augments clinician utilization. More analysis is required to verify if the rich communication actions found in Phase II compliment clinical workflows. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Data and Tools | Energy Analysis | NREL
and Tools Energy Analysis Data and Tools NREL develops energy analysis data and tools to assess collections. Data Products Technology and Performance Analysis Tools Energy Systems Analysis Tools Economic and Financial Analysis Tools
Odukoya, Jonathan A; Adekeye, Olajide; Igbinoba, Angie O; Afolabi, A
2018-01-01
Teachers and Students worldwide often dance to the tune of tests and examinations. Assessments are powerful tools for catalyzing the achievement of educational goals, especially if done rightly. One of the tools for 'doing it rightly' is item analysis. The core objectives for this study, therefore, were: ascertaining the item difficulty and distractive indices of the university wide courses. A range of 112-1956 undergraduate students participated in this study. With the use of secondary data, the ex-post facto design was adopted for this project. In virtually all cases, majority of the items (ranging between 65% and 97% of the 70 items fielded in each course) did not meet psychometric standard in terms of difficulty and distractive indices and consequently needed to be moderated or deleted. Considering the importance of these courses, the need to apply item analyses when developing these tests was emphasized.
Khan, Arif Ul Maula; Torelli, Angelo; Wolf, Ivo; Gretz, Norbert
2018-05-08
In biological assays, automated cell/colony segmentation and counting is imperative owing to huge image sets. Problems occurring due to drifting image acquisition conditions, background noise and high variation in colony features in experiments demand a user-friendly, adaptive and robust image processing/analysis method. We present AutoCellSeg (based on MATLAB) that implements a supervised automatic and robust image segmentation method. AutoCellSeg utilizes multi-thresholding aided by a feedback-based watershed algorithm taking segmentation plausibility criteria into account. It is usable in different operation modes and intuitively enables the user to select object features interactively for supervised image segmentation method. It allows the user to correct results with a graphical interface. This publicly available tool outperforms tools like OpenCFU and CellProfiler in terms of accuracy and provides many additional useful features for end-users.
Integrating interface slicing into software engineering processes
NASA Technical Reports Server (NTRS)
Beck, Jon
1993-01-01
Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Jerden, James
A mechanistic source term (MST) calculation attempts to realistically assess the transport and release of radionuclides from a reactor system to the environment during a specific accident sequence. The U.S. Nuclear Regulatory Commission (NRC) has repeatedly stated its expectation that advanced reactor vendors will utilize an MST during the U.S. reactor licensing process. As part of a project to examine possible impediments to sodium fast reactor (SFR) licensing in the U.S., an analysis was conducted regarding the current capabilities to perform an MST for a metal fuel SFR. The purpose of the project was to identify and prioritize any gapsmore » in current computational tools, and the associated database, for the accurate assessment of an MST. The results of the study demonstrate that an SFR MST is possible with current tools and data, but several gaps exist that may lead to possibly unacceptable levels of uncertainty, depending on the goals of the MST analysis.« less
Paliwoda, Michelle; New, Karen; Bogossian, Fiona
2016-09-01
All newborns are at risk of deterioration as a result of failing to make the transition to extra uterine life. Signs of deterioration can be subtle and easily missed. It has been postulated that the use of an Early Warning Tool may assist clinicians in recognising and responding to signs of deterioration earlier in neonates, thereby preventing a serious adverse event. To examine whether observations from a Standard Observation Tool, applied to three neonatal Early Warning Tools, would hypothetically trigger an escalation of care more frequently than actual escalation of care using the Standard Observation Tool. A retrospective case-control study. A maternity unit in a tertiary public hospital in Australia. Neonates born in 2013 of greater than or equal to 34(+0) weeks gestation, admitted directly to the maternity ward from their birthing location and whose subsequent deterioration required admission to the neonatal unit, were identified as cases from databases of the study hospital. Each case was matched with three controls, inborn during the same period and who did not experience deterioration and neonatal unit admission. Clinical and physiological data recorded on a Standard Observation Tool, from time of admission to the maternity ward, for cases and controls were charted onto each of three Early Warning Tools. The primary outcome was whether the tool 'triggered an escalation of care'. Descriptive statistics (n, %, Mean and SD) were employed. Cases (n=26) comprised late preterm, early term and post-term neonates and matched by gestational age group with 3 controls (n=78). Overall, the Standard Observation Tool triggered an escalation of care for 92.3% of cases compared to the Early Warning Tools; New South Wales Health 80.8%, United Kingdom Newborn Early Warning Chart 57.7% and The Australian Capital Territory Neonatal Early Warning Score 11.5%. Subgroup analysis by gestational age found differences between the tools in hypothetically triggering an escalation of care. The Standard Observation Tool triggered an escalation of care more frequently than the Early Warning Tools, which may be as a result of behavioural data captured on the Standard Observation Tool and escalated, which could not be on the Early Warning Tools. Findings demonstrate that a single tool applied to all gestational age ranges may not be effective in identifying early deterioration or may over trigger an escalation of care. Further research is required into the sensitivity and specificity of Early Warning Tools in neonatal sub-populations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis
McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.
2009-01-01
Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest. Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.
Customisation of the exome data analysis pipeline using a combinatorial approach.
Pattnaik, Swetansu; Vaidyanathan, Srividya; Pooja, Durgad G; Deepak, Sa; Panda, Binay
2012-01-01
The advent of next generation sequencing (NGS) technologies have revolutionised the way biologists produce, analyse and interpret data. Although NGS platforms provide a cost-effective way to discover genome-wide variants from a single experiment, variants discovered by NGS need follow up validation due to the high error rates associated with various sequencing chemistries. Recently, whole exome sequencing has been proposed as an affordable option compared to whole genome runs but it still requires follow up validation of all the novel exomic variants. Customarily, a consensus approach is used to overcome the systematic errors inherent to the sequencing technology, alignment and post alignment variant detection algorithms. However, the aforementioned approach warrants the use of multiple sequencing chemistry, multiple alignment tools, multiple variant callers which may not be viable in terms of time and money for individual investigators with limited informatics know-how. Biologists often lack the requisite training to deal with the huge amount of data produced by NGS runs and face difficulty in choosing from the list of freely available analytical tools for NGS data analysis. Hence, there is a need to customise the NGS data analysis pipeline to preferentially retain true variants by minimising the incidence of false positives and make the choice of right analytical tools easier. To this end, we have sampled different freely available tools used at the alignment and post alignment stage suggesting the use of the most suitable combination determined by a simple framework of pre-existing metrics to create significant datasets.
Yu, Xiaoyu; Reva, Oleg N
2018-01-01
Modern phylogenetic studies may benefit from the analysis of complete genome sequences of various microorganisms. Evolutionary inferences based on genome-scale analysis are believed to be more accurate than the gene-based alternative. However, the computational complexity of current phylogenomic procedures, inappropriateness of standard phylogenetic tools to process genome-wide data, and lack of reliable substitution models which correlates with alignment-free phylogenomic approaches deter microbiologists from using these opportunities. For example, the super-matrix and super-tree approaches of phylogenomics use multiple integrated genomic loci or individual gene-based trees to infer an overall consensus tree. However, these approaches potentially multiply errors of gene annotation and sequence alignment not mentioning the computational complexity and laboriousness of the methods. In this article, we demonstrate that the annotation- and alignment-free comparison of genome-wide tetranucleotide frequencies, termed oligonucleotide usage patterns (OUPs), allowed a fast and reliable inference of phylogenetic trees. These were congruent to the corresponding whole genome super-matrix trees in terms of tree topology when compared with other known approaches including 16S ribosomal RNA and GyrA protein sequence comparison, complete genome-based MAUVE, and CVTree methods. A Web-based program to perform the alignment-free OUP-based phylogenomic inferences was implemented at http://swphylo.bi.up.ac.za/. Applicability of the tool was tested on different taxa from subspecies to intergeneric levels. Distinguishing between closely related taxonomic units may be enforced by providing the program with alignments of marker protein sequences, eg, GyrA.
Yu, Xiaoyu; Reva, Oleg N
2018-01-01
Modern phylogenetic studies may benefit from the analysis of complete genome sequences of various microorganisms. Evolutionary inferences based on genome-scale analysis are believed to be more accurate than the gene-based alternative. However, the computational complexity of current phylogenomic procedures, inappropriateness of standard phylogenetic tools to process genome-wide data, and lack of reliable substitution models which correlates with alignment-free phylogenomic approaches deter microbiologists from using these opportunities. For example, the super-matrix and super-tree approaches of phylogenomics use multiple integrated genomic loci or individual gene-based trees to infer an overall consensus tree. However, these approaches potentially multiply errors of gene annotation and sequence alignment not mentioning the computational complexity and laboriousness of the methods. In this article, we demonstrate that the annotation- and alignment-free comparison of genome-wide tetranucleotide frequencies, termed oligonucleotide usage patterns (OUPs), allowed a fast and reliable inference of phylogenetic trees. These were congruent to the corresponding whole genome super-matrix trees in terms of tree topology when compared with other known approaches including 16S ribosomal RNA and GyrA protein sequence comparison, complete genome-based MAUVE, and CVTree methods. A Web-based program to perform the alignment-free OUP-based phylogenomic inferences was implemented at http://swphylo.bi.up.ac.za/. Applicability of the tool was tested on different taxa from subspecies to intergeneric levels. Distinguishing between closely related taxonomic units may be enforced by providing the program with alignments of marker protein sequences, eg, GyrA. PMID:29511354
Eye movement analysis of reading from computer displays, eReaders and printed books.
Zambarbieri, Daniela; Carniglia, Elena
2012-09-01
To compare eye movements during silent reading of three eBooks and a printed book. The three different eReading tools were a desktop PC, iPad tablet and Kindle eReader. Video-oculographic technology was used for recording eye movements. In the case of reading from the computer display the recordings were made by a video camera placed below the computer screen, whereas for reading from the iPad tablet, eReader and printed book the recording system was worn by the subject and had two cameras: one for recording the movement of the eyes and the other for recording the scene in front of the subject. Data analysis provided quantitative information in terms of number of fixations, their duration, and the direction of the movement, the latter to distinguish between fixations and regressions. Mean fixation duration was different only in reading from the computer display, and was similar for the Tablet, eReader and printed book. The percentage of regressions with respect to the total amount of fixations was comparable for eReading tools and the printed book. The analysis of eye movements during reading an eBook from different eReading tools suggests that subjects' reading behaviour is similar to reading from a printed book. © 2012 The College of Optometrists.
Novel presentational approaches were developed for reporting network meta-analysis.
Tan, Sze Huey; Cooper, Nicola J; Bujkiewicz, Sylwia; Welton, Nicky J; Caldwell, Deborah M; Sutton, Alexander J
2014-06-01
To present graphical tools for reporting network meta-analysis (NMA) results aiming to increase the accessibility, transparency, interpretability, and acceptability of NMA analyses. The key components of NMA results were identified based on recommendations by agencies such as the National Institute for Health and Care Excellence (United Kingdom). Three novel graphs were designed to amalgamate the identified components using familiar graphical tools such as the bar, line, or pie charts and adhering to good graphical design principles. Three key components for presentation of NMA results were identified, namely relative effects and their uncertainty, probability of an intervention being best, and between-study heterogeneity. Two of the three graphs developed present results (for each pairwise comparison of interventions in the network) obtained from both NMA and standard pairwise meta-analysis for easy comparison. They also include options to display the probability best, ranking statistics, heterogeneity, and prediction intervals. The third graph presents rankings of interventions in terms of their effectiveness to enable clinicians to easily identify "top-ranking" interventions. The graphical tools presented can display results tailored to the research question of interest, and targeted at a whole spectrum of users from the technical analyst to the nontechnical clinician. Copyright © 2014 Elsevier Inc. All rights reserved.
Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.
Smith, Anne E; Gans, Will
2015-03-01
The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.
Using enterprise architecture to analyse how organisational structure impact motivation and learning
NASA Astrophysics Data System (ADS)
Närman, Pia; Johnson, Pontus; Gingnell, Liv
2016-06-01
When technology, environment, or strategies change, organisations need to adjust their structures accordingly. These structural changes do not always enhance the organisational performance as intended partly because organisational developers do not understand the consequences of structural changes in performance. This article presents a model-based analysis framework for quantitative analysis of the effect of organisational structure on organisation performance in terms of employee motivation and learning. The model is based on Mintzberg's work on organisational structure. The quantitative analysis is formalised using the Object Constraint Language (OCL) and the Unified Modelling Language (UML) and implemented in an enterprise architecture tool.
RNA-Puzzles: A CASP-like evaluation of RNA three-dimensional structure prediction
Cruz, José Almeida; Blanchet, Marc-Frédérick; Boniecki, Michal; Bujnicki, Janusz M.; Chen, Shi-Jie; Cao, Song; Das, Rhiju; Ding, Feng; Dokholyan, Nikolay V.; Flores, Samuel Coulbourn; Huang, Lili; Lavender, Christopher A.; Lisi, Véronique; Major, François; Mikolajczak, Katarzyna; Patel, Dinshaw J.; Philips, Anna; Puton, Tomasz; Santalucia, John; Sijenyi, Fredrick; Hermann, Thomas; Rother, Kristian; Rother, Magdalena; Serganov, Alexander; Skorupski, Marcin; Soltysinski, Tomasz; Sripakdeevong, Parin; Tuszynska, Irina; Weeks, Kevin M.; Waldsich, Christina; Wildauer, Michael; Leontis, Neocles B.; Westhof, Eric
2012-01-01
We report the results of a first, collective, blind experiment in RNA three-dimensional (3D) structure prediction, encompassing three prediction puzzles. The goals are to assess the leading edge of RNA structure prediction techniques; compare existing methods and tools; and evaluate their relative strengths, weaknesses, and limitations in terms of sequence length and structural complexity. The results should give potential users insight into the suitability of available methods for different applications and facilitate efforts in the RNA structure prediction community in ongoing efforts to improve prediction tools. We also report the creation of an automated evaluation pipeline to facilitate the analysis of future RNA structure prediction exercises. PMID:22361291
FT-NIR: A Tool for Process Monitoring and More.
Martoccia, Domenico; Lutz, Holger; Cohen, Yvan; Jerphagnon, Thomas; Jenelten, Urban
2018-03-30
With ever-increasing pressure to optimize product quality, to reduce cost and to safely increase production output from existing assets, all combined with regular changes in terms of feedstock and operational targets, process monitoring with traditional instruments reaches its limits. One promising answer to these challenges is in-line, real time process analysis with spectroscopic instruments, and above all Fourier-Transform Near Infrared spectroscopy (FT-NIR). Its potential to afford decreased batch cycle times, higher yields, reduced rework and minimized batch variance is presented and application examples in the field of fine chemicals are given. We demonstrate that FT-NIR can be an efficient tool for improved process monitoring and optimization, effective process design and advanced process control.
Using an ICT Tool as a Solution for the Educational and Social Needs of Long-Term Sick Adolescents
ERIC Educational Resources Information Center
Zhu, Chang; Van Winkel, Lies
2015-01-01
This research investigates the role of an ICT tool for meeting the educational and social needs of long-term sick adolescents. Both surveys and interviews were conducted in this study. The participants of this study were sick school students between 12-19 years old. The interviewed participants had used the ICT-supporting tool for three months to…
1998-12-15
A study analyzing battlefield visualization (BV) as a component of information dominance and superiority. This study outlines basic requirements for effective BV in terms of terrain data, information systems (synthetic environment; COA development and analysis tools) and BV development management, with a focus on technology insertion strategies. This study also reports on existing BV systems and provides 16 recommendations for Army BV support efforts, including interested organization, funding levels and duration of effort for each recommended action.
1998-03-01
traveling public, air carriers, and persons employed by or conducting business at public airports. 14. SUBJECT TERMS Airport Security , Federal...26 4. Sterile Area 28 5. Exclusive Area 28 E. SECURITY ALERT LEVELS 29 F. AIRPORT SECURITY TOOLS 30 1. Electronic Detection System 31 a... Security Coordinator ASP Airport Security Program BIS Biometrie Identification System CCTV Closed Circuit Television CJIS Criminal Justice Information
Gyehee Lee; Liping A. Cai; Everette Mills; Joseph T. O' Leary
2002-01-01
Internet plays a significant role in generating new business and facilitating customers' need for a better way to plan and book their trips. From a marketers' perspective, one of the seemingly "fatal attractions" of the Internet for DMOs is that it can be an extremely effective tool in terms of both cost effectiveness and market penetration compared...
ERIC Educational Resources Information Center
Baran, Medine
2016-01-01
This study was carried out to determine high school students' perceptions of the courses of Physics and the factors influential on their perceptions with respect to gender. The research sample included 154 high school students (F:78; M:76). In the study, as the data collection tool, a structured interview form was used. The data collected in the…
Educational and Scientific Applications of Climate Model Diagnostic Analyzer
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.
2016-12-01
Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of Two Variables, and the datasets used are NCAR CAM total cloud fraction and MODIS total cloud fraction. The scientific highlight of the use case is that the CAM5 model overall does a fairly decent job at simulating total cloud cover, though simulates too few clouds especially near and offshore of the eastern ocean basins where low clouds are dominant.
Trajectory Design for the Transiting Exoplanet Survey Satellite (TESS)
NASA Technical Reports Server (NTRS)
Dichmann, Donald J.; Parker, Joel; Williams, Trevor; Mendelsohn, Chad
2014-01-01
The Transiting Exoplanet Survey Satellite (TESS) is a National Aeronautics and Space Administration (NASA) mission launching in 2017. TESS will travel in a highly eccentric orbit around Earth, with initial perigee radius near 17 Earth radii (Re) and apogee radius near 59 Re. The orbit period is near 2:1 resonance with the Moon, with apogee nearly 90 degrees out-of-phase with the Moon, in a configuration that has been shown to be operationally stable. TESS will execute phasing loops followed by a lunar flyby, with a final maneuver to achieve 2:1 resonance with the Moon. The goals of a resonant orbit with long-term stability, short eclipses and limited oscillations of perigee present significant challenges to the trajectory design. To rapidly assess launch opportunities, we adapted the SWM76 launch window tool to assess the TESS mission constraints. To understand the long-term dynamics of such a resonant orbit in the Earth-Moon system we employed Dynamical Systems Theory in the Circular Restricted 3-Body Problem (CR3BP). For precise trajectory analysis we use a high-fidelity model and multiple shooting in the General Mission Analysis Tool (GMAT) to optimize the maneuver delta-V and meet mission constraints. Finally we describe how the techniques we have developed can be applied to missions with similar requirements.
The NASA In-Space Propulsion Technology Project, Products, and Mission Applicability
NASA Technical Reports Server (NTRS)
Anderson, David J.; Pencil, Eric; Liou, Larry; Dankanich, John; Munk, Michelle M.; Kremic, Tibor
2009-01-01
The In-Space Propulsion Technology (ISPT) Project, funded by NASA s Science Mission Directorate (SMD), is continuing to invest in propulsion technologies that will enable or enhance NASA robotic science missions. This overview provides development status, near-term mission benefits, applicability, and availability of in-space propulsion technologies in the areas of aerocapture, electric propulsion, advanced chemical thrusters, and systems analysis tools. Aerocapture investments improved: guidance, navigation, and control models of blunt-body rigid aeroshells; atmospheric models for Earth, Titan, Mars, and Venus; and models for aerothermal effects. Investments in electric propulsion technologies focused on completing NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6 to 7 kW throttle-able gridded ion system. The project is also concluding its High Voltage Hall Accelerator (HiVHAC) mid-term product specifically designed for a low-cost electric propulsion option. The primary chemical propulsion investment is on the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance for lower cost. The project is also delivering products to assist technology infusion and quantify mission applicability and benefits through mission analysis and tools. In-space propulsion technologies are applicable, and potentially enabling for flagship destinations currently under evaluation, as well as having broad applicability to future Discovery and New Frontiers mission solicitations.
2014-01-01
Background Extensive literature exists on public involvement or engagement, but what actual tools or guides exist that are practical, tested and easy to use specifically for initiating and implementing patient and family engagement, is uncertain. No comprehensive review and synthesis of general international published or grey literature on this specific topic was found. A systematic scoping review of published and grey literature is, therefore, appropriate for searching through the vast general engagement literature to identify ‘patient/family engagement’ tools and guides applicable in health organization decision-making, such as within Alberta Health Services in Alberta, Canada. This latter organization requested this search and review to inform the contents of a patient engagement resource kit for patients, providers and leaders. Methods Search terms related to ‘patient engagement’, tools, guides, education and infrastructure or resources, were applied to published literature databases and grey literature search engines. Grey literature also included United States, Australia and Europe where most known public engagement practices exist, and Canada as the location for this study. Inclusion and exclusion criteria were set, and include: English documents referencing ‘patient engagement’ with specific criteria, and published between 1995 and 2011. For document analysis and synthesis, document analysis worksheets were used by three reviewers for the selected 224 published and 193 grey literature documents. Inter-rater reliability was ensured for the final reviews and syntheses of 76 published and 193 grey documents. Results Seven key themes emerged from the literature synthesis analysis, and were identified for patient, provider and/or leader groups. Articles/items within each theme were clustered under main topic areas of ‘tools’, ‘education’ and ‘infrastructure’. The synthesis and findings in the literature include 15 different terms and definitions for ‘patient engagement’, 17 different engagement models, numerous barriers and benefits, and 34 toolkits for various patient engagement and evaluation initiatives. Conclusions Patient engagement is very complex. This scoping review for patient/family engagement tools and guides is a good start for a resource inventory and can guide the content development of a patient engagement resource kit to be used by patients/families, healthcare providers and administrators. PMID:24735787
Digital processing of mesoscale analysis and space sensor data
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.
Lessons Learned During the Refurbishment and Testing of an Observatory After Long-Term Storage
NASA Technical Reports Server (NTRS)
Hawk, John; Peabody, Sharon; Stavely, Richard
2015-01-01
Thermal Fluids Analysis Workshop (TFAWS) 2015, Silver Spring, MD NCTS 21070-15. This paper addresses the lessons learned during the refurbishment and testing of the thermal control system for a spacecraft which was placed into long-term storage. The DSCOVR (Deep Space Climate Observatory) Observatory (formerly known as Triana) was originally scheduled to launch on the Space Shuttle in 2002. With the Triana spacecraft nearly complete, the mission was canceled and the satellite was abruptly put into storage in 2001. In 2008 the observatory was removed from storage to begin refurbishment and testing. Problems arose associated with hardware that was not currently manufactured, coatings degradation, and a significant lack of documentation. Also addressed is the conversion of the thermal and geometric math models for use with updated thermal analysis software tools.
Shpotyuk, Oleh; Filipecki, Jacek; Ingram, Adam; Golovchak, Roman; Vakiv, Mykola; Klym, Halyna; Balitska, Valentyna; Shpotyuk, Mykhaylo; Kozdras, Andrzej
2015-01-01
Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy applied to characterize different types of nanomaterials treated within three-term fitting procedure are critically reconsidered. In contrast to conventional three-term analysis based on admixed positron- and positronium-trapping modes, the process of nanostructurization is considered as substitutional positron-positronium trapping within the same host matrix. Developed formalism allows estimate interfacial void volumes responsible for positron trapping and characteristic bulk positron lifetimes in nanoparticle-affected inhomogeneous media. This algorithm was well justified at the example of thermally induced nanostructurization occurring in 80GeSe2-20Ga2Se3 glass.
Seven Capital Devices for the Future of Stroke Rehabilitation
Iosa, M.; Morone, G.; Fusco, A.; Bragoni, M.; Coiro, P.; Multari, M.; Venturiero, V.; De Angelis, D.; Pratesi, L.; Paolucci, S.
2012-01-01
Stroke is the leading cause of long-term disability for adults in industrialized societies. Rehabilitation's efforts are tended to avoid long-term impairments, but, actually, the rehabilitative outcomes are still poor. Novel tools based on new technologies have been developed to improve the motor recovery. In this paper, we have taken into account seven promising technologies that can improve rehabilitation of patients with stroke in the early future: (1) robotic devices for lower and upper limb recovery, (2) brain computer interfaces, (3) noninvasive brain stimulators, (4) neuroprostheses, (5) wearable devices for quantitative human movement analysis, (6) virtual reality, and (7) tablet-pc used for neurorehabilitation. PMID:23304640
NASA Astrophysics Data System (ADS)
Shpotyuk, Oleh; Filipecki, Jacek; Ingram, Adam; Golovchak, Roman; Vakiv, Mykola; Klym, Halyna; Balitska, Valentyna; Shpotyuk, Mykhaylo; Kozdras, Andrzej
2015-02-01
Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy applied to characterize different types of nanomaterials treated within three-term fitting procedure are critically reconsidered. In contrast to conventional three-term analysis based on admixed positron- and positronium-trapping modes, the process of nanostructurization is considered as substitutional positron-positronium trapping within the same host matrix. Developed formalism allows estimate interfacial void volumes responsible for positron trapping and characteristic bulk positron lifetimes in nanoparticle-affected inhomogeneous media. This algorithm was well justified at the example of thermally induced nanostructurization occurring in 80GeSe2-20Ga2Se3 glass.
New approaches in assessing food intake in epidemiology.
Conrad, Johanna; Koch, Stefanie A J; Nöthlings, Ute
2018-06-22
A promising direction for improving dietary intake measurement in epidemiologic studies is the combination of short-term and long-term dietary assessment methods using statistical methods. Thereby, web-based instruments are particularly interesting as their application offers several potential advantages such as self-administration and a shorter completion time. The objective of this review is to provide an overview of new web-based short-term instruments and to describe their features. A number of web-based short-term dietary assessment tools for application in different countries and age-groups have been developed so far. Particular attention should be paid to the underlying database and the search function of the tool. Moreover, web-based instruments can improve the estimation of portion sizes by offering several options to the user. Web-based dietary assessment methods are associated with lower costs and reduced burden for participants and researchers, and show a comparable validity with traditional instruments. When there is a need for a web-based tool researcher should consider the adaptation of existing tools rather than developing new instruments. The combination of short-term and long-term instruments seems more feasible with the use of new technology.
A Middle Palaeolithic wooden digging stick from Aranbaltza III, Spain
López-Bultó, Oriol; Iriarte, Eneko; Pérez-Garrido, Carlos; Piqué, Raquel; Aranburu, Arantza; Iriarte-Chiapusso, María José; Ortega-Cordellat, Illuminada; Bourguignon, Laurence; Garate, Diego; Libano, Iñaki
2018-01-01
Aranbaltza is an archaeological complex formed by at least three open-air sites. Between 2014 and 2015 a test excavation carried out in Aranbaltza III revealed the presence of a sand and clay sedimentary sequence formed in floodplain environments, within which six sedimentary units have been identified. This sequence was formed between 137–50 ka, and includes several archaeological horizons, attesting to the long-term presence of Neanderthal communities in this area. One of these horizons, corresponding with Unit 4, yielded two wooden tools. One of these tools is a beveled pointed tool that was shaped through a complex operational sequence involving branch shaping, bark peeling, twig removal, shaping, polishing, thermal exposition and chopping. A use-wear analysis of the tool shows it to have traces related with digging soil so it has been interpreted as representing a digging stick. This is the first time such a tool has been identified in a European Late Middle Palaeolithic context; it also represents one of the first well-preserved Middle Palaeolithic wooden tool found in southern Europe. This artefact represents one of the few examples available of wooden tool preservation for the European Palaeolithic, allowing us to further explore the role wooden technologies played in Neanderthal communities. PMID:29590205
Seasonal-Scale Optimization of Conventional Hydropower Operations in the Upper Colorado System
NASA Astrophysics Data System (ADS)
Bier, A.; Villa, D.; Sun, A.; Lowry, T. S.; Barco, J.
2011-12-01
Sandia National Laboratories is developing the Hydropower Seasonal Concurrent Optimization for Power and the Environment (Hydro-SCOPE) tool to examine basin-wide conventional hydropower operations at seasonal time scales. This tool is part of an integrated, multi-laboratory project designed to explore different aspects of optimizing conventional hydropower operations. The Hydro-SCOPE tool couples a one-dimensional reservoir model with a river routing model to simulate hydrology and water quality. An optimization engine wraps around this model framework to solve for long-term operational strategies that best meet the specific objectives of the hydrologic system while honoring operational and environmental constraints. The optimization routines are provided by Sandia's open source DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) software. Hydro-SCOPE allows for multi-objective optimization, which can be used to gain insight into the trade-offs that must be made between objectives. The Hydro-SCOPE tool is being applied to the Upper Colorado Basin hydrologic system. This system contains six reservoirs, each with its own set of objectives (such as maximizing revenue, optimizing environmental indicators, meeting water use needs, or other objectives) and constraints. This leads to a large optimization problem with strong connectedness between objectives. The systems-level approach used by the Hydro-SCOPE tool allows simultaneous analysis of these objectives, as well as understanding of potential trade-offs related to different objectives and operating strategies. The seasonal-scale tool will be tightly integrated with the other components of this project, which examine day-ahead and real-time planning, environmental performance, hydrologic forecasting, and plant efficiency.
NASA Astrophysics Data System (ADS)
Haddag, B.; Kagnaya, T.; Nouari, M.; Cutard, T.
2013-01-01
Modelling machining operations allows estimating cutting parameters which are difficult to obtain experimentally and in particular, include quantities characterizing the tool-workpiece interface. Temperature is one of these quantities which has an impact on the tool wear, thus its estimation is important. This study deals with a new modelling strategy, based on two steps of calculation, for analysis of the heat transfer into the cutting tool. Unlike the classical methods, considering only the cutting tool with application of an approximate heat flux at the cutting face, estimated from experimental data (e.g. measured cutting force, cutting power), the proposed approach consists of two successive 3D Finite Element calculations and fully independent on the experimental measurements; only the definition of the behaviour of the tool-workpiece couple is necessary. The first one is a 3D thermomechanical modelling of the chip formation process, which allows estimating cutting forces, chip morphology and its flow direction. The second calculation is a 3D thermal modelling of the heat diffusion into the cutting tool, by using an adequate thermal loading (applied uniform or non-uniform heat flux). This loading is estimated using some quantities obtained from the first step calculation, such as contact pressure, sliding velocity distributions and contact area. Comparisons in one hand between experimental data and the first calculation and at the other hand between measured temperatures with embedded thermocouples and the second calculation show a good agreement in terms of cutting forces, chip morphology and cutting temperature.
Küster, Eberhard; Dorusch, Falk; Vogt, Carsten; Weiss, Holger; Altenburger, Rolf
2004-07-15
Success of groundwater remediation is typically controlled via snapshot analysis of selected chemical substances or physical parameters. Biological parameters, i.e. ecotoxicological assays, are rarely employed. Hence the aim of the study was to develop a bioassay tool, which allows an on line monitoring of contaminated groundwater, as well as a toxicity reduction evaluation (TRE) of different remediation techniques in parallel and may furthermore be used as an additional tool for process control to supervise remediation techniques in a real time mode. Parallel testing of groundwater remediation techniques was accomplished for short and long time periods, by using the energy dependent luminescence of the bacterium Vibrio fischeri as biological monitoring parameter. One data point every hour for each remediation technique was generated by an automated biomonitor. The bacteria proved to be highly sensitive to the contaminated groundwater and the biomonitor showed a long standing time despite the highly corrosive groundwater present in Bitterfeld, Germany. The bacterial biomonitor is demonstrated to be a valuable tool for remediation success evaluation. Dose response relationships were generated for the six quantitatively dominant groundwater contaminants (2-chlortoluene, 1,2- and 1,4-dichlorobenzene, monochlorobenzene, ethylenbenzene and benzene). The concentrations of individual volatile organic chemicals (VOCs) could not explain the observed effects in the bacteria. An expected mixture toxicity was calculated for the six components using the concept of concentration addition. The calculated EC(50) for the mixture was still one order of magnitude lower than the observed EC(50) of the actual groundwater. The results pointed out that chemical analysis of the six most quantitative substances alone was not able to explain the effects observed with the bacteria. Thus chemical analysis alone may not be an adequate tool for remediation success evaluation in terms of toxicity reduction.
Depth of manual dismantling analysis: A cost–benefit approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Achillas, Ch., E-mail: c.achillas@ihu.edu.gr; Aidonis, D.; Vlachokostas, Ch.
Highlights: ► A mathematical modeling tool for OEMs. ► The tool can be used by OEMs, recyclers of electr(on)ic equipment or WEEE management systems’ regulators. ► The tool makes use of cost–benefit analysis in order to determine the optimal depth of product disassembly. ► The reusable materials and the quantity of metals and plastics recycled can be quantified in an easy-to-comprehend manner. - Abstract: This paper presents a decision support tool for manufacturers and recyclers towards end-of-life strategies for waste electrical and electronic equipment. A mathematical formulation based on the cost benefit analysis concept is herein analytically described in ordermore » to determine the parts and/or components of an obsolete product that should be either non-destructively recovered for reuse or be recycled. The framework optimally determines the depth of disassembly for a given product, taking into account economic considerations. On this basis, it embeds all relevant cost elements to be included in the decision-making process, such as recovered materials and (depreciated) parts/components, labor costs, energy consumption, equipment depreciation, quality control and warehousing. This tool can be part of the strategic decision-making process in order to maximize profitability or minimize end-of-life management costs. A case study to demonstrate the models’ applicability is presented for a typical electronic product in terms of structure and material composition. Taking into account the market values of the pilot product’s components, the manual disassembly is proven profitable with the marginal revenues from recovered reusable materials to be estimated at 2.93–23.06 €, depending on the level of disassembly.« less
Open Source GIS based integrated watershed management
NASA Astrophysics Data System (ADS)
Byrne, J. M.; Lindsay, J.; Berg, A. A.
2013-12-01
Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address challenging resource management issues in industry, government and nongovernmental agencies. Current research and analysis tools were developed to manage meteorological, climatological, and land and water resource data efficiently at high resolution in space and time. The deliverable for this work is a Whitebox-GENESYS open-source resource management capacity with routines for GIS based watershed management including water in agriculture and food production. We are adding urban water management routines through GENESYS in 2013-15 with an engineering PhD candidate. Both Whitebox-GAT and GENESYS are already well-established tools. The proposed research will combine these products to create an open-source geomatics based water resource management tool that is revolutionary in both capacity and availability to a wide array of Canadian and global users
Tool for analyzing the vulnerability of buildings to flooding: the case of Switzerland
NASA Astrophysics Data System (ADS)
Choffet, Marc; Bianchi, Renzo; Jaboyedoff, Michel; Kölz, Ehrfried; Lateltin, Olivier; Leroi, Eric; Mayis, Arnaud
2010-05-01
Whatever the way used to protect property exposed to flood, there exists a residual risk. That is what feedbacks of past flooding show. This residual risk is on one hand linked with the possibility that the protection measures may fail or may not work as intended. The residual risk is on the other hand linked with the possibility that the flood exceeds the chosen level of protection.In many European countries, governments and insurance companies are thinking in terms of vulnerability reduction. This publication will present a new tool to evaluate the vulnerability of buildings in a context of flooding. This tool is developed by the project "Analysis of the vulnerability of buildings to flooding" which is funded by the Foundation for Prevention of Cantonal insurances, Switzerland. It is composed by three modules and it aims to provide a method for reducing the vulnerability of buildings to flooding. The first two modules allow identifying all the elements composing the building and listing it. The third module is dedicated to the choice of efficient risk reducing measures on the basis of cost-benefit analyses. The diagnostic tool for different parts of the building is being developed to allow real estate appraisers, insurance companies and homeowners rapidly assess the vulnerability of buildings in flood prone areas. The tool works with by several databases that have been selected from the collection and analysis of data, information, standards and feedback from risk management, hydrology, architecture, construction, materials engineering, insurance, or economy of construction. A method for determining the local hazard is also proposed, to determine the height of potential floods threatening a building, based on a back analysis of Swiss hazard maps. To calibrate the model, seven cantonal insurance institutions participate in the study by providing data, such as the the amount of damage in flooded areas. The poster will present some results from the development of the tool, such as the amount of damages to buildings and the possibility of analysis offered by the tool. Furthermore, analysis of data from the insurance company led to the emergence of trends in costs of damage due to flooding. Some graphics will be presented in the poster to illustrate the tool design. It will be shown that the tool allow for a census of buildings and the awareness of its vulnerability to flooding. A database development explanation concerning the remediation cost measures and the damage costs are also proposed. Simple and innovative remedial measures could be shown in the poster. By the help of some examples it is shown that the tool allows for an investigation of some interesting perspectives in the development of insurance strategies for building stocks in flood prone areas.
NASA Astrophysics Data System (ADS)
Blomqvist, Niclas; Whipp, David
2016-04-01
The topography of the Earth's surface is the result of the interaction of tectonics, erosion and climate. Thus, topography should contain a record of these processes that can be extracted by topographic analysis. The question considered in this study is whether the spatial variations in erosion that have sculpted the modern topography are representative of the long-term erosion rates in mountainous regions. We compare long-term erosion rates derived from low-temperature thermochronometry to erosional proxies calculated from topographic and climatic data analysis. The study has been performed on a global scale including six orogens: The Himalaya, Andes, Taiwan, Olympic Mountains, Southern Alps in New Zealand and European Alps. The data was analyzed using a new swath profile analysis tool for ArcGIS called ArcSwath (https://github.com/HUGG/ArcSwath) to determine the correlations between the long-term erosion rates and modern elevations, slope angles, relief in 2.5-km- and 5-km-diameter circles, erosion potential, normalized channel steepness index ksn, and annual rainfall. ArcSwath uses a Python script that has been incorporated into an ArcMap 10.2 add-in tool, extracting swath profiles in about ten seconds compared to earlier workflows that could take more than an hour. In ArcMap, UTM-projected point or raster files can be used for creating swath profiles. Point data are projected onto the swath and the statistical parameters (minimum, mean and maximum of the values across the swath) are calculated for the raster data. Both can be immediately plotted using the Python matplotlib library, or plotted externally using the csv-file that is produced by ArcSwath. When raster and point data are plotted together, it is easier to make comparisons and see correlations between the selected data. An unambiguous correlation between the topographic or climatic metrics and long-term erosion rates was not found. Fitting of linear regression lines to the topographic/ climatic metric data and the long-term erosion rates shows that 86 of 288 plots (30%) have "good" R2 values (> 0.35) and 135 of 288 (47%) have an "acceptable" R2 value (> 0.2). The "good" and "acceptable" values have been selected on the basis of visual fit to the regression line. The majority of the plots with a "good" correlation value have positive correlations, while 11/86 plots have negative slopes for the regression lines. Interestingly, two topographic profile shapes were clear in swath profiles: Concave-up (e.g., the central-western Himalaya and the northern Bolivian Andes) and concave-down or straight (e.g., the eastern Himalayas and the southern Bolivian Andes). On the orogen scale, the concave-up shape is often related to relatively high precipitation and erosion rates on the slopes of steep topography. The concave-down/straight profiles seem to occur in association of low rainfall and/or erosion rates. Though we cannot say with confidence, the lack of a clear correlation between long-term erosion rates and climate or topography may be due to the difference in their respective timescales as climate can vary over shorter timescales than 105-107 years. In that case, variations between fluvial and glacial erosion may have overprinted the erosional effects of one another.
Piezoelectric Actuator Modeling Using MSC/NASTRAN and MATLAB
NASA Technical Reports Server (NTRS)
Reaves, Mercedes C.; Horta, Lucas G.
2003-01-01
This paper presents a procedure for modeling structures containing piezoelectric actuators using MSCMASTRAN and MATLAB. The paper describes the utility and functionality of one set of validated modeling tools. The tools described herein use MSCMASTRAN to model the structure with piezoelectric actuators and a thermally induced strain to model straining of the actuators due to an applied voltage field. MATLAB scripts are used to assemble the dynamic equations and to generate frequency response functions. The application of these tools is discussed using a cantilever aluminum beam with a surface mounted piezoelectric actuator as a sample problem. Software in the form of MSCINASTRAN DMAP input commands, MATLAB scripts, and a step-by-step procedure to solve the example problem are provided. Analysis results are generated in terms of frequency response functions from deflection and strain data as a function of input voltage to the actuator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradel, Lauren; Endert, Alexander; Koch, Kristen
2013-08-01
Large, high-resolution vertical displays carry the potential to increase the accuracy of collaborative sensemaking, given correctly designed visual analytics tools. From an exploratory user study using a fictional textual intelligence analysis task, we investigated how users interact with the display to construct spatial schemas and externalize information, as well as how they establish shared and private territories. We investigated the space management strategies of users partitioned by type of tool philosophy followed (visualization- or text-centric). We classified the types of territorial behavior exhibited in terms of how the users interacted with information on the display (integrated or independent workspaces). Next,more » we examined how territorial behavior impacted the common ground between the pairs of users. Finally, we offer design suggestions for building future co-located collaborative visual analytics tools specifically for use on large, high-resolution vertical displays.« less
TriageTools: tools for partitioning and prioritizing analysis of high-throughput sequencing data.
Fimereli, Danai; Detours, Vincent; Konopka, Tomasz
2013-04-01
High-throughput sequencing is becoming a popular research tool but carries with it considerable costs in terms of computation time, data storage and bandwidth. Meanwhile, some research applications focusing on individual genes or pathways do not necessitate processing of a full sequencing dataset. Thus, it is desirable to partition a large dataset into smaller, manageable, but relevant pieces. We present a toolkit for partitioning raw sequencing data that includes a method for extracting reads that are likely to map onto pre-defined regions of interest. We show the method can be used to extract information about genes of interest from DNA or RNA sequencing samples in a fraction of the time and disk space required to process and store a full dataset. We report speedup factors between 2.6 and 96, depending on settings and samples used. The software is available at http://www.sourceforge.net/projects/triagetools/.
Grisham, William; Schottler, Natalie A.; McCauley, Lisa M. Beck; Pham, Anh P.; Ruiz, Maureen L.; Fong, Michelle C.; Cui, Xinran
2011-01-01
Zebra finch song behavior is sexually dimorphic: males sing and females do not. The neural system underlying this behavior is sexually dimorphic, and this sex difference is easy to quantify. During development, the zebra finch song system can be altered by steroid hormones, specifically estradiol, which actually masculinizes it. Because of the ease of quantification and experimental manipulation, the zebra finch song system has great potential for use in undergraduate labs. Unfortunately, the underlying costs prohibit use of this system in undergraduate labs. Further, the time required to perform a developmental study renders such undertakings unrealistic within a single academic term. We have overcome these barriers by creating digital tools, including an image library of song nuclei from zebra finch brains. Students using this library replicate and extend a published experiment examining the dose of estradiol required to masculinize the female zebra finch brain. We have used this library for several terms, and students not only obtain significant experimental results but also make gains in understanding content, experimental controls, and inferential statistics (analysis of variance and post hoc tests). We have provided free access to these digital tools at the following website: http://mdcune.psych.ucla.edu/modules/birdsong. PMID:21633071
Short-term Forecasting Tools for Agricultural Nutrient Management.
Easton, Zachary M; Kleinman, Peter J A; Buda, Anthony R; Goering, Dustin; Emberston, Nichole; Reed, Seann; Drohan, Patrick J; Walter, M Todd; Guinan, Pat; Lory, John A; Sommerlot, Andrew R; Sharpley, Andrew
2017-11-01
The advent of real-time, short-term farm management tools is motivated by the need to protect water quality above and beyond the general guidance offered by existing nutrient management plans. Advances in high-performance computing and hydrologic or climate modeling have enabled rapid dissemination of real-time information that can assist landowners and conservation personnel with short-term management planning. This paper reviews short-term decision support tools for agriculture that are under various stages of development and implementation in the United States: (i) Wisconsin's Runoff Risk Advisory Forecast (RRAF) System, (ii) New York's Hydrologically Sensitive Area Prediction Tool, (iii) Virginia's Saturated Area Forecast Model, (iv) Pennsylvania's Fertilizer Forecaster, (v) Washington's Application Risk Management (ARM) System, and (vi) Missouri's Design Storm Notification System. Although these decision support tools differ in their underlying model structure, the resolution at which they are applied, and the hydroclimates to which they are relevant, all provide forecasts (range 24-120 h) of runoff risk or soil moisture saturation derived from National Weather Service Forecast models. Although this review highlights the need for further development of robust and well-supported short-term nutrient management tools, their potential for adoption and ultimate utility requires an understanding of the appropriate context of application, the strategic and operational needs of managers, access to weather forecasts, scales of application (e.g., regional vs. field level), data requirements, and outreach communication structure. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
The State of Software for Evolutionary Biology.
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-05-01
With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.
Long-Term Pavement Performance Bind Online [Product Brief
DOT National Transportation Integrated Search
2017-02-23
This Product Brief introduces the reader to the Long-Term Pavement Performance Bind (LTPPBind) Online Web-based tool for selecting asphalt binder performance grades (PGs).(1) It explains what the tool is, who can benefit from its use, what its main f...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G.P.; Burns, H.H.; Langton, C.
2013-07-01
The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste and Nuclear Materials Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up tomore » 100 years and longer for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox has produced tangible benefits to the DOE Performance Assessment (PA) community. A review of prior DOE PAs has provided a list of potential opportunities for improving cementitious barrier performance predictions through the use of the CBP software tools. These opportunities include: 1) impact of atmospheric exposure to concrete and grout before closure, such as accelerated slag and Tc-99 oxidation, 2) prediction of changes in K{sub d}/mobility as a function of time that result from changing pH and redox conditions, 3) concrete degradation from rebar corrosion due to carbonation, 4) early age cracking from drying and/or thermal shrinkage and 5) degradation due to sulfate attack. The CBP has already had opportunity to provide near-term, tangible support to ongoing DOE-EM PAs such as the Savannah River Saltstone Disposal Facility (SDF) by providing a sulfate attack analysis that predicts the extent and damage that sulfate ingress will have on the concrete vaults over extended time (i.e., > 1000 years). This analysis is one of the many technical opportunities in cementitious barrier performance that can be addressed by the DOE-EM sponsored CBP software tools. Modification of the existing tools can provide many opportunities to bring defense in depth in prediction of the performance of cementitious barriers over time. (authors)« less
Kennedy, Anne; Vassilev, Ivaylo; James, Elizabeth; Rogers, Anne
2016-02-29
For people with long-term conditions, social networks provide a potentially central means of mobilising, mediating and accessing support for health and well-being. Few interventions address the implementation of improving engagement with and through social networks. This paper describes the development and implementation of a web-based tool which comprises: network mapping, user-centred preference elicitation and need assessment and facilitated engagement with resources. The study aimed to determine whether the intervention was acceptable, implementable and acted to enhance support and to add to theory concerning social networks and engagement with resources and activities. A longitudinal design with 15 case studies used ethnographic methods comprising video, non-participant observation of intervention delivery and qualitative interviews (baseline, 6 and 12 months). Participants were people with type 2 diabetes living in a marginalised island community. Facilitators were local health trainers and care navigators. Analysis applied concepts concerning implementation of technology for self-management support to explain how new practices of work were operationalised and how the technology impacted on relationships fit with everyday life and allowed for visual feedback. Most participants reported identifying and taking up new activities as a result of using the tool. Thematic analysis suggested that workability of the tool was predicated on disruption and reconstruction of networks, challenging/supportive facilitation and change and reflection over time concerning network support. Visualisation of the network enabled people to mobilise support and engage in new activities. The tool aligned synergistically with the facilitators' role of linking people to local resources. The social network tool works through a process of initiating positive disruption of established self-management practice through mapping and reflection on personal network membership and support. This opens up possibilities for reconstructing self-management differently from current practice. Key facets of successful implementation were: the visual maps of networks and support options; facilitation characterised by a perceived lack of status difference which assisted engagement and constructive discussion of support and preferences for activities; and background work (a reliable database, tailored preferences, option reduction) for facilitator and user ease of use.
Verification of a rapid mooring and foundation design tool
Weller, Sam D.; Hardwick, Jon; Gomez, Steven; ...
2018-02-15
Marine renewable energy devices require mooring and foundation systems that suitable in terms of device operation and are also robust and cost effective. In the initial stages of mooring and foundation development a large number of possible configuration permutations exist. Filtering of unsuitable designs is possible using information specific to the deployment site (i.e. bathymetry, environmental conditions) and device (i.e. mooring and/or foundation system role and cable connection requirements). The identification of a final solution requires detailed analysis, which includes load cases based on extreme environmental statistics following certification guidance processes. Static and/or quasi-static modelling of the mooring and/or foundationmore » system serves as an intermediate design filtering stage enabling dynamic time-domain analysis to be focused on a small number of potential configurations. Mooring and foundation design is therefore reliant on logical decision making throughout this stage-gate process. The open-source DTOcean (Optimal Design Tools for Ocean Energy Arrays) Tool includes a mooring and foundation module, which automates the configuration selection process for fixed and floating wave and tidal energy devices. As far as the authors are aware, this is one of the first tools to be developed for the purpose of identifying potential solutions during the initial stages of marine renewable energy design. While the mooring and foundation module does not replace a full design assessment, it provides in addition to suitable configuration solutions, assessments in terms of reliability, economics and environmental impact. This article provides insight into the solution identification approach used by the module and features the verification of both the mooring system calculations and the foundation design using commercial software. Several case studies are investigated: a floating wave energy converter and several anchoring systems. It is demonstrated that the mooring and foundation module is able to provide device and/or site developers with rapid mooring and foundation design solutions to appropriate design criteria.« less
Verification of a rapid mooring and foundation design tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weller, Sam D.; Hardwick, Jon; Gomez, Steven
Marine renewable energy devices require mooring and foundation systems that suitable in terms of device operation and are also robust and cost effective. In the initial stages of mooring and foundation development a large number of possible configuration permutations exist. Filtering of unsuitable designs is possible using information specific to the deployment site (i.e. bathymetry, environmental conditions) and device (i.e. mooring and/or foundation system role and cable connection requirements). The identification of a final solution requires detailed analysis, which includes load cases based on extreme environmental statistics following certification guidance processes. Static and/or quasi-static modelling of the mooring and/or foundationmore » system serves as an intermediate design filtering stage enabling dynamic time-domain analysis to be focused on a small number of potential configurations. Mooring and foundation design is therefore reliant on logical decision making throughout this stage-gate process. The open-source DTOcean (Optimal Design Tools for Ocean Energy Arrays) Tool includes a mooring and foundation module, which automates the configuration selection process for fixed and floating wave and tidal energy devices. As far as the authors are aware, this is one of the first tools to be developed for the purpose of identifying potential solutions during the initial stages of marine renewable energy design. While the mooring and foundation module does not replace a full design assessment, it provides in addition to suitable configuration solutions, assessments in terms of reliability, economics and environmental impact. This article provides insight into the solution identification approach used by the module and features the verification of both the mooring system calculations and the foundation design using commercial software. Several case studies are investigated: a floating wave energy converter and several anchoring systems. It is demonstrated that the mooring and foundation module is able to provide device and/or site developers with rapid mooring and foundation design solutions to appropriate design criteria.« less
Pelagic habitat visualization: the need for a third (and fourth) dimension: HabitatSpace
Beegle-Krause, C; Vance, Tiffany; Reusser, Debbie; Stuebe, David; Howlett, Eoin
2009-01-01
Habitat in open water is not simply a 2-D to 2.5-D surface such as the ocean bottom or the air-water interface. Rather, pelagic habitat is a 3-D volume of water that can change over time, leading us to the term habitat space. Visualization and analysis in 2-D is well supported with GIS tools, but a new tool was needed for visualization and analysis in four dimensions. Observational data (cruise profiles (xo, yo, z, to)), numerical circulation model fields (x,y,z,t), and trajectories (larval fish, 4-D line) need to be merged together in a meaningful way for visualization and analysis. As a first step toward this new framework, UNIDATA’s Integrated Data Viewer (IDV) has been used to create a set of tools for habitat analysis in 4-D. IDV was designed for 3-D+time geospatial data in the meteorological community. NetCDF JavaTM libraries allow the tool to read many file formats including remotely located data (e.g. data available via OPeNDAP ). With this project, IDV has been adapted for use in delineating habitat space for multiple fish species in the ocean. The ability to define and visualize boundaries of a water mass, which meets specific biologically relevant criteria (e.g., volume, connectedness, and inter-annual variability) based on model results and observational data, will allow managers to investigate the survival of individual year classes of commercially important fisheries. Better understanding of the survival of these year classes will lead to improved forecasting of fisheries recruitment.
Vallenet, David; Belda, Eugeni; Calteau, Alexandra; Cruveiller, Stéphane; Engelen, Stefan; Lajus, Aurélie; Le Fèvre, François; Longin, Cyrille; Mornico, Damien; Roche, David; Rouy, Zoé; Salvignol, Gregory; Scarpelli, Claude; Thil Smith, Adam Alexander; Weiman, Marion; Médigue, Claudine
2013-01-01
MicroScope is an integrated platform dedicated to both the methodical updating of microbial genome annotation and to comparative analysis. The resource provides data from completed and ongoing genome projects (automatic and expert annotations), together with data sources from post-genomic experiments (i.e. transcriptomics, mutant collections) allowing users to perfect and improve the understanding of gene functions. MicroScope (http://www.genoscope.cns.fr/agc/microscope) combines tools and graphical interfaces to analyse genomes and to perform the manual curation of gene annotations in a comparative context. Since its first publication in January 2006, the system (previously named MaGe for Magnifying Genomes) has been continuously extended both in terms of data content and analysis tools. The last update of MicroScope was published in 2009 in the Database journal. Today, the resource contains data for >1600 microbial genomes, of which ∼300 are manually curated and maintained by biologists (1200 personal accounts today). Expert annotations are continuously gathered in the MicroScope database (∼50 000 a year), contributing to the improvement of the quality of microbial genomes annotations. Improved data browsing and searching tools have been added, original tools useful in the context of expert annotation have been developed and integrated and the website has been significantly redesigned to be more user-friendly. Furthermore, in the context of the European project Microme (Framework Program 7 Collaborative Project), MicroScope is becoming a resource providing for the curation and analysis of both genomic and metabolic data. An increasing number of projects are related to the study of environmental bacterial (meta)genomes that are able to metabolize a large variety of chemical compounds that may be of high industrial interest. PMID:23193269
Enrichr: interactive and collaborative HTML5 gene list enrichment analysis tool
2013-01-01
Background System-wide profiling of genes and proteins in mammalian cells produce lists of differentially expressed genes/proteins that need to be further analyzed for their collective functions in order to extract new knowledge. Once unbiased lists of genes or proteins are generated from such experiments, these lists are used as input for computing enrichment with existing lists created from prior knowledge organized into gene-set libraries. While many enrichment analysis tools and gene-set libraries databases have been developed, there is still room for improvement. Results Here, we present Enrichr, an integrative web-based and mobile software application that includes new gene-set libraries, an alternative approach to rank enriched terms, and various interactive visualization approaches to display enrichment results using the JavaScript library, Data Driven Documents (D3). The software can also be embedded into any tool that performs gene list analysis. We applied Enrichr to analyze nine cancer cell lines by comparing their enrichment signatures to the enrichment signatures of matched normal tissues. We observed a common pattern of up regulation of the polycomb group PRC2 and enrichment for the histone mark H3K27me3 in many cancer cell lines, as well as alterations in Toll-like receptor and interlukin signaling in K562 cells when compared with normal myeloid CD33+ cells. Such analyses provide global visualization of critical differences between normal tissues and cancer cell lines but can be applied to many other scenarios. Conclusions Enrichr is an easy to use intuitive enrichment analysis web-based tool providing various types of visualization summaries of collective functions of gene lists. Enrichr is open source and freely available online at: http://amp.pharm.mssm.edu/Enrichr. PMID:23586463
Enrichr: interactive and collaborative HTML5 gene list enrichment analysis tool.
Chen, Edward Y; Tan, Christopher M; Kou, Yan; Duan, Qiaonan; Wang, Zichen; Meirelles, Gabriela Vaz; Clark, Neil R; Ma'ayan, Avi
2013-04-15
System-wide profiling of genes and proteins in mammalian cells produce lists of differentially expressed genes/proteins that need to be further analyzed for their collective functions in order to extract new knowledge. Once unbiased lists of genes or proteins are generated from such experiments, these lists are used as input for computing enrichment with existing lists created from prior knowledge organized into gene-set libraries. While many enrichment analysis tools and gene-set libraries databases have been developed, there is still room for improvement. Here, we present Enrichr, an integrative web-based and mobile software application that includes new gene-set libraries, an alternative approach to rank enriched terms, and various interactive visualization approaches to display enrichment results using the JavaScript library, Data Driven Documents (D3). The software can also be embedded into any tool that performs gene list analysis. We applied Enrichr to analyze nine cancer cell lines by comparing their enrichment signatures to the enrichment signatures of matched normal tissues. We observed a common pattern of up regulation of the polycomb group PRC2 and enrichment for the histone mark H3K27me3 in many cancer cell lines, as well as alterations in Toll-like receptor and interlukin signaling in K562 cells when compared with normal myeloid CD33+ cells. Such analyses provide global visualization of critical differences between normal tissues and cancer cell lines but can be applied to many other scenarios. Enrichr is an easy to use intuitive enrichment analysis web-based tool providing various types of visualization summaries of collective functions of gene lists. Enrichr is open source and freely available online at: http://amp.pharm.mssm.edu/Enrichr.
WE-D-204-06: An Open Source ImageJ CatPhan Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, G
2015-06-15
Purpose: The CatPhan is a popular QA device for assessing CT image quality. There are a number of software options which perform analysis of the CatPhan. However, there is often little ability for the user to adjust the analysis if it isn’t running properly, and these are all expensive options. An open source tool is an effective solution. Methods: To use the software, the user imports the CT as an image sequence in ImageJ. The user then scrolls to the slice with the lateral dots. The user then runs the plugin. If tolerance constraints are not already created, the usermore » is prompted to enter them or to use generic tolerances. Upon completion of the analysis, the plugin calls pdfLaTex to compile the pdf report. There is a csv version of the report as well. A log of the results from all CatPhan scans is kept as a csv file. The user can use this to baseline the machine. Results: The tool is capable of detecting the orientation of the phantom. If the CatPhan was scanned backwards, one can simply flip the stack of images horizontally and proceed with the analysis. The analysis includes Sensitometry (estimating the effective beam energy), HU values and linearity, Low Contrast Visibility (using LDPE & Polystyrene), Contrast Scale, Geometric Accuracy, Slice Thickness Accuracy, Spatial resolution (giving the MTF using the line pairs as well as the point spread function), CNR, Low Contrast Detectability (including the raw data), Uniformity (including the Cupping Effect). Conclusion: This is a robust tool that analyzes more components of the CatPhan than other software options (with the exception of ImageOwl). It produces an elegant pdf and keeps a log of analyses for long-term tracking of the system. Because it is open source, users are able to customize any component of it.« less
Network analysis applications in hydrology
NASA Astrophysics Data System (ADS)
Price, Katie
2017-04-01
Applied network theory has seen pronounced expansion in recent years, in fields such as epidemiology, computer science, and sociology. Concurrent development of analytical methods and frameworks has increased possibilities and tools available to researchers seeking to apply network theory to a variety of problems. While water and nutrient fluxes through stream systems clearly demonstrate a directional network structure, the hydrological applications of network theory remain underexplored. This presentation covers a review of network applications in hydrology, followed by an overview of promising network analytical tools that potentially offer new insights into conceptual modeling of hydrologic systems, identifying behavioral transition zones in stream networks and thresholds of dynamical system response. Network applications were tested along an urbanization gradient in Atlanta, Georgia, USA. Peachtree Creek and Proctor Creek. Peachtree Creek contains a nest of five longterm USGS streamflow and water quality gages, allowing network application of longterm flow statistics. The watershed spans a range of suburban and heavily urbanized conditions. Summary flow statistics and water quality metrics were analyzed using a suite of network analysis techniques, to test the conceptual modeling and predictive potential of the methodologies. Storm events and low flow dynamics during Summer 2016 were analyzed using multiple network approaches, with an emphasis on tomogravity methods. Results indicate that network theory approaches offer novel perspectives for understanding long term and eventbased hydrological data. Key future directions for network applications include 1) optimizing data collection, 2) identifying "hotspots" of contaminant and overland flow influx to stream systems, 3) defining process domains, and 4) analyzing dynamic connectivity of various system components, including groundwatersurface water interactions.
An improved method for functional similarity analysis of genes based on Gene Ontology.
Tian, Zhen; Wang, Chunyu; Guo, Maozu; Liu, Xiaoyan; Teng, Zhixia
2016-12-23
Measures of gene functional similarity are essential tools for gene clustering, gene function prediction, evaluation of protein-protein interaction, disease gene prioritization and other applications. In recent years, many gene functional similarity methods have been proposed based on the semantic similarity of GO terms. However, these leading approaches may make errorprone judgments especially when they measure the specificity of GO terms as well as the IC of a term set. Therefore, how to estimate the gene functional similarity reliably is still a challenging problem. We propose WIS, an effective method to measure the gene functional similarity. First of all, WIS computes the IC of a term by employing its depth, the number of its ancestors as well as the topology of its descendants in the GO graph. Secondly, WIS calculates the IC of a term set by means of considering the weighted inherited semantics of terms. Finally, WIS estimates the gene functional similarity based on the IC overlap ratio of term sets. WIS is superior to some other representative measures on the experiments of functional classification of genes in a biological pathway, collaborative evaluation of GO-based semantic similarity measures, protein-protein interaction prediction and correlation with gene expression. Further analysis suggests that WIS takes fully into account the specificity of terms and the weighted inherited semantics of terms between GO terms. The proposed WIS method is an effective and reliable way to compare gene function. The web service of WIS is freely available at http://nclab.hit.edu.cn/WIS/ .
Short-term forecasting tools for agricultural nutrient management
USDA-ARS?s Scientific Manuscript database
The advent of real time/short term farm management tools is motivated by the need to protect water quality above and beyond the general guidance offered by existing nutrient management plans. Advances in high performance computing and hydrologic/climate modeling have enabled rapid dissemination of ...
Probabilistic finite elements for fatigue and fracture analysis
NASA Astrophysics Data System (ADS)
Belytschko, Ted; Liu, Wing Kam
Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.
Computer-Aided Analysis of Patents for Product Technology Maturity Forecasting
NASA Astrophysics Data System (ADS)
Liang, Yanhong; Gan, Dequan; Guo, Yingchun; Zhang, Peng
Product technology maturity foresting is vital for any enterprises to hold the chance for innovation and keep competitive for a long term. The Theory of Invention Problem Solving (TRIZ) is acknowledged both as a systematic methodology for innovation and a powerful tool for technology forecasting. Based on TRIZ, the state -of-the-art on the technology maturity of product and the limits of application are discussed. With the application of text mining and patent analysis technologies, this paper proposes a computer-aided approach for product technology maturity forecasting. It can overcome the shortcomings of the current methods.
Probabilistic finite elements for fatigue and fracture analysis
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Liu, Wing Kam
1992-01-01
Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.
Riera, Amalis; Ford, John K; Ross Chapman, N
2013-09-01
Killer whales in British Columbia are at risk, and little is known about their winter distribution. Passive acoustic monitoring of their year-round habitat is a valuable supplemental method to traditional visual and photographic surveys. However, long-term acoustic studies of odontocetes have some limitations, including the generation of large amounts of data that require highly time-consuming processing. There is a need to develop tools and protocols to maximize the efficiency of such studies. Here, two types of analysis, real-time and long term spectral averages, were compared to assess their performance at detecting killer whale calls in long-term acoustic recordings. In addition, two different duty cycles, 1/3 and 2/3, were tested. Both the use of long term spectral averages and a lower duty cycle resulted in a decrease in call detection and positive pod identification, leading to underestimations of the amount of time the whales were present. The impact of these limitations should be considered in future killer whale acoustic surveys. A compromise between a lower resolution data processing method and a higher duty cycle is suggested for maximum methodological efficiency.
Battlespace Awareness: Heterogeneous Sensor Maps of Large Scale, Complex Environments
2017-06-13
reference frames enable a system designer to describe the position of any sensor or platform at any point of time. This section introduces the...analysis to evaluate the quality of reconstructions created by our algorithms. CloudCompare is an open-source tool designed for this purpose [65]. In...structure of the data. The data term seeks to keep the proposed solution (u) similar to the originally observed values ( f ). A systems designer must
Uchendu, C; Blake, H
2017-03-01
Diabetes is a chronic progressive condition presenting physical, social and psychological challenges that increase the risk of comorbid mental health problems. Cognitive-behavioural therapy (CBT) is effective in treating a variety of psychological disorders, and may potentially improve glycaemic control and psychological outcomes in diabetes. This systematic review and meta-analysis aims to establish the effectiveness of CBT on glycaemic control and comorbid diabetes-related distress, depression, anxiety and quality of life in the short, medium and longer term among adults with diabetes. An electronic search was conducted in PubMed, Embase, MEDLINE, PsycINFO, CINAHL, Web of Knowledge, Cochrane Central Register of Controlled Trials and references in reviews. Twelve randomized controlled trials (RCTs) were identified that evaluated the effectiveness of CBT on at least one of: glycaemic control, diabetes-related distress, anxiety, depression or quality of life in adults with Type 1 or Type 2 diabetes. The Cochrane Risk of Bias Tool and Review Manager version 5.3 were used for risk of bias assessment and meta-analysis, respectively. CBT is effective in reducing short-term and medium-term glycaemic control, although no significant effect was found for long-term glycaemic control. CBT improved short- and medium-term anxiety and depression, and long-term depression. Mixed results were found for diabetes-related distress and quality of life. CBT is beneficial in improving depression for adults with diabetes. It may have benefits for improving glycaemic control and other aspects of psychological health, although the findings are inconclusive. © 2016 Diabetes UK.
An overview of the BioCreative 2012 Workshop Track III: interactive text mining task
Arighi, Cecilia N.; Carterette, Ben; Cohen, K. Bretonnel; Krallinger, Martin; Wilbur, W. John; Fey, Petra; Dodson, Robert; Cooper, Laurel; Van Slyke, Ceri E.; Dahdul, Wasila; Mabee, Paula; Li, Donghui; Harris, Bethany; Gillespie, Marc; Jimenez, Silvia; Roberts, Phoebe; Matthews, Lisa; Becker, Kevin; Drabkin, Harold; Bello, Susan; Licata, Luana; Chatr-aryamontri, Andrew; Schaeffer, Mary L.; Park, Julie; Haendel, Melissa; Van Auken, Kimberly; Li, Yuling; Chan, Juancarlos; Muller, Hans-Michael; Cui, Hong; Balhoff, James P.; Chi-Yang Wu, Johnny; Lu, Zhiyong; Wei, Chih-Hsuan; Tudor, Catalina O.; Raja, Kalpana; Subramani, Suresh; Natarajan, Jeyakumar; Cejuela, Juan Miguel; Dubey, Pratibha; Wu, Cathy
2013-01-01
In many databases, biocuration primarily involves literature curation, which usually involves retrieving relevant articles, extracting information that will translate into annotations and identifying new incoming literature. As the volume of biological literature increases, the use of text mining to assist in biocuration becomes increasingly relevant. A number of groups have developed tools for text mining from a computer science/linguistics perspective, and there are many initiatives to curate some aspect of biology from the literature. Some biocuration efforts already make use of a text mining tool, but there have not been many broad-based systematic efforts to study which aspects of a text mining tool contribute to its usefulness for a curation task. Here, we report on an effort to bring together text mining tool developers and database biocurators to test the utility and usability of tools. Six text mining systems presenting diverse biocuration tasks participated in a formal evaluation, and appropriate biocurators were recruited for testing. The performance results from this evaluation indicate that some of the systems were able to improve efficiency of curation by speeding up the curation task significantly (∼1.7- to 2.5-fold) over manual curation. In addition, some of the systems were able to improve annotation accuracy when compared with the performance on the manually curated set. In terms of inter-annotator agreement, the factors that contributed to significant differences for some of the systems included the expertise of the biocurator on the given curation task, the inherent difficulty of the curation and attention to annotation guidelines. After the task, annotators were asked to complete a survey to help identify strengths and weaknesses of the various systems. The analysis of this survey highlights how important task completion is to the biocurators’ overall experience of a system, regardless of the system’s high score on design, learnability and usability. In addition, strategies to refine the annotation guidelines and systems documentation, to adapt the tools to the needs and query types the end user might have and to evaluate performance in terms of efficiency, user interface, result export and traditional evaluation metrics have been analyzed during this task. This analysis will help to plan for a more intense study in BioCreative IV. PMID:23327936
An overview of the BioCreative 2012 Workshop Track III: interactive text mining task.
Arighi, Cecilia N; Carterette, Ben; Cohen, K Bretonnel; Krallinger, Martin; Wilbur, W John; Fey, Petra; Dodson, Robert; Cooper, Laurel; Van Slyke, Ceri E; Dahdul, Wasila; Mabee, Paula; Li, Donghui; Harris, Bethany; Gillespie, Marc; Jimenez, Silvia; Roberts, Phoebe; Matthews, Lisa; Becker, Kevin; Drabkin, Harold; Bello, Susan; Licata, Luana; Chatr-aryamontri, Andrew; Schaeffer, Mary L; Park, Julie; Haendel, Melissa; Van Auken, Kimberly; Li, Yuling; Chan, Juancarlos; Muller, Hans-Michael; Cui, Hong; Balhoff, James P; Chi-Yang Wu, Johnny; Lu, Zhiyong; Wei, Chih-Hsuan; Tudor, Catalina O; Raja, Kalpana; Subramani, Suresh; Natarajan, Jeyakumar; Cejuela, Juan Miguel; Dubey, Pratibha; Wu, Cathy
2013-01-01
In many databases, biocuration primarily involves literature curation, which usually involves retrieving relevant articles, extracting information that will translate into annotations and identifying new incoming literature. As the volume of biological literature increases, the use of text mining to assist in biocuration becomes increasingly relevant. A number of groups have developed tools for text mining from a computer science/linguistics perspective, and there are many initiatives to curate some aspect of biology from the literature. Some biocuration efforts already make use of a text mining tool, but there have not been many broad-based systematic efforts to study which aspects of a text mining tool contribute to its usefulness for a curation task. Here, we report on an effort to bring together text mining tool developers and database biocurators to test the utility and usability of tools. Six text mining systems presenting diverse biocuration tasks participated in a formal evaluation, and appropriate biocurators were recruited for testing. The performance results from this evaluation indicate that some of the systems were able to improve efficiency of curation by speeding up the curation task significantly (∼1.7- to 2.5-fold) over manual curation. In addition, some of the systems were able to improve annotation accuracy when compared with the performance on the manually curated set. In terms of inter-annotator agreement, the factors that contributed to significant differences for some of the systems included the expertise of the biocurator on the given curation task, the inherent difficulty of the curation and attention to annotation guidelines. After the task, annotators were asked to complete a survey to help identify strengths and weaknesses of the various systems. The analysis of this survey highlights how important task completion is to the biocurators' overall experience of a system, regardless of the system's high score on design, learnability and usability. In addition, strategies to refine the annotation guidelines and systems documentation, to adapt the tools to the needs and query types the end user might have and to evaluate performance in terms of efficiency, user interface, result export and traditional evaluation metrics have been analyzed during this task. This analysis will help to plan for a more intense study in BioCreative IV.
CRAB3: Establishing a new generation of services for distributed analysis at CMS
NASA Astrophysics Data System (ADS)
Cinquilli, M.; Spiga, D.; Grandi, C.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Riahi, H.; Vaandering, E.
2012-12-01
In CMS Computing the highest priorities for analysis tools are the improvement of the end users’ ability to produce and publish reliable samples and analysis results as well as a transition to a sustainable development and operations model. To achieve these goals CMS decided to incorporate analysis processing into the same framework as data and simulation processing. This strategy foresees that all workload tools (TierO, Tier1, production, analysis) share a common core with long term maintainability as well as the standardization of the operator interfaces. The re-engineered analysis workload manager, called CRAB3, makes use of newer technologies, such as RESTFul based web services and NoSQL Databases, aiming to increase the scalability and reliability of the system. As opposed to CRAB2, in CRAB3 all work is centrally injected and managed in a global queue. A pool of agents, which can be geographically distributed, consumes work from the central services serving the user tasks. The new architecture of CRAB substantially changes the deployment model and operations activities. In this paper we present the implementation of CRAB3, emphasizing how the new architecture improves the workflow automation and simplifies maintainability. In particular, we will highlight the impact of the new design on daily operations.
CFGP: a web-based, comparative fungal genomics platform.
Park, Jongsun; Park, Bongsoo; Jung, Kyongyong; Jang, Suwang; Yu, Kwangyul; Choi, Jaeyoung; Kong, Sunghyung; Park, Jaejin; Kim, Seryun; Kim, Hyojeong; Kim, Soonok; Kim, Jihyun F; Blair, Jaime E; Lee, Kwangwon; Kang, Seogchan; Lee, Yong-Hwan
2008-01-01
Since the completion of the Saccharomyces cerevisiae genome sequencing project in 1996, the genomes of over 80 fungal species have been sequenced or are currently being sequenced. Resulting data provide opportunities for studying and comparing fungal biology and evolution at the genome level. To support such studies, the Comparative Fungal Genomics Platform (CFGP; http://cfgp.snu.ac.kr), a web-based multifunctional informatics workbench, was developed. The CFGP comprises three layers, including the basal layer, middleware and the user interface. The data warehouse in the basal layer contains standardized genome sequences of 65 fungal species. The middleware processes queries via six analysis tools, including BLAST, ClustalW, InterProScan, SignalP 3.0, PSORT II and a newly developed tool named BLASTMatrix. The BLASTMatrix permits the identification and visualization of genes homologous to a query across multiple species. The Data-driven User Interface (DUI) of the CFGP was built on a new concept of pre-collecting data and post-executing analysis instead of the 'fill-in-the-form-and-press-SUBMIT' user interfaces utilized by most bioinformatics sites. A tool termed Favorite, which supports the management of encapsulated sequence data and provides a personalized data repository to users, is another novel feature in the DUI.
A critical appraisal of advances in the diagnosis of diverticular disease.
Tursi, Antonio
2018-06-19
Diverticulosis of the colon is a common condition, and about one-fourth of those people develop symptoms, which is called 'diverticular disease' (DD). Since there are still some concerns about the diagnosis of DD, the aim of this review was to analyze current and evolving advances in its diagnosis. Area covered: Analysis of clinical, radiology, laboratory, and endoscopic tools to pose a correct diagnosis of DD was performed according to current PubMed literature. Expert commentary: A combination of clinical characteristic of the abdominal pain and fecal calprotectin expression may help to differentiate between symptomatic uncomplicated diverticular disease and irritable bowel syndrome. Abdominal computerized tomography (CT) scan is still the gold standard in diagnosing acute diverticulitis and its complications. CT-colonography may be useful as a predicting tool on the outcome of the disease. Diverticular Inflammation and Complications Assessment (DICA) endoscopic classification shows a significant relationship between severity of DICA score inflammatory indexes, as well as with severity of abdominal pain. Moreover, it seems to be predictive of the outcome of the disease in terms of acute diverticulitis occurrence/recurrence and surgery occurrence. Finally, preliminary data found intestinal microbiota analysis is a promising tool in diagnosing and monitoring this disease.
Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neubauer, J.
2014-12-01
The deployment and use of lithium-ion (Li-ion) batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge (SOC) histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory (NREL) has developed the Battery Lifetime Analysis and Simulationmore » Tool (BLAST) suite. This suite of tools pairs NREL’s high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.« less
Spreading Effect in Industrial Complex Network Based on Revised Structural Holes Theory
Ye, Qing; Guan, Jun
2016-01-01
This paper analyzed the spreading effect of industrial sectors with complex network model under perspective of econophysics. Input-output analysis, as an important research tool, focuses more on static analysis. However, the fundamental aim of industry analysis is to figure out how interaction between different industries makes impacts on economic development, which turns out to be a dynamic process. Thus, industrial complex network based on input-output tables from WIOD is proposed to be a bridge connecting accurate static quantitative analysis and comparable dynamic one. With application of revised structural holes theory, flow betweenness and random walk centrality were respectively chosen to evaluate industrial sectors’ long-term and short-term spreading effect process in this paper. It shows that industries with higher flow betweenness or random walk centrality would bring about more intensive industrial spreading effect to the industrial chains they stands in, because value stream transmission of industrial sectors depends on how many products or services it can get from the other ones, and they are regarded as brokers with bigger information superiority and more intermediate interests. PMID:27218468
Spreading Effect in Industrial Complex Network Based on Revised Structural Holes Theory.
Xing, Lizhi; Ye, Qing; Guan, Jun
2016-01-01
This paper analyzed the spreading effect of industrial sectors with complex network model under perspective of econophysics. Input-output analysis, as an important research tool, focuses more on static analysis. However, the fundamental aim of industry analysis is to figure out how interaction between different industries makes impacts on economic development, which turns out to be a dynamic process. Thus, industrial complex network based on input-output tables from WIOD is proposed to be a bridge connecting accurate static quantitative analysis and comparable dynamic one. With application of revised structural holes theory, flow betweenness and random walk centrality were respectively chosen to evaluate industrial sectors' long-term and short-term spreading effect process in this paper. It shows that industries with higher flow betweenness or random walk centrality would bring about more intensive industrial spreading effect to the industrial chains they stands in, because value stream transmission of industrial sectors depends on how many products or services it can get from the other ones, and they are regarded as brokers with bigger information superiority and more intermediate interests.
Becker, François; Fourgeau, Patrice; Carpentier, Patrick H; Ouchène, Amina
2018-06-01
We postulate that blue telangiectasia and brownish pigmentation at ankle level, early markers of chronic venous insufficiency, can be quantified for longitudinal studies of chronic venous disease in Caucasian people. Objectives and methods To describe a photographic technique specially developed for this purpose. The pictures were acquired using a dedicated photo stand to position the foot in a reproducible way, with a normalized lighting and acquisition protocol. The image analysis was performed with a tool developed using algorithms optimized to detect and quantify blue telangiectasia and brownish pigmentation and their relative surface in the region of interest. To test the short-term reproducibility of the measures. Results The quantification of the blue telangiectasia and of the brownish pigmentation using an automated digital photo analysis is feasible. The short-term reproducibility is good for blue telangiectasia quantification. It is a less accurate for the brownish pigmentation. Conclusion The blue telangiectasia of the corona phlebectatica and the ankle flare can be assessed using a clinimetric approach based on the automated digital photo analysis.
NASA Technical Reports Server (NTRS)
Hou, Gene
1998-01-01
Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.
Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040
NASA Technical Reports Server (NTRS)
Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.
2012-01-01
Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model Integration), where it has been ap praised at maturity level 3.
Sebestyén, Endre; Nagy, Tibor; Suhai, Sándor; Barta, Endre
2009-01-01
Background The comparative genomic analysis of a large number of orthologous promoter regions of the chordate and plant genes from the DoOP databases shows thousands of conserved motifs. Most of these motifs differ from any known transcription factor binding site (TFBS). To identify common conserved motifs, we need a specific tool to be able to search amongst them. Since conserved motifs from the DoOP databases are linked to genes, the result of such a search can give a list of genes that are potentially regulated by the same transcription factor(s). Results We have developed a new tool called DoOPSearch for the analysis of the conserved motifs in the promoter regions of chordate or plant genes. We used the orthologous promoters of the DoOP database to extract thousands of conserved motifs from different taxonomic groups. The advantage of this approach is that different sets of conserved motifs might be found depending on how broad the taxonomic coverage of the underlying orthologous promoter sequence collection is (consider e.g. primates vs. mammals or Brassicaceae vs. Viridiplantae). The DoOPSearch tool allows the users to search these motif collections or the promoter regions of DoOP with user supplied query sequences or any of the conserved motifs from the DoOP database. To find overrepresented gene ontologies, the gene lists obtained can be analysed further using a modified version of the GeneMerge program. Conclusion We present here a comparative genomics based promoter analysis tool. Our system is based on a unique collection of conserved promoter motifs characteristic of different taxonomic groups. We offer both a command line and a web-based tool for searching in these motif collections using user specified queries. These can be either short promoter sequences or consensus sequences of known transcription factor binding sites. The GeneMerge analysis of the search results allows the user to identify statistically overrepresented Gene Ontology terms that might provide a clue on the function of the motifs and genes. PMID:19534755
Electronic waste management approaches: An overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiddee, Peeranart; Cooperative Research Centre for Contamination Assessment and Remediation of the Environment, Mawson Lakes Campus, Adelaide, SA 5095; Naidu, Ravi, E-mail: ravi.naidu@crccare.com
2013-05-15
Highlights: ► Human toxicity of hazardous substances in e-waste. ► Environmental impacts of e-waste from disposal processes. ► Life Cycle Assessment (LCA), Material Flow Analysis (MFA), Multi Criteria Analysis (MCA) and Extended Producer Responsibility (EPR) to and solve e-waste problems. ► Key issues relating to tools managing e-waste for sustainable e-waste management. - Abstract: Electronic waste (e-waste) is one of the fastest-growing pollution problems worldwide given the presence if a variety of toxic substances which can contaminate the environment and threaten human health, if disposal protocols are not meticulously managed. This paper presents an overview of toxic substances present inmore » e-waste, their potential environmental and human health impacts together with management strategies currently being used in certain countries. Several tools including Life Cycle Assessment (LCA), Material Flow Analysis (MFA), Multi Criteria Analysis (MCA) and Extended Producer Responsibility (EPR) have been developed to manage e-wastes especially in developed countries. The key to success in terms of e-waste management is to develop eco-design devices, properly collect e-waste, recover and recycle material by safe methods, dispose of e-waste by suitable techniques, forbid the transfer of used electronic devices to developing countries, and raise awareness of the impact of e-waste. No single tool is adequate but together they can complement each other to solve this issue. A national scheme such as EPR is a good policy in solving the growing e-waste problems.« less
Chrimes, Dillon; Kushniruk, Andre; Kitos, Nicole R.
2014-01-01
Purpose Usability testing can be used to evaluate human computer interaction (HCI) and communication in shared decision making (SDM) for patient-provider behavioral change and behavioral contracting. Traditional evaluations of usability using scripted or mock patient scenarios with think-aloud protocol analysis provide a to identify HCI issues. In this paper we describe the application of these methods in the evaluation of the Avoiding Diabetes Thru Action Plan Targeting (ADAPT) tool, and test the usability of the tool to support the ADAPT framework for integrated care counseling of pre-diabetes. The think-aloud protocol analysis typically does not provide an assessment of how patient-provider interactions are effected in “live” clinical workflow or whether a tool is successful. Therefore, “Near-live” clinical simulations involving applied simulation methods were used to compliment the think-aloud results. This complementary usability technique was used to test the end-user HCI and tool performance by more closely mimicking the clinical workflow and capturing interaction sequences along with assessing the functionality of computer module prototypes on clinician workflow. We expected this method to further complement and provide different usability findings as compared to think-aloud analysis. Together, this mixed method evaluation provided comprehensive and realistic feedback for iterative refinement of the ADAPT system prior to implementation. Methods The study employed two phases of testing of a new interactive ADAPT tool that embedded an evidence-based shared goal setting component into primary care workflow for dealing with pre-diabetes counseling within a commercial physician office electronic health record (EHR). Phase I applied usability testing that involved “think-aloud” protocol analysis of 8 primary care providers interacting with several scripted clinical scenarios. Phase II used “near-live” clinical simulations of 5 providers interacting with standardized trained patient actors enacting the clinical scenario of counseling for pre-diabetes, each of whom had a pedometer that recorded the number of steps taken over a week. In both phases, all sessions were audio-taped and motion screen-capture software was activated for onscreen recordings. Transcripts were coded using iterative qualitative content analysis methods. Results In Phase I, the impact of the components and layout of ADAPT on user’s Navigation, Understandability, and Workflow were associated with the largest volume of negative comments (i.e. approximately 80% of end-user commentary), while Usability and Content of ADAPT were representative of more positive than negative user commentary. The heuristic category of Usability had a positive-to-negative comment ratio of 2.1, reflecting positive perception of the usability of the tool, its functionality, and overall co-productive utilization of ADAPT. However, there were mixed perceptions about content (i.e., how the information was displayed, organized and described in the tool). In Phase II, the duration of patient encounters was approximately 10 minutes with all of the Patient Instructions (prescriptions) and behavioral contracting being activated at the end of each visit. Upon activation, providers accepted the pathway prescribed by the tool 100% of the time and completed all the fields in the tool in the simulation cases. Only 14% of encounter time was spent using the functionality of the ADAPT tool in terms of keystrokes and entering relevant data. The rest of the time was spent on communication and dialogue to populate the patient instructions. In all cases, the interaction sequence of reviewing and discussing exercise and diet of the patient was linked to the functionality of the ADAPT tool in terms of monitoring, response-efficacy, self-efficacy, and negotiation in the patient-provider dialogue. There was a change from one-way dialogue to two-way dialogue and negotiation that ended in a behavioral contract. This change demonstrated the tool’s sequence, which supported recording current exercise and diet followed by a diet and exercise goal setting procedure to reduce the risk of diabetes onset. Conclusions This study demonstrated that “think-aloud” protocol analysis with “near-live” clinical simulations provided a successful usability evaluation of a new primary care pre-diabetes shared goal setting tool. Each phase of the study provided complementary observations on problems with the new onscreen tool and was used to show the influence of the ADAPT framework on the usability, workflow integration, and communication between the patient and provider. The think-aloud tests with the provider showed the tool can be used according to the ADAPT framework (exercise-to-diet behavior change and tool utilization), while the clinical simulations revealed the ADAPT framework to realistically support patient-provider communication to obtain behavioral change contract. SDM interactions and mechanisms affecting protocol-based care can be more completely captured by combining “near-live” clinical simulations with traditional “think-aloud analysis” which augments clinician utilization. More analysis is required to verify if the rich communication actions found in Phase II compliment clinical workflows. PMID:24981988
Visualization of protein interaction networks: problems and solutions
2013-01-01
Background Visualization concerns the representation of data visually and is an important task in scientific research. Protein-protein interactions (PPI) are discovered using either wet lab techniques, such mass spectrometry, or in silico predictions tools, resulting in large collections of interactions stored in specialized databases. The set of all interactions of an organism forms a protein-protein interaction network (PIN) and is an important tool for studying the behaviour of the cell machinery. Since graphic representation of PINs may highlight important substructures, e.g. protein complexes, visualization is more and more used to study the underlying graph structure of PINs. Although graphs are well known data structures, there are different open problems regarding PINs visualization: the high number of nodes and connections, the heterogeneity of nodes (proteins) and edges (interactions), the possibility to annotate proteins and interactions with biological information extracted by ontologies (e.g. Gene Ontology) that enriches the PINs with semantic information, but complicates their visualization. Methods In these last years many software tools for the visualization of PINs have been developed. Initially thought for visualization only, some of them have been successively enriched with new functions for PPI data management and PIN analysis. The paper analyzes the main software tools for PINs visualization considering four main criteria: (i) technology, i.e. availability/license of the software and supported OS (Operating System) platforms; (ii) interoperability, i.e. ability to import/export networks in various formats, ability to export data in a graphic format, extensibility of the system, e.g. through plug-ins; (iii) visualization, i.e. supported layout and rendering algorithms and availability of parallel implementation; (iv) analysis, i.e. availability of network analysis functions, such as clustering or mining of the graph, and the possibility to interact with external databases. Results Currently, many tools are available and it is not easy for the users choosing one of them. Some tools offer sophisticated 2D and 3D network visualization making available many layout algorithms, others tools are more data-oriented and support integration of interaction data coming from different sources and data annotation. Finally, some specialistic tools are dedicated to the analysis of pathways and cellular processes and are oriented toward systems biology studies, where the dynamic aspects of the processes being studied are central. Conclusion A current trend is the deployment of open, extensible visualization tools (e.g. Cytoscape), that may be incrementally enriched by the interactomics community with novel and more powerful functions for PIN analysis, through the development of plug-ins. On the other hand, another emerging trend regards the efficient and parallel implementation of the visualization engine that may provide high interactivity and near real-time response time, as in NAViGaTOR. From a technological point of view, open-source, free and extensible tools, like Cytoscape, guarantee a long term sustainability due to the largeness of the developers and users communities, and provide a great flexibility since new functions are continuously added by the developer community through new plug-ins, but the emerging parallel, often closed-source tools like NAViGaTOR, can offer near real-time response time also in the analysis of very huge PINs. PMID:23368786
NASA Astrophysics Data System (ADS)
Monasterio, Leonardo Monteiro
2010-03-01
This paper analyzes the spatial dynamics of Brazilian regional inequalities between 1872 and 2000 using contemporary tools. The first part of the paper provides new estimates of income per capita in 1872 by municipality using census and electoral information on income by occupation. The level of analysis is the Minimum Comparable Areas 1872-2000 developed by Reis et al. (Áreas mínimas comparáveis para os períodos intercensitários de 1872 a 2000, 2007). These areas are the least aggregation of adjacent municipalities required to allow consistent geographic area comparisons between census years. In the second section of the paper, Exploratory Spatial Data Analysis, Markov chains and stochastic kernel techniques (spatially conditioned) are applied to the dataset. The results suggest that, in broad terms, the spatial pattern of income distribution in Brazil during that period of time has remained stable.
Prolonged instability prior to a regime shift
Spanbauer, Trisha; Allen, Craig R.; Angeler, David G.; Eason, Tarsha; Fritz, Sherilyn C.; Garmestani, Ahjond S.; Nash, Kirsty L.; Stone, Jeffery R.
2014-01-01
Regime shifts are generally defined as the point of ‘abrupt’ change in the state of a system. However, a seemingly abrupt transition can be the product of a system reorganization that has been ongoing much longer than is evident in statistical analysis of a single component of the system. Using both univariate and multivariate statistical methods, we tested a long-term high-resolution paleoecological dataset with a known change in species assemblage for a regime shift. Analysis of this dataset with Fisher Information and multivariate time series modeling showed that there was a∼2000 year period of instability prior to the regime shift. This period of instability and the subsequent regime shift coincide with regional climate change, indicating that the system is undergoing extrinsic forcing. Paleoecological records offer a unique opportunity to test tools for the detection of thresholds and stable-states, and thus to examine the long-term stability of ecosystems over periods of multiple millennia.
Atmospheric, Long Baseline, and Reactor Neutrino Data Constraints on θ13
NASA Astrophysics Data System (ADS)
Roa, J. E.; Latimer, D. C.; Ernst, D. J.
2009-08-01
An atmospheric neutrino oscillation tool that uses full three-neutrino oscillation probabilities and a full three-neutrino treatment of the Mikheyev-Smirnov-Wolfenstein effect, together with an analysis of the K2K, MINOS, and CHOOZ data, is used to examine the bounds on θ13. The recent, more finely binned, Super-K atmospheric data are employed. For L/Eν≳104km/GeV, we previously found significant linear in θ13 terms. This analysis finds θ13 bounded from above by the atmospheric data while bounded from below by CHOOZ. The origin of this result arises from data in the previously mentioned very long baseline region; here, matter effects conspire with terms linear in θ13 to produce asymmetric bounds on θ13. Assuming CP conservation, we find θ13=-0.07-0.11+0.18 (90% C.L.).
Atmospheric, long baseline, and reactor neutrino data constraints on theta_{13}.
Roa, J E; Latimer, D C; Ernst, D J
2009-08-07
An atmospheric neutrino oscillation tool that uses full three-neutrino oscillation probabilities and a full three-neutrino treatment of the Mikheyev-Smirnov-Wolfenstein effect, together with an analysis of the K2K, MINOS, and CHOOZ data, is used to examine the bounds on theta_{13}. The recent, more finely binned, Super-K atmospheric data are employed. For L/E_{nu} greater, similar 10;{4} km/GeV, we previously found significant linear in theta_{13} terms. This analysis finds theta_{13} bounded from above by the atmospheric data while bounded from below by CHOOZ. The origin of this result arises from data in the previously mentioned very long baseline region; here, matter effects conspire with terms linear in theta_{13} to produce asymmetric bounds on theta_{13}. Assuming CP conservation, we find theta_{13} = -0.07_{-0.11};{+0.18} (90% C.L.).
NASA Astrophysics Data System (ADS)
Krehbiel, C.; Maiersperger, T.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.
2016-12-01
Three major obstacles facing big Earth data users include data storage, management, and analysis. As the amount of satellite remote sensing data increases, so does the need for better data storage and management strategies to exploit the plethora of data now available. Standard GIS tools can help big Earth data users whom interact with and analyze increasingly large and diverse datasets. In this presentation we highlight how NASA's Land Processes Distributed Active Archive Center (LP DAAC) is tackling these big Earth data challenges. We provide a real life use case example to describe three tools and services provided by the LP DAAC to more efficiently exploit big Earth data in a GIS environment. First, we describe the Open-source Project for a Network Data Access Protocol (OPeNDAP), which calls to specific data, minimizing the amount of data that a user downloads and improves the efficiency of data downloading and processing. Next, we cover the LP DAAC's Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), a web application interface for extracting and analyzing land remote sensing data. From there, we review an ArcPython toolbox that was developed to provide quality control services to land remote sensing data products. Locating and extracting specific subsets of larger big Earth datasets improves data storage and management efficiency for the end user, and quality control services provides a straightforward interpretation of big Earth data. These tools and services are beneficial to the GIS user community in terms of standardizing workflows and improving data storage, management, and analysis tactics.
Software applications for flux balance analysis.
Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup
2014-01-01
Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.
Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project
NASA Astrophysics Data System (ADS)
van Eck, T.; Giardini, D.
2010-12-01
The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.
Solar Sail Propulsion Technology at NASA
NASA Technical Reports Server (NTRS)
Johnson, Charles Les
2007-01-01
NASA's In-Space Propulsion Technology Program developed the first generation of solar sail propulsion systems sufficient to accomplish inner solar system science and exploration missions. These first generation solar sails, when operational, will range in size from 40 meters to well over 100 meters in diameter and have an area density of less than 13 grams per square meter. A rigorous, multi-year technology development effort culminated in 2005 with the testing of two different 20-m solar sail systems under thermal vacuum conditions. This effort provided a number of significant insights into the optimal design and expected performance of solar sails as well as an understanding of the methods and costs of building and using them. In addition, solar sail orbital analysis tools for mission design were developed and tested. Laboratory simulations of the effects of long-term space radiation exposure were also conducted on two candidate solar sail materials. Detailed radiation and charging environments were defined for mission trajectories outside the protection of the earth's magnetosphere, in the solar wind environment. These were used in other analytical tools to prove the adequacy of sail design features for accommodating the harsh space environment. The presentation will describe the status of solar sail propulsion within NASA, near-term solar sail mission applications, and near-term plans for further development.
An R package for the integrated analysis of metabolomics and spectral data.
Costa, Christopher; Maraschin, Marcelo; Rocha, Miguel
2016-06-01
Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as nuclear magnetic resonance, gas or liquid chromatography, mass spectrometry, infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Digital teaching tools and global learning communities.
Williams, Mary; Lockhart, Patti; Martin, Cathie
2015-01-01
In 2009, we started a project to support the teaching and learning of university-level plant sciences, called Teaching Tools in Plant Biology. Articles in this series are published by the plant science journal, The Plant Cell (published by the American Society of Plant Biologists). Five years on, we investigated how the published materials are being used through an analysis of the Google Analytics pageviews distribution and through a user survey. Our results suggest that this project has had a broad, global impact in supporting higher education, and also that the materials are used differently by individuals in terms of their role (instructor, independent learner, student) and geographical location. We also report on our ongoing efforts to develop a global learning community that encourages discussion and resource sharing.
Psychometric testing of an instrument to measure the experience of home.
Molony, Sheila L; McDonald, Deborah Dillon; Palmisano-Mills, Christine
2007-10-01
Research related to quality of life in long-term care has been hampered by a paucity of measurement tools sensitive to environmental interventions. The primary aim of this study was to test the psychometric properties of a new instrument, the Experience of Home (EOH) Scale, designed to measure the strength of the experience of meaningful person-environment transaction. The instrument was administered to 200 older adults in diverse dwelling types. Principal components analysis provided support for construct validity, eliciting a three-factor solution accounting for 63.18% of variance in scores. Internal consistency reliability was supported with Cronbach's alpha of .96 for the entire scale. The EOH Scale is a unique research tool to evaluate interventions to improve quality of living in residential environments.
AMIDE: a free software tool for multimodality medical image analysis.
Loening, Andreas Markus; Gambhir, Sanjiv Sam
2003-07-01
Amide's a Medical Image Data Examiner (AMIDE) has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI) and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.
NASA Astrophysics Data System (ADS)
Poat, M. D.; Lauret, J.; Betts, W.
2015-12-01
The STAR online computing environment is an intensive ever-growing system used for real-time data collection and analysis. Composed of heterogeneous and sometimes groups of custom-tuned machines, the computing infrastructure was previously managed by manual configurations and inconsistently monitored by a combination of tools. This situation led to configuration inconsistency and an overload of repetitive tasks along with lackluster communication between personnel and machines. Globally securing this heterogeneous cyberinfrastructure was tedious at best and an agile, policy-driven system ensuring consistency, was pursued. Three configuration management tools, Chef, Puppet, and CFEngine have been compared in reliability, versatility and performance along with a comparison of infrastructure monitoring tools Nagios and Icinga. STAR has selected the CFEngine configuration management tool and the Icinga infrastructure monitoring system leading to a versatile and sustainable solution. By leveraging these two tools STAR can now swiftly upgrade and modify the environment to its needs with ease as well as promptly react to cyber-security requests. By creating a sustainable long term monitoring solution, the detection of failures was reduced from days to minutes, allowing rapid actions before the issues become dire problems, potentially causing loss of precious experimental data or uptime.
RNA-SSPT: RNA Secondary Structure Prediction Tools.
Ahmad, Freed; Mahboob, Shahid; Gulzar, Tahsin; Din, Salah U; Hanif, Tanzeela; Ahmad, Hifza; Afzal, Muhammad
2013-01-01
The prediction of RNA structure is useful for understanding evolution for both in silico and in vitro studies. Physical methods like NMR studies to predict RNA secondary structure are expensive and difficult. Computational RNA secondary structure prediction is easier. Comparative sequence analysis provides the best solution. But secondary structure prediction of a single RNA sequence is challenging. RNA-SSPT is a tool that computationally predicts secondary structure of a single RNA sequence. Most of the RNA secondary structure prediction tools do not allow pseudoknots in the structure or are unable to locate them. Nussinov dynamic programming algorithm has been implemented in RNA-SSPT. The current studies shows only energetically most favorable secondary structure is required and the algorithm modification is also available that produces base pairs to lower the total free energy of the secondary structure. For visualization of RNA secondary structure, NAVIEW in C language is used and modified in C# for tool requirement. RNA-SSPT is built in C# using Dot Net 2.0 in Microsoft Visual Studio 2005 Professional edition. The accuracy of RNA-SSPT is tested in terms of Sensitivity and Positive Predicted Value. It is a tool which serves both secondary structure prediction and secondary structure visualization purposes.
RNA-SSPT: RNA Secondary Structure Prediction Tools
Ahmad, Freed; Mahboob, Shahid; Gulzar, Tahsin; din, Salah U; Hanif, Tanzeela; Ahmad, Hifza; Afzal, Muhammad
2013-01-01
The prediction of RNA structure is useful for understanding evolution for both in silico and in vitro studies. Physical methods like NMR studies to predict RNA secondary structure are expensive and difficult. Computational RNA secondary structure prediction is easier. Comparative sequence analysis provides the best solution. But secondary structure prediction of a single RNA sequence is challenging. RNA-SSPT is a tool that computationally predicts secondary structure of a single RNA sequence. Most of the RNA secondary structure prediction tools do not allow pseudoknots in the structure or are unable to locate them. Nussinov dynamic programming algorithm has been implemented in RNA-SSPT. The current studies shows only energetically most favorable secondary structure is required and the algorithm modification is also available that produces base pairs to lower the total free energy of the secondary structure. For visualization of RNA secondary structure, NAVIEW in C language is used and modified in C# for tool requirement. RNA-SSPT is built in C# using Dot Net 2.0 in Microsoft Visual Studio 2005 Professional edition. The accuracy of RNA-SSPT is tested in terms of Sensitivity and Positive Predicted Value. It is a tool which serves both secondary structure prediction and secondary structure visualization purposes. PMID:24250115
Cultural expressions of depression and the development of the Indonesian Depression Checklist.
Widiana, Herlina Siwi; Simpson, Katrina; Manderson, Lenore
2018-06-01
Depression may manifest differently across cultural settings, suggesting the value of an assessment tool that is sensitive enough to capture these variations. The study reported in this article aimed to develop a depression screening tool for Indonesians derived from ethnographic interviews with 20 people who had been diagnosed as having depression by clinical psychologists at primary health centers. The tool, which we have termed the Indonesian Depression Checklist (IDC), consists of 40 items. The tool was administered to 125 people assessed to have depression by 40 clinical psychologists in primary health centers. The data were analyzed with Confirmatory Factor Analysis (CFA) (IBM SPSS AMOS Software). CFA identified a five-factor hierarchical model ( χ 2 = 168.157, p = .091; CFI = .963; TLI = .957; RMSEA = .036). A 19-item inventory of the IDC, with five factors - Physical Symptoms, Affect, Cognition, Social Engagement and Religiosity - was identified. There was a strong correlation between the total score of the IDC and total score of the Center for Epidemiological Studies-Depression scale (revised version CES-D), a standard tool for assessing symptoms of depression. The IDC accommodates culturally distinctive aspects of depression among Indonesians that are not included in the CES-D.
NASA Astrophysics Data System (ADS)
Masoud, Alaa; Koike, Katsuaki
2017-09-01
Detection and analysis of linear features related to surface and subsurface structures have been deemed necessary in natural resource exploration and earth surface instability assessment. Subjectivity in choosing control parameters required in conventional methods of lineament detection may cause unreliable results. To reduce this ambiguity, we developed LINDA (LINeament Detection and Analysis), an integrated tool with graphical user interface in Visual Basic. This tool automates processes of detection and analysis of linear features from grid data of topography (digital elevation model; DEM), gravity and magnetic surfaces, as well as data from remote sensing imagery. A simple interface with five display windows forms a user-friendly interactive environment. The interface facilitates grid data shading, detection and grouping of segments, lineament analyses for calculating strike and dip and estimating fault type, and interactive viewing of lineament geometry. Density maps of the center and intersection points of linear features (segments and lineaments) are also included. A systematic analysis of test DEMs and Landsat 7 ETM+ imagery datasets in the North and South Eastern Deserts of Egypt is implemented to demonstrate the capability of LINDA and correct use of its functions. Linear features from the DEM are superior to those from the imagery in terms of frequency, but both linear features agree with location and direction of V-shaped valleys and dykes and reference fault data. Through the case studies, LINDA applicability is demonstrated to highlight dominant structural trends, which can aid understanding of geodynamic frameworks in any region.
PANDORA: keyword-based analysis of protein sets by integration of annotation sources.
Kaplan, Noam; Vaaknin, Avishay; Linial, Michal
2003-10-01
Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.
Ballard, Stephanie A; Peretti, Matteo; Lungu, Ovidiu; Voyer, Philippe; Tabamo, Fruan; Alfonso, Linda; Cetin-Sahin, Deniz; Johnson, Sarasa M A; Wilchesky, Machelle
Although specialized communication tools can effectively reduce acute care transfers, few studies have assessed the factors that may influence the use of such tools by nursing staff at the individual level. We evaluated the associations between years of experience, tool-related training, nursing attitudes, and intensity of use of a communication tool developed to reduce transfers in a long-term care facility. We employed a mixed methods design using data from medical charts, electronic records, and semi-structured interviews. Experienced nurses used the tool significantly less than inexperienced nurses, and training had a significant positive impact on tool use. Nurses found the purpose of the tool to be confusing. No significant differences in attitude were observed based on years of experience or intensity of use. Project findings indicate that focused efforts to enrich training may increase intervention adherence. Experienced nurses in particular should be made aware of the benefits of utilizing communication tools. Copyright © 2017 Elsevier Inc. All rights reserved.
Information Communication using Knowledge Engine on Flood Issues
NASA Astrophysics Data System (ADS)
Demir, I.; Krajewski, W. F.
2012-04-01
The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to and visualization of flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, and other flood-related data for communities in Iowa. The system is designed for use by general public, often people with no domain knowledge and poor general science background. To improve effective communication with such audience, we have introduced a new way in IFIS to get information on flood related issues - instead of by navigating within hundreds of features and interfaces of the information system and web-based sources-- by providing dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to distributed sources of real-time stream gauges, and in-house data sources, analysis and visualization tools to answer questions grouped into several categories. Users will be able to provide input based on the query within the categories of rainfall, flood conditions, forecast, inundation maps, flood risk and data sensors. Our goal is the systematization of knowledge on flood related issues, and to provide a single source for definitive answers to factual queries. Long-term goal of this knowledge engine is to make all flood related knowledge easily accessible to everyone, and provide educational geoinformatics tool. The future implementation of the system will be able to accept free-form input and voice recognition capabilities within browser and mobile applications. We intend to deliver increasing capabilities for the system over the coming releases of IFIS. This presentation provides an overview of our Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources.
Saba, Joseph; Audureau, Etienne; Bizé, Marion; Koloshuk, Barbara; Ladner, Joël
2013-04-01
The objective was to develop and validate a multilateral index to determine patient ability to pay for medication in low- and middle-income countries. Primary data were collected in 2009 from 117 cancer patients in China, India, Thailand, and Malaysia. The initial tool included income, expenditures, and assets-based items using ad hoc determined brackets. Principal components analysis was performed to determine final weights. Agreement (Kappa) was measured between results from the final tool and from an Impact Survey (IS) conducted after beginning drug therapy to quantify a patient's actual ability to pay in terms of number of drug cycles per year. The authors present the step-by-step methodology employed to develop the tool on a country-by-country basis. Overall Cronbach value was 0.84. Agreement between the Patient Financial Eligibility Tool (PFET) and IS was perfect (equal number of drug cycles) for 58.1% of patients, fair (1 cycle difference) for 29.1%, and poor (>1 cycle) for 12.8%. Overall Kappa was 0.76 (P<0.0001). The PFET is an effective tool for determining an individual's ability to pay for medication. Combined with tiered models for patient participation in the cost of medication, it could help to increase access to high-priced products in developing countries.
Funding Solar Projects at Federal Agencies: Mechanisms and Selection Criteria (Brochure)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Implementing solar energy projects at federal facilities is a process. The project planning phase of the process includes determining goals, building a team, determining site feasibility and selecting the appropriate project funding tool. This fact sheet gives practical guidance to assist decision-makers with understanding and selecting the funding tool that would best address their site goals. Because project funding tools are complex, federal agencies should seek project assistance before making final decisions. High capital requirements combined with limits on federal agency energy contracts create challenges for funding solar projects. Solar developers typically require long-term contracts (15-20) years to spread outmore » the initial investment and to enable payments similar to conventional utility bill payments. In the private sector, 20-year contracts have been developed, vetted, and accepted, but the General Services Administration (GSA) contract authority (federal acquisition regulation [FAR] part 41) typically limits contract terms to 10 years. Payments on shorter-term contracts make solar economically unattractive compared with conventional generation. However, in several instances, the federal sector has utilized innovative funding tools that allow long-term contracts or has created a project package that is economically attractive within a shorter contract term.« less
Bruining, Hilgo; Matsui, Asuka; Oguro-Ando, Asami; Kahn, René S; Van't Spijker, Heleen M; Akkermans, Guus; Stiedl, Oliver; van Engeland, Herman; Koopmans, Bastijn; van Lith, Hein A; Oppelaar, Hugo; Tieland, Liselotte; Nonkes, Lourens J; Yagi, Takeshi; Kaneko, Ryosuke; Burbach, J Peter H; Yamamoto, Nobuhiko; Kas, Martien J
2015-10-01
Quantitative genetic analysis of basic mouse behaviors is a powerful tool to identify novel genetic phenotypes contributing to neurobehavioral disorders. Here, we analyzed genetic contributions to single-trial, long-term social and nonsocial recognition and subsequently studied the functional impact of an identified candidate gene on behavioral development. Genetic mapping of single-trial social recognition was performed in chromosome substitution strains, a sophisticated tool for detecting quantitative trait loci (QTL) of complex traits. Follow-up occurred by generating and testing knockout (KO) mice of a selected QTL candidate gene. Functional characterization of these mice was performed through behavioral and neurological assessments across developmental stages and analyses of gene expression and brain morphology. Chromosome substitution strain 14 mapping studies revealed an overlapping QTL related to long-term social and object recognition harboring Pcdh9, a cell-adhesion gene previously associated with autism spectrum disorder. Specific long-term social and object recognition deficits were confirmed in homozygous (KO) Pcdh9-deficient mice, while heterozygous mice only showed long-term social recognition impairment. The recognition deficits in KO mice were not associated with alterations in perception, multi-trial discrimination learning, sociability, behavioral flexibility, or fear memory. Rather, KO mice showed additional impairments in sensorimotor development reflected by early touch-evoked biting, rotarod performance, and sensory gating deficits. This profile emerged with structural changes in deep layers of sensory cortices, where Pcdh9 is selectively expressed. This behavior-to-gene study implicates Pcdh9 in cognitive functions required for long-term social and nonsocial recognition. This role is supported by the involvement of Pcdh9 in sensory cortex development and sensorimotor phenotypes. Copyright © 2015 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
The qualitative assessment of pneumatic actuators operation in terms of vibration criteria
NASA Astrophysics Data System (ADS)
Hetmanczyk, M. P.; Michalski, P.
2015-11-01
The work quality of pneumatic actuators can be assessed in terms of multiple criteria. In the case of complex systems with pneumatic actuators retained at end positions (with occurrence of piston impact in cylinder covers) the vibration criteria constitute the most reliable indicators. The paper presents an impact assessment on the operating condition of the rodless pneumatic cylinder regarding to selected vibrational symptoms. On the basis of performed analysis the authors had shown meaningful premises allowing an evaluation of the performance and tuning of end position damping piston movement with usage the most common diagnostic tools (portable vibration analyzers). The presented method is useful in tuning of parameters in industrial conditions.
LFQuant: a label-free fast quantitative analysis tool for high-resolution LC-MS/MS proteomics data.
Zhang, Wei; Zhang, Jiyang; Xu, Changming; Li, Ning; Liu, Hui; Ma, Jie; Zhu, Yunping; Xie, Hongwei
2012-12-01
Database searching based methods for label-free quantification aim to reconstruct the peptide extracted ion chromatogram based on the identification information, which can limit the search space and thus make the data processing much faster. The random effect of the MS/MS sampling can be remedied by cross-assignment among different runs. Here, we present a new label-free fast quantitative analysis tool, LFQuant, for high-resolution LC-MS/MS proteomics data based on database searching. It is designed to accept raw data in two common formats (mzXML and Thermo RAW), and database search results from mainstream tools (MASCOT, SEQUEST, and X!Tandem), as input data. LFQuant can handle large-scale label-free data with fractionation such as SDS-PAGE and 2D LC. It is easy to use and provides handy user interfaces for data loading, parameter setting, quantitative analysis, and quantitative data visualization. LFQuant was compared with two common quantification software packages, MaxQuant and IDEAL-Q, on the replication data set and the UPS1 standard data set. The results show that LFQuant performs better than them in terms of both precision and accuracy, and consumes significantly less processing time. LFQuant is freely available under the GNU General Public License v3.0 at http://sourceforge.net/projects/lfquant/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Prolonged Instability Prior to a Regime Shift | Science ...
Regime shifts are generally defined as the point of ‘abrupt’ change in the state of a system. However, a seemingly abrupt transition can be the product of a system reorganization that has been ongoing much longer than is evident in statistical analysis of a single component of the system. Using both univariate and multivariate statistical methods, we tested a long-term high-resolution paleoecological dataset with a known change in species assemblage for a regime shift. Analysis of this dataset with Fisher Information and multivariate time series modeling showed that there was a∼2000 year period of instability prior to the regime shift. This period of instability and the subsequent regime shift coincide with regional climate change, indicating that the system is undergoing extrinsic forcing. Paleoecological records offer a unique opportunity to test tools for the detection of thresholds and stable-states, and thus to examine the long-term stability of ecosystems over periods of multiple millennia. This manuscript explores various methods of assessing the transition between alternative states in an ecological system described by a long-term high-resolution paleoecological dataset.
Building Training Curricula for Accelerating the Use of NOAA Climate Products and Tools
NASA Astrophysics Data System (ADS)
Timofeyeva-Livezey, M. M.; Meyers, J. C.; Stevermer, A.; Abshire, W. E.; Beller-Simms, N.; Herring, D.
2016-12-01
The National Oceanic and Atmospheric Administration (NOAA) plays a leading role in U.S. intergovernmental efforts on the Climate Data Initiative and the Climate Resilience Toolkit (CRT). CRT (http://toolkit.climate.gov/) is a valuable resource that provides tools, information, and subject matter expertise to decision makers in various sectors, such as agriculture, water resources and transportation, to help them build resilience to our changing climate. In order to make best use of the toolkit and all the resources within it, a training component is critical. The training section helps building users' understanding of the data, science, and impacts of climate variability and change. CRT identifies five steps in building resilience that includes use of appropriate tools to support decision makers depending on their needs. One tool that can be potentially integrated into CRT is NOAA's Local Climate Analysis Tool (LCAT), which provides access to trusted NOAA data and scientifically-sound analysis techniques for doing regional and local climate studies on climate variability and climate change. However, in order for LCAT to be used effectively, we have found an iterative learning approach using specific examples to train users. For example, for LCAT application in analysis of water resources, we use existing CRT case studies for Arizona and Florida water supply users. The Florida example demonstrates primary sensitivity to climate variability impacts, whereas the Arizona example takes into account longer- term climate change. The types of analyses included in LCAT are time series analysis of local climate and the estimated rate of change in the local climate. It also provides a composite analysis to evaluate the relationship between local climate and climate variability events such as El Niño Southern Oscillation, the Pacific North American Index, and other modes of climate variability. This paper will describe the development of a training module for use of LCAT and its integration into CRT. An iterative approach was used that incorporates specific examples of decision making while working with subject matter experts within the water supply community. The recommended strategy is to use a "stepping stone" learning structure to build users knowledge of best practices for use of LCAT.
Information environments for supporting consistent registrar medical handover.
Alem, Leila; Joseph, Michele; Kethers, Stefanie; Steele, Cathie; Wilkinson, Ross
This study was two-fold in nature. Initially, it examined the information environment and the use of customary information tools to support medical handovers in a large metropolitan teaching hospital on four weekends (i.e. Friday night to Monday morning). Weekend medical handovers were found to involve sequences of handovers where patients were discussed at the discretion of the doctor handing over; no reliable discussion of all patients of concern occurred at any one handover, with few information tools being used; and after a set of weekend handovers, there was no complete picture on a Monday morning without an analysis of all patient progress notes. In a subsequent case study, three information tools specifically designed as intervention that attempted to enrich the information environment were evaluated. Results indicate that these tools did support greater continuity in who was discussed but not in what was discussed at handover. After the intervention, if a doctor discussed a patient at handover, that patient was more likely to be discussed at subsequent handovers. However, the picture at Monday morning remained fragmentary. The results are discussed in terms of the complexities inherent in the handover process.
NASA Astrophysics Data System (ADS)
Soni, Sourabh Kumar; Thomas, Benedict
2018-04-01
The term "weldability" has been used to describe a wide variety of characteristics when a material is subjected to welding. In our analysis we perform experimental investigation to estimate the tensile strength of welded joint strength and then optimization of welding process parameters by using taguchi method and Artificial Neural Network (ANN) tool in MINITAB and MATLAB software respectively. The study reveals the influence on weldability of steel by varying composition of steel by mechanical characterization. At first we prepare the samples of different grades of steel (EN8, EN 19, EN 24). The samples were welded together by metal inert gas welding process and then tensile testing on Universal testing machine (UTM) was conducted for the same to evaluate the tensile strength of the welded steel specimens. Further comparative study was performed to find the effects of welding parameter on quality of weld strength by employing Taguchi method and Neural Network tool. Finally we concluded that taguchi method and Neural Network Tool is much efficient technique for optimization.
Fast analysis of radionuclide decay chain migration
NASA Astrophysics Data System (ADS)
Chen, J. S.; Liang, C. P.; Liu, C. W.; Li, L.
2014-12-01
A novel tool for rapidly predicting the long-term plume behavior of an arbitrary length radionuclide decay chain is presented in this study. This fast tool is achieved based on generalized analytical solutions in compact format derived for a set of two-dimensional advection-dispersion equations coupled with sequential first-order decay reactions in groundwater system. The performance of the developed tool is evaluated by a numerical model using a Laplace transform finite difference scheme. The results of performance evaluation indicate that the developed model is robust and accurate. The developed model is then used to fast understand the transport behavior of a four-member radionuclide decay chain. Results show that the plume extents and concentration levels of any target radionuclide are very sensitive to longitudinal, transverse dispersion, decay rate constant and retardation factor. The developed model are useful tools for rapidly assessing the ecological and environmental impact of the accidental radionuclide releases such as the Fukushima nuclear disaster where multiple radionuclides leaked through the reactor, subsequently contaminating the local groundwater and ocean seawater in the vicinity of the nuclear plant.
RipleyGUI: software for analyzing spatial patterns in 3D cell distributions
Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik
2013-01-01
The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544
Finite Element Simulation of Machining of Ti6Al4V Alloy
NASA Astrophysics Data System (ADS)
Rizzuti, S.; Umbrello, D.
2011-05-01
Titanium and its alloys are an important class of materials, especially for aerospace applications, due to their excellent combination of strength and fracture toughness as well as low density. However, these materials are generally regarded as difficult to machine because of their low thermal conductivity and high chemical reactivity with cutting tool materials. Moreover, the low thermal conductivity of Titanium inhibits dissipation of heat within the workpiece causing an higher temperature at the cutting edge and generating for higher cutting speed a rapid chipping at the cutting edge which leads to catastrophic failure. In addition, chip morphology significantly influences the thermo-mechanical behaviour at the workpiece/tool interface, which also affects the tool life. In this paper a finite element analysis of machining of TiAl6V4 is presented. In particular, cutting force, chip morphology and segmentation are taken into account due to their predominant roles to determine machinability and tool wear during the machining of these alloys. Results in terms of residual stresses are also presented. Moreover, the numerical results are compared with experimental ones.
Overaas, Cecilie K; Johansson, Melker S; de Campos, Tarcisio F; Ferreira, Manuela L; Natvig, Bard; Mork, Paul J; Hartvigsen, Jan
2017-12-16
Individuals with persistent low back pain commonly have a broad range of other health concerns including co-occurring musculoskeletal pain, which significantly affect their quality of life, symptom severity, and treatment outcomes. The purpose of this review is to get a better understanding of prevalence and patterns of co-occurring musculoskeletal pain complaints in those with persistent low back pain and its potential association with age, sex, and back-related disability as it might affect prognosis and management. This systematic review protocol has been designed according to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis Protocols. We will perform a comprehensive search, with no date limit, in the following bibliographic databases: MEDLINE and Embase (via Ovid), CINAHL, and Scopus for citation tracking, based on the following domains: back pain, co-occurring musculoskeletal pain, combined with a focus group that emphasizes study design. Appropriate papers will be screened against the eligibility criteria by three reviewers independently, data extracted by two independent author pairs and disagreement resolved by consensus meetings or other reviewers if required. Assessment of methodological quality and risk of bias will be conducted using a modified version of the Risk of Bias Tool for Prevalence Studies developed by Hoy and colleagues. The overall risk of bias will be determined for each included study based on the raters' consensus of the responses to the items in this tool. In case of sufficiently homogenous studies, meta-analysis will be performed. Given the lack of standard terms used to define co-occurring musculoskeletal pain, the search strategy will include the broader term "back pain," different terms for the "other co-occurring pain," and specific study designs combined with several exclusion terms. The results of this proposed review will identify the prevalence and patterns of co-occurring musculoskeletal pain among those with persistent low back pain, which is likely to inform clinical management, research, and policy in management of musculoskeletal disorders. PROSPERO CRD42017068807.
Just tooling around: Experiences with arctools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuttle, M.A.
1994-06-01
The three US Department of Energy (DOE) Installations on the Oak Ridge Reservation (Oak Ridge National Laboratory, Y-12 and K-25) were established during World War II as part of the Manhattan Project to build ``the bomb.`` In later years, the work at these facilities involved nuclear energy research, defense-related activities, and uranium enrichment, resulting in the generation of radioactive material and other toxic by-products. Work is now in progress to identify and clean up the environmental contamination from these and other wastes. Martin Marietta Energy Systems, Inc., which manages the Oak Ridge sites as well as DOE installations at Portsmouth,more » Ohio and Paducah, Kentucky, has been charged with creating and maintaining a comprehensive environmental information system in order to comply with the Federal Facility Agreement (FFA) for the Oak Ridge Reservation and the State of Tennessee Oversight Agreement between the US Department of Energy and the State of Tennessee. As a result, the Oak Ridge Environmental Information System (OREIS) was conceived and is currently being implemented. The tools chosen for the OREIS system are Oracle for the relational database, SAS for data analysis and graphical representation, and Arc/INFO and ArcView for the spatial analysis and display component. Within the broad scope of ESRI`s Arc/Info software, ArcTools was chosen as the graphic user interface for inclusion of Arc/Info into OREIS. The purpose of this paper is to examine in the advantages and disadvantages of incorporating ArcTools for the presentation of Arc/INFO in the OREIS system. The immediate and mid-term development goals of the OREIS system as they relate to ArcTools will be presented. A general discussion of our experiences with the ArcTools product is also included.« less
Comparing genome versus proteome-based identification of clinical bacterial isolates.
Galata, Valentina; Backes, Christina; Laczny, Cédric Christian; Hemmrich-Stanisak, Georg; Li, Howard; Smoot, Laura; Posch, Andreas Emanuel; Schmolke, Susanne; Bischoff, Markus; von Müller, Lutz; Plum, Achim; Franke, Andre; Keller, Andreas
2018-05-01
Whole-genome sequencing (WGS) is gaining importance in the analysis of bacterial cultures derived from patients with infectious diseases. Existing computational tools for WGS-based identification have, however, been evaluated on previously defined data relying thereby unwarily on the available taxonomic information.Here, we newly sequenced 846 clinical gram-negative bacterial isolates representing multiple distinct genera and compared the performance of five tools (CLARK, Kaiju, Kraken, DIAMOND/MEGAN and TUIT). To establish a faithful 'gold standard', the expert-driven taxonomy was compared with identifications based on matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS) analysis. Additionally, the tools were also evaluated using a data set of 200 Staphylococcus aureus isolates.CLARK and Kraken (with k =31) performed best with 626 (100%) and 193 (99.5%) correct species classifications for the gram-negative and S. aureus isolates, respectively. Moreover, CLARK and Kraken demonstrated highest mean F-measure values (85.5/87.9% and 94.4/94.7% for the two data sets, respectively) in comparison with DIAMOND/MEGAN (71 and 85.3%), Kaiju (41.8 and 18.9%) and TUIT (34.5 and 86.5%). Finally, CLARK, Kaiju and Kraken outperformed the other tools by a factor of 30 to 170 fold in terms of runtime.We conclude that the application of nucleotide-based tools using k-mers-e.g. CLARK or Kraken-allows for accurate and fast taxonomic characterization of bacterial isolates from WGS data. Hence, our results suggest WGS-based genotyping to be a promising alternative to the MS-based biotyping in clinical settings. Moreover, we suggest that complementary information should be used for the evaluation of taxonomic classification tools, as public databases may suffer from suboptimal annotations.
NASA Astrophysics Data System (ADS)
Mues, Sarah; Lilge, Inga; Schönherr, Holger; Kemper, Björn; Schnekenburger, Jürgen
2017-02-01
The major problem of Digital Holographic Microscopy (DHM) long term live cell imaging is that over time most of the tracked cells move out of the image area and other ones move in. Therefore, most of the cells are lost for the evaluation of individual cellular processes. Here, we present an effective solution for this crucial problem of long-term microscopic live cell analysis. We have generated functionalized slides containing areas of 250 μm per 200 μm. These micropatterned biointerfaces consist of passivating polyaclrylamide brushes (PAAm). Inner areas are backfilled with octadecanthiol (ODT), which allows cell attachment. The fouling properties of these surfaces are highly controllable and therefore the defined areas designed for the size our microscopic image areas were effective in keeping all cells inside the rectangles over the selected imaging period.
Developments in seismic monitoring for risk reduction
Celebi, M.
2007-01-01
This paper presents recent state-of-the-art developments to obtain displacements and drift ratios for seismic monitoring and damage assessment of buildings. In most cases, decisions on safety of buildings following seismic events are based on visual inspections of the structures. Real-time instrumental measurements using GPS or double integration of accelerations, however, offer a viable alternative. Relevant parameters, such as the type of connections and structural characteristics (including storey geometry), can be estimated to compute drifts corresponding to several pre-selected threshold stages of damage. Drift ratios determined from real-time monitoring can then be compared to these thresholds in order to estimate damage conditions drift ratios. This approach is demonstrated in three steel frame buildings in San Francisco, California. Recently recorded data of strong shaking from these buildings indicate that the monitoring system can be a useful tool in rapid assessment of buildings and other structures following an earthquake. Such systems can also be used for risk monitoring, as a method to assess performance-based design and analysis procedures, for long-term assessment of structural characteristics of a building, and as a possible long-term damage detection tool.
A web-based tool for groundwater mapping and drought analysis
NASA Astrophysics Data System (ADS)
Christensen, S.; Burns, M.; Jones, N.; Strassberg, G.
2012-12-01
In 2011-2012, the state of Texas saw the worst one-year drought on record. Fluctuations in gravity measured by GRACE satellites indicate that as much as 100 cubic kilometers of water was lost during this period. Much of this came from reservoirs and shallow soil moisture, but a significant amount came from aquifers. In response to this crisis, a Texas Drought Technology Steering Committee (TDTSC) consisting of academics and water managers was formed to develop new tools and strategies to assist the state in monitoring, predicting, and responding to drought events. In this presentation, we describe one of the tools that was developed as part of this effort. When analyzing the impact of drought on groundwater levels, it is fairly common to examine time series data at selected monitoring wells. However, accurately assessing impacts and trends requires both spatial and temporal analysis involving the development of detailed water level maps at various scales. Creating such maps in a flexible and rapid fashion is critical for effective drought analysis, but can be challenging due to the massive amounts of data involved and the processing required to generate such maps. Furthermore, wells are typically not sampled at the same points in time, and so developing a water table map for a particular date requires both spatial and temporal interpolation of water elevations. To address this challenge, a Cloud-based water level mapping system was developed for the state of Texas. The system is based on the Texas Water Development Board (TWDB) groundwater database, but can be adapted to use other databases as well. The system involves a set of ArcGIS workflows running on a server with a web-based front end and a Google Earth plug-in. A temporal interpolation geoprocessing tool was developed to estimate the piezometric heads for all wells in a given region at a specific date using a regression analysis. This interpolation tool is coupled with other geoprocessing tools to filter data and interpolate point elevations spatially to produce water level, drawdown, and depth to groundwater maps. The web interface allows for users to generate these maps at locations and times of interest. A sequence of maps can be generated over a period of time and animated to visualize how water levels are changing. The time series regression analysis can also be used to do short-term predictions of future water levels.
Nowinski, Wieslaw L; Belov, Dmitry
2003-09-01
The article introduces an atlas-assisted method and a tool called the Cerefy Neuroradiology Atlas (CNA), available over the Internet for neuroradiology and human brain mapping. The CNA contains an enhanced, extended, and fully segmented and labeled electronic version of the Talairach-Tournoux brain atlas, including parcelated gyri and Brodmann's areas. To our best knowledge, this is the first online, publicly available application with the Talairach-Tournoux atlas. The process of atlas-assisted neuroimage analysis is done in five steps: image data loading, Talairach landmark setting, atlas normalization, image data exploration and analysis, and result saving. Neuroimage analysis is supported by a near-real-time, atlas-to-data warping based on the Talairach transformation. The CNA runs on multiple platforms; is able to process simultaneously multiple anatomical and functional data sets; and provides functions for a rapid atlas-to-data registration, interactive structure labeling and annotating, and mensuration. It is also empowered with several unique features, including interactive atlas warping facilitating fine tuning of atlas-to-data fit, navigation on the triplanar formed by the image data and the atlas, multiple-images-in-one display with interactive atlas-anatomy-function blending, multiple label display, and saving of labeled and annotated image data. The CNA is useful for fast atlas-assisted analysis of neuroimage data sets. It increases accuracy and reduces time in localization analysis of activation regions; facilitates to communicate the information on the interpreted scans from the neuroradiologist to other clinicians and medical students; increases the neuroradiologist's confidence in terms of anatomy and spatial relationships; and serves as a user-friendly, public domain tool for neuroeducation. At present, more than 700 users from five continents have subscribed to the CNA.
NASA Astrophysics Data System (ADS)
Meksiarun, Phiranuphon; Ishigaki, Mika; Huck-Pezzei, Verena A. C.; Huck, Christian W.; Wongravee, Kanet; Sato, Hidetoshi; Ozaki, Yukihiro
2017-03-01
This study aimed to extract the paraffin component from paraffin-embedded oral cancer tissue spectra using three multivariate analysis (MVA) methods; Independent Component Analysis (ICA), Partial Least Squares (PLS) and Independent Component - Partial Least Square (IC-PLS). The estimated paraffin components were used for removing the contribution of paraffin from the tissue spectra. These three methods were compared in terms of the efficiency of paraffin removal and the ability to retain the tissue information. It was found that ICA, PLS and IC-PLS could remove the paraffin component from the spectra at almost the same level while Principal Component Analysis (PCA) was incapable. In terms of retaining cancer tissue spectral integrity, effects of PLS and IC-PLS on the non-paraffin region were significantly less than that of ICA where cancer tissue spectral areas were deteriorated. The paraffin-removed spectra were used for constructing Raman images of oral cancer tissue and compared with Hematoxylin and Eosin (H&E) stained tissues for verification. This study has demonstrated the capability of Raman spectroscopy together with multivariate analysis methods as a diagnostic tool for the paraffin-embedded tissue section.
Ingham, Steven C; Fanslau, Melody A; Burnham, Greg M; Ingham, Barbara H; Norback, John P; Schaffner, Donald W
2007-06-01
A computer-based tool (available at: www.wisc.edu/foodsafety/meatresearch) was developed for predicting pathogen growth in raw pork, beef, and poultry meat. The tool, THERM (temperature history evaluation for raw meats), predicts the growth of pathogens in pork and beef (Escherichia coli O157:H7, Salmonella serovars, and Staphylococcus aureus) and on poultry (Salmonella serovars and S. aureus) during short-term temperature abuse. The model was developed as follows: 25-g samples of raw ground pork, beef, and turkey were inoculated with a five-strain cocktail of the target pathogen(s) and held at isothermal temperatures from 10 to 43.3 degrees C. Log CFU per sample data were obtained for each pathogen and used to determine lag-phase duration (LPD) and growth rate (GR) by DMFit software. The LPD and GR were used to develop the THERM predictive tool, into which chronological time and temperature data for raw meat processing and storage are entered. The THERM tool then predicts a delta log CFU value for the desired pathogen-product combination. The accuracy of THERM was tested in 20 different inoculation experiments that involved multiple products (coarse-ground beef, skinless chicken breast meat, turkey scapula meat, and ground turkey) and temperature-abuse scenarios. With the time-temperature data from each experiment, THERM accurately predicted the pathogen growth and no growth (with growth defined as delta log CFU > 0.3) in 67, 85, and 95% of the experiments with E. coli 0157:H7, Salmonella serovars, and S. aureus, respectively, and yielded fail-safe predictions in the remaining experiments. We conclude that THERM is a useful tool for qualitatively predicting pathogen behavior (growth and no growth) in raw meats. Potential applications include evaluating process deviations and critical limits under the HACCP (hazard analysis critical control point) system.
Mysara, Mohamed; Elhefnawi, Mahmoud; Garibaldi, Jonathan M
2012-06-01
The investigation of small interfering RNA (siRNA) and its posttranscriptional gene-regulation has become an extremely important research topic, both for fundamental reasons and for potential longer-term therapeutic benefits. Several factors affect the functionality of siRNA including positional preferences, target accessibility and other thermodynamic features. State of the art tools aim to optimize the selection of target siRNAs by identifying those that may have high experimental inhibition. Such tools implement artificial neural network models as Biopredsi and ThermoComposition21, and linear regression models as DSIR, i-Score and Scales, among others. However, all these models have limitations in performance. In this work, a neural-network trained new siRNA scoring/efficacy prediction model was developed based on combining two existing scoring algorithms (ThermoComposition21 and i-Score), together with the whole stacking energy (ΔG), in a multi-layer artificial neural network. These three parameters were chosen after a comparative combinatorial study between five well known tools. Our developed model, 'MysiRNA' was trained on 2431 siRNA records and tested using three further datasets. MysiRNA was compared with 11 alternative existing scoring tools in an evaluation study to assess the predicted and experimental siRNA efficiency where it achieved the highest performance both in terms of correlation coefficient (R(2)=0.600) and receiver operating characteristics analysis (AUC=0.808), improving the prediction accuracy by up to 18% with respect to sensitivity and specificity of the best available tools. MysiRNA is a novel, freely accessible model capable of predicting siRNA inhibition efficiency with improved specificity and sensitivity. This multiclassifier approach could help improve the performance of prediction in several bioinformatics areas. MysiRNA model, part of MysiRNA-Designer package [1], is expected to play a key role in siRNA selection and evaluation. Copyright © 2012 Elsevier Inc. All rights reserved.
Equation-free analysis of agent-based models and systematic parameter determination
NASA Astrophysics Data System (ADS)
Thomas, Spencer A.; Lloyd, David J. B.; Skeldon, Anne C.
2016-12-01
Agent based models (ABM)s are increasingly used in social science, economics, mathematics, biology and computer science to describe time dependent systems in circumstances where a description in terms of equations is difficult. Yet few tools are currently available for the systematic analysis of ABM behaviour. Numerical continuation and bifurcation analysis is a well-established tool for the study of deterministic systems. Recently, equation-free (EF) methods have been developed to extend numerical continuation techniques to systems where the dynamics are described at a microscopic scale and continuation of a macroscopic property of the system is considered. To date, the practical use of EF methods has been limited by; (1) the over-head of application-specific implementation; (2) the laborious configuration of problem-specific parameters; and (3) large ensemble sizes (potentially) leading to computationally restrictive run-times. In this paper we address these issues with our tool for the EF continuation of stochastic systems, which includes algorithms to systematically configuration problem specific parameters and enhance robustness to noise. Our tool is generic and can be applied to any 'black-box' simulator and determines the essential EF parameters prior to EF analysis. Robustness is significantly improved using our convergence-constraint with a corrector-repeat (C3R) method. This algorithm automatically detects outliers based on the dynamics of the underlying system enabling both an order of magnitude reduction in ensemble size and continuation of systems at much higher levels of noise than classical approaches. We demonstrate our method with application to several ABM models, revealing parameter dependence, bifurcation and stability analysis of these complex systems giving a deep understanding of the dynamical behaviour of the models in a way that is not otherwise easily obtainable. In each case we demonstrate our systematic parameter determination stage for configuring the system specific EF parameters.
A UML Profile for State Analysis
NASA Technical Reports Server (NTRS)
Murray, Alex; Rasmussen, Robert
2010-01-01
State Analysis is a systems engineering methodology for the specification and design of control systems, developed at the Jet Propulsion Laboratory. The methodology emphasizes an analysis of the system under control in terms of States and their properties and behaviors and their effects on each other, a clear separation of the control system from the controlled system, cognizance in the control system of the controlled system's State, goal-based control built on constraining the controlled system's States, and disciplined techniques for State discovery and characterization. State Analysis (SA) introduces two key diagram types: State Effects and Goal Network diagrams. The team at JPL developed a tool for performing State Analysis. The tool includes a drawing capability, backed by a database that supports the diagram types and the organization of the elements of the SA models. But the tool does not support the usual activities of software engineering and design - a disadvantage, since systems to which State Analysis can be applied tend to be very software-intensive. This motivated the work described in this paper: the development of a preliminary Unified Modeling Language (UML) profile for State Analysis. Having this profile would enable systems engineers to specify a system using the methods and graphical language of State Analysis, which is easily linked with a larger system model in SysML (Systems Modeling Language), while also giving software engineers engaged in implementing the specified control system immediate access to and use of the SA model, in the same language, UML, used for other software design. That is, a State Analysis profile would serve as a shared modeling bridge between system and software models for the behavior aspects of the system. This paper begins with an overview of State Analysis and its underpinnings, followed by an overview of the mapping of SA constructs to the UML metamodel. It then delves into the details of these mappings and the constraints associated with them. Finally, we give an example of the use of the profile for expressing an example SA model.
Petterson, S R
2016-02-01
The aim of this study was to develop a modified quantitative microbial risk assessment (QMRA) framework that could be applied as a decision support tool to choose between alternative drinking water interventions in the developing context. The impact of different household water treatment (HWT) interventions on the overall incidence of diarrheal disease and disability adjusted life years (DALYs) was estimated, without relying on source water pathogen concentration as the starting point for the analysis. A framework was developed and a software tool constructed and then implemented for an illustrative case study for Nepal based on published scientific data. Coagulation combined with free chlorine disinfection provided the greatest estimated health gains in the short term; however, when long-term compliance was incorporated into the calculations, the preferred intervention was porous ceramic filtration. The model demonstrates how the QMRA framework can be used to integrate evidence from different studies to inform management decisions, and in particular to prioritize the next best intervention with respect to estimated reduction in diarrheal incidence. This study only considered HWT interventions; it is recognized that a systematic consideration of sanitation, recreation, and drinking water pathways is important for effective management of waterborne transmission of pathogens, and the approach could be expanded to consider the broader water-related context. © 2015 Society for Risk Analysis.
Ly, Thomas; Pamer, Carol; Dang, Oanh; Brajovic, Sonja; Haider, Shahrukh; Botsis, Taxiarchis; Milward, David; Winter, Andrew; Lu, Susan; Ball, Robert
2018-05-31
The FDA Adverse Event Reporting System (FAERS) is a primary data source for identifying unlabeled adverse events (AEs) in a drug or biologic drug product's postmarketing phase. Many AE reports must be reviewed by drug safety experts to identify unlabeled AEs, even if the reported AEs are previously identified, labeled AEs. Integrating the labeling status of drug product AEs into FAERS could increase report triage and review efficiency. Medical Dictionary for Regulatory Activities (MedDRA) is the standard for coding AE terms in FAERS cases. However, drug manufacturers are not required to use MedDRA to describe AEs in product labels. We hypothesized that natural language processing (NLP) tools could assist in automating the extraction and MedDRA mapping of AE terms in drug product labels. We evaluated the performance of three NLP systems, (ETHER, I2E, MetaMap) for their ability to extract AE terms from drug labels and translate the terms to MedDRA Preferred Terms (PTs). Pharmacovigilance-based annotation guidelines for extracting AE terms from drug labels were developed for this study. We compared each system's output to MedDRA PT AE lists, manually mapped by FDA pharmacovigilance experts using the guidelines, for ten drug product labels known as the "gold standard AE list" (GSL) dataset. Strict time and configuration conditions were imposed in order to test each system's capabilities under conditions of no human intervention and minimal system configuration. Each NLP system's output was evaluated for precision, recall and F measure in comparison to the GSL. A qualitative error analysis (QEA) was conducted to categorize a random sample of each NLP system's false positive and false negative errors. A total of 417, 278, and 250 false positive errors occurred in the ETHER, I2E, and MetaMap outputs, respectively. A total of 100, 80, and 187 false negative errors occurred in ETHER, I2E, and MetaMap outputs, respectively. Precision ranged from 64% to 77%, recall from 64% to 83% and F measure from 67% to 79%. I2E had the highest precision (77%), recall (83%) and F measure (79%). ETHER had the lowest precision (64%). MetaMap had the lowest recall (64%). The QEA found that the most prevalent false positive errors were context errors such as "Context error/General term", "Context error/Instructions or monitoring parameters", "Context error/Medical history preexisting condition underlying condition risk factor or contraindication", and "Context error/AE manifestations or secondary complication". The most prevalent false negative errors were in the "Incomplete or missed extraction" error category. Missing AE terms were typically due to long terms, or terms containing non-contiguous words which do not correspond exactly to MedDRA synonyms. MedDRA mapping errors were a minority of errors for ETHER and I2E but were the most prevalent false positive errors for MetaMap. The results demonstrate that it may be feasible to use NLP tools to extract and map AE terms to MedDRA PTs. However, the NLP tools we tested would need to be modified or reconfigured to lower the error rates to support their use in a regulatory setting. Tools specific for extracting AE terms from drug labels and mapping the terms to MedDRA PTs may need to be developed to support pharmacovigilance. Conducting research using additional NLP systems on a larger, diverse GSL would also be informative. Copyright © 2018. Published by Elsevier Inc.
Inter-subject phase synchronization for exploratory analysis of task-fMRI.
Bolt, Taylor; Nomi, Jason S; Vij, Shruti G; Chang, Catie; Uddin, Lucina Q
2018-08-01
Analysis of task-based fMRI data is conventionally carried out using a hypothesis-driven approach, where blood-oxygen-level dependent (BOLD) time courses are correlated with a hypothesized temporal structure. In some experimental designs, this temporal structure can be difficult to define. In other cases, experimenters may wish to take a more exploratory, data-driven approach to detecting task-driven BOLD activity. In this study, we demonstrate the efficiency and power of an inter-subject synchronization approach for exploratory analysis of task-based fMRI data. Combining the tools of instantaneous phase synchronization and independent component analysis, we characterize whole-brain task-driven responses in terms of group-wise similarity in temporal signal dynamics of brain networks. We applied this framework to fMRI data collected during performance of a simple motor task and a social cognitive task. Analyses using an inter-subject phase synchronization approach revealed a large number of brain networks that dynamically synchronized to various features of the task, often not predicted by the hypothesized temporal structure of the task. We suggest that this methodological framework, along with readily available tools in the fMRI community, provides a powerful exploratory, data-driven approach for analysis of task-driven BOLD activity. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hilliard, Antony
Energy Monitoring and Targeting is a well-established business process that develops information about utility energy consumption in a business or institution. While M&T has persisted as a worthwhile energy conservation support activity, it has not been widely adopted. This dissertation explains M&T challenges in terms of diagnosing and controlling energy consumption, informed by a naturalistic field study of M&T work. A Cognitive Work Analysis of M&T identifies structures that diagnosis can search, information flows un-supported in canonical support tools, and opportunities to extend the most popular tool for MM&T: Cumulative Sum of Residuals (CUSUM) charts. A design application outlines how CUSUM charts were augmented with a more contemporary statistical change detection strategy, Recursive Parameter Estimates, modified to better suit the M&T task using Representation Aiding principles. The design was experimentally evaluated in a controlled M&T synthetic task, and was shown to significantly improve diagnosis performance.
Effectiveness of a Case-Based Computer Program on Students' Ethical Decision Making.
Park, Eun-Jun; Park, Mihyun
2015-11-01
The aim of this study was to test the effectiveness of a case-based computer program, using an integrative ethical decision-making model, on the ethical decision-making competency of nursing students in South Korea. This study used a pre- and posttest comparison design. Students in the intervention group used a computer program for case analysis assignments, whereas students in the standard group used a traditional paper assignment for case analysis. The findings showed that using the case-based computer program as a complementary tool for the ethics courses offered at the university enhanced students' ethical preparedness and satisfaction with the course. On the basis of the findings, it is recommended that nurse educators use a case-based computer program as a complementary self-study tool in ethics courses to supplement student learning without an increase in course hours, particularly in terms of analyzing ethics cases with dilemma scenarios and exercising ethical decision making. Copyright 2015, SLACK Incorporated.
Tracking and Quantifying Developmental Processes in C. elegans Using Open-source Tools.
Dutta, Priyanka; Lehmann, Christina; Odedra, Devang; Singh, Deepika; Pohl, Christian
2015-12-16
Quantitatively capturing developmental processes is crucial to derive mechanistic models and key to identify and describe mutant phenotypes. Here protocols are presented for preparing embryos and adult C. elegans animals for short- and long-term time-lapse microscopy and methods for tracking and quantification of developmental processes. The methods presented are all based on C. elegans strains available from the Caenorhabditis Genetics Center and on open-source software that can be easily implemented in any laboratory independently of the microscopy system used. A reconstruction of a 3D cell-shape model using the modelling software IMOD, manual tracking of fluorescently-labeled subcellular structures using the multi-purpose image analysis program Endrov, and an analysis of cortical contractile flow using PIVlab (Time-Resolved Digital Particle Image Velocimetry Tool for MATLAB) are shown. It is discussed how these methods can also be deployed to quantitatively capture other developmental processes in different models, e.g., cell tracking and lineage tracing, tracking of vesicle flow.
Near-infrared hyperspectral imaging for quality analysis of agricultural and food products
NASA Astrophysics Data System (ADS)
Singh, C. B.; Jayas, D. S.; Paliwal, J.; White, N. D. G.
2010-04-01
Agricultural and food processing industries are always looking to implement real-time quality monitoring techniques as a part of good manufacturing practices (GMPs) to ensure high-quality and safety of their products. Near-infrared (NIR) hyperspectral imaging is gaining popularity as a powerful non-destructive tool for quality analysis of several agricultural and food products. This technique has the ability to analyse spectral data in a spatially resolved manner (i.e., each pixel in the image has its own spectrum) by applying both conventional image processing and chemometric tools used in spectral analyses. Hyperspectral imaging technique has demonstrated potential in detecting defects and contaminants in meats, fruits, cereals, and processed food products. This paper discusses the methodology of hyperspectral imaging in terms of hardware, software, calibration, data acquisition and compression, and development of prediction and classification algorithms and it presents a thorough review of the current applications of hyperspectral imaging in the analyses of agricultural and food products.
Positioning matrix of economic efficiency and complexity: a case study in a university hospital.
Ippolito, Adelaide; Viggiani, Vincenzo
2014-01-01
At the end of 2010, the Federico II University Hospital in Naples, Italy, initiated a series of discussions aimed at designing and applying a positioning matrix to its departments. This analysis was developed to create a tool able to extract meaningful information both to increase knowledge about individual departments and to inform the choices of general management during strategic planning. The name given to this tool was the positioning matrix of economic efficiency and complexity. In the matrix, the x-axis measures the ratio between revenues and costs, whereas the y-axis measures the index of complexity, thus showing "profitability" while bearing in mind the complexity of activities. By using the positioning matrix, it was possible to conduct a critical analysis of the characteristics of the Federico II University Hospital and to extract useful information for general management to use during strategic planning at the end of 2010 when defining medium-term objectives. Copyright © 2013 John Wiley & Sons, Ltd.
The advanced software development workstation project
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III; Pitman, Charles L.
1991-01-01
The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.
Planetarium instructional efficacy: A research synthesis
NASA Astrophysics Data System (ADS)
Brazell, Bruce D.
The purpose of the current study was to explore the instructional effectiveness of the planetarium in astronomy education using meta-analysis. A review of the literature revealed 46 studies related to planetarium efficacy. However, only 19 of the studies satisfied selection criteria for inclusion in the meta-analysis. Selected studies were then subjected to coding procedures, which extracted information such as subject characteristics, experimental design, and outcome measures. From these data, 24 effect sizes were calculated in the area of student achievement and five effect sizes were determined in the area of student attitudes using reported statistical information. Mean effect sizes were calculated for both the achievement and the attitude distributions. Additionally, each effect size distribution was subjected to homogeneity analysis. The attitude distribution was found to be homogeneous with a mean effect size of -0.09, which was not significant, p = .2535. The achievement distribution was found to be heterogeneous with a statistically significant mean effect size of +0.28, p < .05. Since the achievement distribution was heterogeneous, the analog to the ANOVA procedure was employed to explore variability in this distribution in terms of the coded variables. The analog to the ANOVA procedure revealed that the variability introduced by the coded variables did not fully explain the variability in the achievement distribution beyond subject-level sampling error under a fixed effects model. Therefore, a random effects model analysis was performed which resulted in a mean effect size of +0.18, which was not significant, p = .2363. However, a large random effect variance component was determined indicating that the differences between studies were systematic and yet to be revealed. The findings of this meta-analysis showed that the planetarium has been an effective instructional tool in astronomy education in terms of student achievement. However, the meta-analysis revealed that the planetarium has not been a very effective tool for improving student attitudes towards astronomy.
The State of Software for Evolutionary Biology
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-01-01
Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525
Tribological performances of new steel grades for hot stamping tools
NASA Astrophysics Data System (ADS)
Medea, F.; Venturato, G.; Ghiotti, A.; Bruschi, S.
2017-09-01
In the last years, the use of High Strength Steels (HSS) as structural parts in car body-in-white manufacturing has rapidly increased thanks to their favourable strength-to-weight ratio and stiffness, which allow a reduction of the fuel consumption to accommodate the new restricted regulations for CO2 emissions control. The survey of the technical and scientific literature shows a large interest in the development of different coatings for the blanks from the traditional Al-Si up to new Zn-based coatings and on the analysis of hard PVD, CVD coatings and plasma nitriding applied on the tools. By contrast, fewer investigations have been focused on the development and test of new tools steels grades capable to improve the wear resistance and the thermal properties that are required for the in-die quenching during forming. On this base, the paper deals with the analysis and comparison the tribological performances in terms of wear, friction and heat transfer of new tool steel grades for high-temperature applications, characterized by a higher thermal conductivity than the commonly used tools. Testing equipment, procedures as well as measurements analyses to evaluate the friction coefficient, the wear and heat transfer phenomena are presented. Emphasis is given on the physical simulation techniques that were specifically developed to reproduce the thermal and mechanical cycles on the metal sheets and dies as in the industrial practice. The reference industrial process is the direct hot stamping of the 22MnB5 HSS coated with the common Al-Si coating for automotive applications.
The Application of Function Points to Predict Source Lines of Code for Software Development
1992-09-01
there are some disadvantages. Software estimating tools are expensive. A single tool may cost more than $15,000 due to the high market value of the...term and Lang variables simultaneously onlN added marginal improvements over models with these terms included singularly. Using all the available
Teamwork Assessment Tools in Obstetric Emergencies: A Systematic Review.
Onwochei, Desire N; Halpern, Stephen; Balki, Mrinalini
2017-06-01
Team-based training and simulation can improve patient safety, by improving communication, decision making, and performance of team members. Currently, there is no general consensus on whether or not a specific assessment tool is better adapted to evaluate teamwork in obstetric emergencies. The purpose of this qualitative systematic review was to find the tools available to assess team effectiveness in obstetric emergencies. We searched Embase, Medline, PubMed, Web of Science, PsycINFO, CINAHL, and Google Scholar for prospective studies that evaluated nontechnical skills in multidisciplinary teams involving obstetric emergencies. The search included studies from 1944 until January 11, 2016. Data on reliability and validity measures were collected and used for interpretation. A descriptive analysis was performed on the data. Thirteen studies were included in the final qualitative synthesis. All the studies assessed teams in the context of obstetric simulation scenarios, but only six included anesthetists in the simulations. One study evaluated their teamwork tool using just validity measures, five using just reliability measures, and one used both. The most reliable tools identified were the Clinical Teamwork Scale, the Global Assessment of Obstetric Team Performance, and the Global Rating Scale of performance. However, they were still lacking in terms of quality and validity. More work needs to be conducted to establish the validity of teamwork tools for nontechnical skills, and the development of an ideal tool is warranted. Further studies are required to assess how outcomes, such as performance and patient safety, are influenced when using these tools.
The “Common Solutions” Strategy of the Experiment Support group at CERN for the LHC Experiments
NASA Astrophysics Data System (ADS)
Girone, M.; Andreeva, J.; Barreiro Megino, F. H.; Campana, S.; Cinquilli, M.; Di Girolamo, A.; Dimou, M.; Giordano, D.; Karavakis, E.; Kenyon, M. J.; Kokozkiewicz, L.; Lanciotti, E.; Litmaath, M.; Magini, N.; Negri, G.; Roiser, S.; Saiz, P.; Saiz Santos, M. D.; Schovancova, J.; Sciabà, A.; Spiga, D.; Trentadue, R.; Tuckett, D.; Valassi, A.; Van der Ster, D. C.; Shiers, J. D.
2012-12-01
After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management monitoring, File Transfer monitoring) and the Site Status Board. This talk focuses primarily on the strategic aspects of providing such common solutions and how this relates to the overall goals of long-term sustainability and the relationship to the various WLCG Technical Evolution Groups. The success of the service components has given us confidence in the process, and has developed the trust of the stakeholders. We are now attempting to expand the development of common solutions into the more critical workflows. The first is a feasibility study of common analysis workflow execution elements between ATLAS and CMS. We look forward to additional common development in the future.
Predictive and concurrent validity of the Braden scale in long-term care: a meta-analysis.
Wilchesky, Machelle; Lungu, Ovidiu
2015-01-01
Pressure ulcer prevention is an important long-term care (LTC) quality indicator. While the Braden Scale is a recommended risk assessment tool, there is a paucity of information specifically pertaining to its validity within the LTC setting. We, therefore, undertook a systematic review and meta-analysis comparing Braden Scale predictive and concurrent validity within this context. We searched the Medline, EMBASE, PsychINFO and PubMed databases from 1985-2014 for studies containing the requisite information to analyze tool validity. Our initial search yielded 3,773 articles. Eleven datasets emanating from nine published studies describing 40,361 residents met all meta-analysis inclusion criteria and were analyzed using random effects models. Pooled sensitivity, specificity, positive predictive value (PPV), and negative predictive values were 86%, 38%, 28%, and 93%, respectively. Specificity was poorer in concurrent samples as compared with predictive samples (38% vs. 72%), while PPV was low in both sample types (25 and 37%). Though random effects model results showed that the Scale had good overall predictive ability [RR, 4.33; 95% CI, 3.28-5.72], none of the concurrent samples were found to have "optimal" sensitivity and specificity. In conclusion, the appropriateness of the Braden Scale in LTC is questionable given its low specificity and PPV, in particular in concurrent validity studies. Future studies should further explore the extent to which the apparent low validity of the Scale in LTC is due to the choice of cutoff point and/or preventive strategies implemented by LTC staff as a matter of course. © 2015 by the Wound Healing Society.
Long-term imaging of circadian locomotor rhythms of a freely crawling C. elegans population
Winbush, Ari; Gruner, Matthew; Hennig, Grant W.; van der Linden, Alexander M.
2016-01-01
Background Locomotor activity is used extensively as a behavioral output to study the underpinnings of circadian rhythms. Recent studies have required a populational approach for the study of circadian rhythmicity in Caenorhabditis elegans locomotion. New method We describe an imaging system for long-term automated recording and analysis of locomotion data of multiple free-crawling C. elegans animals on the surface of an agar plate. We devised image analysis tools for measuring specific features related to movement and shape to identify circadian patterns. Results We demonstrate the utility of our system by quantifying circadian locomotor rhythms in wild-type and mutant animals induced by temperature cycles. We show that 13 °C:18 °C (12:12 h) cycles are sufficient to entrain locomotor activity of wild-type animals, which persist but are rapidly damped during 13 °C free-running conditions. Animals with mutations in tax-2, a cyclic nucleotide-gated (CNG) ion channel, significantly reduce locomotor activity during entrainment and free-running. Comparison with existing method(s) Current methods for measuring circadian locomotor activity is generally restricted to recording individual swimming animals of C. elegans, which is a distinct form of locomotion from crawling behavior generally observed in the laboratory. Our system works well with up to 20 crawling adult animals, and allows for a detailed analysis of locomotor activity over long periods of time. Conclusions Our population-based approach provides a powerful tool for quantification of circadian rhythmicity of C. elegans locomotion, and could allow for a screening system of candidate circadian genes in this model organism. PMID:25911068
Food Service Technical Terms. English-Spanish Lexicon.
ERIC Educational Resources Information Center
Shin, Masako T.
This English-Spanish lexicon presents food service technical terms. The terms are divided into seven categories: basic food items, common baking terms, food cutting terms, general cooking terms, non-English culinary terms, and tools and equipment. Each English word or term is followed by its Spanish equivalent(s). (YLB)
Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A
1995-06-01
PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.
NASA Astrophysics Data System (ADS)
Rohland, Stefanie; Pfurtscheller, Clemens; Seebauer, Sebastian
2016-04-01
Keywords: private preparedness, property protection, flood, heavy rains, Transtheoretical Model, evaluation of methods and tools Experiences in Europe and Austria from coping with numerous floods and heavy rain events in recent decades point to room for improvement in reducing damages and adverse effects. One of the emerging issues is private preparedness, which has only received punctual attention in Austria until now. Current activities to promote property protection are, however, not underpinned by a long-term strategy, thus minimizing their cumulative effect. While printed brochures and online information are widely available, innovative information services, tailored to and actively addressing specific target groups, are thin on the ground. This project reviews (national as well as international) established approaches, with a focus on German-speaking areas, checking their long-term effectiveness with the help of expert workshops and an empirical analysis of survey data. The Transtheoretical Model (Prochaska, 1977) serves as the analytical framework: We assign specific tools to distinct stages of behavioural change. People's openness to absorb risk information or their willingness to engage in private preparedness depend on an incremental process of considering, appraising, introducing and finally maintaining preventive actions. Based on this stage-specific perspective and the workshop results, gaps of intervention are identified to define best-practice examples and recommendations that can be realized within the prevailing legislative and organizational framework at national, regional and local level in Austria.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, J.
The U. S. Department of Energy's (DOE) Office of Environmental Management (EM) has the responsibility for cleaning up 60 sites in 22 states that were associated with the legacy of the nation's nuclear weapons program and other research and development activities. These sites are unique and many of the technologies needed to successfully disposition the associated wastes have yet to be developed or would require significant re-engineering to be adapted for future EM cleanup efforts. In 2008, the DOE-EM Engineering and Technology Program (EM-22) released the Engineering and Technology Roadmap in response to Congressional direction and the need to focusmore » on longer term activities required for the completion of the aforementioned cleanup program. One of the strategic initiatives included in the Roadmap was to enhance long term performance monitoring as defined by 'Develop and deploy cost effective long-term strategies and technologies to monitor closure sites (including soil, groundwater, and surface water) with multiple contaminants (organics, metals and radionuclides) to verify integrated long-term cleanup performance'. To support this long-term monitoring (LTM) strategic initiative, EM 22 and the Savannah River National Laboratory (SRNL) organized and held an interactive symposia, known as the 2009 DOE-EM Long-Term Monitoring Technical Forum, to define and prioritize LTM improvement strategies and products that could be realized within a 3 to 5 year investment time frame. This near-term focus on fundamental research would then be used as a foundation for development of applied programs to improve the closure and long-term performance of EM's legacy waste sites. The Technical Forum was held in Atlanta, GA on February 11-12, 2009, and attended by 57 professionals with a focus on identifying those areas of opportunity that would most effectively advance the transition of the current practices to a more effective strategy for the LTM paradigm. The meeting format encompassed three break-out sessions, which focused on needs and opportunities associated with the following LTM technical areas: (1) Performance Monitoring Tools, (2) Systems, and (3) Information Management. The specific objectives of the Technical Forum were to identify: (1) technical targets for reducing EM costs for life-cycle monitoring; (2) cost-effective approaches and tools to support the transition from active to passive remedies at EM waste sites; and (3) specific goals and objectives associated with the lifecycle monitoring initiatives outlined within the Roadmap. The first Breakout Session on LTM performance measurement tools focused on the integration and improvement of LTM performance measurement and monitoring tools that deal with parameters such as ecosystems, boundary conditions, geophysics, remote sensing, biomarkers, ecological indicators and other types of data used in LTM configurations. Although specific tools were discussed, it was recognized that the Breakout Session could not comprehensively discuss all monitoring technologies in the time provided. Attendees provided key references where other organizations have assessed monitoring tools. Three investment sectors were developed in this Breakout Session. The second Breakout Session was on LTM systems. The focus of this session was to identify new and inventive LTM systems addressing the framework for interactive parameters such as infrastructure, sensors, diagnostic features, field screening tools, state of the art characterization monitoring systems/concepts, and ecosystem approaches to site conditions and evolution. LTM systems consist of the combination of data acquisition and management efforts, data processing and analysis efforts and reporting tools. The objective of the LTM systems workgroup was to provide a vision and path towards novel and innovative LTM systems, which should be able to provide relevant, actionable information on system performance in a cost-effective manner. Two investment sectors were developed in this Breakout Session. The last Breakout Session of the Technical Forum was on LTM information management. The session focus was on the development and implementation of novel information management systems for LTM including techniques to address data issues such as: efficient management of large and diverse datasets; consistency and comparability in data management and incorporation of accurate historical information; data interpretation and information synthesis including statistical methods, modeling, and visualization; and linage of data to site management objectives and leveraging information to forge consensus among stakeholders. One investment sector was developed in this Breakout Session.« less
Gross, Anita R.; Kaplan, Faith; Huang, Stacey; Khan, Mahweesh; Santaguida, P. Lina; Carlesso, Lisa C.; MacDermid, Joy C.; Walton, David M.; Kenardy, Justin; Söderlund, Anne; Verhagen, Arianne; Hartvigsen, Jan
2013-01-01
Objectives: To conduct an overview on psychological interventions, orthoses, patient education, ergonomics, and 1⁰/2⁰ neck pain prevention for adults with acute-chronic neck pain. Search Strategy: Computerized databases and grey literature were searched (2006-2012). Selection Criteria: Systematic reviews of randomized controlled trials (RCTs) on pain, function/disability, global perceived effect, quality-of-life and patient satisfaction were retrieved. Data Collection & Analysis: Two independent authors selected articles, assessed risk of bias using AMSTAR tool and extracted data. The GRADE tool was used to evaluate the body of evidence and an external panel to provide critical review. Main Results: We retrieved 30 reviews (5-9 AMSTAR score) reporting on 75 RCTs with the following moderate GRADE evidence. For acute whiplash associated disorder (WAD), an education video in emergency rooms (1RCT, 405participants] favoured pain reduction at long-term follow-up thus helping 1 in 23 people [Standard Mean Difference: -0.44(95%CI: -0.66 to -0.23)). Use of a soft collar (2RCTs, 1278participants) was not beneficial in the long-term. For chronic neck pain, a mind-body intervention (2RCTs, 1 meta-analysis, 191participants) improved short-term pain/function in 1 of 4 or 6 participants. In workers, 2-minutes of daily scapula-thoracic endurance training (1RCT, 127participants) over 10 weeks was beneficial in 1 of 4 participants. A number of psychosocial interventions, workplace interventions, collar use and self-management educational strategies were not beneficial. Reviewers' Conclusions: Moderate evidence exists for quantifying beneficial and non-beneficial effects of a limited number of interventions for acute WAD and chronic neck pain. Larger trials with more rigorous controls need to target promising interventions PMID:24133554
NOBLE - Flexible concept recognition for large-scale biomedical natural language processing.
Tseytlin, Eugene; Mitchell, Kevin; Legowski, Elizabeth; Corrigan, Julia; Chavan, Girish; Jacobson, Rebecca S
2016-01-14
Natural language processing (NLP) applications are increasingly important in biomedical data analysis, knowledge engineering, and decision support. Concept recognition is an important component task for NLP pipelines, and can be either general-purpose or domain-specific. We describe a novel, flexible, and general-purpose concept recognition component for NLP pipelines, and compare its speed and accuracy against five commonly used alternatives on both a biological and clinical corpus. NOBLE Coder implements a general algorithm for matching terms to concepts from an arbitrary vocabulary set. The system's matching options can be configured individually or in combination to yield specific system behavior for a variety of NLP tasks. The software is open source, freely available, and easily integrated into UIMA or GATE. We benchmarked speed and accuracy of the system against the CRAFT and ShARe corpora as reference standards and compared it to MMTx, MGrep, Concept Mapper, cTAKES Dictionary Lookup Annotator, and cTAKES Fast Dictionary Lookup Annotator. We describe key advantages of the NOBLE Coder system and associated tools, including its greedy algorithm, configurable matching strategies, and multiple terminology input formats. These features provide unique functionality when compared with existing alternatives, including state-of-the-art systems. On two benchmarking tasks, NOBLE's performance exceeded commonly used alternatives, performing almost as well as the most advanced systems. Error analysis revealed differences in error profiles among systems. NOBLE Coder is comparable to other widely used concept recognition systems in terms of accuracy and speed. Advantages of NOBLE Coder include its interactive terminology builder tool, ease of configuration, and adaptability to various domains and tasks. NOBLE provides a term-to-concept matching system suitable for general concept recognition in biomedical NLP pipelines.
Health Systems and Their Assessment: A Methodological Proposal of the Synthetic Outcome Measure
Romaniuk, Piotr; Kaczmarek, Krzysztof; Syrkiewicz-Świtała, Magdalena; Holecki, Tomasz; Szromek, Adam R.
2018-01-01
The effectiveness of health systems is an area of constant interest for public health researchers and practitioners. The varied approach to effectiveness itself has resulted in numerous methodological proposals related to its measurement. The limitations of the currently used methods lead to a constant search for better tools for the assessment of health systems. This article shows the possibilities of using the health system synthetic outcome measure (SOM) for this purpose. It is an original tool using 41 indicators referring to the epidemiological situation, health behaviors, and factors related to the health-care system, which allows a relatively quick and easy assessment of the health system in terms of its effectiveness. Construction of the measure of health system functioning in such a way allowed its presentation in dynamic perspective, i.e., assessing not only the health system itself in a given moment of time but also changes in the value of the effectiveness measures. In order to demonstrate the cognitive value of the SOM, the analysis of the effectiveness of health systems in 21 countries of Central and Eastern Europe during the transformation period was carried out. The mean SOM values calculated on the basis of the component measures allowed to differentiate countries in terms of the effectiveness of their health systems. Considering the whole period, a similar level of health system effects can be observed in Slovenia, Croatia, Czech Republic, Slovakia, Poland, Macedonia, and Albania. In the middle group, Hungary, Romania, Latvia, Lithuania, Georgia, Estonia, Bulgaria, Belarus, and Armenia were found. The third group, weakest in terms of achieved effects, was formed by health systems in countries like Ukraine, Moldova, and Russia. The presented method allows for the analysis of the health system outcomes from a comparative angle, eliminating arbitrariness of pinpointing a model solution as a potential reference point in the assessment of the systems. The measure, with the use of additional statistical tools to establish correlations with elements of the external and internal environment of a health system, allows for conducting analyses of conditions for differences in the effects of health system operation and circumstances for the effectiveness of reform processes.
Health Systems and Their Assessment: A Methodological Proposal of the Synthetic Outcome Measure.
Romaniuk, Piotr; Kaczmarek, Krzysztof; Syrkiewicz-Świtała, Magdalena; Holecki, Tomasz; Szromek, Adam R
2018-01-01
The effectiveness of health systems is an area of constant interest for public health researchers and practitioners. The varied approach to effectiveness itself has resulted in numerous methodological proposals related to its measurement. The limitations of the currently used methods lead to a constant search for better tools for the assessment of health systems. This article shows the possibilities of using the health system synthetic outcome measure (SOM) for this purpose. It is an original tool using 41 indicators referring to the epidemiological situation, health behaviors, and factors related to the health-care system, which allows a relatively quick and easy assessment of the health system in terms of its effectiveness. Construction of the measure of health system functioning in such a way allowed its presentation in dynamic perspective, i.e., assessing not only the health system itself in a given moment of time but also changes in the value of the effectiveness measures. In order to demonstrate the cognitive value of the SOM, the analysis of the effectiveness of health systems in 21 countries of Central and Eastern Europe during the transformation period was carried out. The mean SOM values calculated on the basis of the component measures allowed to differentiate countries in terms of the effectiveness of their health systems. Considering the whole period, a similar level of health system effects can be observed in Slovenia, Croatia, Czech Republic, Slovakia, Poland, Macedonia, and Albania. In the middle group, Hungary, Romania, Latvia, Lithuania, Georgia, Estonia, Bulgaria, Belarus, and Armenia were found. The third group, weakest in terms of achieved effects, was formed by health systems in countries like Ukraine, Moldova, and Russia. The presented method allows for the analysis of the health system outcomes from a comparative angle, eliminating arbitrariness of pinpointing a model solution as a potential reference point in the assessment of the systems. The measure, with the use of additional statistical tools to establish correlations with elements of the external and internal environment of a health system, allows for conducting analyses of conditions for differences in the effects of health system operation and circumstances for the effectiveness of reform processes.
Ornellas, Pâmela Oliveira; Antunes, Leonardo Dos Santos; Fontes, Karla Bianca Fernandes da Costa; Póvoa, Helvécio Cardoso Corrêa; Küchler, Erika Calvano; Iorio, Natalia Lopes Pontes; Antunes, Lívia Azeredo Alves
2016-09-01
This study aimed to perform a systematic review to assess the effectiveness of antimicrobial photodynamic therapy (aPDT) in the reduction of microorganisms in deep carious lesions. An electronic search was conducted in Pubmed, Web of Science, Scopus, Lilacs, and Cochrane Library, followed by a manual search. The MeSH terms, MeSH synonyms, related terms, and free terms were used in the search. As eligibility criteria, only clinical studies were included. Initially, 227 articles were identified in the electronic search, and 152 studies remained after analysis and exclusion of the duplicated studies; 6 remained after application of the eligibility criteria; and 3 additional studies were found in the manual search. After access to the full articles, three were excluded, leaving six for evaluation by the criteria of the Cochrane Collaboration’s tool for assessing risk of bias. Of these, five had some risk of punctuated bias. All results from the selected studies showed a significant reduction of microorganisms in deep carious lesions for both primary and permanent teeth. The meta-analysis demonstrated a significant reduction in microorganism counts in all analyses (p<0.00001). Based on these findings, there is scientific evidence emphasizing the effectiveness of aPDT in reducing microorganisms in deep carious lesions.
NASA Astrophysics Data System (ADS)
Ornellas, Pâmela Oliveira; Antunes, Leonardo Santos; Fontes, Karla Bianca Fernandes da Costa; Póvoa, Helvécio Cardoso Corrêa; Küchler, Erika Calvano; Iorio, Natalia Lopes Pontes; Antunes, Lívia Azeredo Alves
2016-09-01
This study aimed to perform a systematic review to assess the effectiveness of antimicrobial photodynamic therapy (aPDT) in the reduction of microorganisms in deep carious lesions. An electronic search was conducted in Pubmed, Web of Science, Scopus, Lilacs, and Cochrane Library, followed by a manual search. The MeSH terms, MeSH synonyms, related terms, and free terms were used in the search. As eligibility criteria, only clinical studies were included. Initially, 227 articles were identified in the electronic search, and 152 studies remained after analysis and exclusion of the duplicated studies; 6 remained after application of the eligibility criteria; and 3 additional studies were found in the manual search. After access to the full articles, three were excluded, leaving six for evaluation by the criteria of the Cochrane Collaboration's tool for assessing risk of bias. Of these, five had some risk of punctuated bias. All results from the selected studies showed a significant reduction of microorganisms in deep carious lesions for both primary and permanent teeth. The meta-analysis demonstrated a significant reduction in microorganism counts in all analyses (p<0.00001). Based on these findings, there is scientific evidence emphasizing the effectiveness of aPDT in reducing microorganisms in deep carious lesions.
Land surface Verification Toolkit (LVT)
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.
2017-01-01
LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.
Development of solution techniques for nonlinear structural analysis
NASA Technical Reports Server (NTRS)
Vos, R. G.; Andrews, J. S.
1974-01-01
Nonlinear structural solution methods in the current research literature are classified according to order of the solution scheme, and it is shown that the analytical tools for these methods are uniformly derivable by perturbation techniques. A new perturbation formulation is developed for treating an arbitrary nonlinear material, in terms of a finite-difference generated stress-strain expansion. Nonlinear geometric effects are included in an explicit manner by appropriate definition of an applicable strain tensor. A new finite-element pilot computer program PANES (Program for Analysis of Nonlinear Equilibrium and Stability) is presented for treatment of problems involving material and geometric nonlinearities, as well as certain forms on nonconservative loading.
CMS endcap RPC performance analysis
NASA Astrophysics Data System (ADS)
Teng, H.; CMS Collaboration
2014-08-01
The Resistive Plate Chamber (RPC) detector system in LHC-CMS experiment is designed for the trigger purpose. The endcap RPC system has been successfully operated since the commissioning period (2008) to the end of RUN1 (2013). We have developed an analysis tool for endcap RPC performance and validated the efficiency calculation algorithm, focusing on the first endcap station which was assembled and tested by the Peking University group. We cross checked the results obtained with those extracted with alternative methods and we found good agreement in terms of performance parameters [1]. The results showed that the CMS-RPC endcap system fulfilled the performance expected in the Technical Design Report [2].
Impact of Drought on Groundwater and Soil Moisture - A Geospatial Tool for Water Resource Management
NASA Astrophysics Data System (ADS)
Ziolkowska, J. R.; Reyes, R.
2016-12-01
For many decades, recurring droughts in different regions in the US have been negatively impacting ecosystems and economic sectors. Oklahoma and Texas have been suffering from exceptional and extreme droughts in 2011-2014, with almost 95% of the state areas being affected (Drought Monitor, 2015). Accordingly, in 2011 alone, around 1.6 billion were lost in the agricultural sector alone as a result of drought in Oklahoma (Stotts 2011), and 7.6 billion in Texas agriculture (Fannin 2012). While surface water is among the instant indicators of drought conditions, it does not translate directly to groundwater resources that are the main source of irrigation water. Both surface water and groundwater are susceptible to drought, while groundwater depletion is a long-term process and might not show immediately. However, understanding groundwater availability is crucial for designing water management strategies and sustainable water use in the agricultural sector and other economic sectors. This paper presents an interactive geospatially weighted evaluation model and a tool at the same time to analyze groundwater resources that can be used for decision support in water management. The tool combines both groundwater and soil moisture changes in Oklahoma and Texas in 2003-2014, thus representing the most important indicators of agricultural and hydrological drought. The model allows for analyzing temporal and geospatial long-term drought at the county level. It can be expanded to other regions in the US and the world. The model has been validated with the Palmer Drought Index Severity Index to account for other indicators of meteorological drought. It can serve as a basis for an upcoming socio-economic and environmental analysis of drought events in the short and long-term in different geographic regions.
Donini, Lorenzo M; Poggiogalle, Eleonora; Molfino, Alessio; Rosano, Aldo; Lenzi, Andrea; Rossi Fanelli, Filippo; Muscaritoli, Maurizio
2016-10-01
Malnutrition plays a major role in clinical and functional impairment in older adults. The use of validated, user-friendly and rapid screening tools for malnutrition in the elderly may improve the diagnosis and, possibly, the prognosis. The aim of this study was to assess the agreement between Mini-Nutritional Assessment (MNA), considered as a reference tool, MNA short form (MNA-SF), Malnutrition Universal Screening Tool (MUST), and Nutrition Risk Screening (NRS-2002) in elderly institutionalized participants. Participants were enrolled among nursing home residents and underwent a multidimensional evaluation. Predictive value and survival analysis were performed to compare the nutritional classifications obtained from the different tools. A total of 246 participants (164 women, age: 82.3 ± 9 years, and 82 men, age: 76.5 ± 11 years) were enrolled. Based on MNA, 22.6% of females and 17% of males were classified as malnourished; 56.7% of women and 61% of men were at risk of malnutrition. Agreement between MNA and MUST or NRS-2002 was classified as "fair" (k = 0.270 and 0.291, respectively; P < .001), whereas the agreement between MNA and MNA-SF was classified as "moderate" (k = 0.588; P < .001). Because of the high percentage of false negative participants, MUST and NRS-2002 presented a low overall predictive value compared with MNA and MNA-SF. Clinical parameters were significantly different in false negative participants with MUST or NRS-2002 from true negative and true positive individuals using the reference tool. For all screening tools, there was a significant association between malnutrition and mortality. MNA showed the best predictive value for survival among well-nourished participants. Functional, psychological, and cognitive parameters, not considered in MUST and NRS-2002 tools, are probably more important risk factors for malnutrition than acute illness in geriatric long-term care inpatient settings and may account for the low predictive value of these tests. MNA-SF seems to combine the predictive capacity of the full version of the MNA with a sufficiently short time of administration. Copyright © 2016 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.
How to normalize metatranscriptomic count data for differential expression analysis.
Klingenberg, Heiner; Meinicke, Peter
2017-01-01
Differential expression analysis on the basis of RNA-Seq count data has become a standard tool in transcriptomics. Several studies have shown that prior normalization of the data is crucial for a reliable detection of transcriptional differences. Until now it has not been clear whether and how the transcriptomic approach can be used for differential expression analysis in metatranscriptomics. We propose a model for differential expression in metatranscriptomics that explicitly accounts for variations in the taxonomic composition of transcripts across different samples. As a main consequence the correct normalization of metatranscriptomic count data under this model requires the taxonomic separation of the data into organism-specific bins. Then the taxon-specific scaling of organism profiles yields a valid normalization and allows us to recombine the scaled profiles into a metatranscriptomic count matrix. This matrix can then be analyzed with statistical tools for transcriptomic count data. For taxon-specific scaling and recombination of scaled counts we provide a simple R script. When applying transcriptomic tools for differential expression analysis directly to metatranscriptomic data with an organism-independent (global) scaling of counts the resulting differences may be difficult to interpret. The differences may correspond to changing functional profiles of the contributing organisms but may also result from a variation of taxonomic abundances. Taxon-specific scaling eliminates this variation and therefore the resulting differences actually reflect a different behavior of organisms under changing conditions. In simulation studies we show that the divergence between results from global and taxon-specific scaling can be drastic. In particular, the variation of organism abundances can imply a considerable increase of significant differences with global scaling. Also, on real metatranscriptomic data, the predictions from taxon-specific and global scaling can differ widely. Our studies indicate that in real data applications performed with global scaling it might be impossible to distinguish between differential expression in terms of transcriptomic changes and differential composition in terms of changing taxonomic proportions. As in transcriptomics, a proper normalization of count data is also essential for differential expression analysis in metatranscriptomics. Our model implies a taxon-specific scaling of counts for normalization of the data. The application of taxon-specific scaling consequently removes taxonomic composition variations from functional profiles and therefore provides a clear interpretation of the observed functional differences.
Vail, III, William Banning; Momii, Steven Thomas
1998-01-01
Methods and apparatus are described to produce stick-slip motion of a logging tool within a cased well attached to a wireline that is drawn upward by a continuously rotating wireline drum. The stick-slip motion results in the periodic upward movement of the tool in the cased well described in terms of a dwell time during which time the tool is stationary, the move time during which time the tool moves, and the stroke that is upward distance that the tool translates during the "slip" portion of the stick-slip motion. This method of measurement is used to log the well at different vertical positions of the tool. Therefore, any typical "station-to-station logging tool" may be modified to be a "continuous logging tool", where "continuous" means that the wireline drum continually rotates while the tool undergoes stick-slip motion downhole and measurements are performed during the dwell times when the tool is momentarily stationary. The stick-slip methods of operation and the related apparatus are particularly described in terms of making measurements of formation resistivity from within a cased well during the dwell times when the tool is momentarily stationary during the periodic stick-slip motion of the logging tool.
Vail, W.B. III; Momii, S.T.
1998-02-10
Methods and apparatus are described to produce stick-slip motion of a logging tool within a cased well attached to a wireline that is drawn upward by a continuously rotating wireline drum. The stick-slip motion results in the periodic upward movement of the tool in the cased well described in terms of a dwell time during which time the tool is stationary, the move time during which time the tool moves, and the stroke that is upward distance that the tool translates during the ``slip`` portion of the stick-slip motion. This method of measurement is used to log the well at different vertical positions of the tool. Therefore, any typical ``station-to-station logging tool`` may be modified to be a ``continuous logging tool,`` where ``continuous`` means that the wireline drum continually rotates while the tool undergoes stick-slip motion downhole and measurements are performed during the dwell times when the tool is momentarily stationary. The stick-slip methods of operation and the related apparatus are particularly described in terms of making measurements of formation resistivity from within a cased well during the dwell times when the tool is momentarily stationary during the periodic stick-slip motion of the logging tool. 12 figs.
Decoupling the use and meaning of strategic plans in public healthcare
2013-01-01
Background The culture of New Public Management has promoted the diffusion of strategic management tools throughout Public Healthcare Organisations (PHOs). There is consensus that better strategic planning tools are required to achieve higher levels of organisational performance. This paper provides evidence and understanding of the emergent uses and scope of strategic planning in PHOs, in order to answer three research questions: (i) has the New Public Management approach changed the organisational culture of PHOs in terms of how they adopt, diffuse, and use strategic planning documents? (ii) how coherent are strategic planning documents in PHOs? and (iii) what are the main purposes of strategic documents in PHOs? Methods An analysis was carried out in three Italian Local Health Authorities. We analysed the number and types of formal strategic documents adopted between 2004 and 2012, evaluating their degree of coherence and coordination, their hierarchy, their degree of disclosure, and the consistency of their strategic goals. A content analysis was performed to investigate overlap in terms of content and focus, and a qualitative analysis was carried out to study and represent the relationships between documents. Results The analysis showed that a rich set of strategic documents were adopted by each PHO. However, these are often uncoordinated and overlap in terms of content. They adopt different language and formats for various stakeholders. The presence of diverse external drivers may explain the divergent focus, priorities and inconsistent goals in the strategic documents. This planning complexity makes it difficult to determine how the overall goals and mission of an organisation are defined and made visible. Conclusions The evidence suggests that PHOs use a considerable number of strategic documents. However, they employ no clear or explicit overarching strategy currently, and strategic planning appears to be externally oriented. All the documents communicate similar topics to different stakeholders, although they use different language to answer to the different expectations of each stakeholder. Therefore, strategic planning and plans seem to be driven by neo-institutional approaches, as they are means to build consensus and negotiate ground for strategic actions, rather than means to identify strategic choices and priorities. PMID:23289527
Decoupling the use and meaning of strategic plans in public healthcare.
Lega, Federico; Longo, Francesco; Rotolo, Andrea
2013-01-04
The culture of New Public Management has promoted the diffusion of strategic management tools throughout Public Healthcare Organisations (PHOs). There is consensus that better strategic planning tools are required to achieve higher levels of organisational performance. This paper provides evidence and understanding of the emergent uses and scope of strategic planning in PHOs, in order to answer three research questions: (i) has the New Public Management approach changed the organisational culture of PHOs in terms of how they adopt, diffuse, and use strategic planning documents? (ii) how coherent are strategic planning documents in PHOs? and (iii) what are the main purposes of strategic documents in PHOs? An analysis was carried out in three Italian Local Health Authorities. We analysed the number and types of formal strategic documents adopted between 2004 and 2012, evaluating their degree of coherence and coordination, their hierarchy, their degree of disclosure, and the consistency of their strategic goals. A content analysis was performed to investigate overlap in terms of content and focus, and a qualitative analysis was carried out to study and represent the relationships between documents. The analysis showed that a rich set of strategic documents were adopted by each PHO. However, these are often uncoordinated and overlap in terms of content. They adopt different language and formats for various stakeholders. The presence of diverse external drivers may explain the divergent focus, priorities and inconsistent goals in the strategic documents. This planning complexity makes it difficult to determine how the overall goals and mission of an organisation are defined and made visible. The evidence suggests that PHOs use a considerable number of strategic documents. However, they employ no clear or explicit overarching strategy currently, and strategic planning appears to be externally oriented. All the documents communicate similar topics to different stakeholders, although they use different language to answer to the different expectations of each stakeholder. Therefore, strategic planning and plans seem to be driven by neo-institutional approaches, as they are means to build consensus and negotiate ground for strategic actions, rather than means to identify strategic choices and priorities.
Short-term earthquake forecasting based on an epidemic clustering model
NASA Astrophysics Data System (ADS)
Console, Rodolfo; Murru, Maura; Falcone, Giuseppe
2016-04-01
The application of rigorous statistical tools, with the aim of verifying any prediction method, requires a univocal definition of the hypothesis, or the model, characterizing the concerned anomaly or precursor, so as it can be objectively recognized in any circumstance and by any observer. This is mandatory to build up on the old-fashion approach consisting only of the retrospective anecdotic study of past cases. A rigorous definition of an earthquake forecasting hypothesis should lead to the objective identification of particular sub-volumes (usually named alarm volumes) of the total time-space volume within which the probability of occurrence of strong earthquakes is higher than the usual. The test of a similar hypothesis needs the observation of a sufficient number of past cases upon which a statistical analysis is possible. This analysis should be aimed to determine the rate at which the precursor has been followed (success rate) or not followed (false alarm rate) by the target seismic event, or the rate at which a target event has been preceded (alarm rate) or not preceded (failure rate) by the precursor. The binary table obtained from this kind of analysis leads to the definition of the parameters of the model that achieve the maximum number of successes and the minimum number of false alarms for a specific class of precursors. The mathematical tools suitable for this purpose may include the definition of Probability Gain or the R-Score, as well as the application of popular plots such as the Molchan error-diagram and the ROC diagram. Another tool for evaluating the validity of a forecasting method is the concept of the likelihood ratio (also named performance factor) of occurrence and non-occurrence of seismic events under different hypotheses. Whatever is the method chosen for building up a new hypothesis, usually based on retrospective data, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this step could be problematic for seismicity characterized by long-term recurrence. However, the separation of the data base of the data base collected in the past in two separate sections (one on which the best fit of the parameters is carried out, and the other on which the hypothesis is tested) can be a viable solution, known as retrospective-forward testing. In this study we show examples of application of the above mentioned concepts to the analysis of the Italian catalog of instrumental seismicity, making use of an epidemic algorithm developed to model short-term clustering features. This model, for which a precursory anomaly is just the occurrence of seismic activity, doesn't need the retrospective categorization of earthquakes in terms of foreshocks, mainshocks and aftershocks. It was introduced more than 15 years ago and tested so far in a number of real cases. It is now being run by several seismological centers around the world in forward real-time mode for testing purposes.
Wang, Jia-Hong; Zhao, Ling-Feng; Lin, Pei; Su, Xiao-Rong; Chen, Shi-Jun; Huang, Li-Qiang; Wang, Hua-Feng; Zhang, Hai; Hu, Zhen-Fu; Yao, Kai-Tai; Huang, Zhong-Xi
2014-09-01
Identifying biological functions and molecular networks in a gene list and how the genes may relate to various topics is of considerable value to biomedical researchers. Here, we present a web-based text-mining server, GenCLiP 2.0, which can analyze human genes with enriched keywords and molecular interactions. Compared with other similar tools, GenCLiP 2.0 offers two unique features: (i) analysis of gene functions with free terms (i.e. any terms in the literature) generated by literature mining or provided by the user and (ii) accurate identification and integration of comprehensive molecular interactions from Medline abstracts, to construct molecular networks and subnetworks related to the free terms. http://ci.smu.edu.cn. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
OOMMPPAA: A Tool To Aid Directed Synthesis by the Combined Analysis of Activity and Structural Data
2014-01-01
There is an ever increasing resource in terms of both structural information and activity data for many protein targets. In this paper we describe OOMMPPAA, a novel computational tool designed to inform compound design by combining such data. OOMMPPAA uses 3D matched molecular pairs to generate 3D ligand conformations. It then identifies pharmacophoric transformations between pairs of compounds and associates them with their relevant activity changes. OOMMPPAA presents this data in an interactive application providing the user with a visual summary of important interaction regions in the context of the binding site. We present validation of the tool using openly available data for CDK2 and a GlaxoSmithKline data set for a SAM-dependent methyl-transferase. We demonstrate OOMMPPAA’s application in optimizing both potency and cell permeability and use OOMMPPAA to highlight nuanced and cross-series SAR. OOMMPPAA is freely available to download at http://oommppaa.sgc.ox.ac.uk/OOMMPPAA/. PMID:25244105
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prindle, N.H.; Mendenhall, F.T.; Trauth, K.
1996-05-01
The Systems Prioritization Method (SPM) is a decision-aiding tool developed by Sandia National Laboratories (SNL). SPM provides an analytical basis for supporting programmatic decisions for the Waste Isolation Pilot Plant (WIPP) to meet selected portions of the applicable US EPA long-term performance regulations. The first iteration of SPM (SPM-1), the prototype for SPM< was completed in 1994. It served as a benchmark and a test bed for developing the tools needed for the second iteration of SPM (SPM-2). SPM-2, completed in 1995, is intended for programmatic decision making. This is Volume II of the three-volume final report of the secondmore » iteration of the SPM. It describes the technical input and model implementation for SPM-2, and presents the SPM-2 technical baseline and the activities, activity outcomes, outcome probabilities, and the input parameters for SPM-2 analysis.« less
NASA Astrophysics Data System (ADS)
Yeates, E.; Dreaper, G.; Afshari, S.; Tavakoly, A. A.
2017-12-01
Over the past six fiscal years, the United States Army Corps of Engineers (USACE) has contracted an average of about a billion dollars per year for navigation channel dredging. To execute these funds effectively, USACE Districts must determine which navigation channels need to be dredged in a given year. Improving this prioritization process results in more efficient waterway maintenance. This study uses the Streamflow Prediction Tool, a runoff routing model based on global weather forecast ensembles, to estimate dredged volumes. This study establishes regional linear relationships between cumulative flow and dredged volumes over a long-term simulation covering 30 years (1985-2015), using drainage area and shoaling parameters. The study framework integrates the National Hydrography Dataset (NHDPlus Dataset) with parameters from the Corps Shoaling Analysis Tool (CSAT) and dredging record data from USACE District records. Results in the test cases of the Houston Ship Channel and the Sabine and Port Arthur Harbor waterways in Texas indicate positive correlation between the simulated streamflows and actual dredging records.
Yeh, Shih-Ching; Huang, Ming-Chun; Wang, Pa-Chun; Fang, Te-Yung; Su, Mu-Chun; Tsai, Po-Yi; Rizzo, Albert
2014-10-01
Dizziness is a major consequence of imbalance and vestibular dysfunction. Compared to surgery and drug treatments, balance training is non-invasive and more desired. However, training exercises are usually tedious and the assessment tool is insufficient to diagnose patient's severity rapidly. An interactive virtual reality (VR) game-based rehabilitation program that adopted Cawthorne-Cooksey exercises, and a sensor-based measuring system were introduced. To verify the therapeutic effect, a clinical experiment with 48 patients and 36 normal subjects was conducted. Quantified balance indices were measured and analyzed by statistical tools and a Support Vector Machine (SVM) classifier. In terms of balance indices, patients who completed the training process are progressed and the difference between normal subjects and patients is obvious. Further analysis by SVM classifier show that the accuracy of recognizing the differences between patients and normal subject is feasible, and these results can be used to evaluate patients' severity and make rapid assessment. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Modelling and analysis of gene regulatory network using feedback control theory
NASA Astrophysics Data System (ADS)
El-Samad, H.; Khammash, M.
2010-01-01
Molecular pathways are a part of a remarkable hierarchy of regulatory networks that operate at all levels of organisation. These regulatory networks are responsible for much of the biological complexity within the cell. The dynamic character of these pathways and the prevalence of feedback regulation strategies in their operation make them amenable to systematic mathematical analysis using the same tools that have been used with success in analysing and designing engineering control systems. In this article, we aim at establishing this strong connection through various examples where the behaviour exhibited by gene networks is explained in terms of their underlying control strategies. We complement our analysis by a survey of mathematical techniques commonly used to model gene regulatory networks and analyse their dynamic behaviour.
TEXTINFO: a tool for automatic determination of patient clinical profiles using text analysis.
Borst, F.; Lyman, M.; Nhàn, N. T.; Tick, L. J.; Sager, N.; Scherrer, J. R.
1991-01-01
The clinical data contained in narrative patient documents is made available via grammatical and semantic processing. Retrievals from the resulting relational database tables are matched against a set of clinical descriptors to obtain clinical profiles of the patients in terms of the descriptors present in the documents. Discharge summaries of 57 Dept. of Digestive Surgery patients were processed in this manner. Factor analysis and discriminant analysis procedures were then applied, showing the profiles to be useful for diagnosis definitions (by establishing relations between diagnoses and clinical findings), for diagnosis assessment (by viewing the match between a definition and observed events recorded in a patient text), and potentially for outcome evaluation based on the classification abilities of clinical signs. PMID:1807679
Order parameter analysis of synchronization transitions on star networks
NASA Astrophysics Data System (ADS)
Chen, Hong-Bin; Sun, Yu-Ting; Gao, Jian; Xu, Can; Zheng, Zhi-Gang
2017-12-01
The collective behaviors of populations of coupled oscillators have attracted significant attention in recent years. In this paper, an order parameter approach is proposed to study the low-dimensional dynamical mechanism of collective synchronizations, by adopting the star-topology of coupled oscillators as a prototype system. The order parameter equation of star-linked phase oscillators can be obtained in terms of the Watanabe-Strogatz transformation, Ott-Antonsen ansatz, and the ensemble order parameter approach. Different solutions of the order parameter equation correspond to the diverse collective states, and different bifurcations reveal various transitions among these collective states. The properties of various transitions in the star-network model are revealed by using tools of nonlinear dynamics such as time reversibility analysis and linear stability analysis.
Identification of metabolic pathways using pathfinding approaches: a systematic review.
Abd Algfoor, Zeyad; Shahrizal Sunar, Mohd; Abdullah, Afnizanfaizal; Kolivand, Hoshang
2017-03-01
Metabolic pathways have become increasingly available for various microorganisms. Such pathways have spurred the development of a wide array of computational tools, in particular, mathematical pathfinding approaches. This article can facilitate the understanding of computational analysis of metabolic pathways in genomics. Moreover, stoichiometric and pathfinding approaches in metabolic pathway analysis are discussed. Three major types of studies are elaborated: stoichiometric identification models, pathway-based graph analysis and pathfinding approaches in cellular metabolism. Furthermore, evaluation of the outcomes of the pathways with mathematical benchmarking metrics is provided. This review would lead to better comprehension of metabolism behaviors in living cells, in terms of computed pathfinding approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Kartoshkina, Yuliya
2013-01-01
An internally-developed tool was developed to assess the intercultural competence of students taking part in short-term study abroad programs. Four scales were built to assess possible change in students' host culture knowledge, cross-cultural awareness, cross-cultural adaptation, and self-assessed foreign language proficiency. Enrollment in a…
NASA Astrophysics Data System (ADS)
Kapanen, Mika; Tenhunen, Mikko; Hämäläinen, Tuomo; Sipilä, Petri; Parkkinen, Ritva; Järvinen, Hannu
2006-07-01
Quality control (QC) data of radiotherapy linear accelerators, collected by Helsinki University Central Hospital between the years 2000 and 2004, were analysed. The goal was to provide information for the evaluation and elaboration of QC of accelerator outputs and to propose a method for QC data analysis. Short- and long-term drifts in outputs were quantified by fitting empirical mathematical models to the QC measurements. Normally, long-term drifts were well (<=1%) modelled by either a straight line or a single-exponential function. A drift of 2% occurred in 18 ± 12 months. The shortest drift times of only 2-3 months were observed for some new accelerators just after the commissioning but they stabilized during the first 2-3 years. The short-term reproducibility and the long-term stability of local constancy checks, carried out with a sealed plane parallel ion chamber, were also estimated by fitting empirical models to the QC measurements. The reproducibility was 0.2-0.5% depending on the positioning practice of a device. Long-term instabilities of about 0.3%/month were observed for some checking devices. The reproducibility of local absorbed dose measurements was estimated to be about 0.5%. The proposed empirical model fitting of QC data facilitates the recognition of erroneous QC measurements and abnormal output behaviour, caused by malfunctions, offering a tool to improve dose control.
Image-Based Single Cell Profiling: High-Throughput Processing of Mother Machine Experiments
Sachs, Christian Carsten; Grünberger, Alexander; Helfrich, Stefan; Probst, Christopher; Wiechert, Wolfgang; Kohlheyer, Dietrich; Nöh, Katharina
2016-01-01
Background Microfluidic lab-on-chip technology combined with live-cell imaging has enabled the observation of single cells in their spatio-temporal context. The mother machine (MM) cultivation system is particularly attractive for the long-term investigation of rod-shaped bacteria since it facilitates continuous cultivation and observation of individual cells over many generations in a highly parallelized manner. To date, the lack of fully automated image analysis software limits the practical applicability of the MM as a phenotypic screening tool. Results We present an image analysis pipeline for the automated processing of MM time lapse image stacks. The pipeline supports all analysis steps, i.e., image registration, orientation correction, channel/cell detection, cell tracking, and result visualization. Tailored algorithms account for the specialized MM layout to enable a robust automated analysis. Image data generated in a two-day growth study (≈ 90 GB) is analyzed in ≈ 30 min with negligible differences in growth rate between automated and manual evaluation quality. The proposed methods are implemented in the software molyso (MOther machine AnaLYsis SOftware) that provides a new profiling tool to analyze unbiasedly hitherto inaccessible large-scale MM image stacks. Conclusion Presented is the software molyso, a ready-to-use open source software (BSD-licensed) for the unsupervised analysis of MM time-lapse image stacks. molyso source code and user manual are available at https://github.com/modsim/molyso. PMID:27661996
Computational science: shifting the focus from tools to models
Hinsen, Konrad
2014-01-01
Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728
Wu, Guey-Hau; Liou, Yiing-Mei; Huang, Lian-Hua
2004-10-01
In assessing the health of a community is important to select tools appropriate to the community's characteristics. The framework for this paper is the system framework for community assessment developed by Trotter, Smith and Maurer (2000); the data were collected by windshield survey, literature review, interview, and observation. Through data analysis and the identification of the community's problem, the authors prioritize those problems in accordance with Goeppinger and Schuste's (1992) criteria. They illustrate the practicality and local applicability of this method by means of a local case. Finally, the authors evaluate the framework in terms of concept clearance, variable classification, and indicator measurement. In addition, they propose concrete suggestions for community workers to consider in the selection of assessment tools, and to enrich nursing knowledge.
Single Cell Gene Expression Profiling of Skeletal Muscle-Derived Cells.
Gatto, Sole; Puri, Pier Lorenzo; Malecova, Barbora
2017-01-01
Single cell gene expression profiling is a fundamental tool for studying the heterogeneity of a cell population by addressing the phenotypic and functional characteristics of each cell. Technological advances that have coupled microfluidic technologies with high-throughput quantitative RT-PCR analyses have enabled detailed analyses of single cells in various biological contexts. In this chapter, we describe the procedure for isolating the skeletal muscle interstitial cells termed Fibro-Adipogenic Progenitors (FAPs ) and their gene expression profiling at the single cell level. Moreover, we accompany our bench protocol with bioinformatics analysis designed to process raw data as well as to visualize single cell gene expression data. Single cell gene expression profiling is therefore a useful tool in the investigation of FAPs heterogeneity and their contribution to muscle homeostasis.
Automatic identification of artifacts in electrodermal activity data.
Taylor, Sara; Jaques, Natasha; Chen, Weixuan; Fedor, Szymon; Sano, Akane; Picard, Rosalind
2015-01-01
Recently, wearable devices have allowed for long term, ambulatory measurement of electrodermal activity (EDA). Despite the fact that ambulatory recording can be noisy, and recording artifacts can easily be mistaken for a physiological response during analysis, to date there is no automatic method for detecting artifacts. This paper describes the development of a machine learning algorithm for automatically detecting EDA artifacts, and provides an empirical evaluation of classification performance. We have encoded our results into a freely available web-based tool for artifact and peak detection.
Marketing quality and value to the managed care market.
Kazmirski, G
1998-11-01
Quantifying quality and marketing care delivery have been long-term challenges in the health care market. Insurers, employers, other purchasers of care, and providers face a constant challenge in positioning their organizations in a proactive, competitive niche. Tools that measure patient's self-reported perception of health care needs and expectations have increased the ability to quantify quality of care delivery. When integrated with case management and disease management strategies, outcomes reporting and variance analysis tracking can be packaged to position a provider in a competitive niche.
The use of rational functions in numerical quadrature
NASA Astrophysics Data System (ADS)
Gautschi, Walter
2001-08-01
Quadrature problems involving functions that have poles outside the interval of integration can profitably be solved by methods that are exact not only for polynomials of appropriate degree, but also for rational functions having the same (or the most important) poles as the function to be integrated. Constructive and computational tools for accomplishing this are described and illustrated in a number of quadrature contexts. The superiority of such rational/polynomial methods is shown by an analysis of the remainder term and documented by numerical examples.
The concept of coupling impedance in the self-consistent plasma wake field excitation
NASA Astrophysics Data System (ADS)
Fedele, R.; Akhter, T.; De Nicola, S.; Migliorati, M.; Marocchino, A.; Massimo, F.; Palumbo, L.
2016-09-01
Within the framework of the Vlasov-Maxwell system of equations, we describe the self-consistent interaction of a relativistic charged-particle beam with the surroundings while propagating through a plasma-based acceleration device. This is done in terms of the concept of coupling (longitudinal) impedance in full analogy with the conventional accelerators. It is shown that also here the coupling impedance is a very useful tool for the Nyquist-type stability analysis. Examples of specific physical situations are finally illustrated.
Labor Resource Audit and Analysis: A Tool for Management Planning and Control
1989-06-01
RM(onn17 COSA~ CODES B 5OBJECT TERMS (Continge pn rev@rj !~cea and f, entfa y.block R V FIELD GROUP SUB-GROUP re-; P anning; labor; ugLr og; a e...David 1 hplr., chairman Department of Admi srtv Sine KnealeT-Mg;fi Dea ofInformation and-ln sine ii ABSTRACT This study was conducted in an effort to...Audit ---------------------------------- 84 viii ACKNOWLEDGMENTS A wide variety of authorities in their field have contributed material to this study. I
NASA Technical Reports Server (NTRS)
1974-01-01
The BRAVO User's Manual is presented which describes the BRAVO methodology in terms of step-by-step procedures, so that it may be used as a tool for a team of analysts performing cost effectiveness analyses on potential future space applications. BRAVO requires a relatively general set of input information and a relatively small expenditure of resources. For Vol. 1, see N74-12493; for Vol. 2, see N74-14530.
Multi-Mission Strategic Technology Prioritization Study
NASA Technical Reports Server (NTRS)
Weisbin, C. R.; Rodriquez, G.; Elfes, A.; Derleth, J.; Smith, J. H.; Manvi, R.; Kennedy, B.; Shelton, K.
2004-01-01
This viewgraph presentation provides an overview of a pilot study intended to demonstrate in an auditable fashion how advanced space technology development can best impact future NASA missions. The study was a joint project by staff members of NASA's Jet Propulsion Laboratory (JPL), and Goddard Space Flight Center (GSFC). The other goals of the study were to show an approach to deal effectively with inter-program analysis trades, and to explore the limits of these approaches and tools in terms of what can be realistically achieved.
Analysis of a mammography teaching program based on an affordance design model.
Luo, Ping; Eikman, Edward A; Kealy, William; Qian, Wei
2006-12-01
The wide use of computer technology in education, particularly in mammogram reading, asks for e-learning evaluation. The existing media comparative studies, learner attitude evaluations, and performance tests are problematic. Based on an affordance design model, this study examined an existing e-learning program on mammogram reading. The selection criteria include content relatedness, representativeness, e-learning orientation, image quality, program completeness, and accessibility. A case study was conducted to examine the affordance features, functions, and presentations of the selected software. Data collection and analysis methods include interviews, protocol-based document analysis, and usability tests and inspection. Also some statistics were calculated. The examination of PBE identified that this educational software designed and programmed some tools. The learner can use these tools in the process of optimizing displays, scanning images, comparing different projections, marking the region of interests, constructing a descriptive report, assessing one's learning outcomes, and comparing one's decisions with the experts' decisions. Further, PBE provides some resources for the learner to construct one's knowledge and skills, including a categorized image library, a term-searching function, and some teaching links. Besides, users found it easy to navigate and carry out tasks. The users also reacted positively toward PBE's navigation system, instructional aids, layout, pace and flow of information, graphics, and other presentation design. The software provides learners with some cognitive tools, supporting their perceptual problem-solving processes and extending their capabilities. Learners can internalize the mental models in mammogram reading through multiple perceptual triangulations, sensitization of related features, semantic description of mammogram findings, and expert-guided semantic report construction. The design of these cognitive tools and the software interface matches the findings and principles in human learning and instructional design. Working with PBE's case-based simulations and categorized gallery, learners can enrich and transfer their experience to their jobs.
Integration of EGA secure data access into Galaxy.
Hoogstrate, Youri; Zhang, Chao; Senf, Alexander; Bijlard, Jochem; Hiltemann, Saskia; van Enckevort, David; Repo, Susanna; Heringa, Jaap; Jenster, Guido; J A Fijneman, Remond; Boiten, Jan-Willem; A Meijer, Gerrit; Stubbs, Andrew; Rambla, Jordi; Spalding, Dylan; Abeln, Sanne
2016-01-01
High-throughput molecular profiling techniques are routinely generating vast amounts of data for translational medicine studies. Secure access controlled systems are needed to manage, store, transfer and distribute these data due to its personally identifiable nature. The European Genome-phenome Archive (EGA) was created to facilitate access and management to long-term archival of bio-molecular data. Each data provider is responsible for ensuring a Data Access Committee is in place to grant access to data stored in the EGA. Moreover, the transfer of data during upload and download is encrypted. ELIXIR, a European research infrastructure for life-science data, initiated a project (2016 Human Data Implementation Study) to understand and document the ELIXIR requirements for secure management of controlled-access data. As part of this project, a full ecosystem was designed to connect archived raw experimental molecular profiling data with interpreted data and the computational workflows, using the CTMM Translational Research IT (CTMM-TraIT) infrastructure http://www.ctmm-trait.nl as an example. Here we present the first outcomes of this project, a framework to enable the download of EGA data to a Galaxy server in a secure way. Galaxy provides an intuitive user interface for molecular biologists and bioinformaticians to run and design data analysis workflows. More specifically, we developed a tool -- ega_download_streamer - that can download data securely from EGA into a Galaxy server, which can subsequently be further processed. This tool will allow a user within the browser to run an entire analysis containing sensitive data from EGA, and to make this analysis available for other researchers in a reproducible manner, as shown with a proof of concept study. The tool ega_download_streamer is available in the Galaxy tool shed: https://toolshed.g2.bx.psu.edu/view/yhoogstrate/ega_download_streamer.
Integration of EGA secure data access into Galaxy
Hoogstrate, Youri; Zhang, Chao; Senf, Alexander; Bijlard, Jochem; Hiltemann, Saskia; van Enckevort, David; Repo, Susanna; Heringa, Jaap; Jenster, Guido; Fijneman, Remond J.A.; Boiten, Jan-Willem; A. Meijer, Gerrit; Stubbs, Andrew; Rambla, Jordi; Spalding, Dylan; Abeln, Sanne
2016-01-01
High-throughput molecular profiling techniques are routinely generating vast amounts of data for translational medicine studies. Secure access controlled systems are needed to manage, store, transfer and distribute these data due to its personally identifiable nature. The European Genome-phenome Archive (EGA) was created to facilitate access and management to long-term archival of bio-molecular data. Each data provider is responsible for ensuring a Data Access Committee is in place to grant access to data stored in the EGA. Moreover, the transfer of data during upload and download is encrypted. ELIXIR, a European research infrastructure for life-science data, initiated a project (2016 Human Data Implementation Study) to understand and document the ELIXIR requirements for secure management of controlled-access data. As part of this project, a full ecosystem was designed to connect archived raw experimental molecular profiling data with interpreted data and the computational workflows, using the CTMM Translational Research IT (CTMM-TraIT) infrastructure http://www.ctmm-trait.nl as an example. Here we present the first outcomes of this project, a framework to enable the download of EGA data to a Galaxy server in a secure way. Galaxy provides an intuitive user interface for molecular biologists and bioinformaticians to run and design data analysis workflows. More specifically, we developed a tool -- ega_download_streamer - that can download data securely from EGA into a Galaxy server, which can subsequently be further processed. This tool will allow a user within the browser to run an entire analysis containing sensitive data from EGA, and to make this analysis available for other researchers in a reproducible manner, as shown with a proof of concept study. The tool ega_download_streamer is available in the Galaxy tool shed: https://toolshed.g2.bx.psu.edu/view/yhoogstrate/ega_download_streamer. PMID:28232859
Can we trust the calculation of texture indices of CT images? A phantom study.
Caramella, Caroline; Allorant, Adrien; Orlhac, Fanny; Bidault, Francois; Asselain, Bernard; Ammari, Samy; Jaranowski, Patricia; Moussier, Aurelie; Balleyguier, Corinne; Lassau, Nathalie; Pitre-Champagnat, Stephanie
2018-04-01
Texture analysis is an emerging tool in the field of medical imaging analysis. However, many issues have been raised in terms of its use in assessing patient images and it is crucial to harmonize and standardize this new imaging measurement tool. This study was designed to evaluate the reliability of texture indices of CT images on a phantom including a reproducibility study, to assess the discriminatory capacity of indices potentially relevant in CT medical images and to determine their redundancy. For the reproducibility and discriminatory analysis, eight identical CT acquisitions were performed on a phantom including one homogeneous insert and two close heterogeneous inserts. Texture indices were selected for their high reproducibility and capability of discriminating different textures. For the redundancy analysis, 39 acquisitions of the same phantom were performed using varying acquisition parameters and a correlation matrix was used to explore the 2 × 2 relationships. LIFEx software was used to explore 34 different parameters including first order and texture indices. Only eight indices of 34 exhibited high reproducibility and discriminated textures from each other. Skewness and kurtosis from histogram were independent from the six other indices but were intercorrelated, the other six indices correlated in diverse degrees (entropy, dissimilarity, and contrast of the co-occurrence matrix, contrast of the Neighborhood Gray Level difference matrix, SZE, ZLNU of the Gray-Level Size Zone Matrix). Care should be taken when using texture analysis as a tool to characterize CT images because changes in quantitation may be primarily due to internal variability rather than from real physio-pathological effects. Some textural indices appear to be sufficiently reliable and capable to discriminate close textures on CT images. © 2018 American Association of Physicists in Medicine.
Proceedings of the 11th Thermal and Fluids Analysis Workshop
NASA Astrophysics Data System (ADS)
Sakowski, Barbara
2002-07-01
The Eleventh Thermal & Fluids Analysis WorkShop (TFAWS 2000) was held the week of August 21-25 at The Forum in downtown Cleveland. This year's annual event focused on building stronger links between research community and the engineering design/application world and celebrated the theme "Bridging the Gap Between Research and Design". Dr. Simon Ostrach delivered the keynote address "Research for Design (R4D)" and encouraged a more deliberate approach to performing research with near-term engineering design applications in mind. Over 100 persons attended TFAWS 2000, including participants from five different countries. This year's conference devoted a full-day seminar to the discussion of analysis and design tools associated with aeropropulsion research at the Glenn Research Center. As in previous years, the workshop also included hands-on instruction in state-of-the-art analysis tools, paper sessions on selected topics, short courses and application software demonstrations. TFAWS 2000 was co-hosted by the Thermal/Fluids Systems Design and Analysis Branch of NASA GRC and by the Ohio Aerospace Institute and was co-chaired by Barbara A. Sakowski and James R. Yuko. The annual NASA Delegates meeting is a standard component of TFAWS where the civil servants of the various centers represented discuss current and future events which affect the Community of Applied Thermal and Fluid ANalystS (CATFANS). At this year's delegates meeting the following goals (among others) were set by the collective body of delegates participation of all Centers in the NASA material properties database (TPSX) update: (1) developing and collaboratively supporting multi-center proposals; (2) expanding the scope of TFAWS to include other federal laboratories; (3) initiation of a white papers on thermal tools and standards; and (4) formation of an Agency-wide TFAWS steering committee.
Subluxation and semantics: a corpus linguistics study.
Budgell, Brian
2016-06-01
The purpose of this study was to analyze the curriculum of one chiropractic college in order to discover if there were any implicit consensus definitions of the term subluxation. Using the software WordSmith Tools, the corpus of an undergraduate chiropractic curriculum was analyzed by reviewing collocated terms and through discourse analysis of text blocks containing words based on the root 'sublux.' It was possible to identify 3 distinct concepts which were each referred to as 'subluxation:' i) an acute or instantaneous injurious event; ii) a clinical syndrome which manifested post-injury; iii) a physical lesion, i.e. an anatomical or physiological derangement which in most instances acted as a pain generator. In fact, coherent implicit definitions of subluxation exist and may enjoy broad but subconscious acceptance. However, confusion likely arises from failure to distinguish which concept an author or speaker is referring to when they employ the term subluxation.
A Value Measure for Public-Sector Enterprise Risk Management: A TSA Case Study.
Fletcher, Kenneth C; Abbas, Ali E
2018-05-01
This article presents a public value measure that can be used to aid executives in the public sector to better assess policy decisions and maximize value to the American people. Using Transportation Security Administration (TSA) programs as an example, we first identify the basic components of public value. We then propose a public value account to quantify the outcomes of various risk scenarios, and we determine the certain equivalent of several important TSA programs. We illustrate how this proposed measure can quantify the effects of two main challenges that government organizations face when conducting enterprise risk management: (1) short-term versus long-term incentives and (2) avoiding potential negative consequences even if they occur with low probability. Finally, we illustrate how this measure enables the use of various tools from decision analysis to be applied in government settings, such as stochastic dominance arguments and certain equivalent calculations. Regarding the TSA case study, our analysis demonstrates the value of continued expansion of the TSA trusted traveler initiative and increasing the background vetting for passengers who are afforded expedited security screening. © 2017 Society for Risk Analysis.
Non-parametric characterization of long-term rainfall time series
NASA Astrophysics Data System (ADS)
Tiwari, Harinarayan; Pandey, Brij Kishor
2018-03-01
The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.
Lancaster, Timothy S; Schill, Matthew R; Greenberg, Jason W; Ruaengsri, Chawannuch; Schuessler, Richard B; Lawton, Jennifer S; Maniar, Hersh S; Pasque, Michael K; Moon, Marc R; Damiano, Ralph J; Melby, Spencer J
2018-05-01
The recently developed American College of Cardiology Foundation-Society of Thoracic Surgeons (STS) Collaboration on the Comparative Effectiveness of Revascularization Strategy (ASCERT) Long-Term Survival Probability Calculator is a valuable addition to existing short-term risk-prediction tools for cardiac surgical procedures but has yet to be externally validated. Institutional data of 654 patients aged 65 years or older undergoing isolated coronary artery bypass grafting between 2005 and 2010 were reviewed. Predicted survival probabilities were calculated using the ASCERT model. Survival data were collected using the Social Security Death Index and institutional medical records. Model calibration and discrimination were assessed for the overall sample and for risk-stratified subgroups based on (1) ASCERT 7-year survival probability and (2) the predicted risk of mortality (PROM) from the STS Short-Term Risk Calculator. Logistic regression analysis was performed to evaluate additional perioperative variables contributing to death. Overall survival was 92.1% (569 of 597) at 1 year and 50.5% (164 of 325) at 7 years. Calibration assessment found no significant differences between predicted and actual survival curves for the overall sample or for the risk-stratified subgroups, whether stratified by predicted 7-year survival or by PROM. Discriminative performance was comparable between the ASCERT and PROM models for 7-year survival prediction (p < 0.001 for both; C-statistic = 0.815 for ASCERT and 0.781 for PROM). Prolonged ventilation, stroke, and hospital length of stay were also predictive of long-term death. The ASCERT survival probability calculator was externally validated for prediction of long-term survival after coronary artery bypass grafting in all risk groups. The widely used STS PROM performed comparably as a predictor of long-term survival. Both tools provide important information for preoperative decision making and patient counseling about potential outcomes after coronary artery bypass grafting. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Top-level modeling of an als system utilizing object-oriented techniques
NASA Astrophysics Data System (ADS)
Rodriguez, L. F.; Kang, S.; Ting, K. C.
The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.
NASA's Cryogenic Fluid Management Technology Project
NASA Technical Reports Server (NTRS)
Tramel, Terri L.; Motil, Susan M.
2008-01-01
The Cryogenic Fluid Management (CFM) Project's primary objective is to develop storage, transfer, and handling technologies for cryogens that will support the enabling of high performance cryogenic propulsion systems, lunar surface systems and economical ground operations. Such technologies can significantly reduce propellant launch mass and required on-orbit margins, reduce or even eliminate propellant tank fluid boil-off losses for long term missions, and simplify vehicle operations. This paper will present the status of the specific technologies that the CFM Project is developing. The two main areas of concentration are analysis models development and CFM hardware development. The project develops analysis tools and models based on thermodynamics, hydrodynamics, and existing flight/test data. These tools assist in the development of pressure/thermal control devices (such as the Thermodynamic Vent System (TVS), and Multi-layer insulation); with the ultimate goal being to develop a mature set of tools and models that can characterize the performance of the pressure/thermal control devices incorporated in the design of an entire CFM system with minimal cryogen loss. The project does hardware development and testing to verify our understanding of the physical principles involved, and to validate the performance of CFM components, subsystems and systems. This database provides information to anchor our analytical models. This paper describes some of the current activities of the NASA's Cryogenic Fluid Management Project.
Identifying currents in the gene pool for bacterial populations using an integrative approach.
Tang, Jing; Hanage, William P; Fraser, Christophe; Corander, Jukka
2009-08-01
The evolution of bacterial populations has recently become considerably better understood due to large-scale sequencing of population samples. It has become clear that DNA sequences from a multitude of genes, as well as a broad sample coverage of a target population, are needed to obtain a relatively unbiased view of its genetic structure and the patterns of ancestry connected to the strains. However, the traditional statistical methods for evolutionary inference, such as phylogenetic analysis, are associated with several difficulties under such an extensive sampling scenario, in particular when a considerable amount of recombination is anticipated to have taken place. To meet the needs of large-scale analyses of population structure for bacteria, we introduce here several statistical tools for the detection and representation of recombination between populations. Also, we introduce a model-based description of the shape of a population in sequence space, in terms of its molecular variability and affinity towards other populations. Extensive real data from the genus Neisseria are utilized to demonstrate the potential of an approach where these population genetic tools are combined with an phylogenetic analysis. The statistical tools introduced here are freely available in BAPS 5.2 software, which can be downloaded from http://web.abo.fi/fak/mnf/mate/jc/software/baps.html.
CFGP: a web-based, comparative fungal genomics platform
Park, Jongsun; Park, Bongsoo; Jung, Kyongyong; Jang, Suwang; Yu, Kwangyul; Choi, Jaeyoung; Kong, Sunghyung; Park, Jaejin; Kim, Seryun; Kim, Hyojeong; Kim, Soonok; Kim, Jihyun F.; Blair, Jaime E.; Lee, Kwangwon; Kang, Seogchan; Lee, Yong-Hwan
2008-01-01
Since the completion of the Saccharomyces cerevisiae genome sequencing project in 1996, the genomes of over 80 fungal species have been sequenced or are currently being sequenced. Resulting data provide opportunities for studying and comparing fungal biology and evolution at the genome level. To support such studies, the Comparative Fungal Genomics Platform (CFGP; http://cfgp.snu.ac.kr), a web-based multifunctional informatics workbench, was developed. The CFGP comprises three layers, including the basal layer, middleware and the user interface. The data warehouse in the basal layer contains standardized genome sequences of 65 fungal species. The middleware processes queries via six analysis tools, including BLAST, ClustalW, InterProScan, SignalP 3.0, PSORT II and a newly developed tool named BLASTMatrix. The BLASTMatrix permits the identification and visualization of genes homologous to a query across multiple species. The Data-driven User Interface (DUI) of the CFGP was built on a new concept of pre-collecting data and post-executing analysis instead of the ‘fill-in-the-form-and-press-SUBMIT’ user interfaces utilized by most bioinformatics sites. A tool termed Favorite, which supports the management of encapsulated sequence data and provides a personalized data repository to users, is another novel feature in the DUI. PMID:17947331
Fernández-de-Manúel, Laura; Díaz-Díaz, Covadonga; Jiménez-Carretero, Daniel; Torres, Miguel; Montoya, María C
2017-05-01
Embryonic stem cells (ESCs) can be established as permanent cell lines, and their potential to differentiate into adult tissues has led to widespread use for studying the mechanisms and dynamics of stem cell differentiation and exploring strategies for tissue repair. Imaging live ESCs during development is now feasible due to advances in optical imaging and engineering of genetically encoded fluorescent reporters; however, a major limitation is the low spatio-temporal resolution of long-term 3-D imaging required for generational and neighboring reconstructions. Here, we present the ESC-Track (ESC-T) workflow, which includes an automated cell and nuclear segmentation and tracking tool for 4-D (3-D + time) confocal image data sets as well as a manual editing tool for visual inspection and error correction. ESC-T automatically identifies cell divisions and membrane contacts for lineage tree and neighborhood reconstruction and computes quantitative features from individual cell entities, enabling analysis of fluorescence signal dynamics and tracking of cell morphology and motion. We use ESC-T to examine Myc intensity fluctuations in the context of mouse ESC (mESC) lineage and neighborhood relationships. ESC-T is a powerful tool for evaluation of the genealogical and microenvironmental cues that maintain ESC fitness.
Mollayeva, Tatyana; Thurairajah, Pravheen; Burton, Kirsteen; Mollayeva, Shirin; Shapiro, Colin M; Colantonio, Angela
2016-02-01
This review appraises the process of development and the measurement properties of the Pittsburgh sleep quality index (PSQI), gauging its potential as a screening tool for sleep dysfunction in non-clinical and clinical samples; it also compares non-clinical and clinical populations in terms of PSQI scores. MEDLINE, Embase, PsycINFO, and HAPI databases were searched. Critical appraisal of studies of measurement properties was performed using COSMIN. Of 37 reviewed studies, 22 examined construct validity, 19 - known-group validity, 15 - internal consistency, and three - test-retest reliability. Study quality ranged from poor to excellent, with the majority designated fair. Internal consistency, based on Cronbach's alpha, was good. Discrepancies were observed in factor analytic studies. In non-clinical and clinical samples with known differences in sleep quality, the PSQI global scores and all subscale scores, with the exception of sleep disturbance, differed significantly. The best evidence synthesis for the PSQI showed strong reliability and validity, and moderate structural validity in a variety of samples, suggesting the tool fulfills its intended utility. A taxonometric analysis can contribute to better understanding of sleep dysfunction as either a dichotomous or continuous construct. Copyright © 2015 Elsevier Ltd. All rights reserved.
Yu, Yao; Tu, Kang; Zheng, Siyuan; Li, Yun; Ding, Guohui; Ping, Jie; Hao, Pei; Li, Yixue
2009-08-25
In the post-genomic era, the development of high-throughput gene expression detection technology provides huge amounts of experimental data, which challenges the traditional pipelines for data processing and analyzing in scientific researches. In our work, we integrated gene expression information from Gene Expression Omnibus (GEO), biomedical ontology from Medical Subject Headings (MeSH) and signaling pathway knowledge from sigPathway entries to develop a context mining tool for gene expression analysis - GEOGLE. GEOGLE offers a rapid and convenient way for searching relevant experimental datasets, pathways and biological terms according to multiple types of queries: including biomedical vocabularies, GDS IDs, gene IDs, pathway names and signature list. Moreover, GEOGLE summarizes the signature genes from a subset of GDSes and estimates the correlation between gene expression and the phenotypic distinction with an integrated p value. This approach performing global searching of expression data may expand the traditional way of collecting heterogeneous gene expression experiment data. GEOGLE is a novel tool that provides researchers a quantitative way to understand the correlation between gene expression and phenotypic distinction through meta-analysis of gene expression datasets from different experiments, as well as the biological meaning behind. The web site and user guide of GEOGLE are available at: http://omics.biosino.org:14000/kweb/workflow.jsp?id=00020.
Validation of the Italian version of the HSE Indicator Tool.
Magnavita, N
2012-06-01
An Italian version of the Health & Safety Executive's (HSE) Management Standards Revised Indicator Tool (MS-RIT) has been used to monitor the working conditions that may lead to stress. To initially examine the factor structure of the Italian version of the MS-RIT, in comparison with the original UK tool, and to investigate its validity and reliability; second, to study the association between occupational stress and psychological distress. Workers from 17 companies self-completed the MS-RIT and the General Health Questionnaire used to measure the psychological distress while they waited for their periodic examination at the workplace. Factor analysis was employed to ascertain whether the Italian version maintained the original subdivision into seven scales. Odds ratios were calculated to estimate the risk of impairment associated with exposure to stress at the workplace. In total, 748 workers participated; the response rate was 91%. The factor structure of the Italian MS-RIT corresponded partially to the original UK version. The 'demand', 'control', 'role', ' relationship' and 'colleague-support' scales were equivalent to the UK ones. A principal factor, termed ' elasticity', incorporated the UK 'management-support' and 'change' scales. Reliability analysis of the sub-scales revealed Cronbach's alpha values ranging from 0.75 to 0.86. Our findings confirmed the usefulness of the Italian version of the HSE MS-RIT in stress control.
Near-Term Actions to Address Long-Term Climate Risk
NASA Astrophysics Data System (ADS)
Lempert, R. J.
2014-12-01
Addressing climate change requires effective long-term policy making, which occurs when reflecting on potential events decades or more in the future causes policy makers to choose near-term actions different than those they would otherwise pursue. Contrary to some expectations, policy makers do sometimes make such long-term decisions, but not as commonly and successfully as climate change may require. In recent years however, the new capabilities of analytic decision support tools, combined with improved understanding of cognitive and organizational behaviors, has significantly improved the methods available for organizations to manage longer-term climate risks. In particular, these tools allow decision makers to understand what near-term actions consistently contribute to achieving both short- and long-term societal goals, even in the face of deep uncertainty regarding the long-term future. This talk will describe applications of these approaches for infrastructure, water, and flood risk management planning, as well as studies of how near-term choices about policy architectures can affect long-term greenhouse gas emission reduction pathways.
The Multiple Control of Verbal Behavior
Michael, Jack; Palmer, David C; Sundberg, Mark L
2011-01-01
Amid the novel terms and original analyses in Skinner's Verbal Behavior, the importance of his discussion of multiple control is easily missed, but multiple control of verbal responses is the rule rather than the exception. In this paper we summarize and illustrate Skinner's analysis of multiple control and introduce the terms convergent multiple control and divergent multiple control. We point out some implications for applied work and discuss examples of the role of multiple control in humor, poetry, problem solving, and recall. Joint control and conditional discrimination are discussed as special cases of multiple control. We suggest that multiple control is a useful analytic tool for interpreting virtually all complex behavior, and we consider the concepts of derived relations and naming as cases in point. PMID:22532752
A scalable, self-analyzing digital locking system for use on quantum optics experiments.
Sparkes, B M; Chrzanowski, H M; Parrain, D P; Buchler, B C; Lam, P K; Symul, T
2011-07-01
Digital control of optics experiments has many advantages over analog control systems, specifically in terms of the scalability, cost, flexibility, and the integration of system information into one location. We present a digital control system, freely available for download online, specifically designed for quantum optics experiments that allows for automatic and sequential re-locking of optical components. We show how the inbuilt locking analysis tools, including a white-noise network analyzer, can be used to help optimize individual locks, and verify the long term stability of the digital system. Finally, we present an example of the benefits of digital locking for quantum optics by applying the code to a specific experiment used to characterize optical Schrödinger cat states.
Performance Analysis of IIUM Wireless Campus Network
NASA Astrophysics Data System (ADS)
Abd Latif, Suhaimi; Masud, Mosharrof H.; Anwar, Farhat
2013-12-01
International Islamic University Malaysia (IIUM) is one of the leading universities in the world in terms of quality of education that has been achieved due to providing numerous facilities including wireless services to every enrolled student. The quality of this wireless service is controlled and monitored by Information Technology Division (ITD), an ISO standardized organization under the university. This paper aims to investigate the constraints of wireless campus network of IIUM. It evaluates the performance of the IIUM wireless campus network in terms of delay, throughput and jitter. QualNet 5.2 simulator tool has employed to measure these performances of IIUM wireless campus network. The observation from the simulation result could be one of the influencing factors in improving wireless services for ITD and further improvement.
Farid, Suzanne S; Washbrook, John; Titchener-Hooker, Nigel J
2005-01-01
This paper presents the application of a decision-support tool, SIMBIOPHARMA, for assessing different manufacturing strategies under uncertainty for the production of biopharmaceuticals. SIMBIOPHARMA captures both the technical and business aspects of biopharmaceutical manufacture within a single tool that permits manufacturing alternatives to be evaluated in terms of cost, time, yield, project throughput, resource utilization, and risk. Its use for risk analysis is demonstrated through a hypothetical case study that uses the Monte Carlo simulation technique to imitate the randomness inherent in manufacturing subject to technical and market uncertainties. The case study addresses whether start-up companies should invest in a stainless steel pilot plant or use disposable equipment for the production of early phase clinical trial material. The effects of fluctuating product demands and titers on the performance of a biopharmaceutical company manufacturing clinical trial material are analyzed. The analysis highlights the impact of different manufacturing options on the range in possible outcomes for the project throughput and cost of goods and the likelihood that these metrics exceed a critical threshold. The simulation studies highlight the benefits of incorporating uncertainties when evaluating manufacturing strategies. Methods of presenting and analyzing information generated by the simulations are suggested. These are used to help determine the ranking of alternatives under different scenarios. The example illustrates the benefits to companies of using such a tool to improve management of their R&D portfolios so as to control the cost of goods.
IPRStats: visualization of the functional potential of an InterProScan run.
Kelly, Ryan J; Vincent, David E; Friedberg, Iddo
2010-12-21
InterPro is a collection of protein signatures for the classification and automated annotation of proteins. Interproscan is a software tool that scans protein sequences against Interpro member databases using a variety of profile-based, hidden markov model and positional specific score matrix methods. It not only combines a set of analysis tools, but also performs data look-up from various sources, as well as some redundancy removal. Interproscan is robust and scalable, able to perform on any machine from a netbook to a large cluster. However, when performing whole-genome or metagenome analysis, there is a need for a fast statistical visualization of the results to have good initial grasp on the functional potential of the sequences in the analyzed data set. This is especially important when analyzing and comparing metagenomic or metaproteomic data-sets. IPRStats is a tool for the visualization of Interproscan results. Interproscan results are parsed from the Interproscan XML or EBIXML file into an SQLite or MySQL database. The results for each signature database scan are read and displayed as pie-charts or bar charts as summary statistics. A table is also provided, where each entry is a signature (e.g. a Pfam entry) accompanied by one or more Gene Ontology terms, if Interproscan was run using the Gene Ontology option. We present an platform-independent, open source licensed tool that is useful for Interproscan users who wish to view the summary of their results in a rapid and concise fashion.
Jain, Tarun; Nowak, Richard; Hudson, Michael; Frisoli, Tiberio; Jacobsen, Gordon; McCord, James
2016-06-01
The HEART score is a risk-stratification tool that was developed and validated for patients evaluated for possible acute coronary syndrome (ACS) in the emergency department (ED). We sought to determine the short-term and long-term prognostic utility of the HEART score. A retrospective single-center analysis of 947 patients evaluated for possible ACS in the ED in 1999 was conducted. Patients were followed for major adverse cardiac events (MACEs) at 30 days: death, acute myocardial infarction, or revascularization procedure. All-cause mortality was assessed at 5 years. The HEART score was compared with the Thrombolysis in Myocardial Infarction (TIMI) score. At 30 days, 14% (135/947) of patients had an MACE: 48 deaths (5%), 84 acute myocardial infarctions (9%), and 48 (5%) revascularization procedures. The MACE rate in patients with HEART score ≤3 was 0.6% (1/175) involving a revascularization procedure, 9.5% (53/557) in patients with HEART score between 4 and 6, and 38% (81/215) with HEART score ≥7. The C-statistic for the HEART score was 0.82 and 0.68 for the TIMI score for predicting 30-day MACE (P < 0.05). Patients with HEART score ≤3 had lower 5-year mortality rate compared with those with TIMI score of 0 (10.6% vs. 20.5%, P = 0.02). The HEART score is a valuable risk-stratification tool in predicting not only short-term MACE but also long-term mortality in patients evaluated for possible ACS in the ED. The HEART score had a superior prognostic value compared with the TIMI score.