Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit
NASA Astrophysics Data System (ADS)
Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.
2013-12-01
Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.
Strategic thinking for radiology.
Schilling, R B
1997-08-01
We have now analyzed the use and benefits of four Strategic Thinking Tools for Radiology: the Vision Statement, the High Five, the Two-by-Two, and Real-Win-Worth. Additional tools will be provided during the tutorial. The tools provided above should be considered as examples. They all contain the 10 benefits outlined earlier to varying degrees. It is extremely important that the tools be used in a manner consistent with the Vision Statement of the organization. The specific situation, the effectiveness of the team, and the experience developed with the tools over time will determine the true benefits of the process. It has also been shown that with active use of the types of tools provided above, teams have learned to modify the tools for increased effectiveness and have created additional tools for specific purposes. Once individuals in the organization become committed to improving communication and to using tools/frameworks for solving problems as a team, effectiveness becomes boundless.
Evaluating the Utility of Web-Based Consumer Support Tools Using Rough Sets
NASA Astrophysics Data System (ADS)
Maciag, Timothy; Hepting, Daryl H.; Slezak, Dominik; Hilderman, Robert J.
On the Web, many popular e-commerce sites provide consumers with decision support tools to assist them in their commerce-related decision-making. Many consumers will rank the utility of these tools quite highly. Data obtained from web usage mining analyses, which may provide knowledge about a user's online experiences, could help indicate the utility of these tools. This type of analysis could provide insight into whether provided tools are adequately assisting consumers in conducting their online shopping activities or if new or additional enhancements need consideration. Although some research in this regard has been described in previous literature, there is still much that can be done. The authors of this paper hypothesize that a measurement of consumer decision accuracy, i.e. a measurement preferences, could help indicate the utility of these tools. This paper describes a procedure developed towards this goal using elements of rough set theory. The authors evaluated the procedure using two support tools, one based on a tool developed by the US-EPA and the other developed by one of the authors called cogito. Results from the evaluation did provide interesting insights on the utility of both support tools. Although it was shown that the cogito tool obtained slightly higher decision accuracy, both tools could be improved from additional enhancements. Details of the procedure developed and results obtained from the evaluation will be provided. Opportunities for future work are also discussed.
Sustainability Tools Inventory - Initial Gaps Analysis | Science ...
This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4
Implementation of GenePattern within the Stanford Microarray Database.
Hubble, Jeremy; Demeter, Janos; Jin, Heng; Mao, Maria; Nitzberg, Michael; Reddy, T B K; Wymore, Farrell; Zachariah, Zachariah K; Sherlock, Gavin; Ball, Catherine A
2009-01-01
Hundreds of researchers across the world use the Stanford Microarray Database (SMD; http://smd.stanford.edu/) to store, annotate, view, analyze and share microarray data. In addition to providing registered users at Stanford access to their own data, SMD also provides access to public data, and tools with which to analyze those data, to any public user anywhere in the world. Previously, the addition of new microarray data analysis tools to SMD has been limited by available engineering resources, and in addition, the existing suite of tools did not provide a simple way to design, execute and share analysis pipelines, or to document such pipelines for the purposes of publication. To address this, we have incorporated the GenePattern software package directly into SMD, providing access to many new analysis tools, as well as a plug-in architecture that allows users to directly integrate and share additional tools through SMD. In this article, we describe our implementation of the GenePattern microarray analysis software package into the SMD code base. This extension is available with the SMD source code that is fully and freely available to others under an Open Source license, enabling other groups to create a local installation of SMD with an enriched data analysis capability.
Teaching Web Security Using Portable Virtual Labs
ERIC Educational Resources Information Center
Chen, Li-Chiou; Tao, Lixin
2012-01-01
We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…
The Legacy Archive for Microwave Background Data Analysis (LAMBDA)
NASA Astrophysics Data System (ADS)
Miller, Nathan; LAMBDA
2018-01-01
The Legacy Archive for Microwave Background Data Analysis (LAMBDA) provides CMB researchers with archival data for cosmology missions, software tools, and links to other sites of interest. LAMBDA is one-stop shopping for CMB researchers. It hosts data from WMAP along with many suborbital experiments. Over the past year, LAMBDA has acquired new data from SPTpol, SPIDER and ACTPol. In addition to the primary CMB, LAMBDA also provides foreground data.LAMBDA has several ongoing efforts to provide tools for CMB researchers. These tools include a web interface for CAMB and a web interface for a CMB survey footprint database and plotting tool. Additionally, we have recently developed a Docker container with standard CMB analysis tools and demonstrations in the form of Jupyter notebooks. These containers will be publically available through Docker's container repository and the source will be available on github.
Exploring Learner's Patterns of Using the Online Course Tool in University Classes
ERIC Educational Resources Information Center
Yamamoto, Yoshihiko; Usami, Akinori
2015-01-01
Online course tools such as WebCT or Manaba+R are popularly used in university classes and enhance learners' understanding of their course contents. In addition, teachers try to utilize these online course tools for their students such as giving their students online discussions, providing students with additional materials and so forth. However,…
Thanki, Anil S; Soranzo, Nicola; Haerty, Wilfried; Davey, Robert P
2018-03-01
Gene duplication is a major factor contributing to evolutionary novelty, and the contraction or expansion of gene families has often been associated with morphological, physiological, and environmental adaptations. The study of homologous genes helps us to understand the evolution of gene families. It plays a vital role in finding ancestral gene duplication events as well as identifying genes that have diverged from a common ancestor under positive selection. There are various tools available, such as MSOAR, OrthoMCL, and HomoloGene, to identify gene families and visualize syntenic information between species, providing an overview of syntenic regions evolution at the family level. Unfortunately, none of them provide information about structural changes within genes, such as the conservation of ancestral exon boundaries among multiple genomes. The Ensembl GeneTrees computational pipeline generates gene trees based on coding sequences, provides details about exon conservation, and is used in the Ensembl Compara project to discover gene families. A certain amount of expertise is required to configure and run the Ensembl Compara GeneTrees pipeline via command line. Therefore, we converted this pipeline into a Galaxy workflow, called GeneSeqToFamily, and provided additional functionality. This workflow uses existing tools from the Galaxy ToolShed, as well as providing additional wrappers and tools that are required to run the workflow. GeneSeqToFamily represents the Ensembl GeneTrees pipeline as a set of interconnected Galaxy tools, so they can be run interactively within the Galaxy's user-friendly workflow environment while still providing the flexibility to tailor the analysis by changing configurations and tools if necessary. Additional tools allow users to subsequently visualize the gene families produced by the workflow, using the Aequatus.js interactive tool, which has been developed as part of the Aequatus software project.
NASA Technical Reports Server (NTRS)
Voellmer, George M.
1992-01-01
Mechanism enables robot to change tools on end of arm. Actuated by motion of robot: requires no additional electrical or pneumatic energy to make or break connection between tool and wrist at end of arm. Includes three basic subassemblies: wrist interface plate attached to robot arm at wrist, tool interface plate attached to tool, and holster. Separate tool interface plate and holster provided for each tool robot uses.
Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology
Latendresse, Mario; Paley, Suzanne M.; Krummenacker, Markus; Ong, Quang D.; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M.; Caspi, Ron
2016-01-01
Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms. PMID:26454094
78 FR 52927 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-27
... TeamSTEPPS[supreg] (aka Team Strategies and Tools for Enhancing Performance and Patient Safety) to provide an evidence-based suite of tools and strategies for training teamwork- based patient safety to... strategies provided in the program in action. In addition to developing Master Trainers, AHRQ has also...
ERIC Educational Resources Information Center
Greiner, Keith
2007-01-01
This is a one-page summary of work-study assistance as an academic tool for college and university students. The summary includes references to on-line resource documents that provide additional details.
Spectrum simulation in DTSA-II.
Ritchie, Nicholas W M
2009-10-01
Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.
GAC: Gene Associations with Clinical, a web based application.
Zhang, Xinyan; Rupji, Manali; Kowalski, Jeanne
2017-01-01
We present GAC, a shiny R based tool for interactive visualization of clinical associations based on high-dimensional data. The tool provides a web-based suite to perform supervised principal component analysis (SuperPC), an approach that uses both high-dimensional data, such as gene expression, combined with clinical data to infer clinical associations. We extended the approach to address binary outcomes, in addition to continuous and time-to-event data in our package, thereby increasing the use and flexibility of SuperPC. Additionally, the tool provides an interactive visualization for summarizing results based on a forest plot for both binary and time-to-event data. In summary, the GAC suite of tools provide a one stop shop for conducting statistical analysis to identify and visualize the association between a clinical outcome of interest and high-dimensional data types, such as genomic data. Our GAC package has been implemented in R and is available via http://shinygispa.winship.emory.edu/GAC/. The developmental repository is available at https://github.com/manalirupji/GAC.
Data Association Algorithms for Tracking Satellites
2013-03-27
validation of the new tools. The description provided here includes the mathematical back ground and description of the models implemented, as well as a...simulation development. This work includes the addition of higher-fidelity models in CU-TurboProp and validation of the new tools. The description...ode45(), used in Ananke, and (3) provide the necessary inputs to the bidirectional reflectance distribution function ( BRDF ) model provided by Pacific
Simulation Tools for Power Electronics Courses Based on Java Technologies
ERIC Educational Resources Information Center
Canesin, Carlos A.; Goncalves, Flavio A. S.; Sampaio, Leonardo P.
2010-01-01
This paper presents interactive power electronics educational tools. These interactive tools make use of the benefits of Java language to provide a dynamic and interactive approach to simulating steady-state ideal rectifiers (uncontrolled and controlled; single-phase and three-phase). Additionally, this paper discusses the development and use of…
Bolstering Teaching through Online Tools
ERIC Educational Resources Information Center
Singh, Anil; Mangalaraj, George; Taneja, Aakash
2010-01-01
This paper offers a compilation of technologies that provides either free or low-cost solutions to the challenges of teaching online courses. It presents various teaching methods the outlined tools and technologies can support, with emphasis on fit between these tools and the tasks they are meant to serve. In addition, it highlights various…
Complex ambulatory settings demand scheduling systems.
Ross, K M
1998-01-01
Practice management systems are becoming more and more complex, as they are asked to integrate all aspects of patient and resource management. Although patient scheduling is a standard expectation in any ambulatory environment, facilities and equipment resource scheduling are additional functionalities of scheduling systems. Because these functions were not typically managed in manual patient scheduling, often the result was resource mismanagement, along with a potential negative impact on utilization, patient flow and provider productivity. As ambulatory organizations have become more seasoned users of practice management software, the value of resource scheduling has become apparent. Appointment scheduling within a fully integrated practice management system is recognized as an enhancement of scheduling itself and provides additional tools to manage other information needs. Scheduling, as one component of patient information management, provides additional tools in these areas.
Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology.
Karp, Peter D; Latendresse, Mario; Paley, Suzanne M; Krummenacker, Markus; Ong, Quang D; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M; Caspi, Ron
2016-09-01
Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Sean; Dewan, Leslie; Massie, Mark
This report presents results from a collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear (GAIN) Nuclear Energy Voucher program. The TAP concept is a molten salt reactor using configurable zirconium hydride moderator rod assemblies to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches and time-dependent parametersmore » necessary to simulate the continuously changing physics in this complex system. The implementation of continuous-energy Monte Carlo transport and depletion tools in ChemTriton provide for full-core three-dimensional modeling and simulation. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this concept. Additional analyses of mass feed rates and enrichments, isotopic removals, tritium generation, core power distribution, core vessel helium generation, moderator rod heat deposition, and reactivity coeffcients provide additional information to make informed design decisions. This work demonstrates capabilities of ORNL modeling and simulation tools for neutronic and fuel cycle analysis of molten salt reactor concepts.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-21
... provides an online reporting tool to support the annual HSIP reporting process. Additional information is.../ . Reporting into the online reporting tool meets all report requirements and USDOT Web site compatibility...
NASA Technical Reports Server (NTRS)
Burns, K. Lee; Altino, Karen
2008-01-01
The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.
CHRONIOUS: a wearable platform for monitoring and management of patients with chronic disease.
Bellos, Christos; Papadopoulos, Athanassios; Rosso, Roberto; Fotiadis, Dimitrios I
2011-01-01
The CHRONIOUS system has been developed based on an open architecture design that consists of a set of subsystems which interact in order to provide all the needed services to the chronic disease patients. An advanced multi-parametric expert system is being implemented that fuses information effectively from various sources using intelligent techniques. Data are collected by sensors of a body network controlling vital signals while additional tools record dietary habits and plans, drug intake, environmental and biochemical parameters and activity data. The CHRONIOUS platform provides guidelines and standards for the future generations of "chronic disease management systems" and facilitates sophisticated monitoring tools. In addition, an ontological information retrieval system is being delivered satisfying the necessities for up-to-date clinical information of Chronic Obstructive pulmonary disease (COPD) and Chronic Kidney Disease (CKD). Moreover, support tools are being embedded in the system, such as the Mental Tools for the monitoring of patient mental health status. The integrated platform provides real-time patient monitoring and supervision, both indoors and outdoors and represents a generic platform for the management of various chronic diseases.
GAC: Gene Associations with Clinical, a web based application
Zhang, Xinyan; Rupji, Manali; Kowalski, Jeanne
2018-01-01
We present GAC, a shiny R based tool for interactive visualization of clinical associations based on high-dimensional data. The tool provides a web-based suite to perform supervised principal component analysis (SuperPC), an approach that uses both high-dimensional data, such as gene expression, combined with clinical data to infer clinical associations. We extended the approach to address binary outcomes, in addition to continuous and time-to-event data in our package, thereby increasing the use and flexibility of SuperPC. Additionally, the tool provides an interactive visualization for summarizing results based on a forest plot for both binary and time-to-event data. In summary, the GAC suite of tools provide a one stop shop for conducting statistical analysis to identify and visualize the association between a clinical outcome of interest and high-dimensional data types, such as genomic data. Our GAC package has been implemented in R and is available via http://shinygispa.winship.emory.edu/GAC/. The developmental repository is available at https://github.com/manalirupji/GAC. PMID:29263780
2015-03-01
designated by other documentation. REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of...have also established an Advanced ReTOOL program, which will provide additional summer training experiences and post -baccalaureate opportunities to...address cultural appropriateness of research conceptualization, design and implementation; (3) provide culturally appropriate cancer prevention, screening
The Durham Adaptive Optics Simulation Platform (DASP): Current status
NASA Astrophysics Data System (ADS)
Basden, A. G.; Bharmal, N. A.; Jenkins, D.; Morris, T. J.; Osborn, J.; Peng, J.; Staykov, L.
2018-01-01
The Durham Adaptive Optics Simulation Platform (DASP) is a Monte-Carlo modelling tool used for the simulation of astronomical and solar adaptive optics systems. In recent years, this tool has been used to predict the expected performance of the forthcoming extremely large telescope adaptive optics systems, and has seen the addition of several modules with new features, including Fresnel optics propagation and extended object wavefront sensing. Here, we provide an overview of the features of DASP and the situations in which it can be used. Additionally, the user tools for configuration and control are described.
Bennett, Hunter; Davison, Kade; Arnold, John; Slattery, Flynn; Martin, Max; Norton, Kevin
2017-10-01
Multicomponent movement assessment tools have become commonplace to measure movement quality, proposing to indicate injury risk and performance capabilities. Despite popular use, there has been no attempt to compare the components of each tool reported in the literature, the processes in which they were developed, or the underpinning rationale for their included content. As such, the objective of this systematic review was to provide a comprehensive summary of current movement assessment tools and appraise the evidence supporting their development. A systematic literature search was performed using PRISMA guidelines to identify multicomponent movement assessment tools. Commonalities between tools and the evidence provided to support the content of each tool was identified. Each tool underwent critical appraisal to identify the rigor in which it was developed, and its applicability to professional practice. Eleven tools were identified, of which 5 provided evidence to support their content as assessments of movement quality. One assessment tool (Soccer Injury Movement Screen [SIMS]) received an overall score of above 65% on critical appraisal, with a further 2 tools (Movement Competency Screen [MCS] and modified 4 movement screen [M4-MS]) scoring above 60%. Only the MCS provided clear justification for its developmental process. The remaining 8 tools scored between 40 and 60%. On appraisal, the MCS, M4-MS, and SIMS seem to provide the most practical value for assessing movement quality as they provide the strongest reports of developmental rigor and an identifiable evidence base. In addition, considering the evidence provided, these tools may have the strongest potential for identifying performance capabilities and guiding exercise prescription in athletic and sport-specific populations.
76 FR 32331 - Preliminary Plan for Retrospective Review of Existing Regulations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-06
... review. In addition, DHS launched an IdeaScale Web page; this social media tool provided an additional... agency's regulatory program more effective or less burdensome in achieving its regulatory objectives. DHS...
Pupils, Tools and the Zone of Proximal Development
ERIC Educational Resources Information Center
Abtahi, Yasmine
2018-01-01
In this article, I use the Vygotskian concept of the Zone of Proximal Development (ZPD) to examine the learning experience of two grade seven pupils as they attempted to solve an addition of fractions problem using fraction strips. The aim is to highlight how tools can facilitate the enactment of a ZPD, within which the tool provides the guidance.…
European solvent industry group generic exposure scenario risk and exposure tool
Zaleski, Rosemary T; Qian, Hua; Zelenka, Michael P; George-Ares, Anita; Money, Chris
2014-01-01
The European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) was developed to facilitate the safety evaluation of consumer uses of solvents, as required by the European Union Registration, Evaluation and Authorization of Chemicals (REACH) Regulation. This exposure-based risk assessment tool provides estimates of both exposure and risk characterization ratios for consumer uses. It builds upon the consumer portion of the European Center for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) tool by implementing refinements described in ECETOC TR107. Technical enhancements included the use of additional data to refine scenario defaults and the ability to include additional parameters in exposure calculations. Scenarios were also added to cover all frequently encountered consumer uses of solvents. The TRA tool structure was modified to automatically determine conditions necessary for safe use. EGRET reports results using specific standard phrases in a format consistent with REACH exposure scenario guidance, in order that the outputs can be readily assimilated within safety data sheets and similar information technology systems. Evaluation of tool predictions for a range of commonly encountered consumer uses of solvents found it provides reasonable yet still conservative exposure estimates. PMID:23361440
European solvent industry group generic exposure scenario risk and exposure tool.
Zaleski, Rosemary T; Qian, Hua; Zelenka, Michael P; George-Ares, Anita; Money, Chris
2014-01-01
The European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) was developed to facilitate the safety evaluation of consumer uses of solvents, as required by the European Union Registration, Evaluation and Authorization of Chemicals (REACH) Regulation. This exposure-based risk assessment tool provides estimates of both exposure and risk characterization ratios for consumer uses. It builds upon the consumer portion of the European Center for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) tool by implementing refinements described in ECETOC TR107. Technical enhancements included the use of additional data to refine scenario defaults and the ability to include additional parameters in exposure calculations. Scenarios were also added to cover all frequently encountered consumer uses of solvents. The TRA tool structure was modified to automatically determine conditions necessary for safe use. EGRET reports results using specific standard phrases in a format consistent with REACH exposure scenario guidance, in order that the outputs can be readily assimilated within safety data sheets and similar information technology systems. Evaluation of tool predictions for a range of commonly encountered consumer uses of solvents found it provides reasonable yet still conservative exposure estimates.
EJSCREEN Data--2015 Public Release
EJSCREEN is an environmental justice (EJ) screening and mapping tool that provides EPA with a nationally consistent dataset and methodology for calculating EJ indexes, which can be used for highlighting places that may be candidates for further review, analysis, or outreach as the agency develops programs, policies and other activities. The tool provides both summary and detailed information at the Census block group level or a user-defined area for both demographic and environmental indicators. The summary information is in the form of EJ Indexes which combine demographic information with a single environmental indicator (such as proximity to traffic) that can help identify communities living in areas with greater potential for environmental and health impacts. The tool also provides additional detailed demographic and environmental information to supplement screening analyses. EJSCREEN displays this information in color-coded maps, bar charts, and standard reports. Users should keep in mind that screening tools are subject to substantial uncertainty in their demographic and environmental data, particularly when looking at small geographic areas, such as Census block groups. Data on the full range of environmental impacts and demographic factors in any given location are almost certainly not available directly through this tool, and its initial results should be supplemented with additional information and local knowledge before making any judgments about poten
Automating Expertise in Collaborative Learning Environments
ERIC Educational Resources Information Center
LaVoie, Noelle; Streeter, Lynn; Lochbaum, Karen; Wroblewski, David; Boyce, Lisa; Krupnick, Charles; Psotka, Joseph
2010-01-01
We have developed a set of tools for improving online collaborative learning including an automated expert that monitors and moderates discussions, and additional tools to evaluate contributions, semantically search all posted comments, access a library of hundreds of digital books and provide reports to instructors. The technology behind these…
VizieR Online Data Catalog: Jame Clerk Maxwell Telescope Science Archive (CADC, 2003)
NASA Astrophysics Data System (ADS)
Canadian Astronomy Data, Centre
2018-01-01
The JCMT Science Archive (JSA), a collaboration between the CADC and EOA, is the official distribution site for observational data obtained with the James Clerk Maxwell Telescope (JCMT) on Mauna Kea, Hawaii. The JSA search interface is provided by the CADC Search tool, which provides generic access to the complete set of telescopic data archived at the CADC. Help on the use of this tool is provided via tooltips. For additional information on instrument capabilities and data reduction, please consult the SCUBA-2 and ACSIS instrument pages provided on the JAC maintained JCMT pages. JCMT-specific help related to the use of the CADC AdvancedSearch tool is available from the JAC. (1 data file).
New additions to the cancer precision medicine toolkit.
Mardis, Elaine R
2018-04-13
New computational and database-driven tools are emerging to aid in the interpretation of cancer genomic data as its use becomes more common in clinical evidence-based cancer medicine. Two such open source tools, published recently in Genome Medicine, provide important advances to address the clinical cancer genomics data interpretation bottleneck.
Semantic-Aware Components and Services of ActiveMath
ERIC Educational Resources Information Center
Melis, Erica; Goguadze, Giorgi; Homik, Martin; Libbrecht, Paul; Ullrich, Carsten; Winterstein, Stefan
2006-01-01
ActiveMath is a complex web-based adaptive learning environment with a number of components and interactive learning tools. The basis for handling semantics of learning content is provided by its semantic (mathematics) content markup, which is additionally annotated with educational metadata. Several components, tools and external services can…
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.
1992-01-01
The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.
Sohl, Stephanie Jean; Birdee, Gurjeet; Elam, Roy
2016-11-01
Improving health behaviors is fundamental to preventing and controlling chronic disease. Healthcare providers who have a patient-centered communication style and appropriate behavioral change tools can empower patients to engage in and sustain healthy behaviors. This review highlights motivational interviewing and mindfulness along with other evidence-based strategies for enhancing patient-centered communication and the behavior change process. Motivational interviewing and mindfulness are especially useful for empowering patients to set self-determined, or autonomous, goals for behavior change. This is important because autonomously motivated behavioral change is more sustainable. Additional strategies such as self-monitoring are discussed as useful for supporting the implementation and maintenance of goals. Thus, there is a need for healthcare providers to develop such tools to empower sustained behavior change. The additional support of a new role, a health coach who specializes in facilitating the process of health-related behavior change, may be required to substantially impact public health.
Sohl, Stephanie Jean; Birdee, Gurjeet; Elam, Roy
2015-01-01
Improving health behaviors is fundamental to preventing and controlling chronic disease. Healthcare providers who have a patient-centered communication style and appropriate behavioral change tools can empower patients to engage in and sustain healthy behaviors. This review highlights motivational interviewing and mindfulness along with other evidence-based strategies for enhancing patient-centered communication and the behavior change process. Motivational interviewing and mindfulness are especially useful for empowering patients to set self-determined, or autonomous, goals for behavior change. This is important because autonomously motivated behavioral change is more sustainable. Additional strategies such as self-monitoring are discussed as useful for supporting the implementation and maintenance of goals. Thus, there is a need for healthcare providers to develop such tools to empower sustained behavior change. The additional support of a new role, a health coach who specializes in facilitating the process of health-related behavior change, may be required to substantially impact public health. PMID:28239308
An Intuitive Dashboard for Bayesian Network Inference
NASA Astrophysics Data System (ADS)
Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.
2014-03-01
Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.
Alpha, Tau Rho; Diggles, Michael F.
1998-01-01
This CD-ROM contains 17 teaching tools: 16 interactive HyperCard 'stacks' and a printable model. They are separated into the following categories: Geologic Processes, Earthquakes and Faulting, and Map Projections and Globes. A 'navigation' stack, Earth Science, is provided as a 'launching' place from which to access all of the other stacks. You can also open the HyperCard Stacks folder and launch any of the 16 stacks yourself. In addition, a 17th tool, Earth and Tectonic Globes, is provided as a printable document. Each of the tools can be copied onto a 1.4-MB floppy disk and distributed freely.
Diagnostic tools in Rhinology EAACI position paper
2011-01-01
This EAACI Task Force document aims at providing the readers with a comprehensive and complete overview of the currently available tools for diagnosis of nasal and sino-nasal disease. We have tried to logically order the different important issues related to history taking, clinical examination and additional investigative tools for evaluation of the severity of sinonasal disease into a consensus document. A panel of European experts in the field of Rhinology has contributed to this consensus document on Diagnostic Tools in Rhinology. PMID:22410181
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This volume developed by the Machine Tool Advanced Skill Technology (MAST) program contains key administrative documents and provides additional sources for machine tool and precision manufacturing information and important points of contact in the industry. The document contains the following sections: a foreword; grant award letter; timeline for…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Konovalenko, Ivan S., E-mail: ivkon@ispms.tsc.ru; Konovalenko, Igor S., E-mail: igkon@ispms.tsc.ru; Kolubaev, Evgeniy A., E-mail: eak@ispms.tsc.ru
2015-10-27
A molecular dynamics model was constructed to describe material loading on the atomic scale by the mode identical to friction stir welding. It was shown that additional vibration applied to the tool during the loading mode provides specified intensity values and continuous thermomechanical action during welding. An increase in additional vibration intensity causes an increase both in the force acting on the workpiece from the rotating tool and in temperature within the welded area.
Learning with LOGO: Logo and Vectors.
ERIC Educational Resources Information Center
Lough, Tom; Tipps, Steve
1986-01-01
This is the first of a two-part series on the general concept of vector space. Provides tool procedures to allow investigation of vector properties, vector addition and subtraction, and X and Y components. Lists several sources of additional vector ideas. (JM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rack, Frank; Storms, Michael; Schroeder, Derryl
The primary accomplishments of the JOI Cooperative Agreement with DOE/NETL in this quarter were (1) the preliminary postcruise evaluation of the tools and measurement systems that were used during ODP Leg 204 to study hydrate deposits on Hydrate Ridge, offshore Oregon from July through September 2002; and (2) the preliminary study of the hydrate-bearing core samples preserved in pressure vessels and in liquid nitrogen cryofreezers, which are now stored at the ODP Gulf Coast Repository in College Station, TX. During ODP Leg 204, several newly modified downhole tools were deployed to better characterize the subsurface lithologies and environments hosting microbialmore » populations and gas hydrates. A preliminary review of the use of these tools is provided herein. The DVTP, DVTP-P, APC-methane, and APC-Temperature tools (ODP memory tools) were used extensively and successfully during ODP Leg 204 aboard the D/V JOIDES Resolution. These systems provided a strong operational capability for characterizing the in situ properties of methane hydrates in subsurface environments on Hydrate Ridge during ODP Leg 204. Pressure was also measured during a trial run of the Fugro piezoprobe, which operates on similar principles as the DVTP-P. The final report describing the deployments of the Fugro Piezoprobe is provided in Appendix A of this report. A preliminary analysis and comparison between the piezoprobe and DVTP-P tools is provided in Appendix B of this report. Finally, a series of additional holes were cored at the crest of Hydrate Ridge (Site 1249) specifically geared toward the rapid recovery and preservation of hydrate samples as part of a hydrate geriatric study partially funded by the Department of Energy (DOE). In addition, the preliminary results from gamma density non-invasive imaging of the cores preserved in pressure vessels are provided in Appendix C of this report. An initial visual inspection of the samples stored in liquid nitrogen is provided in Appendix D of this report.« less
Two-Dimensional Neutronic and Fuel Cycle Analysis of the Transatomic Power Molten Salt Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Betzler, Benjamin R.; Powers, Jeffrey J.; Worrall, Andrew
2017-01-15
This status report presents the results from the first phase of the collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear, Nuclear Energy Voucher program. The TAP design is a molten salt reactor using movable moderator rods to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches andmore » time-dependent parameters necessary to simulate the continuously changing physics in this complex system. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this design. Additional analyses of time step sizes, mass feed rates and enrichments, and isotopic removals provide additional information to make informed design decisions. This work further demonstrates capabilities of ORNL modeling and simulation tools for analysis of molten salt reactor designs and strongly positions this effort for the upcoming three-dimensional core analysis.« less
Ekirapa-Kiracho, Elizabeth; Ghosh, Upasona; Brahmachari, Rittika; Paina, Ligia
2017-12-28
Effective stakeholder engagement in research and implementation is important for improving the development and implementation of policies and programmes. A varied number of tools have been employed for stakeholder engagement. In this paper, we discuss two participatory methods for engaging with stakeholders - participatory social network analysis (PSNA) and participatory impact pathways analysis (PIPA). Based on our experience, we derive lessons about when and how to apply these tools. This paper was informed by a review of project reports and documents in addition to reflection meetings with the researchers who applied the tools. These reports were synthesised and used to make thick descriptions of the applications of the methods while highlighting key lessons. PSNA and PIPA both allowed a deep understanding of how the system actors are interconnected and how they influence maternal health and maternal healthcare services. The findings from the PSNA provided guidance on how stakeholders of a health system are interconnected and how they can stimulate more positive interaction between the stakeholders by exposing existing gaps. The PIPA meeting enabled the participants to envision how they could expand their networks and resources by mentally thinking about the contributions that they could make to the project. The processes that were considered critical for successful application of the tools and achievement of outcomes included training of facilitators, language used during the facilitation, the number of times the tool is applied, length of the tools, pretesting of the tools, and use of quantitative and qualitative methods. Whereas both tools allowed the identification of stakeholders and provided a deeper understanding of the type of networks and dynamics within the network, PIPA had a higher potential for promoting collaboration between stakeholders, likely due to allowing interaction between them. Additionally, it was implemented within a participatory action research project. PIPA also allowed participatory evaluation of the project from the perspective of the community. This paper provides lessons about the use of these participatory tools.
A Tool for Estimating Variability in Wood Preservative Treatment Retention
Patricia K. Lebow; Adam M. Taylor; Timothy M. Young
2015-01-01
Composite sampling is standard practice for evaluation of preservative retention levels in preservative-treated wood. Current protocols provide an average retention value but no estimate of uncertainty. Here we describe a statistical method for calculating uncertainty estimates using the standard sampling regime with minimal additional chemical analysis. This tool can...
Benchmarking: A Study of School and School District Effect and Efficiency.
ERIC Educational Resources Information Center
Swanson, Austin D.; Engert, Frank
The "New York State School Report Card" provides a vehicle for benchmarking with respect to student achievement. In this study, additional tools were developed for making external comparisons with respect to achievement, and tools were added for assessing fiscal policy and efficiency. Data from school years 1993-94 through 1995-96 were…
A survey of tools and resources for the next generation analyst
NASA Astrophysics Data System (ADS)
Hall, David L.; Graham, Jake; Catherman, Emily
2015-05-01
We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.
Interpolator for numerically controlled machine tools
Bowers, Gary L.; Davenport, Clyde M.; Stephens, Albert E.
1976-01-01
A digital differential analyzer circuit is provided that depending on the embodiment chosen can carry out linear, parabolic, circular or cubic interpolation. In the embodiment for parabolic interpolations, the circuit provides pulse trains for the X and Y slide motors of a two-axis machine to effect tool motion along a parabolic path. The pulse trains are generated by the circuit in such a way that parabolic tool motion is obtained from information contained in only one block of binary input data. A part contour may be approximated by one or more parabolic arcs. Acceleration and initial velocity values from a data block are set in fixed bit size registers for each axis separately but simultaneously and the values are integrated to obtain the movement along the respective axis as a function of time. Integration is performed by continual addition at a specified rate of an integrand value stored in one register to the remainder temporarily stored in another identical size register. Overflows from the addition process are indicative of the integral. The overflow output pulses from the second integration may be applied to motors which position the respective machine slides according to a parabolic motion in time to produce a parabolic machine tool motion in space. An additional register for each axis is provided in the circuit to allow "floating" of the radix points of the integrand registers and the velocity increment to improve position accuracy and to reduce errors encountered when the acceleration integrand magnitudes are small when compared to the velocity integrands. A divider circuit is provided in the output of the circuit to smooth the output pulse spacing and prevent motor stall, because the overflow pulses produced in the binary addition process are spaced unevenly in time. The divider has the effect of passing only every nth motor drive pulse, with n being specifiable. The circuit inputs (integrands, rates, etc.) are scaled to give exactly n times the desired number of pulses out, in order to compensate for the divider.
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
NASA Access Mechanism - Graphical user interface information retrieval system
NASA Technical Reports Server (NTRS)
Hunter, Judy F.; Generous, Curtis; Duncan, Denise
1993-01-01
Access to online information sources of aerospace, scientific, and engineering data, a mission focus for NASA's Scientific and Technical Information Program, has always been limited by factors such as telecommunications, query language syntax, lack of standardization in the information, and the lack of adequate tools to assist in searching. Today, the NASA STI Program's NASA Access Mechanism (NAM) prototype offers a solution to these problems by providing the user with a set of tools that provide a graphical interface to remote, heterogeneous, and distributed information in a manner adaptable to both casual and expert users. Additionally, the NAM provides access to many Internet-based services such as Electronic Mail, the Wide Area Information Servers system, Peer Locating tools, and electronic bulletin boards.
NASA access mechanism: Graphical user interface information retrieval system
NASA Technical Reports Server (NTRS)
Hunter, Judy; Generous, Curtis; Duncan, Denise
1993-01-01
Access to online information sources of aerospace, scientific, and engineering data, a mission focus for NASA's Scientific and Technical Information Program, has always been limited to factors such as telecommunications, query language syntax, lack of standardization in the information, and the lack of adequate tools to assist in searching. Today, the NASA STI Program's NASA Access Mechanism (NAM) prototype offers a solution to these problems by providing the user with a set of tools that provide a graphical interface to remote, heterogeneous, and distributed information in a manner adaptable to both casual and expert users. Additionally, the NAM provides access to many Internet-based services such as Electronic Mail, the Wide Area Information Servers system, Peer Locating tools, and electronic bulletin boards.
A Diagnostics Tool to detect ensemble forecast system anomaly and guide operational decisions
NASA Astrophysics Data System (ADS)
Park, G. H.; Srivastava, A.; Shrestha, E.; Thiemann, M.; Day, G. N.; Draijer, S.
2017-12-01
The hydrologic community is moving toward using ensemble forecasts to take uncertainty into account during the decision-making process. The New York City Department of Environmental Protection (DEP) implements several types of ensemble forecasts in their decision-making process: ensemble products for a statistical model (Hirsch and enhanced Hirsch); the National Weather Service (NWS) Advanced Hydrologic Prediction Service (AHPS) forecasts based on the classical Ensemble Streamflow Prediction (ESP) technique; and the new NWS Hydrologic Ensemble Forecasting Service (HEFS) forecasts. To remove structural error and apply the forecasts to additional forecast points, the DEP post processes both the AHPS and the HEFS forecasts. These ensemble forecasts provide mass quantities of complex data, and drawing conclusions from these forecasts is time-consuming and difficult. The complexity of these forecasts also makes it difficult to identify system failures resulting from poor data, missing forecasts, and server breakdowns. To address these issues, we developed a diagnostic tool that summarizes ensemble forecasts and provides additional information such as historical forecast statistics, forecast skill, and model forcing statistics. This additional information highlights the key information that enables operators to evaluate the forecast in real-time, dynamically interact with the data, and review additional statistics, if needed, to make better decisions. We used Bokeh, a Python interactive visualization library, and a multi-database management system to create this interactive tool. This tool compiles and stores data into HTML pages that allows operators to readily analyze the data with built-in user interaction features. This paper will present a brief description of the ensemble forecasts, forecast verification results, and the intended applications for the diagnostic tool.
Precision non-contact polishing tool
Taylor, John S.
1997-01-01
A non-contact polishing tool that combines two orthogonal slurry flow geometries to provide flexibility in altering the shape of the removal footprint. By varying the relative contributions of the two flow geometries, the footprint shape can be varied between the characteristic shapes corresponding to the two independent flow regimes. In addition, the tool can include a pressure activated means by which the shape of the brim of the tool can be varied. The tool can be utilized in various applications, such as x-ray optical surfaces, x-ray lithography, lenses, etc., where stringent shape and finish tolerances are required.
Precision non-contact polishing tool
Taylor, J.S.
1997-01-07
A non-contact polishing tool is disclosed that combines two orthogonal slurry flow geometries to provide flexibility in altering the shape of the removal footprint. By varying the relative contributions of the two flow geometries, the footprint shape can be varied between the characteristic shapes corresponding to the two independent flow regimes. In addition, the tool can include a pressure activated means by which the shape of the brim of the tool can be varied. The tool can be utilized in various applications, such as x-ray optical surfaces, x-ray lithography, lenses, etc., where stringent shape and finish tolerances are required. 5 figs.
Livet, Melanie; Fixsen, Amanda
2018-01-01
With mental health services shifting to community-based settings, community mental health (CMH) organizations are under increasing pressure to deliver effective services. Despite availability of evidence-based interventions, there is a gap between effective mental health practices and the care that is routinely delivered. Bridging this gap requires availability of easily tailorable implementation support tools to assist providers in implementing evidence-based intervention with quality, thereby increasing the likelihood of achieving the desired client outcomes. This study documents the process and lessons learned from exploring the feasibility of adapting such a technology-based tool, Centervention, as the example innovation, for use in CMH settings. Mixed-methods data on core features, innovation-provider fit, and organizational capacity were collected from 44 CMH providers. Lessons learned included the need to augment delivery through technology with more personal interactions, the importance of customizing and integrating the tool with existing technologies, and the need to incorporate a number of strategies to assist with adoption and use of Centervention-like tools in CMH contexts. This study adds to the current body of literature on the adaptation process for technology-based tools and provides information that can guide additional innovations for CMH settings.
2014-01-01
Background Chronic Obstructive Pulmonary Disease (COPD) is a growing worldwide problem that imposes a great burden on the daily life of patients. Since there is no cure, the goal of treating COPD is to maintain or improve quality of life. We have developed a new tool, the Assessment of Burden of COPD (ABC) tool, to assess and visualize the integrated health status of patients with COPD, and to provide patients and healthcare providers with a treatment algorithm. This tool may be used during consultations to monitor the burden of COPD and to adjust treatment if necessary. The aim of the current study is to analyse the effectiveness of the ABC tool compared with usual care on health related quality of life among COPD patients over a period of 18 months. Methods/Design A cluster randomised controlled trial will be conducted in COPD patients in both primary and secondary care throughout the Netherlands. An intervention group, receiving care based on the ABC tool, will be compared with a control group receiving usual care. The primary outcome will be the change in score on a disease-specific-quality-of-life questionnaire, the Saint George Respiratory Questionnaire. Secondary outcomes will be a different questionnaire (the COPD Assessment Test), lung function and number of exacerbations. During the 18 months follow-up, seven measurements will be conducted, including a baseline and final measurement. Patients will receive questionnaires to be completed at home. Additional data, such as number of exacerbations, will be recorded by the patients’ healthcare providers. A total of 360 patients will be recruited by 40 general practitioners and 20 pulmonologists. Additionally, a process evaluation will be performed among patients and healthcare providers. Discussion The new ABC tool complies with the 2014 Global Initiative for Chronic Obstructive Lung Disease guidelines, which describe the necessity to classify patients on both their airway obstruction and a comprehensive symptom assessment. It has been developed to classify patients, but also to provide visual insight into the burden of COPD and to provide treatment advice. Trial registration Netherlands Trial Register, NTR3788. PMID:25098313
Contextual constraints for the design of patient-centered health IT tools.
Gonzales, Michael J; O'Connor, Maria Francesca; Riek, Laurel D
2013-01-01
Technologists are constantly working to improve clinical practice by developing new health information technology (Health IT) tools, yet may not always consider the context of how these tools may be used. Patient preferences can vary widely as a result of demographics, health conditions, physical limitations, and personal inclinations, with healthcare providers having to adapt clinical encounters to better suit patient needs. Health IT tools, too, need to be agile across different healthcare contexts, with each stakeholder's specific needs in mind. In this paper, we discuss the challenges and limitations associated with the design and automation of contextually sensitive devices in the healthcare environment. We target the various contexts in which health information is presented in patient-provider encounters, and discuss contextual constraints that may apply to the aforementioned situations. In addition, we present a number of suggestions for informational constraints and the design of informational tools in these settings so that patient and provider informational needs can be better met in clinical communication contexts.
Solar System Treks: Interactive Web Portals or STEM, Exploration and Beyond
NASA Astrophysics Data System (ADS)
Law, E.; Day, B. H.; Viotti, M.
2017-12-01
NASA's Solar System Treks project produces a suite of online visualization and analysis tools for lunar and planetary mapping and modeling. These portals offer great benefits for education and public outreach, providing access to data from a wide range of instruments aboard a variety of past and current missions. As a component of NASA's STEM Activation Infrastructure, they are available as resources for NASA STEM programs, and to the greater STEM community. As new missions are planned to a variety of planetary bodies, these tools facilitate public understanding of the missions and engage the public in the process of identifying and selecting where these missions will land. There are currently three web portals in the program: Moon Trek (https://moontrek.jpl.nasa.gov), Mars Trek (https://marstrek.jpl.nasa.gov), and Vesta Trek (https://vestatrek.jpl.nasa.gov). A new release of Mars Trek includes new tools and data products focusing on human landing site selection. Backed by evidence-based cognitive and computer science findings, an additional version is available for educational and public audiences in support of earning along novice-to-expert pathways, enabling authentic, real-world interaction with planetary data. Portals for additional planetary bodies are planned. As web-based toolsets, the portals do not require users to purchase or install any software beyond current web browsers. The portals provide analysis tools for measurement and study of planetary terrain. They allow data to be layered and adjusted to optimize visualization. Visualizations are easily stored and shared. The portals provide 3D visualization and give users the ability to mark terrain for generation of STL/OBJ files that can be directed to 3D printers. Such 3D prints are valuable tools in museums, public exhibits, and classrooms - especially for the visually impaired. The program supports additional clients, web services, and APIs facilitating dissemination of planetary data to external applications and venues. NASA challenges and hackathons also provide members of the software development community opportunities to participate in tool development and leverage data from the portals.
Poverty and pediatric palliative care: what can we do?
Beaune, Laura; Leavens, Anne; Muskat, Barbara; Ford-Jones, Lee; Rapoport, Adam; Zlotnik Shaul, Randi; Morinis, Julia; Chapman, Lee Ann
2014-01-01
It has been recognized that families of children with life-limiting health conditions struggle with significant financial demands, yet may not have awareness of resources available to them. Additionally, health care providers may not be aware of the socioeconomic needs of families they care for. This article describes a mixed-methods study examining the content validity and utility for health care providers of a poverty screening tool and companion resource guide for the pediatric palliative care population. The study found high relevance and validity of the tool. Significant barriers to implementing the screening tool in clinical practice were described by participants, including: concerns regarding time required, roles and responsibilities, and discomfort in asking about income. Implications for practice and suggestions for improving the tool are discussed. Screening and attention to the social determinants of health lie within the scope of practice of all health care providers. Social workers can play a leadership role in this work.
ASTROS: A multidisciplinary automated structural design tool
NASA Technical Reports Server (NTRS)
Neill, D. J.
1989-01-01
ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.
Air Markets Program Data (AMPD)
The Air Markets Program Data tool allows users to search EPA data to answer scientific, general, policy, and regulatory questions about industry emissions. Air Markets Program Data (AMPD) is a web-based application that allows users easy access to both current and historical data collected as part of EPA's emissions trading programs. This site allows you to create and view reports and to download emissions data for further analysis. AMPD provides a query tool so users can create custom queries of industry source emissions data, allowance data, compliance data, and facility attributes. In addition, AMPD provides interactive maps, charts, reports, and pre-packaged datasets. AMPD does not require any additional software, plug-ins, or security controls and can be accessed using a standard web browser.
Expert systems tools for Hubble Space Telescope observation scheduling
NASA Technical Reports Server (NTRS)
Miller, Glenn; Rosenthal, Don; Cohen, William; Johnston, Mark
1987-01-01
The utility of expert systems techniques for the Hubble Space Telescope (HST) planning and scheduling is discussed and a plan for development of expert system tools which will augment the existing ground system is described. Additional capabilities provided by these tools will include graphics-oriented plan evaluation, long-range analysis of the observation pool, analysis of optimal scheduling time intervals, constructing sequences of spacecraft activities which minimize operational overhead, and optimization of linkages between observations. Initial prototyping of a scheduler used the Automated Reasoning Tool running on a LISP workstation.
A Tool and Application Programming Interface for Browsing Historical Geostationary Satellite Data
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Ayers, J.
2013-12-01
Providing access to information is a key concern for NASA Langley Research Center. We describe a tool and method that allows end users to easily browse and access information that is otherwise difficult to acquire and manipulate. The tool described has as its core the application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the enhanced imagery as an input into their own work flows. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite imagery available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider.
ERIC Educational Resources Information Center
Cann, Cynthia W.; Brumagim, Alan L.
2008-01-01
The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…
Providing Common Access Mechanisms for Dissimilar Network Interconnection Nodes
1991-02-01
Network management involves both maintaining adequate data transmission capabilities in the face of growing and changing needs and keeping the network...Display Only tools are able to obtain information from an IN or a set of INs and display this information, but are not able to change the...configuration or state of an IN. 2. Display and Control tools have the same capabilities as Display Only tools, but in addition are capable of changing the
Woehrle, Holger; Arzt, Michael; Graml, Andrea; Fietze, Ingo; Young, Peter; Teschler, Helmut; Ficker, Joachim H
2018-01-01
This study investigated the addition of a real-time feedback patient engagement tool on positive airway pressure (PAP) adherence when added to a proactive telemedicine strategy. Data from a German healthcare provider (ResMed Healthcare Germany) were retrospectively analyzed. Patients who first started PAP therapy between 1 September 2009 and 30 April 2014, and were managed using telemedicine (AirView™; proactive care) or telemedicine + patient engagement tool (AirView™ + myAir™; patient engagement) were eligible. Patient demographics, therapy start date, sleep-disordered breathing indices, device usage hours, and therapy termination rate were obtained and compared between the two groups. The first 500 patients managed by telemedicine-guided care and a patient engagement tool were matched with 500 patients managed by telemedicine-guided care only. The proportion of nights with device usage ≥4 h was 77 ± 25% in the patient engagement group versus 63 ± 32% in the proactive care group (p < 0.001). Therapy termination occurred less often in the patient engagement group (p < 0.001). The apnea-hypopnea index was similar in the two groups, but leak was significantly lower in the patient engagement versus proactive care group (2.7 ± 4.0 vs 4.1 ± 5.3 L/min; p < 0.001). Addition of a patient engagement tool to telemonitoring-guided proactive care was associated with higher device usage and lower leak. This suggests that addition of an engagement tool may help improve PAP therapy adherence and reduce mask leak. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Development of microsatellite markers in Parthenium ssp.
USDA-ARS?s Scientific Manuscript database
Molecular markers provide the most efficient means to study genetic diversity within and among species of a particular genus. In addition, molecular markers can facilitate breeding efforts by providing tools necessary to reduce the time required to obtain recombinant genotypes with improved agricu...
Knowledge maps: a tool for online assessment with automated feedback.
Ho, Veronica W; Harris, Peter G; Kumar, Rakesh K; Velan, Gary M
2018-12-01
In higher education, most assessments or examinations comprise either multiple-choice items or open-ended questions such as modified essay questions (MEQs). Online concept and knowledge maps are potential tools for assessment, which might emphasize meaningful, integrated understanding of phenomena. We developed an online knowledge-mapping assessment tool, which provides automated feedback on student-submitted maps. We conducted a pilot study to investigate the potential utility of online knowledge mapping as a tool for automated assessment by comparing the scores generated by the software with manual grading of a MEQ on the same topic for a cohort of first-year medical students. In addition, an online questionnaire was used to gather students' perceptions of the tool. Map items were highly discriminating between students of differing knowledge of the topic overall. Regression analysis showed a significant correlation between map scores and MEQ scores, and responses to the questionnaire regarding use of knowledge maps for assessment were overwhelmingly positive. These results suggest that knowledge maps provide a similar indication of students' understanding of a topic as a MEQ, with the advantage of instant, consistent computer grading and time savings for educators. Online concept and knowledge maps could be a useful addition to the assessment repertoire in higher education.
Gressel, Gregory M; Lundsberg, Lisbet S; Illuzzi, Jessica L; Danton, Cheryl M; Sheth, Sangini S; Xu, Xiao; Gariepy, Aileen
2014-12-01
To explore patient and provider perspectives regarding a new Web-based contraceptive support tool. We conducted a qualitative study at an urban Medicaid-based clinic among sexually active women interested in starting a new contraceptive method, clinic providers and staff. All participants were given the opportunity to explore Bedsider, an online contraceptive support tool developed for sexually active women ages 18-29 by the National Campaign to Prevent Teen and Unplanned Pregnancy and endorsed by the American Congress of Obstetricians and Gynecologists. Focus groups were conducted separately among patient participants and clinic providers/staff using open-ended structured interview guides to identify specific themes and key concepts related to use of this tool in an urban clinic setting. Patient participants were very receptive to this online contraceptive support tool, describing it as trustworthy, accessible and empowering. In contrast, clinic providers and staff had concerns regarding the Website's legitimacy, accessibility, ability to empower patients and applicability, which limited their willingness to recommend its use to patients. Contrasting opinions regarding Bedsider may point to a potential disconnect between how providers and patients view contraception information tools. Further qualitative and quantitative studies are needed to explore women's perspectives on contraceptive education and counseling and providers' understanding of these perspectives. This study identifies a contrast between how patients and providers in an urban clinic setting perceive a Web-based contraceptive tool. Given a potential patient-provider discrepancy in preferred methods and approaches to contraceptive counseling, additional research is needed to enhance this important arena of women's health care. Copyright © 2014 Elsevier Inc. All rights reserved.
This site provides EJ policy, information resources, compliance and enforcement data tools and community outreach activities. Additional topics are grants and program info documents, federal advisory committee and interagency working group activities.
77 FR 77038 - Procurement List; Proposed Additions
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-31
... purpose is to provide interested persons an opportunity to submit comments on the proposed actions...: Keystone Vocational Services, Inc., Sharon, PA Contracting Activity: General Services Administration, Tools...
Competency-Based Education for the Molecular Genetic Pathology Fellow
Talbert, Michael L.; Dunn, S. Terence; Hunt, Jennifer; Hillyard, David R.; Mirza, Imran; Nowak, Jan A.; Van Deerlin, Vivianna; Vnencak-Jones, Cindy L.
2009-01-01
The following report represents guidelines for competency-based fellowship training in Molecular Genetic Pathology (MGP) developed by the Association for Molecular Pathology Training and Education Committee and Directors of MGP Programs in the United States. The goals of the effort were to describe each of the Accreditation Council for Graduate Medical Education competencies as they apply to MGP fellowship training, provide a summary of goals and objectives, and recommend assessment tools. These guidelines are particularly pertinent to MGP training, which is a relatively new specialty that operates within a rapidly changing scientific and technological arena. It is hoped that this document will provide additional material for directors of existing MGP programs to consider for improvement of program objectives and enhancement of evaluation tools already in place. In addition, the guidelines should provide a valuable framework for the development of new MGP programs. PMID:19797613
Meyer, Denny; Abbott, Jo-Anne; Rehm, Imogen; Bhar, Sunil; Barak, Azy; Deng, Gary; Wallace, Klaire; Ogden, Edward; Klein, Britt
2017-04-01
Suicidal patients often visit healthcare professionals in their last month before suicide, but medical practitioners are unlikely to raise the issue of suicide with patients because of time constraints and uncertainty regarding an appropriate approach. A brief tool called the e-PASS Suicidal Ideation Detector (eSID) was developed for medical practitioners to help detect the presence of suicidal ideation (SI) in their clients. If SI is detected, the system alerts medical practitioners to address this issue with a client. The eSID tool was developed due to the absence of an easy-to-use, evidence-based SI detection tool for general practice. The tool was developed using binary logistic regression analyses of data provided by clients accessing an online psychological assessment function. Ten primary healthcare professionals provided advice regarding the use of the tool. The analysis identified eleven factors in addition to the Kessler-6 for inclusion in the model used to predict the probability of recent SI. The model performed well across gender and age groups 18-64 (AUR 0.834, 95% CI 0.828-0.841, N = 16,703). Healthcare professionals were interviewed; they recommended that the tool be incorporated into existing medical software systems and that additional resources be supplied, tailored to the level of risk identified. The eSID is expected to trigger risk assessments by healthcare professionals when this is necessary. Initial reactions of healthcare professionals to the tool were favorable, but further testing and in situ development are required.
Circuit design tool. User's manual, revision 2
NASA Technical Reports Server (NTRS)
Miyake, Keith M.; Smith, Donald E.
1992-01-01
The CAM chip design was produced in a UNIX software environment using a design tool that supports definition of digital electronic modules, composition of these modules into higher level circuits, and event-driven simulation of these circuits. Our design tool provides an interface whose goals include straightforward but flexible primitive module definition and circuit composition, efficient simulation, and a debugging environment that facilitates design verification and alteration. The tool provides a set of primitive modules which can be composed into higher level circuits. Each module is a C-language subroutine that uses a set of interface protocols understood by the design tool. Primitives can be altered simply by recoding their C-code image; in addition new primitives can be added allowing higher level circuits to be described in C-code rather than as a composition of primitive modules--this feature can greatly enhance the speed of simulation.
Additive Manufacturing of Tooling for Refrigeration Cabinet Foaming Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Post, Brian K; Nuttall, David; Cukier, Michael
The primary objective of this project was to leverage the Big Area Additive Manufacturing (BAAM) process and materials into a long term, quick change tooling concept to drastically reduce product lead and development timelines and costs. Current refrigeration foam molds are complicated to manufacture involving casting several aluminum parts in an approximate shape, machining components of the molds and post fitting and shimming of the parts in an articulated fixture. The total process timeline can take over 6 months. The foaming process is slower than required for production, therefore multiple fixtures, 10 to 27, are required per refrigerator model. Moldsmore » are particular to a specific product configuration making mixed model assembly challenging for sequencing, mold changes or auto changeover features. The initial goal was to create a tool leveraging the ORNL materials and additive process to build a tool in 4 to 6 weeks or less. A secondary goal was to create common fixture cores and provide lightweight fixture sections that could be revised in a very short time to increase equipment flexibility reduce lead times, lower the barriers to first production trials, and reduce tooling costs.« less
GAP Final Technical Report 12-14-04
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrew J. Bordner, PhD, Senior Research Scientist
2004-12-14
The Genomics Annotation Platform (GAP) was designed to develop new tools for high throughput functional annotation and characterization of protein sequences and structures resulting from genomics and structural proteomics, benchmarking and application of those tools. Furthermore, this platform integrated the genomic scale sequence and structural analysis and prediction tools with the advanced structure prediction and bioinformatics environment of ICM. The development of GAP was primarily oriented towards the annotation of new biomolecular structures using both structural and sequence data. Even though the amount of protein X-ray crystal data is growing exponentially, the volume of sequence data is growing even moremore » rapidly. This trend was exploited by leveraging the wealth of sequence data to provide functional annotation for protein structures. The additional information provided by GAP is expected to assist the majority of the commercial users of ICM, who are involved in drug discovery, in identifying promising drug targets as well in devising strategies for the rational design of therapeutics directed at the protein of interest. The GAP also provided valuable tools for biochemistry education, and structural genomics centers. In addition, GAP incorporates many novel prediction and analysis methods not available in other molecular modeling packages. This development led to signing the first Molsoft agreement in the structural genomics annotation area with the University of oxford Structural Genomics Center. This commercial agreement validated the Molsoft efforts under the GAP project and provided the basis for further development of the large scale functional annotation platform.« less
NASA Astrophysics Data System (ADS)
Welton, B.; Chouinard, K.; Sultan, M.; Becker, D.; Milewski, A.; Becker, R.
2010-12-01
Rising populations in the arid and semi arid parts of the World are increasing the demand for fresh water supplies worldwide. Many data sets needed for assessment of hydrologic applications across vast regions of the world are expensive, unpublished, difficult to obtain, or at varying scales which complicates their use. Fortunately, this situation is changing with the development of global remote sensing datasets and web-based platforms such as GIS Server. GIS provides a cost effective vehicle for comparing, analyzing, and querying a variety of spatial datasets as geographically referenced layers. We have recently constructed a web-based GIS, that incorporates all relevant geological, geochemical, geophysical, and remote sensing data sets that were readily used to identify reservoir types and potential well locations on local and regional scales in various tectonic settings including: (1) extensional environment (Red Sea rift), (2) transcurrent fault system (Najd Fault in the Arabian-Nubian Shield), and (3) compressional environments (Himalayas). The web-based GIS could also be used to detect spatial and temporal trends in precipitation, recharge, and runoff in large watersheds on local, regional, and continental scales. These applications were enabled through the construction of a web-based ArcGIS Server with Google Map’s interface and the development of customized geoprocessing tools. ArcGIS Server provides out-of-the-box setups that are generic in nature. This platform includes all of the standard web based GIS tools (e.g. pan, zoom, identify, search, data querying, and measurement). In addition to the standard suite of tools provided by ArcGIS Server an additional set of advanced data manipulation and display tools was also developed to allow for a more complete and customizable view of the area of interest. The most notable addition to the standard GIS Server tools is the custom on-demand geoprocessing tools (e.g., graph, statistical functions, custom raster creation, profile, TRMM). The generation of a wide range of derivative maps (e.g., buffer zone, contour map, graphs, temporal rainfall distribution maps) from various map layers (e.g., geologic maps, geophysics, satellite images) allows for more user flexibility. The use of these tools along with Google Map’s API which enables the website user to utilize high quality GeoEye 2 images provide by Google in conjunction with our data, creates a more complete image of the area being observed and allows for custom derivative maps to be created in the field and viewed immediately on the web, processes that were restricted to offline databases.
Adequacy of surface analytical tools for studying the tribology of ceramics
NASA Technical Reports Server (NTRS)
Sliney, H. E.
1986-01-01
Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.
Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang
2008-01-01
Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146
Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis
2015-01-01
Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.; Olariu, Stephen
1995-01-01
The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.
NASA Technical Reports Server (NTRS)
Mayer, Richard
1988-01-01
The integrated development support environment (IDSE) is a suite of integrated software tools that provide intelligent support for information modelling. These tools assist in function, information, and process modeling. Additional tools exist to assist in gathering and analyzing information to be modeled. This is a user's guide to application of the IDSE. Sections covering the requirements and design of each of the tools are presented. There are currently three integrated computer aided manufacturing definition (IDEF) modeling methodologies: IDEF0, IDEF1, and IDEF2. Also, four appendices exist to describe hardware and software requirements, installation procedures, and basic hardware usage.
Jane: a new tool for the cophylogeny reconstruction problem.
Conow, Chris; Fielder, Daniel; Ovadia, Yaniv; Libeskind-Hadas, Ran
2010-02-03
This paper describes the theory and implementation of a new software tool, called Jane, for the study of historical associations. This problem arises in parasitology (associations of hosts and parasites), molecular systematics (associations of orderings and genes), and biogeography (associations of regions and orderings). The underlying problem is that of reconciling pairs of trees subject to biologically plausible events and costs associated with these events. Existing software tools for this problem have strengths and limitations, and the new Jane tool described here provides functionality that complements existing tools. The Jane software tool uses a polynomial time dynamic programming algorithm in conjunction with a genetic algorithm to find very good, and often optimal, solutions even for relatively large pairs of trees. The tool allows the user to provide rich timing information on both the host and parasite trees. In addition the user can limit host switch distance and specify multiple host switch costs by specifying regions in the host tree and costs for host switches between pairs of regions. Jane also provides a graphical user interface that allows the user to interactively experiment with modifications to the solutions found by the program. Jane is shown to be a useful tool for cophylogenetic reconstruction. Its functionality complements existing tools and it is therefore likely to be of use to researchers in the areas of parasitology, molecular systematics, and biogeography.
Basic Wind Tech Course - Lesson Plans and Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swapp, Andy
2011-07-01
The funds from this project were used to purchase tools and instrumentation to help replicate actual on-the-job wind energy scenarios which provided the students with the practical or applied components of wind energy jobs. This project enhanced the educational experiences provided for the students in terms of engineering and science components of wind energy by using electronics, control systems, and electro-mechanical instrumentation to help students learn standardized wind-specific craftsman skills. In addition the tools and instrumentation helped the students learn the safety necessary to work in the wind industry.
New Data Services for Polar Investigators from Integrated Earth Data Applications (IEDA)
NASA Astrophysics Data System (ADS)
Nitsche, F. O.; Ferrini, V.; Morton, J. J.; Arko, R. A.; McLain, K.; O'hara, S. H.; Carbotte, S. M.; Lehnert, K. A.; IEDA Team, I.
2013-12-01
Accessibility and preservation of data is needed to support multi-disciplinary research in the key environmentally sensitive Polar Regions. IEDA (Integrated Earth Data Applications) is a community-based data facility funded by the US National Science Foundation (NSF) to support, sustain, and advance the geosciences by providing data services for observational solid earth data from the Ocean, Earth, and Polar Sciences. IEDA tools and services relevant to the Polar Research Community include the Antarctic and Southern Ocean Data System (ASODS), the U.S. Antarctic Program Data Coordination Center (USAP-DCC), GeoMapApp, as well as a number of services for sample-based data (SESAR and EarthChem). In addition to existing tools, which assist Polar investigators in archiving their data, and creating DIF records for global searches in AMD, IEDA recently added several new tools and services that will provide further support for investigators with the data life cycle process. These include a data management plan (http://www.iedadata.org/compliance/plan) and data compliance reporting tool (http://www.iedadata.org/compliance/report) that will help investigators comply with the requirements of funding agencies such as the National Science Foundation (NSF). Data, especially from challenging Polar Regions, are likely to be used by other scientists for future studies. Therefore, data acknowledgment is an important concern of many investigators. To encourage data acknowledgments by data users, we link references of publications (when known) to datasets and cruises registered within the ASODS system as part of our data curation services (http://www.marine-geo.org/portals/antarctic/references.php). In addition, IEDA offers a data publication service to register scientific data with DOI's, making data sets citable as publications with attribution to investigators as authors. IEDA is a publication agent of the DataCite consortium. Offering such services provides additional incentives for making data available through data centers. Such tools and services are important building blocks of a coherent and comprehensive (cyber) data support structure for Polar investigators.
Center for Corporate Climate Leadership Goal Setting
EPA provides tools and recognition for companies setting aggressive GHG reduction goals, which can galvanize reduction efforts at a company and often leads to the identification of many additional reduction opportunities.
Rover Wheel-Actuated Tool Interface
NASA Technical Reports Server (NTRS)
Matthews, Janet; Ahmad, Norman; Wilcox, Brian
2007-01-01
A report describes an interface for utilizing some of the mobility features of a mobile robot for general-purpose manipulation of tools and other objects. The robot in question, now undergoing conceptual development for use on the Moon, is the All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) rover, which is designed to roll over gentle terrain or walk over rough or steep terrain. Each leg of the robot is a six-degree-of-freedom general purpose manipulator tipped by a wheel with a motor drive. The tool interface includes a square cross-section peg, equivalent to a conventional socket-wrench drive, that rotates with the wheel. The tool interface also includes a clamp that holds a tool on the peg, and a pair of fold-out cameras that provides close-up stereoscopic images of the tool and its vicinity. The field of view of the imagers is actuated by the clamp mechanism and is specific to each tool. The motor drive can power any of a variety of tools, including rotating tools for helical fasteners, drills, and such clamping tools as pliers. With the addition of a flexible coupling, it could also power another tool or remote manipulator at a short distance. The socket drive can provide very high torque and power because it is driven by the wheel motor.
Word Processing Programs and Weaker Writers/Readers: A Meta-Analysis of Research Findings
ERIC Educational Resources Information Center
Morphy, Paul; Graham, Steve
2012-01-01
Since its advent word processing has become a common writing tool, providing potential advantages over writing by hand. Word processors permit easy revision, produce legible characters quickly, and may provide additional supports (e.g., spellcheckers, speech recognition). Such advantages should remedy common difficulties among weaker…
Utilizing lean tools to improve value and reduce outpatient wait times in an Indian hospital.
Miller, Richard; Chalapati, Nirisha
2015-01-01
This paper aims to demonstrate how lean tools were applied to some unique issues of providing healthcare in a developing country where many patients face challenges not found in developed countries. The challenges provide insight into how lean tools can be utilized to provide similar results across the world. This paper is based on a qualitative case study carried out by a master's student implementing lean at a hospital in India. This paper finds that lean tools such as value-stream mapping and root cause analysis can lead to dramatic reductions in waste and improvements in productivity. The problems of the majority of patients paying for their own healthcare and lacking transportation created scheduling problems that required patients to receive their diagnosis and pay for treatment within a single day. Many additional wastes were identified that were significantly impacting the hospital's ability to provide care. As a result of this project, average outpatient wait times were reduced from 1 hour to 15 minutes along with a significant increase in labor productivity. The results demonstrate how lean tools can increase value to the patients. It also provides are framework that can be utilized for healthcare providers in developed and developing countries to analyze their value streams to reduce waste. This paper is one of the first to address the unique issues of implementing lean to a healthcare setting in a developing country.
Direct push driven in situ color logging tool (CLT): technique, analysis routines, and application
NASA Astrophysics Data System (ADS)
Werban, U.; Hausmann, J.; Dietrich, P.; Vienken, T.
2014-12-01
Direct push technologies have recently seen a broad development providing several tools for in situ parameterization of unconsolidated sediments. One of these techniques is the measurement of soil colors - a proxy information that reveals to soil/sediment properties. We introduce the direct push driven color logging tool (CLT) for real-time and depth-resolved investigation of soil colors within the visible spectrum. Until now, no routines exist on how to handle high-resolved (mm-scale) soil color data. To develop such a routine, we transform raw data (CIEXYZ) into soil color surrogates of selected color spaces (CIExyY, CIEL*a*b*, CIEL*c*h*, sRGB) and denoise small-scale natural variability by Haar and Daublet4 wavelet transformation, gathering interpretable color logs over depth. However, interpreting color log data as a single application remains challenging. Additional information, such as site-specific knowledge of the geological setting, is required to correlate soil color data to specific layers properties. Hence, we exemplary provide results from a joint interpretation of in situ-obtained soil color data and 'state-of-the-art' direct push based profiling tool data and discuss the benefit of additional data. The developed routine is capable of transferring the provided information obtained as colorimetric data into interpretable color surrogates. Soil color data proved to correlate with small-scale lithological/chemical changes (e.g., grain size, oxidative and reductive conditions), especially when combined with additional direct push vertical high resolution data (e.g., cone penetration testing and soil sampling). Thus, the technique allows enhanced profiling by means of providing another reproducible high-resolution parameter for analysis subsurface conditions. This opens potential new areas of application and new outputs for such data in site investigation. It is our intention to improve color measurements by means method of application and data interpretation, useful to characterize vadose layer/soil/sediment characteristics.
Web-Based Tools for Text-Based Patient-Provider Communication in Chronic Conditions: Scoping Review
Grunfeld, Eva; Makuwaza, Tutsirai; Bender, Jacqueline L
2017-01-01
Background Patients with chronic conditions require ongoing care which not only necessitates support from health care providers outside appointments but also self-management. Web-based tools for text-based patient-provider communication, such as secure messaging, allow for sharing of contextual information and personal narrative in a simple accessible medium, empowering patients and enabling their providers to address emerging care needs. Objective The objectives of this study were to (1) conduct a systematic search of the published literature and the Internet for Web-based tools for text-based communication between patients and providers; (2) map tool characteristics, their intended use, contexts in which they were used, and by whom; (3) describe the nature of their evaluation; and (4) understand the terminology used to describe the tools. Methods We conducted a scoping review using the MEDLINE (Medical Literature Analysis and Retrieval System Online) and EMBASE (Excerpta Medica Database) databases. We summarized information on the characteristics of the tools (structure, functions, and communication paradigm), intended use, context and users, evaluation (study design and outcomes), and terminology. We performed a parallel search of the Internet to compare with tools identified in the published literature. Results We identified 54 papers describing 47 unique tools from 13 countries studied in the context of 68 chronic health conditions. The majority of tools (77%, 36/47) had functions in addition to communication (eg, viewable care plan, symptom diary, or tracker). Eight tools (17%, 8/47) were described as allowing patients to communicate with the team or multiple health care providers. Most of the tools were intended to support communication regarding symptom reporting (49%, 23/47), and lifestyle or behavior modification (36%, 17/47). The type of health care providers who used tools to communicate with patients were predominantly allied health professionals of various disciplines (30%, 14/47), nurses (23%, 11/47), and physicians (19%, 9/47), among others. Over half (52%, 25/48) of the tools were evaluated in randomized controlled trials, and 23 tools (48%, 23/48) were evaluated in nonrandomized studies. Terminology of tools varied by intervention type and functionality and did not consistently reflect a theme of communication. The majority of tools found in the Internet search were patient portals from 6 developers; none were found among published articles. Conclusions Web-based tools for text-based patient-provider communication were identified from a wide variety of clinical contexts and with varied functionality. Tools were most prevalent in contexts where intended use was self-management. Few tools for team-based communication were found, but this may become increasingly important as chronic disease care becomes more interdisciplinary. PMID:29079552
Web-Based Tools for Text-Based Patient-Provider Communication in Chronic Conditions: Scoping Review.
Voruganti, Teja; Grunfeld, Eva; Makuwaza, Tutsirai; Bender, Jacqueline L
2017-10-27
Patients with chronic conditions require ongoing care which not only necessitates support from health care providers outside appointments but also self-management. Web-based tools for text-based patient-provider communication, such as secure messaging, allow for sharing of contextual information and personal narrative in a simple accessible medium, empowering patients and enabling their providers to address emerging care needs. The objectives of this study were to (1) conduct a systematic search of the published literature and the Internet for Web-based tools for text-based communication between patients and providers; (2) map tool characteristics, their intended use, contexts in which they were used, and by whom; (3) describe the nature of their evaluation; and (4) understand the terminology used to describe the tools. We conducted a scoping review using the MEDLINE (Medical Literature Analysis and Retrieval System Online) and EMBASE (Excerpta Medica Database) databases. We summarized information on the characteristics of the tools (structure, functions, and communication paradigm), intended use, context and users, evaluation (study design and outcomes), and terminology. We performed a parallel search of the Internet to compare with tools identified in the published literature. We identified 54 papers describing 47 unique tools from 13 countries studied in the context of 68 chronic health conditions. The majority of tools (77%, 36/47) had functions in addition to communication (eg, viewable care plan, symptom diary, or tracker). Eight tools (17%, 8/47) were described as allowing patients to communicate with the team or multiple health care providers. Most of the tools were intended to support communication regarding symptom reporting (49%, 23/47), and lifestyle or behavior modification (36%, 17/47). The type of health care providers who used tools to communicate with patients were predominantly allied health professionals of various disciplines (30%, 14/47), nurses (23%, 11/47), and physicians (19%, 9/47), among others. Over half (52%, 25/48) of the tools were evaluated in randomized controlled trials, and 23 tools (48%, 23/48) were evaluated in nonrandomized studies. Terminology of tools varied by intervention type and functionality and did not consistently reflect a theme of communication. The majority of tools found in the Internet search were patient portals from 6 developers; none were found among published articles. Web-based tools for text-based patient-provider communication were identified from a wide variety of clinical contexts and with varied functionality. Tools were most prevalent in contexts where intended use was self-management. Few tools for team-based communication were found, but this may become increasingly important as chronic disease care becomes more interdisciplinary. ©Teja Voruganti, Eva Grunfeld, Tutsirai Makuwaza, Jacqueline L Bender. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 27.10.2017.
O'Donnell, Michael S.; Aldridge, Cameron L.; Doherty, Kevin E.; Fedy, Bradley C.
2015-01-01
We deliver all products described herein as online geographic information system data for visualization and downloading. We outline the data properties for each model and their data inputs, describe the process of selecting appropriate data products for multifarious applications, describe all data products and software, provide newly derived model composites, and discuss how land managers may use the models to inform future sage-grouse studies and potentially refine conservation efforts. The models, software tools, and associated opportunities for novel applications of these products should provide a suite of additional, but not exclusive, tools for assessing Wyoming Greater Sage-grouse habitats, which land managers, conservationists, and scientists can apply to myriad applications.
... diagnosis or provide additional evidence if it’s necessary. Magnetic Resonance Imaging (MRI) Diagnostic tool that currently offers the most ... Out Learn More Evoked Potentials (EP) Learn More Magnetic Resonance Imaging (MRI) Learn More Cerebrospinal Fluid (CSF) Learn More ...
INFORMATION: THEORY, BRAIN, AND BEHAVIOR
Jensen, Greg; Ward, Ryan D.; Balsam, Peter D.
2016-01-01
In the 65 years since its formal specification, information theory has become an established statistical paradigm, providing powerful tools for quantifying probabilistic relationships. Behavior analysis has begun to adopt these tools as a novel means of measuring the interrelations between behavior, stimuli, and contingent outcomes. This approach holds great promise for making more precise determinations about the causes of behavior and the forms in which conditioning may be encoded by organisms. In addition to providing an introduction to the basics of information theory, we review some of the ways that information theory has informed the studies of Pavlovian conditioning, operant conditioning, and behavioral neuroscience. In addition to enriching each of these empirical domains, information theory has the potential to act as a common statistical framework by which results from different domains may be integrated, compared, and ultimately unified. PMID:24122456
NASA Technical Reports Server (NTRS)
Saito, Jim
1987-01-01
The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.
Schneider, T; Arumi, D; Crook, T J; Sun, F; Michel, M C
2014-09-01
To compare the effects of additional educational material on treatment satisfaction of overactive bladder (OAB) patients treated with a muscarinic receptor antagonist. In an observational study of OAB patients being treated by their physician with fesoterodine for 4 months (FAKTEN study), sites were randomised to providing standard treatment or additional educational material including the SAGA tool. Patient satisfaction was assessed by three validated patient-reported outcomes including the Treatment Satisfaction Question. Because of premature discontinuation of the study, descriptive statistical analysis was performed. A total of 431 and 342 patients received standard treatment or additional educational material, respectively. At study end, 76.1% [95% CI = 71.3, 80.4] of patients with standard care and 79.6% [95% CI = 74.4, 84.1] with additional SAGA tool were satisfied with treatment (primary end-point). Comparable outcomes with and without the additional educational material were also found in various patient subgroups, at the 1-month time point, and for the other patient-reported outcomes. A notable exception was the subgroup of treatment-naïve patients in which the percentage of satisfied patients was 77.2% vs. 89.5% with standard treatment and additional SAGA tool, respectively (post hoc analysis). In an observational study, most overactive bladder patients were satisfied with fesoterodine treatment. Because of the small sample size, the study does not support or refute the hypothesis that adding the SAGA tool will improve patient satisfaction with treatment. The potential effect of additional educational material in treatment-naïve patients warrants further dedicated studies. © 2014 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat
2008-01-01
The MRO Sequence Checking Tool program, mro_check, automates significant portions of the MRO (Mars Reconnaissance Orbiter) sequence checking procedure. Though MRO has similar checks to the ODY s (Mars Odyssey) Mega Check tool, the checks needed for MRO are unique to the MRO spacecraft. The MRO sequence checking tool automates the majority of the sequence validation procedure and check lists that are used to validate the sequences generated by MRO MPST (mission planning and sequencing team). The tool performs more than 50 different checks on the sequence. The automation varies from summarizing data about the sequence needed for visual verification of the sequence, to performing automated checks on the sequence and providing a report for each step. To allow for the addition of new checks as needed, this tool is built in a modular fashion.
ERIC Educational Resources Information Center
Lang, Sarah N.; Schoppe-Sullivan, Sarah J.; Jeon, Lieny
2017-01-01
By adapting a self-administered assessment of coparenting, we sought to provide a new tool, the Cocaring Relationship Questionnaire, to measure parent-teacher, or cocaring relationships, and provide additional construct validity for the multidimensional concept of cocaring. Next, recognizing the importance of parental involvement for young…
ERIC Educational Resources Information Center
Center for Renewable Energy and Sustainable Tech., Washington, DC.
An educational tool concerning renewable energy and the environment, this CD-ROM provides nearly 1,000 screens of text, graphics, videos, and interactive exercises. It also provides a detailed index, charts of U.S. energy consumption by state, an energy glossary, and a list of related Web sites. This CD-ROM, additionally, offers "The School…
ERIC Educational Resources Information Center
Lakin, Joni M.
2012-01-01
Ability tests are used by teachers to provide additional context for interpreting student achievement and as a tool for differentiating instruction to the cognitive strengths and weaknesses of students. Tests that provide the most useful information for these purposes measure school-related content domains including verbal and quantitative…
A Microsoft Excel® 2010 Based Tool for Calculating Interobserver Agreement
Azulay, Richard L
2011-01-01
This technical report provides detailed information on the rationale for using a common computer spreadsheet program (Microsoft Excel®) to calculate various forms of interobserver agreement for both continuous and discontinuous data sets. In addition, we provide a brief tutorial on how to use an Excel spreadsheet to automatically compute traditional total count, partial agreement-within-intervals, exact agreement, trial-by-trial, interval-by-interval, scored-interval, unscored-interval, total duration, and mean duration-per-interval interobserver agreement algorithms. We conclude with a discussion of how practitioners may integrate this tool into their clinical work. PMID:22649578
A microsoft excel(®) 2010 based tool for calculating interobserver agreement.
Reed, Derek D; Azulay, Richard L
2011-01-01
This technical report provides detailed information on the rationale for using a common computer spreadsheet program (Microsoft Excel(®)) to calculate various forms of interobserver agreement for both continuous and discontinuous data sets. In addition, we provide a brief tutorial on how to use an Excel spreadsheet to automatically compute traditional total count, partial agreement-within-intervals, exact agreement, trial-by-trial, interval-by-interval, scored-interval, unscored-interval, total duration, and mean duration-per-interval interobserver agreement algorithms. We conclude with a discussion of how practitioners may integrate this tool into their clinical work.
NASA Astrophysics Data System (ADS)
Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.
2017-12-01
The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).
NASA Astrophysics Data System (ADS)
Castner, E.; Leach, A. M.; Galloway, J. N.; Andrews, J.
2015-12-01
Nitrogen footprints (NF) connect entities with the reactive nitrogen (Nr; all species of nitrogen except N2) lost to the environment as a result of their activities. While necessary to life, excess Nr can be detrimental to ecosystem and human health, causing impacts such as smog, eutrophication, biodiversity loss, and climate change. The NF tool was recently developed to help institutions measure and reduce their environmental impact. This tool accounts for the NF from energy usage, food production and consumption, fertilizer usage, research animals, and agricultural activities. The tool also provides scenario analysis to help institutions reduce their NF and establish a reduction target. Currently in a testing phase, seven institutions have used the tool to calculate their NF, and six additional institutions have calculations in progress. Many institutions interested in sustainability have already calculated their carbon footprint (CF), which reports the total greenhouse gas emissions resulting from institution activities. The University of New Hampshire Sustainability Institute (UNHSI) Campus Carbon Calculator, developed in 2001, is used by thousands of institutions in the United States. While important, the CF addresses just one aspect of an institution's environmental impact: global climate change. The NF broadens this perspective by connecting to additional environmental impacts that are both global and local. The data requirements for the CF and NF have a significant overlap, especially in the energy sector. Given the similarity of data requirements and the benefits of considering the two footprints together, the two tools are in the preliminary stages of being merged. We will first provide an overview of the NF tool for institutions. We will then compare available NF and CF results from multiple institutions to assess trends and correlations and to determine the impact of different scenarios on both footprints.
Kopp, Mary K; Hornberger, Cynthia
2008-02-01
Additional efforts are needed to assist public health, school, and clinic-based pediatric nurses in identifying the prevalence of obesity among Kansas Medicaid-eligible children, 21 years or younger. A Proper Exercise and Nutrition (PEN) tool kit was mailed to 500 public health nurses who performed KAN Be Healthy (KBH) assessments. KBH nurses were provided an expanded training curriculum on growth, nutrition, and obesity along with appropriate screening tools. Nurses were surveyed about their current assessment practices and general knowledge of obese clients. After the PEN tool kit distribution, nurses reported an increased use of screening tools and standardized referral parameters. This program intervention was successful in changing nursing practice, resulting in evidence-based understanding of obesity screening and assessment.
Conservation of the introgressed European water frog complex using molecular tools.
Holsbeek, G; Maes, G E; De Meester, L; Volckaert, F A M
2009-03-01
In Belgium, the Pelophylax esculentus complex has recently been subjected to multiple introductions of non-native water frogs, increasing the occurrence of hybridisation events. In the present study, we tested the reliability of morphometric and recently developed microsatellite tools to identify introgression and to determine the origin of exotic Belgian water frogs. By analysing 150 individuals of each taxon of the P. esculentus complex and an additional 60 specimens of the introduced P. cf. bedriagae, we show that neither of the currently available tools appears to have sufficient power to reliably distinguish all Belgian water frog species. We therefore aimed at increasing the discriminatory power of a microsatellite identification tool by developing a new marker panel with additional microsatellite loci. By adding only two new microsatellite loci (RlCA5 and RlCA1b20), all taxa of the P. esculentus complex could be distinguished from each other with high confidence. Three more loci (Res3, Res5 and Res17) provided a powerful discrimination of the exotic species.
ERIC Educational Resources Information Center
Bachman, Lyle F.
1989-01-01
Applied linguistics and psychometrics have influenced language testing, providing additional tools for investigating factors affecting language test performance and assuring measurement reliability. An examination is presented of language testing, including the theoretical issues involved, the methodological advances, language test development,…
Rising dough and baking bread at the Australian synchrotron
NASA Astrophysics Data System (ADS)
Mayo, S. C.; McCann, T.; Day, L.; Favaro, J.; Tuhumury, H.; Thompson, D.; Maksimenko, A.
2016-01-01
Wheat protein quality and the amount of common salt added in dough formulation can have a significant effect on the microstructure and loaf volume of bread. High-speed synchrotron micro-CT provides an ideal tool for observing the three dimensional structure of bread dough in situ during proving (rising) and baking. In this work, the synchrotron micro-CT technique was used to observe the structure and time evolution of doughs made from high and low protein flour and three different salt additives. These experiments showed that, as expected, high protein flour produces a higher volume loaf compared to low protein flour regardless of salt additives. Furthermore the results show that KCl in particular has a very negative effect on dough properties resulting in much reduced porosity. The hundreds of datasets produced and analysed during this experiment also provided a valuable test case for handling large quantities of data using tools on the Australian Synchrotron's MASSIVE cluster.
Use of Influenza Risk Assessment Tool for Prepandemic Preparedness
Trock, Susan C.
2018-01-01
In 2010, the Centers for Disease Control and Prevention began to develop an Influenza Risk Assessment Tool (IRAT) to methodically capture and assess information relating to influenza A viruses not currently circulating among humans. The IRAT uses a multiattribute, additive model to generate a summary risk score for each virus. Although the IRAT is not intended to predict the next pandemic influenza A virus, it has provided input into prepandemic preparedness decisions. PMID:29460739
Development of the geometry database for the CBM experiment
NASA Astrophysics Data System (ADS)
Akishina, E. P.; Alexandrov, E. I.; Alexandrov, I. N.; Filozova, I. A.; Friese, V.; Ivanov, V. V.
2018-01-01
The paper describes the current state of the Geometry Database (Geometry DB) for the CBM experiment. The main purpose of this database is to provide convenient tools for: (1) managing the geometry modules; (2) assembling various versions of the CBM setup as a combination of geometry modules and additional files. The CBM users of the Geometry DB may use both GUI (Graphical User Interface) and API (Application Programming Interface) tools for working with it.
NASA Astrophysics Data System (ADS)
Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti; Maddox, Marlo
2015-04-01
With the addition of Space Weather Research Center (a sub-team within CCMC) in 2010 to address NASA’s own space weather needs, CCMC has become a unique entity that not only facilitates research through providing access to the state-of-the-art space science and space weather models, but also plays a critical role in providing unique space weather services to NASA robotic missions, developing innovative tools and transitioning research to operations via user feedback. With scientists, forecasters and software developers working together within one team, through close and direct connection with space weather customers and trusted relationship with model developers, CCMC is flexible, nimble and effective to meet customer needs. In this presentation, we highlight a few unique aspects of CCMC/SWRC’s space weather services, such as addressing space weather throughout the solar system, pushing the frontier of space weather forecasting via the ensemble approach, providing direct personnel and tool support for spacecraft anomaly resolution, prompting development of multi-purpose tools and knowledge bases, and educating and engaging the next generation of space weather scientists.
On the next generation of reliability analysis tools
NASA Technical Reports Server (NTRS)
Babcock, Philip S., IV; Leong, Frank; Gai, Eli
1987-01-01
The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.
NASA's Lunar and Planetary Mapping and Modeling Program
NASA Astrophysics Data System (ADS)
Law, E.; Day, B. H.; Kim, R. M.; Bui, B.; Malhotra, S.; Chang, G.; Sadaqathullah, S.; Arevalo, E.; Vu, Q. A.
2016-12-01
NASA's Lunar and Planetary Mapping and Modeling Program produces a suite of online visualization and analysis tools. Originally designed for mission planning and science, these portals offer great benefits for education and public outreach (EPO), providing access to data from a wide range of instruments aboard a variety of past and current missions. As a component of NASA's Science EPO Infrastructure, they are available as resources for NASA STEM EPO programs, and to the greater EPO community. As new missions are planned to a variety of planetary bodies, these tools are facilitating the public's understanding of the missions and engaging the public in the process of identifying and selecting where these missions will land. There are currently three web portals in the program: the Lunar Mapping and Modeling Portal or LMMP (http://lmmp.nasa.gov), Vesta Trek (http://vestatrek.jpl.nasa.gov), and Mars Trek (http://marstrek.jpl.nasa.gov). Portals for additional planetary bodies are planned. As web-based toolsets, the portals do not require users to purchase or install any software beyond current web browsers. The portals provide analysis tools for measurement and study of planetary terrain. They allow data to be layered and adjusted to optimize visualization. Visualizations are easily stored and shared. The portals provide 3D visualization and give users the ability to mark terrain for generation of STL files that can be directed to 3D printers. Such 3D prints are valuable tools in museums, public exhibits, and classrooms - especially for the visually impaired. Along with the web portals, the program supports additional clients, web services, and APIs that facilitate dissemination of planetary data to a range of external applications and venues. NASA challenges and hackathons are also providing members of the software development community opportunities to participate in tool development and leverage data from the portals.
Medicare recovery audit contractors: what providers need to know and how to prepare.
Riley, James B; Greis, Jason S
2009-04-01
RAC audits are on their way in 2009 and all providers should strive to be prepared. In a depressed economy in which most providers are experiencing decreased or stagnating reimbursement, an additional hit to the bottom line from a RAC audit is especially unwelcome. Providers have a number of tools at their disposal, however, to proactively prepare in advance of receiving a RAC demand letter or medical record request.
A dielectric logging tool with insulated collar for formation fluid detection around borehole
NASA Astrophysics Data System (ADS)
Wang, Bin; Li, Kang; Kong, Fan-Min; Zhao, Jia
2015-08-01
A dielectric tool with insulated collar for analyzing fluid saturation outside a borehole was introduced. The UWB (ultra-wideband) antenna mounted on the tool was optimized to launch a transient pulse. The broadband evaluation method provided more advantages when compared with traditional dielectric tools. The EM (electromagnetic) power distribution outside the borehole was studied, and it was shown that energy was propagated in two modes. Furthermore, the mechanism of the modes was discussed. In order to increase this tools' investigation depth, a novel insulated collar was introduced. In addition, operation in difference formations was discussed and this tool proved to be able to efficiently launch lateral EM waves. Response voltages indicated that the proposed scheme was able to evaluate the fluid saturation of reservoir formations and dielectric dispersion properties. It may be used as an alternative tool for imaging logging applications.
Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan
2014-01-01
LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784
Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan
2014-12-15
LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.
A critical assessment of topologically associating domain prediction tools
Dali, Rola
2017-01-01
Abstract Topologically associating domains (TADs) have been proposed to be the basic unit of chromosome folding and have been shown to play key roles in genome organization and gene regulation. Several different tools are available for TAD prediction, but their properties have never been thoroughly assessed. In this manuscript, we compare the output of seven different TAD prediction tools on two published Hi-C data sets. TAD predictions varied greatly between tools in number, size distribution and other biological properties. Assessed against a manual annotation of TADs, individual TAD boundary predictions were found to be quite reliable, but their assembly into complete TAD structures was much less so. In addition, many tools were sensitive to sequencing depth and resolution of the interaction frequency matrix. This manuscript provides users and designers of TAD prediction tools with information that will help guide the choice of tools and the interpretation of their predictions. PMID:28334773
22 Students and 22 Teachers: Socio-Cultural Mediation in the Early Childhood Classroom
ERIC Educational Resources Information Center
Lozano, Leticia I.
2014-01-01
It is essential for teachers to provide a setting where student interaction is fostered as a mediational tool for learning, thus expediting the natural transfer of language and knowledge among students (Cummins, 1979). Doing so provides students a way of learning in an additive environment (Soltero, 2004). Could such a classroom have the potential…
Introducing MCgrid 2.0: Projecting cross section calculations on grids
NASA Astrophysics Data System (ADS)
Bothmann, Enrico; Hartland, Nathan; Schumann, Steffen
2015-11-01
MCgrid is a software package that provides access to interpolation tools for Monte Carlo event generator codes, allowing for the fast and flexible variation of scales, coupling parameters and PDFs in cutting edge leading- and next-to-leading-order QCD calculations. We present the upgrade to version 2.0 which has a broader scope of interfaced interpolation tools, now providing access to fastNLO, and features an approximated treatment for the projection of MC@NLO-type calculations onto interpolation grids. MCgrid 2.0 also now supports the extended information provided through the HepMC event record used in the recent SHERPA version 2.2.0. The additional information provided therein allows for the support of multi-jet merged QCD calculations in a future update of MCgrid.
Friction Stir Weld System for Welding and Weld Repair
NASA Technical Reports Server (NTRS)
Ding, R. Jeffrey (Inventor); Romine, Peter L. (Inventor); Oelgoetz, Peter A. (Inventor)
2001-01-01
A friction stir weld system for welding and weld repair has a base foundation unit connected to a hydraulically controlled elevation platform and a hydraulically adjustable pin tool. The base foundation unit may be fixably connected to a horizontal surface or may be connected to a mobile support in order to provide mobility to the friction stir welding system. The elevation platform may be utilized to raise and lower the adjustable pin tool about a particular axis. Additional components which may be necessary for the friction stir welding process include back plate tooling, fixturing and/or a roller mechanism.
Slok, Annerika H M; in 't Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, P N Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P
2014-07-10
In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice.
Slok, Annerika H M; in ’t Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, PN Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P
2014-01-01
In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice. PMID:25010353
Sandrock, R.J.
1961-12-12
A self-actuated gripping tool is described for transferring fuel elements and the like into reactors and other inaccessible locations. The tool will grasp or release the load only when properly positioned for this purpose. In addition, the load cannot be released except when unsupported by the tool, so that jarring or contact will not bring about accidental release of the load. The gripping members or jaws of the device are cam-actuated by an axially slidable shaft which has two lockable positions. A spring urges the shaft into one position and a solenoid is provided to overcome the spring and move it into the other position. The weight of the tool operates a sleeve to lock the shaft in its existing position. Only when the cable supporting the tool is slack is the device capable of being actuated either to grasp or release its load. (AEC)
Accelerating Industrial Adoption of Metal Additive Manufacturing Technology
NASA Astrophysics Data System (ADS)
Vartanian, Kenneth; McDonald, Tom
2016-03-01
While metal additive manufacturing (AM) technology has clear benefits, there are still factors preventing its adoption by industry. These factors include the high cost of metal AM systems, the difficulty for machinists to learn and operate metal AM machines, the long approval process for part qualification/certification, and the need for better process controls; however, the high AM system cost is the main barrier deterring adoption. In this paper, we will discuss an America Makes-funded program to reduce AM system cost by combining metal AM technology with conventional computerized numerical controlled (CNC) machine tools. Information will be provided on how an Optomec-led team retrofitted a legacy CNC vertical mill with laser engineered net shaping (LENS®—LENS is a registered trademark of Sandia National Labs) AM technology, dramatically lowering deployment cost. The upgraded system, dubbed LENS Hybrid Vertical Mill, enables metal additive and subtractive operations to be performed on the same machine tool and even on the same part. Information on the LENS Hybrid system architecture, learnings from initial system deployment and continuing development work will also be provided to help guide further development activities within the materials community.
cDNA Microarray Screening in Food Safety
ROY, SASHWATI; SEN, CHANDAN K
2009-01-01
The cDNA microarray technology and related bioinformatics tools presents a wide range of novel application opportunities. The technology may be productively applied to address food safety. In this mini-review article, we present an update highlighting the late breaking discoveries that demonstrate the vitality of cDNA microarray technology as a tool to analyze food safety with reference to microbial pathogens and genetically modified foods. In order to bring the microarray technology to mainstream food safety, it is important to develop robust user-friendly tools that may be applied in a field setting. In addition, there needs to be a standardized process for regulatory agencies to interpret and act upon microarray-based data. The cDNA microarray approach is an emergent technology in diagnostics. Its values lie in being able to provide complimentary molecular insight when employed in addition to traditional tests for food safety, as part of a more comprehensive battery of tests. PMID:16466843
Hewett, Rafe; VanCuren, Anne; Trocio, Loralee; Beaudrault, Sara; Gund, Anona; Luther, Mimi; Groom, Holly
2013-01-01
This project's objective was to enhance efforts to improve vaccine-ordering efficiencies among targeted clinics using publicly purchased vaccines. Using an assessment of ordering behavior developed by the Centers for Disease Control and Prevention, we selected and trained immunization providers and assessed improvements in ordering behavior by comparing ordering patterns before and after the intervention. A total of 144 Vaccines for Children program providers in Oregon. We assessed 144 providers trained in the Economic Order Quantity process between January and November 2010. INTERVENTION (IF APPLICABLE): Providers were invited to participate in regional trainings. Trainings included assignment of ordering frequency and dissemination of tools to support adherence to the recommended ordering frequency. The percent increase in targeted clinics ordering according to recommended order frequency and the resulting decrease in orders placed, as an outcome of training and ordering tools. Only 35% of targeted providers were ordering according to the recommended ordering frequency before the project began. After completing training, utilizing ordering tools and ordering over a 7-month period, 78% of the targeted clinics were ordering according to the recommended frequency, a 120% increase in the number of clinics ordering with the recommended frequency. At baseline, targeted clinics placed 915 total vaccine orders over a 7-month period. After completing training and participating in the Economic Order Quantity process, only 645 orders were placed, a reduction of 30% . The initiative was successful in reducing the number of orders placed by Vaccines for Children providers in Oregon. A previous effort to reduce ordering, without the use of training or tools, did not achieve the same levels of provider compliance, suggesting that the addition of staff and development of tools were helpful in supporting behavior change and improving providers' ability to adhere to assigned order frequencies. Reducing order frequency results in more efficient vaccine ordering patterns and benefits vaccine distributors, Oregon Immunization Program staff, and provider staff.
Mann, Janet; Patterson, Eric M.
2013-01-01
Tool-use research has focused primarily on land-based animals, with less consideration given to aquatic animals and the environmental challenges and conditions they face. Here, we review aquatic tool use and examine the contributing ecological, physiological, cognitive and social factors. Tool use among aquatic animals is rare but taxonomically diverse, occurring in fish, cephalopods, mammals, crabs, urchins and possibly gastropods. While additional research is required, the scarcity of tool use can likely be attributable to the characteristics of aquatic habitats, which are generally not conducive to tool use. Nonetheless, studying tool use by aquatic animals provides insights into the conditions that promote and inhibit tool-use behaviour across biomes. Like land-based tool users, aquatic animals tend to find tools on the substrate and use tools during foraging. However, unlike on land, tool users in water often use other animals (and their products) and water itself as a tool. Among sea otters and dolphins, the two aquatic tool users studied in greatest detail, some individuals specialize in tool use, which is vertically socially transmitted possibly because of their long dependency periods. In all, the contrasts between aquatic- and land-based tool users enlighten our understanding of the adaptive value of tool-use behaviour. PMID:24101631
Mann, Janet; Patterson, Eric M
2013-11-19
Tool-use research has focused primarily on land-based animals, with less consideration given to aquatic animals and the environmental challenges and conditions they face. Here, we review aquatic tool use and examine the contributing ecological, physiological, cognitive and social factors. Tool use among aquatic animals is rare but taxonomically diverse, occurring in fish, cephalopods, mammals, crabs, urchins and possibly gastropods. While additional research is required, the scarcity of tool use can likely be attributable to the characteristics of aquatic habitats, which are generally not conducive to tool use. Nonetheless, studying tool use by aquatic animals provides insights into the conditions that promote and inhibit tool-use behaviour across biomes. Like land-based tool users, aquatic animals tend to find tools on the substrate and use tools during foraging. However, unlike on land, tool users in water often use other animals (and their products) and water itself as a tool. Among sea otters and dolphins, the two aquatic tool users studied in greatest detail, some individuals specialize in tool use, which is vertically socially transmitted possibly because of their long dependency periods. In all, the contrasts between aquatic- and land-based tool users enlighten our understanding of the adaptive value of tool-use behaviour.
Developing and using a rubric for evaluating evidence-based medicine point-of-care tools.
Shurtz, Suzanne; Foster, Margaret J
2011-07-01
The research sought to establish a rubric for evaluating evidence-based medicine (EBM) point-of-care tools in a health sciences library. The authors searched the literature for EBM tool evaluations and found that most previous reviews were designed to evaluate the ability of an EBM tool to answer a clinical question. The researchers' goal was to develop and complete rubrics for assessing these tools based on criteria for a general evaluation of tools (reviewing content, search options, quality control, and grading) and criteria for an evaluation of clinical summaries (searching tools for treatments of common diagnoses and evaluating summaries for quality control). Differences between EBM tools' options, content coverage, and usability were minimal. However, the products' methods for locating and grading evidence varied widely in transparency and process. As EBM tools are constantly updating and evolving, evaluation of these tools needs to be conducted frequently. Standards for evaluating EBM tools need to be established, with one method being the use of objective rubrics. In addition, EBM tools need to provide more information about authorship, reviewers, methods for evidence collection, and grading system employed.
A Computer-Based Nursing Diagnosis Consultant
Evans, Steven
1984-01-01
This consultant permits a nurse to enter patient signs and symptoms which are then interpreted by the system in order to relate them to well-established nursing-related dysfunctional patterns. The system attempts to confirm the pattern by soliciting additional patient information from the nurse. This process provides an educational prompt to the nurse, and the suggestions of the system also provide a clinical support tool that can be of practical value. As our testing hones the system and subtlety is added to the weighing of the evidence the nurse provides, it is expected that this tool will be a useful adjunct to computer-based nursing services in support of health care. This Nursing Diagnosis Consultant is yet another element in the COMMES family of consultants for health professionals.
Chemical Tool Peer Review Summary.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cashion, Avery Ted; Cieslewski, Grzegorz
Chemical tracers are commonly used to characterize fracture networks and to determine the connectivity between the injection and production wells. Currently, most tracer experiments involve injecting the tracer at the injection well, manually collecting liquid samples at the wellhead of the production well, and sending the samples off for laboratory analysis. While this method provides accurate tracer concentration data, it does not provide information regarding the location of the fractures conducting the tracer between wellbores. The goal of this project is to develop chemical sensors and design a prototype tool to help understand the fracture properties of a geothermal reservoirmore » by monitoring tracer concentrations along the depth of the well. The sensors will be able to detect certain species of the ionic tracers (mainly iodide) and pH in-situ during the tracer experiment. The proposed high-temperature (HT) tool will house the chemical sensors as well as a standard logging sensor package of pressure, temperature, and flow sensors in order to provide additional information on the state of the geothermal reservoir. The sensors and the tool will be able to survive extended deployments at temperatures up to 225 °C and high pressures to provide real-time temporal and spatial feedback of tracer concentration. Data collected from this tool will allow for the real-time identification of the fractures conducting chemical tracers between wellbores along with the pH of the reservoir fluid at various depths.« less
Visual Purple, the Next Generation Crisis Management Decision Training Tool
2001-09-01
talents of professional Hollywood screenwriters during the scripting and writing process of the simulations. Additionally, cinematic techniques learned...cultural, and language experts for research development. Additionally, GTA provides country specific support in script writing and cinematic resources as...The result is an entirely new dimension of realism that traditional exercises often fail to capture. The scenario requires the participant to make the
DspaceOgre 3D Graphics Visualization Tool
NASA Technical Reports Server (NTRS)
Jain, Abhinandan; Myin, Steven; Pomerantz, Marc I.
2011-01-01
This general-purpose 3D graphics visualization C++ tool is designed for visualization of simulation and analysis data for articulated mechanisms. Examples of such systems are vehicles, robotic arms, biomechanics models, and biomolecular structures. DspaceOgre builds upon the open-source Ogre3D graphics visualization library. It provides additional classes to support the management of complex scenes involving multiple viewpoints and different scene groups, and can be used as a remote graphics server. This software provides improved support for adding programs at the graphics processing unit (GPU) level for improved performance. It also improves upon the messaging interface it exposes for use as a visualization server.
Using Risk Assessment Methodologies to Meet Management Objectives
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2015-01-01
Current decision making involves numerous possible combinations of technology elements, safety and health issues, operational aspects and process considerations to satisfy program goals. Identifying potential risk considerations as part of the management decision making process provides additional tools to make more informed management decision. Adapting and using risk assessment methodologies can generate new perspectives on various risk and safety concerns that are not immediately apparent. Safety and operational risks can be identified and final decisions can balance these considerations with cost and schedule risks. Additional assessments can also show likelihood of event occurrence and event consequence to provide a more informed basis for decision making, as well as cost effective mitigation strategies. Methodologies available to perform Risk Assessments range from qualitative identification of risk potential, to detailed assessments where quantitative probabilities are calculated. Methodology used should be based on factors that include: 1) type of industry and industry standards, 2) tasks, tools, and environment 3) type and availability of data and 4) industry views and requirements regarding risk & reliability. Risk Assessments are a tool for decision makers to understand potential consequences and be in a position to reduce, mitigate or eliminate costly mistakes or catastrophic failures.
The Small Bodies Imager Browser --- finding asteroid and comet images without pain
NASA Astrophysics Data System (ADS)
Palmer, E.; Sykes, M.; Davis, D.; Neese, C.
2014-07-01
To facilitate accessing and downloading spatially resolved imagery of asteroids and comets in the NASA Planetary Data System (PDS), we have created the Small Bodies Image Browser. It is a HTML5 webpage that runs inside a standard web browser needing no installation (http://sbn.psi.edu/sbib/). The volume of data returned by spacecraft missions has grown substantially over the last decade. While this wealth of data provides scientists with ample support for research, it has greatly increased the difficulty of managing, accessing and processing these data. Further, the complexity necessary for a long-term archive results in an architecture that is efficient for computers, but not user friendly. The Small Bodies Image Browser (SBIB) is tied into the PDS archive of the Small Bodies Asteroid Subnode hosted at the Planetary Science Institute [1]. Currently, the tool contains the entire repository of the Dawn mission's encounter with Vesta [2], and we will be adding other datasets in the future. For Vesta, this includes both the level 1A and 1B images for the Framing Camera (FC) and the level 1B spectral cubes from the Visual and Infrared (VIR) spectrometer, providing over 30,000 individual images. A key strength of the tool is providing quick and easy access of these data. The tool allows for searches based on clicking on a map or typing in coordinates. The SBIB can show an entire mission phase (such as cycle 7 of the Low Altitude Mapping Orbit) and the associated footprints, as well as search by image name. It can focus the search by mission phase, resolution or instrument. Imagery archived in the PDS are generally provided by missions in a single or narrow range of formats. To enhance the value and usability of this data to researchers, SBIB makes these available in these original formats as well as PNG, JPEG and ArcGIS compatible ISIS cubes [3]. Additionally, we provide header files for the VIR cubes so they can be read into ENVI without additional processing. Finally, we also provide both camera-based and map-projected products with geometric data embedded for use within ArcGIS and ISIS. We use the Gaskell shape model for terrain projections [4]. There are several other outstanding data analysis tools that have access to asteroid and comet data: JAsteroid (a derivative of JMARS [5]) and the Applied Physics Laboratory's Small Body Mapping Tool [6]. The SBIB has specifically focused on providing data in the easiest manner possible rather than trying to be an analytical tool.
Modelling the urban water cycle as an integrated part of the city: a review.
Urich, Christian; Rauch, Wolfgang
2014-01-01
In contrast to common perceptions, the urban water infrastructure system is a complex and dynamic system that is constantly evolving and adapting to changes in the urban environment, to sustain existing services and provide additional ones. Instead of simplifying urban water infrastructure to a static system that is decoupled from its urban context, new management strategies use the complexity of the system to their advantage by integrating centralised with decentralised solutions and explicitly embedding water systems into their urban form. However, to understand and test possible adaptation strategies, urban water modelling tools are required to support exploration of their effectiveness as the human-technology-environment system coevolves under different future scenarios. The urban water modelling community has taken first steps to developing these new modelling tools. This paper critically reviews the historical development of urban water modelling tools and provides a summary of the current state of integrated modelling approaches. It reflects on the challenges that arise through the current practice of coupling urban water management tools with urban development models and discusses a potential pathway towards a new generation of modelling tools.
Center for Corporate Climate Leadership Leveraging Third-party Programs for Supplier Outreach
Third-party programs maximize efficient use of resources by helping companies request and analyze emissions information from suppliers and then provide suppliers with additional tools to develop their own GHG inventories and manage their GHG emissions.
Planetary Geologic Mapping Python Toolbox: A Suite of Tools to Support Mapping Workflows
NASA Astrophysics Data System (ADS)
Hunter, M. A.; Skinner, J. A.; Hare, T. M.; Fortezzo, C. M.
2017-06-01
The collective focus of the Planetary Geologic Mapping Python Toolbox is to provide researchers with additional means to migrate legacy GIS data, assess the quality of data and analysis results, and simplify common mapping tasks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pavlak, Gregory S.; Henze, Gregor P.; Hirsch, Adam I.
This paper demonstrates an energy signal tool to assess the system-level and whole-building energy use of an office building in downtown Denver, Colorado. The energy signal tool uses a traffic light visualization to alert a building operator to energy use which is substantially different from expected. The tool selects which light to display for a given energy end-use by comparing measured energy use to expected energy use, accounting for uncertainty. A red light is only displayed when a fault is likely enough, and abnormal operation costly enough, that taking action will yield the lowest cost result. While the theoretical advancesmore » and tool development were reported previously, it has only been tested using a basic building model and has not, until now, been experimentally verified. Expected energy use for the field demonstration is provided by a compact reduced-order representation of the Alliance Center, generated from a detailed DOE-2.2 energy model. Actual building energy consumption data is taken from the summer of 2014 for the office building immediately after a significant renovation project. The purpose of this paper is to demonstrate a first look at the building following its major renovation compared to the design intent. The tool indicated strong under-consumption in lighting and plug loads and strong over-consumption in HVAC energy consumption, which prompted several focused actions for follow-up investigation. In addition, this paper illustrates the application of Bayesian inference to the estimation of posterior parameter probability distributions to measured data. Practical discussion of the application is provided, along with additional findings from further investigating the significant difference between expected and actual energy consumption.« less
BiNChE: a web tool and library for chemical enrichment analysis based on the ChEBI ontology.
Moreno, Pablo; Beisken, Stephan; Harsha, Bhavana; Muthukrishnan, Venkatesh; Tudose, Ilinca; Dekker, Adriano; Dornfeldt, Stefanie; Taruttis, Franziska; Grosse, Ivo; Hastings, Janna; Neumann, Steffen; Steinbeck, Christoph
2015-02-21
Ontology-based enrichment analysis aids in the interpretation and understanding of large-scale biological data. Ontologies are hierarchies of biologically relevant groupings. Using ontology annotations, which link ontology classes to biological entities, enrichment analysis methods assess whether there is a significant over or under representation of entities for ontology classes. While many tools exist that run enrichment analysis for protein sets annotated with the Gene Ontology, there are only a few that can be used for small molecules enrichment analysis. We describe BiNChE, an enrichment analysis tool for small molecules based on the ChEBI Ontology. BiNChE displays an interactive graph that can be exported as a high-resolution image or in network formats. The tool provides plain, weighted and fragment analysis based on either the ChEBI Role Ontology or the ChEBI Structural Ontology. BiNChE aids in the exploration of large sets of small molecules produced within Metabolomics or other Systems Biology research contexts. The open-source tool provides easy and highly interactive web access to enrichment analysis with the ChEBI ontology tool and is additionally available as a standalone library.
Nagata, Tomohisa; Mori, Koji; Aratake, Yutaka; Ide, Hiroshi; Ishida, Hiromi; Nobori, Junichiro; Kojima, Reiko; Odagami, Kiminori; Kato, Anna; Tsutsumi, Akizumi; Matsuda, Shinya
2014-01-01
The aim of the present study was to develop standardized cost estimation tools that provide information to employers about occupational safety and health (OSH) activities for effective and efficient decision making in Japanese companies. We interviewed OSH staff members including full-time professional occupational physicians to list all OSH activities. Using activity-based costing, cost data were obtained from retrospective analyses of occupational safety and health costs over a 1-year period in three manufacturing workplaces and were obtained from retrospective analyses of occupational health services costs in four manufacturing workplaces. We verified the tools additionally in four workplaces including service businesses. We created the OSH and occupational health standardized cost estimation tools. OSH costs consisted of personnel costs, expenses, outsourcing costs and investments for 15 OSH activities. The tools provided accurate, relevant information on OSH activities and occupational health services. The standardized information obtained from our OSH and occupational health cost estimation tools can be used to manage OSH costs, make comparisons of OSH costs between companies and organizations and help occupational health physicians and employers to determine the best course of action.
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.
1993-01-01
Over the past several years, it has been the primary goal of this grant to design and implement software to be used in the conceptual design of aerospace vehicles. The work carried out under this grant was performed jointly with members of the Vehicle Analysis Branch (VAB) of NASA LaRC, Computer Sciences Corp., and Vigyan Corp. This has resulted in the development of several packages and design studies. Primary among these are the interactive geometric modeling tool, the Solid Modeling Aerospace Research Tool (smart), and the integration and execution tools provided by the Environment for Application Software Integration and Execution (EASIE). In addition, it is the purpose of the personnel of this grant to provide consultation in the areas of structural design, algorithm development, and software development and implementation, particularly in the areas of computer aided design, geometric surface representation, and parallel algorithms.
MacEwan, Matthew J; Dudek, Nancy L; Wood, Timothy J; Gofton, Wade T
2016-01-01
CONSTRUCT: The Ottawa Surgical Competency Operating Room Evaluation (O-SCORE) is a 9-item surgical evaluation tool designed to assess technical competence in surgical trainees using behavioral anchors. The initial development of the O-SCORE produced evidence for valid results. Further work is required to determine if the use of a single surgeon or an unblinded rater introduces bias. In addition, the relationship of the O-SCORE to other currently used technical assessment tools should be explored to provide validity evidence related to the relationship to other measures. We have designed this project to provide continued validity evidence for the O-SCORE related to these two issues. Nineteen residents and 2 staff Orthopedic Surgeons from the University of Ottawa volunteered to participate in a 2-part OSCE style station. Participants completed a written questionnaire followed by a videotaped 10-minute simulated open reduction and internal fixation of a midshaft radius fracture. Videos were rated individually by 2 blinded staff orthopedic surgeons using an Objective Structured Assessment of Technical Skills (OSATS) global rating scale, an OSATS checklist, and the O-SCORE in random order. O-SCORE results appeared sensitive to surgical training level even when raters were blinded. In addition, strong agreement between two independent observers using the O-SCORE suggests that the measure captures a performance easily recognized by surgical observers. Ratings on the O-SCORE also were strongly associated with global ratings on the currently most validated technical evaluation tool (OSATS). Collectively, these results suggest that the O-SCORE generates accurate, reproducible, and meaningful results when used in a randomized and blinded fashion, providing continued validity evidence for using this tool to evaluate surgical trainee competence. The O-SCORE was able to differentiate surgical trainee level using blinded raters providing further evidence of validity for the O-SCORE. There was strong agreement between two independent observers using the O-SCORE. Ratings on the O-SCORE also demonstrated equivalence to scores on the most validated technical evaluation tool (OSATS). These results suggest that the O-SCORE demonstrates accurate and reproducible results when used in a randomized and blinded fashion providing continued validity evidence for this tool in the evaluation of surgical competence in the trainees.
Computational medicinal chemistry in fragment-based drug discovery: what, how and when.
Rabal, Obdulia; Urbano-Cuadrado, Manuel; Oyarzabal, Julen
2011-01-01
The use of fragment-based drug discovery (FBDD) has increased in the last decade due to the encouraging results obtained to date. In this scenario, computational approaches, together with experimental information, play an important role to guide and speed up the process. By default, FBDD is generally considered as a constructive approach. However, such additive behavior is not always present, therefore, simple fragment maturation will not always deliver the expected results. In this review, computational approaches utilized in FBDD are reported together with real case studies, where applicability domains are exemplified, in order to analyze them, and then, maximize their performance and reliability. Thus, a proper use of these computational tools can minimize misleading conclusions, keeping the credit on FBDD strategy, as well as achieve higher impact in the drug-discovery process. FBDD goes one step beyond a simple constructive approach. A broad set of computational tools: docking, R group quantitative structure-activity relationship, fragmentation tools, fragments management tools, patents analysis and fragment-hopping, for example, can be utilized in FBDD, providing a clear positive impact if they are utilized in the proper scenario - what, how and when. An initial assessment of additive/non-additive behavior is a critical point to define the most convenient approach for fragments elaboration.
Extended version of the "Sniffin' Sticks" identification test: test-retest reliability and validity.
Sorokowska, A; Albrecht, E; Haehner, A; Hummel, T
2015-03-30
The extended, 32-item version of the Sniffin' Sticks identification test was developed in order to create a precise tool enabling repeated, longitudinal testing of individual olfactory subfunctions. Odors of the previous test version had to be changed for technical reasons, and the odor identification test needed re-investigation in terms of reliability, validity, and normative values. In our study we investigated olfactory abilities of a group of 100 patients with olfactory dysfunction and 100 controls. We reconfirmed the high test-retest reliability of the extended version of the Sniffin' Sticks identification test and high correlations between the new and the original part of this tool. In addition, we confirmed the validity of the test as it discriminated clearly between controls and patients with olfactory loss. The additional set of 16 odor identification sticks can be either included in the current olfactory test, thus creating a more detailed diagnosis tool, or it can be used separately, enabling to follow olfactory function over time. Additionally, the normative values presented in our paper might provide useful guidelines for interpretation of the extended identification test results. The revised version of the Sniffin' Sticks 32-item odor identification test is a reliable and valid tool for the assessment of olfactory function. Copyright © 2015 Elsevier B.V. All rights reserved.
ARX - A Comprehensive Tool for Anonymizing Biomedical Data
Prasser, Fabian; Kohlmayer, Florian; Lautenschläger, Ronald; Kuhn, Klaus A.
2014-01-01
Collaboration and data sharing have become core elements of biomedical research. Especially when sensitive data from distributed sources are linked, privacy threats have to be considered. Statistical disclosure control allows the protection of sensitive data by introducing fuzziness. Reduction of data quality, however, needs to be balanced against gains in protection. Therefore, tools are needed which provide a good overview of the anonymization process to those responsible for data sharing. These tools require graphical interfaces and the use of intuitive and replicable methods. In addition, extensive testing, documentation and openness to reviews by the community are important. Existing publicly available software is limited in functionality, and often active support is lacking. We present ARX, an anonymization tool that i) implements a wide variety of privacy methods in a highly efficient manner, ii) provides an intuitive cross-platform graphical interface, iii) offers a programming interface for integration into other software systems, and iv) is well documented and actively supported. PMID:25954407
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrow, William R.; Shehabi, Arman; Smith, Sarah
The LIGHTEnUP Analysis Tool (Lifecycle Industry GreenHouse gas, Technology and Energy through the Use Phase) has been developed for The United States Department of Energy’s (U.S. DOE) Advanced Manufacturing Office (AMO) to forecast both the manufacturing sector and product life-cycle energy consumption implications of manufactured products across the U.S. economy. The tool architecture incorporates publicly available historic and projection datasets of U.S. economy-wide energy use including manufacturing, buildings operations, electricity generation and transportation. The tool requires minimal inputs to define alternate scenarios to business-as-usual projection data. The tool is not an optimization or equilibrium model and therefore does not selectmore » technologies or deployment scenarios endogenously. Instead, inputs are developed exogenous to the tool by the user to reflect detailed engineering calculations, future targets and goals, or creative insights. The tool projects the scenario’s energy, CO 2 emissions, and energy expenditure (i.e., economic spending to purchase energy) implications and provides documentation to communicate results. The tool provides a transparent and uniform system of comparing manufacturing and use-phase impacts of technologies. The tool allows the user to create multiple scenarios that can reflect a range of possible future outcomes. However, reasonable scenarios require careful attention to assumptions and details about the future. This tool is part of an emerging set of AMO’s life cycle analysis (LCA) tool such as the Material Flows the Industry (MFI) tool, and the Additive Manufacturing LCA tool.« less
Comparative analysis and visualization of multiple collinear genomes
2012-01-01
Background Genome browsers are a common tool used by biologists to visualize genomic features including genes, polymorphisms, and many others. However, existing genome browsers and visualization tools are not well-suited to perform meaningful comparative analysis among a large number of genomes. With the increasing quantity and availability of genomic data, there is an increased burden to provide useful visualization and analysis tools for comparison of multiple collinear genomes such as the large panels of model organisms which are the basis for much of the current genetic research. Results We have developed a novel web-based tool for visualizing and analyzing multiple collinear genomes. Our tool illustrates genome-sequence similarity through a mosaic of intervals representing local phylogeny, subspecific origin, and haplotype identity. Comparative analysis is facilitated through reordering and clustering of tracks, which can vary throughout the genome. In addition, we provide local phylogenetic trees as an alternate visualization to assess local variations. Conclusions Unlike previous genome browsers and viewers, ours allows for simultaneous and comparative analysis. Our browser provides intuitive selection and interactive navigation about features of interest. Dynamic visualizations adjust to scale and data content making analysis at variable resolutions and of multiple data sets more informative. We demonstrate our genome browser for an extensive set of genomic data sets composed of almost 200 distinct mouse laboratory strains. PMID:22536897
Huang, Camillan
2003-01-01
Technology has created a new dimension for visual teaching and learning with web-delivered interactive media. The Virtual Labs Project has embraced this technology with instructional design and evaluation methodologies behind the simPHYSIO suite of simulation-based, online interactive teaching modules in physiology for the Stanford students. In addition, simPHYSIO provides the convenience of anytime web-access and a modular structure that allows for personalization and customization of the learning material. This innovative tool provides a solid delivery and pedagogical backbone that can be applied to developing an interactive simulation-based training tool for the use and management of the Picture Archiving and Communication System (PACS) image information system. The disparity in the knowledge between health and IT professionals can be bridged by providing convenient modular teaching tools to fill the gaps in knowledge. An innovative teaching method in the whole PACS is deemed necessary for its successful implementation and operation since it has become widely distributed with many interfaces, components, and customizations. This paper will discuss the techniques for developing an interactive-based teaching tool, a case study of its implementation, and a perspective for applying this approach to an online PACS training tool. Copyright 2002 Elsevier Science Ltd.
ProteoWizard: open source software for rapid proteomics tools development.
Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag
2008-11-01
The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.
NASA Astrophysics Data System (ADS)
Vaishali, S.; Narendranath, S.; Sreekumar, P.
An IDL (interactive data language) based widget application developed for the calibration of C1XS (Narendranath et al., 2010) instrument on Chandrayaan-1 is modified to provide a generic package for the analysis of data from x-ray detectors. The package supports files in ascii as well as FITS format. Data can be fitted with a list of inbuilt functions to derive the spectral redistribution function (SRF). We have incorporated functions such as `HYPERMET' (Philips & Marlow 1976) including non Gaussian components in the SRF such as low energy tail, low energy shelf and escape peak. In addition users can incorporate additional models which may be required to model detector specific features. Spectral fits use a routine `mpfit' which uses Leven-Marquardt least squares fitting method. The SRF derived from this tool can be fed into an accompanying program to generate a redistribution matrix file (RMF) compatible with the X-ray spectral analysis package XSPEC. The tool provides a user friendly interface of help to beginners and also provides transparency and advanced features for experts.
This map service displays all air-related layers used in the USEPA Community/Tribal-Focused Exposure and Risk Screening Tool (C/T-FERST) mapping application (https://www.epa.gov/c-ferst). The following data sources (and layers) are contained in this service:USEPA's 2005 National-Scale Air Toxic Assessment (NATA) data. Data are shown at the census tract level (2000 census tract boundaries, US Census Bureau) for Cumulative Cancer and Non-Cancer risks (Neurological and Respiratory) from 139 air toxics. In addition, individual pollutant estimates of Ambient Concentration, Exposure Concentration, Cancer, and Non-Cancer risks (Neurological and Respiratory) are provided for: Acetaldehyde, Acrolein, Arsenic, Benzene, 1,3-Butadiene, Chromium, Diesel PM, Formaldehyde, Lead, Naphthalene, and Polycyclic Aromatic Hydrocarbon (PAH). The original Access tables were downloaded from USEPA's Office of Air and Radiation (OAR) https://www.epa.gov/national-air-toxics-assessment/2005-national-air-toxics-assessment. The data classification (defined interval) for this map service was developed for USEPA's Office of Research and Development's (ORD) Community-Focused Exposure and Risk Screening Tool (C-FERST) per guidance provided by OAR.The 2005 NATA provides information on 177 of the 187 Clean Air Act air toxics (https://www.epa.gov/sites/production/files/2015-10/documents/2005-nata-pollutants.pdf) plus diesel particulate matter (diesel PM was assessed for non-cancer only). For addit
Fuzzy Logic as a Tool for Assessing Students' Knowledge and Skills
ERIC Educational Resources Information Center
Voskoglou, Michael Gr.
2013-01-01
Fuzzy logic, which is based on fuzzy sets theory introduced by Zadeh in 1965, provides a rich and meaningful addition to standard logic. The applications which may be generated from or adapted to fuzzy logic are wide-ranging and provide the opportunity for modeling under conditions which are imprecisely defined. In this article we develop a fuzzy…
Comprehensive BRL-CAD Primitive Database
2015-03-01
are not to be construed as an official Department of the Army position unless so designated by other authorized documents. Citation of...database provides the target describers of BRL–CAD with a representative example of each primitive’s shape and its properties. In addition to the...database was completed, a tool was created to generate primitive shapes automatically. This provides target describers—CAD experts who generate
Mouse Models for Drug Discovery. Can New Tools and Technology Improve Translational Power?
Zuberi, Aamir; Lutz, Cathleen
2016-01-01
Abstract The use of mouse models in biomedical research and preclinical drug evaluation is on the rise. The advent of new molecular genome-altering technologies such as CRISPR/Cas9 allows for genetic mutations to be introduced into the germ line of a mouse faster and less expensively than previous methods. In addition, the rapid progress in the development and use of somatic transgenesis using viral vectors, as well as manipulations of gene expression with siRNAs and antisense oligonucleotides, allow for even greater exploration into genomics and systems biology. These technological advances come at a time when cost reductions in genome sequencing have led to the identification of pathogenic mutations in patient populations, providing unprecedented opportunities in the use of mice to model human disease. The ease of genetic engineering in mice also offers a potential paradigm shift in resource sharing and the speed by which models are made available in the public domain. Predictively, the knowledge alone that a model can be quickly remade will provide relief to resources encumbered by licensing and Material Transfer Agreements. For decades, mouse strains have provided an exquisite experimental tool to study the pathophysiology of the disease and assess therapeutic options in a genetically defined system. However, a major limitation of the mouse has been the limited genetic diversity associated with common laboratory mice. This has been overcome with the recent development of the Collaborative Cross and Diversity Outbred mice. These strains provide new tools capable of replicating genetic diversity to that approaching the diversity found in human populations. The Collaborative Cross and Diversity Outbred strains thus provide a means to observe and characterize toxicity or efficacy of new therapeutic drugs for a given population. The combination of traditional and contemporary mouse genome editing tools, along with the addition of genetic diversity in new modeling systems, are synergistic and serve to make the mouse a better model for biomedical research, enhancing the potential for preclinical drug discovery and personalized medicine. PMID:28053071
Interferometric correction system for a numerically controlled machine
Burleson, Robert R.
1978-01-01
An interferometric correction system for a numerically controlled machine is provided to improve the positioning accuracy of a machine tool, for example, for a high-precision numerically controlled machine. A laser interferometer feedback system is used to monitor the positioning of the machine tool which is being moved by command pulses to a positioning system to position the tool. The correction system compares the commanded position as indicated by a command pulse train applied to the positioning system with the actual position of the tool as monitored by the laser interferometer. If the tool position lags the commanded position by a preselected error, additional pulses are added to the pulse train applied to the positioning system to advance the tool closer to the commanded position, thereby reducing the lag error. If the actual tool position is leading in comparison to the commanded position, pulses are deleted from the pulse train where the advance error exceeds the preselected error magnitude to correct the position error of the tool relative to the commanded position.
Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli
2016-01-01
Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications.
Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli
2016-01-01
Objects: Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. Methods: The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. Results: The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Conclusions: Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications. PMID:27199729
Object oriented studies into artificial space debris
NASA Technical Reports Server (NTRS)
Adamson, J. M.; Marshall, G.
1988-01-01
A prototype simulation is being developed under contract to the Royal Aerospace Establishment (RAE), Farnborough, England, to assist in the discrimination of artificial space objects/debris. The methodology undertaken has been to link Object Oriented programming, intelligent knowledge based system (IKBS) techniques and advanced computer technology with numeric analysis to provide a graphical, symbolic simulation. The objective is to provide an additional layer of understanding on top of conventional classification methods. Use is being made of object and rule based knowledge representation, multiple reasoning, truth maintenance and uncertainty. Software tools being used include Knowledge Engineering Environment (KEE) and SymTactics for knowledge representation. Hooks are being developed within the SymTactics framework to incorporate mathematical models describing orbital motion and fragmentation. Penetration and structural analysis can also be incorporated. SymTactics is an Object Oriented discrete event simulation tool built as a domain specific extension to the KEE environment. The tool provides facilities for building, debugging and monitoring dynamic (military) simulations.
Distributed visualization of gridded geophysical data: the Carbon Data Explorer, version 0.2.3
NASA Astrophysics Data System (ADS)
Endsley, K. A.; Billmire, M. G.
2016-01-01
Due to the proliferation of geophysical models, particularly climate models, the increasing resolution of their spatiotemporal estimates of Earth system processes, and the desire to easily share results with collaborators, there is a genuine need for tools to manage, aggregate, visualize, and share data sets. We present a new, web-based software tool - the Carbon Data Explorer - that provides these capabilities for gridded geophysical data sets. While originally developed for visualizing carbon flux, this tool can accommodate any time-varying, spatially explicit scientific data set, particularly NASA Earth system science level III products. In addition, the tool's open-source licensing and web presence facilitate distributed scientific visualization, comparison with other data sets and uncertainty estimates, and data publishing and distribution.
Grants4Targets - an innovative approach to translate ideas from basic research into novel drugs.
Lessl, Monika; Schoepe, Stefanie; Sommer, Anette; Schneider, Martin; Asadullah, Khusru
2011-04-01
Collaborations between industry and academia are steadily gaining importance. To combine expertises Bayer Healthcare has set up a novel open innovation approach called Grants4Targets. Ideas on novel drug targets can easily be submitted to http://www.grants4targets.com. After a review process, grants are provided to perform focused experiments to further validate the proposed targets. In addition to financial support specific know-how on target validation and drug discovery is provided. Experienced scientists are nominated as project partners and, depending on the project, tools or specific models are provided. Around 280 applications have been received and 41 projects granted. According to our experience, this type of bridging fund combined with joint efforts provides a valuable tool to foster drug discovery collaborations. Copyright © 2010 Elsevier Ltd. All rights reserved.
Manufacturing and metrology for IR conformal windows and domes
NASA Astrophysics Data System (ADS)
Ferralli, Ian; Blalock, Todd; Brunelle, Matt; Lynch, Timothy; Myer, Brian; Medicus, Kate
2017-05-01
Freeform and conformal optics have the potential to dramatically improve optical systems by enabling systems with fewer optical components, reduced aberrations, and improved aerodynamic performance. These optical components differ from standard components in their surface shape, typically a non-symmetric equation based definition, and material properties. Traditional grinding and polishing tools are unable to handle these freeform shapes. Additionally, standard metrology tools cannot measure these surfaces. Desired substrates are typically hard ceramics, including poly-crystalline alumina or aluminum oxynitride. Notwithstanding the challenges that the hardness provides to manufacturing, these crystalline materials can be highly susceptible to grain decoration creating unacceptable scatter in optical systems. In this presentation, we will show progress towards addressing the unique challenges of manufacturing conformal windows and domes. Particular attention is given to our robotic polishing platform. This platform is based on an industrial robot adapted to accept a wide range of tooling and parts. The robot's flexibility has provided us an opportunity to address the unique challenges of conformal windows. Slurries and polishing active layers can easily be changed to adapt to varying materials and address grain decoration. We have the flexibility to change tool size and shape to address the varying sizes and shapes of conformal optics. In addition, the robotic platform can be a base for a deflectometry-based metrology tool to measure surface form error. This system, whose precision is independent of the robot's positioning accuracy, will allow us to measure optics in-situ saving time and reducing part risk. In conclusion, we will show examples of the conformal windows manufactured using our developed processes.
Improving evaluation at two medical schools.
Schiekirka-Schwake, Sarah; Dreiling, Katharina; Pyka, Katharina; Anders, Sven; von Steinbüchel, Nicole; Raupach, Tobias
2017-08-03
Student evaluations of teaching can provide useful feedback for teachers and programme coordinators alike. We have designed a novel evaluation tool assessing teacher performance and student learning outcome. This tool was implemented at two German medical schools. In this article, we report student and teacher perceptions of the novel tool, and the implementation process. Focus group discussions as well as one-to-one interviews involving 22 teachers and 31 undergraduate medical students were conducted. Following adjustments to the feedback reports (e.g. the colour coding of results) at one medical school, 42 teachers were asked about their perceptions of the revised report and the personal benefit of the evaluation tool. Teachers appreciated the individual feedback provided by the evaluation tool and stated that they wanted to improve their teaching, based on the results; however, they missed most of the preparative communication. Students were unsure about the additional benefit of the instrument compared with traditional evaluation tools. A majority was unwilling to complete evaluation forms in their spare time, and some felt that the new questionnaire was too long and that the evaluations occurred too often. They were particularly interested in feedback on how their comments have helped to further improve teaching. Student evaluations of teaching can provide useful feedback CONCLUSION: Despite evidence of the utility of the tool for individual teachers, implementation of changes to the process of evaluation appears to have been suboptimal, mainly owing to a perceived lack of communication. In order to motivate students to provide evaluation data, feedback loops including aims and consequences should be established. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
ERIC Educational Resources Information Center
Isakson, Carol
2004-01-01
Search engines rapidly add new services and experimental tools in trying to outmaneuver each other for customers. In this article, the author describes the latest additional services of some search engines and provides its sources. The author also suggests tips for using these new search upgrades.
Remediating soils: Designing biochars to meet the need
Biochar, the porous, carbon-rich product of pyrolysis, may provide an additional tool for remediating both metal and organic contaminated soils and for reducing other soil limitations. Soils contaminated with metals, organics or limited in some other way is a world-wide problem...
A survey of tools for the analysis of quantitative PCR (qPCR) data.
Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas
2014-09-01
Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.
Analysis of Facial Injuries Caused by Power Tools.
Kim, Jiye; Choi, Jin-Hee; Hyun Kim, Oh; Won Kim, Sug
2016-06-01
The number of injuries caused by power tools is steadily increasing as more domestic woodwork is undertaken and more power tools are used recreationally. The injuries caused by the different power tools as a consequence of accidents are an issue, because they can lead to substantial costs for patients and the national insurance system. The increase in hand surgery as a consequence of the use of power tools and its economic impact, and the characteristics of the hand injuries caused by power saws have been described. In recent years, the authors have noticed that, in addition to hand injuries, facial injuries caused by power tools commonly present to the emergency room. This study aimed to review the data in relation to facial injuries caused by power saws that were gathered from patients who visited the trauma center at our hospital over the last 4 years, and to analyze the incidence and epidemiology of the facial injuries caused by power saws. The authors found that facial injuries caused by power tools have risen continually. Facial injuries caused by power tools are accidental, and they cause permanent facial disfigurements and functional disabilities. Accidents are almost inevitable in particular workplaces; however, most facial injuries could be avoided by providing sufficient operator training and by tool operators wearing suitable protective devices. The evaluation of the epidemiology and patterns of facial injuries caused by power tools in this study should provide the information required to reduce the number of accidental injuries.
Avionics System Architecture Tool
NASA Technical Reports Server (NTRS)
Chau, Savio; Hall, Ronald; Traylor, marcus; Whitfield, Adrian
2005-01-01
Avionics System Architecture Tool (ASAT) is a computer program intended for use during the avionics-system-architecture- design phase of the process of designing a spacecraft for a specific mission. ASAT enables simulation of the dynamics of the command-and-data-handling functions of the spacecraft avionics in the scenarios in which the spacecraft is expected to operate. ASAT is built upon I-Logix Statemate MAGNUM, providing a complement of dynamic system modeling tools, including a graphical user interface (GUI), modeling checking capabilities, and a simulation engine. ASAT augments this with a library of predefined avionics components and additional software to support building and analyzing avionics hardware architectures using these components.
Development of a high-temperature diagnostics-while-drilling tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavira, David J.; Huey, David; Hetmaniak, Chris
2009-01-01
The envisioned benefits of Diagnostics-While-Drilling (DWD) are based on the principle that high-speed, real-time information from the downhole environment will promote better control of the drilling process. Although in practice a DWD system could provide information related to any aspect of exploration and production of subsurface resources, the current DWD system provides data on drilling dynamics. This particular set of new tools provided by DWD will allow quicker detection of problems, reduce drilling flat-time and facilitate more efficient drilling (drilling optimization) with the overarching result of decreased drilling costs. In addition to providing the driller with an improved, real-time picturemore » of the drilling conditions downhole, data generated from DWD systems provides researchers with valuable, high fidelity data sets necessary for developing and validating enhanced understanding of the drilling process. Toward this end, the availability of DWD creates a synergy with other Sandia Geothermal programs, such as the hard-rock bit program, where the introduction of alternative rock-reduction technologies are contingent on the reduction or elimination of damaging dynamic effects. More detailed descriptions of the rationale for the program and early development efforts are described in more detail by others [SAND2003-2069 and SAND2000-0239]. A first-generation low-temperature (LT) DWD system was fielded in a series of proof-of-concept tests (POC) to validate functionality. Using the LT system, DWD was subsequently used to support a single-laboratory/multiple-partner CRADA (Cooperative Research and Development Agreement) entitled Advanced Drag Bits for Hard-Rock Drilling. The drag-bit CRADA was established between Sandia and four bit companies, and involved testing of a PDC bit from each company [Wise, et al., 2003, 2004] in the same lithologic interval at the Gas Technology Institute (GTI) test facility near Catoosa, OK. In addition, the LT DWD system has been fielded in cost-sharing efforts with an industrial partner to support the development of new generation hard-rock drag bits. Following the demonstrated success of the POC DWD system, efforts were initiated in FY05 to design, fabricate and test a high-temperature (HT) capable version of the DWD system. The design temperature for the HT DWD system was 225 C. Programmatic requirements dictated that a HT DWD tool be developed during FY05 and that a working system be demonstrated before the end of FY05. During initial design discussions regarding a high-temperature system it was decided that, to the extent possible, the HT DWD system would maintain functionality similar to the low temperature system, that is, the HT DWD system would also be designed to provide the driller with real-time information on bit and bottom-hole-assembly (BHA) dynamics while drilling. Additionally, because of time and fiscal constraints associated with the HT system development, the design of the HT DWD tool would follow that of the LT tool. The downhole electronics package would be contained in a concentrically located pressure barrel and the use of externally applied strain gages with thru-tool connectors would also be used in the new design. Also, in order to maximize the potential wells available for the HT DWD system and to allow better comparison with the low-temperature design, the diameter of the tool was maintained at 7-inches. This report discusses the efforts associated with the development of a DWD system capable of sustained operation at 225 C. This report documents work performed in the second phase of the Diagnostics-While-Drilling (DWD) project in which a high-temperature (HT) version of the phase 1 low-temperature (LT) proof-of-concept (POC) DWD tool was built and tested. Descriptions of the design, fabrication and field testing of the HT tool are provided. Background on prior phases of the project can be found in SAND2003-2069 and SAND2000-0239.« less
Moffitt, Christine M.
2017-01-01
This project tested and revised a risk assessment/management tool authored by Moffitt and Stockton designed to provide hatchery biologists and others a structure to measure risk and provide tools to control, prevent or eliminate invasive New Zealand mudsnails (NZMS) and other invasive mollusks in fish hatcheries and hatchery operations. The document has two parts: the risk assessment tool, and an appendix that summarizes options for control or management.The framework of the guidance document for risk assessment/hatchery tool combines approaches used by the Hazard Analysis and Critical Control Points (HACCP) process with those developed by the Commission for Environmental Cooperation (CEC), of Canada, Mexico, and the United States, in the Tri-National Risk Assessment Guidelines for Aquatic Alien Invasive Species. The framework approach for this attached first document assesses risk potential with two activities: probability of infestation and consequences of infestation. Each activity is treated equally to determine the risk potential. These two activities are divided into seven basic elements that utilize scientific, technical, and other relevant information in the process of the risk assessment. To determine the probability of infestation four steps are used that have scores reported or determined and averaged. This assessment follows a familiar HACCP process to assess pathways of entry, entry potential, colonization potential, spread potential. The economic, environmental and social consequences are considered as economic impact, environmental impact, and social and cultural influences.To test this document, the Principal Investigator worked to identify interested hatchery managers through contacts at regional aquaculture meetings, fish health meetings, and through the network of invasive species managers and scientists participating in the Western Regional Panel on Aquatic Nuisance Species and the 100th Meridian Initiative's Columbia River Basin Team, and the Western New Zealand Mudsnail Conference in Seattle. Targeted hatchery workshops were conducted with staff at Dworshak National Fish Hatchery Complex (ID), Similkameen Pond, Oroville WA, and Ringold Springs State Hatchery (WA).As a result of communications with hatchery staff, invasive species managers, and on site assessments of hatchery facilities, the document was modified and enhanced. Additional resources were added to keep it up to date. The result is a more simplified tool that can lead hatchery or management personnel through the process of risk assessment and provide an introduction to the risk management and communication process.In addition to the typical HACCP processes, this tool adds steps to rate and consider uncertainty and the weight of evidence regarding options and monitoring results . Uncertainty of outcome exists in most tools that can be used to control or prevent NZMS or other invasive mollusks from infesting an area. In additional this document emphasizes that specific control tools and plans must be tailored to each specific setting to consider the economic, environmental and social influences. From the testing and evaluation process, there was a strong recognition that a number of control and prevention tools previously suggested and reported in the literature from laboratory and small scale trials may not be compatible with regional and national regulations, economic constraints, social or cultural constraints, engineering or water chemistry characteristics of each facility.The options for control are summarized in the second document, Review of Control Measures for Hatcheries Infested with NZMS (Appendix A) that provides sources for additional resources and specific tools, and guidance regarding the feasibility and success of each approach. This tool also emphasizes that management plans need to be adaptive and incorporate oversight from professionals familiar with measuring risks of fish diseases, and treatments (e.g. the fish health practitioners and water quality and effluent management teams). Finally, with such a team, the adaptive management approach must be ongoing, and become a regular component of hatchery operations.Although it was the intent that this two part document would be included as part of the revised National Management and Control Plan for the NZMS proposed by the U.S. Fish and Wildlife Service (USFWS) and others, it is provided as a stand-alone document.
KSC technicians use propellant slump measurement tool on ATA SRM
NASA Technical Reports Server (NTRS)
1988-01-01
Kennedy Space Center (KSC) technicians use new propellant slump measurement tool on the Assembly Test Article (ATA) aft solid rocket motor (SRM). The tool measures any slumping of the top of the solid rocket booster (SRB) solid propellant. Data gathered by this tool and others during the ATA test will be analyzed by SRM engineers. Astronaut Stephen S. Oswald at far right (barely visible) and Morton Thiokol supervisor Howard Fichtl look on during the data gathering process. The month-long ATA test is designed to evaluate the performance of new tools required to put the tighter fitting redesigned SRM joints together. In addition, new procedures are being used and ground crews are receiving training in preparation for stacking the STS-26 flight set of motors. View provided by KSC with alternate number KSC-87PC-956.
Visualization of the NASA ICON mission in 3d
NASA Astrophysics Data System (ADS)
Mendez, R. A., Jr.; Immel, T. J.; Miller, N.
2016-12-01
The ICON Explorer mission (http://icon.ssl.berkeley.edu) will provide several data products for the atmosphere and ionosphere after its launch in 2017. This project will support the mission by investigating the capability of these tools for visualization of current and predicted observatory characteristics and data acquisition. Visualization of this mission can be accomplished using tools like Google Earth or CesiumJS, as well assistance from Java or Python. Ideally we will bring this visualization into the homes of people without the need of additional software. The path of launching a standalone website, building this environment, and a full toolkit will be discussed. Eventually, the initial work could lead to the addition of a downloadable visualization packages for mission demonstration or science visualization.
Measuring Workload Demand of Informatics Systems with the Clinical Case Demand Index
Iyengar, M. Sriram; Rogith, Deevakar; Florez-Arango, Jose F
2017-01-01
Introduction: The increasing use of Health Information Technology (HIT) can add substantially to workload on clinical providers. Current methods for assessing workload do not take into account the nature of clinical cases and the use of HIT tools while solving them. Methods: The Clinical Case Demand Index (CCDI), consisting of a summary score and visual representation, was developed to meet this need. Consistency with current perceived workload measures was evaluated in a Randomized Control Trial of a mobile health system. Results: CCDI is significantly correlated with existing workload measures and inversely related to provider performance. Discussion: CCDI combines subjective and objective characteristics of clinical cases along with cognitive and clinical dimensions. Applications include evaluation of HIT tools, clinician scheduling, medical education. Conclusion: CCDI supports comparative effectiveness research of HIT tools. In addition, CCDI could have numerous applications including training, clinical trials, design of clinical workflows, and others. PMID:29854166
Tools for controlling protein interactions with light
Tucker, Chandra L.; Vrana, Justin D.; Kennedy, Matthew J.
2014-01-01
Genetically-encoded actuators that allow control of protein-protein interactions with light, termed ‘optical dimerizers’, are emerging as new tools for experimental biology. In recent years, numerous new and versatile dimerizer systems have been developed. Here we discuss the design of optical dimerizer experiments, including choice of a dimerizer system, photoexcitation sources, and coordinate use of imaging reporters. We provide detailed protocols for experiments using two dimerization systems we previously developed, CRY2/CIB and UVR8/UVR8, for use controlling transcription, protein localization, and protein secretion with light. Additionally, we provide instructions and software for constructing a pulse-controlled LED light device for use in experiments requiring extended light treatments. PMID:25181301
LLVM Infrastructure and Tools Project Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCormick, Patrick Sean
2017-11-06
This project works with the open source LLVM Compiler Infrastructure (http://llvm.org) to provide tools and capabilities that address needs and challenges faced by ECP community (applications, libraries, and other components of the software stack). Our focus is on providing a more productive development environment that enables (i) improved compilation times and code generation for parallelism, (ii) additional features/capabilities within the design and implementations of LLVM components for improved platform/performance portability and (iii) improved aspects related to composition of the underlying implementation details of the programming environment, capturing resource utilization, overheads, etc. -- including runtime systems that are often not easilymore » addressed by application and library developers.« less
Data management system advanced development
NASA Technical Reports Server (NTRS)
Douglas, Katherine; Humphries, Terry
1990-01-01
The Data Management System (DMS) Advanced Development task provides for the development of concepts, new tools, DMS services, and for the testing of the Space Station DMS hardware and software. It also provides for the development of techniques capable of determining the effects of system changes/enhancements, additions of new technology, and/or hardware and software growth on system performance. This paper will address the built-in characteristics which will support network monitoring requirements in the design of the evolving DMS network implementation, functional and performance requirements for a real-time, multiprogramming, multiprocessor operating system, and the possible use of advanced development techniques such as expert systems and artificial intelligence tools in the DMS design.
2014-09-01
2. Reciprocal use of DNADs ..................................................................30 E. BERRY AMENDMENT PROCESSES ...the contracting process or procuring of systems that do not totally meet the system requirements provided by the combat developer. From a government...as constraints, restrictions and exceptions provided by the Berry Amendment. Additionally, data will be collected on acquisition processes from
Purohit, Bhaskar; Maneskar, Abhishek; Saxena, Deepak
2016-04-14
Addressing the shortage of health service providers (doctors and nurses) in rural health centres remains a huge challenge. The lack of motivation of health service providers to serve in rural areas is one of the major reasons for such shortage. While many studies have aimed at analysing the reasons for low motivation, hardly any studies in India have focused on developing valid and reliable tools to measure motivation among health service providers. Hence, the objective of the study was to test and develop a valid and reliable instrument to assess the motivation of health service providers working with the public health system in India and the extent to which the motivation factors included in the study motivate health service providers to perform better at work. The present study adapted an already developed tool on motivation. The reliability and validity of the tool were established using different methods. The first stage of the tool development involved content development and assessment where, after a detailed literature review, a predeveloped tool with 19 items was adapted. However, in light of the literature review and pilot test, the same tool was modified to suit the local context by adding 7 additional items so that the final modified tool comprised of 26 items. A correlation matrix was applied to check the pattern of relationships among the items. The total sample size for the study was 154 health service providers from one Western state in India. To understand the sampling adequacy, the Kaiser-Meyer-Olkin measure of sampling adequacy and Bartlett's test of sphericity were applied and finally factor analysis was carried out to calculate the eigenvalues and to understand the relative impact of factors affecting motivation. A correlation matrix value of 0.017 was obtained narrating multi-co-linearity among the observations. Based on initial factor analysis, 8 out of 26 study factors were excluded from the study components with a cutoff range of less than 0.6. Running the factor analysis again suggested the inclusion of 18 items which were subsequently labelled under the following heads: transparency, goals, security, convenience, benefits, encouragement, adequacy of earnings and further growth and power. There is a great need to develop instruments aimed at assessing the motivation of health service providers. The instrument used in the study has good psychometric properties and may serve as a useful tool to assess motivation among healthcare providers.
Developing and using a rubric for evaluating evidence-based medicine point-of-care tools
Foster, Margaret J
2011-01-01
Objective: The research sought to establish a rubric for evaluating evidence-based medicine (EBM) point-of-care tools in a health sciences library. Methods: The authors searched the literature for EBM tool evaluations and found that most previous reviews were designed to evaluate the ability of an EBM tool to answer a clinical question. The researchers' goal was to develop and complete rubrics for assessing these tools based on criteria for a general evaluation of tools (reviewing content, search options, quality control, and grading) and criteria for an evaluation of clinical summaries (searching tools for treatments of common diagnoses and evaluating summaries for quality control). Results: Differences between EBM tools' options, content coverage, and usability were minimal. However, the products' methods for locating and grading evidence varied widely in transparency and process. Conclusions: As EBM tools are constantly updating and evolving, evaluation of these tools needs to be conducted frequently. Standards for evaluating EBM tools need to be established, with one method being the use of objective rubrics. In addition, EBM tools need to provide more information about authorship, reviewers, methods for evidence collection, and grading system employed. PMID:21753917
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.
Simonyan, Vahan; Mazumder, Raja
2014-09-30
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis
Simonyan, Vahan; Mazumder, Raja
2014-01-01
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953
Open Access to Multi-Domain Collaborative Analysis of Geospatial Data Through the Internet
NASA Astrophysics Data System (ADS)
Turner, A.
2009-12-01
The internet has provided us with a high bandwidth, low latency, globally connected network in which to rapidly share realtime data from sensors, reports, and imagery. In addition, the availability of this data is even easier to obtain, consume and analyze. Another aspect of the internet has been the increased approachability of complex systems through lightweight interfaces - with additional complex services able to provide more advanced connections into data services. These analyses and discussions have primarily been siloed within single domains, or kept out of the reach of amateur scientists and interested citizens. However, through more open access to analytical tools and data, experts can collaborate with citizens to gather information, provide interfaces for experimenting and querying results, and help make improved insights and feedback for further investigation. For example, farmers in Uganda are able to use their mobile phones to query, analyze, and be alerted to banana crop disease based on agriculture and climatological data. In the U.S., local groups use online social media sharing sites to gather data on storm-water runoff and stream siltation in order to alert wardens and environmental agencies. This talk will present various web-based geospatial visualization and analysis techniques and tools such as Google Earth and GeoCommons that have emerged that provide for a collaboration between experts of various domains as well as between experts, government, and citizen scientists. Through increased communication and the sharing of data and tools, it is possible to gain broad insight and development of joint, working solutions to a variety of difficult scientific and policy related questions.
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315
jSPyDB, an open source database-independent tool for data management
NASA Astrophysics Data System (ADS)
Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo
2011-12-01
Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.
MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.
Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y
2017-08-14
Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .
Helioviewer: A Web 2.0 Tool for Visualizing Heterogeneous Heliophysics Data
NASA Astrophysics Data System (ADS)
Hughitt, V. K.; Ireland, J.; Lynch, M. J.; Schmeidel, P.; Dimitoglou, G.; Müeller, D.; Fleck, B.
2008-12-01
Solar physics datasets are becoming larger, richer, more numerous and more distributed. Feature/event catalogs (describing objects of interest in the original data) are becoming important tools in navigating these data. In the wake of this increasing influx of data and catalogs there has been a growing need for highly sophisticated tools for accessing and visualizing this wealth of information. Helioviewer is a novel tool for integrating and visualizing disparate sources of solar and Heliophysics data. Taking advantage of the newly available power of modern web application frameworks, Helioviewer merges image and feature catalog data, and provides for Heliophysics data a familiar interface not unlike Google Maps or MapQuest. In addition to streamlining the process of combining heterogeneous Heliophysics datatypes such as full-disk images and coronagraphs, the inclusion of visual representations of automated and human-annotated features provides the user with an integrated and intuitive view of how different factors may be interacting on the Sun. Currently, Helioviewer offers images from The Extreme ultraviolet Imaging Telescope (EIT), The Large Angle and Spectrometric COronagraph experiment (LASCO) and the Michelson Doppler Imager (MDI) instruments onboard The Solar and Heliospheric Observatory (SOHO), as well as The Transition Region and Coronal Explorer (TRACE). Helioviewer also incorporates feature/event information from the LASCO CME List, NOAA Active Regions, CACTus CME and Type II Radio Bursts feature/event catalogs. The project is undergoing continuous development with many more data sources and additional functionality planned for the near future.
MEvoLib v1.0: the first molecular evolution library for Python.
Álvarez-Jarreta, Jorge; Ruiz-Pesini, Eduardo
2016-10-28
Molecular evolution studies involve many different hard computational problems solved, in most cases, with heuristic algorithms that provide a nearly optimal solution. Hence, diverse software tools exist for the different stages involved in a molecular evolution workflow. We present MEvoLib, the first molecular evolution library for Python, providing a framework to work with different tools and methods involved in the common tasks of molecular evolution workflows. In contrast with already existing bioinformatics libraries, MEvoLib is focused on the stages involved in molecular evolution studies, enclosing the set of tools with a common purpose in a single high-level interface with fast access to their frequent parameterizations. The gene clustering from partial or complete sequences has been improved with a new method that integrates accessible external information (e.g. GenBank's features data). Moreover, MEvoLib adjusts the fetching process from NCBI databases to optimize the download bandwidth usage. In addition, it has been implemented using parallelization techniques to cope with even large-case scenarios. MEvoLib is the first library for Python designed to facilitate molecular evolution researches both for expert and novel users. Its unique interface for each common task comprises several tools with their most used parameterizations. It has also included a method to take advantage of biological knowledge to improve the gene partition of sequence datasets. Additionally, its implementation incorporates parallelization techniques to enhance computational costs when handling very large input datasets.
Intelligent computer-aided training authoring environment
NASA Technical Reports Server (NTRS)
Way, Robert D.
1994-01-01
Although there has been much research into intelligent tutoring systems (ITS), there are few authoring systems available that support ITS metaphors. Instructional developers are generally obliged to use tools designed for creating on-line books. We are currently developing an authoring environment derived from NASA's research on intelligent computer-aided training (ICAT). The ICAT metaphor, currently in use at NASA has proven effective in disciplines from satellite deployment to high school physics. This technique provides a personal trainer (PT) who instructs the student using a simulated work environment (SWE). The PT acts as a tutor, providing individualized instruction and assistance to each student. Teaching in an SWE allows the student to learn tasks by doing them, rather than by reading about them. This authoring environment will expedite ICAT development by providing a tool set that guides the trainer modeling process. Additionally, this environment provides a vehicle for distributing NASA's ICAT technology to the private sector.
Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design
NASA Technical Reports Server (NTRS)
Schutte, Paul C.; Trujillo, Anna; Pritchett, Amy R.
2000-01-01
While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plug-in' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).
Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design
NASA Technical Reports Server (NTRS)
Pritchett, Amy R.
2002-01-01
While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).
New Technologies for Managing Cotton Modules
USDA-ARS?s Scientific Manuscript database
The use of RFID transponders in the module tags on round modules formed by John Deere harvesters has opened up new possibilities for managing modules and harvest data. Tools are needed to help integrate this new technology and provide additional value to growers and ginners. A mobile application w...
New technologies for managing cotton modules
USDA-ARS?s Scientific Manuscript database
The use of RFID transponders in the module tags on round modules formed by John Deere harvesters has opened up new possibilities for managing modules and harvest data. Tools are needed to help integrate this new technology and provide additional value to growers and ginners. A mobile application w...
Sediment tracers in water erosion studies: Current approaches and challenges
USDA-ARS?s Scientific Manuscript database
The interest in the use of sediment tracers as a complementary tool to traditional water soil erosion or deposition measurements or assessment has increased due to the additional information they may provide such as sediment source identification and tracking of sediment movement over the landscape ...
Collision detection and modeling of rigid and deformable objects in laparoscopic simulator
NASA Astrophysics Data System (ADS)
Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru
2015-03-01
Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.
Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization
NASA Technical Reports Server (NTRS)
Gern, Frank H.
2012-01-01
This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.
"Think aloud" and "Near live" usability testing of two complex clinical decision support tools.
Richardson, Safiya; Mishuris, Rebecca; O'Connell, Alexander; Feldstein, David; Hess, Rachel; Smith, Paul; McCullagh, Lauren; McGinn, Thomas; Mann, Devin
2017-10-01
Low provider adoption continues to be a significant barrier to realizing the potential of clinical decision support. "Think Aloud" and "Near Live" usability testing were conducted on two clinical decision support tools. Each was composed of an alert, a clinical prediction rule which estimated risk of either group A Streptococcus pharyngitis or pneumonia and an automatic order set based on risk. The objective of this study was to further understanding of the facilitators of usability and to evaluate the types of additional information gained from proceeding to "Near Live" testing after completing "Think Aloud". This was a qualitative observational study conducted at a large academic health care system with 12 primary care providers. During "Think Aloud" testing, participants were provided with written clinical scenarios and asked to verbalize their thought process while interacting with the tool. During "Near Live" testing participants interacted with a mock patient. Morae usability software was used to record full screen capture and audio during every session. Participant comments were placed into coding categories and analyzed for generalizable themes. Themes were compared across usability methods. "Think Aloud" and "Near Live" usability testing generated similar themes under the coding categories visibility, workflow, content, understand-ability and navigation. However, they generated significantly different themes under the coding categories usability, practical usefulness and medical usefulness. During both types of testing participants found the tool easier to use when important text was distinct in its appearance, alerts were passive and appropriately timed, content was up to date, language was clear and simple, and each component of the tool included obvious indicators of next steps. Participant comments reflected higher expectations for usability and usefulness during "Near Live" testing. For example, visit aids, such as automatically generated order sets, were felt to be less useful during "Near-Live" testing because they would not be all inclusive for the visit. These complementary types of usability testing generated unique and generalizable insights. Feedback during "Think Aloud" testing primarily helped to improve the tools' ease of use. The additional feedback from "Near Live" testing, which mimics a real clinical encounter, was helpful for eliciting key barriers and facilitators to provider workflow and adoption. Copyright © 2017 Elsevier B.V. All rights reserved.
Litchfield, Ian; Gill, Paramjit; Avery, Tony; Campbell, Stephen; Perryman, Katherine; Marsden, Kate; Greenfield, Sheila
2018-05-22
Primary care is changing rapidly to meet the needs of an ageing and chronically ill population. New ways of working are called for yet the introduction of innovative service interventions is complicated by organisational challenges arising from its scale and diversity and the growing complexity of patients and their care. One such intervention is the multi-strand, single platform, Patient Safety Toolkit developed to help practices provide safer care in this dynamic and pressured environment where the likelihood of adverse incidents is increasing. Here we describe the attitudes of staff toward these tools and how their implementation was shaped by a number of contextual factors specific to each practice. The Patient Safety Toolkit comprised six tools; a system of rapid note review, an online staff survey, a patient safety questionnaire, prescribing safety indicators, a medicines reconciliation tool, and a safe systems checklist. We implemented these tools at practices across the Midlands, the North West, and the South Coast of England and conducted semi-structured interviews to determine staff perspectives on their effectiveness and applicability. The Toolkit was used in 46 practices and a total of 39 follow-up interviews were conducted. Three key influences emerged on the implementation of the Toolkit these related to their ease of use and the novelty of the information they provide; whether their implementation required additional staff training or practice resource; and finally factors specific to the practice's local environment such as overlapping initiatives orchestrated by their CCG. The concept of a balanced toolkit to address a range of safety issues proved popular. A number of barriers and facilitators emerged in particular those tools that provided relevant information with a minimum impact on practice resource were favoured. Individual practice circumstances also played a role. Practices with IT aware staff were at an advantage and those previously utilising patient safety initiatives were less likely to adopt additional tools with overlapping outputs. By acknowledging these influences we can better interpret reaction to and adoption of individual elements of the toolkit and optimise future implementation.
MannDB: A microbial annotation database for protein characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, C; Lam, M; Smith, J
2006-05-19
MannDB was created to meet a need for rapid, comprehensive automated protein sequence analyses to support selection of proteins suitable as targets for driving the development of reagents for pathogen or protein toxin detection. Because a large number of open-source tools were needed, it was necessary to produce a software system to scale the computations for whole-proteome analysis. Thus, we built a fully automated system for executing software tools and for storage, integration, and display of automated protein sequence analysis and annotation data. MannDB is a relational database that organizes data resulting from fully automated, high-throughput protein-sequence analyses using open-sourcemore » tools. Types of analyses provided include predictions of cleavage, chemical properties, classification, features, functional assignment, post-translational modifications, motifs, antigenicity, and secondary structure. Proteomes (lists of hypothetical and known proteins) are downloaded and parsed from Genbank and then inserted into MannDB, and annotations from SwissProt are downloaded when identifiers are found in the Genbank entry or when identical sequences are identified. Currently 36 open-source tools are run against MannDB protein sequences either on local systems or by means of batch submission to external servers. In addition, BLAST against protein entries in MvirDB, our database of microbial virulence factors, is performed. A web client browser enables viewing of computational results and downloaded annotations, and a query tool enables structured and free-text search capabilities. When available, links to external databases, including MvirDB, are provided. MannDB contains whole-proteome analyses for at least one representative organism from each category of biological threat organism listed by APHIS, CDC, HHS, NIAID, USDA, USFDA, and WHO. MannDB comprises a large number of genomes and comprehensive protein sequence analyses representing organisms listed as high-priority agents on the websites of several governmental organizations concerned with bio-terrorism. MannDB provides the user with a BLAST interface for comparison of native and non-native sequences and a query tool for conveniently selecting proteins of interest. In addition, the user has access to a web-based browser that compiles comprehensive and extensive reports.« less
A novel teaching system for industrial robots.
Lin, Hsien-I; Lin, Yu-Hsiang
2014-03-27
The most important tool for controlling an industrial robotic arm is a teach pendant, which controls the robotic arm movement in work spaces and accomplishes teaching tasks. A good teaching tool should be easy to operate and can complete teaching tasks rapidly and effortlessly. In this study, a new teaching system is proposed for enabling users to operate robotic arms and accomplish teaching tasks easily. The proposed teaching system consists of the teach pen, optical markers on the pen, a motion capture system, and the pen tip estimation algorithm. With the marker positions captured by the motion capture system, the pose of the teach pen is accurately calculated by the pen tip algorithm and used to control the robot tool frame. In addition, Fitts' Law is adopted to verify the usefulness of this new system, and the results show that the system provides high accuracy, excellent operation performance, and a stable error rate. In addition, the system maintains superior performance, even when users work on platforms with different inclination angles.
A Novel Teaching System for Industrial Robots
Lin, Hsien-I; Lin, Yu-Hsiang
2014-01-01
The most important tool for controlling an industrial robotic arm is a teach pendant, which controls the robotic arm movement in work spaces and accomplishes teaching tasks. A good teaching tool should be easy to operate and can complete teaching tasks rapidly and effortlessly. In this study, a new teaching system is proposed for enabling users to operate robotic arms and accomplish teaching tasks easily. The proposed teaching system consists of the teach pen, optical markers on the pen, a motion capture system, and the pen tip estimation algorithm. With the marker positions captured by the motion capture system, the pose of the teach pen is accurately calculated by the pen tip algorithm and used to control the robot tool frame. In addition, Fitts' Law is adopted to verify the usefulness of this new system, and the results show that the system provides high accuracy, excellent operation performance, and a stable error rate. In addition, the system maintains superior performance, even when users work on platforms with different inclination angles. PMID:24681669
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
GeneTools--application for functional annotation and statistical hypothesis testing.
Beisvag, Vidar; Jünge, Frode K R; Bergum, Hallgeir; Jølsum, Lars; Lydersen, Stian; Günther, Clara-Cecilie; Ramampiaro, Heri; Langaas, Mette; Sandvik, Arne K; Laegreid, Astrid
2006-10-24
Modern biology has shifted from "one gene" approaches to methods for genomic-scale analysis like microarray technology, which allow simultaneous measurement of thousands of genes. This has created a need for tools facilitating interpretation of biological data in "batch" mode. However, such tools often leave the investigator with large volumes of apparently unorganized information. To meet this interpretation challenge, gene-set, or cluster testing has become a popular analytical tool. Many gene-set testing methods and software packages are now available, most of which use a variety of statistical tests to assess the genes in a set for biological information. However, the field is still evolving, and there is a great need for "integrated" solutions. GeneTools is a web-service providing access to a database that brings together information from a broad range of resources. The annotation data are updated weekly, guaranteeing that users get data most recently available. Data submitted by the user are stored in the database, where it can easily be updated, shared between users and exported in various formats. GeneTools provides three different tools: i) NMC Annotation Tool, which offers annotations from several databases like UniGene, Entrez Gene, SwissProt and GeneOntology, in both single- and batch search mode. ii) GO Annotator Tool, where users can add new gene ontology (GO) annotations to genes of interest. These user defined GO annotations can be used in further analysis or exported for public distribution. iii) eGOn, a tool for visualization and statistical hypothesis testing of GO category representation. As the first GO tool, eGOn supports hypothesis testing for three different situations (master-target situation, mutually exclusive target-target situation and intersecting target-target situation). An important additional function is an evidence-code filter that allows users, to select the GO annotations for the analysis. GeneTools is the first "all in one" annotation tool, providing users with a rapid extraction of highly relevant gene annotation data for e.g. thousands of genes or clones at once. It allows a user to define and archive new GO annotations and it supports hypothesis testing related to GO category representations. GeneTools is freely available through www.genetools.no
Timler, Dariusz; Bogusiak, Katarzyna; Kasielska-Trojan, Anna; Neskoromna-Jędrzejczak, Aneta; Gałązkowski, Robert; Szarpak, Łukasz
2016-02-01
The aim of the study was to verify the effectiveness of short text messages (short message service, or SMS) as an additional notification tool in case of fire or a mass casualty incident in a hospital. A total of 2242 SMS text messages were sent to 59 hospital workers divided into 3 groups (n=21, n=19, n=19). Messages were sent from a Samsung GT-S8500 Wave cell phone and Orange Poland was chosen as the telecommunication provider. During a 3-month trial period, messages were sent between 3:35 PM and midnight with no regular pattern. Employees were asked to respond by telling how much time it would take them to reach the hospital in case of a mass casualty incident. The mean reaction time (SMS reply) was 36.41 minutes. The mean declared time of arrival to the hospital was 100.5 minutes. After excluding 10% of extreme values for declared arrival time, the mean arrival time was estimated as 38.35 minutes. Short text messages (SMS) can be considered an additional tool for notifying medical staff in case of a mass casualty incident.
MAPGEN: Mixed-Initiative Activity Planning for the Mars Exploration Rover Mission
NASA Technical Reports Server (NTRS)
Ai-Chang, Mitchell; Bresina, John; Hsu, Jennifer; Jonsson, Ari; Kanefsky, Bob; McCurdy, Michael; Morris, Paul; Rajan, Kanna; Vera, Alonso; Yglesias, Jeffrey
2004-01-01
This document describes the Mixed initiative Activity Plan Generation system MAPGEN. This system is one of the critical tools in the Mars Exploration Rover mission surface operations, where it is used to build activity plans for each of the rovers, each Martian day. The MAPGEN system combines an existing tool for activity plan editing and resource modeling, with an advanced constraint-based reasoning and planning framework. The constraint-based planning component provides active constraint and rule enforcement, automated planning capabilities, and a variety of tools and functions that are useful for building activity plans in an interactive fashion. In this demonstration, we will show the capabilities of the system and demonstrate how the system has been used in actual Mars rover operations. In contrast to the demonstration given at ICAPS 03, significant improvement have been made to the system. These include various additional capabilities that are based on automated reasoning and planning techniques, as well as a new Constraint Editor support tool. The Constraint Editor (CE) as part of the process for generating these command loads, the MAPGEN tool provides engineers and scientists an intelligent activity planning tool that allows them to more effectively generate complex plans that maximize the science return each day. The key to the effectiveness of the MAPGEN tool is an underlying constraint-based planning and reasoning engine.
New tools in cybertherapy: the VEPSY web site.
Castelnuovo, Gianluca; Buselli, Claudio; De Ferrari, Roberta; Gaggioli, Andrea; Mantovani, Fabrizia; Molinari, Enrico; Villamira, Marco; Riva, Giuseppe
2004-01-01
In the last years the rapid development of the Internet and new communication technologies has had a great impact on psychology and psychotherapy. Psychotherapists seem to rely with more and more interest on the new technological tools such as videophone, audio and video chat, e-mail, SMS and the new Instant Messaging Tools (IMs). All these technologies outline a stimulating as well as complex scenario: in order to effectively exploit their potential, it is important to study which is the possible role played by the Internet-based tools inside a psychotherapeutic iter. Could the technology substitute the health care practitioners or are these tools only a resource in addition to the traditional ones in the therapist's hand? The major aim of this chapter is to provide a framework for the integration of old and new tools in mental health care. Different theoretical positions about the possible role played by e-therapy are reported showing the possible changes that psychotherapy will necessarily face in a cyber setting. The VEPSY website, an integration of different Internet-based tools developed within the VEPSY UPDATED Project, is described as an example of clinical application matching between old (and functional) practices with new (and promising) media for the treatment of different mental disorders. A rationale about the possible scenarios for the use of the VEPSY website in the clinical process is provided.
The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.
Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A
2010-03-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).
The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software
Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung
2010-01-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162
The intelligent clinical laboratory as a tool to increase cancer care management productivity.
Mohammadzadeh, Niloofar; Safdari, Reza
2014-01-01
Studies of the causes of cancer, early detection, prevention or treatment need accurate, comprehensive, and timely cancer data. The clinical laboratory provides important cancer information needed for physicians which influence clinical decisions regarding treatment, diagnosis and patient monitoring. Poor communication between health care providers and clinical laboratory personnel can lead to medical errors and wrong decisions in providing cancer care. Because of the key impact of laboratory information on cancer diagnosis and treatment the quality of the tests, lab reports, and appropriate lab management are very important. A laboratory information management system (LIMS) can have an important role in diagnosis, fast and effective access to cancer data, decrease redundancy and costs, and facilitate the integration and collection of data from different types of instruments and systems. In spite of significant advantages LIMS is limited by factors such as problems in adaption to new instruments that may change existing work processes. Applications of intelligent software simultaneously with existing information systems, in addition to remove these restrictions, have important benefits including adding additional non-laboratory-generated information to the reports, facilitating decision making, and improving quality and productivity of cancer care services. Laboratory systems must have flexibility to change and have the capability to develop and benefit from intelligent devices. Intelligent laboratory information management systems need to benefit from informatics tools and latest technologies like open sources. The aim of this commentary is to survey application, opportunities and necessity of intelligent clinical laboratory as a tool to increase cancer care management productivity.
Nursing journal clubs and the clinical nurse specialist.
Westlake, Cheryl; Albert, Nancy M; Rice, Karen L; Bautista, Cynthia; Close, Jackie; Foster, Jan; Timmerman, Gayle M
2015-01-01
The purpose of this article was to describe the clinical nurse specialist's role in developing and implementing a journal club. Tools for critiquing clinical and research articles with an application of each are provided. The journal club provides a forum through which nurses maintain their knowledge base about clinically relevant topics and developments in their specific clinical discipline, analyze and synthesize the relevant scientific literature as evidence, and engage in informal discussions about evidence-based and best practices. The value of journal clubs includes nursing staff education, review of and support for evidence-based practice, promotion of nursing research, and fostering of organization-wide nursing practice changes. The process for establishing a journal club and suggested appraisal tools are discussed. In addition, strategies for overcoming barriers to the implementation of a journal club are outlined. Suggested article review questions and a reporting format for clinical and research articles are provided with examples from 2 articles. Finally, a glossary of terms commonly used by research scientists and manuscript writers are listed and additional resources provided. The clinical nurse specialist's role in developing and implementing a journal club will be facilitated through the use of this article. Enhanced nursing staff education, evidence-based practice, organization-wide nursing practice changes, and nursing research may be conducted following the implementation of a nursing journal club.
Open access to high-level data and analysis tools in the CMS experiment at the LHC
Calderon, A.; Colling, D.; Huffman, A.; ...
2015-12-23
The CMS experiment, in recognition of its commitment to data preservation and open access as well as to education and outreach, has made its first public release of high-level data under the CC0 waiver: up to half of the proton-proton collision data (by volume) at 7 TeV from 2010 in CMS Analysis Object Data format. CMS has prepared, in collaboration with CERN and the other LHC experiments, an open-data web portal based on Invenio. The portal provides access to CMS public data as well as to analysis tools and documentation for the public. The tools include an event display andmore » histogram application that run in the browser. In addition a virtual machine containing a CMS software environment along with XRootD access to the data is available. Within the virtual machine the public can analyse CMS data, example code is provided. As a result, we describe the accompanying tools and documentation and discuss the first experiences of data use.« less
Influence of export control policy on the competitiveness of machine tool producing organizations
NASA Astrophysics Data System (ADS)
Ahrstrom, Jeffrey D.
The possible influence of export control policies on producers of export controlled machine tools is examined in this quantitative study. International market competitiveness theories hold that market controlling policies such as export control regulations may influence an organization's ability to compete (Burris, 2010). Differences in domestic application of export control policy on machine tool exports may impose throttling effects on the competitiveness of participating firms (Freedenberg, 2010). Commodity shipments from Japan, Germany, and the United States to the Russian market will be examined using descriptive statistics; gravity modeling of these specific markets provides a foundation for comparison to actual shipment data; and industry participant responses to a user developed survey will provide additional data for analysis using a Kruskal-Wallis one-way analysis of variance. There is scarce academic research data on the topic of export control effects within the machine tool industry. Research results may be of interest to industry leadership in market participation decisions, advocacy arguments, and strategic planning. Industry advocates and export policy decision makers could find data of interest in supporting positions for or against modifications of export control policies.
Utopia Providing Trusted Social Network Relationships within an Un-trusted Environment
NASA Astrophysics Data System (ADS)
Gauvin, William; Liu, Benyuan; Fu, Xinwen; Wang, Jie
This paper introduces an unobtrusive method and distributed solution set to aid users of on-line social networking sites, by creating a trusted environment in which every member has the ability to identify each other within their private social network by name, gender, age, location, and the specific usage patterns adopted by the group. Utopia protects members by understanding how the social network is created and the specific aspects of the group that make it unique and identifiable. The main focus of Utopia is the protection of the group, and their privacy within a social network from predators and spammers that characteristically do not fit within the well defined usage boundaries of the social network as a whole. The solution set provides defensive, as well as offensive tools to identify these threats. Once identified, client desktop tools are used to prevent these predators from further interaction within the group. In addition, offensive tools are used to determine the origin of the predator to allow actions to be taken by automated tools and law enforcement to alleviate the threat.
Stienen, Jozette Jc; Ottevanger, Petronella B; Wennekes, Lianne; Dekker, Helena M; van der Maazen, Richard Wm; Mandigers, Caroline Mpw; van Krieken, Johan Hjm; Blijlevens, Nicole Ma; Hermens, Rosella Pmg
2015-01-09
An overload of health-related information is available for patients on numerous websites, guidelines, and information leaflets. However, the increasing need for personalized health-related information is currently unmet. This study evaluates an educational e-tool for patients with non-Hodgkin's lymphoma (NHL) designed to meet patient needs with respect to personalized and complete health-related information provision. The e-tool aims to help NHL patients manage and understand their personal care pathway, by providing them with insight into their own care pathway, the possibility to keep a diary, and structured health-related information. Together with a multidisciplinary NHL expert panel, we developed an e-tool consisting of two sections: (1) a personal section for patients' own care pathway and their experiences, and (2) an informative section including information on NHL. We developed an ideal NHL care pathway based on the available (inter)national guidelines. The ideal care pathway, including date of first consultation, diagnosis, and therapy start, was used to set up the personal care pathway. The informative section was developed in collaboration with the patient association, Hematon. Regarding participants, 14 patients and 6 laymen were asked to evaluate the e-tool. The 24-item questionnaire used discussed issues concerning layout (6 questions), user convenience (3 questions), menu clarity (3 questions), information clarity (5 questions), and general impression (7 questions). In addition, the panel members were asked to give their feedback by email. A comprehensive overview of diagnostics, treatments, and aftercare can be established by patients completing the questions from the personal section. The informative section consisted of NHL information regarding NHL in general, diagnostics, therapy, aftercare, and waiting times. Regarding participants, 6 patients and 6 laymen completed the questionnaire. Overall, the feedback was positive, with at least 75% satisfaction on each feedback item. Important strengths mentioned were the use of a low health-literacy level, the opportunity to document the personal care pathway and experiences, and the clear overview of the information provided. The added value of the e-tool in general was pointed out as very useful for preparing the consultation with one's doctor and for providing all information on one website, including the opportunity for a personalized care pathway and diary. The majority of the revisions concerned wording and clarity. In addition, more explicit information on immunotherapy, experimental therapy, and psychosocial support was added. We have developed a personal care management e-tool for NHL patients. This tool contains a unique way to help patients manage their personal care pathway and give them insight into their NHL by providing health-related information and a personal diary. This evaluation showed that our e-tool meets patients' needs concerning personalized health-related information, which might serve as a good example for other oncologic diseases. Future research should focus on the possible impact of the e-tool on doctor-patient communication during consultations.
MDWeb and MDMoby: an integrated web-based platform for molecular dynamics simulations.
Hospital, Adam; Andrio, Pau; Fenollosa, Carles; Cicin-Sain, Damjan; Orozco, Modesto; Gelpí, Josep Lluís
2012-05-01
MDWeb and MDMoby constitute a web-based platform to help access to molecular dynamics (MD) in the standard and high-throughput regime. The platform provides tools to prepare systems from PDB structures mimicking the procedures followed by human experts. It provides inputs and can send simulations for three of the most popular MD packages (Amber, NAMD and Gromacs). Tools for analysis of trajectories, either provided by the user or retrieved from our MoDEL database (http://mmb.pcb.ub.es/MoDEL) are also incorporated. The platform has two ways of access, a set of web-services based on the BioMoby framework (MDMoby), programmatically accessible and a web portal (MDWeb). http://mmb.irbbarcelona.org/MDWeb; additional information and methodology details can be found at the web site ( http://mmb.irbbarcelona.org/MDWeb/help.php)
Ecoupling server: A tool to compute and analyze electronic couplings.
Cabeza de Vaca, Israel; Acebes, Sandra; Guallar, Victor
2016-07-05
Electron transfer processes are often studied through the evaluation and analysis of the electronic coupling (EC). Since most standard QM codes do not provide readily such a measure, additional, and user-friendly tools to compute and analyze electronic coupling from external wave functions will be of high value. The first server to provide a friendly interface for evaluation and analysis of electronic couplings under two different approximations (FDC and GMH) is presented in this communication. Ecoupling server accepts inputs from common QM and QM/MM software and provides useful plots to understand and analyze the results easily. The web server has been implemented in CGI-python using Apache and it is accessible at http://ecouplingserver.bsc.es. Ecoupling server is free and open to all users without login. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Moreno-Martínez, F Javier; Montoro, Pedro R; Rodríguez-Rojo, Inmaculada C
2014-12-01
This article presents a new corpus of 820 words pertaining to 14 semantic categories, 7 natural (animals, body parts, insects, flowers, fruits, trees, and vegetables) and 7 man-made (buildings, clothing, furniture, kitchen utensils, musical instruments, tools, and vehicles); each word in the database was collected empirically in a previous exemplar generation study. In the present study, 152 Spanish speakers provided data for four psycholinguistic variables known to affect lexical-semantic processing in both neurologically intact and brain-damaged participants: age of acquisition, familiarity, manipulability, and typicality. Furthermore, we collected lexical frequency data derived from Internet search hits, plus three additional Spanish lexical frequency indexes. Word length, number of syllables, and the proportion of respondents citing the exemplar as a category member-which can be useful as an additional measure of typicality-are also provided. Reliability and validity indexes showed that our items display characteristics similar to those of other corpora. Overall, this new corpus of words provides a useful tool for scientists engaged in cognitive- and neuroscience-based research focused on examining language, memory, and object processing. The full set of norms can be downloaded from www.psychonomic.org/archive.
Effects of Feeding Back the Motivation of a Collaboratively Learning Group
ERIC Educational Resources Information Center
Schoor, Cornelia; Kownatzki, Salome; Narciss, Susanne; Körndle, Hermann
2014-01-01
Introduction: Motivation is an important issue in both face-to-face and computer-supported collaborative learning. There are several approaches for enhancing motivation, including group awareness tools that provide feedback on the group's motivation. However, this feedback was rarely unconfounded with other constructs. Additionally, it is…
No Strings Attached: Open Source Solutions
ERIC Educational Resources Information Center
Fredricks, Kathy
2009-01-01
Imagine downloading a new software application and not having to worry about licensing, finding dollars in the budget, or incurring additional maintenance costs. Imagine finding a Web design tool in the public domain--free for use. Imagine major universities that provide online courses with no strings attached. Imagine online textbooks without a…
Tools of the Trade: Microcomputer-Based Instrumentation: As Easy As ADC.
ERIC Educational Resources Information Center
Lam, Tom
1983-01-01
Provides a brief introduction to using the microcomputer as a piece of laboratory equipment. Focuses on the range of hardware currently available to assist microcomputer-based-instrumentation (MBI), specifically the analog-to-digital converters. In addition, discusses specification of system performance, selection and integration of components,…
Engaged Music Learning through Children's Literature
ERIC Educational Resources Information Center
Eppink, Joseph A.
2009-01-01
Children's literature is a wonderful addition to the general music classroom. Stories and poems can be key strategic tools for teaching musical concepts and skills while leading students to further experience writing, vocabulary, and literature. Children's literature and music provide an opportunity to increase the love of music and reading within…
New technologies and in vitro testing approaches have been valuable additions to risk assessments that have historically relied solely on in vivo test results. Compared to in vivo methods, in vitro high throughput screening (HTS) assays are less expensive, faster and can provide ...
ERIC Educational Resources Information Center
Waters, John K.
2009-01-01
Biometrics has been making its way into school districts for the past decade. Biometric tools draw information from a person's identifying physical components, providing a virtually fail-safe level of protection for K-12 schools. In addition to their security uses, biometric systems are currently used in schools for cafeteria purchases, library…
INDICATORS OF UV EXPOSURE IN CORALS: RELEVANCE TO GLOBAL CLIMATE CHANGE AND CORAL BLEACHING
Increased exposure to solar UV radiation and elevated water temperatures are believed to play a role in the bleaching of corals. To provide additional tools for evaluating the role of UV radiation, we have examined UV-specific effects in coral and have characterized factors that ...
Bayesian network interface for assisting radiology interpretation and education
NASA Astrophysics Data System (ADS)
Duda, Jeffrey; Botzolakis, Emmanuel; Chen, Po-Hao; Mohan, Suyash; Nasrallah, Ilya; Rauschecker, Andreas; Rudie, Jeffrey; Bryan, R. Nick; Gee, James; Cook, Tessa
2018-03-01
In this work, we present the use of Bayesian networks for radiologist decision support during clinical interpretation. This computational approach has the advantage of avoiding incorrect diagnoses that result from known human cognitive biases such as anchoring bias, framing effect, availability bias, and premature closure. To integrate Bayesian networks into clinical practice, we developed an open-source web application that provides diagnostic support for a variety of radiology disease entities (e.g., basal ganglia diseases, bone lesions). The Clinical tool presents the user with a set of buttons representing clinical and imaging features of interest. These buttons are used to set the value for each observed feature. As features are identified, the conditional probabilities for each possible diagnosis are updated in real time. Additionally, using sensitivity analysis, the interface may be set to inform the user which remaining imaging features provide maximum discriminatory information to choose the most likely diagnosis. The Case Submission tools allow the user to submit a validated case and the associated imaging features to a database, which can then be used for future tuning/testing of the Bayesian networks. These submitted cases are then reviewed by an assigned expert using the provided QC tool. The Research tool presents users with cases with previously labeled features and a chosen diagnosis, for the purpose of performance evaluation. Similarly, the Education page presents cases with known features, but provides real time feedback on feature selection.
AdViSHE: A Validation-Assessment Tool of Health-Economic Models for Decision Makers and Model Users.
Vemer, P; Corro Ramos, I; van Voorn, G A K; Al, M J; Feenstra, T L
2016-04-01
A trade-off exists between building confidence in health-economic (HE) decision models and the use of scarce resources. We aimed to create a practical tool providing model users with a structured view into the validation status of HE decision models, to address this trade-off. A Delphi panel was organized, and was completed by a workshop during an international conference. The proposed tool was constructed iteratively based on comments from, and the discussion amongst, panellists. During the Delphi process, comments were solicited on the importance and feasibility of possible validation techniques for modellers, their relevance for decision makers, and the overall structure and formulation in the tool. The panel consisted of 47 experts in HE modelling and HE decision making from various professional and international backgrounds. In addition, 50 discussants actively engaged in the discussion at the conference workshop and returned 19 questionnaires with additional comments. The final version consists of 13 items covering all relevant aspects of HE decision models: the conceptual model, the input data, the implemented software program, and the model outcomes. Assessment of the Validation Status of Health-Economic decision models (AdViSHE) is a validation-assessment tool in which model developers report in a systematic way both on validation efforts performed and on their outcomes. Subsequently, model users can establish whether confidence in the model is justified or whether additional validation efforts should be undertaken. In this way, AdViSHE enhances transparency of the validation status of HE models and supports efficient model validation.
National Energy Audit Tool for Multifamily Buildings Development Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malhotra, Mini; MacDonald, Michael; Accawi, Gina K
The U.S. Department of Energy's (DOE's) Weatherization Assistance Program (WAP) enables low-income families to reduce their energy costs by providing funds to make their homes more energy efficient. In addition, the program funds Weatherization Training and Technical Assistance (T and TA) activities to support a range of program operations. These activities include measuring and documenting performance, monitoring programs, promoting advanced techniques and collaborations to further improve program effectiveness, and training, including developing tools and information resources. The T and TA plan outlines the tasks, activities, and milestones to support the weatherization network with the program implementation ramp up efforts. Weatherizationmore » of multifamily buildings has been recognized as an effective way to ramp up weatherization efforts. To support this effort, the 2009 National Weatherization T and TA plan includes the task of expanding the functionality of the Weatherization Assistant, a DOE-sponsored family of energy audit computer programs, to perform audits for large and small multifamily buildings This report describes the planning effort for a new multifamily energy audit tool for DOE's WAP. The functionality of the Weatherization Assistant is being expanded to also perform energy audits of small multifamily and large multifamily buildings. The process covers an assessment of needs that includes input from national experts during two national Web conferences. The assessment of needs is then translated into capability and performance descriptions for the proposed new multifamily energy audit, with some description of what might or should be provided in the new tool. The assessment of needs is combined with our best judgment to lay out a strategy for development of the multifamily tool that proceeds in stages, with features of an initial tool (version 1) and a more capable version 2 handled with currently available resources. Additional development in the future is expected to be needed if more capabilities are to be added. A rough schedule for development of the version 1 tool is presented. The components and capabilities described in this plan will serve as the starting point for development of the proposed new multifamily energy audit tool for WAP.« less
Open Source Tools for Seismicity Analysis
NASA Astrophysics Data System (ADS)
Powers, P.
2010-12-01
The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.
Data standards, sense and stability: Scratchpads, the ICZN and ZooBank
Baker, Edward; Michel, Ellinor
2011-01-01
Abstract The International Commission of Zoological Nomenclature has used the Scratchpads platform (currently being developed and maintained by ViBRANT) as the foundation for its redesigned website and as a platform for engaging with its users. The existing Scratchpad tools, with extensions to provide additional functions, have allowed for a major transformation in presentation of linked nomenclatural tools. Continued development of the new website will act as a springboard for the ICZN to participate more fully in the wider community of biodiversity informatics. PMID:22207812
Olechnovic, Kliment; Margelevicius, Mindaugas; Venclovas, Ceslovas
2011-03-01
We present Voroprot, an interactive cross-platform software tool that provides a unique set of capabilities for exploring geometric features of protein structure. Voroprot allows the construction and visualization of the Apollonius diagram (also known as the additively weighted Voronoi diagram), the Apollonius graph, protein alpha shapes, interatomic contact surfaces, solvent accessible surfaces, pockets and cavities inside protein structure. Voroprot is available for Windows, Linux and Mac OS X operating systems and can be downloaded from http://www.ibt.lt/bioinformatics/voroprot/.
The Alba ray tracing code: ART
NASA Astrophysics Data System (ADS)
Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi
2013-09-01
The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.
An Overview of Tools for Creating, Validating and Using PDS Metadata
NASA Astrophysics Data System (ADS)
King, T. A.; Hardman, S. H.; Padams, J.; Mafi, J. N.; Cecconi, B.
2017-12-01
NASA's Planetary Data System (PDS) has defined information models for creating metadata to describe bundles, collections and products for all the assets acquired by a planetary science projects. Version 3 of the PDS Information Model (commonly known as "PDS3") is widely used and is used to describe most of the existing planetary archive. Recently PDS has released version 4 of the Information Model (commonly known as "PDS4") which is designed to improve consistency, efficiency and discoverability of information. To aid in creating, validating and using PDS4 metadata the PDS and a few associated groups have developed a variety of tools. In addition, some commercial tools, both free and for a fee, can be used to create and work with PDS4 metadata. We present an overview of these tools, describe those tools currently under development and provide guidance as to which tools may be most useful for missions, instrument teams and the individual researcher.
Mukherjee, Joydeep; Llewellyn, Lyndon E; Evans-Illidge, Elizabeth A
2008-01-01
Microbial marine biodiscovery is a recent scientific endeavour developing at a time when information and other technologies are also undergoing great technical strides. Global visualisation of datasets is now becoming available to the world through powerful and readily available software such as Worldwind™, ArcGIS Explorer™ and Google Earth™. Overlaying custom information upon these tools is within the hands of every scientist and more and more scientific organisations are making data available that can also be integrated into these global visualisation tools. The integrated global view that these tools enable provides a powerful desktop exploration tool. Here we demonstrate the value of this approach to marine microbial biodiscovery by developing a geobibliography that incorporates citations on tropical and near-tropical marine microbial natural products research with Google Earth™ and additional ancillary global data sets. The tools and software used are all readily available and the reader is able to use and install the material described in this article. PMID:19172194
NASA Astrophysics Data System (ADS)
Sanan, P.; Tackley, P. J.; Gerya, T.; Kaus, B. J. P.; May, D.
2017-12-01
StagBL is an open-source parallel solver and discretization library for geodynamic simulation,encapsulating and optimizing operations essential to staggered-grid finite volume Stokes flow solvers.It provides a parallel staggered-grid abstraction with a high-level interface in C and Fortran.On top of this abstraction, tools are available to define boundary conditions and interact with particle systems.Tools and examples to efficiently solve Stokes systems defined on the grid are provided in small (direct solver), medium (simple preconditioners), and large (block factorization and multigrid) model regimes.By working directly with leading application codes (StagYY, I3ELVIS, and LaMEM) and providing an API and examples to integrate with others, StagBL aims to become a community tool supplying scalable, portable, reproducible performance toward novel science in regional- and planet-scale geodynamics and planetary science.By implementing kernels used by many research groups beneath a uniform abstraction layer, the library will enable optimization for modern hardware, thus reducing community barriers to large- or extreme-scale parallel simulation on modern architectures. In particular, the library will include CPU-, Manycore-, and GPU-optimized variants of matrix-free operators and multigrid components.The common layer provides a framework upon which to introduce innovative new tools.StagBL will leverage p4est to provide distributed adaptive meshes, and incorporate a multigrid convergence analysis tool.These options, in addition to a wealth of solver options provided by an interface to PETSc, will make the most modern solution techniques available from a common interface. StagBL in turn provides a PETSc interface, DMStag, to its central staggered grid abstraction.We present public version 0.5 of StagBL, including preliminary integration with application codes and demonstrations with its own demonstration application, StagBLDemo. Central to StagBL is the notion of an uninterrupted pipeline from toy/teaching codes to high-performance, extreme-scale solves. StagBLDemo replicates the functionality of an advanced MATLAB-style regional geodynamics code, thus providing users with a concrete procedure to exceed the performance and scalability limitations of smaller-scale tools.
2015-01-01
We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1–100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets. PMID:25411686
French, William R; Zimmerman, Lisa J; Schilling, Birgit; Gibson, Bradford W; Miller, Christine A; Townsend, R Reid; Sherrod, Stacy D; Goodwin, Cody R; McLean, John A; Tabb, David L
2015-02-06
We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1-100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets.
Travel During Pregnancy: Considerations for the Obstetric Provider.
Antony, Kathleen M; Ehrenthal, Deborah; Evensen, Ann; Iruretagoyena, J Igor
2017-02-01
Travel among US citizens is becoming increasingly common, and travel during pregnancy is also speculated to be increasingly common. During pregnancy, the obstetric provider may be the first or only clinician approached with questions regarding travel. In this review, we discuss the reasons women travel during pregnancy, medical considerations for long-haul air travel, destination-specific medical complications, and precautions for pregnant women to take both before travel and while abroad. To improve the quality of pretravel counseling for patients before or during pregnancy, we have created 2 tools: a guide for assessing the pregnant patient's risk during travel and a pretravel checklist for the obstetric provider. A PubMed search for English-language publications about travel during pregnancy was performed using the search terms "travel" and "pregnancy" and was limited to those published since the year 2000. Studies on subtopics were not limited by year of publication. Eight review articles were identified. Three additional studies that analyzed data from travel clinics were found, and 2 studies reported on the frequency of international travel during pregnancy. Additional publications addressed air travel during pregnancy (10 reviews, 16 studies), high-altitude travel during pregnancy (5 reviews, 5 studies), and destination-specific illnesses in pregnant travelers. Travel during pregnancy including international travel is common. Pregnant travelers have unique travel-related and destination-specific risks. We review those risks and provide tools for obstetric providers to use in counseling pregnant travelers.
LIMO EEG: a toolbox for hierarchical LInear MOdeling of ElectroEncephaloGraphic data.
Pernet, Cyril R; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A
2011-01-01
Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses.
LIMO EEG: A Toolbox for Hierarchical LInear MOdeling of ElectroEncephaloGraphic Data
Pernet, Cyril R.; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A.
2011-01-01
Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses. PMID:21403915
Combining Induced Pluripotent Stem Cells and Genome Editing Technologies for Clinical Applications.
Chang, Chia-Yu; Ting, Hsiao-Chien; Su, Hong-Lin; Jeng, Jing-Ren
2018-01-01
In this review, we introduce current developments in induced pluripotent stem cells (iPSCs), site-specific nuclease (SSN)-mediated genome editing tools, and the combined application of these two novel technologies in biomedical research and therapeutic trials. The sustainable pluripotent property of iPSCs in vitro not only provides unlimited cell sources for basic research but also benefits precision medicines for human diseases. In addition, rapidly evolving SSN tools efficiently tailor genetic manipulations for exploring gene functions and can be utilized to correct genetic defects of congenital diseases in the near future. Combining iPSC and SSN technologies will create new reliable human disease models with isogenic backgrounds in vitro and provide new solutions for cell replacement and precise therapies.
The dark side of the immunohistochemical moon: industry.
Kalyuzhny, Alexander E
2009-12-01
Modern biological research is dependent on tools developed and provided by commercial suppliers, and antibodies for immunohistochemistry are among the most frequently used of these tools. Not all commercial antibodies perform as expected, however; this problem leads researchers to waste time and money when using antibodies that perform inadequately. Different commercial suppliers offer antibodies of varying degrees of quality and, in some cases, are unable to provide expert technical support for the immunohistochemical use of their antibodies. This article briefly describes the production of commercial antibodies from the manufacturer's perspective and presents some guidelines for choosing appropriate commercial antibodies for immunohistochemistry. Additionally, the article suggests steps to establish mutually beneficial relationships between commercial antibody suppliers and researchers who use them.
Social Media As a Leadership Tool for Pharmacists
Toney, Blake; Goff, Debra A.; Weber, Robert J.
2015-01-01
The profession of pharmacy is currently experiencing transformational change in health system practice models with pharmacists’ provider status. Gaining buy-in and support of stakeholders in medicine, nursing, and other advocates for patient care is critical. To this end, building momentum to advance the profession will require experimentation with and utilization of more efficient ways to disseminate relevant information. Traditional methods to communicate can be inefficient and painstakingly slow. Health care providers are turning to social media to network, connect, engage, educate, and learn. Pharmacy leaders can use social media as an additional tool in the leadership toolkit. This article of the Director’s Forum shows how social media can assist pharmacy leaders in further developing patient-centered pharmacy services. PMID:26448676
Rhebergen, Martijn D F; Hulshof, Carel T J; Lenderink, Annet F; van Dijk, Frank J H
2010-10-22
Common information facilities do not always provide the quality information needed to answer questions on health or health-related issues, such as Occupational Safety and Health (OSH) matters. Barriers may be the accessibility, quantity and readability of information. Online Question & Answer (Q&A) network tools, which link questioners directly to experts can overcome some of these barriers. When designing and testing online tools, assessing the usability and applicability is essential. Therefore, the purpose of this study is to assess the usability and applicability of a new online Q&A network tool for answers on OSH questions. We applied a cross-sectional usability test design. Eight occupational health experts and twelve potential questioners from the working population (workers) were purposively selected to include a variety of computer- and internet-experiences. During the test, participants were first observed while executing eight tasks that entailed important features of the tool. In addition, they were interviewed. Through task observations and interviews we assessed applicability, usability (effectiveness, efficiency and satisfaction) and facilitators and barriers in use. Most features were usable, though several could be improved. Most tasks were executed effectively. Some tasks, for example searching stored questions in categories, were not executed efficiently and participants were less satisfied with the corresponding features. Participants' recommendations led to improvements. The tool was found mostly applicable for additional information, to observe new OSH trends and to improve contact between OSH experts and workers. Hosting and support by a trustworthy professional organization, effective implementation campaigns, timely answering and anonymity were seen as important use requirements. This network tool is a promising new strategy for offering company workers high quality information to answer OSH questions. Q&A network tools can be an addition to existing information facilities in the field of OSH, but also to other healthcare fields struggling with how to answer questions from people in practice with high quality information. In the near future, we will focus on the use of the tool and its effects on information and knowledge dissemination.
Purification of functionalized DNA origami nanostructures.
Shaw, Alan; Benson, Erik; Högberg, Björn
2015-05-26
The high programmability of DNA origami has provided tools for precise manipulation of matter at the nanoscale. This manipulation of matter opens up the possibility to arrange functional elements for a diverse range of applications that utilize the nanometer precision provided by these structures. However, the realization of functionalized DNA origami still suffers from imperfect production methods, in particular in the purification step, where excess material is separated from the desired functionalized DNA origami. In this article we demonstrate and optimize two purification methods that have not previously been applied to DNA origami. In addition, we provide a systematic study comparing the purification efficacy of these and five other commonly used purification methods. Three types of functionalized DNA origami were used as model systems in this study. DNA origami was patterned with either small molecules, antibodies, or larger proteins. With the results of our work we aim to provide a guideline in quality fabrication of various types of functionalized DNA origami and to provide a route for scalable production of these promising tools.
iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations
NASA Astrophysics Data System (ADS)
Vanfretti, L.; Rabuzin, T.; Baudette, M.; Murad, M.
The iTesla Power Systems Library (iPSL) is a Modelica package providing a set of power system components for phasor time-domain modeling and simulation. The Modelica language provides a systematic approach to develop models using a formal mathematical description, that uniquely specifies the physical behavior of a component or the entire system. Furthermore, the standardized specification of the Modelica language (Modelica Association [1]) enables unambiguous model exchange by allowing any Modelica-compliant tool to utilize the models for simulation and their analyses without the need of a specific model transformation tool. As the Modelica language is being developed with open specifications, any tool that implements these requirements can be utilized. This gives users the freedom of choosing an Integrated Development Environment (IDE) of their choice. Furthermore, any integration solver can be implemented within a Modelica tool to simulate Modelica models. Additionally, Modelica is an object-oriented language, enabling code factorization and model re-use to improve the readability of a library by structuring it with object-oriented hierarchy. The developed library is released under an open source license to enable a wider distribution and let the user customize it to their specific needs. This paper describes the iPSL and provides illustrative application examples.
The Future of Air Traffic Management
NASA Technical Reports Server (NTRS)
Denery, Dallas G.; Erzberger, Heinz; Edwards, Thomas A. (Technical Monitor)
1998-01-01
A system for the control of terminal area traffic to improve productivity, referred to as the Center-TRACON Automation System (CTAS), is being developed at NASA's Ames Research Center under a joint program with the FAA. CTAS consists of a set of integrated tools that provide computer-generated advisories for en-route and terminal area controllers. The premise behind the design of CTAS has been that successful planning of traffic requires accurate trajectory prediction. Data bases consisting of representative aircraft performance models, airline preferred operational procedures and a three dimensional wind model support the trajectory prediction. The research effort has been the design of a set of automation tools that make use of this trajectory prediction capability to assist controllers in overall management of traffic. The first tool, the Traffic Management Advisor (TMA), provides the overall flow management between the en route and terminal areas. A second tool, the Final Approach Spacing Tool (FAST) provides terminal area controllers with sequence and runway advisories to allow optimal use of the runways. The TMA and FAST are now being used in daily operations at Dallas/Ft. Worth airport. Additional activities include the development of several other tools. These include: 1) the En Route Descent Advisor that assist the en route controller in issuing conflict free descents and ascents; 2) the extension of FAST to include speed and heading advisories and the Expedite Departure Path (EDP) that assists the terminal controller in management of departures; and 3) the Collaborative Arrival Planner (CAP) that will assist the airlines in operational decision making. The purpose of this presentation is to review the CTAS concept and to present the results of recent field tests. The paper will first discuss the overall concept and then discuss the status of the individual tools.
Review of functional near-infrared spectroscopy in neurorehabilitation
Mihara, Masahito; Miyai, Ichiro
2016-01-01
Abstract. We provide a brief overview of the research and clinical applications of near-infrared spectroscopy (NIRS) in the neurorehabilitation field. NIRS has several potential advantages and shortcomings as a neuroimaging tool and is suitable for research application in the rehabilitation field. As one of the main applications of NIRS, we discuss its application as a monitoring tool, including investigating the neural mechanism of functional recovery after brain damage and investigating the neural mechanisms for controlling bipedal locomotion and postural balance in humans. In addition to being a monitoring tool, advances in signal processing techniques allow us to use NIRS as a therapeutic tool in this field. With a brief summary of recent studies investigating the clinical application of NIRS using motor imagery task, we discuss the possible clinical usage of NIRS in brain–computer interface and neurofeedback. PMID:27429995
Mouse Models for Drug Discovery. Can New Tools and Technology Improve Translational Power?
Zuberi, Aamir; Lutz, Cathleen
2016-12-01
The use of mouse models in biomedical research and preclinical drug evaluation is on the rise. The advent of new molecular genome-altering technologies such as CRISPR/Cas9 allows for genetic mutations to be introduced into the germ line of a mouse faster and less expensively than previous methods. In addition, the rapid progress in the development and use of somatic transgenesis using viral vectors, as well as manipulations of gene expression with siRNAs and antisense oligonucleotides, allow for even greater exploration into genomics and systems biology. These technological advances come at a time when cost reductions in genome sequencing have led to the identification of pathogenic mutations in patient populations, providing unprecedented opportunities in the use of mice to model human disease. The ease of genetic engineering in mice also offers a potential paradigm shift in resource sharing and the speed by which models are made available in the public domain. Predictively, the knowledge alone that a model can be quickly remade will provide relief to resources encumbered by licensing and Material Transfer Agreements. For decades, mouse strains have provided an exquisite experimental tool to study the pathophysiology of the disease and assess therapeutic options in a genetically defined system. However, a major limitation of the mouse has been the limited genetic diversity associated with common laboratory mice. This has been overcome with the recent development of the Collaborative Cross and Diversity Outbred mice. These strains provide new tools capable of replicating genetic diversity to that approaching the diversity found in human populations. The Collaborative Cross and Diversity Outbred strains thus provide a means to observe and characterize toxicity or efficacy of new therapeutic drugs for a given population. The combination of traditional and contemporary mouse genome editing tools, along with the addition of genetic diversity in new modeling systems, are synergistic and serve to make the mouse a better model for biomedical research, enhancing the potential for preclinical drug discovery and personalized medicine. © The Author 2016. Published by Oxford University Press.
2012-01-01
Background Little information is known about what information women want when choosing a birth facility. The objective of this study was to inform the development of a consumer decision support tool about birth facility by identifying the information needs of maternity care consumers in Queensland, Australia. Methods Participants were 146 women residing in both urban and rural areas of Queensland, Australia who were pregnant and/or had recently given birth. A cross-sectional survey was administered in which participants were asked to rate the importance of 42 information items to their decision-making about birth facility. Participants could also provide up to ten additional information items of interest in an open-ended question. Results On average, participants rated 30 of the 42 information items as important to decision-making about birth facility. While the majority of information items were valued by most participants, those related to policies about support people, other women’s recommendations about the facility, freedom to choose one’s preferred position during labour and birth, the aesthetic quality of the facility, and access to on-site neonatal intensive care were particularly widely valued. Additional items of interest frequently focused on postnatal care and support, policies related to medical intervention, and access to water immersion. Conclusions The women surveyed had significant and diverse information needs for decision-making about birth facility. These findings have immediate applications for the development of decision support tools about birth facility, and highlight the need for tools which provide a large volume of information in an accessible and user-friendly format. These findings may also be used to guide communication and information-sharing by care providers involved in counselling pregnant women and families about their options for birth facility or providing referrals to birth facilities. PMID:22708648
Thompson, Rachel; Wojcieszek, Aleena M
2012-06-18
Little information is known about what information women want when choosing a birth facility. The objective of this study was to inform the development of a consumer decision support tool about birth facility by identifying the information needs of maternity care consumers in Queensland, Australia. Participants were 146 women residing in both urban and rural areas of Queensland, Australia who were pregnant and/or had recently given birth. A cross-sectional survey was administered in which participants were asked to rate the importance of 42 information items to their decision-making about birth facility. Participants could also provide up to ten additional information items of interest in an open-ended question. On average, participants rated 30 of the 42 information items as important to decision-making about birth facility. While the majority of information items were valued by most participants, those related to policies about support people, other women's recommendations about the facility, freedom to choose one's preferred position during labour and birth, the aesthetic quality of the facility, and access to on-site neonatal intensive care were particularly widely valued. Additional items of interest frequently focused on postnatal care and support, policies related to medical intervention, and access to water immersion. The women surveyed had significant and diverse information needs for decision-making about birth facility. These findings have immediate applications for the development of decision support tools about birth facility, and highlight the need for tools which provide a large volume of information in an accessible and user-friendly format. These findings may also be used to guide communication and information-sharing by care providers involved in counselling pregnant women and families about their options for birth facility or providing referrals to birth facilities.
Solutions for Mining Distributed Scientific Data
NASA Astrophysics Data System (ADS)
Lynnes, C.; Pham, L.; Graves, S.; Ramachandran, R.; Maskey, M.; Keiser, K.
2007-12-01
Researchers at the University of Alabama in Huntsville (UAH) and the Goddard Earth Sciences Data and Information Services Center (GES DISC) are working on approaches and methodologies facilitating the analysis of large amounts of distributed scientific data. Despite the existence of full-featured analysis tools, such as the Algorithm Development and Mining (ADaM) toolkit from UAH, and data repositories, such as the GES DISC, that provide online access to large amounts of data, there remain obstacles to getting the analysis tools and the data together in a workable environment. Does one bring the data to the tools or deploy the tools close to the data? The large size of many current Earth science datasets incurs significant overhead in network transfer for analysis workflows, even with the advanced networking capabilities that are available between many educational and government facilities. The UAH and GES DISC team are developing a capability to define analysis workflows using distributed services and online data resources. We are developing two solutions for this problem that address different analysis scenarios. The first is a Data Center Deployment of the analysis services for large data selections, orchestrated by a remotely defined analysis workflow. The second is a Data Mining Center approach of providing a cohesive analysis solution for smaller subsets of data. The two approaches can be complementary and thus provide flexibility for researchers to exploit the best solution for their data requirements. The Data Center Deployment of the analysis services has been implemented by deploying ADaM web services at the GES DISC so they can access the data directly, without the need of network transfers. Using the Mining Workflow Composer, a user can define an analysis workflow that is then submitted through a Web Services interface to the GES DISC for execution by a processing engine. The workflow definition is composed, maintained and executed at a distributed location, but most of the actual services comprising the workflow are available local to the GES DISC data repository. Additional refinements will ultimately provide a package that is easily implemented and configured at additional data centers for analysis of additional science data sets. Enhancements to the ADaM toolkit allow the staging of distributed data wherever the services are deployed, to support a Data Mining Center that can provide additional computational resources, large storage of output, easier addition and updates to available services, and access to data from multiple repositories. The Data Mining Center case provides researchers more flexibility to quickly try different workflow configurations and refine the process, using smaller amounts of data that may likely be transferred from distributed online repositories. This environment is sufficient for some analyses, but can also be used as an initial sandbox to test and refine a solution before staging the execution at a Data Center Deployment. Detection of airborne dust both over water and land in MODIS imagery using mining services for both solutions will be presented. The dust detection is just one possible example of the mining and analysis capabilities the proposed mining services solutions will provide to the science community. More information about the available services and the current status of this project is available at http://www.itsc.uah.edu/mws/
Lyles, Courtney R; Altschuler, Andrea; Chawla, Neetu; Kowalski, Christine; McQuillan, Deanna; Bayliss, Elizabeth; Heisler, Michele; Grant, Richard W
2016-09-14
Complex patients with multiple chronic conditions often face significant challenges communicating and coordinating with their primary care physicians. These challenges are exacerbated by the limited time allotted to primary care visits. Our aim was to employ a user-centered design process to create a tablet tool for use by patients for visit discussion prioritization. We employed user-centered design methods to create a tablet-based waiting room tool that enables complex patients to identify and set discussion topic priorities for their primary care visit. In an iterative design process, we completed one-on-one interviews with 40 patients and their 17 primary care providers, followed by three design sessions with a 12-patient group. We audiorecorded and transcribed all discussions and categorized major themes. In addition, we met with 15 key health communication, education, and technology leaders within our health system to further review the design and plan for broader implementation of the tool. In this paper, we present the significant changes made to the tablet tool at each phase of this design work. Patient feedback emphasized the need to make the tablet tool accessible for patients who lacked technical proficiency and to reduce the quantity and complexity of text presentation. Both patients and their providers identified specific content choices based on their personal experiences (eg, the ability to raise private or sensitive concerns) and recommended targeting new patients. Stakeholder groups provided essential input on the need to augment text with video and to create different versions of the videos to match sex and race/ethnicity of the actors with patients. User-centered design in collaboration with patients, providers, and key health stakeholders led to marked evolution in the initial content, layout, and target audience for a tablet waiting room tool intended to assist complex patients with setting visit discussion priorities.
Drinking Water Consequences Tools. A Literature Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pasqualini, Donatella
2016-05-12
In support of the goals of Department of Homeland Security’s (DHS) National Protection and Programs Directorate and the Federal Emergency Management Agency, the DHS Office of Science and Technology is seeking to develop and/or modify consequence assessment tools to enable drinking water systems owner/operators to estimate the societal and economic consequences of drinking water disruption due to the threats and hazards. This work will expand the breadth of consequence estimation methods and tools using the best-available data describing water distribution infrastructure, owner/assetlevel economic losses, regional-scale economic activity, and health. In addition, this project will deploy the consequence methodology and capabilitymore » within a Web-based platform. This report is intended to support DHS effort providing a review literature review of existing assessment tools of water and wastewater systems consequences to disruptions. The review includes tools that assess water systems resilience, vulnerability, and risk. This will help to understand gaps and limitations of these tools in order to plan for the development of the next-generation consequences tool for water and waste water systems disruption.« less
King County Nearshore Habitat Mapping Data Report: Picnic Point to Shilshole Bay Marina
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodruff, Dana L.; Farley, Paul J.; Borde, Amy B.
2000-12-31
The objective of this study is to provide accurate, georeferenced maps of benthic habitats to assist in the siting of a new wastewater treatment plant outfall and the assessment of habitats of endangered, threatened, and economically important species. The mapping was conducted in the fall of 1999 using two complementary techniques: side-scan sonar and underwater videography. Products derived from these techniques include geographic information system (GIS) compatible polygon data of substrate type and vegetation cover, including eelgrass and kelp. Additional GIS overlays include underwater video track line data of total macroalgae, selected macroalgal species, fish, and macroinvertebrates. The combined toolsmore » of geo-referenced side-scan sonar and underwater video is a powerful technique for assessing and mapping of nearshore habitat in Puget Sound. Side-scan sonar offers the ability to map eelgrass with high spatial accuracy and resolution, and provides information on patch size, shape, and coverage. It also provides information on substrate change and location of specific targets (e.g., piers, docks, pilings, large boulders, debris piles). The addition of underwater video is a complementary tool providing both groundtruthing for the sonar and additional information on macro fauna and flora. As a groundtruthing technique, the video was able to confirm differences between substrate types, as well as detect subtle spatial changes in substrate. It also verified information related to eelgrass, including the density classification categories and the type of substrate associated with eelgrass, which could not be determined easily with side- scan sonar. Video is also a powerful tool for mapping the location of macroalgae, (including kelp and Ulva), fish and macroinvertebrates. The ability to geo-locate these resources in their functional habitat provides an added layer of information and analytical potential.« less
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.
2013-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).
Content Validation and Evaluation of an Endovascular Teamwork Assessment Tool.
Hull, L; Bicknell, C; Patel, K; Vyas, R; Van Herzeele, I; Sevdalis, N; Rudarakanchana, N
2016-07-01
To modify, content validate, and evaluate a teamwork assessment tool for use in endovascular surgery. A multistage, multimethod study was conducted. Stage 1 included expert review and modification of the existing Observational Teamwork Assessment for Surgery (OTAS) tool. Stage 2 included identification of additional exemplar behaviours contributing to effective teamwork and enhanced patient safety in endovascular surgery (using real-time observation, focus groups, and semistructured interviews of multidisciplinary teams). Stage 3 included content validation of exemplar behaviours using expert consensus according to established psychometric recommendations and evaluation of structure, content, feasibility, and usability of the Endovascular Observational Teamwork Assessment Tool (Endo-OTAS) by an expert multidisciplinary panel. Stage 4 included final team expert review of exemplars. OTAS core team behaviours were maintained (communication, coordination, cooperation, leadership team monitoring). Of the 114 OTAS behavioural exemplars, 19 were modified, four removed, and 39 additional endovascular-specific behaviours identified. Content validation of these 153 exemplar behaviours showed that 113/153 (73.9%) reached the predetermined Item-Content Validity Index rating for teamwork and/or patient safety. After expert team review, 140/153 (91.5%) exemplars were deemed to warrant inclusion in the tool. More than 90% of the expert panel agreed that Endo-OTAS is an appropriate teamwork assessment tool with observable behaviours. Some concerns were noted about the time required to conduct observations and provide performance feedback. Endo-OTAS is a novel teamwork assessment tool, with evidence for content validity and relevance to endovascular teams. Endo-OTAS enables systematic objective assessment of the quality of team performance during endovascular procedures. Copyright © 2016. Published by Elsevier Ltd.
mMass 3: a cross-platform software environment for precise analysis of mass spectrometric data.
Strohalm, Martin; Kavan, Daniel; Novák, Petr; Volný, Michael; Havlícek, Vladimír
2010-06-01
While tools for the automated analysis of MS and LC-MS/MS data are continuously improving, it is still often the case that at the end of an experiment, the mass spectrometrist will spend time carefully examining individual spectra. Current software support is mostly provided only by the instrument vendors, and the available software tools are often instrument-dependent. Here we present a new generation of mMass, a cross-platform environment for the precise analysis of individual mass spectra. The software covers a wide range of processing tasks such as import from various data formats, smoothing, baseline correction, peak picking, deisotoping, charge determination, and recalibration. Functions presented in the earlier versions such as in silico digestion and fragmentation were redesigned and improved. In addition to Mascot, an interface for ProFound has been implemented. A specific tool is available for isotopic pattern modeling to enable precise data validation. The largest available lipid database (from the LIPID MAPS Consortium) has been incorporated and together with the new compound search tool lipids can be rapidly identified. In addition, the user can define custom libraries of compounds and use them analogously. The new version of mMass is based on a stand-alone Python library, which provides the basic functionality for data processing and interpretation. This library can serve as a good starting point for other developers in their projects. Binary distributions of mMass, its source code, a detailed user's guide, and video tutorials are freely available from www.mmass.org .
Testing simple deceptive honeypot tools
NASA Astrophysics Data System (ADS)
Yahyaoui, Aymen; Rowe, Neil C.
2015-05-01
Deception can be a useful defensive technique against cyber-attacks; it has the advantage of unexpectedness to attackers and offers a variety of tactics. Honeypots are a good tool for deception. They act as decoy computers to confuse attackers and exhaust their time and resources. This work tested the effectiveness of two free honeypot tools in real networks by varying their location and virtualization, and the effects of adding more deception to them. We tested a Web honeypot tool, Glastopf and an SSH honeypot tool Kippo. We deployed the Web honeypot in both a residential network and our organization's network and as both real and virtual machines; the organization honeypot attracted more attackers starting in the third week. Results also showed that the virtual honeypots received attacks from more unique IP addresses. They also showed that adding deception to the Web honeypot, in the form of additional linked Web pages and interactive features, generated more interest by attackers. For the purpose of comparison, we used examined log files of a legitimate Web-site www.cmand.org. The traffic distributions for the Web honeypot and the legitimate Web site showed similarities (with much malicious traffic from Brazil), but the SSH honeypot was different (with much malicious traffic from China). Contrary to previous experiments where traffic to static honeypots decreased quickly, our honeypots received increasing traffic over a period of three months. It appears that both honeypot tools are useful for providing intelligence about cyber-attack methods, and that additional deception is helpful.
Comprehensive review on additives of topical dosage forms for drug delivery.
Garg, Tarun; Rath, Goutam; Goyal, Amit K
2015-12-01
Skin is the largest organ of the human body and plays the most important role in protecting against pathogen and foreign matter. Three important modes such as topical, regional and transdermal are widely used for delivery of various dosage forms. Among these modes, the topical dosage forms are preferred because it provides local therapeutic activity when applied to the skin or mucous membranes. Additives or pharmaceutical excipients (non-drug component of dosage form) are used as inactive ingredients in dosage form or tools for structuring dosage forms. The main use of topical dosage form additives are controling the extent of absorption, maintaining the viscosity, improving the stability as well as organoleptic property and increasing the bulk of the formulation. The overall goal of this article is to provide the clinician with information related to the topical dosage form additives and their current major applications against various diseases.
A study on the applications of AI in finishing of additive manufacturing parts
NASA Astrophysics Data System (ADS)
Fathima Patham, K.
2017-06-01
Artificial intelligent and computer simulation are the technological powerful tools for solving complex problems in the manufacturing industries. Additive Manufacturing is one of the powerful manufacturing techniques that provide design flexibilities to the products. The products with complex shapes are directly manufactured without the need of any machining and tooling using Additive Manufacturing. However, the main drawback of the components produced using the Additive Manufacturing processes is the quality of the surfaces. This study aims to minimize the defects caused during Additive Manufacturing with the aid of Artificial Intelligence. The developed AI system has three layers, each layer is trying to eliminate or minimize the production errors. The first layer of the AI system optimizes the digitization of the 3D CAD model of the product and hence reduces the stair case errors. The second layer of the AI system optimizes the 3D printing machine parameters in order to eliminate the warping effect. The third layer of AI system helps to choose the surface finishing technique suitable for the printed component based on the Degree of Complexity of the product and the material. The efficiency of the developed AI system was examined on the functional parts such as gears.
Guerra, Ricardo Oliveira; Oliveira, Bruna Silva; Alvarado, Beatriz Eugenia; Curcio, Carmen Lucia; Rejeski, W Jack; Marsh, Anthony P; Ip, Edward H; Barnard, Ryan T; Guralnik, Jack M; Zunzunegui, Maria Victoria
2016-01-01
Aim To assess the reliability and the validity of Portuguese- and Spanish-translated versions of the video-based short-form Mobility Assessment Tool in assessing self-reported mobility, and to provide evidence for the applicability of these videos in elderly Latin American populations as a complement to physical performance measures. Methods The sample consisted of 300 elderly participants (150 from Brazil, 150 from Colombia) recruited at neighborhood social centers. Mobility was assessed with the Mobility Assessment Tool, and compared with the Short Physical Performance Battery score and self-reported functional limitations. Reliability was calculated using intraclass correlation coefficients. Multiple linear regression analyses were used to assess associations among mobility assessment tools and health, and sociodemographic variables. Results A significant gradient of increasing Mobility Assessment Tool score with better physical function was observed for both self-reported and objective measures, and in each city. Associations between self-reported mobility and health were strong, and significant. Mobility Assessment Tool scores were lower in women at both sites. Intraclass correlation coefficients of the Mobility Assessment Tool were 0.94 (95% confidence interval 0.90–0.97) in Brazil and 0.81 (95% confidence interval 0.66–0.91) in Colombia. Mobility Assessment Tool scores were lower in Manizales than in Natal after adjustment by Short Physical Performance Battery, self-rated health and sex. Conclusions These results provide evidence for high reliability and good validity of the Mobility Assessment Tool in its Spanish and Portuguese versions used in Latin American populations. In addition, the Mobility Assessment Tool can detect mobility differences related to environmental features that cannot be captured by objective perfor mance measures. PMID:24666718
Guerra, Ricardo Oliveira; Oliveira, Bruna Silva; Alvarado, Beatriz Eugenia; Curcio, Carmen Lucia; Rejeski, W Jack; Marsh, Anthony P; Ip, Edward H; Barnard, Ryan T; Guralnik, Jack M; Zunzunegui, Maria Victoria
2014-10-01
To assess the reliability and the validity of Portuguese- and Spanish-translated versions of the video-based short-form Mobility Assessment Tool in assessing self-reported mobility, and to provide evidence for the applicability of these videos in elderly Latin American populations as a complement to physical performance measures. The sample consisted of 300 elderly participants (150 from Brazil, 150 from Colombia) recruited at neighborhood social centers. Mobility was assessed with the Mobility Assessment Tool, and compared with the Short Physical Performance Battery score and self-reported functional limitations. Reliability was calculated using intraclass correlation coefficients. Multiple linear regression analyses were used to assess associations among mobility assessment tools and health, and sociodemographic variables. A significant gradient of increasing Mobility Assessment Tool score with better physical function was observed for both self-reported and objective measures, and in each city. Associations between self-reported mobility and health were strong, and significant. Mobility Assessment Tool scores were lower in women at both sites. Intraclass correlation coefficients of the Mobility Assessment Tool were 0.94 (95% confidence interval 0.90-0.97) in Brazil and 0.81 (95% confidence interval 0.66-0.91) in Colombia. Mobility Assessment Tool scores were lower in Manizales than in Natal after adjustment by Short Physical Performance Battery, self-rated health and sex. These results provide evidence for high reliability and good validity of the Mobility Assessment Tool in its Spanish and Portuguese versions used in Latin American populations. In addition, the Mobility Assessment Tool can detect mobility differences related to environmental features that cannot be captured by objective performance measures. © 2013 Japan Geriatrics Society.
Lesion correlates of impairments in actual tool use following unilateral brain damage.
Salazar-López, E; Schwaiger, B J; Hermsdörfer, J
2016-04-01
To understand how the brain controls actions involving tools, tests have been developed employing different paradigms such as pantomime, imitation and real tool use. The relevant areas have been localized in the premotor cortex, the middle temporal gyrus and the superior and inferior parietal lobe. This study employs Voxel Lesion Symptom Mapping to relate the functional impairment in actual tool use with extent and localization of the structural damage in the left (LBD, N=31) and right (RBD, N=19) hemisphere in chronic stroke patients. A series of 12 tools was presented to participants in a carousel. In addition, a non-tool condition tested the prescribed manipulation of a bar. The execution was scored according to an apraxic error scale based on the dimensions grasp, movement, direction and space. Results in the LBD group show that the ventro-dorsal stream constitutes the core of the defective network responsible for impaired tool use; it is composed of the inferior parietal lobe, the supramarginal and angular gyrus and the dorsal premotor cortex. In addition, involvement of regions in the temporal lobe, the rolandic operculum, the ventral premotor cortex and the middle occipital gyrus provide evidence of the role of the ventral stream in this task. Brain areas related to the use of the bar largely overlapped with this network. For patients with RBD data were less conclusive; however, a trend for the involvement of the temporal lobe in apraxic errors was manifested. Skilled bar manipulation depended on the same temporal area in these patients. Therefore, actual tool use depends on a well described left fronto-parietal-temporal network. RBD affects actual tool use, however the underlying neural processes may be more widely distributed and more heterogeneous. Goal directed manipulation of non-tool objects seems to involve very similar brain areas as tool use, suggesting that both types of manipulation share identical processes and neural representations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Martucci, Katherine T; Mackey, Sean C
2018-06-01
Neuroimaging research has demonstrated definitive involvement of the central nervous system in the development, maintenance, and experience of chronic pain. Structural and functional neuroimaging has helped elucidate central nervous system contributors to chronic pain in humans. Neuroimaging of pain has provided a tool for increasing our understanding of how pharmacologic and psychologic therapies improve chronic pain. To date, findings from neuroimaging pain research have benefitted clinical practice by providing clinicians with an educational framework to discuss the biopsychosocial nature of pain with patients. Future advances in neuroimaging-based therapeutics (e.g., transcranial magnetic stimulation, real-time functional magnetic resonance imaging neurofeedback) may provide additional benefits for clinical practice. In the future, with standardization and validation, brain imaging could provide objective biomarkers of chronic pain, and guide treatment for personalized pain management. Similarly, brain-based biomarkers may provide an additional predictor of perioperative prognoses.
Guidelines for the analysis of free energy calculations
Klimovich, Pavel V.; Shirts, Michael R.; Mobley, David L.
2015-01-01
Free energy calculations based on molecular dynamics (MD) simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical–analysis.py, freely available on GitHub at https://github.com/choderalab/pymbar–examples, that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope these tools and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations. PMID:25808134
Pilot study of digital tools to support multimodal hand hygiene in a clinical setting.
Thirkell, Gary; Chambers, Joanne; Gilbart, Wayne; Thornhill, Kerrill; Arbogast, James; Lacey, Gerard
2018-03-01
Digital tools for hand hygiene do not share data, limiting their potential to support multimodal programs. The Christie NHS Foundation Trust, United Kingdom, worked with GOJO (in the United States), MEG (in Ireland), and SureWash (in Ireland) to integrate their systems and pilot their combined use in a clinical setting. A 28-bed medical oncology unit piloted the system for 5 weeks. Live data from the tools were combined to create a novel combined risk status metric that was displayed publicly and via a management Web site. The combined risk status reduced over the pilot period. However, larger and longer duration studies are required to reach statistical significance. Staff and especially patient reaction was positive in that 70% of the hand hygiene training events were by patients. The digital tools did not negatively impact clinical workflow and received positive engagement from staff and patients. The combined risk status did not change significantly over the short pilot period because there was also no specific hand hygiene improvement campaign underway at the time of the pilot study. The results indicate that integrated digital tools can provide both rich data and novel tools that both measure impact and provide feedback to support the implementation of multimodal hand hygiene campaigns, reducing the need for significant additional personnel resources. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. All rights reserved.
The coaching process: an effective tool for professional development.
Kowalski, Karren; Casper, Colleen
2007-01-01
A model for coaching in nursing is described. Criteria for selecting a coach are discussed. Competencies for a coach are recommended. In addition, guidelines for caching sessions are provided as well as an example of an action plan outline to help the coachee identify areas of desired growth and options for developing these areas.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-28
... dealer that maintains the customer relationship. MSRB Response: The MSRB disagrees for the reasons stated... for issuers and customers and provide additional tools to assist with the administration and...'' customers to determine whether the investor would like to purchase the bonds.\\6\\ \\5\\ In some cases the...
Demonstrating the Relationship between School Nurse Workload and Student Outcomes
ERIC Educational Resources Information Center
Daughtry, Donna; Engelke, Martha Keehner
2018-01-01
This article describes how one very large, diverse school district developed a Student Acuity Tool for School Nurse Assignment and used a logic model to successfully advocate for additional school nurse positions. The logic model included three student outcomes that were evaluated: provide medications and procedures safely and accurately, increase…
Collaborative Scaffolding in Online Task-Based Voice Interactions between Advanced Learners
ERIC Educational Resources Information Center
Kenning, Marie-Madeleine
2010-01-01
This paper reports some of the findings of a distinctive innovative use of audio-conferencing involving a population (campus-based advanced learners) and a type of application (task-based language learning) that have received little attention to date: the use of Wimba Voice Tools to provide additional opportunities for spoken interactions between…
The Use of a Computer-Based Writing Program: Facilitation or Frustration?
ERIC Educational Resources Information Center
Chen, Chi-Fen Emily; Cheng, Wei-Yuan
2006-01-01
The invention of computer-based writing program has revolutionized the way of teaching second language writing. Embedded with artificial intelligence scoring engine, it can provide students with both immediate score and diagnostic feedback on their essays. In addition, some of such programs offer convenient writing and editing tools to facilitate…
Breaking into the Movies: Pedagogy and the Politics of Film.
ERIC Educational Resources Information Center
Giroux, Henry A.
2001-01-01
Proposes that in addition to entertaining, film offers up subject positions; mobilizes desires; influences its audience unconsciously; and helps construct the landscape of American culture. Notes that film can provide a pedagogical tool for offering students alternative views of the world. Concludes that as a form of public pedagogy, film combines…
Exploring Metaskills of Knowledge-Creating Inquiry in Higher Education
ERIC Educational Resources Information Center
Muukkonen, Hanni; Lakkala, Minna
2009-01-01
The skills of knowledge-creating inquiry are explored as a challenge for higher education. The knowledge-creation approach to learning provides a theoretical tool for addressing them: In addition to the individual and social aspects in regulation of inquiry, the knowledge-creation approach focuses on aspects related to advancing shared objects of…
Dynamics of buckbrush populations under simulated forest restoration alternatives
David W. Huffman; Margaret M. Moore
2008-01-01
Plant population models are valuable tools for assessing ecological tradeoffs between forest management approaches. In addition, these models can provide insight on plant life history patterns and processes important for persistence and recovery of populations in changing environments. In this study, we evaluated a set of ecological restoration alternatives for their...
Dynamics of buckbrush populations under simulated forest restoration alternatives (P-53)
David W. Huffman; Margaret M. Moore
2008-01-01
Plant population models are valuable tools for assessing ecological tradeoffs between forest management approaches. In addition, these models can provide insight on plant life history patterns and processes important for persistence and recovery of populations in changing environments. In this study, we evaluated a set of ecological restoration alternatives for their...
IP Addressing: Problem-Based Learning Approach on Computer Networks
ERIC Educational Resources Information Center
Jevremovic, Aleksandar; Shimic, Goran; Veinovic, Mladen; Ristic, Nenad
2017-01-01
The case study presented in this paper describes the pedagogical aspects and experience gathered while using an e-learning tool named IPA-PBL. Its main purpose is to provide additional motivation for adopting theoretical principles and procedures in a computer networks course. In the proposed model, the sequencing of activities of the learning…
ERIC Educational Resources Information Center
Sebastian, James; Allensworth, Elaine; Stevens, David
2014-01-01
Background: In this paper we call for studying school leadership and its relationship to instruction and learning through approaches that highlight the role of configurations of multiple organizational supports. A configuration-focused approach to studying leadership and other essential supports provides a valuable addition to existing tools in…
Safe Schools: Unified Emergency Contingency Plan for Schools.
ERIC Educational Resources Information Center
Illinois State Police, Springfield.
This contingency plan is intended to stimulate emergency planning and provide an organizational tool for Illinois schools to use in the development of individual emergency plans. It may accommodate and complement a school's current contingency plan and will allow for the inclusion of additional material concerning school safety. It is intended as…
Michael J. Firko; Jane Leslie Hayes
1990-01-01
Quantitative genetic studies of resistance can provide estimates of genetic parameters not available with other types of genetic analyses. Three methods are discussed for estimating the amount of additive genetic variation in resistance to individual insecticides and subsequent estimation of heritability (h2) of resistance. Sibling analysis and...
ERIC Educational Resources Information Center
Klemes, Joel; Epstein, Alit; Zuker, Michal; Grinberg, Nira; Ilovitch, Tamar
2006-01-01
The current study examines how a computerized learning environment assists students with learning disabilities (LD) enrolled in a distance learning course at the Open University of Israel. The technology provides computer display of the text, synchronized with auditory output and accompanied by additional computerized study skill tools which…
MAGIC: A Tool for Combining, Interpolating, and Processing Magnetograms
NASA Technical Reports Server (NTRS)
Allred, Joel
2012-01-01
Transients in the solar coronal magnetic field are ultimately the source of space weather. Models which seek to track the evolution of the coronal field require magnetogram images to be used as boundary conditions. These magnetograms are obtained by numerous instruments with different cadences and resolutions. A tool is required which allows modelers to fmd all available data and use them to craft accurate and physically consistent boundary conditions for their models. We have developed a software tool, MAGIC (MAGnetogram Interpolation and Composition), to perform exactly this function. MAGIC can manage the acquisition of magneto gram data, cast it into a source-independent format, and then perform the necessary spatial and temporal interpolation to provide magnetic field values as requested onto model-defined grids. MAGIC has the ability to patch magneto grams from different sources together providing a more complete picture of the Sun's field than is possible from single magneto grams. In doing this, care must be taken so as not to introduce nonphysical current densities along the seam between magnetograms. We have designed a method which minimizes these spurious current densities. MAGIC also includes a number of post-processing tools which can provide additional information to models. For example, MAGIC includes an interface to the DA VE4VM tool which derives surface flow velocities from the time evolution of surface magnetic field. MAGIC has been developed as an application of the KAMELEON data formatting toolkit which has been developed by the CCMC.
Wild monkeys flake stone tools.
Proffitt, Tomos; Luncz, Lydia V; Falótico, Tiago; Ottoni, Eduardo B; de la Torre, Ignacio; Haslam, Michael
2016-11-03
Our understanding of the emergence of technology shapes how we view the origins of humanity. Sharp-edged stone flakes, struck from larger cores, are the primary evidence for the earliest stone technology. Here we show that wild bearded capuchin monkeys (Sapajus libidinosus) in Brazil deliberately break stones, unintentionally producing recurrent, conchoidally fractured, sharp-edged flakes and cores that have the characteristics and morphology of intentionally produced hominin tools. The production of archaeologically visible cores and flakes is therefore no longer unique to the human lineage, providing a comparative perspective on the emergence of lithic technology. This discovery adds an additional dimension to interpretations of the human Palaeolithic record, the possible function of early stone tools, and the cognitive requirements for the emergence of stone flaking.
Vallejo, Vanessa; Mitache, Andrei V; Tarnanas, Ioannis; Muri, Rene; Mosimann, Urs P; Nef, Tobias
2015-08-01
Computer games for a serious purpose - so called serious games can provide additional information for the screening and diagnosis of cognitive impairment. Moreover, they have the advantage of being an ecological tool by involving daily living tasks. However, there is a need for better comprehensive designs regarding the acceptance of this technology, as the target population is older adults that are not used to interact with novel technologies. Moreover given the complexity of the diagnosis and the need for precise assessment, an evaluation of the best approach to analyze the performance data is required. The present study examines the usability of a new screening tool and proposes several new outlines for data analysis.
NASA Astrophysics Data System (ADS)
Vogt, S.; Neumayer, F. F.; Serkyov, I.; Jesner, G.; Kelsch, R.; Geile, M.; Sommer, A.; Golle, R.; Volk, W.
2017-09-01
Steel is the most common material used in vehicles’ chassis, which makes its research an important topic for the automotive industry. Recently developed ultra-high-strength steels (UHSS) provide extreme tensile strength up to 1,500 MPa and combine great crashworthiness with good weight reduction potential. However, in order to reach the final shape of sheet metal parts additional cutting steps such as trimming and piercing are often required. The final trimming of quenched metal sheets presents a huge challenge to a conventional process, mainly because of the required extreme cutting force. The high cutting impact, due to the materials’ brittleness, causes excessive tool wear or even sudden tool failure. Therefore, a laser is commonly used for the cutting process, which is time and energy consuming. The purpose of this paper is to demonstrate the capability of a conventional blanking tool design in a continuous stroke piercing process using boron steel 22MnB5 sheets. Two different types of tool steel were tested for their suitability as active cutting elements: electro-slag remelted (ESR) cold work tool steel Bohler K340 ISODUR and powder-metallurgic (PM) high speed steel Bohler S390 MICROCLEAN. A FEM study provided information about an optimized punch design, which withstands buckling under high cutting forces. The wear behaviour of the process was assessed by the tool wear of the active cutting elements as well as the quality of cut surfaces.
Lange, Belinda; Chang, Chien-Yen; Suma, Evan; Newman, Bradley; Rizzo, Albert Skip; Bolas, Mark
2011-01-01
The use of the commercial video games as rehabilitation tools, such as the Nintendo WiiFit, has recently gained much interest in the physical therapy arena. Motion tracking controllers such as the Nintendo Wiimote are not sensitive enough to accurately measure performance in all components of balance. Additionally, users can figure out how to "cheat" inaccurate trackers by performing minimal movement (e.g. wrist twisting a Wiimote instead of a full arm swing). Physical rehabilitation requires accurate and appropriate tracking and feedback of performance. To this end, we are developing applications that leverage recent advances in commercial video game technology to provide full-body control of animated virtual characters. A key component of our approach is the use of newly available low cost depth sensing camera technology that provides markerless full-body tracking on a conventional PC. The aim of this research was to develop and assess an interactive game-based rehabilitation tool for balance training of adults with neurological injury.
Instrument Development of Integrative Health and Wellness Assessment™.
McElligott, Deborah; Eckardt, Sarah; Montgomery Dossey, Barbara; Luck, Susan; Eckardt, Patricia
2017-12-01
The nurse coach role was developed to address the needs of our nation and the world for health and wellbeing. The Theory of Integrative Nurse Coaching provides a foundation for coaching interventions supporting health promotion, and a framework for the development of the Integrative Health and Wellness Assessment (IHWA) short form. This 36-question Likert-type scale self-reporting tool assists participants in assessing healthy behaviors through a self-reflection process, provides information for the coaching relationship, and may be an outcome measurement. This article describes the history of the IHWA tool and the development and pilot testing of the IHWA short form using guidelines provided by DeVellis. Results of the Kaiser-Meyer-Olkin test yielded a value of .520, and the Bartlett's test of sphericity was significant. Cronbach's alpha overall scale internal consistency was .88 ( n = 36). Pilot study results indicate that the scale could be improved through additional revision, and an ongoing study is in progress.
Empirical flow parameters : a tool for hydraulic model validity
Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.
2013-01-01
The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.
Economic impact of a nationwide interoperable e-Health system using the PENG evaluation tool.
Parv, L; Saluse, J; Aaviksoo, A; Tiik, M; Sepper, R; Ross, P
2012-01-01
The aim of this paper is to evaluate the costs and benefits of the Estonian interoperable health information exchange system. In addition, a framework will be built for follow-up monitoring and analysis of a nationwide HIE system. PENG evaluation tool was used to map and quantify the costs and benefits arising from type II diabetic patient management for patients, providers and the society. The analysis concludes with a quantification based on real costs and potential benefits identified by a panel of experts. Setting up a countrywide interoperable eHealth system incurs a large initial investment. However, if the system is working seamlessly, benefits will surpass costs within three years. The results show that while the society stands to benefit the most, the costs will be mainly borne by the healthcare providers. Therefore, new government policies should be devised to encourage providers to invest to ensure society wide benefits.
Integration of g4tools in Geant4
NASA Astrophysics Data System (ADS)
Hřivnáčová, Ivana
2014-06-01
g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.
EJSCREEN Version 1, Environmental Data
This map service displays raw environmental data for the 12 environmental indicators used in EJSCREEN. The map service displays percentiles for each of the environmental indicators to provide perspective on how a selected location compares to the entire nation. EJSCREEN is an environmental justice screening tool that provides EPA with a nationally consistent approach to screening for potential areas of EJ concern that may warrant further investigation. The EJ indexes are block group level results that combine multiple demographic factors with a single environmental variable (such as proximity to traffic) that can be used to help identify communities living with the greatest potential for negative environmental and health effects. The EJSCREEN tool is currently for internal EPA use only. It is anticipated that as users become accustomed to this new tool, individual programs within the Agency will develop program use guidelines and a community of practice will develop around them within the EPA Geoplatform. Users should keep in mind that screening tools are subject to substantial uncertainty in their demographic and environmental data, particularly when looking at small geographic areas, such as Census block groups. Data on the full range of environmental impacts and demographic factors in any given location are almost certainly not available directly through this tool, and its initial results should be supplemented with additional information and local knowledge
The CRB1 Complex: Following the Trail of Crumbs to a Feasible Gene Therapy Strategy.
Quinn, Peter M; Pellissier, Lucie P; Wijnholds, Jan
2017-01-01
Once considered science fiction, gene therapy is rapidly becoming scientific reality, targeting a growing number of the approximately 250 genes linked to hereditary retinal disorders such as retinitis pigmentosa and Leber's congenital amaurosis. Powerful new technologies have emerged, leading to the development of humanized models for testing and screening these therapies, bringing us closer to the goal of personalized medicine. These tools include the ability to differentiate human induced pluripotent stem cells (iPSCs) to create a "retina-in-a-dish" model and the self-formed ectodermal autonomous multi-zone, which can mimic whole eye development. In addition, highly specific gene-editing tools are now available, including the CRISPR/Cas9 system and the recently developed homology-independent targeted integration approach, which allows gene editing in non-dividing cells. Variants in the CRB1 gene have long been associated with retinopathies, and more recently the CRB2 gene has also been shown to have possible clinical relevance with respect to retinopathies. In this review, we discuss the role of the CRB protein complex in patients with retinopathy. In addition, we discuss new opportunities provided by stem cells and gene-editing tools, and we provide insight into how the retinal therapeutic pipeline can be improved. Finally, we discuss the current state of adeno-associated virus-mediated gene therapy and how it can be applied to treat retinopathies associated with mutations in CRB1 .
A New U.S. West Coast Network of Atmospheric River Observatories
NASA Astrophysics Data System (ADS)
White, A. B.; Wilczak, J. M.; Ayers, T. E.; King, C. W.; Jordan, J. R.; Shaw, W. J.; Flaherty, J. E.; Morris, V. R.
2015-12-01
The West Coast of North America is the gateway to winter storms forming over the Pacific Ocean that deliver most of the precipitation and water supply to the region. Satellites are capable of detecting the concentrated water vapor in these storms (a.k.a. atmospheric rivers) over the oceans, but because of the complex emissivity of land surfaces, fail to do so over land. In addition, these storms often are accompanied by a baroclinically induced low-level jet that drives the moisture up the windward slopes of coastal and inland mountain ranges and produces orographically enhanced precipitation. To date, satellites cannot resolve this important feature. NOAA's Hydrometeorology Testbed (HMT; hmt.noaa.gov) has developed the concept of an atmospheric river observatory (ARO); a collection of ground-based instruments capable of detecting and monitoring the water vapor transport in the low-level jet region. With funding provided by the California Department of Water Resources and U.S. Department of Energy, HMT has installed a picket fence of AROs along the U.S. West Coast. In addition, HMT has developed an award-winning water vapor flux tool that takes advantage of the data collected by the AROs to provide situational awareness and decision support for end users. This tool recently has been updated to include operational weather prediction output. The ARO network and water vapor flux tool will be described in this presentation.
Mossavar-Rahmani, Yasmin; Henry, Holly; Rodabough, Rebecca; Bragg, Charlotte; Brewer, Amy; Freed, Trish; Kinzel, Laura; Pedersen, Margaret; Soule, C Oehme; Vosburg, Shirley
2004-01-01
Self-monitoring promotes behavior changes by promoting awareness of eating habits and creates self-efficacy. It is an important component of the Women's Health Initiative dietary intervention. During the first year of intervention, 74% of the total sample of 19,542 dietary intervention participants self-monitored. As the study progressed the self-monitoring rate declined to 59% by spring 2000. Participants were challenged by inability to accurately estimate fat content of restaurant foods and the inconvenience of carrying bulky self-monitoring tools. In 1996, a Self-Monitoring Working Group was organized to develop additional self-monitoring options that were responsive to participant needs. This article describes the original and additional self-monitoring tools and trends in tool use over time. Original tools were the Food Diary and Fat Scan. Additional tools include the Keeping Track of Goals, Quick Scan, Picture Tracker, and Eating Pattern Changes instruments. The additional tools were used by the majority of participants (5,353 of 10,260 or 52% of participants who were self-monitoring) by spring 2000. Developing self-monitoring tools that are responsive to participant needs increases the likelihood that self-monitoring can enhance dietary reporting adherence, especially in long-term clinical trials.
Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.
Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450
NASA Astrophysics Data System (ADS)
Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.
2012-12-01
The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.
Civil Protection Practitioners' Response to Introducing Nowcasting in Weather Warnings
NASA Astrophysics Data System (ADS)
Ulbrich, Thorsten
2014-05-01
The HErZ project WEXICOM (Improving the process of weather warnings and extreme weather information in the chain from the meteorological forecasts to their communication for the Berlin conurbation) assesses the communication and use of weather warnings. In cooperation with DWD we conducted two online surveys with German relief forces before and after a nowcasting application was introduced into the weather warning platform FEWIS. The aim is to investigate how relief workers make use of the additional information. DWD supports German civil protection by providing the warning platform FeWIS (Fire brigade Weather Information System) for registered relief workers. The platform provides information on meteorological hazards needed to take precautions and to support rescue actions. In June 2013 DWD added nowcasted estimates of storm attributes including warning cones based on a 1x1 km grid. The tool named "GewitterMonitor" is based on NowcastMIX and uses short-term weather models and observations to derive warnings with high precision on intensity, location and timing of thunder storm cells for the following two hours. A first survey provided prior to the addition of nowcasted information investigates how users benefit from FeWIS and how they perceive its functionality and reliability. Following the introduction users gain experience applying the nowcasting tool in the thunderstorm season 2013. In Winter 2013/2014 we conducted another online survey. The post-survey comprises questions on the use of the GewitterMonitor and on how the tool supports relief forces in responding to meteorological risks. The post survey also repeats questions on the perception of functionality and function of FeWIS and poses questions derived from the previous survey. This second survey collects practitioners feedback on GewitterMonitor and allows to detect changes in how users perceive the performance of FeWIS after the addition by relating responses to the prior survey.
Tools4miRs – one place to gather all the tools for miRNA analysis
Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr
2016-01-01
Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. Contact: piotr@ibb.waw.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626
Tools4miRs - one place to gather all the tools for miRNA analysis.
Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr
2016-09-01
MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. piotr@ibb.waw.pl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Adaptive eLearning modules for cytopathology education: A review and approach.
Samulski, T Danielle; La, Teresa; Wu, Roseann I
2016-11-01
Clinical training imposes time and resource constraints on educators and learners, making it difficult to provide and absorb meaningful instruction. Additionally, innovative and personalized education has become an expectation of adult learners. Fortunately, the development of web-based educational tools provides a possible solution to these challenges. Within this review, we introduce the utility of adaptive eLearning platforms in pathology education. In addition to a review of the current literature, we provide the reader with a suggested approach for module creation, as well as a critical assessment of an available platform, based on our experience in creating adaptive eLearning modules for teaching basic concepts in gynecologic cytopathology. Diagn. Cytopathol. 2016;44:944-951. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Integrated Systems Health Management (ISHM) Toolkit
NASA Technical Reports Server (NTRS)
Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim
2013-01-01
A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.
Telecom Modeling with ChatterBell.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jrad, Ahmad M.; Kelic, Andjelka
This document provides a description and user manual for the ChatterBell voice telecom modeling and simulation capability. The intended audience consists of network planners and practitioners who wish to use the tool to model a particular voice network and analyze its behavior under varying assumptions and possible failure conditions. ChatterBell is built on top of the N-SMART voice simulation and visualization suite that was developed through collaboration between Sandia National Laboratories and Bell Laboratories of Lucent Technologies. The new and improved modeling and simulation tool has been modified and modernized to incorporate the latest development in the telecom world includingmore » the widespread use of VoIP technology. In addition, ChatterBell provides new commands and modeling capabilities that were not available in the N-SMART application.« less
CRISPR Primer Designer: Design primers for knockout and chromosome imaging CRISPR-Cas system.
Yan, Meng; Zhou, Shi-Rong; Xue, Hong-Wei
2015-07-01
The clustered regularly interspaced short palindromic repeats (CRISPR)-associated system enables biologists to edit genomes precisely and provides a powerful tool for perturbing endogenous gene regulation, modulation of epigenetic markers, and genome architecture. However, there are concerns about the specificity of the system, especially the usages of knocking out a gene. Previous designing tools either were mostly built-in websites or ran as command-line programs, and none of them ran locally and acquired a user-friendly interface. In addition, with the development of CRISPR-derived systems, such as chromosome imaging, there were still no tools helping users to generate specific end-user spacers. We herein present CRISPR Primer Designer for researchers to design primers for CRISPR applications. The program has a user-friendly interface, can analyze the BLAST results by using multiple parameters, score for each candidate spacer, and generate the primers when using a certain plasmid. In addition, CRISPR Primer Designer runs locally and can be used to search spacer clusters, and exports primers for the CRISPR-Cas system-based chromosome imaging system. © 2014 Institute of Botany, Chinese Academy of Sciences.
Development of a Web-based question database for students' self-assessment.
Hammoud, Maya M; Barclay, Mel L
2002-09-01
Computer-based testing (CBT) for the purpose of the national licensure examination has increased interest among medical students in this modality of testing. The advent of Web-based question-delivery systems for self-assessment and learning has made it possible for students to practice this technology and participate in self-directed learning. Test Pilot(TM) is a Web-based program that provides a fast and easy tool for the development and deployment of online testing. Our objectives for introducing the program were to (1) develop a large database of questions for students' practice and self-assessment; (2) include multimedia tools such as illustrations and short videos to enhance learning; (3) provide a feedback tool for clerkship and site directors regarding student performance; and (4) evaluate this tool in terms of students' frequency of use, students' satisfaction, and its potential effectiveness in enhancing learning. The Obstetrics and Gynecology clerkship at the University of Michigan is held at four different sites. In the past, students have been provided with access to floppy disks that contain about 500 self-assessment questions. These questions have been reformatted, updated, and transferred to Test Pilot. Visual illustrations have been added to the questions along with more varied formats, including extended matching, fill-in, and essay questions. The questions are divided into ten-question quizzes. The students get immediate feedback after answering each question and a summary of performance at the end of each quiz. Security, access, and analysis are facilitated because the questions and responses are stored centrally. In addition, Test Pilot captures information regarding individual and collective students' performances. At the end of the rotation, students fill out a form evaluating the Test Pilot program and comparing it with the quiz disks. In addition, we are collecting data regarding the actual use of Test Pilot, which will be compared with the students' surveys and final exam scores. Test Pilot has many benefits, including access control, immediate feedback, automated scoring, interactive learning, and data analysis. The enhancement of material permitted by a Web-based system increases the depth and variety of the learning experience by adding perceptual dimensions. Test Pilot also provides the clerkship director with the capability to obtain improved measurements of student performance and captures the student's self-learning and testing process. It can potentially identify weaknesses or inconsistencies across the different sites and recognize students who may need additional help early in the rotation. Over a one-year period, most students have switched from the quiz disks to Test Pilot. The students reported satisfaction with the Web-based format and found it user friendly. They especially liked the immediate feedback. The students have requested more questions and multimedia options be added. We plan to continue the development and assessment of this learning tool.
Intelligent Processing Equipment Within the Environmental Protection Agency
NASA Technical Reports Server (NTRS)
Greathouse, Daniel G.; Nalesnik, Richard P.
1992-01-01
Protection of the environment and environmental remediation requires the cooperation, at all levels, of government and industry. Intelligent processing equipment, in addition to other artificial intelligence based tools, was used by the Environmental Protection Agency to provide personnel safety and improve the efficiency of those responsible for protection and remediation of the environment. These exploratory efforts demonstrate the feasibility and utility of expanding development and widespread use of these tools. A survey of current intelligent processing equipment applications in the Agency is presented and is followed by a brief discussion of possible uses in the future.
Quality indexing with computer-aided lexicography
NASA Technical Reports Server (NTRS)
Buchan, Ronald L.
1992-01-01
Indexing with computers is a far cry from indexing with the first indexing tool, the manual card sorter. With the aid of computer-aided lexicography, both indexing and indexing tools can provide standardization, consistency, and accuracy, resulting in greater quality control than ever before. A brief survey of computer activity in indexing is presented with detailed illustrations from NASA activity. Applications from techniques mentioned, such as Retrospective Indexing (RI), can be made to many indexing systems. In addition to improving the quality of indexing with computers, the improved efficiency with which certain tasks can be done is demonstrated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reis, Chuck; Nelson, Eric; Armer, James
The purpose of this playbook and accompanying spreadsheets is to generalize the detailed CBP analysis and to put tools in the hands of experienced refrigeration designers to evaluate multiple applications of refrigeration waste heat reclaim across the United States. Supermarkets with large portfolios of similar buildings can use these tools to assess the impact of large-scale implementation of heat reclaim systems. In addition, the playbook provides best practices for implementing heat reclaim systems to achieve the best long-term performance possible. It includes guidance on operations and maintenance as well as measurement and verification.
A Practical Tutorial on Modified Condition/Decision Coverage
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Veerhusen, Dan S.; Chilenski, John J.; Rierson, Leanna K.
2001-01-01
This tutorial provides a practical approach to assessing modified condition/decision coverage (MC/DC) for aviation software products that must comply with regulatory guidance for DO-178B level A software. The tutorial's approach to MC/DC is a 5-step process that allows a certification authority or verification analyst to evaluate MC/DC claims without the aid of a coverage tool. In addition to the MC/DC approach, the tutorial addresses factors to consider in selecting and qualifying a structural coverage analysis tool, tips for reviewing life cycle data related to MC/DC, and pitfalls common to structural coverage analysis.
Supporting Scientific Analysis within Collaborative Problem Solving Environments
NASA Technical Reports Server (NTRS)
Watson, Velvin R.; Kwak, Dochan (Technical Monitor)
2000-01-01
Collaborative problem solving environments for scientists should contain the analysis tools the scientists require in addition to the remote collaboration tools used for general communication. Unfortunately, most scientific analysis tools have been designed for a "stand-alone mode" and cannot be easily modified to work well in a collaborative environment. This paper addresses the questions, "What features are desired in a scientific analysis tool contained within a collaborative environment?", "What are the tool design criteria needed to provide these features?", and "What support is required from the architecture to support these design criteria?." First, the features of scientific analysis tools that are important for effective analysis in collaborative environments are listed. Next, several design criteria for developing analysis tools that will provide these features are presented. Then requirements for the architecture to support these design criteria are listed. Sonic proposed architectures for collaborative problem solving environments are reviewed and their capabilities to support the specified design criteria are discussed. A deficiency in the most popular architecture for remote application sharing, the ITU T. 120 architecture, prevents it from supporting highly interactive, dynamic, high resolution graphics. To illustrate that the specified design criteria can provide a highly effective analysis tool within a collaborative problem solving environment, a scientific analysis tool that contains the specified design criteria has been integrated into a collaborative environment and tested for effectiveness. The tests were conducted in collaborations between remote sites in the US and between remote sites on different continents. The tests showed that the tool (a tool for the visual analysis of computer simulations of physics) was highly effective for both synchronous and asynchronous collaborative analyses. The important features provided by the tool (and made possible by the specified design criteria) are: 1. The tool provides highly interactive, dynamic, high resolution, 3D graphics. 2. All remote scientists can view the same dynamic, high resolution, 3D scenes of the analysis as the analysis is being conducted. 3. The responsiveness of the tool is nearly identical to the responsiveness of the tool in a stand-alone mode. 4. The scientists can transfer control of the analysis between themselves. 5. Any analysis session or segment of an analysis session, whether done individually or collaboratively, can be recorded and posted on the Web for other scientists or students to download and play in either a collaborative or individual mode. 6. The scientist or student who downloaded the session can, individually or collaboratively, modify or extend the session with his/her own "what if" analysis of the data and post his/her version of the analysis back onto the Web. 7. The peak network bandwidth used in the collaborative sessions is only 1K bit/second even though the scientists at all sites are viewing high resolution (1280 x 1024 pixels), dynamic, 3D scenes of the analysis. The links between the specified design criteria and these performance features are presented.
NASA Technical Reports Server (NTRS)
1991-01-01
IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.
NASA Astrophysics Data System (ADS)
Krause, Lee S.; Burns, Carla L.
2000-06-01
This paper discusses the research currently in progress to develop the Conceptual Federation Object Model Design Tool. The objective of the Conceptual FOM (C-FOM) Design Tool effort is to provide domain and subject matter experts, such as scenario developers, with automated support for understanding and utilizing available HLA simulation and other simulation assets during HLA Federation development. The C-FOM Design Tool will import Simulation Object Models from HLA reuse repositories, such as the MSSR, to populate the domain space that will contain all the objects and their supported interactions. In addition, the C-FOM tool will support the conversion of non-HLA legacy models into HLA- compliant models by applying proven abstraction techniques against the legacy models. Domain experts will be able to build scenarios based on the domain objects and interactions in both a text and graphical form and export a minimal FOM. The ability for domain and subject matter experts to effectively access HLA and non-HLA assets is critical to the long-term acceptance of the HLA initiative.
Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool
NASA Astrophysics Data System (ADS)
Torlapati, Jagadish; Prabhakar Clement, T.
2013-01-01
We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.
PolNet: A Tool to Quantify Network-Level Cell Polarity and Blood Flow in Vascular Remodeling.
Bernabeu, Miguel O; Jones, Martin L; Nash, Rupert W; Pezzarossa, Anna; Coveney, Peter V; Gerhardt, Holger; Franco, Claudio A
2018-05-08
In this article, we present PolNet, an open-source software tool for the study of blood flow and cell-level biological activity during vessel morphogenesis. We provide an image acquisition, segmentation, and analysis protocol to quantify endothelial cell polarity in entire in vivo vascular networks. In combination, we use computational fluid dynamics to characterize the hemodynamics of the vascular networks under study. The tool enables, to our knowledge for the first time, a network-level analysis of polarity and flow for individual endothelial cells. To date, PolNet has proven invaluable for the study of endothelial cell polarization and migration during vascular patterning, as demonstrated by two recent publications. Additionally, the tool can be easily extended to correlate blood flow with other experimental observations at the cellular/molecular level. We release the source code of our tool under the Lesser General Public License. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.
The BioCyc collection of microbial genomes and metabolic pathways.
Karp, Peter D; Billington, Richard; Caspi, Ron; Fulcher, Carol A; Latendresse, Mario; Kothari, Anamika; Keseler, Ingrid M; Krummenacker, Markus; Midford, Peter E; Ong, Quang; Ong, Wai Kit; Paley, Suzanne M; Subhraveti, Pallavi
2017-08-17
BioCyc.org is a microbial genome Web portal that combines thousands of genomes with additional information inferred by computer programs, imported from other databases and curated from the biomedical literature by biologist curators. BioCyc also provides an extensive range of query tools, visualization services and analysis software. Recent advances in BioCyc include an expansion in the content of BioCyc in terms of both the number of genomes and the types of information available for each genome; an expansion in the amount of curated content within BioCyc; and new developments in the BioCyc software tools including redesigned gene/protein pages and metabolite pages; new search tools; a new sequence-alignment tool; a new tool for visualizing groups of related metabolic pathways; and a facility called SmartTables, which enables biologists to perform analyses that previously would have required a programmer's assistance. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Asati, Atul; Kachurina, Olga; Kachurin, Anatoly
2012-01-01
Considering importance of ganglioside antibodies as biomarkers in various immune-mediated neuropathies and neurological disorders, we developed a high throughput multiplexing tool for the assessment of gangliosides-specific antibodies based on Biolpex/Luminex platform. In this report, we demonstrate that the ganglioside high throughput multiplexing tool is robust, highly specific and demonstrating ∼100-fold higher concentration sensitivity for IgG detection than ELISA. In addition to the ganglioside-coated array, the high throughput multiplexing tool contains beads coated with influenza hemagglutinins derived from H1N1 A/Brisbane/59/07 and H1N1 A/California/07/09 strains. Influenza beads provided an added advantage of simultaneous detection of ganglioside- and influenza-specific antibodies, a capacity important for the assay of both infectious antigen-specific and autoimmune antibodies following vaccination or disease. Taken together, these results support the potential adoption of the ganglioside high throughput multiplexing tool for measuring ganglioside antibodies in various neuropathic and neurological disorders. PMID:22952605
Automated Flight Dynamics Product Generation for the EOS AM-1 Spacecraft
NASA Technical Reports Server (NTRS)
Matusow, Carla
1999-01-01
As part of NASA's Earth Science Enterprise, the Earth Observing System (EOS) AM-1 spacecraft is designed to monitor long-term, global, environmental changes. Because of the complexity of the AM-1 spacecraft, the mission operations center requires more than 80 distinct flight dynamics products (reports). To create these products, the AM-1 Flight Dynamics Team (FDT) will use a combination of modified commercial software packages (e.g., Analytical Graphic's Satellite ToolKit) and NASA-developed software applications. While providing the most cost-effective solution to meeting the mission requirements, the integration of these software applications raises several operational concerns: (1) Routine product generation requires knowledge of multiple applications executing on variety of hardware platforms. (2) Generating products is a highly interactive process requiring a user to interact with each application multiple times to generate each product. (3) Routine product generation requires several hours to complete. (4) User interaction with each application introduces the potential for errors, since users are required to manually enter filenames and input parameters as well as run applications in the correct sequence. Generating products requires some level of flight dynamics expertise to determine the appropriate inputs and sequencing. To address these issues, the FDT developed an automation software tool called AutoProducts, which runs on a single hardware platform and provides all necessary coordination and communication among the various flight dynamics software applications. AutoProducts, autonomously retrieves necessary files, sequences and executes applications with correct input parameters, and deliver the final flight dynamics products to the appropriate customers. Although AutoProducts will normally generate pre-programmed sets of routine products, its graphical interface allows for easy configuration of customized and one-of-a-kind products. Additionally, AutoProducts has been designed as a mission-independent tool, and can be easily reconfigured to support other missions or incorporate new flight dynamics software packages. After the AM-1 launch, AutoProducts will run automatically at pre-determined time intervals . The AutoProducts tool reduces many of the concerns associated with the flight dynamics product generation. Although AutoProducts required a significant effort to develop because of the complexity of the interfaces involved, its use will provide significant cost savings through reduced operator time and maximum product reliability. In addition, user satisfaction is significantly improved and flight dynamics experts have more time to perform valuable analysis work. This paper will describe the evolution of the AutoProducts tool, highlighting the cost savings and customer satisfaction resulting from its development. It will also provide details about the tool including its graphical interface and operational capabilities.
A Whale of a Tale: Creating Spacecraft Telemetry Data Analysis Products for the Deep Impact Mission
NASA Technical Reports Server (NTRS)
Sturdevant, Kathryn F.; Wright, Jesse J.; Lighty, Roger A.; Nakamura, Lori L.
2006-01-01
This paper describes some of the challenges and lessons learned from the Deep Impact (DI) Mission Ground Data System's (GDS) telemetry data processing and product generation tool, nicknamed 'Whale.' One of the challenges of any mission is to analyze testbed and operational telemetry data. Methods to retrieve this data to date have required spacecraft subsystem members to become experts in the use of a myriad of query and plot tools. As budgets shrink, and the GDS teams grow smaller, more of the burden to understand these tools falls on the users. The user base also varies from novice to expert, and requiring them to become GDS tool experts in addition to spacecraft domain experts is an undue burden. The "Whale" approach is to process all of the data for a given spacecraft test, and provide each subsystem with plots and data products 'automagically.'.
Molecular genealogy tools for white-tailed deer with chronic wasting disease
Ernest, Holly B.; Hoar, Bruce R.; Well, Jay A.; O’Rourke, Katherine I.
2010-01-01
Molecular genetic data provide powerful tools for genealogy reconstruction to reveal mechanisms underlying disease ecology. White-tailed deer (Odocoileus virginianus) congregate in matriarchal groups; kin-related close social spacing may be a factor in the spread of infectious diseases. Spread of chronic wasting disease (CWD), a prion disorder of deer and their cervid relatives, is presumed to be associated with direct contact between individuals and by exposure to shared food and water sources contaminated with prions shed by infected deer. Key aspects of disease ecology are yet unknown. DNA tools for pedigree reconstruction were developed to fill knowledge gaps in disease dynamics in prion-infected wild animals. Kinship indices using data from microsatellite loci and sequence haplotypes of mitochondrial DNA were employed to assemble genealogies. Molecular genealogy tools will be useful for landscape-level population genetic research and monitoring, in addition to epidemiologic studies examining transmission of CWD in captive and free-ranging cervids. PMID:20592847
Roback, M G; Green, S M; Andolfatto, G; Leroy, P L; Mason, K P
2018-01-01
Many hospitals, and medical and dental clinics and offices, routinely monitor their procedural-sedation practices-tracking adverse events, outcomes, and efficacy in order to optimize the sedation delivery and practice. Currently, there exist substantial differences between settings in the content, collection, definition, and interpretation of such sedation outcomes, with resulting widespread reporting variation. With the objective of reducing such disparities, the International Committee for the Advancement of Procedural Sedation has herein developed a multidisciplinary, consensus-based, standardized tool intended to be applicable for all types of sedation providers in all locations worldwide. This tool is amenable for inclusion in either a paper or an electronic medical record. An additional, parallel research tool is presented to promote consistency and standardized data collection for procedural-sedation investigations. Copyright © 2017. Published by Elsevier Ltd.
Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping
2018-05-22
Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Grubert, Emily; Siders, Anne
2016-09-01
Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.
Muellner, Ulrich J; Vial, Flavie; Wohlfender, Franziska; Hadorn, Daniela; Reist, Martin; Muellner, Petra
2015-01-01
The reporting of outputs from health surveillance systems should be done in a near real-time and interactive manner in order to provide decision makers with powerful means to identify, assess, and manage health hazards as early and efficiently as possible. While this is currently rarely the case in veterinary public health surveillance, reporting tools do exist for the visual exploration and interactive interrogation of health data. In this work, we used tools freely available from the Google Maps and Charts library to develop a web application reporting health-related data derived from slaughterhouse surveillance and from a newly established web-based equine surveillance system in Switzerland. Both sets of tools allowed entry-level usage without or with minimal programing skills while being flexible enough to cater for more complex scenarios for users with greater programing skills. In particular, interfaces linking statistical softwares and Google tools provide additional analytical functionality (such as algorithms for the detection of unusually high case occurrences) for inclusion in the reporting process. We show that such powerful approaches could improve timely dissemination and communication of technical information to decision makers and other stakeholders and could foster the early-warning capacity of animal health surveillance systems.
NASA Astrophysics Data System (ADS)
Williams, C. A.; Dicaprio, C.; Simons, M.
2003-12-01
With the advent of projects such as the Plate Boundary Observatory and future InSAR missions, spatially dense geodetic data of high quality will provide an increasingly detailed picture of the movement of the earth's surface. To interpret such information, powerful and easily accessible modeling tools are required. We are presently developing such a tool that we feel will meet many of the needs for evaluating quasi-static earth deformation. As a starting point, we begin with a modified version of the finite element code TECTON, which has been specifically designed to solve tectonic problems involving faulting and viscoelastic/plastic earth behavior. As our first priority, we are integrating the code into the GeoFramework, which is an extension of the Python-based Pyre modeling framework. The goal of this framework is to provide simplified user interfaces for powerful modeling codes, to provide easy access to utilities such as meshers and visualization tools, and to provide a tight integration between different modeling tools so they can interact with each other. The initial integration of the code into this framework is essentially complete, and a more thorough integration, where Python-based drivers control the entire solution, will be completed in the near future. We have an evolving set of priorities that we expect to solidify as we receive more input from the modeling community. Current priorities include the development of linear and quadratic tetrahedral elements, the development of a parallelized version of the code using the PETSc libraries, the addition of more complex rheologies, realistic fault friction models, adaptive time stepping, and spherical geometries. In this presentation we describe current progress toward our various priorities, briefly describe the structure of the code within the GeoFramework, and demonstrate some sample applications.
When calibration is not enough
NASA Astrophysics Data System (ADS)
Kingsley, Jeffrey R.; Johnson, Leslie
1999-12-01
When added CD (Critical Dimension) capacity is needed there are several routes that can be taken -- add shifts and people to existing equipment, obtain additional equipment and staff or use an outside service provider for peak and emergency work. In all but the first scenario the qualification of the 'new' equipment, and correlation to the existing measurements, is key to meaningful results. In many cases simply calibrating the new tool with the same reference material or standard used to calibrate the existing tools will provide the level of agreement required. In fact, calibrating instruments using different standards can provide an acceptable level of agreement in cases where accuracy is a second tier consideration. However, there are also situations where factors outside of calibration can influence the results. In this study CD measurements from a mask sample being used to qualify an outside service provider showed good agreement for the narrower linewidths, but significant deviation occurred with increasing CD. In the course of a root cause investigation, it was found that there are a variety of factors that may influence the agreement found between two tools. What are these 'other factors' and how are they found? In the present case the results of a 'round robin' consensus from a variety of tools was used to initially determine which tool needed to be investigated. The instrument parameters felt to be the most important causes of the disagreement were identified and experiments run to test their influence. The factors investigated as the cause of the disagreement included (1) Type of detector and location with respect to sample, (2) Beam Voltage, (3) Scan Rotation/Sample Orientation issues and (4) Edge Detection Algorithm.
Trade Space Specification Tool (TSST) for Rapid Mission Architecture (Version 1.2)
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Schrock, Mitchell; Borden, Chester S.; Moeller, Robert C.
2013-01-01
Trade Space Specification Tool (TSST) is designed to capture quickly ideas in the early spacecraft and mission architecture design and categorize them into trade space dimensions and options for later analysis. It is implemented as an Eclipse RCP Application, which can be run as a standalone program. Users rapidly create concept items with single clicks on a graphical canvas, and can organize and create linkages between the ideas using drag-and-drop actions within the same graphical view. Various views such as a trade view, rules view, and architecture view are provided to help users to visualize the trade space. This software can identify, explore, and assess aspects of the mission trade space, as well as capture and organize linkages/dependencies between trade space components. The tool supports a user-in-the-loop preliminary logical examination and filtering of trade space options to help identify which paths in the trade space are feasible (and preferred) and what analyses need to be done later with executable models. This tool provides multiple user views of the trade space to guide the analyst/team to facilitate interpretation and communication of the trade space components and linkages, identify gaps in combining and selecting trade space options, and guide user decision-making for which combinations of architectural options should be pursued for further evaluation. This software provides an environment to capture mission trade space elements rapidly and assist users for their architecture analysis. This is primarily focused on mission and spacecraft architecture design, rather than general-purpose design application. In addition, it provides more flexibility to create concepts and organize the ideas. The software is developed as an Eclipse plug-in and potentially can be integrated with other Eclipse-based tools.
DynGO: a tool for visualizing and mining of Gene Ontology and its associations
Liu, Hongfang; Hu, Zhang-Zhi; Wu, Cathy H
2005-01-01
Background A large volume of data and information about genes and gene products has been stored in various molecular biology databases. A major challenge for knowledge discovery using these databases is to identify related genes and gene products in disparate databases. The development of Gene Ontology (GO) as a common vocabulary for annotation allows integrated queries across multiple databases and identification of semantically related genes and gene products (i.e., genes and gene products that have similar GO annotations). Meanwhile, dozens of tools have been developed for browsing, mining or editing GO terms, their hierarchical relationships, or their "associated" genes and gene products (i.e., genes and gene products annotated with GO terms). Tools that allow users to directly search and inspect relations among all GO terms and their associated genes and gene products from multiple databases are needed. Results We present a standalone package called DynGO, which provides several advanced functionalities in addition to the standard browsing capability of the official GO browsing tool (AmiGO). DynGO allows users to conduct batch retrieval of GO annotations for a list of genes and gene products, and semantic retrieval of genes and gene products sharing similar GO annotations. The result are shown in an association tree organized according to GO hierarchies and supported with many dynamic display options such as sorting tree nodes or changing orientation of the tree. For GO curators and frequent GO users, DynGO provides fast and convenient access to GO annotation data. DynGO is generally applicable to any data set where the records are annotated with GO terms, as illustrated by two examples. Conclusion We have presented a standalone package DynGO that provides functionalities to search and browse GO and its association databases as well as several additional functions such as batch retrieval and semantic retrieval. The complete documentation and software are freely available for download from the website . PMID:16091147
Human Disease Insight: An integrated knowledge-based platform for disease-gene-drug information.
Tasleem, Munazzah; Ishrat, Romana; Islam, Asimul; Ahmad, Faizan; Hassan, Md Imtaiyaz
2016-01-01
The scope of the Human Disease Insight (HDI) database is not limited to researchers or physicians as it also provides basic information to non-professionals and creates disease awareness, thereby reducing the chances of patient suffering due to ignorance. HDI is a knowledge-based resource providing information on human diseases to both scientists and the general public. Here, our mission is to provide a comprehensive human disease database containing most of the available useful information, with extensive cross-referencing. HDI is a knowledge management system that acts as a central hub to access information about human diseases and associated drugs and genes. In addition, HDI contains well-classified bioinformatics tools with helpful descriptions. These integrated bioinformatics tools enable researchers to annotate disease-specific genes and perform protein analysis, search for biomarkers and identify potential vaccine candidates. Eventually, these tools will facilitate the analysis of disease-associated data. The HDI provides two types of search capabilities and includes provisions for downloading, uploading and searching disease/gene/drug-related information. The logistical design of the HDI allows for regular updating. The database is designed to work best with Mozilla Firefox and Google Chrome and is freely accessible at http://humandiseaseinsight.com. Copyright © 2015 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.
ERPLAB: an open-source toolbox for the analysis of event-related potentials
Lopez-Calderon, Javier; Luck, Steven J.
2014-01-01
ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations. PMID:24782741
ERPLAB: an open-source toolbox for the analysis of event-related potentials.
Lopez-Calderon, Javier; Luck, Steven J
2014-01-01
ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB's EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB's tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user's guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Tengfang; Flapper, Joris; Ke, Jing
The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variablesmore » affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water usage in individual dairy plants, augment benchmarking activities in the market places, and facilitate implementation of efficiency measures and strategies to save energy and water usage in the dairy industry. Industrial adoption of this emerging tool and technology in the market is expected to benefit dairy plants, which are important customers of California utilities. Further demonstration of this benchmarking tool is recommended, for facilitating its commercialization and expansion in functions of the tool. Wider use of this BEST-Dairy tool and its continuous expansion (in functionality) will help to reduce the actual consumption of energy and water in the dairy industry sector. The outcomes comply very well with the goals set by the AB 1250 for PIER program.« less
Bezgin, Gleb; Reid, Andrew T; Schubert, Dirk; Kötter, Rolf
2009-01-01
Brain atlases are widely used in experimental neuroscience as tools for locating and targeting specific brain structures. Delineated structures in a given atlas, however, are often difficult to interpret and to interface with database systems that supply additional information using hierarchically organized vocabularies (ontologies). Here we discuss the concept of volume-to-ontology mapping in the context of macroscopical brain structures. We present Java tools with which we have implemented this concept for retrieval of mapping and connectivity data on the macaque brain from the CoCoMac database in connection with an electronic version of "The Rhesus Monkey Brain in Stereotaxic Coordinates" authored by George Paxinos and colleagues. The software, including our manually drawn monkey brain template, can be downloaded freely under the GNU General Public License. It adds value to the printed atlas and has a wider (neuro-)informatics application since it can read appropriately annotated data from delineated sections of other species and organs, and turn them into 3D registered stacks. The tools provide additional features, including visualization and analysis of connectivity data, volume and centre-of-mass estimates, and graphical manipulation of entire structures, which are potentially useful for a range of research and teaching applications.
SU-E-E-02: An Excel-Based Study Tool for ABR-Style Exams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cline, K; Stanley, D; Defoor, D
2015-06-15
Purpose: As the landscape of learning and testing shifts toward a computer-based environment, a replacement for paper-based methods of studying is desirable. Using Microsoft Excel, a study tool was developed that allows the user to populate multiple-choice questions and then generate an interactive quiz session to answer them. Methods: The code for the tool was written using Microsoft Excel Visual Basic for Applications with the intent that this tool could be implemented by any institution with Excel. The base tool is a template with a setup macro, which builds out the structure based on user’s input. Once the framework ismore » built, the user can input sets of multiple-choice questions, answer choices, and even add figures. The tool can be run in random-question or sequential-question mode for single or multiple courses of study. The interactive session allows the user to select answer choices and immediate feedback is provided. Once the user is finished studying, the tool records the day’s progress by reporting progress statistics useful for trending. Results: Six doctoral students at UTHSCSA have used this tool for the past two months to study for their qualifying exam, which is similar in format and content to the American Board of Radiology (ABR) Therapeutic Part II exam. The students collaborated to create a repository of questions, met weekly to go over these questions, and then used the tool to prepare for their exam. Conclusion: The study tool has provided an effective and efficient way for students to collaborate and be held accountable for exam preparation. The ease of use and familiarity of Excel are important factors for the tool’s use. There are software packages to create similar question banks, but this study tool has no additional cost for those that already have Excel. The study tool will be made openly available.« less
The catalogCleaner: Separating the Sheep from the Goats
NASA Astrophysics Data System (ADS)
O'Brien, K.; Hankin, S. C.; Schweitzer, R.; Koyuk, H.
2012-12-01
The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards and focusing initially on well understood data types, such as gridded data from climate models. This phased approach serves to engage data providers and users and also has a high probability of demonstrable successes. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. For example, serving five hundred individual files from a single climate model might be compliant, but enhancing the service so that those files are all aggregated together into one virtual dataset and available through a single access URL provides a much more useful service. The UAF project began showcasing the advantages of providing compliant data by manually building a master catalog generated from hand-picked THREDDS servers. With an understanding that educating data managers to provide standards compliant data and metadata can take years, the UAF project wanted to continue increasing the volume of data served through the master catalog as much as possible. However, it quickly became obvious, through the sheer volume of data servers available, that the manual process of building a master catalog was not scalable. Thus, the idea for the catalogCleaner tool was born. The goal of this tool is to automatically crawl a remote OPeNDAP or THREDDS server, and from the information in the server build a "clean" catalog of data that will be: a) served through uniform access services; b) have CF compliant metadata; c) directly link the data to common visualization tools thereby allowing users to immediately begin exploring actual data. In addition, the UAF-generated clean catalog can then be used to drive data discovery tools such as Geoportal, GI-CAT, etc. This presentation will further explore the motivation of creating this tool, the implementation of this tool, as well as the myriad of challenges and difficulties there were encountered along the way.
NASA Technical Reports Server (NTRS)
Merry, Josh; Takeshita, Jennifer; Tweedy, Bryan; Burford, Dwight
2006-01-01
In this presentation, the results of a recent study on the effect of pin tool design for friction stir welding thin sheets (0.040") of aluminum alloys 2024 and 7075 are provided. The objective of this study was to investigate and document the effect of tool shoulder and pin diameter, as well as the presence of pin flutes, on the resultant microstructure and mechanical properties at both room temperature and cryogenic temperature. Specifically, the comparison between three tools will include: FSW process load analysis (tool forces required to fabricate the welds), Static Mechanical Properties (ultimate tensile strength, yield strength, and elongation), and Process window documenting the range of parameters that can be used with the three pin tools investigated. All samples were naturally aged for a period greater than 10 days. Prior research has shown 7075 may require post weld heat treatment. Therefore, an additional pair of room temperature and cryogenic temperature samples was post-weld aged to the 7075-T7 condition prior to mechanical testing.
Suárez Álvarez, Óscar; Fernández-Feito, Ana; Vallina Crespo, Henar; Aldasoro Unamuno, Elena; Cofiño, Rafael
2018-05-11
It is essential to develop a comprehensive approach to institutionally promoted interventions to assess their impact on health from the perspective of the social determinants of health and equity. Simple, adapted tools must be developed to carry out these assessments. The aim of this paper is to present two tools to assess the impact of programmes and community-based interventions on the social determinants of health. The first tool is intended to assess health programmes through interviews and analysis of information provided by the assessment team. The second tool, by means of online assessments of community-based interventions, also enables a report on inequality issues that includes recommendations for improvement. In addition to reducing health-related social inequities, the implementation of these tools can also help to improve the efficiency of public health interventions. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
The Gamma-Ray Burst ToolSHED is Open for Business
NASA Astrophysics Data System (ADS)
Giblin, Timothy W.; Hakkila, Jon; Haglin, David J.; Roiger, Richard J.
2004-09-01
The GRB ToolSHED, a Gamma-Ray Burst SHell for Expeditions in Data-Mining, is now online and available via a web browser to all in the scientific community. The ToolSHED is an online web utility that contains pre-processed burst attributes of the BATSE catalog and a suite of induction-based machine learning and statistical tools for classification and cluster analysis. Users create their own login account and study burst properties within user-defined multi-dimensional parameter spaces. Although new GRB attributes are periodically added to the database for user selection, the ToolSHED has a feature that allows users to upload their own burst attributes (e.g. spectral parameters, etc.) so that additional parameter spaces can be explored. A data visualization feature using GNUplot and web-based IDL has also been implemented to provide interactive plotting of user-selected session output. In an era in which GRB observations and attributes are becoming increasingly more complex, a utility such as the GRB ToolSHED may play an important role in deciphering GRB classes and understanding intrinsic burst properties.
SGRAPH (SeismoGRAPHer): Seismic waveform analysis and integrated tools in seismology
NASA Astrophysics Data System (ADS)
Abdelwahed, Mohamed F.
2012-03-01
Although numerous seismological programs are currently available, most of them suffer from the inability to manipulate different data formats and the lack of embedded seismological tools. SeismoGRAPHer, or simply SGRAPH, is a new system for maintaining and analyzing seismic waveform data in a stand-alone, Windows-based application that manipulates a wide range of data formats. SGRAPH was intended to be a tool sufficient for performing basic waveform analysis and solving advanced seismological problems. The graphical user interface (GUI) utilities and the Windows functionalities, such as dialog boxes, menus, and toolbars, simplify the user interaction with the data. SGRAPH supports common data formats, such as SAC, SEED, GSE, ASCII, and Nanometrics Y-format, and provides the ability to solve many seismological problems with built-in inversion tools. Loaded traces are maintained, processed, plotted, and saved as SAC, ASCII, or PS (post script) file formats. SGRAPH includes Generalized Ray Theory (GRT), genetic algorithm (GA), least-square fitting, auto-picking, fast Fourier transforms (FFT), and many additional tools. This program provides rapid estimation of earthquake source parameters, location, attenuation, and focal mechanisms. Advanced waveform modeling techniques are provided for crustal structure and focal mechanism estimation. SGRAPH has been employed in the Egyptian National Seismic Network (ENSN) as a tool assisting with routine work and data analysis. More than 30 users have been using previous versions of SGRAPH in their research for more than 3 years. The main features of this application are ease of use, speed, small disk space requirements, and the absence of third-party developed components. Because of its architectural structure, SGRAPH can be interfaced with newly developed methods or applications in seismology. A complete setup file, including the SGRAPH package with the online user guide, is available.
Overview of Nuclear Physics Data: Databases, Web Applications and Teaching Tools
NASA Astrophysics Data System (ADS)
McCutchan, Elizabeth
2017-01-01
The mission of the United States Nuclear Data Program (USNDP) is to provide current, accurate, and authoritative data for use in pure and applied areas of nuclear science and engineering. This is accomplished by compiling, evaluating, and disseminating extensive datasets. Our main products include the Evaluated Nuclear Structure File (ENSDF) containing information on nuclear structure and decay properties and the Evaluated Nuclear Data File (ENDF) containing information on neutron-induced reactions. The National Nuclear Data Center (NNDC), through the website www.nndc.bnl.gov, provides web-based retrieval systems for these and many other databases. In addition, the NNDC hosts several on-line physics tools, useful for calculating various quantities relating to basic nuclear physics. In this talk, I will first introduce the quantities which are evaluated and recommended in our databases. I will then outline the searching capabilities which allow one to quickly and efficiently retrieve data. Finally, I will demonstrate how the database searches and web applications can provide effective teaching tools concerning the structure of nuclei and how they interact. Work supported by the Office of Nuclear Physics, Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-98CH10886.
Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan
2015-10-29
This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parametersmore » applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.« less
Toxicokinetic and Dosimetry Modeling Tools for Exposure ...
New technologies and in vitro testing approaches have been valuable additions to risk assessments that have historically relied solely on in vivo test results. Compared to in vivo methods, in vitro high throughput screening (HTS) assays are less expensive, faster and can provide mechanistic insights on chemical action. However, extrapolating from in vitro chemical concentrations to target tissue or blood concentrations in vivo is fraught with uncertainties, and modeling is dependent upon pharmacokinetic variables not measured in in vitro assays. To address this need, new tools have been created for characterizing, simulating, and evaluating chemical toxicokinetics. Physiologically-based pharmacokinetic (PBPK) models provide estimates of chemical exposures that produce potentially hazardous tissue concentrations, while tissue microdosimetry PK models relate whole-body chemical exposures to cell-scale concentrations. These tools rely on high-throughput in vitro measurements, and successful methods exist for pharmaceutical compounds that determine PK from limited in vitro measurements and chemical structure-derived property predictions. These high throughput (HT) methods provide a more rapid and less resource–intensive alternative to traditional PK model development. We have augmented these in vitro data with chemical structure-based descriptors and mechanistic tissue partitioning models to construct HTPBPK models for over three hundred environmental and pharmace
Accessing the public MIMIC-II intensive care relational database for clinical research.
Scott, Daniel J; Lee, Joon; Silva, Ikaro; Park, Shinhyuk; Moody, George B; Celi, Leo A; Mark, Roger G
2013-01-10
The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge "Predicting mortality of ICU Patients". QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database.
Influenza Research Database: An integrated bioinformatics resource for influenza virus research.
Zhang, Yun; Aevermann, Brian D; Anderson, Tavis K; Burke, David F; Dauphin, Gwenaelle; Gu, Zhiping; He, Sherry; Kumar, Sanjeev; Larsen, Christopher N; Lee, Alexandra J; Li, Xiaomei; Macken, Catherine; Mahaffey, Colin; Pickett, Brett E; Reardon, Brian; Smith, Thomas; Stewart, Lucy; Suloway, Christian; Sun, Guangyu; Tong, Lei; Vincent, Amy L; Walters, Bryan; Zaremba, Sam; Zhao, Hongtao; Zhou, Liwei; Zmasek, Christian; Klem, Edward B; Scheuermann, Richard H
2017-01-04
The Influenza Research Database (IRD) is a U.S. National Institute of Allergy and Infectious Diseases (NIAID)-sponsored Bioinformatics Resource Center dedicated to providing bioinformatics support for influenza virus research. IRD facilitates the research and development of vaccines, diagnostics and therapeutics against influenza virus by providing a comprehensive collection of influenza-related data integrated from various sources, a growing suite of analysis and visualization tools for data mining and hypothesis generation, personal workbench spaces for data storage and sharing, and active user community support. Here, we describe the recent improvements in IRD including the use of cloud and high performance computing resources, analysis and visualization of user-provided sequence data with associated metadata, predictions of novel variant proteins, annotations of phenotype-associated sequence markers and their predicted phenotypic effects, hemagglutinin (HA) clade classifications, an automated tool for HA subtype numbering conversion, linkouts to disease event data and the addition of host factor and antiviral drug components. All data and tools are freely available without restriction from the IRD website at https://www.fludb.org. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
G. Scott Place; Bruce Hronek
2001-01-01
Open space is a necessary tool in our park system for fostering creativity and allowing for relaxation. In addition, open space areas allow people to exercise, find self-worth, and to use their imagination. This manuscript addresses the issue of what is happening in open space provided in several park settings. Do residents use open space as a place where they can play...
Next Generation Loading System for Detonators and Primers
Designed , fabricated and installed next generation tooling to provide additional manufacturing capabilities for new detonators and other small...prototype munitions on automated, semi-automated and manual machines. Lead design effort, procured and installed a primary explosive Drying Oven for a pilot...facility. Designed , fabricated and installed a Primary Explosives Waste Treatment System in a pilot environmental processing facility. Designed
NASA Research to Support the Airlines
NASA Technical Reports Server (NTRS)
Evans, Cody; Mogford, Richard H.
2017-01-01
This presentation is an update on continued research and partnerships with airline and industry partners. In this presentation, several recent research efforts are discussed and illustrations are provided to bring greater awareness to the commercial aviation industry. By discussing projects like the Flight Awareness Collaboration Tool and dispatcher human factors studies, we can solicit additional feedback and participation.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
... closely tailor their investment and risk management strategies and decisions. Furthermore, the Exchange... powerful tool for hedging a market sector, and that various strategies that the investor put into play were... to provide investors with additional short term option classes for investment, trading, and risk...
Measured outcomes with hypnosis as an experimental tool in a cardiovascular physiology laboratory.
Casiglia, Edoardo; Tikhonoff, Valérie; Giordano, Nunzia; Andreatta, Elisa; Regaldo, Giuseppe; Tosello, Maria T; Rossi, Augusto M; Bordin, Daniele; Giacomello, Margherita; Facco, Enrico
2012-01-01
The authors detail their multidisciplinary collaboration of cardiologists, physiologists, neurologists, psychologists, engineers, and statisticians in researching the effects of hypnosis on the cardiovascular system and their additions to that incomplete literature. The article details their results and provides guidelines for researchers interested in replicating their research on hypnosis' effect on the cardiovascular system.
An Analysis of Data Activities and Instructional Supports in Middle School Science Textbooks
ERIC Educational Resources Information Center
Morris, Bradley J.; Masnick, Amy M.; Baker, Katie; Junglen, Angela
2015-01-01
A critical component of science and math education is reasoning with data. Science textbooks are instructional tools that provide opportunities for learning science content (e.g. facts about force and motion) and process skills (e.g. data recording) that support and augment reasoning with data. In addition, the construction and design of textbooks…
PowerPoint and Concept Maps: A Great Double Act
ERIC Educational Resources Information Center
Simon, Jon
2015-01-01
This article explores how concept maps can provide a useful addition to PowerPoint slides to convey interconnections of knowledge and help students see how knowledge is often non-linear. While most accounting educators are familiar with PowerPoint, they are likely to be less familiar with concept maps and this article shows how the tool can be…
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNesby, Kevin L.; Homan, Barrie E.; Benjamin, Richard A.
Here, the techniques presented in this paper allow for mapping of temperature, pressure, chemical species, and energy deposition during and following detonations of explosives, using high speed cameras as the main diagnostic tool. Additionally, this work provides measurement in the explosive near to far-field (0-500 charge diameters) of surface temperatures, peak air-shock pressures, some chemical species signatures, shock energy deposition, and air shock formation.
Felicia D. Archuleta; Paulette L. Ford
2013-01-01
Black-tailed prairie dogs (Cynomys ludovicianus) are considered a keystone species in grassland ecosystems. Through their burrowing activities, they conspicuously alter grassland landscapes and provide foraging, shelter and nesting habitat for a diverse array of grassland species, in addition to serving as prey for the endangered black-footed ferret (Mustela nigripes...
Impact of iPads on Break-Time in Primary Schools--A Danish Context
ERIC Educational Resources Information Center
Schilhab, Theresa
2017-01-01
Today, technology in the form of tablet computers (e.g. iPads) is crucial as a tool for learning and education. Tablets support educational activities such as archiving, word processing, and generation of academic products. They also connect with the Internet, providing access to news, encyclopaedic entries, and e-books. In addition, tablets have…
Pi in the Sky: Hands-on Mathematical Activities for Teaching Astronomy.
ERIC Educational Resources Information Center
Pethoud, Robert
This book of activities was designed to provide students with the opportunity to create mental models of concepts in astronomy while using simple, homemade tools. In addition, these sequential, hands-on activities are to help students see how scientific knowledge is obtained. The introduction describes the rationale for the book and describes the…
ERIC Educational Resources Information Center
Holcomb, Edie L.
2004-01-01
This book builds upon the best-selling first edition to provide additional guidance and support for educators who are "ready, willing, and able" to explore more sophisticated uses of data. New tools and activities facilitate active engagement with data and a collaborative culture of collective responsibility for the learning of all…
Smartphone based face recognition tool for the blind.
Kramer, K M; Hedin, D S; Rolkosky, D J
2010-01-01
The inability to identify people during group meetings is a disadvantage for blind people in many professional and educational situations. To explore the efficacy of face recognition using smartphones in these settings, we have prototyped and tested a face recognition tool for blind users. The tool utilizes Smartphone technology in conjunction with a wireless network to provide audio feedback of the people in front of the blind user. Testing indicated that the face recognition technology can tolerate up to a 40 degree angle between the direction a person is looking and the camera's axis and a 96% success rate with no false positives. Future work will be done to further develop the technology for local face recognition on the smartphone in addition to remote server based face recognition.
Visualizing vascular structures in virtual environments
NASA Astrophysics Data System (ADS)
Wischgoll, Thomas
2013-01-01
In order to learn more about the cause of coronary heart diseases and develop diagnostic tools, the extraction and visualization of vascular structures from volumetric scans for further analysis is an important step. By determining a geometric representation of the vasculature, the geometry can be inspected and additional quantitative data calculated and incorporated into the visualization of the vasculature. To provide a more user-friendly visualization tool, virtual environment paradigms can be utilized. This paper describes techniques for interactive rendering of large-scale vascular structures within virtual environments. This can be applied to almost any virtual environment configuration, such as CAVE-type displays. Specifically, the tools presented in this paper were tested on a Barco I-Space and a large 62x108 inch passive projection screen with a Kinect sensor for user tracking.
Integrating Climate and Ocean Change Vulnerability into Conservation Planning
NASA Astrophysics Data System (ADS)
Mcleod, E.; Green, A.; Game, E.; Anthony, K.; Cinner, J.; Heron, S. F.; Kleypas, J. A.; Lovelock, C.; Pandolfi, J.; Pressey, B.; Salm, R.; Schill, S.; Woodroffe, C. D.
2013-05-01
Tropical coastal and marine ecosystems are particularly vulnerable to ocean warming, ocean acidification, and sea-level rise. Yet these projected climate and ocean change impacts are rarely considered in conservation planning due to the lack of guidance on how existing climate and ocean change models, tools, and data can be applied. We address this gap by describing how conservation planning can use available tools and data for assessing the vulnerability of tropical marine ecosystems to key climate threats. Additionally, we identify limitations of existing tools and provide recommendations for future research to improve integration of climate and ocean change information and conservation planning. Such information is critical for developing a conservation response that adequately protects these ecosystems and dependent coastal communities in the face of climate and ocean change.
Cementitious Barriers Partnership FY2013 End-Year Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G. P.; Langton, C. A.; Burns, H. H.
2013-11-01
In FY2013, the Cementitious Barriers Partnership (CBP) demonstrated continued tangible progress toward fulfilling the objective of developing a set of software tools to improve understanding and prediction of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. In November 2012, the CBP released “Version 1.0” of the CBP Software Toolbox, a suite of software for simulating reactive transport in cementitious materials and important degradation phenomena. In addition, the CBP completed development of new software for the “Version 2.0” Toolbox to be released in early FY2014 and demonstrated use of the Version 1.0 Toolbox on DOEmore » applications. The current primary software components in both Versions 1.0 and 2.0 are LeachXS/ORCHESTRA, STADIUM, and a GoldSim interface for probabilistic analysis of selected degradation scenarios. The CBP Software Toolbox Version 1.0 supports analysis of external sulfate attack (including damage mechanics), carbonation, and primary constituent leaching. Version 2.0 includes the additional analysis of chloride attack and dual regime flow and contaminant migration in fractured and non-fractured cementitious material. The LeachXS component embodies an extensive material property measurements database along with chemical speciation and reactive mass transport simulation cases with emphasis on leaching of major, trace and radionuclide constituents from cementitious materials used in DOE facilities, such as Saltstone (Savannah River) and Cast Stone (Hanford), tank closure grouts, and barrier concretes. STADIUM focuses on the physical and structural service life of materials and components based on chemical speciation and reactive mass transport of major cement constituents and aggressive species (e.g., chloride, sulfate, etc.). THAMES is a planned future CBP Toolbox component focused on simulation of the microstructure of cementitious materials and calculation of resultant hydraulic and constituent mass transfer parameters needed in modeling. Two CBP software demonstrations were conducted in FY2013, one to support the Saltstone Disposal Facility (SDF) at SRS and the other on a representative Hanford high-level waste tank. The CBP Toolbox demonstration on the SDF provided analysis on the most probable degradation mechanisms to the cementitious vault enclosure caused by sulfate and carbonation ingress. This analysis was documented and resulted in the issuance of a SDF Performance Assessment Special Analysis by Liquid Waste Operations this fiscal year. The two new software tools supporting chloride attack and dual-regime flow will provide additional degradation tools to better evaluate performance of DOE and commercial cementitious barriers. The CBP SRNL experimental program produced two patent applications and field data that will be used in the development and calibration of CBP software tools being developed in FY2014. The CBP software and simulation tools varies from other efforts in that all the tools are based upon specific and relevant experimental research of cementitious materials utilized in DOE applications. The CBP FY2013 program involved continuing research to improve and enhance the simulation tools as well as developing new tools that model other key degradation phenomena not addressed in Version 1.0. Also efforts to continue to verify the various simulation tools through laboratory experiments and analysis of field specimens are ongoing and will continue into FY2014 to quantify and reduce the uncertainty associated with performance assessments. This end-year report summarizes FY2013 software development efforts and the various experimental programs that are providing data for calibration and validation of the CBP developed software.« less
Klein, Dawn M; Fix, Gemmae M; Hogan, Timothy P; Simon, Steven R; Nazi, Kim M; Turvey, Carolyn L
2015-08-18
Information sharing between providers is critical for care coordination, especially in health systems such as the United States Department of Veterans Affairs (VA), where many patients also receive care from other health care organizations. Patients can facilitate this sharing by using the Blue Button, an online tool that promotes patients' ability to view, print, and download their health records. The aim of this study was to characterize (1) patients' use of Blue Button, an online information-sharing tool in VA's patient portal, My HealtheVet, (2) information-sharing practices between VA and non-VA providers, and (3) how providers and patients use a printed Blue Button report during a clinical visit. Semistructured qualitative interviews were conducted with 34 VA patients, 10 VA providers, and 9 non-VA providers. Interviews focused on patients' use of Blue Button, information-sharing practices between VA and non-VA providers, and how patients and providers use a printed Blue Button report during a clinical visit. Qualitative themes were identified through iterative rounds of coding starting with an a priori schema based on technology adoption theory. Information sharing between VA and non-VA providers relied primarily on the patient. Patients most commonly used Blue Button to access and share VA laboratory results. Providers recognized the need for improved information sharing, valued the Blue Button printout, and expressed interest in a way to share information electronically across settings. Consumer-oriented technologies such as Blue Button can facilitate patients sharing health information with providers in other health care systems; however, more education is needed to inform patients of this use to facilitate care coordination. Additional research is needed to explore how personal health record documents, such as Blue Button reports, can be easily shared and incorporated into the clinical workflow of providers.
UCSC genome browser: deep support for molecular biomedical research.
Mangan, Mary E; Williams, Jennifer M; Lathe, Scott M; Karolchik, Donna; Lathe, Warren C
2008-01-01
The volume and complexity of genomic sequence data, and the additional experimental data required for annotation of the genomic context, pose a major challenge for display and access for biomedical researchers. Genome browsers organize this data and make it available in various ways to extract useful information to advance research projects. The UCSC Genome Browser is one of these resources. The official sequence data for a given species forms the framework to display many other types of data such as expression, variation, cross-species comparisons, and more. Visual representations of the data are available for exploration. Data can be queried with sequences. Complex database queries are also easily achieved with the Table Browser interface. Associated tools permit additional query types or access to additional data sources such as images of in situ localizations. Support for solving researcher's issues is provided with active discussion mailing lists and by providing updated training materials. The UCSC Genome Browser provides a source of deep support for a wide range of biomedical molecular research (http://genome.ucsc.edu).
Cramer, Bradley D.; Kleffner, Mark A.; Brett, Carlton E.; McLaughlin, P.I.; Jeppsson, Lennart; Munnecke, Axel; Samtleben, Christian
2010-01-01
The Wenlock Epoch of the Silurian Period has become one of the chronostratigraphically best-constrained intervals of the Paleozoic. The integration of multiple chronostratigraphic tools, such as conodont and graptolite biostratigraphy, sequence stratigraphy, and ??13Ccarb chemostratigraphy, has greatly improved global chronostratigraphic correlation and portions of the Wenlock can now be correlated with precision better than ??100kyr. Additionally, such detailed and integrated chronostratigraphy provides an opportunity to evaluate the fidelity of individual chronostratigraphic tools. Here, we use conodont biostratigraphy, sequence stratigraphy and carbon isotope (??13Ccarb) chemostratigraphy to demonstrate that the conodont Kockelella walliseri, an important guide fossil for middle and upper Sheinwoodian strata (lower stage of the Wenlock Series), first appears at least one full stratigraphic sequence lower in Laurentia than in Baltica. Rather than serving as a demonstration of the unreliability of conodont biostratigraphy, this example serves to demonstrate the promise of high-resolution Paleozoic stratigraphy. The temporal difference between the two first occurrences was likely less than 1million years, and although it is conceptually understood that speciation and colonization must have been non-instantaneous events, Paleozoic paleobiogeographic variability on such short timescales (tens to hundreds of kyr) traditionally has been ignored or considered to be of little practical importance. The expansion of high-resolution Paleozoic stratigraphy in the future will require robust biostratigraphic zonations that embrace the integration of multiple chronostratigraphic tools as well as the paleobiogeographic variability in ranges that they will inevitably demonstrate. In addition, a better understanding of the paleobiogeographic migration histories of marine organisms will provide a unique tool for future Paleozoic paleoceanography and paleobiology research. ?? 2010 Elsevier B.V.
Rasinger, J D; Marbaix, H; Dieu, M; Fumière, O; Mauro, S; Palmblad, M; Raes, M; Berntssen, M H G
2016-09-16
The rapidly growing aquaculture industry drives the search for sustainable protein sources in fish feed. In the European Union (EU) since 2013 non-ruminant processed animal proteins (PAP) are again permitted to be used in aquafeeds. To ensure that commercial fish feeds do not contain PAP from prohibited species, EU reference methods were established. However, due to the heterogeneous and complex nature of PAP complementary methods are required to guarantee the safe use of this fish feed ingredient. In addition, there is a need for tissue specific PAP detection to identify the sources (i.e. bovine carcass, blood, or meat) of illegal PAP use. In the present study, we investigated and compared different protein extraction, solubilisation and digestion protocols on different proteomics platforms for the detection and differentiation of prohibited PAP. In addition, we assessed if tissue specific PAP detection was feasible using proteomics tools. All work was performed independently in two different laboratories. We found that irrespective of sample preparation gel-based proteomics tools were inappropriate when working with PAP. Gel-free shotgun proteomics approaches in combination with direct spectral comparison were able to provide quality species and tissue specific data to complement and refine current methods of PAP detection and identification. To guarantee the safe use of processed animal protein (PAP) in aquafeeds efficient PAP detection and monitoring tools are required. The present study investigated and compared various proteomics workflows and shows that the application of shotgun proteomics in combination with direct comparison of spectral libraries provides for the desired species and tissue specific classification of this heat sterilized and pressure treated (≥133°C, at 3bar for 20min) protein feed ingredient. Copyright © 2016 Elsevier B.V. All rights reserved.
FunGene: the functional gene pipeline and repository.
Fish, Jordan A; Chai, Benli; Wang, Qiong; Sun, Yanni; Brown, C Titus; Tiedje, James M; Cole, James R
2013-01-01
Ribosomal RNA genes have become the standard molecular markers for microbial community analysis for good reasons, including universal occurrence in cellular organisms, availability of large databases, and ease of rRNA gene region amplification and analysis. As markers, however, rRNA genes have some significant limitations. The rRNA genes are often present in multiple copies, unlike most protein-coding genes. The slow rate of change in rRNA genes means that multiple species sometimes share identical 16S rRNA gene sequences, while many more species share identical sequences in the short 16S rRNA regions commonly analyzed. In addition, the genes involved in many important processes are not distributed in a phylogenetically coherent manner, potentially due to gene loss or horizontal gene transfer. While rRNA genes remain the most commonly used markers, key genes in ecologically important pathways, e.g., those involved in carbon and nitrogen cycling, can provide important insights into community composition and function not obtainable through rRNA analysis. However, working with ecofunctional gene data requires some tools beyond those required for rRNA analysis. To address this, our Functional Gene Pipeline and Repository (FunGene; http://fungene.cme.msu.edu/) offers databases of many common ecofunctional genes and proteins, as well as integrated tools that allow researchers to browse these collections and choose subsets for further analysis, build phylogenetic trees, test primers and probes for coverage, and download aligned sequences. Additional FunGene tools are specialized to process coding gene amplicon data. For example, FrameBot produces frameshift-corrected protein and DNA sequences from raw reads while finding the most closely related protein reference sequence. These tools can help provide better insight into microbial communities by directly studying key genes involved in important ecological processes.
SAR For REDD+ in the Mai Ndombe District (DRC)
NASA Astrophysics Data System (ADS)
Haarpaintner, Jorg
2016-08-01
The overall goal of the project "SAR for REDD" is to provide cloud-penetrating satellite synthetic aperture radar (SAR) pre-processing and analysing capabilities and tools to support operational tropical forest monitoring in REDD countries and primarily in Africa. The project's end-user is the Observatoir Satellitale des Forêts d'Afrique Centrale (OSFAC).This paper presents an overall summary of the project and shows first results of the satellite products, that will be delivered to the user in addition to software tools to enhance the user's own technical capacity.The products shown here are SAR mosaics and derived forest-land cover maps based on C-band Sentinel-1A data for 2015, ALOS-PALSAR data for the period 2007-2010 and ALOS-2 PALSAR-2 for 2015. In addition, a forest cover change map from 2007 to 2010 based on ALOS PALSAR has been produced and is compared to results from the Global Forest Cover project [1].
NASA Astrophysics Data System (ADS)
Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam
2018-05-01
Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.
Approaches for Defining the Hsp90-dependent Proteome
Hartson, Steven D.; Matts, Robert L.
2011-01-01
Hsp90 is the target of ongoing drug discovery studies seeking new compounds to treat cancer, neurodegenerative diseases, and protein folding disorders. To better understand Hsp90’s roles in cellular pathologies and in normal cells, numerous studies have utilized proteomics assays and related high-throughput tools to characterize its physical and functional protein partnerships. This review surveys these studies, and summarizes the strengths and limitations of the individual attacks. We also include downloadable spreadsheets compiling all of the Hsp90-interacting proteins identified in more than 23 studies. These tools include cross-references among gene aliases, human homologues of yeast Hsp90-interacting proteins, hyperlinks to database entries, summaries of canonical pathways that are enriched in the Hsp90 interactome, and additional bioinformatic annotations. In addition to summarizing Hsp90 proteomics studies performed to date and the insights they have provided, we identify gaps in our current understanding of Hsp90-mediated proteostasis. PMID:21906632
New Tools to Search for Data in the European Space Agency's Planetary Science Archive
NASA Astrophysics Data System (ADS)
Grotheer, E.; Macfarlane, A. J.; Rios, C.; Arviset, C.; Heather, D.; Fraga, D.; Vallejo, F.; De Marchi, G.; Barbarisi, I.; Saiz, J.; Barthelemy, M.; Docasal, R.; Martinez, S.; Besse, S.; Lim, T.
2016-12-01
The European Space Agency's (ESA) Planetary Science Archive (PSA), which can be accessed at http://archives.esac.esa.int/psa, provides public access to the archived data of Europe's missions to our neighboring planets. These datasets are compliant with the Planetary Data System (PDS) standards. Recently, a new interface has been released, which includes upgrades to make PDS4 data available from newer missions such as ExoMars and BepiColombo. Additionally, the PSA development team has been working to ensure that the legacy PDS3 data will be more easily accessible via the new interface as well. In addition to a new querying interface, the new PSA also allows access via the EPN-TAP and PDAP protocols. This makes the PSA data sets compatible with other archive-related tools and projects, such as the Virtual European Solar and Planetary Access (VESPA) project for creating a virtual observatory.
SLDAssay: A software package and web tool for analyzing limiting dilution assays.
Trumble, Ilana M; Allmon, Andrew G; Archin, Nancie M; Rigdon, Joseph; Francis, Owen; Baldoni, Pedro L; Hudgens, Michael G
2017-11-01
Serial limiting dilution (SLD) assays are used in many areas of infectious disease related research. This paper presents SLDAssay, a free and publicly available R software package and web tool for analyzing data from SLD assays. SLDAssay computes the maximum likelihood estimate (MLE) for the concentration of target cells, with corresponding exact and asymptotic confidence intervals. Exact and asymptotic goodness of fit p-values, and a bias-corrected (BC) MLE are also provided. No other publicly available software currently implements the BC MLE or the exact methods. For validation of SLDAssay, results from Myers et al. (1994) are replicated. Simulations demonstrate the BC MLE is less biased than the MLE. Additionally, simulations demonstrate that exact methods tend to give better confidence interval coverage and goodness-of-fit tests with lower type I error than the asymptotic methods. Additional advantages of using exact methods are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam
2018-01-01
Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.
Spec Tool; an online education and research resource
NASA Astrophysics Data System (ADS)
Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.
2016-06-01
Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.
Risk stratification following acute myocardial infarction.
Singh, Mandeep
2007-07-01
This article reviews the current risk assessment models available for patients presenting with myocardial infarction (MI). These practical tools enhance the health care provider's ability to rapidly and accurately assess patient risk from the event or revascularization therapy, and are of paramount importance in managing patients presenting with MI. This article highlights the models used for ST-elevation MI (STEMI) and non-ST elevation MI (NSTEMI) and provides an additional description of models used to assess risks after primary angioplasty (ie, angioplasty performed for STEMI).
NASA Technical Reports Server (NTRS)
Jedlovec, Gary J.; Molthan, Andrew; Zavodsky, Bradley T.; Case, Jonathan L.; LaFontaine, Frank J.; Srikishen, Jayanthi
2010-01-01
The NASA Short-term Prediction Research and Transition Center (SPoRT)'s new "Weather in a Box" resources will provide weather research and forecast modeling capabilities for real-time application. Model output will provide additional forecast guidance and research into the impacts of new NASA satellite data sets and software capabilities. By combining several research tools and satellite products, SPoRT can generate model guidance that is strongly influenced by unique NASA contributions.
Lawrence, Renée H; Tomolo, Anne M
2011-03-01
Although practice-based learning and improvement (PBLI) is now recognized as a fundamental and necessary skill set, we are still in need of tools that yield specific information about gaps in knowledge and application to help nurture the development of quality improvement (QI) skills in physicians in a proficient and proactive manner. We developed a questionnaire and coding system as an assessment tool to evaluate and provide feedback regarding PBLI self-efficacy, knowledge, and application skills for residency programs and related professional requirements. Five nationally recognized QI experts/leaders reviewed and completed our questionnaire. Through an iterative process, a coding system based on identifying key variables needed for ideal responses was developed to score project proposals. The coding system comprised 14 variables related to the QI projects, and an additional 30 variables related to the core knowledge concepts related to PBLI. A total of 86 residents completed the questionnaire, and 2 raters coded their open-ended responses. Interrater reliability was assessed by percentage agreement and Cohen κ for individual variables and Lin concordance correlation for total scores for knowledge and application. Discriminative validity (t test to compare known groups) and coefficient of reproducibility as an indicator of construct validity (item difficulty hierarchy) were also assessed. Interrater reliability estimates were good (percentage of agreements, above 90%; κ, above 0.4 for most variables; concordances for total scores were R = .88 for knowledge and R = .98 for application). Despite the residents' limited range of experiences in the group with prior PBLI exposure, our tool met our goal of differentiating between the 2 groups in our preliminary analyses. Correcting for chance agreement identified some variables that are potentially problematic. Although additional evaluation is needed, our tool may prove helpful and provide detailed information about trainees' progress and the curriculum.
Lawrence, Renée H; Tomolo, Anne M
2011-01-01
Background Although practice-based learning and improvement (PBLI) is now recognized as a fundamental and necessary skill set, we are still in need of tools that yield specific information about gaps in knowledge and application to help nurture the development of quality improvement (QI) skills in physicians in a proficient and proactive manner. We developed a questionnaire and coding system as an assessment tool to evaluate and provide feedback regarding PBLI self-efficacy, knowledge, and application skills for residency programs and related professional requirements. Methods Five nationally recognized QI experts/leaders reviewed and completed our questionnaire. Through an iterative process, a coding system based on identifying key variables needed for ideal responses was developed to score project proposals. The coding system comprised 14 variables related to the QI projects, and an additional 30 variables related to the core knowledge concepts related to PBLI. A total of 86 residents completed the questionnaire, and 2 raters coded their open-ended responses. Interrater reliability was assessed by percentage agreement and Cohen κ for individual variables and Lin concordance correlation for total scores for knowledge and application. Discriminative validity (t test to compare known groups) and coefficient of reproducibility as an indicator of construct validity (item difficulty hierarchy) were also assessed. Results Interrater reliability estimates were good (percentage of agreements, above 90%; κ, above 0.4 for most variables; concordances for total scores were R = .88 for knowledge and R = .98 for application). Conclusion Despite the residents' limited range of experiences in the group with prior PBLI exposure, our tool met our goal of differentiating between the 2 groups in our preliminary analyses. Correcting for chance agreement identified some variables that are potentially problematic. Although additional evaluation is needed, our tool may prove helpful and provide detailed information about trainees' progress and the curriculum. PMID:22379522
Guidelines for the analysis of free energy calculations.
Klimovich, Pavel V; Shirts, Michael R; Mobley, David L
2015-05-01
Free energy calculations based on molecular dynamics simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical-analysis.py, freely available on GitHub as part of the pymbar package (located at http://github.com/choderalab/pymbar), that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope this tool and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations.
π Scope: python based scientific workbench with visualization tool for MDSplus data
NASA Astrophysics Data System (ADS)
Shiraiwa, S.
2014-10-01
π Scope is a python based scientific data analysis and visualization tool constructed on wxPython and Matplotlib. Although it is designed to be a generic tool, the primary motivation for developing the new software is 1) to provide an updated tool to browse MDSplus data, with functionalities beyond dwscope and jScope, and 2) to provide a universal foundation to construct interface tools to perform computer simulation and modeling for Alcator C-Mod. It provides many features to visualize MDSplus data during tokamak experiments including overplotting different signals and discharges, various plot types (line, contour, image, etc.), in-panel data analysis using python scripts, and publication quality graphics generation. Additionally, the logic to produce multi-panel plots is designed to be backward compatible with dwscope, enabling smooth migration for dwscope users. πScope uses multi-threading to reduce data transfer latency, and its object-oriented design makes it easy to modify and expand while the open source nature allows portability. A built-in tree data browser allows a user to approach the data structure both from a GUI and a script, enabling relatively complex data analysis workflow to be built quickly. As an example, an IDL-based interface to perform GENRAY/CQL3D simulations was ported on πScope, thus allowing LHCD simulation to be run between-shot using C-Mod experimental profiles. This workflow is being used to generate a large database to develop a LHCD actuator model for the plasma control system. Supported by USDoE Award DE-FC02-99ER54512.
Engine System Model Development for Nuclear Thermal Propulsion
NASA Technical Reports Server (NTRS)
Nelson, Karl W.; Simpson, Steven P.
2006-01-01
In order to design, analyze, and evaluate conceptual Nuclear Thermal Propulsion (NTP) engine systems, an improved NTP design and analysis tool has been developed. The NTP tool utilizes the Rocket Engine Transient Simulation (ROCETS) system tool and many of the routines from the Enabler reactor model found in Nuclear Engine System Simulation (NESS). Improved non-nuclear component models and an external shield model were added to the tool. With the addition of a nearly complete system reliability model, the tool will provide performance, sizing, and reliability data for NERVA-Derived NTP engine systems. A new detailed reactor model is also being developed and will replace Enabler. The new model will allow more flexibility in reactor geometry and include detailed thermal hydraulics and neutronics models. A description of the reactor, component, and reliability models is provided. Another key feature of the modeling process is the use of comprehensive spreadsheets for each engine case. The spreadsheets include individual worksheets for each subsystem with data, plots, and scaled figures, making the output very useful to each engineering discipline. Sample performance and sizing results with the Enabler reactor model are provided including sensitivities. Before selecting an engine design, all figures of merit must be considered including the overall impacts on the vehicle and mission. Evaluations based on key figures of merit of these results and results with the new reactor model will be performed. The impacts of clustering and external shielding will also be addressed. Over time, the reactor model will be upgraded to design and analyze other NTP concepts with CERMET and carbide fuel cores.
BIRCH: a user-oriented, locally-customizable, bioinformatics system.
Fristensky, Brian
2007-02-09
Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.
BIRCH: A user-oriented, locally-customizable, bioinformatics system
Fristensky, Brian
2007-01-01
Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351
Coastal On-line Assessment and Synthesis Tool 2.0
NASA Technical Reports Server (NTRS)
Brown, Richard; Navard, Andrew; Nguyen, Beth
2011-01-01
COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.
NASA transmission research and its probable effects on helicopter transmission design
NASA Technical Reports Server (NTRS)
Zaretsky, E. V.; Coy, J. J.; Townsend, D. P.
1983-01-01
Transmissions studied for application to helicopters in addition to the more conventional geared transmissions include hybrid (traction/gear), bearingless planetary, and split torque transmissions. Research is being performed to establish the validity of analysis and computer codes developed to predict the performance, efficiency, life, and reliability of these transmissions. Results of this research should provide the transmission designer with analytical tools to design for minimum weight and noise with maximum life and efficiency. In addition, the advantages and limitations of drive systems as well as the more conventional systems will be defined.
NASA transmission research and its probable effects on helicopter transmission design
NASA Technical Reports Server (NTRS)
Zaretsky, E. V.; Coy, J. J.; Townsend, D. P.
1984-01-01
Transmissions studied for application to helicopters in addition to the more conventional geared transmissions include hybrid (traction/gear), bearingless planetary, and split torque transmissions. Research is being performed to establish the validity of analysis and computer codes developed to predict the performance, efficiency, life, and reliability of these transmissions. Results of this research should provide the transmission designer with analytical tools to design for minimum weight and noise with maximum life and efficiency. In addition, the advantages and limitations of drive systems as well as the more conventional systems will be defined.
Tan, Amanda W Y; Hemelrijk, Charlotte K; Malaivijitnond, Suchinda; Gumert, Michael D
2018-05-12
Examining how animals direct social learning during skill acquisition under natural conditions, generates data for examining hypotheses regarding how transmission biases influence cultural change in animal populations. We studied a population of macaques on Koram Island, Thailand, and examined model-based biases during interactions by unskilled individuals with tool-using group members. We first compared the prevalence of interactions (watching, obtaining food, object exploration) and proximity to tool users during interactions, in developing individuals (infants, juveniles) versus mature non-learners (adolescents, adults), to provide evidence that developing individuals are actively seeking information about tool use from social partners. All infants and juveniles, but only 49% of mature individuals carried out interacted with tool users. Macaques predominantly obtained food by scrounging or stealing, suggesting maximizing scrounging opportunities motivates interactions with tool users. However, while interactions by adults was limited to obtaining food, young macaques and particularly infants also watched tool users and explored objects, indicating additional interest in tool use itself. We then ran matrix correlations to identify interaction biases, and what attributes of tool users influenced these. Biases correlated with social affiliation, but macaques also preferentially targeted tool users that potentially increase scrounging and learning opportunities. Results suggest that social structure may constrain social learning, but the motivation to bias interactions towards tool users to maximize feeding opportunities may also socially modulate learning by facilitating close proximity to better tool users, and further interest in tool-use actions and materials, especially during development.
2014-01-01
Background. Evidence rankings do not consider equally internal (IV), external (EV), and model validity (MV) for clinical studies including complementary and alternative medicine/integrative medicine (CAM/IM) research. This paper describe this model and offers an EV assessment tool (EVAT©) for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IM research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making. PMID:24734111
Is nanotechnology the key to unravel and engineer biological processes?
Navarro, Melba; Planell, Josep A
2012-01-01
Regenerative medicine is an emerging field aiming to the development of new reparative strategies to treat degenerative diseases, injury, and trauma through developmental pathways in order to rebuild the architecture of the original injured organ and take over its functionality. Most of the processes and interactions involved in the regenerative process take place at subcellular scale. Nanotechnology provides the tools and technology not only to detect, to measure, or to image the interactions between the different biomolecules and biological entities, but also to control and guide the regenerative process. The relevance of nanotechnology for the development of regenerative medicine as well as an overview of the different tools that contribute to unravel and engineer biological systems are presented in this chapter. In addition, general data about the social impact and global investment in nanotechnology are provided.
Medical ethics on film: towards a reconstruction of the teaching of healthcare professionals.
Volandes, Angelo
2007-11-01
The clinical vignette remains the standard means by which medical ethics are taught to students in the healthcare professions. Although written or verbal vignettes are useful as a pedagogic tool for teaching ethics and introducing students to real cases, they are limited, since students must imagine the clinical scenario. Medical ethics are almost universally taught during the early years of training, when students are unfamiliar with the clinical reality in which ethics issues arise. Film vignettes fill in that imaginative leap. By providing vivid details with images, film vignettes offer rich and textured details of cases, including the patient's perspective and the clinical reality. Film vignettes provide a detailed ethnography that allows for a more complete discussion of the ethical issues. Film can serve as an additional tool for teaching medical ethics to members of the healthcare professions.
NASA Technical Reports Server (NTRS)
Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti A.; Maddox, Marlo M.; Mays, Mona Leila
2015-01-01
The Space Weather Research Center (http://swrc. gsfc.nasa.gov) at NASA Goddard, part of the Community Coordinated Modeling Center (http://ccmc.gsfc.nasa.gov), is committed to providing research-based forecasts and notifications to address NASA's space weather needs, in addition to its critical role in space weather education. It provides a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, tailored space weather alerts and products, and weekly summaries and reports. In this paper, we focus on how (near) real-time data (both in space and on ground), in combination with modeling capabilities and an innovative dissemination system called the integrated Space Weather Analysis system (http://iswa.gsfc.nasa.gov), enable monitoring, analyzing, and predicting the spacecraft charging environment for spacecraft users. Relevant tools and resources are discussed.
Modeling a Wireless Network for International Space Station
NASA Technical Reports Server (NTRS)
Alena, Richard; Yaprak, Ece; Lamouri, Saad
2000-01-01
This paper describes the application of wireless local area network (LAN) simulation modeling methods to the hybrid LAN architecture designed for supporting crew-computing tools aboard the International Space Station (ISS). These crew-computing tools, such as wearable computers and portable advisory systems, will provide crew members with real-time vehicle and payload status information and access to digital technical and scientific libraries, significantly enhancing human capabilities in space. A wireless network, therefore, will provide wearable computer and remote instruments with the high performance computational power needed by next-generation 'intelligent' software applications. Wireless network performance in such simulated environments is characterized by the sustainable throughput of data under different traffic conditions. This data will be used to help plan the addition of more access points supporting new modules and more nodes for increased network capacity as the ISS grows.
In vivo RNAi: Today and Tomorrow
Perrimon, Norbert; Ni, Jian-Quan; Perkins, Lizabeth
2010-01-01
SUMMARY RNA interference (RNAi) provides a powerful reverse genetics approach to analyze gene functions both in tissue culture and in vivo. Because of its widespread applicability and effectiveness it has become an essential part of the tool box kits of model organisms such as Caenorhabditis elegans, Drosophila, and the mouse. In addition, the use of RNAi in animals in which genetic tools are either poorly developed or nonexistent enables a myriad of fundamental questions to be asked. Here, we review the methods and applications of in vivo RNAi to characterize gene functions in model organisms and discuss their impact to the study of developmental as well as evolutionary questions. Further, we discuss the applications of RNAi technologies to crop improvement, pest control and RNAi therapeutics, thus providing an appreciation of the potential for phenomenal applications of RNAi to agriculture and medicine. PMID:20534712
Najmanovich, Rafael
2013-01-01
IsoCleft Finder is a web-based tool for the detection of local geometric and chemical similarities between potential small-molecule binding cavities and a non-redundant dataset of ligand-bound known small-molecule binding-sites. The non-redundant dataset developed as part of this study is composed of 7339 entries representing unique Pfam/PDB-ligand (hetero group code) combinations with known levels of cognate ligand similarity. The query cavity can be uploaded by the user or detected automatically by the system using existing PDB entries as well as user-provided structures in PDB format. In all cases, the user can refine the definition of the cavity interactively via a browser-based Jmol 3D molecular visualization interface. Furthermore, users can restrict the search to a subset of the dataset using a cognate-similarity threshold. Local structural similarities are detected using the IsoCleft software and ranked according to two criteria (number of atoms in common and Tanimoto score of local structural similarity) and the associated Z-score and p-value measures of statistical significance. The results, including predicted ligands, target proteins, similarity scores, number of atoms in common, etc., are shown in a powerful interactive graphical interface. This interface permits the visualization of target ligands superimposed on the query cavity and additionally provides a table of pairwise ligand topological similarities. Similarities between top scoring ligands serve as an additional tool to judge the quality of the results obtained. We present several examples where IsoCleft Finder provides useful functional information. IsoCleft Finder results are complementary to existing approaches for the prediction of protein function from structure, rational drug design and x-ray crystallography. IsoCleft Finder can be found at: http://bcb.med.usherbrooke.ca/isocleftfinder. PMID:24555058
Automated Generation of Technical Documentation and Provenance for Reproducible Research
NASA Astrophysics Data System (ADS)
Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.
2017-12-01
Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.
Kraemer, Kari; Cohen, Mark E; Liu, Yaoming; Barnhart, Douglas C; Rangel, Shawn J; Saito, Jacqueline M; Bilimoria, Karl Y; Ko, Clifford Y; Hall, Bruce L
2016-11-01
There is an increased desire among patients and families to be involved in the surgical decision-making process. A surgeon's ability to provide patients and families with patient-specific estimates of postoperative complications is critical for shared decision making and informed consent. Surgeons can also use patient-specific risk estimates to decide whether or not to operate and what options to offer patients. Our objective was to develop and evaluate a publicly available risk estimation tool that would cover many common pediatric surgical procedures across all specialties. American College of Surgeons NSQIP Pediatric standardized data from 67 hospitals were used to develop a risk estimation tool. Surgeons enter 18 preoperative variables (demographics, comorbidities, procedure) that are used in a logistic regression model to predict 9 postoperative outcomes. A surgeon adjustment score is also incorporated to adjust for any additional risk not accounted for in the 18 risk factors. A pediatric surgical risk calculator was developed based on 181,353 cases covering 382 CPT codes across all specialties. It had excellent discrimination for mortality (c-statistic = 0.98), morbidity (c-statistic = 0.81), and 7 additional complications (c-statistic > 0.77). The Hosmer-Lemeshow statistic and graphic representations also showed excellent calibration. The ACS NSQIP Pediatric Surgical Risk Calculator was developed using standardized and audited multi-institutional data from the ACS NSQIP Pediatric, and it provides empirically derived, patient-specific postoperative risks. It can be used as a tool in the shared decision-making process by providing clinicians, families, and patients with useful information for many of the most common operations performed on pediatric patients in the US. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Akl, Elie A; Fadlallah, Racha; Ghandour, Lilian; Kdouh, Ola; Langlois, Etienne; Lavis, John N; Schünemann, Holger; El-Jardali, Fadi
2017-09-04
Groups or institutions funding or conducting systematic reviews in health policy and systems research (HPSR) should prioritise topics according to the needs of policymakers and stakeholders. The aim of this study was to develop and validate a tool to prioritise questions for systematic reviews in HPSR. We developed the tool following a four-step approach consisting of (1) the definition of the purpose and scope of tool, (2) item generation and reduction, (3) testing for content and face validity, (4) and pilot testing of the tool. The research team involved international experts in HPSR, systematic review methodology and tool development, led by the Center for Systematic Reviews on Health Policy and Systems Research (SPARK). We followed an inclusive approach in determining the final selection of items to allow customisation to the user's needs. The purpose of the SPARK tool was to prioritise questions in HPSR in order to address them in systematic reviews. In the item generation and reduction phase, an extensive literature search yielded 40 relevant articles, which were reviewed by the research team to create a preliminary list of 19 candidate items for inclusion in the tool. As part of testing for content and face validity, input from international experts led to the refining, changing, merging and addition of new items, and to organisation of the tool into two modules. Following pilot testing, we finalised the tool, with 22 items organised in two modules - the first module including 13 items to be rated by policymakers and stakeholders, and the second including 9 items to be rated by systematic review teams. Users can customise the tool to their needs, by omitting items that may not be applicable to their settings. We also developed a user manual that provides guidance on how to use the SPARK tool, along with signaling questions. We have developed and conducted initial validation of the SPARK tool to prioritise questions for systematic reviews in HPSR, along with a user manual. By aligning systematic review production to policy priorities, the tool will help support evidence-informed policymaking and reduce research waste. We invite others to contribute with additional real-life implementation of the tool.
Whalen, Kimberly J; Buchholz, Susan W
The overall objective of this review is to quantitatively measure the psychometric properties and the feasibility of caregiver burden screening tools. The more specific objectives were to determine the reliability, validity as well as feasibility of tools that are used to screen for caregiver burden and strain. This review considered international quantitative research papers that addressed the psychometric properties and feasibility of caregiver burden screening tools. The search strategy aimed to find both published and unpublished studies from 1980-2007 published only in the English language. An initial limited search of MEDLINE and CINAHL was undertaken followed by analysis of the text words contained in the title and abstract and the index terms used to describe the article. A second search identified keywords and index terms across major databases. Third, the reference list of identified reports and articles was searched for additional studies. Each paper was assessed by two independent reviewers for methodological quality prior to inclusion in the review using an appropriate critical appraisal instrument from the Joanna Briggs Institutes' System for the Unified Management, Assessment and Review (SUMARI) package. Because burden is a multidimensional construct defined internationally with a multitude of other terms, only those studies whose title, abstract or keywords contained the search terminology developed for this review were identified for retrieval. The construct of caregiver burden is not standardized, and many terms are used to describe burden. A caregiver is also identified as a carer. Instruments exist in multiple languages and have been tested in multiple populations. A total of 112 papers, experimental and non-experimental in nature, were included in the review. The majority of papers were non-experimental studies that tested or used a caregiver burden screening tool. Because of the nature of these papers, a meta-analysis of the results was not possible. Instead a table is used to depict the 74 caregiver burden screening tools that meet the psychometric and feasibility standards of this review. The Zarit Burden Interview (ZBI), in particular the 22-item version, has been examined the most throughout the literature. In addition to its sound psychometric properties, the ZBI has been widely used across languages and cultures. The significant amount of research that has already been done on psychometric testing of caregiver burden tools has provided a solid foundation for additional research. Although some tools have been well tested, many tools have published limited psychometric properties and feasibility data. The clinician needs to be aware of this and may need to team up with a researcher to obtain additional research data on their specific population before using a minimally tested caregiver burden screening tool. Because caregiver burden is multidimensional and many different terms are used to describe burden, both the clinician and researcher need to be precise in their selection of the appropriate tool for their work.
A modeling tool to support decision making in future hydropower development in Chile
NASA Astrophysics Data System (ADS)
Vicuna, S.; Hermansen, C.; Cerda, J. P.; Olivares, M. A.; Gomez, T. I.; Toha, E.; Poblete, D.; Mao, L.; Falvey, M. J.; Pliscoff, P.; Melo, O.; Lacy, S.; Peredo, M.; Marquet, P. A.; Maturana, J.; Gironas, J. A.
2017-12-01
Modeling tools support planning by providing transparent means to assess the outcome of natural resources management alternatives within technical frameworks in the presence of conflicting objectives. Such tools, when employed to model different scenarios, complement discussion in a policy-making context. Examples of practical use of this type of tool exist, such as the Canadian public forest management, but are not common, especially in the context of developing countries. We present a tool to support the selection from a portfolio of potential future hydropower projects in Chile. This tool, developed by a large team of researchers under the guidance of the Chilean Energy Ministry, is especially relevant in the context of evident regionalism, skepticism and change in societal values in a country that has achieved a sustained growth alongside increased demands from society. The tool operates at a scale of a river reach, between 1-5 km long, on a domain that can be defined according to the scale needs of the related discussion, and its application can vary from river basins to regions or other spatial configurations that may be of interest. The tool addresses both available hydropower potential and the existence (inferred or observed) of other ecological, social, cultural and productive characteristics of the territory which are valuable to society, and provides a means to evaluate their interaction. The occurrence of each of these other valuable characteristics in the territory is measured by generating a presence-density score for each. Considering the level of constraint each characteristic imposes on hydropower development, they are weighted against each other and an aggregate score is computed. With this information, optimal trade-offs are computed between additional hydropower capacity and valuable local characteristics over the entire domain, using the classical knapsack 0-1 optimization algorithm. Various scenarios of different weightings and hydropower development targets are tested and compared. The results illustrate the capabilities of the tool to identify promising hydropower development strategies and to aid public policy discussions aimed at establishing incentives and regulations, and therefore provide decision makers with supporting material allowing a more informed discussion.
DMPTool: New Guidance, Resources, and Outreach for Quality Data Management Plans
NASA Astrophysics Data System (ADS)
Cruse, P.; Sallans, A.
2013-12-01
A growing number of US federal funding agencies require data management plans (DMP) as part of new research grant proposals. To help researchers with this requirement, several organizations (the California Digital Library, University of Illinois, University of Virginia, Smithsonian Institution, the DataONE consortium and the (UK) Digital Curation Centre) came together to develop the DMPTool in 2011. The goal of the DMPTool is to provide researchers with guidance, links to resources and help with writing data management plans. Thanks to a grant from the Alfred P. Sloan Foundation, these organizations have been able to develop DMPTool2, adding new features and functionality, while aiming to grow the number of users and funding requirements and build a community around DMP best practices. Researchers create plans in the tool by selecting their desired funding agency. The tool provides specific requirements from the selected agency as well as detailed help with each area of the plan. Users have access to complete DMP life cycle management, tracking changes through creation, editing, submission, evaluation, and publication. They may also perform enhanced keyword searches and view publicly available plans. With role-based user authorization, users may hold various roles within the interface: creators, collaborators, institutional administrators, and tool administrators. Furthermore, partner institutions can add significant value to the process with several special tool features. Along with institutional branding in the interface, they can provide links to resources they provide to users, such as to preservation repositories, consultation services, or news and event items. In addition, partner institutions can provide help with specific plan questions, and even suggest responses. Institutional administrators can also mine data on plans in order to better support researchers. Institutions may be represented in different roles: as a funder, a researcher's affiliate, or as an institution with its own DMP requirements.
Providing Access and Visualization to Global Cloud Properties from GEO Satellites
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Palikonda, R.; Ayers, J. K.
2015-12-01
Providing public access to cloud macro and microphysical properties is a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a tool and method that allows end users to easily browse and access cloud information that is otherwise difficult to acquire and manipulate. The core of the tool is an application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the dynamically generated imagery as an input into their own work flows for both image generation and cloud product requisition. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite derived cloud product information available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider. Using the Open Geospatial Consortium's Web Processing Service as our access method, we describe a system that uses a hybrid local and cloud based parallel processing system that can return both satellite imagery and cloud product imagery as well as the binary data used to generate them in multiple formats. The images and cloud products are sourced from multiple satellites and also "merged" datasets created by temporally and spatially matching satellite sensors. Finally, the tool and API allow users to access information that spans the time ranges that our group has information available. In the case of satellite imagery, the temporal range can span the entire lifetime of the sensor.
Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing
NASA Technical Reports Server (NTRS)
Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.
2010-01-01
The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development and throughout the life of the Orion project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shekar, Venkateswaran; Fiondella, Lance; Chatterjee, Samrat
Several transportation network vulnerability models have been proposed. However, most only consider disruptions as a static snapshot in time and the impact on total travel time. These approaches cannot consider the time-varying nature of travel demand nor other undesirable outcomes that follow from transportation network disruptions. This paper proposes an algorithmic approach to assess the vulnerability of a transportation network that considers the time-varying demand with an open source dynamic transportation simulation tool. The open source nature of the tool allows us to systematically consider many disruption scenarios and quantitatively compare their relative criticality. This is far more efficient thanmore » traditional approaches which would require days or weeks of a transportation engineers time to manually set up, run, and assess these simulations. In addition to travel time, we also collect statistics on additional fuel consumed and the corresponding carbon dioxide emissions. Our approach, thus provides a more systematic approach that is both time-varying and can consider additional negative consequences of disruptions for decision makers to evaluate.« less
ERIC Educational Resources Information Center
Stallman, Helen M.; King, Sharron
2016-01-01
The increasing awareness and impact of mental health problems in university students in addition to a need for objective measures of teaching quality provide the impetus for a new approach to supporting students. There is a need for more effective tools that integrate the institutional silos of teaching, learning, support, and wellbeing to help…
United States Air Force Civil Engineering Additive Manufacturing Applications: Tools and Jigs
designs for printing applications. The overall results push forward the Air Forces 3D printing knowledge while providing critical information for decision makers on this up and coming technology....the results indicate that 3Dscanning technology will reach a point within the next 5 years where it can help foster the rapid build-up of 3D CE asset
ERIC Educational Resources Information Center
Brant, Jacek Wiktor
2015-01-01
In the wake of current world financial crisis serious efforts are being made to rethink the dominant economic assumptions. There is a growing movement in universities to make economics more relevant and to embrace an understanding of diverse models. Additionally, philosophical schools such as critical realism have provided new tools for thinking…
Technology as a Tool for Urban Classrooms. ERIC/CUE Digest, Number 95.
ERIC Educational Resources Information Center
Burnett, Gary
By 1992, according to a study by the Council of Chief State School Officers, more than 3.5 million computers were in U.S. elementary and secondary schools--a ratio of one computer for every 13 students. In addition, 99 percent of all schools across the country reported that they provide their students with some access to computers. Sometimes…
ERIC Educational Resources Information Center
Lowe, Karen
2003-01-01
Discusses the process of weeding, updating, and building a school library media collection that supports the state curriculum. Explains resource alignment, a process for using the shelf list as a tool to analyze and align media center resources to state curricula, and describes a five-year plan and its usefulness for additional funding. (LRW)
2006-09-01
Underwater Robot Challenge was organized and supported by the City University of Hong Kong and the WWF (Worldwide Fund for Nature). THE POWER OF... PARTNERSHIP In addition to providing background information and resources for developing the mission scenario, working with Ocean.US and the ORION
NREL Leads Wind Farm Modeling Research - Continuum Magazine | NREL
ten 2-MW Bonus wind turbines. Photo provided by HC Sorensen, Middelgrunden Wind Turbine Cooperative ) has created complex computer modeling tools to improve wind turbine design and overall wind farm activity surrounding a multi-megawatt wind turbine. In addition to its work with Doppler LIDAR, the
Almeida, Lucas Henrique de; GonÇalves, MaÍsa de Carvalho; Novaes, Marcos Carneiro; Paresqui, Rayner Constantino; Bispo, PitÁgoras da conceiÇÃo
2018-01-12
Specimens of the perlid stonefly Anacroneuria flintorum from different regions of the Brazilian Atlantic Forest along the Atlantic coast of Brazil were studied. In this paper, the nymph of A. flintorum is described based on reared specimens and molecular associations. Additionally, we provide new locality records and comments about variation of the species.
Urban tree crown health assessment system: a tool for communities and citizen foresters
Matthew F. Winn; Sang-Mook Lee; Philip A. Araman
2007-01-01
Trees are important assets to urban communities. In addition to the aesthetic values that urban trees provide, they also aid in such things as erosion control, pollution removal, and rainfall interception. The urban environment, however, can often produce stresses to these trees. Soil compaction, limited root growth, and groundwater contamination are just a few of the...
Power and Energy: Industrial Arts Curriculum Guide. Grades 9-12. Curriculum Guide 1335 (Tentative).
ERIC Educational Resources Information Center
Louisiana State Dept. of Education, Baton Rouge.
The tentative guide in power and energy for senior high school use is part of a series of industrial arts curriculum materials developed by the State of Louisiana. The course is designed to provide "hands-on" experience with tools and materials along with a study of the industrial processes in power and energy. In addition, the student…
One important outcome of this project will be the development of a long-term demonstration program that will provide teaching and research tools for many decades. In addition, we will develop models that will be available for others to use, present our findings to others, and ...
ERIC Educational Resources Information Center
Louisiana State Dept. of Education, Baton Rouge.
The tentative guide in graphic arts technology for senior high schools is part of a series of industrial arts curriculum materials developed by the State of Louisiana. The course is designed to provide "hands-on" experience with tools and materials along with a study of the industrial processes in graphic arts technology. In addition,…
SeqLib: a C ++ API for rapid BAM manipulation, sequence alignment and sequence assembly
Wala, Jeremiah; Beroukhim, Rameen
2017-01-01
Abstract We present SeqLib, a C ++ API and command line tool that provides a rapid and user-friendly interface to BAM/SAM/CRAM files, global sequence alignment operations and sequence assembly. Four C libraries perform core operations in SeqLib: HTSlib for BAM access, BWA-MEM and BLAT for sequence alignment and Fermi for error correction and sequence assembly. Benchmarking indicates that SeqLib has lower CPU and memory requirements than leading C ++ sequence analysis APIs. We demonstrate an example of how minimal SeqLib code can extract, error-correct and assemble reads from a CRAM file and then align with BWA-MEM. SeqLib also provides additional capabilities, including chromosome-aware interval queries and read plotting. Command line tools are available for performing integrated error correction, micro-assemblies and alignment. Availability and Implementation: SeqLib is available on Linux and OSX for the C ++98 standard and later at github.com/walaj/SeqLib. SeqLib is released under the Apache2 license. Additional capabilities for BLAT alignment are available under the BLAT license. Contact: jwala@broadinstitue.org; rameen@broadinstitute.org PMID:28011768
SeqLib: a C ++ API for rapid BAM manipulation, sequence alignment and sequence assembly.
Wala, Jeremiah; Beroukhim, Rameen
2017-03-01
We present SeqLib, a C ++ API and command line tool that provides a rapid and user-friendly interface to BAM/SAM/CRAM files, global sequence alignment operations and sequence assembly. Four C libraries perform core operations in SeqLib: HTSlib for BAM access, BWA-MEM and BLAT for sequence alignment and Fermi for error correction and sequence assembly. Benchmarking indicates that SeqLib has lower CPU and memory requirements than leading C ++ sequence analysis APIs. We demonstrate an example of how minimal SeqLib code can extract, error-correct and assemble reads from a CRAM file and then align with BWA-MEM. SeqLib also provides additional capabilities, including chromosome-aware interval queries and read plotting. Command line tools are available for performing integrated error correction, micro-assemblies and alignment. SeqLib is available on Linux and OSX for the C ++98 standard and later at github.com/walaj/SeqLib. SeqLib is released under the Apache2 license. Additional capabilities for BLAT alignment are available under the BLAT license. jwala@broadinstitue.org ; rameen@broadinstitute.org. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Incorporation of a horizontally transferred gene into an operon during cnidarian evolution.
Dana, Catherine E; Glauber, Kristine M; Chan, Titus A; Bridge, Diane M; Steele, Robert E
2012-01-01
Genome sequencing has revealed examples of horizontally transferred genes, but we still know little about how such genes are incorporated into their host genomes. We have previously reported the identification of a gene (flp) that appears to have entered the Hydra genome through horizontal transfer. Here we provide additional evidence in support of our original hypothesis that the transfer was from a unicellular organism, and we show that the transfer occurred in an ancestor of two medusozoan cnidarian species. In addition we show that the gene is part of a bicistronic operon in the Hydra genome. These findings identify a new animal phylum in which trans-spliced leader addition has led to the formation of operons, and define the requirements for evolution of an operon in Hydra. The identification of operons in Hydra also provides a tool that can be exploited in the construction of transgenic Hydra strains.
Development of a StandAlone Surgical Haptic Arm.
Jones, Daniel; Lewis, Andrew; Fischer, Gregory S
2011-01-01
When performing telesurgery with current commercially available Minimally Invasive Robotic Surgery (MIRS) systems, a surgeon cannot feel the tool interactions that are inherent in traditional laparoscopy. It is proposed that haptic feedback in the control of MIRS systems could improve the speed, safety and learning curve of robotic surgery. To test this hypothesis, a standalone surgical haptic arm (SASHA) capable of manipulating da Vinci tools has been designed and fabricated with the additional ability of providing information for haptic feedback. This arm was developed as a research platform for developing and evaluating approaches to telesurgery, including various haptic mappings between master and slave and evaluating the effects of latency.
NASA Astrophysics Data System (ADS)
Kleber, E.; Crosby, C. J.; Arrowsmith, R.; Robinson, S.; Haddad, D. E.
2013-12-01
The use of Light Detection and Ranging (lidar) derived topography has become an indispensable tool in Earth science research. The collection of high-resolution lidar topography from an airborne or terrestrial platform allows landscapes and landforms to be represented at sub-meter resolution and in three dimensions. In addition to its high value for scientific research, lidar derived topography has tremendous potential as a tool for Earth science education. Recent science education initiatives and a community call for access to research-level data make the time ripe to expose lidar data and derived data products as a teaching tool. High resolution topographic data fosters several Disciplinary Core Ideas (DCIs) of the Next Generation Science Standards (NGS, 2013), presents respective Big Ideas of the new community-driven Earth Science Literacy Initiative (ESLI, 2009), teaches to a number National Science Education Standards (NSES, 1996), and Benchmarks for Science Literacy (AAAS, 1993) for science education for undergraduate physical and environmental earth science classes. The spatial context of lidar data complements concepts like visualization, place-based learning, inquiry based teaching and active learning essential to teaching in the geosciences. As official host to EarthScope lidar datasets for tectonically active areas in the western United States, the NSF-funded OpenTopography facility provides user-friendly access to a wealth of data that is easily incorporated into Earth science educational materials. OpenTopography (www.opentopography.org), in collaboration with EarthScope, has developed education and outreach activities to foster teacher, student and researcher utilization of lidar data. These educational resources use lidar data coupled with free tools such as Google Earth to provide a means for students and the interested public to visualize and explore Earth's surface in an interactive manner not possible with most other remotely sensed imagery. The education section of the OpenTopography portal has recently been strengthened with the addition of several new resources and the re-organization of existing content for easy discovery. New resources include a detailed frequently asked questions (FAQ) section, updated 'How-to' videos for downloading data from OpenTopography and additional webpages aimed at students, educators and researchers leveraging existing and updated resources from OpenTopography, EarthScope and other organizations. In addition, the OpenLandform catalog, an online collection of classic geologic landforms depicted in lidar, has been updated to include additional tectonic landforms from EarthScope lidar datasets.
Pellet to Part Manufacturing System for CNCs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roschli, Alex C.; Love, Lonnie J.; Post, Brian K.
Oak Ridge National Laboratory’s Manufacturing Demonstration Facility worked with Hybrid Manufacturing Technologies to develop a compact prototype composite additive manufacturing head that can effectively extrude injection molding pellets. The head interfaces with conventional CNC machine tools enabling rapid conversion of conventional machine tools to additive manufacturing tools. The intent was to enable wider adoption of Big Area Additive Manufacturing (BAAM) technology and combine BAAM technology with conventional machining systems.
NASA Technical Reports Server (NTRS)
Shih, Hsin-Yi; Tien, James S.; Ferkul, Paul (Technical Monitor)
2001-01-01
The recently developed numerical model of concurrent-flow flame spread over thin solids has been used as a simulation tool to help the designs of a space experiment. The two-dimensional and three-dimensional, steady form of the compressible Navier-Stokes equations with chemical reactions are solved. With the coupled multi-dimensional solver of the radiative heat transfer, the model is capable of answering a number of questions regarding the experiment concept and the hardware designs. In this paper, the capabilities of the numerical model are demonstrated by providing the guidance for several experimental designing issues. The test matrix and operating conditions of the experiment are estimated through the modeling results. The three-dimensional calculations are made to simulate the flame-spreading experiment with realistic hardware configuration. The computed detailed flame structures provide the insight to the data collection. In addition, the heating load and the requirements of the product exhaust cleanup for the flow tunnel are estimated with the model. We anticipate that using this simulation tool will enable a more efficient and successful space experiment to be conducted.
Gaining perspective on the water-energy nexus at the community scale.
Perrone, Debra; Murphy, Jennifer; Hornberger, George M
2011-05-15
Water and energy resources are interrelated but their influence on each other is rarely considered. To quantify the water and energy portfolios associated with a community's water-energy nexus (WEN) and the influence of geographic location on resources, we present the WEN tool. The WEN tool quantifies a community's transport (consumed for or lost before delivery) and nexus (energy for water and water for energy) resources so communities can assess their resource flows. In addition, to provide insight into the full range of impacts of water and energy resource acquisition and to frame the influence of geography on resources, we coin the term "urban resource islands". The concept of urban resource islands provides a framework for considering the implication of geography on a community's water and energy resource acquisition and use. The WEN tool and the concept of resource islands can promote communities to think about their hidden resources and integrate such concepts into their sustainability trade-off analyses and policy decisions. In this paper, we use Tucson, Arizona, United States as a case study.
NASA Technical Reports Server (NTRS)
Kocher, Joshua E; Gilliam, David P.
2005-01-01
Secure computing is a necessity in the hostile environment that the internet has become. Protection from nefarious individuals and organizations requires a solution that is more a methodology than a one time fix. One aspect of this methodology is having the knowledge of which network ports a computer has open to the world, These network ports are essentially the doorways from the internet into the computer. An assessment method which uses the nmap software to scan ports has been developed to aid System Administrators (SAs) with analysis of open ports on their system(s). Additionally, baselines for several operating systems have been developed so that SAs can compare their open ports to a baseline for a given operating system. Further, the tool is deployed on a website where SAs and Users can request a port scan of their computer. The results are then emailed to the requestor. This tool aids Users, SAs, and security professionals by providing an overall picture of what services are running, what ports are open, potential trojan programs or backdoors, and what ports can be closed.
Virtual GEOINT Center: C2ISR through an avatar's eyes
NASA Astrophysics Data System (ADS)
Seibert, Mark; Tidbal, Travis; Basil, Maureen; Muryn, Tyler; Scupski, Joseph; Williams, Robert
2013-05-01
As the number of devices collecting and sending data in the world are increasing, finding ways to visualize and understand that data is becoming more and more of a problem. This has often been coined as the problem of "Big Data." The Virtual Geoint Center (VGC) aims to aid in solving that problem by providing a way to combine the use of the virtual world with outside tools. Using open-source software such as OpenSim and Blender, the VGC uses a visually stunning 3D environment to display the data sent to it. The VGC is broken up into two major components: The Kinect Minimap, and the Geoint Map. The Kinect Minimap uses the Microsoft Kinect and its open-source software to make a miniature display of people the Kinect detects in front of it. The Geoint Map collect smartphone sensor information from online databases and displays them in real time onto a map generated by Google Maps. By combining outside tools and the virtual world, the VGC can help a user "visualize" data, and provide additional tools to "understand" the data.
KNIME for reproducible cross-domain analysis of life science data.
Fillbrunn, Alexander; Dietz, Christian; Pfeuffer, Julianus; Rahn, René; Landrum, Gregory A; Berthold, Michael R
2017-11-10
Experiments in the life sciences often involve tools from a variety of domains such as mass spectrometry, next generation sequencing, or image processing. Passing the data between those tools often involves complex scripts for controlling data flow, data transformation, and statistical analysis. Such scripts are not only prone to be platform dependent, they also tend to grow as the experiment progresses and are seldomly well documented, a fact that hinders the reproducibility of the experiment. Workflow systems such as KNIME Analytics Platform aim to solve these problems by providing a platform for connecting tools graphically and guaranteeing the same results on different operating systems. As an open source software, KNIME allows scientists and programmers to provide their own extensions to the scientific community. In this review paper we present selected extensions from the life sciences that simplify data exploration, analysis, and visualization and are interoperable due to KNIME's unified data model. Additionally, we name other workflow systems that are commonly used in the life sciences and highlight their similarities and differences to KNIME. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Developing a Graphical User Interface for the ALSS Crop Planning Tool
NASA Technical Reports Server (NTRS)
Koehlert, Erik
1997-01-01
The goal of my project was to create a graphical user interface for a prototype crop scheduler. The crop scheduler was developed by Dr. Jorge Leon and Laura Whitaker for the ALSS (Advanced Life Support System) program. The addition of a system-independent graphical user interface to the crop planning tool will make the application more accessible to a wider range of users and enhance its value as an analysis, design, and planning tool. My presentation will demonstrate the form and functionality of this interface. This graphical user interface allows users to edit system parameters stored in the file system. Data on the interaction of the crew, crops, and waste processing system with the available system resources is organized and labeled. Program output, which is stored in the file system, is also presented to the user in performance-time plots and organized charts. The menu system is designed to guide the user through analysis and decision making tasks, providing some help if necessary. The Java programming language was used to develop this interface in hopes of providing portability and remote operation.
[Diagnostic tools for canine parvovirus infection].
Proksch, A L; Hartmann, K
2015-01-01
Canine parvovirus (CPV) infection is one of the most important and common infectious diseases in dogs, in particular affecting young puppies when maternal antibodies have waned and vaccine-induced antibodies have not yet developed. The mortality rate remains high. Therefore, a rapid and safe diagnostic tool is essential to diagnose the disease to 1) provide intensive care treatment and 2) to identify virus-shedding animals and thus prevent virus spread. Whilst the detection of antibodies against CPV is considered unsuitable to diagnose the disease, there are several different methods to directly detect complete virus, virus antigen or DNA. Additionally, to test in commercial laboratories, rapid in-house tests based on ELISA are available worldwide. The specificity of the ELISA rapid in-house tests is reported to be excellent. However, results on sensitivity vary and high numbers of false-negative results are commonly reported, which potentially leads to misdiagnosis. Polymerase chain reaction (PCR) is a very sensitive and specific diagnostic tool. It also provides the opportunity to differentiate vaccine strains from natural infection when sequencing is performed after PCR.
The I4 Online Query Tool for Earth Observations Data
NASA Technical Reports Server (NTRS)
Stefanov, William L.; Vanderbloemen, Lisa A.; Lawrence, Samuel J.
2015-01-01
The NASA Earth Observation System Data and Information System (EOSDIS) delivers an average of 22 terabytes per day of data collected by orbital and airborne sensor systems to end users through an integrated online search environment (the Reverb/ECHO system). Earth observations data collected by sensors on the International Space Station (ISS) are not currently included in the EOSDIS system, and are only accessible through various individual online locations. This increases the effort required by end users to query multiple datasets, and limits the opportunity for data discovery and innovations in analysis. The Earth Science and Remote Sensing Unit of the Exploration Integration and Science Directorate at NASA Johnson Space Center has collaborated with the School of Earth and Space Exploration at Arizona State University (ASU) to develop the ISS Instrument Integration Implementation (I4) data query tool to provide end users a clean, simple online interface for querying both current and historical ISS Earth Observations data. The I4 interface is based on the Lunaserv and Lunaserv Global Explorer (LGE) open-source software packages developed at ASU for query of lunar datasets. In order to avoid mirroring existing databases - and the need to continually sync/update those mirrors - our design philosophy is for the I4 tool to be a pure query engine only. Once an end user identifies a specific scene or scenes of interest, I4 transparently takes the user to the appropriate online location to download the data. The tool consists of two public-facing web interfaces. The Map Tool provides a graphic geobrowser environment where the end user can navigate to an area of interest and select single or multiple datasets to query. The Map Tool displays active image footprints for the selected datasets (Figure 1). Selecting a footprint will open a pop-up window that includes a browse image and a link to available image metadata, along with a link to the online location to order or download the actual data. Search results are either delivered in the form of browse images linked to the appropriate online database, similar to the Map Tool, or they may be transferred within the I4 environment for display as footprints in the Map Tool. Datasets searchable through I4 (http://eol.jsc.nasa.gov/I4_tool) currently include: Crew Earth Observations (CEO) cataloged and uncataloged handheld astronaut photography; Sally Ride EarthKAM; Hyperspectral Imager for the Coastal Ocean (HICO); and the ISS SERVIR Environmental Research and Visualization System (ISERV). The ISS is a unique platform in that it will have multiple users over its lifetime, and that no single remote sensing system has a permanent internal or external berth. The open source I4 tool is designed to enable straightforward addition of new datasets as they become available such as ISS-RapidSCAT, Cloud Aerosol Transport System (CATS), and the High Definition Earth Viewing (HDEV) system. Data from other sensor systems, such as those operated by the ISS International Partners or under the auspices of the US National Laboratory program, can also be added to I4 provided sufficient access to enable searching of data or metadata is available. Commercial providers of remotely sensed data from the ISS may be particularly interested in I4 as an additional means of directing potential customers and clients to their products.
NASA Astrophysics Data System (ADS)
Baldwin, R.; Ansari, S.; Reid, G.; Lott, N.; Del Greco, S.
2007-12-01
The main goal in developing and deploying Geographic Information System (GIS) services at NOAA's National Climatic Data Center (NCDC) is to provide users with simple access to data archives while integrating new and informative climate products. Several systems at NCDC provide a variety of climatic data in GIS formats and/or map viewers. The Online GIS Map Services provide users with data discovery options which flow into detailed product selection maps, which may be queried using standard "region finder" tools or gazetteer (geographical dictionary search) functions. Each tabbed selection offers steps to help users progress through the systems. A series of additional base map layers or data types have been added to provide companion information. New map services include: Severe Weather Data Inventory, Local Climatological Data, Divisional Data, Global Summary of the Day, and Normals/Extremes products. THREDDS Data Server technology is utilized to provide access to gridded multidimensional datasets such as Model, Satellite and Radar. This access allows users to download data as a gridded NetCDF file, which is readable by ArcGIS. In addition, users may subset the data for a specific geographic region, time period, height range or variable prior to download. The NCDC Weather Radar Toolkit (WRT) is a client tool which accesses Weather Surveillance Radar 1988 Doppler (WSR-88D) data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. As more users become accustom to GIS, questions of better, cheaper, faster access soon follow. Expanding use and availability can best be accomplished through standards which promote interoperability. Our GIS related products provide Open Geospatial Consortium (OGC) compliant Web Map Services (WMS), Web Feature Services (WFS), Web Coverage Services (WCS) and Federal Geographic Data Committee (FGDC) metadata as a complement to the map viewers. KML/KMZ data files (soon to be compliant OGC specifications) also provide access.
Geoscience data visualization and analysis using GeoMapApp
NASA Astrophysics Data System (ADS)
Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha
2013-04-01
Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D visualizations into common formats including grids, images, text files, spreadsheets, etc. Examples of interdisciplinary investigations that make use of GeoMapApp visualization and analysis functionality will be provided.
Additive manufacturing: Toward holistic design
Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.; ...
2017-03-18
Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.
Rubin, Katrine Hass; Friis-Holmberg, Teresa; Hermann, Anne Pernille; Abrahamsen, Bo; Brixen, Kim
2013-08-01
A huge number of risk assessment tools have been developed. Far from all have been validated in external studies, more of them have absence of methodological and transparent evidence, and few are integrated in national guidelines. Therefore, we performed a systematic review to provide an overview of existing valid and reliable risk assessment tools for prediction of osteoporotic fractures. Additionally, we aimed to determine if the performance of each tool was sufficient for practical use, and last, to examine whether the complexity of the tools influenced their discriminative power. We searched PubMed, Embase, and Cochrane databases for papers and evaluated these with respect to methodological quality using the Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS) checklist. A total of 48 tools were identified; 20 had been externally validated, however, only six tools had been tested more than once in a population-based setting with acceptable methodological quality. None of the tools performed consistently better than the others and simple tools (i.e., the Osteoporosis Self-assessment Tool [OST], Osteoporosis Risk Assessment Instrument [ORAI], and Garvan Fracture Risk Calculator [Garvan]) often did as well or better than more complex tools (i.e., Simple Calculated Risk Estimation Score [SCORE], WHO Fracture Risk Assessment Tool [FRAX], and Qfracture). No studies determined the effectiveness of tools in selecting patients for therapy and thus improving fracture outcomes. High-quality studies in randomized design with population-based cohorts with different case mixes are needed. Copyright © 2013 American Society for Bone and Mineral Research.
Barcoding and Border Biosecurity: Identifying Cyprinid Fishes in the Aquarium Trade
Collins, Rupert A.; Armstrong, Karen F.; Meier, Rudolf; Yi, Youguang; Brown, Samuel D. J.; Cruickshank, Robert H.; Keeling, Suzanne; Johnston, Colin
2012-01-01
Background Poorly regulated international trade in ornamental fishes poses risks to both biodiversity and economic activity via invasive alien species and exotic pathogens. Border security officials need robust tools to confirm identifications, often requiring hard-to-obtain taxonomic literature and expertise. DNA barcoding offers a potentially attractive tool for quarantine inspection, but has yet to be scrutinised for aquarium fishes. Here, we present a barcoding approach for ornamental cyprinid fishes by: (1) expanding current barcode reference libraries; (2) assessing barcode congruence with morphological identifications under numerous scenarios (e.g. inclusion of GenBank data, presence of singleton species, choice of analytical method); and (3) providing supplementary information to identify difficult species. Methodology/Principal Findings We sampled 172 ornamental cyprinid fish species from the international trade, and provide data for 91 species currently unrepresented in reference libraries (GenBank/Bold). DNA barcodes were found to be highly congruent with our morphological assignments, achieving success rates of 90–99%, depending on the method used (neighbour-joining monophyly, bootstrap, nearest neighbour, GMYC, percent threshold). Inclusion of data from GenBank (additional 157 spp.) resulted in a more comprehensive library, but at a cost to success rate due to the increased number of singleton species. In addition to DNA barcodes, our study also provides supporting data in the form of specimen images, morphological characters, taxonomic bibliography, preserved vouchers, and nuclear rhodopsin sequences. Using this nuclear rhodopsin data we also uncovered evidence of interspecific hybridisation, and highlighted unrecognised diversity within popular aquarium species, including the endangered Indian barb Puntius denisonii. Conclusions/Significance We demonstrate that DNA barcoding provides a highly effective biosecurity tool for rapidly identifying ornamental fishes. In cases where DNA barcodes are unable to offer an identification, we improve on previous studies by consolidating supplementary information from multiple data sources, and empower biosecurity agencies to confidently identify high-risk fishes in the aquarium trade. PMID:22276096
Mapping with Small UAS: A Point Cloud Accuracy Assessment
NASA Astrophysics Data System (ADS)
Toth, Charles; Jozkow, Grzegorz; Grejner-Brzezinska, Dorota
2015-12-01
Interest in using inexpensive Unmanned Aerial System (UAS) technology for topographic mapping has recently significantly increased. Small UAS platforms equipped with consumer grade cameras can easily acquire high-resolution aerial imagery allowing for dense point cloud generation, followed by surface model creation and orthophoto production. In contrast to conventional airborne mapping systems, UAS has limited ground coverage due to low flying height and limited flying time, yet it offers an attractive alternative to high performance airborne systems, as the cost of the sensors and platform, and the flight logistics, is relatively low. In addition, UAS is better suited for small area data acquisitions and to acquire data in difficult to access areas, such as urban canyons or densely built-up environments. The main question with respect to the use of UAS is whether the inexpensive consumer sensors installed in UAS platforms can provide the geospatial data quality comparable to that provided by conventional systems. This study aims at the performance evaluation of the current practice of UAS-based topographic mapping by reviewing the practical aspects of sensor configuration, georeferencing and point cloud generation, including comparisons between sensor types and processing tools. The main objective is to provide accuracy characterization and practical information for selecting and using UAS solutions in general mapping applications. The analysis is based on statistical evaluation as well as visual examination of experimental data acquired by a Bergen octocopter with three different image sensor configurations, including a GoPro HERO3+ Black Edition, a Nikon D800 DSLR and a Velodyne HDL-32. In addition, georeferencing data of varying quality were acquired and evaluated. The optical imagery was processed by using three commercial point cloud generation tools. Comparing point clouds created by active and passive sensors by using different quality sensors, and finally, by different commercial software tools, provides essential information for the performance validation of UAS technology.
Arabidopsis research requires a critical re-evaluation of genetic tools.
Nikonorova, Natalia; Yue, Kun; Beeckman, Tom; De Smet, Ive
2018-06-27
An increasing number of reports question conclusions based on loss-of-function lines that have unexpected genetic backgrounds. In this opinion paper, we urge researchers to meticulously (re)investigate phenotypes retrieved from various genetic backgrounds and be critical regarding some previously drawn conclusions. As an example, we provide new evidence that acr4-2 mutant phenotypes with respect to columella stem cells are due to the lack of ACR4 and not - at least not as a major contributor - to a mutation in QRT1. In addition, we take the opportunity to alert the scientific community about the qrt1-2 background of a large number of Syngenta Arabidopsis Insertion Library (SAIL) T-DNA lines, a feature that is not commonly recognized by Arabidopsis researchers. This qrt1-2 background might have an important impact on the interpretation of the results obtained using these research tools, now and in the past. In conclusion, as a community, we should continuously assess and - if necessary - correct our conclusions based on the large number of (genetic) tools our work is built on. In addition, the positive or negative results of this self-criticism should be made available to the scientific community.
Drug-loaded nanoparticles induce gene expression in human pluripotent stem cell derivatives
NASA Astrophysics Data System (ADS)
Gajbhiye, Virendra; Escalante, Leah; Chen, Guojun; Laperle, Alex; Zheng, Qifeng; Steyer, Benjamin; Gong, Shaoqin; Saha, Krishanu
2013-12-01
Tissue engineering and advanced manufacturing of human stem cells requires a suite of tools to control gene expression spatiotemporally in culture. Inducible gene expression systems offer cell-extrinsic control, typically through addition of small molecules, but small molecule inducers typically contain few functional groups for further chemical modification. Doxycycline (DXC), a potent small molecule inducer of tetracycline (Tet) transgene systems, was conjugated to a hyperbranched dendritic polymer (Boltorn H40) and subsequently reacted with polyethylene glycol (PEG). The resulting PEG-H40-DXC nanoparticle exhibited pH-sensitive drug release behavior and successfully controlled gene expression in stem-cell-derived fibroblasts with a Tet-On system. While free DXC inhibited fibroblast proliferation and matrix metalloproteinase (MMP) activity, PEG-H40-DXC nanoparticles maintained higher fibroblast proliferation levels and MMP activity. The results demonstrate that the PEG-H40-DXC nanoparticle system provides an effective tool to controlling gene expression in human stem cell derivatives.Tissue engineering and advanced manufacturing of human stem cells requires a suite of tools to control gene expression spatiotemporally in culture. Inducible gene expression systems offer cell-extrinsic control, typically through addition of small molecules, but small molecule inducers typically contain few functional groups for further chemical modification. Doxycycline (DXC), a potent small molecule inducer of tetracycline (Tet) transgene systems, was conjugated to a hyperbranched dendritic polymer (Boltorn H40) and subsequently reacted with polyethylene glycol (PEG). The resulting PEG-H40-DXC nanoparticle exhibited pH-sensitive drug release behavior and successfully controlled gene expression in stem-cell-derived fibroblasts with a Tet-On system. While free DXC inhibited fibroblast proliferation and matrix metalloproteinase (MMP) activity, PEG-H40-DXC nanoparticles maintained higher fibroblast proliferation levels and MMP activity. The results demonstrate that the PEG-H40-DXC nanoparticle system provides an effective tool to controlling gene expression in human stem cell derivatives. Electronic supplementary information (ESI) available: ESI containing 1H NMR spectra and additional fibroblast characterization data. See DOI: 10.1039/c3nr04794f
Reducing Information Overload in Large Seismic Data Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.
2000-08-02
Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their effortsmore » to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.« less
Simulink/PARS Integration Support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vacaliuc, B.; Nakhaee, N.
2013-12-18
The state of the art for signal processor hardware has far out-paced the development tools for placing applications on that hardware. In addition, signal processors are available in a variety of architectures, each uniquely capable of handling specific types of signal processing efficiently. With these processors becoming smaller and demanding less power, it has become possible to group multiple processors, a heterogeneous set of processors, into single systems. Different portions of the desired problem set can be assigned to different processor types as appropriate. As software development tools do not keep pace with these processors, especially when multiple processors ofmore » different types are used, a method is needed to enable software code portability among multiple processors and multiple types of processors along with their respective software environments. Sundance DSP, Inc. has developed a software toolkit called “PARS”, whose objective is to provide a framework that uses suites of tools provided by different vendors, along with modeling tools and a real time operating system, to build an application that spans different processor types. The software language used to express the behavior of the system is a very high level modeling language, “Simulink”, a MathWorks product. ORNL has used this toolkit to effectively implement several deliverables. This CRADA describes this collaboration between ORNL and Sundance DSP, Inc.« less
Synthetic biology: Novel approaches for microbiology.
Padilla-Vaca, Felipe; Anaya-Velázquez, Fernando; Franco, Bernardo
2015-06-01
In the past twenty years, molecular genetics has created powerful tools for genetic manipulation of living organisms. Whole genome sequencing has provided necessary information to assess knowledge on gene function and protein networks. In addition, new tools permit to modify organisms to perform desired tasks. Gene function analysis is speed up by novel approaches that couple both high throughput data generation and mining. Synthetic biology is an emerging field that uses tools for generating novel gene networks, whole genome synthesis and engineering. New applications in biotechnological, pharmaceutical and biomedical research are envisioned for synthetic biology. In recent years these new strategies have opened up the possibilities to study gene and genome editing, creation of novel tools for functional studies in virus, parasites and pathogenic bacteria. There is also the possibility to re-design organisms to generate vaccine subunits or produce new pharmaceuticals to combat multi-drug resistant pathogens. In this review we provide our opinion on the applicability of synthetic biology strategies for functional studies of pathogenic organisms and some applications such as genome editing and gene network studies to further comprehend virulence factors and determinants in pathogenic organisms. We also discuss what we consider important ethical issues for this field of molecular biology, especially for potential misuse of the new technologies. Copyright© by the Spanish Society for Microbiology and Institute for Catalan Studies.
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-01-01
Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-11-01
With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.
Failure environment analysis tool applications
NASA Astrophysics Data System (ADS)
Pack, Ginger L.; Wadsworth, David B.
1993-02-01
Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.
Graph-based optimization of epitope coverage for vaccine antigen design
Theiler, James Patrick; Korber, Bette Tina Marie
2017-01-29
Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less
Multisite Evaluation of a Data Quality Tool for Patient-Level Clinical Data Sets
Huser, Vojtech; DeFalco, Frank J.; Schuemie, Martijn; Ryan, Patrick B.; Shang, Ning; Velez, Mark; Park, Rae Woong; Boyce, Richard D.; Duke, Jon; Khare, Ritu; Utidjian, Levon; Bailey, Charles
2016-01-01
Introduction: Data quality and fitness for analysis are crucial if outputs of analyses of electronic health record data or administrative claims data should be trusted by the public and the research community. Methods: We describe a data quality analysis tool (called Achilles Heel) developed by the Observational Health Data Sciences and Informatics Collaborative (OHDSI) and compare outputs from this tool as it was applied to 24 large healthcare datasets across seven different organizations. Results: We highlight 12 data quality rules that identified issues in at least 10 of the 24 datasets and provide a full set of 71 rules identified in at least one dataset. Achilles Heel is a freely available software that provides a useful starter set of data quality rules with the ability to add additional rules. We also present results of a structured email-based interview of all participating sites that collected qualitative comments about the value of Achilles Heel for data quality evaluation. Discussion: Our analysis represents the first comparison of outputs from a data quality tool that implements a fixed (but extensible) set of data quality rules. Thanks to a common data model, we were able to compare quickly multiple datasets originating from several countries in America, Europe and Asia. PMID:28154833
A simple web-based tool to compare freshwater fish data collected using AFS standard methods
Bonar, Scott A.; Mercado-Silva, Norman; Rahr, Matt; Torrey, Yuta T.; Cate, Averill
2016-01-01
The American Fisheries Society (AFS) recently published Standard Methods for Sampling North American Freshwater Fishes. Enlisting the expertise of 284 scientists from 107 organizations throughout Canada, Mexico, and the United States, this text was developed to facilitate comparisons of fish data across regions or time. Here we describe a user-friendly web tool that automates among-sample comparisons in individual fish condition, population length-frequency distributions, and catch per unit effort (CPUE) data collected using AFS standard methods. Currently, the web tool (1) provides instantaneous summaries of almost 4,000 data sets of condition, length frequency, and CPUE of common freshwater fishes collected using standard gears in 43 states and provinces; (2) is easily appended with new standardized field data to update subsequent queries and summaries; (3) compares fish data from a particular water body with continent, ecoregion, and state data summaries; and (4) provides additional information about AFS standard fish sampling including benefits, ongoing validation studies, and opportunities to comment on specific methods. The web tool—programmed in a PHP-based Drupal framework—was supported by several AFS Sections, agencies, and universities and is freely available from the AFS website and fisheriesstandardsampling.org. With widespread use, the online tool could become an important resource for fisheries biologists.
Graph-based optimization of epitope coverage for vaccine antigen design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theiler, James Patrick; Korber, Bette Tina Marie
Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less
Failure environment analysis tool applications
NASA Technical Reports Server (NTRS)
Pack, Ginger L.; Wadsworth, David B.
1993-01-01
Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.
Failure environment analysis tool applications
NASA Technical Reports Server (NTRS)
Pack, Ginger L.; Wadsworth, David B.
1994-01-01
Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within it, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.
Effectiveness of classroom response systems within an active learning environment.
Welch, Susan
2013-11-01
In nursing education, the inclusion of pedagogical tools is necessary to transform Millennial classrooms. One such pedagogical tool currently offered is classroom response systems (CRS). The purpose of this study was to evaluate the effectiveness of CRS as a pedagogical tool in improving nursing students' examination performance within an active learning environment. A pretest-posttest design was used to determine whether there was a relationship between the use of CRS (independent variable) and nursing students' examination performance in a first-year Professional Practice course (dependent variable). Paired t tests revealed no greater improvement in posttest scores. Therefore, the use of CRS technology was not effective in increasing nursing students' examination scores in the Professional Practice course. Additional research is needed to provide adequate understanding of the effectiveness of CRS within the nursing education classroom. Copyright 2013, SLACK Incorporated.
Laing, Karen; Baumgartner, Katherine
2005-01-01
Many endoscopy units are looking for ways to improve their efficiency without increasing the number of staff, purchasing additional equipment, or making the patients feel as if they have been rushed through the care process. To accomplish this, a few hospitals have looked to other industries for help. Recently, "lean" methods and tools from the manufacturing industry, have been applied successfully in health care systems, and have proven to be an effective way to eliminate waste and redundancy in workplace processes. The "lean" method and tools in service organizations focuses on providing the most efficient and effective flow of service and products. This article will describe the journey of one endoscopy department within a community hospital to illustrate application of "lean" methods and tools and results.
Bigger data, collaborative tools and the future of predictive drug discovery
NASA Astrophysics Data System (ADS)
Ekins, Sean; Clark, Alex M.; Swamidass, S. Joshua; Litterman, Nadia; Williams, Antony J.
2014-10-01
Over the past decade we have seen a growth in the provision of chemistry data and cheminformatics tools as either free websites or software as a service commercial offerings. These have transformed how we find molecule-related data and use such tools in our research. There have also been efforts to improve collaboration between researchers either openly or through secure transactions using commercial tools. A major challenge in the future will be how such databases and software approaches handle larger amounts of data as it accumulates from high throughput screening and enables the user to draw insights, enable predictions and move projects forward. We now discuss how information from some drug discovery datasets can be made more accessible and how privacy of data should not overwhelm the desire to share it at an appropriate time with collaborators. We also discuss additional software tools that could be made available and provide our thoughts on the future of predictive drug discovery in this age of big data. We use some examples from our own research on neglected diseases, collaborations, mobile apps and algorithm development to illustrate these ideas.
Advanced engineering environment collaboration project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamph, Jane Ann; Pomplun, Alan R.; Kiba, Grant W.
2008-12-01
The Advanced Engineering Environment (AEE) is a model for an engineering design and communications system that will enhance project collaboration throughout the nuclear weapons complex (NWC). Sandia National Laboratories and Parametric Technology Corporation (PTC) worked together on a prototype project to evaluate the suitability of a portion of PTC's Windchill 9.0 suite of data management, design and collaboration tools as the basis for an AEE. The AEE project team implemented Windchill 9.0 development servers in both classified and unclassified domains and used them to test and evaluate the Windchill tool suite relative to the needs of the NWC using weaponsmore » project use cases. A primary deliverable was the development of a new real time collaborative desktop design and engineering process using PDMLink (data management tool), Pro/Engineer (mechanical computer aided design tool) and ProductView Lite (visualization tool). Additional project activities included evaluations of PTC's electrical computer aided design, visualization, and engineering calculations applications. This report documents the AEE project work to share information and lessons learned with other NWC sites. It also provides PTC with recommendations for improving their products for NWC applications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plimpton, Steve; Jones, Matt; Crozier, Paul
2006-01-01
Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invokedmore » interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less
MFCompress: a compression tool for FASTA and multi-FASTA data.
Pinho, Armando J; Pratas, Diogo
2014-01-01
The data deluge phenomenon is becoming a serious problem in most genomic centers. To alleviate it, general purpose tools, such as gzip, are used to compress the data. However, although pervasive and easy to use, these tools fall short when the intention is to reduce as much as possible the data, for example, for medium- and long-term storage. A number of algorithms have been proposed for the compression of genomics data, but unfortunately only a few of them have been made available as usable and reliable compression tools. In this article, we describe one such tool, MFCompress, specially designed for the compression of FASTA and multi-FASTA files. In comparison to gzip and applied to multi-FASTA files, MFCompress can provide additional average compression gains of almost 50%, i.e. it potentially doubles the available storage, although at the cost of some more computation time. On highly redundant datasets, and in comparison with gzip, 8-fold size reductions have been obtained. Both source code and binaries for several operating systems are freely available for non-commercial use at http://bioinformatics.ua.pt/software/mfcompress/.
NASA Technical Reports Server (NTRS)
Balikhin, M. A.; Rodriguez, J. V.; Boynton, R. J.; Walker, S. N.; Aryan, Homayon; Sibeck, D. G.; Billings, S. A.
2016-01-01
Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB3GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB3GEO forecasts use solar wind density and interplanetary magnetic field B(sub z) observations at L1. The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB3GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB3GEO forecast.
Solernou, Albert; Hanson, Benjamin S; Richardson, Robin A; Welch, Robert; Read, Daniel J; Harlen, Oliver G; Harris, Sarah A
2018-03-01
Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.
Requirements Development for the NASA Advanced Engineering Environment (AEE)
NASA Technical Reports Server (NTRS)
Rogers, Eric; Hale, Joseph P.; Zook, Keith; Gowda, Sanjay; Salas, Andrea O.
2003-01-01
The requirements development process for the Advanced Engineering Environment (AEE) is presented. This environment has been developed to allow NASA to perform independent analysis and design of space transportation architectures and technologies. Given the highly collaborative and distributed nature of AEE, a variety of organizations are involved in the development, operations and management of the system. Furthermore, there are additional organizations involved representing external customers and stakeholders. Thorough coordination and effective communication is essential to translate desired expectations of the system into requirements. Functional, verifiable requirements for this (and indeed any) system are necessary to fulfill several roles. Requirements serve as a contractual tool, configuration management tool, and as an engineering tool, sometimes simultaneously. The role of requirements as an engineering tool is particularly important because a stable set of requirements for a system provides a common framework of system scope and characterization among team members. Furthermore, the requirements provide the basis for checking completion of system elements and form the basis for system verification. Requirements are at the core of systems engineering. The AEE Project has undertaken a thorough process to translate the desires and expectations of external customers and stakeholders into functional system-level requirements that are captured with sufficient rigor to allow development planning, resource allocation and system-level design, development, implementation and verification. These requirements are maintained in an integrated, relational database that provides traceability to governing Program requirements and also to verification methods and subsystem-level requirements.
Balikhin, M A; Rodriguez, J V; Boynton, R J; Walker, S N; Aryan, H; Sibeck, D G; Billings, S A
2016-01-01
Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB 3 GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB 3 GEO forecasts use solar wind density and interplanetary magnetic field B z observations at L1.The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB 3 GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB 3 GEO forecast.
Infrared Spectroscopy as a Chemical Fingerprinting Tool
NASA Technical Reports Server (NTRS)
Huff, Tim; Munafo, Paul M. (Technical Monitor)
2002-01-01
Infrared (IR) spectroscopy is a powerful analytical tool in the chemical fingerprinting of materials. The technique is rapid, reproducible and usually non-invasive. With the appropriate accessories, the technique can be used to examine samples in either a solid, liquid or gas phase. Solid samples of varying sizes and shapes may be used, and with the addition of microscopic IR (microspectroscopy) capabilities, minute materials such as single fibers and threads may be examined. With the addition of appropriate software, microspectroscopy can be used for automated discrete point or compositional surface area mapping, with the latter providing a means to record changes in the chemical composition of a material surface over a defined area. Both aqueous and non-aqueous free-flowing solutions can be analyzed using appropriate IR techniques, as can viscous liquids such as heavy oils and greases. Due to the ability to characterize gaseous samples, IR spectroscopy can also be coupled with thermal processes such as thermogravimetric (TG) analyses to provide both thermal and chemical data in a single run. In this configuration, solids (or liquids) heated in a TG analyzer undergo decomposition, with the evolving gases directed into the IR spectrometer. Thus, information is provided on the thermal properties of a material and the order in which its chemical constituents are broken down during incremental heating. Specific examples of these varied applications will be cited, with data interpretation and method limitations further discussed.
Habitability research priorities for the International Space Station and beyond.
Whitmore, M; Adolf, J A; Woolford, B J
2000-09-01
Advanced technology and the desire to explore space have resulted in increasingly longer manned space missions. Long Duration Space Flights (LDSF) have provided a considerable amount of scientific research on the ability of humans to adapt and function in microgravity environments. In addition, studies conducted in analogous environments, such as winter-over expeditions in Antarctica, have complemented the scientific understanding of human performance in LDSF. These findings indicate long duration missions may take a toll on the individual, both physiologically and psychologically, with potential impacts on performance. Significant factors in any manned LDSF are habitability, workload and performance. They are interrelated and influence one another, and therefore necessitate an integrated research approach. An integral part of this approach will be identifying and developing tools not only for assessment of habitability, workload, and performance, but also for prediction of these factors as well. In addition, these tools will be used to identify and provide countermeasures to minimize decrements and maximize mission success. The purpose of this paper is to identify research goals and methods for the International Space Station (ISS) in order to identify critical factors and level of impact on habitability, workload, and performance, and to develop and validate countermeasures. Overall, this approach will provide the groundwork for creating an optimal environment in which to live and work onboard ISS as well as preparing for longer planetary missions.
Research Priorities for the International Space Station and Beyond
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Adolf, Jurine A.; Woolford, Barbara J.
1999-01-01
Advanced technology and the desire to explore space have resulted in increasingly longer manned space missions. Long Duration Space Flights (LDSF) have provided a considerable amount of scientific research on the ability of humans to adapt and function in microgravity environments. In addition, studies conducted in analogous environments, such as winter-over expeditions in Antarctica, have complemented the scientific understanding of human performance in LDSF. These findings indicate long duration missions may take a toll on the individual, both physiologically and psychologically, with potential impacts on performance. Significant factors in any manned LDSF are habitability, workload and performance. They are interrelated and influence one another, and therefore necessitate an integrated research approach. An integral part of this approach will be identifying and developing tools not only for assessment of habitability, workload, and performance, but also for prediction of these factors as well. In addition, these tools will be used to identify and provide countermeasures to minimize decrements and maximize mission success. The purpose of this paper is to identify research goals and methods for the International Space Station (ISS) in order to identify critical factors and level of impact on habitability, workload, and performance, and to develop and validate countermeasures. Overall, this approach will provide the groundwork for creating an optimal environment in which to live and work onboard ISS as well as preparing for longer planetary missions.
Tatem, Kathleen S; Quinn, James L; Phadke, Aditi; Yu, Qing; Gordish-Dressman, Heather; Nagaraju, Kanneboyina
2014-09-29
The open field activity monitoring system comprehensively assesses locomotor and behavioral activity levels of mice. It is a useful tool for assessing locomotive impairment in animal models of neuromuscular disease and efficacy of therapeutic drugs that may improve locomotion and/or muscle function. The open field activity measurement provides a different measure than muscle strength, which is commonly assessed by grip strength measurements. It can also show how drugs may affect other body systems as well when used with additional outcome measures. In addition, measures such as total distance traveled mirror the 6 min walk test, a clinical trial outcome measure. However, open field activity monitoring is also associated with significant challenges: Open field activity measurements vary according to animal strain, age, sex, and circadian rhythm. In addition, room temperature, humidity, lighting, noise, and even odor can affect assessment outcomes. Overall, this manuscript provides a well-tested and standardized open field activity SOP for preclinical trials in animal models of neuromuscular diseases. We provide a discussion of important considerations, typical results, data analysis, and detail the strengths and weaknesses of open field testing. In addition, we provide recommendations for optimal study design when using open field activity in a preclinical trial.
Fundamental Aeronautics Program: Overview of Project Work in Supersonic Cruise Efficiency
NASA Technical Reports Server (NTRS)
Castner, Raymond
2011-01-01
The Supersonics Project, part of NASA?s Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2011) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.
Toyz: A framework for scientific analysis of large datasets and astronomical images
NASA Astrophysics Data System (ADS)
Moolekamp, F.; Mamajek, E.
2015-11-01
As the size of images and data products derived from astronomical data continues to increase, new tools are needed to visualize and interact with that data in a meaningful way. Motivated by our own astronomical images taken with the Dark Energy Camera (DECam) we present Toyz, an open source Python package for viewing and analyzing images and data stored on a remote server or cluster. Users connect to the Toyz web application via a web browser, making it a convenient tool for students to visualize and interact with astronomical data without having to install any software on their local machines. In addition it provides researchers with an easy-to-use tool that allows them to browse the files on a server and quickly view very large images (>2 Gb) taken with DECam and other cameras with a large FOV and create their own visualization tools that can be added on as extensions to the default Toyz framework.
NASA Technical Reports Server (NTRS)
Castner, Ray
2012-01-01
The Supersonics Project, part of NASA's Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2012) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.
Uranus: a rapid prototyping tool for FPGA embedded computer vision
NASA Astrophysics Data System (ADS)
Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.
2007-01-01
The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.
DATA-MEAns: an open source tool for the classification and management of neural ensemble recordings.
Bonomini, María P; Ferrandez, José M; Bolea, Jose Angel; Fernandez, Eduardo
2005-10-30
The number of laboratories using techniques that allow to acquire simultaneous recordings of as many units as possible is considerably increasing. However, the development of tools used to analyse this multi-neuronal activity is generally lagging behind the development of the tools used to acquire these data. Moreover, the data exchange between research groups using different multielectrode acquisition systems is hindered by commercial constraints such as exclusive file structures, high priced licenses and hard policies on intellectual rights. This paper presents a free open-source software for the classification and management of neural ensemble data. The main goal is to provide a graphical user interface that links the experimental data to a basic set of routines for analysis, visualization and classification in a consistent framework. To facilitate the adaptation and extension as well as the addition of new routines, tools and algorithms for data analysis, the source code and documentation are freely available.
SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool
Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda
2008-01-01
Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080
SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.
Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda
2008-08-15
It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.
Ayadurai, Shamala; Sunderland, V Bruce; Tee, Lisa Bg; Md Said, Siti Norlina; Hattingh, H Laetitia
2018-06-07
A review of pharmacist diabetes intervention studies revealed lack of structured process in providing diabetes care which consequently produced varied results from increased to minimal improvements. This study aimed to determine the effectiveness of a structured clinical guidelines tool, the Simpler™ tool, in the delivery of diabetes care. The primary outcome was significant improvement in HbA1c (glycated haemoglobin). Secondary outcomes were improved lipid profiles and blood pressure (BP). A 6-month, parallel, multi-centre, two arms, randomised controlled trial involving 14 pharmacists at seven primary care clinics was conducted in Johor, Malaysia. Pharmacists without prior specialised diabetes training were trained to use the tool. Patients were randomised within each centre to: 1) Simpler™ care (SC), receiving care from pharmacists who applied the tool (n=55); 2) Usual care (UC), receiving usual care and dispensing services (n=69). SC reduced HbA1c significantly by 1.59% (95%CI: -2.2, -0.9) compared to 0.25% (95%CI: -0.62, 0.11), (P=<0.001) in UC. In addition, SC patients had significantly improved systolic BP: (-6.28 mmHg (95%CI: -10.5, 2.0), p=0.005). The proportion of patients who reached the Malaysian guideline treatment goals were significantly more in the SC arm (14.3% vs 1.5% for HbA1c, p=0.020; 80% vs 42% for systolic BP, p=0.001; 60.5% vs 40.4% for LDL cholesterol, p=0.046). Use of the Simpler™ tool facilitated delivery of comprehensive evidence-based diabetes management and significantly improved clinical outcomes. The Simpler™ tool supported pharmacists in providing enhanced structured diabetes care. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Practices to prevent venous thromboembolism: a brief review
Lau, Brandyn D; Haut, Elliott R
2014-01-01
Background Venous thromboembolism (VTE) is a common cause of preventable harm for hospitalised patients. Over the past decade, numerous intervention types have been implemented in attempts to improve the prescription of VTE prophylaxis in hospitals, with varying degrees of success. We reviewed key articles to assess the efficacy of different types of interventions to improve prescription of VTE prophylaxis for hospitalised patients. Methods We conducted a search of MEDLINE for key studies published between 2001 and 2012 of interventions employing education, paper based tools, computerised tools, real time audit and feedback, or combinations of intervention types to improve prescription of VTE prophylaxis for patients in hospital settings. Process outcomes of interest were prescription of any VTE prophylaxis and best practice VTE prophylaxis. Clinical outcomes of interest were any VTE and potentially preventable VTE, defined as VTE occurring in patients not prescribed appropriate prophylaxis. Results 16 articles were included in this review. Two studies employed education only, four implemented paper based tools, four used computerised tools, two evaluated audit and feedback strategies, and four studies used combinations of intervention types. Individual modalities result in improved prescription of VTE prophylaxis; however, the greatest and most sustained improvements were those that combined education with computerised tools. Conclusions Many intervention types have proven effective to different degrees in improving VTE prevention. Provider education is likely a required additional component and should be combined with other intervention types. Active mandatory tools are likely more effective than passive ones. Information technology tools that are well integrated into provider workflow, such as alerts and computerised clinical decision support, can improve best practice prophylaxis use and prevent patient harm resulting from VTE. PMID:23708438
Dande, Payal; Samant, Purva
2018-01-01
Tuberculosis [TB] has afflicted numerous nations in the world. As per a report by the World Health Organization [WHO], an estimated 1.4 million TB deaths in 2015 and an additional 0.4 million deaths resulting from TB disease among people living with HIV, were observed. Most of the TB deaths can be prevented if it is detected at an early stage. The existing processes of diagnosis like blood tests or sputum tests are not only tedious but also take a long time for analysis and cannot differentiate between different drug resistant stages of TB. The need to find newer prompt methods for disease detection has been aided by the latest Artificial Intelligence [AI] tools. Artificial Neural Network [ANN] is one of the important tools that is being used widely in diagnosis and evaluation of medical conditions. This review aims at providing brief introduction to various AI tools that are used in TB detection and gives a detailed description about the utilization of ANN as an efficient diagnostic technique. The paper also provides a critical assessment of ANN and the existing techniques for their diagnosis of TB. Researchers and Practitioners in the field are looking forward to use ANN and other upcoming AI tools such as Fuzzy-logic, genetic algorithms and artificial intelligence simulation as a promising current and future technology tools towards tackling the global menace of Tuberculosis. Latest advancements in the diagnostic field include the combined use of ANN with various other AI tools like the Fuzzy-logic, which has led to an increase in the efficacy and specificity of the diagnostic techniques. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sharma, Puneet; Pienaar, Ilse S
2014-11-01
The technology toolbox by which neural elements can be selectively manipulated in vertebrate and invertebrate brains has expanded greatly in recent years, to now include sophisticated optogenetics and novel designer receptors. Application of such tools allow for ascertaining whether a particular behavioural phenotype associates with interrogation of a specific neural circuit. Optogenetics has already found application in the study of Parkinson's disease (PD) circuitry and therapies, whereas novel designer receptors hold promise for enlightening on current understanding of the mechanisms underlying parkinsonian motor and non-motor symptoms. In particular, this new generation of research tools provide a method by which significant insights can be gained on brain networks implicated in brain diseases such as PD. These tools also promise to assist in the development of novel therapies for targeting degenerated dopaminergic and non-dopaminergic neurons in the diseased basal ganglia system of PD patients, for providing symptomatic relief or even reverse neurodegenerative processes. The present review discusses how such technologies, in conjunction with application of sensitive behavioural assays, continue to significantly advance our knowledge of circuit and signalling properties inherent to PD pathology. The discussion also highlights how such experimental approaches provide additional explorative avenues which may result in dramatically improved therapeutic options for PD patients. Copyright © 2014 Elsevier Ltd. All rights reserved.
Accessing the public MIMIC-II intensive care relational database for clinical research
2013-01-01
Background The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. Results QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge “Predicting mortality of ICU Patients”. Conclusions QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database. PMID:23302652
Advances in magnetic resonance neuroimaging techniques in the evaluation of neonatal encephalopathy.
Panigrahy, Ashok; Blüml, Stefan
2007-02-01
Magnetic resonance (MR) imaging has become an essential tool in the evaluation of neonatal encephalopathy. Magnetic resonance-compatible neonatal incubators allow sick neonates to be transported to the MR scanner, and neonatal head coils can improve signal-to-noise ratio, critical for advanced MR imaging techniques. Refinement of conventional imaging techniques include the use of PROPELLER techniques for motion correction. Magnetic resonance spectroscopic imaging and diffusion tensor imaging provide quantitative assessment of both brain development and brain injury in the newborn with respect to metabolite abnormalities and hypoxic-ischemic injury. Knowledge of normal developmental changes in MR spectroscopy metabolite concentration and diffusion tensor metrics is essential to interpret pathological cases. Perfusion MR and functional MR can provide additional physiological information. Both MR spectroscopy and diffusion tensor imaging can provide additional information in the differential of neonatal encephalopathy, including perinatal white matter injury, hypoxic-ischemic brain injury, metabolic disease, infection, and birth injury.
A Categorization of Dynamic Analyzers
NASA Technical Reports Server (NTRS)
Lujan, Michelle R.
1997-01-01
Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.
A community resource benchmarking predictions of peptide binding to MHC-I molecules.
Peters, Bjoern; Bui, Huynh-Hoa; Frankild, Sune; Nielson, Morten; Lundegaard, Claus; Kostem, Emrah; Basch, Derek; Lamberth, Kasper; Harndahl, Mikkel; Fleri, Ward; Wilson, Stephen S; Sidney, John; Lund, Ole; Buus, Soren; Sette, Alessandro
2006-06-09
Recognition of peptides bound to major histocompatibility complex (MHC) class I molecules by T lymphocytes is an essential part of immune surveillance. Each MHC allele has a characteristic peptide binding preference, which can be captured in prediction algorithms, allowing for the rapid scan of entire pathogen proteomes for peptide likely to bind MHC. Here we make public a large set of 48,828 quantitative peptide-binding affinity measurements relating to 48 different mouse, human, macaque, and chimpanzee MHC class I alleles. We use this data to establish a set of benchmark predictions with one neural network method and two matrix-based prediction methods extensively utilized in our groups. In general, the neural network outperforms the matrix-based predictions mainly due to its ability to generalize even on a small amount of data. We also retrieved predictions from tools publicly available on the internet. While differences in the data used to generate these predictions hamper direct comparisons, we do conclude that tools based on combinatorial peptide libraries perform remarkably well. The transparent prediction evaluation on this dataset provides tool developers with a benchmark for comparison of newly developed prediction methods. In addition, to generate and evaluate our own prediction methods, we have established an easily extensible web-based prediction framework that allows automated side-by-side comparisons of prediction methods implemented by experts. This is an advance over the current practice of tool developers having to generate reference predictions themselves, which can lead to underestimating the performance of prediction methods they are not as familiar with as their own. The overall goal of this effort is to provide a transparent prediction evaluation allowing bioinformaticians to identify promising features of prediction methods and providing guidance to immunologists regarding the reliability of prediction tools.
Data Independent Acquisition analysis in ProHits 4.0.
Liu, Guomin; Knight, James D R; Zhang, Jian Ping; Tsou, Chih-Chiang; Wang, Jian; Lambert, Jean-Philippe; Larsen, Brett; Tyers, Mike; Raught, Brian; Bandeira, Nuno; Nesvizhskii, Alexey I; Choi, Hyungwon; Gingras, Anne-Claude
2016-10-21
Affinity purification coupled with mass spectrometry (AP-MS) is a powerful technique for the identification and quantification of physical interactions. AP-MS requires careful experimental design, appropriate control selection and quantitative workflows to successfully identify bona fide interactors amongst a large background of contaminants. We previously introduced ProHits, a Laboratory Information Management System for interaction proteomics, which tracks all samples in a mass spectrometry facility, initiates database searches and provides visualization tools for spectral counting-based AP-MS approaches. More recently, we implemented Significance Analysis of INTeractome (SAINT) within ProHits to provide scoring of interactions based on spectral counts. Here, we provide an update to ProHits to support Data Independent Acquisition (DIA) with identification software (DIA-Umpire and MSPLIT-DIA), quantification tools (through DIA-Umpire, or externally via targeted extraction), and assessment of quantitative enrichment (through mapDIA) and scoring of interactions (through SAINT-intensity). With additional improvements, notably support of the iProphet pipeline, facilitated deposition into ProteomeXchange repositories and enhanced export and viewing functions, ProHits 4.0 offers a comprehensive suite of tools to facilitate affinity proteomics studies. It remains challenging to score, annotate and analyze proteomics data in a transparent manner. ProHits was previously introduced as a LIMS to enable storing, tracking and analysis of standard AP-MS data. In this revised version, we expand ProHits to include integration with a number of identification and quantification tools based on Data-Independent Acquisition (DIA). ProHits 4.0 also facilitates data deposition into public repositories, and the transfer of data to new visualization tools. Copyright © 2016 Elsevier B.V. All rights reserved.
2014-01-01
Background Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). Results BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. Conclusions BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication. PMID:24955110
Erickson, Richard A.; Thogmartin, Wayne E.; Szymanski, Jennifer A.
2014-01-01
Background: Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). Results: BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. Conclusions: BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication.
Erickson, Richard A; Thogmartin, Wayne E; Szymanski, Jennifer A
2014-01-01
Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication.
Russ, Alissa L; Jahn, Michelle A; Patel, Himalaya; Porter, Brian W; Nguyen, Khoa A; Zillich, Alan J; Linsky, Amy; Simon, Steven R
2018-06-01
An electronic medication reconciliation tool was previously developed by another research team to aid provider-patient communication for medication reconciliation. To evaluate the usability of this tool, we integrated artificial safety probes into standard usability methods. The objective of this article is to describe this method of using safety probes, which enabled us to evaluate how well the tool supports users' detection of medication discrepancies. We completed a mixed-method usability evaluation in a simulated setting with 30 participants: 20 healthcare professionals (HCPs) and 10 patients. We used factual scenarios but embedded three artificial safety probes: (1) a missing medication (i.e., omission); (2) an extraneous medication (i.e., commission); and (3) an inaccurate dose (i.e., dose discrepancy). We measured users' detection of each probe to estimate the probability that a HCP or patient would detect these discrepancies. Additionally, we recorded participants' detection of naturally occurring discrepancies. Each safety probe was detected by ≤50% of HCPs. Patients' detection rates were generally higher. Estimates indicate that a HCP and patient, together, would detect 44.8% of these medication discrepancies. Additionally, HCPs and patients detected 25 and 45 naturally-occurring discrepancies, respectively. Overall, detection of medication discrepancies was low. Findings indicate that more advanced interface designs are warranted. Future research is needed on how technologies can be designed to better aid HCPs' and patients' detection of medication discrepancies. This is one of the first studies to evaluate the usability of a collaborative medication reconciliation tool and assess HCPs' and patients' detection of medication discrepancies. Results demonstrate that embedded safety probes can enhance standard usability methods by measuring additional, clinically-focused usability outcomes. The novel safety probes we used may serve as an initial, standard set for future medication reconciliation research. More prevalent use of safety probes could strengthen usability research for a variety of health information technologies. Published by Elsevier Inc.
Medical ethics on film: towards a reconstruction of the teaching of healthcare professionals
Volandes, Angelo
2007-01-01
The clinical vignette remains the standard means by which medical ethics are taught to students in the healthcare professions. Although written or verbal vignettes are useful as a pedagogic tool for teaching ethics and introducing students to real cases, they are limited, since students must imagine the clinical scenario. Medical ethics are almost universally taught during the early years of training, when students are unfamiliar with the clinical reality in which ethics issues arise. Film vignettes fill in that imaginative leap. By providing vivid details with images, film vignettes offer rich and textured details of cases, including the patient's perspective and the clinical reality. Film vignettes provide a detailed ethnography that allows for a more complete discussion of the ethical issues. Film can serve as an additional tool for teaching medical ethics to members of the healthcare professions. PMID:17971475
Software reliability through fault-avoidance and fault-tolerance
NASA Technical Reports Server (NTRS)
Vouk, Mladen A.; Mcallister, David F.
1993-01-01
Strategies and tools for the testing, risk assessment and risk control of dependable software-based systems were developed. Part of this project consists of studies to enable the transfer of technology to industry, for example the risk management techniques for safety-concious systems. Theoretical investigations of Boolean and Relational Operator (BRO) testing strategy were conducted for condition-based testing. The Basic Graph Generation and Analysis tool (BGG) was extended to fully incorporate several variants of the BRO metric. Single- and multi-phase risk, coverage and time-based models are being developed to provide additional theoretical and empirical basis for estimation of the reliability and availability of large, highly dependable software. A model for software process and risk management was developed. The use of cause-effect graphing for software specification and validation was investigated. Lastly, advanced software fault-tolerance models were studied to provide alternatives and improvements in situations where simple software fault-tolerance strategies break down.
Bearing tester data compilation, analysis, and reporting and bearing math modeling
NASA Technical Reports Server (NTRS)
1983-01-01
The Shaberth bearing analysis computer program was developed for the analysis of jet engine shaft/bearing systems operating above room temperature with normal hydrocarbon lubricants. It is also possible to use this tool to evaluate the shaft bearing systems operating in cryogenics. Effects such as fluid drag, radial temperature gradients, outer race misalignments and clearance changes were simulated and evaluated. In addition, the speed and preload effects on bearing radial stiffness was evaluated. The Shaberth program was also used to provide contact stresses from which contact geometry was calculated to support other analyses such as the determination of cryogenic fluid film thickness in the contacts and evaluation of surface and subsurface stresses necessary for bearing failure evaluation. This program was a vital tool for the thermal analysis of the bearing in that it provides the heat generation rates at the rolling element/race contacts for input into a thermal model of the bearing/shaft assembly.
SNPit: a federated data integration system for the purpose of functional SNP annotation.
Shen, Terry H; Carlson, Christopher S; Tarczy-Hornoch, Peter
2009-08-01
Genome wide association studies can potentially identify the genetic causes behind the majority of human diseases. With the advent of more advanced genotyping techniques, there is now an explosion of data gathered on single nucleotide polymorphisms (SNPs). The need exists for an integrated system that can provide up-to-date functional annotation information on SNPs. We have developed the SNP Integration Tool (SNPit) system to address this need. Built upon a federated data integration system, SNPit provides current information on a comprehensive list of SNP data sources. Additional logical inference analysis was included through an inference engine plug in. The SNPit web servlet is available online for use. SNPit allows users to go to one source for up-to-date information on the functional annotation of SNPs. A tool that can help to integrate and analyze the potential functional significance of SNPs is important for understanding the results from genome wide association studies.
Neandertal cannibalism and Neandertal bones used as tools in Northern Europe
Rougier, Hélène; Crevecoeur, Isabelle; Beauval, Cédric; Posth, Cosimo; Flas, Damien; Wißing, Christoph; Furtwängler, Anja; Germonpré, Mietje; Gómez-Olivencia, Asier; Semal, Patrick; van der Plicht, Johannes; Bocherens, Hervé; Krause, Johannes
2016-01-01
Almost 150 years after the first identification of Neandertal skeletal material, the cognitive and symbolic abilities of these populations remain a subject of intense debate. We present 99 new Neandertal remains from the Troisième caverne of Goyet (Belgium) dated to 40,500–45,500 calBP. The remains were identified through a multidisciplinary study that combines morphometrics, taphonomy, stable isotopes, radiocarbon dating and genetic analyses. The Goyet Neandertal bones show distinctive anthropogenic modifications, which provides clear evidence for butchery activities as well as four bones having been used for retouching stone tools. In addition to being the first site to have yielded multiple Neandertal bones used as retouchers, Goyet not only provides the first unambiguous evidence of Neandertal cannibalism in Northern Europe, but also highlights considerable diversity in mortuary behaviour among the region’s late Neandertal population in the period immediately preceding their disappearance. PMID:27381450
Rublee, Parke A; Remington, David L; Schaefer, Eric F; Marshall, Michael M
2005-01-01
Molecular methods, including conventional PCR, real-time PCR, denaturing gradient gel electrophoresis, fluorescent fragment detection PCR, and fluorescent in situ hybridization, have all been developed for use in identifying and studying the distribution of the toxic dinoflagellates Pfiesteria piscicida and P. shumwayae. Application of the methods has demonstrated a worldwide distribution of both species and provided insight into their environmental tolerance range and temporal changes in distribution. Genetic variability among geographic locations generally appears low in rDNA genes, and detection of the organisms in ballast water is consistent with rapid dispersal or high gene flow among populations, but additional sequence data are needed to verify this hypothesis. The rapid development and application of these tools serves as a model for study of other microbial taxa and provides a basis for future development of tools that can simultaneously detect multiple targets.
PathVisio-Faceted Search: an exploration tool for multi-dimensional navigation of large pathways
Fried, Jake Y.; Luna, Augustin
2013-01-01
Purpose: The PathVisio-Faceted Search plugin helps users explore and understand complex pathways by overlaying experimental data and data from webservices, such as Ensembl BioMart, onto diagrams drawn using formalized notations in PathVisio. The plugin then provides a filtering mechanism, known as a faceted search, to find and highlight diagram nodes (e.g. genes and proteins) of interest based on imported data. The tool additionally provides a flexible scripting mechanism to handle complex queries. Availability: The PathVisio-Faceted Search plugin is compatible with PathVisio 3.0 and above. PathVisio is compatible with Windows, Mac OS X and Linux. The plugin, documentation, example diagrams and Groovy scripts are available at http://PathVisio.org/wiki/PathVisioFacetedSearchHelp. The plugin is free, open-source and licensed by the Apache 2.0 License. Contact: augustin@mail.nih.gov or jakeyfried@gmail.com PMID:23547033
NASA Astrophysics Data System (ADS)
Xiong, Yanmei; Zhang, Yuyan; Rong, Pengfei; Yang, Jie; Wang, Wei; Liu, Dingbin
2015-09-01
We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose.We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose. Electronic supplementary information (ESI) available: Experimental section and additional figures. See DOI: 10.1039/c5nr03758a
The use of virtual reality simulation of head trauma in a surgical boot camp.
Vergara, Victor M; Panaiotis; Kingsley, Darra; Alverson, Dale C; Godsmith, Timothy; Xia, Shan; Caudell, Thomas P
2009-01-01
Surgical "boot camps" provide excellent opportunities to enhance orientation, learning, and preparation of new surgery interns as they enter the clinical arena. This paper describes the utilization of an interactive virtual reality (VR) simulation and associated virtual patient (VP) as an additional tool for surgical boot camps. Complementing other forms of simulation, virtual patients (VPs) require less specialized equipment and can also provide a wide variety of medical scenarios. In this paper we discuss a study that measured the learning effectiveness of a real-world VP simulation used by a class of new surgery interns who operated it with a standard computer interface. The usability of the simulator as a learning tool has been demonstrated and measured. This study brings the use of VR simulation with VPs closer to wider application and integration into a training curriculum, such as a surgery intern boot camp.
Preliminary design methods for fiber reinforced composite structures employing a personal computer
NASA Technical Reports Server (NTRS)
Eastlake, C. N.
1986-01-01
The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Starr, D. L.; Wozniak, P. R.; Vestrand, W. T.
2002-01-01
SkyDOT (Sky Database for Objects in Time-Domain) is a Virtual Observatory currently comprised of data from the RAPTOR, ROTSE I, and OGLE I1 survey projects. This makes it a very large time domain database. In addition, the RAPTOR project provides SkyDOT with real-time variability data as well as stereoscopic information. With its web interface, we believe SkyDOT will be a very useful tool for both astronomers, and the public. Our main task has been to construct an efficient relational database containing all existing data, while handling a real-time inflow of data. We also provide a useful web interface allowing easymore » access to both astronomers and the public. Initially, this server will allow common searches, specific queries, and access to light curves. In the future we will include machine learning classification tools and access to spectral information.« less
Vlaic, Sebastian; Hoffmann, Bianca; Kupfer, Peter; Weber, Michael; Dräger, Andreas
2013-09-01
GRN2SBML automatically encodes gene regulatory networks derived from several inference tools in systems biology markup language. Providing a graphical user interface, the networks can be annotated via the simple object access protocol (SOAP)-based application programming interface of BioMart Central Portal and minimum information required in the annotation of models registry. Additionally, we provide an R-package, which processes the output of supported inference algorithms and automatically passes all required parameters to GRN2SBML. Therefore, GRN2SBML closes a gap in the processing pipeline between the inference of gene regulatory networks and their subsequent analysis, visualization and storage. GRN2SBML is freely available under the GNU Public License version 3 and can be downloaded from http://www.hki-jena.de/index.php/0/2/490. General information on GRN2SBML, examples and tutorials are available at the tool's web page.
Southern California Disasters II
NASA Technical Reports Server (NTRS)
Nicholson, Heather; Todoroff, Amber L.; LeBoeuf, Madeline A.
2015-01-01
The USDA Forest Service (USFS) has multiple programs in place which primarily utilize Landsat imagery to produce burn severity indices for aiding wildfire damage assessment and mitigation. These indices provide widely-used wildfire damage assessment tools to decision makers. When the Hyperspectral Infrared Imager (HyspIRI) is launched in 2022, the sensor's hyperspectral resolution will support new methods for assessing natural disaster impacts on ecosystems, including wildfire damage to forests. This project used simulated HyspIRI data to study three southern California fires: Aspen, French, and King. Burn severity indices were calculated from the data and the results were quantitatively compared to the comparable USFS products currently in use. The final results from this project illustrate how HyspIRI data may be used in the future to enhance assessment of fire-damaged areas and provide additional monitoring tools for decision support to the USFS and other land management agencies.
Villoria Sáez, Paola; del Río Merino, Mercedes; Porras-Amores, César
2012-02-01
The management planning of construction and demolition (C&D) waste uses a single indicator which does not provide enough detailed information. Therefore the determination and implementation of other innovative and precise indicators should be determined. The aim of this research work is to improve existing C&D waste quantification tools in the construction of new residential buildings in Spain. For this purpose, several housing projects were studied to determine an estimation of C&D waste generated during their construction process. This paper determines the values of three indicators to estimate the generation of C&D waste in new residential buildings in Spain, itemizing types of waste and construction stages. The inclusion of two more accurate indicators, in addition to the global one commonly in use, provides a significant improvement in C&D waste quantification tools and management planning.