Structural Embeddings: Mechanization with Method
NASA Technical Reports Server (NTRS)
Munoz, Cesar; Rushby, John
1999-01-01
The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.
Formal Methods Tool Qualification
NASA Technical Reports Server (NTRS)
Wagner, Lucas G.; Cofer, Darren; Slind, Konrad; Tinelli, Cesare; Mebsout, Alain
2017-01-01
Formal methods tools have been shown to be effective at finding defects in safety-critical digital systems including avionics systems. The publication of DO-178C and the accompanying formal methods supplement DO-333 allows applicants to obtain certification credit for the use of formal methods without providing justification for them as an alternative method. This project conducted an extensive study of existing formal methods tools, identifying obstacles to their qualification and proposing mitigations for those obstacles. Further, it interprets the qualification guidance for existing formal methods tools and provides case study examples for open source tools. This project also investigates the feasibility of verifying formal methods tools by generating proof certificates which capture proof of the formal methods tool's claim, which can be checked by an independent, proof certificate checking tool. Finally, the project investigates the feasibility of qualifying this proof certificate checker, in the DO-330 framework, in lieu of qualifying the model checker itself.
UQTk Version 3.0.3 User Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sargsyan, Khachik; Safta, Cosmin; Chowdhary, Kamaljit Singh
2017-05-01
The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 3.0.3 offers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sen- sitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.
Method for forming an abrasive surface on a tool
Seals, Roland D.; White, Rickey L.; Swindeman, Catherine J.; Kahl, W. Keith
1999-01-01
A method for fabricating a tool used in cutting, grinding and machining operations, is provided. The method is used to deposit a mixture comprising an abrasive material and a bonding material on a tool surface. The materials are propelled toward the receiving surface of the tool substrate using a thermal spray process. The thermal spray process melts the bonding material portion of the mixture, but not the abrasive material. Upon impacting the tool surface, the mixture or composition solidifies to form a hard abrasive tool coating.
Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis
2015-01-01
Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.
Kashani-Amin, Elaheh; Tabatabaei-Malazy, Ozra; Sakhteman, Amirhossein; Larijani, Bagher; Ebrahim-Habibi, Azadeh
2018-02-27
Prediction of proteins' secondary structure is one of the major steps in the generation of homology models. These models provide structural information which is used to design suitable ligands for potential medicinal targets. However, selecting a proper tool between multiple secondary structure prediction (SSP) options is challenging. The current study is an insight onto currently favored methods and tools, within various contexts. A systematic review was performed for a comprehensive access to recent (2013-2016) studies which used or recommended protein SSP tools. Three databases, Web of Science, PubMed and Scopus were systematically searched and 99 out of 209 studies were finally found eligible to extract data. Four categories of applications for 59 retrieved SSP tools were: (I) prediction of structural features of a given sequence, (II) evaluation of a method, (III) providing input for a new SSP method and (IV) integrating a SSP tool as a component for a program. PSIPRED was found to be the most popular tool in all four categories. JPred and tools utilizing PHD (Profile network from HeiDelberg) method occupied second and third places of popularity in categories I and II. JPred was only found in the two first categories, while PHD was present in three fields. This study provides a comprehensive insight about the recent usage of SSP tools which could be helpful for selecting a proper tool's choice. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Designing and Using Software Tools for Educational Purposes: FLAT, a Case Study
ERIC Educational Resources Information Center
Castro-Schez, J. J.; del Castillo, E.; Hortolano, J.; Rodriguez, A.
2009-01-01
Educational software tools are considered to enrich teaching strategies, providing a more compelling means of exploration and feedback than traditional blackboard methods. Moreover, software simulators provide a more motivating link between theory and practice than pencil-paper methods, encouraging active and discovery learning in the students.…
Formal Assurance Certifiable Tooling Formal Assurance Certifiable Tooling Strategy Final Report
NASA Technical Reports Server (NTRS)
Bush, Eric; Oglesby, David; Bhatt, Devesh; Murugesan, Anitha; Engstrom, Eric; Mueller, Joe; Pelican, Michael
2017-01-01
This is the Final Report of a research project to investigate issues and provide guidance for the qualification of formal methods tools under the DO-330 qualification process. It consisted of three major subtasks spread over two years: 1) an assessment of theoretical soundness issues that may affect qualification for three categories of formal methods tools, 2) a case study simulating the DO-330 qualification of two actual tool sets, and 3) an investigation of risk mitigation strategies that might be applied to chains of such formal methods tools in order to increase confidence in their certification of airborne software.
Assessing Adaptive Instructional Design Tools and Methods in ADAPT[IT].
ERIC Educational Resources Information Center
Eseryel, Deniz; Spector, J. Michael
ADAPT[IT] (Advanced Design Approach for Personalized Training - Interactive Tools) is a European project within the Information Society Technologies program that is providing design methods and tools to guide a training designer according to the latest cognitive science and standardization principles. ADAPT[IT] addresses users in two significantly…
EMERGY METHODS: VALUABLE INTEGRATED ASSESSMENT TOOLS
NHEERL's Atlantic Ecology Division is investigating emergy methods as tools for integrated assessment in several projects evaluating environmental impacts, policies, and alternatives for remediation and intervention. Emergy accounting is a methodology that provides a quantitative...
NASA Technical Reports Server (NTRS)
Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert
2005-01-01
Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.
Feasibility and acceptability of a computer-based tool to improve contraceptive counseling.
Wilson, Ellen K; Krieger, Kathleen E; Koo, Helen P; Minnis, Alexandra M; Treiman, Katherine
2014-07-01
The objective was to test the feasibility and acceptability of a computerized tool, Smart Choices, designed to enhance the quality of contraceptive counseling in family planning clinics. The tool includes (a) a questionnaire completed by patients and summarized in a printout for providers and (b) a birth control guide patients explore to learn about various contraceptive methods. In 2 family planning clinics, we conducted interviews with 125 women who used the Smart Choices computerized tool and 7 providers. Smart Choices integrated into clinic flow well in one clinic, but less well in the other, which had very short waiting times. Patients were generally enthusiastic about Smart Choices, including its helpfulness in preparing them and their providers for the counseling session and increasing their knowledge of contraceptive methods. Providers varied in how much they used the printout and in their opinions about its usefulness. Some felt its usefulness was limited because it overlapped with the clinic's intake forms or because it did not match with their concept of counseling needs. Others felt it provided valuable information not collected by intake forms and more honest information. Some found Smart Choices to be most helpful with patients who were unsure what method they wanted. Smart Choices is feasible to implement and well received by patients, but modifications are needed to increase provider enthusiasm for this tool. The Smart Choices tool requires refinement before widespread dissemination. Copyright © 2014 Elsevier Inc. All rights reserved.
2D and 3D Method of Characteristic Tools for Complex Nozzle Development
NASA Technical Reports Server (NTRS)
Rice, Tharen
2003-01-01
This report details the development of a 2D and 3D Method of Characteristic (MOC) tool for the design of complex nozzle geometries. These tools are GUI driven and can be run on most Windows-based platforms. The report provides a user's manual for these tools as well as explains the mathematical algorithms used in the MOC solutions.
The fact of ignorance: revisiting the Socratic method as a tool for teaching critical thinking.
Oyler, Douglas R; Romanelli, Frank
2014-09-15
Critical thinking, while highly valued as an ability of health care providers, remains a skill that many educators find difficult to teach. This review provides an analysis examining why current methods of teaching critical thinking to health care students (primarily medical and pharmacy students) often fail and describes a premise and potential utility of the Socratic method as a tool to teach critical thinking in health care education.
Vibration damping method and apparatus
Redmond, James M.; Barney, Patrick S.; Parker, Gordon G.; Smith, David A.
1999-01-01
The present invention provides vibration damping method and apparatus that can damp vibration in more than one direction without requiring disassembly, that can accommodate varying tool dimensions without requiring re-tuning, and that does not interfere with tool tip operations and cooling. The present invention provides active dampening by generating bending moments internal to a structure such as a boring bar to dampen vibration thereof.
Analyzing Human-Landscape Interactions: Tools That Integrate
NASA Astrophysics Data System (ADS)
Zvoleff, Alex; An, Li
2014-01-01
Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.
Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi
2014-11-01
To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.
Methods for transition toward computer assisted cognitive examination.
Jurica, P; Valenzi, S; Struzik, Z R; Cichocki, A
2015-01-01
We present a software framework which enables the extension of current methods for the assessment of cognitive fitness using recent technological advances. Screening for cognitive impairment is becoming more important as the world's population grows older. Current methods could be enhanced by use of computers. Introduction of new methods to clinics requires basic tools for collection and communication of collected data. To develop tools that, with minimal interference, offer new opportunities for the enhancement of the current interview based cognitive examinations. We suggest methods and discuss process by which established cognitive tests can be adapted for data collection through digitization by pen enabled tablets. We discuss a number of methods for evaluation of collected data, which promise to increase the resolution and objectivity of the common scoring strategy based on visual inspection. By involving computers in the roles of both instructing and scoring, we aim to increase the precision and reproducibility of cognitive examination. The tools provided in Python framework CogExTools available at http://bsp. brain.riken.jp/cogextools/ enable the design, application and evaluation of screening tests for assessment of cognitive impairment. The toolbox is a research platform; it represents a foundation for further collaborative development by the wider research community and enthusiasts. It is free to download and use, and open-source. We introduce a set of open-source tools that facilitate the design and development of new cognitive tests for modern technology. We provide these tools in order to enable the adaptation of technology for cognitive examination in clinical settings. The tools provide the first step in a possible transition toward standardized mental state examination using computers.
Electromagnetic variable degrees of freedom actuator systems and methods
Montesanti, Richard C [Pleasanton, CA; Trumper, David L [Plaistow, NH; Kirtley, Jr., James L.
2009-02-17
The present invention provides a variable reluctance actuator system and method that can be adapted for simultaneous rotation and translation of a moving element by applying a normal-direction magnetic flux on the moving element. In a beneficial example arrangement, the moving element includes a swing arm that carries a cutting tool at a set radius from an axis of rotation so as to produce a rotary fast tool servo that provides a tool motion in a direction substantially parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. An actuator rotates a swing arm such that a cutting tool moves toward and away from a mounted rotating workpiece in a controlled manner in order to machine the workpiece. Position sensors provide rotation and displacement information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in feed slide of a precision lathe.
The Fact of IgnoranceRevisiting the Socratic Method as a Tool for Teaching Critical Thinking
Romanelli, Frank
2014-01-01
Critical thinking, while highly valued as an ability of health care providers, remains a skill that many educators find difficult to teach. This review provides an analysis examining why current methods of teaching critical thinking to health care students (primarily medical and pharmacy students) often fail and describes a premise and potential utility of the Socratic method as a tool to teach critical thinking in health care education. PMID:25258449
OPTIMIZING USABILITY OF AN ECONOMIC DECISION SUPPORT TOOL: PROTOTYPE OF THE EQUIPT TOOL.
Cheung, Kei Long; Hiligsmann, Mickaël; Präger, Maximilian; Jones, Teresa; Józwiak-Hagymásy, Judit; Muñoz, Celia; Lester-George, Adam; Pokhrel, Subhash; López-Nicolás, Ángel; Trapero-Bertran, Marta; Evers, Silvia M A A; de Vries, Hein
2018-01-01
Economic decision-support tools can provide valuable information for tobacco control stakeholders, but their usability may impact the adoption of such tools. This study aims to illustrate a mixed-method usability evaluation of an economic decision-support tool for tobacco control, using the EQUIPT ROI tool prototype as a case study. A cross-sectional mixed methods design was used, including a heuristic evaluation, a thinking aloud approach, and a questionnaire testing and exploring the usability of the Return of Investment tool. A total of sixty-six users evaluated the tool (thinking aloud) and completed the questionnaire. For the heuristic evaluation, four experts evaluated the interface. In total twenty-one percent of the respondents perceived good usability. A total of 118 usability problems were identified, from which twenty-six problems were categorized as most severe, indicating high priority to fix them before implementation. Combining user-based and expert-based evaluation methods is recommended as these were shown to identify unique usability problems. The evaluation provides input to optimize usability of a decision-support tool, and may serve as a vantage point for other developers to conduct usability evaluations to refine similar tools before wide-scale implementation. Such studies could reduce implementation gaps by optimizing usability, enhancing in turn the research impact of such interventions.
Vibration damping method and apparatus
Redmond, J.M.; Barney, P.S.; Parker, G.G.; Smith, D.A.
1999-06-22
The present invention provides vibration damping method and apparatus that can damp vibration in more than one direction without requiring disassembly, that can accommodate varying tool dimensions without requiring re-tuning, and that does not interfere with tool tip operations and cooling. The present invention provides active dampening by generating bending moments internal to a structure such as a boring bar to dampen vibration thereof. 38 figs.
System and method for integrating hazard-based decision making tools and processes
Hodgin, C Reed [Westminster, CO
2012-03-20
A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.
PyHLA: tests for the association between HLA alleles and diseases.
Fan, Yanhui; Song, You-Qiang
2017-02-06
Recently, several tools have been designed for human leukocyte antigen (HLA) typing using single nucleotide polymorphism (SNP) array and next-generation sequencing (NGS) data. These tools provide high-throughput and cost-effective approaches for identifying HLA types. Therefore, tools for downstream association analysis are highly desirable. Although several tools have been designed for multi-allelic marker association analysis, they were designed only for microsatellite markers and do not scale well with increasing data volumes, or they were designed for large-scale data but provided a limited number of tests. We have developed a Python package called PyHLA, which implements several methods for HLA association analysis, to fill the gap. PyHLA is a tailor-made, easy to use, and flexible tool designed specifically for the association analysis of the HLA types imputed from genome-wide genotyping and NGS data. PyHLA provides functions for association analysis, zygosity tests, and interaction tests between HLA alleles and diseases. Monte Carlo permutation and several methods for multiple testing corrections have also been implemented. PyHLA provides a convenient and powerful tool for HLA analysis. Existing methods have been integrated and desired methods have been added in PyHLA. Furthermore, PyHLA is applicable to small and large sample sizes and can finish the analysis in a timely manner on a personal computer with different platforms. PyHLA is implemented in Python. PyHLA is a free, open source software distributed under the GPLv2 license. The source code, tutorial, and examples are available at https://github.com/felixfan/PyHLA.
An automated benchmarking platform for MHC class II binding prediction methods.
Andreatta, Massimo; Trolle, Thomas; Yan, Zhen; Greenbaum, Jason A; Peters, Bjoern; Nielsen, Morten
2018-05-01
Computational methods for the prediction of peptide-MHC binding have become an integral and essential component for candidate selection in experimental T cell epitope discovery studies. The sheer amount of published prediction methods-and often discordant reports on their performance-poses a considerable quandary to the experimentalist who needs to choose the best tool for their research. With the goal to provide an unbiased, transparent evaluation of the state-of-the-art in the field, we created an automated platform to benchmark peptide-MHC class II binding prediction tools. The platform evaluates the absolute and relative predictive performance of all participating tools on data newly entered into the Immune Epitope Database (IEDB) before they are made public, thereby providing a frequent, unbiased assessment of available prediction tools. The benchmark runs on a weekly basis, is fully automated, and displays up-to-date results on a publicly accessible website. The initial benchmark described here included six commonly used prediction servers, but other tools are encouraged to join with a simple sign-up procedure. Performance evaluation on 59 data sets composed of over 10 000 binding affinity measurements suggested that NetMHCIIpan is currently the most accurate tool, followed by NN-align and the IEDB consensus method. Weekly reports on the participating methods can be found online at: http://tools.iedb.org/auto_bench/mhcii/weekly/. mniel@bioinformatics.dtu.dk. Supplementary data are available at Bioinformatics online.
Method Of Wire Insertion For Electric Machine Stators
Brown, David L; Stabel, Gerald R; Lawrence, Robert Anthony
2005-02-08
A method of inserting coils in slots of a stator is provided. The method includes interleaving a first set of first phase windings and a first set of second phase windings on an insertion tool. The method also includes activating the insertion tool to radially insert the first set of first phase windings and the first set of second phase windings in the slots of the stator. In one embodiment, interleaving the first set of first phase windings and the first set of second phase windings on the insertion tool includes forming the first set of first phase windings in first phase openings defined in the insertion tool, and forming the first set of second phase windings in second phase openings defined in the insertion tool.
Oliver, David M; Hanley, Nick D; van Niekerk, Melanie; Kay, David; Heathwaite, A Louise; Rabinovici, Sharyl J M; Kinzelman, Julie L; Fleming, Lora E; Porter, Jonathan; Shaikh, Sabina; Fish, Rob; Chilton, Sue; Hewitt, Julie; Connolly, Elaine; Cummins, Andy; Glenk, Klaus; McPhail, Calum; McRory, Eric; McVittie, Alistair; Giles, Amanna; Roberts, Suzanne; Simpson, Katherine; Tinch, Dugald; Thairs, Ted; Avery, Lisa M; Vinten, Andy J A; Watts, Bill D; Quilliam, Richard S
2016-02-01
The use of molecular tools, principally qPCR, versus traditional culture-based methods for quantifying microbial parameters (e.g., Fecal Indicator Organisms) in bathing waters generates considerable ongoing debate at the science-policy interface. Advances in science have allowed the development and application of molecular biological methods for rapid (~2 h) quantification of microbial pollution in bathing and recreational waters. In contrast, culture-based methods can take between 18 and 96 h for sample processing. Thus, molecular tools offer an opportunity to provide a more meaningful statement of microbial risk to water-users by providing near-real-time information enabling potentially more informed decision-making with regard to water-based activities. However, complementary studies concerning the potential costs and benefits of adopting rapid methods as a regulatory tool are in short supply. We report on findings from an international Working Group that examined the breadth of social impacts, challenges, and research opportunities associated with the application of molecular tools to bathing water regulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ringgenberg, P.D.; Burris, W.J.
1988-06-28
A method is described of flow testing a formation in a wellbore, comprising: providing a testing string including at least one annulus pressure responsive tool bore closure valve; providing a packer and setting the packer in the wellbore to seal thereacross; running the testing string into the wellbore with the tool bore closure valve in an open position; stinging into the set packer with the bottom of the testing string; increasing pressure a first time in the wellbore annulus around the testing string and above the set packer without cycling the tool bore closure valve; reducing pressure in the wellboremore » annulus; closing the tool bore closure valve responsive to the pressure reduction; increasing pressure a second time in the wellbore annulus; reopening the tool bore closure valve responsive to the second increase; and flowing fluids from the formation through the reopened tool bore closure valve.« less
DIY Adapted Repurposed Tool (ART) Kit-A Recipe for Success
ERIC Educational Resources Information Center
Schoonover, Judith; Schwind, Deborah B.
2018-01-01
Every individual should be provided with opportunities to self-express through participation in art activities. In order to provide independent exploration and creativity during art, it may be necessary to adapt art tools, modify how the activity is accomplished, and examine the environment to determine the best methods to provide access. Simple…
Cox, Ruth; Sanchez, Javier; Revie, Crawford W
2013-01-01
Global climate change is known to result in the emergence or re-emergence of some infectious diseases. Reliable methods to identify the infectious diseases of humans and animals and that are most likely to be influenced by climate are therefore required. Since different priorities will affect the decision to address a particular pathogen threat, decision makers need a standardised method of prioritisation. Ranking methods and Multi-Criteria Decision approaches provide such a standardised method and were employed here to design two different pathogen prioritisation tools. The opinion of 64 experts was elicited to assess the importance of 40 criteria that could be used to prioritise emerging infectious diseases of humans and animals in Canada. A weight was calculated for each criterion according to the expert opinion. Attributes were defined for each criterion as a transparent and repeatable method of measurement. Two different Multi-Criteria Decision Analysis tools were tested, both of which used an additive aggregation approach. These were an Excel spreadsheet tool and a tool developed in software 'M-MACBETH'. The tools were trialed on nine 'test' pathogens. Two different methods of criteria weighting were compared, one using fixed weighting values, the other using probability distributions to account for uncertainty and variation in expert opinion. The ranking of the nine pathogens varied according to the weighting method that was used. In both tools, using both weighting methods, the diseases that tended to rank the highest were West Nile virus, Giardiasis and Chagas, while Coccidioidomycosis tended to rank the lowest. Both tools are a simple and user friendly approach to prioritising pathogens according to climate change by including explicit scoring of 40 criteria and incorporating weighting methods based on expert opinion. They provide a dynamic interactive method that can help to identify pathogens for which a full risk assessment should be pursued.
Cox, Ruth; Sanchez, Javier; Revie, Crawford W.
2013-01-01
Global climate change is known to result in the emergence or re-emergence of some infectious diseases. Reliable methods to identify the infectious diseases of humans and animals and that are most likely to be influenced by climate are therefore required. Since different priorities will affect the decision to address a particular pathogen threat, decision makers need a standardised method of prioritisation. Ranking methods and Multi-Criteria Decision approaches provide such a standardised method and were employed here to design two different pathogen prioritisation tools. The opinion of 64 experts was elicited to assess the importance of 40 criteria that could be used to prioritise emerging infectious diseases of humans and animals in Canada. A weight was calculated for each criterion according to the expert opinion. Attributes were defined for each criterion as a transparent and repeatable method of measurement. Two different Multi-Criteria Decision Analysis tools were tested, both of which used an additive aggregation approach. These were an Excel spreadsheet tool and a tool developed in software ‘M-MACBETH’. The tools were trialed on nine ‘test’ pathogens. Two different methods of criteria weighting were compared, one using fixed weighting values, the other using probability distributions to account for uncertainty and variation in expert opinion. The ranking of the nine pathogens varied according to the weighting method that was used. In both tools, using both weighting methods, the diseases that tended to rank the highest were West Nile virus, Giardiasis and Chagas, while Coccidioidomycosis tended to rank the lowest. Both tools are a simple and user friendly approach to prioritising pathogens according to climate change by including explicit scoring of 40 criteria and incorporating weighting methods based on expert opinion. They provide a dynamic interactive method that can help to identify pathogens for which a full risk assessment should be pursued. PMID:23950868
MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.
Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y
2017-08-14
Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .
Gressel, Gregory M; Lundsberg, Lisbet S; Illuzzi, Jessica L; Danton, Cheryl M; Sheth, Sangini S; Xu, Xiao; Gariepy, Aileen
2014-12-01
To explore patient and provider perspectives regarding a new Web-based contraceptive support tool. We conducted a qualitative study at an urban Medicaid-based clinic among sexually active women interested in starting a new contraceptive method, clinic providers and staff. All participants were given the opportunity to explore Bedsider, an online contraceptive support tool developed for sexually active women ages 18-29 by the National Campaign to Prevent Teen and Unplanned Pregnancy and endorsed by the American Congress of Obstetricians and Gynecologists. Focus groups were conducted separately among patient participants and clinic providers/staff using open-ended structured interview guides to identify specific themes and key concepts related to use of this tool in an urban clinic setting. Patient participants were very receptive to this online contraceptive support tool, describing it as trustworthy, accessible and empowering. In contrast, clinic providers and staff had concerns regarding the Website's legitimacy, accessibility, ability to empower patients and applicability, which limited their willingness to recommend its use to patients. Contrasting opinions regarding Bedsider may point to a potential disconnect between how providers and patients view contraception information tools. Further qualitative and quantitative studies are needed to explore women's perspectives on contraceptive education and counseling and providers' understanding of these perspectives. This study identifies a contrast between how patients and providers in an urban clinic setting perceive a Web-based contraceptive tool. Given a potential patient-provider discrepancy in preferred methods and approaches to contraceptive counseling, additional research is needed to enhance this important arena of women's health care. Copyright © 2014 Elsevier Inc. All rights reserved.
A new idea for visualization of lesions distribution in mammogram based on CPD registration method.
Pan, Xiaoguang; Qi, Buer; Yu, Hongfei; Wei, Haiping; Kang, Yan
2017-07-20
Mammography is currently the most effective technique for breast cancer. Lesions distribution can provide support for clinical diagnosis and epidemiological studies. We presented a new idea to help radiologists study breast lesions distribution conveniently. We also developed an automatic tool based on this idea which could show visualization of lesions distribution in a standard mammogram. Firstly, establishing a lesion database to study; then, extracting breast contours and match different women's mammograms to a standard mammogram; finally, showing the lesion distribution in the standard mammogram, and providing the distribution statistics. The crucial process of developing this tool was matching different women's mammograms correctly. We used a hybrid breast contour extraction method combined with coherent point drift method to match different women's mammograms. We tested our automatic tool by four mass datasets of 641 images. The distribution results shown by the tool were consistent with the results counted according to their reports and mammograms by manual. We also discussed the registration error that was less than 3.3 mm in average distance. The new idea is effective and the automatic tool can provide lesions distribution results which are consistent with radiologists simply and conveniently.
A Tool and Application Programming Interface for Browsing Historical Geostationary Satellite Data
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Ayers, J.
2013-12-01
Providing access to information is a key concern for NASA Langley Research Center. We describe a tool and method that allows end users to easily browse and access information that is otherwise difficult to acquire and manipulate. The tool described has as its core the application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the enhanced imagery as an input into their own work flows. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite imagery available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider.
Steele Gray, Carolyn; Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl
2016-02-18
Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis-Possible, Implementable, (to be) Challenged, (to be) Killed-guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers.
Constructing Benchmark Databases and Protocols for Medical Image Analysis: Diabetic Retinopathy
Kauppi, Tomi; Kämäräinen, Joni-Kristian; Kalesnykiene, Valentina; Sorri, Iiris; Uusitalo, Hannu; Kälviäinen, Heikki
2013-01-01
We address the performance evaluation practices for developing medical image analysis methods, in particular, how to establish and share databases of medical images with verified ground truth and solid evaluation protocols. Such databases support the development of better algorithms, execution of profound method comparisons, and, consequently, technology transfer from research laboratories to clinical practice. For this purpose, we propose a framework consisting of reusable methods and tools for the laborious task of constructing a benchmark database. We provide a software tool for medical image annotation helping to collect class label, spatial span, and expert's confidence on lesions and a method to appropriately combine the manual segmentations from multiple experts. The tool and all necessary functionality for method evaluation are provided as public software packages. As a case study, we utilized the framework and tools to establish the DiaRetDB1 V2.1 database for benchmarking diabetic retinopathy detection algorithms. The database contains a set of retinal images, ground truth based on information from multiple experts, and a baseline algorithm for the detection of retinopathy lesions. PMID:23956787
Lewis, Meron; Lee, Amanda
2016-11-01
To undertake a systematic review to determine similarities and differences in metrics and results between recently and/or currently used tools, protocols and methods for monitoring Australian healthy food prices and affordability. Electronic databases of peer-reviewed literature and online grey literature were systematically searched using the PRISMA approach for articles and reports relating to healthy food and diet price assessment tools, protocols, methods and results that utilised retail pricing. National, state, regional and local areas of Australia from 1995 to 2015. Assessment tools, protocols and methods to measure the price of 'healthy' foods and diets. The search identified fifty-nine discrete surveys of 'healthy' food pricing incorporating six major food pricing tools (those used in multiple areas and time periods) and five minor food pricing tools (those used in a single survey area or time period). Analysis demonstrated methodological differences regarding: included foods; reference households; use of availability and/or quality measures; household income sources; store sampling methods; data collection protocols; analysis methods; and results. 'Healthy' food price assessment methods used in Australia lack comparability across all metrics and most do not fully align with a 'healthy' diet as recommended by the current Australian Dietary Guidelines. None have been applied nationally. Assessment of the price, price differential and affordability of healthy (recommended) and current (unhealthy) diets would provide more robust and meaningful data to inform health and fiscal policy in Australia. The INFORMAS 'optimal' approach provides a potential framework for development of these methods.
Isoperms: An Environmental Management Tool.
ERIC Educational Resources Information Center
Sebera, Donald K.
A quantitative tool, the isoperm method, is described; it quantifies the effect of environmental factors of temperature (T) and percent relative humidity (%RH) on the anticipated useful life expectancy of paper-based collections. The isoperm method provides answers to questions of the expected lifetime of the collection under various temperature…
Developing and using a rubric for evaluating evidence-based medicine point-of-care tools
Foster, Margaret J
2011-01-01
Objective: The research sought to establish a rubric for evaluating evidence-based medicine (EBM) point-of-care tools in a health sciences library. Methods: The authors searched the literature for EBM tool evaluations and found that most previous reviews were designed to evaluate the ability of an EBM tool to answer a clinical question. The researchers' goal was to develop and complete rubrics for assessing these tools based on criteria for a general evaluation of tools (reviewing content, search options, quality control, and grading) and criteria for an evaluation of clinical summaries (searching tools for treatments of common diagnoses and evaluating summaries for quality control). Results: Differences between EBM tools' options, content coverage, and usability were minimal. However, the products' methods for locating and grading evidence varied widely in transparency and process. Conclusions: As EBM tools are constantly updating and evolving, evaluation of these tools needs to be conducted frequently. Standards for evaluating EBM tools need to be established, with one method being the use of objective rubrics. In addition, EBM tools need to provide more information about authorship, reviewers, methods for evidence collection, and grading system employed. PMID:21753917
Teaching science with technology: Using EPA’s EnviroAtlas in the classroom
Background/Question/Methods U.S. EPA’s EnviroAtlas provides a collection of web-based, interactive tools and resources for exploring ecosystem goods and services. EnviroAtlas contains two primary tools: An Interactive Map, which provides access to 300+ maps at multiple exte...
Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO
NASA Technical Reports Server (NTRS)
Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.
2016-01-01
A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.
A community resource benchmarking predictions of peptide binding to MHC-I molecules.
Peters, Bjoern; Bui, Huynh-Hoa; Frankild, Sune; Nielson, Morten; Lundegaard, Claus; Kostem, Emrah; Basch, Derek; Lamberth, Kasper; Harndahl, Mikkel; Fleri, Ward; Wilson, Stephen S; Sidney, John; Lund, Ole; Buus, Soren; Sette, Alessandro
2006-06-09
Recognition of peptides bound to major histocompatibility complex (MHC) class I molecules by T lymphocytes is an essential part of immune surveillance. Each MHC allele has a characteristic peptide binding preference, which can be captured in prediction algorithms, allowing for the rapid scan of entire pathogen proteomes for peptide likely to bind MHC. Here we make public a large set of 48,828 quantitative peptide-binding affinity measurements relating to 48 different mouse, human, macaque, and chimpanzee MHC class I alleles. We use this data to establish a set of benchmark predictions with one neural network method and two matrix-based prediction methods extensively utilized in our groups. In general, the neural network outperforms the matrix-based predictions mainly due to its ability to generalize even on a small amount of data. We also retrieved predictions from tools publicly available on the internet. While differences in the data used to generate these predictions hamper direct comparisons, we do conclude that tools based on combinatorial peptide libraries perform remarkably well. The transparent prediction evaluation on this dataset provides tool developers with a benchmark for comparison of newly developed prediction methods. In addition, to generate and evaluate our own prediction methods, we have established an easily extensible web-based prediction framework that allows automated side-by-side comparisons of prediction methods implemented by experts. This is an advance over the current practice of tool developers having to generate reference predictions themselves, which can lead to underestimating the performance of prediction methods they are not as familiar with as their own. The overall goal of this effort is to provide a transparent prediction evaluation allowing bioinformaticians to identify promising features of prediction methods and providing guidance to immunologists regarding the reliability of prediction tools.
Spatial allocation of forest recreation value
Kenneth A. Baerenklau; Armando Gonzalez-Caban; Catrina Paez; Edgard Chavez
2009-01-01
Non-market valuation methods and geographic information systems are useful planning and management tools for public land managers. Recent attention has been given to investigation and demonstration of methods for combining these tools to provide spatially-explicit representations of non-market value. Most of these efforts have focused on spatial allocation of...
Automatic feed system for ultrasonic machining
Calkins, Noel C.
1994-01-01
Method and apparatus for ultrasonic machining in which feeding of a tool assembly holding a machining tool toward a workpiece is accomplished automatically. In ultrasonic machining, a tool located just above a workpiece and vibrating in a vertical direction imparts vertical movement to particles of abrasive material which then remove material from the workpiece. The tool does not contact the workpiece. Apparatus for moving the tool assembly vertically is provided such that it operates with a relatively small amount of friction. Adjustable counterbalance means is provided which allows the tool to be immobilized in its vertical travel. A downward force, termed overbalance force, is applied to the tool assembly. The overbalance force causes the tool to move toward the workpiece as material is removed from the workpiece.
Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl
2016-01-01
Background Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. Objective This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Methods Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Results Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis—Possible, Implementable, (to be) Challenged, (to be) Killed—guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Conclusions Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers. PMID:26892952
Developing and using a rubric for evaluating evidence-based medicine point-of-care tools.
Shurtz, Suzanne; Foster, Margaret J
2011-07-01
The research sought to establish a rubric for evaluating evidence-based medicine (EBM) point-of-care tools in a health sciences library. The authors searched the literature for EBM tool evaluations and found that most previous reviews were designed to evaluate the ability of an EBM tool to answer a clinical question. The researchers' goal was to develop and complete rubrics for assessing these tools based on criteria for a general evaluation of tools (reviewing content, search options, quality control, and grading) and criteria for an evaluation of clinical summaries (searching tools for treatments of common diagnoses and evaluating summaries for quality control). Differences between EBM tools' options, content coverage, and usability were minimal. However, the products' methods for locating and grading evidence varied widely in transparency and process. As EBM tools are constantly updating and evolving, evaluation of these tools needs to be conducted frequently. Standards for evaluating EBM tools need to be established, with one method being the use of objective rubrics. In addition, EBM tools need to provide more information about authorship, reviewers, methods for evidence collection, and grading system employed.
Method and tool to reverse the charges in anti-reflection films used for solar cell applications
Sharma, Vivek; Tracy, Clarence
2017-01-31
A method is provided for making a solar cell. The method includes providing a stack including a substrate, a barrier layer disposed on the substrate, and an anti-reflective layer disposed on the barrier layer, where the anti-reflective layer has charge centers. The method also includes generating a corona with a charging tool and contacting the anti-reflective layer with the corona thereby injecting charge into at least some of the charge centers in the anti-reflective layer. Ultra-violet illumination and temperature-based annealing may be used to modify the charge of the anti-reflective layer.
Assembling a Case Study Tool Kit: 10 Tools for Teaching with Cases
ERIC Educational Resources Information Center
Prud'homme-Généreux, Annie
2017-01-01
This column provides original articles on innovations in case study teaching, assessment of the method, as well as case studies with teaching notes. The author shares the strategies and tools that teachers can use to manage a case study classroom effectively.
Dynamic visualization techniques for high consequence software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pollock, G.M.
1998-02-01
This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less
A large-scale benchmark of gene prioritization methods.
Guala, Dimitri; Sonnhammer, Erik L L
2017-04-21
In order to maximize the use of results from high-throughput experimental studies, e.g. GWAS, for identification and diagnostics of new disease-associated genes, it is important to have properly analyzed and benchmarked gene prioritization tools. While prospective benchmarks are underpowered to provide statistically significant results in their attempt to differentiate the performance of gene prioritization tools, a strategy for retrospective benchmarking has been missing, and new tools usually only provide internal validations. The Gene Ontology(GO) contains genes clustered around annotation terms. This intrinsic property of GO can be utilized in construction of robust benchmarks, objective to the problem domain. We demonstrate how this can be achieved for network-based gene prioritization tools, utilizing the FunCoup network. We use cross-validation and a set of appropriate performance measures to compare state-of-the-art gene prioritization algorithms: three based on network diffusion, NetRank and two implementations of Random Walk with Restart, and MaxLink that utilizes network neighborhood. Our benchmark suite provides a systematic and objective way to compare the multitude of available and future gene prioritization tools, enabling researchers to select the best gene prioritization tool for the task at hand, and helping to guide the development of more accurate methods.
Ballermann, Mark; Shaw, Nicola T; Mayes, Damon C; Gibney, R T Noel
2011-01-01
Electronic documentation methods may assist critical care providers with information management tasks in Intensive Care Units (ICUs). We conducted a quasi-experimental observational study to investigate patterns of information tool use by ICU physicians, nurses, and respiratory therapists during verbal communication tasks. Critical care providers used tools less at 3 months after the CCIS introduction. At 12 months, care providers referred to paper and permanent records, especially during shift changes. The results suggest potential areas of improvement for clinical information systems in assisting critical care providers in ensuring informational continuity around their patients.
NASA Astrophysics Data System (ADS)
Tan, Samantha H.; Chen, Ning; Liu, Shi; Wang, Kefei
2003-09-01
As part of the semiconductor industry "contamination-free manufacturing" effort, significant emphasis has been placed on reducing potential sources of contamination from process equipment and process equipment components. Process tools contain process chambers and components that are exposed to the process environment or process chemistry and in some cases are in direct contact with production wafers. Any contamination from these sources must be controlled or eliminated in order to maintain high process yields, device performance, and device reliability. This paper discusses new nondestructive analytical methods for quantitative measurement of the cleanliness of metal, quartz, polysilicon and ceramic components that are used in process equipment tools. The goal of these new procedures is to measure the effectiveness of cleaning procedures and to verify whether a tool component part is sufficiently clean for installation and subsequent routine use in the manufacturing line. These procedures provide a reliable "qualification method" for tool component certification and also provide a routine quality control method for reliable operation of cleaning facilities. Cost advantages to wafer manufacturing include higher yields due to improved process cleanliness and elimination of yield loss and downtime resulting from the installation of "bad" components in process tools. We also discuss a representative example of wafer contamination having been linked to a specific process tool component.
OPTHYLIC: An Optimised Tool for Hybrid Limits Computation
NASA Astrophysics Data System (ADS)
Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée
2018-05-01
A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.
NASA Astrophysics Data System (ADS)
Kozhina, T. D.; Kurochkin, A. V.
2016-04-01
The paper highlights results of the investigative tests of GTE compressor Ti-alloy blades obtained by the method of electrochemical machining with oscillating tool-electrodes, carried out in order to define the optimal parameters of the ECM process providing attainment of specified blade quality parameters given in the design documentation, while providing maximal performance. The new technological methods suggested based on the results of the tests; in particular application of vibrating tool-electrodes and employment of locating elements made of high-strength materials, significantly extend the capabilities of this method.
Tools4miRs – one place to gather all the tools for miRNA analysis
Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr
2016-01-01
Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. Contact: piotr@ibb.waw.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626
Tools4miRs - one place to gather all the tools for miRNA analysis.
Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr
2016-09-01
MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. piotr@ibb.waw.pl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Kristensen, Finn Børlum; Lampe, Kristian; Chase, Deborah L; Lee-Robin, Sun Hae; Wild, Claudia; Moharra, Montse; Garrido, Marcial Velasco; Nielsen, Camilla Palmhøj; Røttingen, John-Arne; Neikter, Susanna Allgurin; Bistrup, Marie Louise
2009-12-01
This article presents an overview of the practical methods and tools to support transnational Health Technology Assessment (HTA) that were developed and pilot tested by the European network for HTA (EUnetHTA), which involved a total of sixty-four Partner organizations. The methods differ according to scope and purpose of each of the tools developed. They included, for example, literature reviews, surveys, Delphi and consensus methods, workshops, pilot tests, and internal/public consultation. Practical results include an HTA Core Model and a Handbook on the use of the model, two pilot examples of HTA core information, an HTA Adaptation Toolkit for taking existing reports into new settings, a book about HTA and health policy making in Europe, a newsletter providing structured information about emerging/new technologies, an interactive Web-based tool to share information about monitoring activities for emerging/new technologies, and a Handbook on HTA capacity building for Member States with limited institutionalization of HTA. The tools provide high-quality information and methodological frameworks for HTA that facilitate preparation of HTA documentation, and sharing of information in and across national or regional systems. The tools will be used and further tested by partners in the EUnetHTA Collaboration aiming to (i) help reduce unnecessary duplication of HTA activities, (ii) develop and promote good practice in HTA methods and processes, (iii) share what can be shared, (iv) facilitate local adaptation of HTA information, (v) improve the links between health policy and HTA.
Googling DNA sequences on the World Wide Web.
Hajibabaei, Mehrdad; Singer, Gregory A C
2009-11-10
New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.
Chrimes, Dillon; Kushniruk, Andre; Kitos, Nicole R.
2014-01-01
Purpose Usability testing can be used to evaluate human computer interaction (HCI) and communication in shared decision making (SDM) for patient-provider behavioral change and behavioral contracting. Traditional evaluations of usability using scripted or mock patient scenarios with think-aloud protocol analysis provide a to identify HCI issues. In this paper we describe the application of these methods in the evaluation of the Avoiding Diabetes Thru Action Plan Targeting (ADAPT) tool, and test the usability of the tool to support the ADAPT framework for integrated care counseling of pre-diabetes. The think-aloud protocol analysis typically does not provide an assessment of how patient-provider interactions are effected in “live” clinical workflow or whether a tool is successful. Therefore, “Near-live” clinical simulations involving applied simulation methods were used to compliment the think-aloud results. This complementary usability technique was used to test the end-user HCI and tool performance by more closely mimicking the clinical workflow and capturing interaction sequences along with assessing the functionality of computer module prototypes on clinician workflow. We expected this method to further complement and provide different usability findings as compared to think-aloud analysis. Together, this mixed method evaluation provided comprehensive and realistic feedback for iterative refinement of the ADAPT system prior to implementation. Methods The study employed two phases of testing of a new interactive ADAPT tool that embedded an evidence-based shared goal setting component into primary care workflow for dealing with pre-diabetes counseling within a commercial physician office electronic health record (EHR). Phase I applied usability testing that involved “think-aloud” protocol analysis of 8 primary care providers interacting with several scripted clinical scenarios. Phase II used “near-live” clinical simulations of 5 providers interacting with standardized trained patient actors enacting the clinical scenario of counseling for pre-diabetes, each of whom had a pedometer that recorded the number of steps taken over a week. In both phases, all sessions were audio-taped and motion screen-capture software was activated for onscreen recordings. Transcripts were coded using iterative qualitative content analysis methods. Results In Phase I, the impact of the components and layout of ADAPT on user’s Navigation, Understandability, and Workflow were associated with the largest volume of negative comments (i.e. approximately 80% of end-user commentary), while Usability and Content of ADAPT were representative of more positive than negative user commentary. The heuristic category of Usability had a positive-to-negative comment ratio of 2.1, reflecting positive perception of the usability of the tool, its functionality, and overall co-productive utilization of ADAPT. However, there were mixed perceptions about content (i.e., how the information was displayed, organized and described in the tool). In Phase II, the duration of patient encounters was approximately 10 minutes with all of the Patient Instructions (prescriptions) and behavioral contracting being activated at the end of each visit. Upon activation, providers accepted the pathway prescribed by the tool 100% of the time and completed all the fields in the tool in the simulation cases. Only 14% of encounter time was spent using the functionality of the ADAPT tool in terms of keystrokes and entering relevant data. The rest of the time was spent on communication and dialogue to populate the patient instructions. In all cases, the interaction sequence of reviewing and discussing exercise and diet of the patient was linked to the functionality of the ADAPT tool in terms of monitoring, response-efficacy, self-efficacy, and negotiation in the patient-provider dialogue. There was a change from one-way dialogue to two-way dialogue and negotiation that ended in a behavioral contract. This change demonstrated the tool’s sequence, which supported recording current exercise and diet followed by a diet and exercise goal setting procedure to reduce the risk of diabetes onset. Conclusions This study demonstrated that “think-aloud” protocol analysis with “near-live” clinical simulations provided a successful usability evaluation of a new primary care pre-diabetes shared goal setting tool. Each phase of the study provided complementary observations on problems with the new onscreen tool and was used to show the influence of the ADAPT framework on the usability, workflow integration, and communication between the patient and provider. The think-aloud tests with the provider showed the tool can be used according to the ADAPT framework (exercise-to-diet behavior change and tool utilization), while the clinical simulations revealed the ADAPT framework to realistically support patient-provider communication to obtain behavioral change contract. SDM interactions and mechanisms affecting protocol-based care can be more completely captured by combining “near-live” clinical simulations with traditional “think-aloud analysis” which augments clinician utilization. More analysis is required to verify if the rich communication actions found in Phase II compliment clinical workflows. PMID:24981988
New concept in brazing metallic honeycomb panels
NASA Technical Reports Server (NTRS)
Carter, P. D.; Layton, R. E.; Stratton, F. W.
1973-01-01
Aluminum oxide coating provides surface which will not be wetted by brazing alloy and which stops metallic diffusion welding of tooling materials to part being produced. This method eliminates loss of tooling materials and parts from braze wetting and allows fall-apart disassembly of tooling after brazing.
Schoville, Benjamin J; Brown, Kyle S; Harris, Jacob A; Wilkins, Jayne
2016-01-01
The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout the MSA. How these tools were being used and discarded across a changing Pleistocene landscape can provide insight into how MSA populations prioritized technological and foraging decisions. Creating inferential links between experimental and archaeological tool use helps to establish prehistoric tool function, but is complicated by the overlaying of post-depositional damage onto behaviorally worn tools. Taphonomic damage patterning can provide insight into site formation history, but may preclude behavioral interpretations of tool function. Here, multiple experimental processes that form edge damage on unretouched lithic points from taphonomic and behavioral processes are presented. These provide experimental distributions of wear on tool edges from known processes that are then quantitatively compared to the archaeological patterning of stone point edge damage from three MSA lithic assemblages-Kathu Pan 1, Pinnacle Point Cave 13B, and Die Kelders Cave 1. By using a model-fitting approach, the results presented here provide evidence for variable MSA behavioral strategies of stone point utilization on the landscape consistent with armature tips at KP1, and cutting tools at PP13B and DK1, as well as damage contributions from post-depositional sources across assemblages. This study provides a method with which landscape-scale questions of early modern human tool-use and site-use can be addressed.
Schoville, Benjamin J.; Brown, Kyle S.; Harris, Jacob A.; Wilkins, Jayne
2016-01-01
The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout the MSA. How these tools were being used and discarded across a changing Pleistocene landscape can provide insight into how MSA populations prioritized technological and foraging decisions. Creating inferential links between experimental and archaeological tool use helps to establish prehistoric tool function, but is complicated by the overlaying of post-depositional damage onto behaviorally worn tools. Taphonomic damage patterning can provide insight into site formation history, but may preclude behavioral interpretations of tool function. Here, multiple experimental processes that form edge damage on unretouched lithic points from taphonomic and behavioral processes are presented. These provide experimental distributions of wear on tool edges from known processes that are then quantitatively compared to the archaeological patterning of stone point edge damage from three MSA lithic assemblages—Kathu Pan 1, Pinnacle Point Cave 13B, and Die Kelders Cave 1. By using a model-fitting approach, the results presented here provide evidence for variable MSA behavioral strategies of stone point utilization on the landscape consistent with armature tips at KP1, and cutting tools at PP13B and DK1, as well as damage contributions from post-depositional sources across assemblages. This study provides a method with which landscape-scale questions of early modern human tool-use and site-use can be addressed. PMID:27736886
Murray-Davis, Beth; McDonald, Helen; Cross-Sudworth, Fiona; Ahmed, Rashid; Simioni, Julia; Dore, Sharon; Marrin, Michael; DeSantis, Judy; Leyland, Nicholas; Gardosi, Jason; Hutton, Eileen; McDonald, Sarah
2015-08-01
Adverse events occur in up to 10% of obstetric cases, and up to one half of these could be prevented. Case reviews and root cause analysis using a structured tool may help health care providers to learn from adverse events and to identify trends and recurring systems issues. We sought to establish the reliability of a root cause analysis computer application called Standardized Clinical Outcome Review (SCOR). We designed a mixed methods study to evaluate the effectiveness of the tool. We conducted qualitative content analysis of five charts reviewed by both the traditional obstetric quality assurance methods and the SCOR tool. We also determined inter-rater reliability by having four health care providers review the same five cases using the SCOR tool. The comparative qualitative review revealed that the traditional quality assurance case review process used inconsistent language and made serious, personalized recommendations for those involved in the case. In contrast, the SCOR review provided a consistent format for recommendations, a list of action points, and highlighted systems issues. The mean percentage agreement between the four reviewers for the five cases was 75%. The different health care providers completed data entry and assessment of the case in a similar way. Missing data from the chart and poor wording of questions were identified as issues affecting percentage agreement. The SCOR tool provides a standardized, objective, obstetric-specific tool for root cause analysis that may improve identification of risk factors and dissemination of action plans to prevent future events.
Newton, Paul; Chandler, Val; Morris-Thomson, Trish; Sayer, Jane; Burke, Linda
2015-01-01
To map current selection and recruitment processes for newly qualified nurses and to explore the advantages and limitations of current selection and recruitment processes. The need to improve current selection and recruitment practices for newly qualified nurses is highlighted in health policy internationally. A cross-sectional, sequential-explanatory mixed-method design with 4 components: (1) Literature review of selection and recruitment of newly qualified nurses; and (2) Literature review of a public sector professions' selection and recruitment processes; (3) Survey mapping existing selection and recruitment processes for newly qualified nurses; and (4) Qualitative study about recruiters' selection and recruitment processes. Literature searches on the selection and recruitment of newly qualified candidates in teaching and nursing (2005-2013) were conducted. Cross-sectional, mixed-method data were collected from thirty-one (n = 31) individuals in health providers in London who had responsibility for the selection and recruitment of newly qualified nurses using a survey instrument. Of these providers who took part, six (n = 6) purposively selected to be interviewed qualitatively. Issues of supply and demand in the workforce, rather than selection and recruitment tools, predominated in the literature reviews. Examples of tools to measure values, attitudes and skills were found in the nursing literature. The mapping exercise found that providers used many selection and recruitment tools, some providers combined tools to streamline process and assure quality of candidates. Most providers had processes which addressed the issue of quality in the selection and recruitment of newly qualified nurses. The 'assessment centre model', which providers were adopting, allowed for multiple levels of assessment and streamlined recruitment. There is a need to validate the efficacy of the selection tools. © 2014 John Wiley & Sons Ltd.
Cement bond evaluation method in horizontal wells using segmented bond tool
NASA Astrophysics Data System (ADS)
Song, Ruolong; He, Li
2018-06-01
Most of the existing cement evaluation technologies suffer from tool eccentralization due to gravity in highly deviated wells and horizontal wells. This paper proposes a correction method to lessen the effects of tool eccentralization on evaluation results of cement bond using segmented bond tool, which has an omnidirectional sonic transmitter and eight segmented receivers evenly arranged around the tool 2 ft from the transmitter. Using 3-D finite difference parallel numerical simulation method, we investigate the logging responses of centred and eccentred segmented bond tool in a variety of bond conditions. From the numerical results, we find that the tool eccentricity and channel azimuth can be estimated from measured sector amplitude. The average of the sector amplitude when the tool is eccentred can be corrected to the one when the tool is centred. Then the corrected amplitude will be used to calculate the channel size. The proposed method is applied to both synthetic and field data. For synthetic data, it turns out that this method can estimate the tool eccentricity with small error and the bond map is improved after correction. For field data, the tool eccentricity has a good agreement with the measured well deviation angle. Though this method still suffers from the low accuracy of calculating channel azimuth, the credibility of corrected bond map is improved especially in horizontal wells. It gives us a choice to evaluate the bond condition for horizontal wells using existing logging tool. The numerical results in this paper can provide aids for understanding measurements of segmented tool in both vertical and horizontal wells.
TACIT: An open-source text analysis, crawling, and interpretation tool.
Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra
2017-04-01
As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.
Tolle, Charles R [Idaho Falls, ID; Clark, Denis E [Idaho Falls, ID; Smartt, Herschel B [Idaho Falls, ID; Miller, Karen S [Idaho Falls, ID
2009-10-06
A material-forming tool and a method for forming a material are described including a shank portion; a shoulder portion that releasably engages the shank portion; a pin that releasably engages the shoulder portion, wherein the pin defines a passageway; and a source of a material coupled in material flowing relation relative to the pin and wherein the material-forming tool is utilized in methodology that includes providing a first material; providing a second material, and placing the second material into contact with the first material; and locally plastically deforming the first material with the material-forming tool so as mix the first material and second material together to form a resulting material having characteristics different from the respective first and second materials.
NASA Astrophysics Data System (ADS)
Mueller, David S.
2013-04-01
Selection of the appropriate extrapolation methods for computing the discharge in the unmeasured top and bottom parts of a moving-boat acoustic Doppler current profiler (ADCP) streamflow measurement is critical to the total discharge computation. The software tool, extrap, combines normalized velocity profiles from the entire cross section and multiple transects to determine a mean profile for the measurement. The use of an exponent derived from normalized data from the entire cross section is shown to be valid for application of the power velocity distribution law in the computation of the unmeasured discharge in a cross section. Selected statistics are combined with empirically derived criteria to automatically select the appropriate extrapolation methods. A graphical user interface (GUI) provides the user tools to visually evaluate the automatically selected extrapolation methods and manually change them, as necessary. The sensitivity of the total discharge to available extrapolation methods is presented in the GUI. Use of extrap by field hydrographers has demonstrated that extrap is a more accurate and efficient method of determining the appropriate extrapolation methods compared with tools currently (2012) provided in the ADCP manufacturers' software.
Guidelines for reporting and using prediction tools for genetic variation analysis.
Vihinen, Mauno
2013-02-01
Computational prediction methods are widely used for the analysis of human genome sequence variants and their effects on gene/protein function, splice site aberration, pathogenicity, and disease risk. New methods are frequently developed. We believe that guidelines are essential for those writing articles about new prediction methods, as well as for those applying these tools in their research, so that the necessary details are reported. This will enable readers to gain the full picture of technical information, performance, and interpretation of results, and to facilitate comparisons of related methods. Here, we provide instructions on how to describe new methods, report datasets, and assess the performance of predictive tools. We also discuss what details of predictor implementation are essential for authors to understand. Similarly, these guidelines for the use of predictors provide instructions on what needs to be delineated in the text, as well as how researchers can avoid unwarranted conclusions. They are applicable to most prediction methods currently utilized. By applying these guidelines, authors will help reviewers, editors, and readers to more fully comprehend prediction methods and their use. © 2012 Wiley Periodicals, Inc.
Logging-while-coring method and apparatus
Goldberg, David S.; Myers, Gregory J.
2007-11-13
A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.
Logging-while-coring method and apparatus
Goldberg, David S.; Myers, Gregory J.
2007-01-30
A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.
Software development environments: Status and trends
NASA Technical Reports Server (NTRS)
Duffel, Larry E.
1988-01-01
Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.
Ingrassia, Pier Luigi; Prato, Federico; Geddo, Alessandro; Colombo, Davide; Tengattini, Marco; Calligaro, Sara; La Mura, Fabrizio; Franc, Jeffrey Michael; Della Corte, Francesco
2010-11-01
Functional exercises represent an important link between disaster planning and disaster response. Although these exercises are widely performed, no standardized method exists for their evaluation. To describe a simple and objective method to assess medical performance during functional exercise events. An evaluation tool comprising three data fields (triage, clinical maneuvers, and radio usage), accompanied by direct anecdotal observational methods, was used to evaluate a large functional mass casualty incident exercise. Seventeen medical responders managed 112 victims of a simulated building explosion. Although 81% of the patients were assigned the appropriate triage codes, evacuation from the site did not follow in priority. Required maneuvers were performed correctly in 85.2% of airway maneuvers and 78.7% of breathing maneuvers, however, significant under-treatment occurred, possibly due to equipment shortages. Extensive use of radio communication was documented. In evaluating this tool, the structured markers were informative, but further information provided by direct observation was invaluable. A three-part tool (triage, medical maneuvers, and radio usage) can provide a method to evaluate functional mass casualty incident exercises, and is easily implemented. For the best results, it should be used in conjunction with direct observation. The evaluation tool has great potential as a reproducible and internationally recognized tool for evaluating disaster management exercises. Copyright © 2010 Elsevier Inc. All rights reserved.
Topography measurements and applications in ballistics and tool mark identifications*
Vorburger, T V; Song, J; Petraco, N
2016-01-01
The application of surface topography measurement methods to the field of firearm and toolmark analysis is fairly new. The field has been boosted by the development of a number of competing optical methods, which has improved the speed and accuracy of surface topography acquisitions. We describe here some of these measurement methods as well as several analytical methods for assessing similarities and differences among pairs of surfaces. We also provide a few examples of research results to identify cartridge cases originating from the same firearm or tool marks produced by the same tool. Physical standards and issues of traceability are also discussed. PMID:27182440
GEsture: an online hand-drawing tool for gene expression pattern search.
Wang, Chunyan; Xu, Yiqing; Wang, Xuelin; Zhang, Li; Wei, Suyun; Ye, Qiaolin; Zhu, Youxiang; Yin, Hengfu; Nainwal, Manoj; Tanon-Reyes, Luis; Cheng, Feng; Yin, Tongming; Ye, Ning
2018-01-01
Gene expression profiling data provide useful information for the investigation of biological function and process. However, identifying a specific expression pattern from extensive time series gene expression data is not an easy task. Clustering, a popular method, is often used to classify similar expression genes, however, genes with a 'desirable' or 'user-defined' pattern cannot be efficiently detected by clustering methods. To address these limitations, we developed an online tool called GEsture. Users can draw, or graph a curve using a mouse instead of inputting abstract parameters of clustering methods. GEsture explores genes showing similar, opposite and time-delay expression patterns with a gene expression curve as input from time series datasets. We presented three examples that illustrate the capacity of GEsture in gene hunting while following users' requirements. GEsture also provides visualization tools (such as expression pattern figure, heat map and correlation network) to display the searching results. The result outputs may provide useful information for researchers to understand the targets, function and biological processes of the involved genes.
Fan Noise Prediction with Applications to Aircraft System Noise Assessment
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Envia, Edmane; Burley, Casey L.
2009-01-01
This paper describes an assessment of current fan noise prediction tools by comparing measured and predicted sideline acoustic levels from a benchmark fan noise wind tunnel test. Specifically, an empirical method and newly developed coupled computational approach are utilized to predict aft fan noise for a benchmark test configuration. Comparisons with sideline noise measurements are performed to assess the relative merits of the two approaches. The study identifies issues entailed in coupling the source and propagation codes, as well as provides insight into the capabilities of the tools in predicting the fan noise source and subsequent propagation and radiation. In contrast to the empirical method, the new coupled computational approach provides the ability to investigate acoustic near-field effects. The potential benefits/costs of these new methods are also compared with the existing capabilities in a current aircraft noise system prediction tool. The knowledge gained in this work provides a basis for improved fan source specification in overall aircraft system noise studies.
Bolstering Teaching through Online Tools
ERIC Educational Resources Information Center
Singh, Anil; Mangalaraj, George; Taneja, Aakash
2010-01-01
This paper offers a compilation of technologies that provides either free or low-cost solutions to the challenges of teaching online courses. It presents various teaching methods the outlined tools and technologies can support, with emphasis on fit between these tools and the tasks they are meant to serve. In addition, it highlights various…
Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A
1995-06-01
PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.
ERIC Educational Resources Information Center
Upitis, Rena; Brook, Julia
2017-01-01
Even though there are demonstrated benefits of using online tools to support student musicians, there is a persistent challenge of providing sufficient and effective professional development for independent music teachers to use such tools successfully. This paper describes several methods for helping teachers use an online tool called iSCORE,…
Development, standardization, and validation of analytical methods provides state-of-the-science
techniques to evaluate the presence, or absence, of select PPCPs in biosolids. This research
provides the approaches, methods, and tools to assess the exposures and redu...
Ekirapa-Kiracho, Elizabeth; Ghosh, Upasona; Brahmachari, Rittika; Paina, Ligia
2017-12-28
Effective stakeholder engagement in research and implementation is important for improving the development and implementation of policies and programmes. A varied number of tools have been employed for stakeholder engagement. In this paper, we discuss two participatory methods for engaging with stakeholders - participatory social network analysis (PSNA) and participatory impact pathways analysis (PIPA). Based on our experience, we derive lessons about when and how to apply these tools. This paper was informed by a review of project reports and documents in addition to reflection meetings with the researchers who applied the tools. These reports were synthesised and used to make thick descriptions of the applications of the methods while highlighting key lessons. PSNA and PIPA both allowed a deep understanding of how the system actors are interconnected and how they influence maternal health and maternal healthcare services. The findings from the PSNA provided guidance on how stakeholders of a health system are interconnected and how they can stimulate more positive interaction between the stakeholders by exposing existing gaps. The PIPA meeting enabled the participants to envision how they could expand their networks and resources by mentally thinking about the contributions that they could make to the project. The processes that were considered critical for successful application of the tools and achievement of outcomes included training of facilitators, language used during the facilitation, the number of times the tool is applied, length of the tools, pretesting of the tools, and use of quantitative and qualitative methods. Whereas both tools allowed the identification of stakeholders and provided a deeper understanding of the type of networks and dynamics within the network, PIPA had a higher potential for promoting collaboration between stakeholders, likely due to allowing interaction between them. Additionally, it was implemented within a participatory action research project. PIPA also allowed participatory evaluation of the project from the perspective of the community. This paper provides lessons about the use of these participatory tools.
Volumetric Verification of Multiaxis Machine Tool Using Laser Tracker
Aguilar, Juan José
2014-01-01
This paper aims to present a method of volumetric verification in machine tools with linear and rotary axes using a laser tracker. Beyond a method for a particular machine, it presents a methodology that can be used in any machine type. Along this paper, the schema and kinematic model of a machine with three axes of movement, two linear and one rotational axes, including the measurement system and the nominal rotation matrix of the rotational axis are presented. Using this, the machine tool volumetric error is obtained and nonlinear optimization techniques are employed to improve the accuracy of the machine tool. The verification provides a mathematical, not physical, compensation, in less time than other methods of verification by means of the indirect measurement of geometric errors of the machine from the linear and rotary axes. This paper presents an extensive study about the appropriateness and drawbacks of the regression function employed depending on the types of movement of the axes of any machine. In the same way, strengths and weaknesses of measurement methods and optimization techniques depending on the space available to place the measurement system are presented. These studies provide the most appropriate strategies to verify each machine tool taking into consideration its configuration and its available work space. PMID:25202744
Ocean Surface Topography Data Products and Tools
NASA Technical Reports Server (NTRS)
Case, Kelley E.; Bingham, Andrew W.; Berwin, Robert W.; Rigor, Eric M.; Raskin, Robert G.
2004-01-01
The Physical Oceanography Distributed Active Archiving Center (PO.DAAC), NASA's primary data center for archiving and distributing oceanographic data, is supporting the Jason and TOPEX/Poseidon satellite tandem missions by providing a variety of data products, tools, and distribution methods to the wider scientific and general community. PO.DAAC has developed several new data products for sea level residual measurements, providing a longterm climate data record from 1992 to the present These products provide compatible measurements of sea level residuals for the entire time series including the tandem TOPEX/Poseidon and Jason mission. Several data distribution tool. are available from NASA PO.DAAC. The Near-Real-Time Image Distribution Server (NEREIDS) provides quicklook browse images and binary data files The PO.DAAC Ocean ESIP Tool (POET) provides interactive, on-tine data subsetting and visualization for several altimetry data products.
Design and Analysis Tools for Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.; Folk, Thomas C.
2009-01-01
Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.
Survey of Non-Rigid Registration Tools in Medicine.
Keszei, András P; Berkels, Benjamin; Deserno, Thomas M
2017-02-01
We catalogue available software solutions for non-rigid image registration to support scientists in selecting suitable tools for specific medical registration purposes. Registration tools were identified using non-systematic search in Pubmed, Web of Science, IEEE Xplore® Digital Library, Google Scholar, and through references in identified sources (n = 22). Exclusions are due to unavailability or inappropriateness. The remaining (n = 18) tools were classified by (i) access and technology, (ii) interfaces and application, (iii) living community, (iv) supported file formats, and (v) types of registration methodologies emphasizing the similarity measures implemented. Out of the 18 tools, (i) 12 are open source, 8 are released under a permissive free license, which imposes the least restrictions on the use and further development of the tool, 8 provide graphical processing unit (GPU) support; (ii) 7 are built on software platforms, 5 were developed for brain image registration; (iii) 6 are under active development but only 3 have had their last update in 2015 or 2016; (iv) 16 support the Analyze format, while 7 file formats can be read with only one of the tools; and (v) 6 provide multiple registration methods and 6 provide landmark-based registration methods. Based on open source, licensing, GPU support, active community, several file formats, algorithms, and similarity measures, the tools Elastics and Plastimatch are chosen for the platform ITK and without platform requirements, respectively. Researchers in medical image analysis already have a large choice of registration tools freely available. However, the most recently published algorithms may not be included in the tools, yet.
Held, Rachel Forster; Santos, Susan; Marki, Michelle; Helmer, Drew
2016-09-02
We developed and disseminated an educational DVD to introduce U.S. Veterans to independently-practiced complementary and alternative medicine (CAM) techniques and encourage CAM experimentation. The project's goal was to determine optimal dissemination methods to facilitate implementation within the Veteran's Health Administration. In the first phase, the DVD was disseminated using four methods: passive, provider-mediated, active, and peer-mediated. In the second, implementation phase, "champion" providers who supported CAM integrated dissemination into clinical practice. Qualitative data came from Veteran focus groups and semi-structured provider interviews. Data from both phases was triangulated to identify common themes. Effective dissemination requires engaging patients. Providers who most successfully integrated the DVD into practice already had CAM knowledge, and worked in settings where CAM was accepted clinical practice, or with leadership or infrastructure that supported a culture of CAM use. Institutional buy-in allowed for provider networking and effective implementation of the tool. Providers were given autonomy to determine the most appropriate dissemination strategies, which increased enthusiasm and use. Many of the lessons learned from this project can be applied to dissemination of any new educational tool within a healthcare setting. Results reiterate the importance of utilizing best practices for introducing educational tools within the healthcare context and the need for thoughtful, multi-faceted dissemination strategies.
A visual training tool for the Photoload sampling technique
Violet J. Holley; Robert E. Keane
2010-01-01
This visual training aid is designed to provide Photoload users a tool to increase the accuracy of fuel loading estimations when using the Photoload technique. The Photoload Sampling Technique (RMRS-GTR-190) provides fire managers a sampling method for obtaining consistent, accurate, inexpensive, and quick estimates of fuel loading. It is designed to require only one...
NASA EEE Parts and Advanced Interconnect Program (AIP)
NASA Technical Reports Server (NTRS)
Gindorf, T.; Garrison, A.
1996-01-01
none given From Program Objectives: I. Accelerate the readiness of new technologies through development of validation, assessment and test method/tools II. Provide NASA Projects infusion paths for emerging technologies III. Provide NASA Projects technology selection, application and validation guidelines for harware and processes IV. Disseminate quality assurance, reliability, validation, tools and availability information to the NASA community.
Chrimes, Dillon; Kitos, Nicole R; Kushniruk, Andre; Mann, Devin M
2014-09-01
Usability testing can be used to evaluate human-computer interaction (HCI) and communication in shared decision making (SDM) for patient-provider behavioral change and behavioral contracting. Traditional evaluations of usability using scripted or mock patient scenarios with think-aloud protocol analysis provide a way to identify HCI issues. In this paper we describe the application of these methods in the evaluation of the Avoiding Diabetes Thru Action Plan Targeting (ADAPT) tool, and test the usability of the tool to support the ADAPT framework for integrated care counseling of pre-diabetes. The think-aloud protocol analysis typically does not provide an assessment of how patient-provider interactions are effected in "live" clinical workflow or whether a tool is successful. Therefore, "Near-live" clinical simulations involving applied simulation methods were used to compliment the think-aloud results. This complementary usability technique was used to test the end-user HCI and tool performance by more closely mimicking the clinical workflow and capturing interaction sequences along with assessing the functionality of computer module prototypes on clinician workflow. We expected this method to further complement and provide different usability findings as compared to think-aloud analysis. Together, this mixed method evaluation provided comprehensive and realistic feedback for iterative refinement of the ADAPT system prior to implementation. The study employed two phases of testing of a new interactive ADAPT tool that embedded an evidence-based shared goal setting component into primary care workflow for dealing with pre-diabetes counseling within a commercial physician office electronic health record (EHR). Phase I applied usability testing that involved "think-aloud" protocol analysis of eight primary care providers interacting with several scripted clinical scenarios. Phase II used "near-live" clinical simulations of five providers interacting with standardized trained patient actors enacting the clinical scenario of counseling for pre-diabetes, each of whom had a pedometer that recorded the number of steps taken over a week. In both phases, all sessions were audio-taped and motion screen-capture software was activated for onscreen recordings. Transcripts were coded using iterative qualitative content analysis methods. In Phase I, the impact of the components and layout of ADAPT on user's Navigation, Understandability, and Workflow were associated with the largest volume of negative comments (i.e. approximately 80% of end-user commentary), while Usability and Content of ADAPT were representative of more positive than negative user commentary. The heuristic category of Usability had a positive-to-negative comment ratio of 2.1, reflecting positive perception of the usability of the tool, its functionality, and overall co-productive utilization of ADAPT. However, there were mixed perceptions about content (i.e., how the information was displayed, organized and described in the tool). In Phase II, the duration of patient encounters was approximately 10 min with all of the Patient Instructions (prescriptions) and behavioral contracting being activated at the end of each visit. Upon activation, providers accepted the pathway prescribed by the tool 100% of the time and completed all the fields in the tool in the simulation cases. Only 14% of encounter time was spent using the functionality of the ADAPT tool in terms of keystrokes and entering relevant data. The rest of the time was spent on communication and dialog to populate the patient instructions. In all cases, the interaction sequence of reviewing and discussing exercise and diet of the patient was linked to the functionality of the ADAPT tool in terms of monitoring, response-efficacy, self-efficacy, and negotiation in the patient-provider dialog. There was a change from one-way dialog to two-way dialog and negotiation that ended in a behavioral contract. This change demonstrated the tool's sequence, which supported recording current exercise and diet followed by a diet and exercise goal setting procedure to reduce the risk of diabetes onset. This study demonstrated that "think-aloud" protocol analysis with "near-live" clinical simulations provided a successful usability evaluation of a new primary care pre-diabetes shared goal setting tool. Each phase of the study provided complementary observations on problems with the new onscreen tool and was used to show the influence of the ADAPT framework on the usability, workflow integration, and communication between the patient and provider. The think-aloud tests with the provider showed the tool can be used according to the ADAPT framework (exercise-to-diet behavior change and tool utilization), while the clinical simulations revealed the ADAPT framework to realistically support patient-provider communication to obtain behavioral change contract. SDM interactions and mechanisms affecting protocol-based care can be more completely captured by combining "near-live" clinical simulations with traditional "think-aloud analysis" which augments clinician utilization. More analysis is required to verify if the rich communication actions found in Phase II compliment clinical workflows. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Quantum chemical studies of estrogenic compounds
USDA-ARS?s Scientific Manuscript database
Quantum chemical methods are potent tools to provide information on the chemical structure and electronic properties of organic molecules. Modern computational chemistry methods have provided a great deal of insight into the binding of estrogenic compounds to estrogenic receptors (ER), an important ...
While drilling system and method
Mayes, James C.; Araya, Mario A.; Thorp, Richard Edward
2007-02-20
A while drilling system and method for determining downhole parameters is provided. The system includes a retrievable while drilling tool positionable in a downhole drilling tool, a sensor chassis and at least one sensor. The while drilling tool is positionable in the downhole drilling tool and has a first communication coupler at an end thereof. The sensor chassis is supported in the drilling tool. The sensor chassis has a second communication coupler at an end thereof for operative connection with the first communication coupler. The sensor is positioned in the chassis and is adapted to measure internal and/or external parameters of the drilling tool. The sensor is operatively connected to the while drilling tool via the communication coupler for communication therebetween. The sensor may be positioned in the while drilling tool and retrievable with the drilling tool. Preferably, the system is operable in high temperature and high pressure conditions.
Building Flexible User Interfaces for Solving PDEs
NASA Astrophysics Data System (ADS)
Logg, Anders; Wells, Garth N.
2010-09-01
FEniCS is a collection of software tools for the automated solution of differential equations by finite element methods. In this note, we describe how FEniCS can be used to solve a simple nonlinear model problem with varying levels of automation. At one extreme, FEniCS provides tools for the fully automated and adaptive solution of nonlinear partial differential equations. At the other extreme, FEniCS provides a range of tools that allow the computational scientist to experiment with novel solution algorithms.
A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.
Płotka-Wasylka, J
2018-05-01
A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.
Collision detection and modeling of rigid and deformable objects in laparoscopic simulator
NASA Astrophysics Data System (ADS)
Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru
2015-03-01
Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.
X-ray mask and method for making
Morales, Alfredo M.
2004-10-26
The present invention describes a method for fabricating an x-ray mask tool which is a contact lithographic mask which can provide an x-ray exposure dose which is adjustable from point-to-point. The tool is useful in the preparation of LIGA plating molds made from PMMA, or similar materials. In particular the tool is useful for providing an ability to apply a graded, or "stepped" x-ray exposure dose across a photosensitive substrate. By controlling the x-ray radiation dose from point-to-point, it is possible to control the development process for removing exposed portions of the substrate; adjusting it such that each of these portions develops at a more or less uniformly rate regardless of feature size or feature density distribution.
Costs Associated with Using the ASA24® Dietary Assessment Tool
As part of its mission to advance measures and methods for monitoring cancer-related behaviors and other risk factors, the Risk Factor Assessment Branch provides tools and resources to the extramural research community.
PC graphics generation and management tool for real-time applications
NASA Technical Reports Server (NTRS)
Truong, Long V.
1992-01-01
A graphics tool was designed and developed for easy generation and management of personal computer graphics. It also provides methods and 'run-time' software for many common artificial intelligence (AI) or expert system (ES) applications.
Public data and open source tools for multi-assay genomic investigation of disease.
Kannan, Lavanya; Ramos, Marcel; Re, Angela; El-Hachem, Nehme; Safikhani, Zhaleh; Gendoo, Deena M A; Davis, Sean; Gomez-Cabrero, David; Castelo, Robert; Hansen, Kasper D; Carey, Vincent J; Morgan, Martin; Culhane, Aedín C; Haibe-Kains, Benjamin; Waldron, Levi
2016-07-01
Molecular interrogation of a biological sample through DNA sequencing, RNA and microRNA profiling, proteomics and other assays, has the potential to provide a systems level approach to predicting treatment response and disease progression, and to developing precision therapies. Large publicly funded projects have generated extensive and freely available multi-assay data resources; however, bioinformatic and statistical methods for the analysis of such experiments are still nascent. We review multi-assay genomic data resources in the areas of clinical oncology, pharmacogenomics and other perturbation experiments, population genomics and regulatory genomics and other areas, and tools for data acquisition. Finally, we review bioinformatic tools that are explicitly geared toward integrative genomic data visualization and analysis. This review provides starting points for accessing publicly available data and tools to support development of needed integrative methods. © The Author 2015. Published by Oxford University Press.
Pollack, Ari H; Miller, Andrew; Mishra, Sonali R.; Pratt, Wanda
2016-01-01
Participatory design, a method by which system users and stakeholders meaningfully contribute to the development of a new process or technology, has great potential to revolutionize healthcare technology, yet has seen limited adoption. We conducted a design session with eleven physicians working to create a novel clinical information tool utilizing participatory design methods. During the two-hour session, the physicians quickly engaged in the process and generated a large quantity of information, informing the design of a future tool. By utilizing facilitators experienced in design methodology, with detailed domain expertise, and well integrated into the healthcare organization, the participatory design session engaged a group of users who are often disenfranchised with existing processes as well as health information technology in general. We provide insight into why participatory design works with clinicians and provide guiding principles for how to implement these methods in healthcare organizations interested in advancing health information technology. PMID:28269900
Evidence Arguments for Using Formal Methods in Software Certification
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Pai, Ganesh
2013-01-01
We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.
Exploring Mission Concepts with the JPL Innovation Foundry A-Team
NASA Technical Reports Server (NTRS)
Ziemer, John K.; Ervin, Joan; Lang, Jared
2013-01-01
The JPL Innovation Foundry has established a new approach for exploring, developing, and evaluating early concepts called the A-Team. The A-Team combines innovative collaborative methods with subject matter expertise and analysis tools to help mature mission concepts. Science, implementation, and programmatic elements are all considered during an A-Team study. Methods are grouped by Concept Maturity Level (CML), from 1 through 3, including idea generation and capture (CML 1), initial feasibility assessment (CML 2), and trade space exploration (CML 3). Methods used for each CML are presented, and the key team roles are described from two points of view: innovative methods and technical expertise. A-Team roles for providing innovative methods include the facilitator, study lead, and assistant study lead. A-Team roles for providing technical expertise include the architect, lead systems engineer, and integration engineer. In addition to these key roles, each A-Team study is uniquely staffed to match the study topic and scope including subject matter experts, scientists, technologists, flight and instrument systems engineers, and program managers as needed. Advanced analysis and collaborative engineering tools (e.g. cost, science traceability, mission design, knowledge capture, study and analysis support infrastructure) are also under development for use in A-Team studies and will be discussed briefly. The A-Team facilities provide a constructive environment for innovative ideas from all aspects of mission formulation to eliminate isolated studies and come together early in the development cycle when they can provide the biggest impact. This paper provides an overview of the A-Team, its study processes, roles, methods, tools and facilities.
ERIC Educational Resources Information Center
Aydogan, Tuncay; Ergun, Serap
2016-01-01
Concept mapping is a method of graphical learning that can be beneficial as a study method for concept linking and organization. Concept maps, which provide an elegant, easily understood representation of an expert's domain knowledge, are tools for organizing and representing knowledge. These tools have been used in educational environments to…
The visibility of QSEN competencies in clinical assessment tools in Swedish nurse education.
Nygårdh, Annette; Sherwood, Gwen; Sandberg, Therese; Rehn, Jeanette; Knutsson, Susanne
2017-12-01
Prospective nurses need specific and sufficient knowledge to be able to provide quality care. The Swedish Society of Nursing has emphasized the importance of the six quality and safety competencies (QSEN), originated in the US, in Swedish nursing education. To investigate the visibility of the QSEN competencies in the assessment tools used in clinical practice METHOD: A quantitative descriptive method was used to analyze assessment tools from 23 universities. Teamwork and collaboration was the most visible competency. Patient-centered care was visible to a large degree but was not referred to by name. Informatics was the least visible, a notable concern since all nurses should be competent in informatics to provide quality and safety in care. These results provide guidance as academic and clinical programs around the world implement assessment of how well nurses have developed these essential quality and safety competencies. Copyright © 2017 Elsevier Ltd. All rights reserved.
Enhancement of Chemical Entity Identification in Text Using Semantic Similarity Validation
Grego, Tiago; Couto, Francisco M.
2013-01-01
With the amount of chemical data being produced and reported in the literature growing at a fast pace, it is increasingly important to efficiently retrieve this information. To tackle this issue text mining tools have been applied, but despite their good performance they still provide many errors that we believe can be filtered by using semantic similarity. Thus, this paper proposes a novel method that receives the results of chemical entity identification systems, such as Whatizit, and exploits the semantic relationships in ChEBI to measure the similarity between the entities found in the text. The method assigns a single validation score to each entity based on its similarities with the other entities also identified in the text. Then, by using a given threshold, the method selects a set of validated entities and a set of outlier entities. We evaluated our method using the results of two state-of-the-art chemical entity identification tools, three semantic similarity measures and two text window sizes. The method was able to increase precision without filtering a significant number of correctly identified entities. This means that the method can effectively discriminate the correctly identified chemical entities, while discarding a significant number of identification errors. For example, selecting a validation set with 75% of all identified entities, we were able to increase the precision by 28% for one of the chemical entity identification tools (Whatizit), maintaining in that subset 97% the correctly identified entities. Our method can be directly used as an add-on by any state-of-the-art entity identification tool that provides mappings to a database, in order to improve their results. The proposed method is included in a freely accessible web tool at www.lasige.di.fc.ul.pt/webtools/ice/. PMID:23658791
X-ray mask and method for providing same
Morales, Alfredo M [Pleasanton, CA; Skala, Dawn M [Fremont, CA
2004-09-28
The present invention describes a method for fabricating an x-ray mask tool which can achieve pattern features having lateral dimension of less than 1 micron. The process uses a thin photoresist and a standard lithographic mask to transfer an trace image pattern in the surface of a silicon wafer by exposing and developing the resist. The exposed portion of the silicon substrate is then anisotropically etched to provide an etched image of the trace image pattern consisting of a series of channels in the silicon having a high depth-to-width aspect ratio. These channels are then filled by depositing a metal such as gold to provide an inverse image of the trace image and thereby providing a robust x-ray mask tool.
X-ray mask and method for providing same
Morales, Alfredo M.; Skala, Dawn M.
2002-01-01
The present invention describes a method for fabricating an x-ray mask tool which can achieve pattern features having lateral dimension of less than 1 micron. The process uses a thin photoresist and a standard lithographic mask to transfer an trace image pattern in the surface of a silicon wafer by exposing and developing the resist. The exposed portion of the silicon substrate is then anisotropically etched to provide an etched image of the trace image pattern consisting of a series of channels in the silicon having a high depth-to-width aspect ratio. These channels are then filled by depositing a metal such as gold to provide an inverse image of the trace image and thereby providing a robust x-ray mask tool.
Slok, Annerika H M; Twellaar, Mascha; Jutbo, Leslie; Kotz, Daniel; Chavannes, Niels H; Holverda, Sebastiaan; Salomé, Philippe L; Dekhuijzen, P N Richard; Rutten-van Mölken, Maureen P M H; Schuiten, Denise; In 't Veen, Johannes C C M; van Schayck, Onno C P
2016-11-17
In the management of chronic conditions, such as chronic obstructive pulmonary disease (COPD), there is a shift from doctor-driven care to patient-centred integrated care with active involvement of and self-management by the patient. A recently developed tool, the assessment of burden of COPD (ABC) tool, can be used in this transition to facilitate self-management support and shared decision-making. We performed a qualitative study, in which we collected and analysed the data using the methods of conventional content analyses. We performed in-depth interviews consisting of mainly open questions. Fifteen healthcare providers and 21 patients were interviewed who had worked with the ABC tool in daily care. In general, participants responded positively to the tool. Healthcare providers felt the visual representation provided was effective and comprehensible for patients and provided them with insight into their disease, a finding that patients confirmed. If patients were allowed to choose between a consultation with or without the ABC tool, the majority would prefer using the tool: it provides them with an overview and insight, which makes it easier to discuss all relevant topics related to COPD. The tool can provide structure in consultations, and is compatible with the concepts of 'motivational interviewing' and 'individualised care-planning'. Suggestions for improvement related to content and layout. So far, the tool has only been available as a stand-alone online program, that is not connected to the electronic medical record systems. It was therefore suggested that the tool be integrated into the systems to enhance its usability and its uptake by healthcare providers.
CORSSTOL: Cylinder Optimization of Rings, Skin, and Stringers with Tolerance sensitivity
NASA Technical Reports Server (NTRS)
Finckenor, J.; Bevill, M.
1995-01-01
Cylinder Optimization of Rings, Skin, and Stringers with Tolerance (CORSSTOL) sensitivity is a design optimization program incorporating a method to examine the effects of user-provided manufacturing tolerances on weight and failure. CORSSTOL gives designers a tool to determine tolerances based on need. This is a decisive way to choose the best design among several manufacturing methods with differing capabilities and costs. CORSSTOL initially optimizes a stringer-stiffened cylinder for weight without tolerances. The skin and stringer geometry are varied, subject to stress and buckling constraints. Then the same analysis and optimization routines are used to minimize the maximum material condition weight subject to the least favorable combination of tolerances. The adjusted optimum dimensions are provided with the weight and constraint sensitivities of each design variable. The designer can immediately identify critical tolerances. The safety of parts made out of tolerance can also be determined. During design and development of weight-critical systems, design/analysis tools that provide product-oriented results are of vital significance. The development of this program and methodology provides designers with an effective cost- and weight-saving design tool. The tolerance sensitivity method can be applied to any system defined by a set of deterministic equations.
Delaunay Triangulation as a New Coverage Measurement Method in Wireless Sensor Network
Chizari, Hassan; Hosseini, Majid; Poston, Timothy; Razak, Shukor Abd; Abdullah, Abdul Hanan
2011-01-01
Sensing and communication coverage are among the most important trade-offs in Wireless Sensor Network (WSN) design. A minimum bound of sensing coverage is vital in scheduling, target tracking and redeployment phases, as well as providing communication coverage. Some methods measure the coverage as a percentage value, but detailed information has been missing. Two scenarios with equal coverage percentage may not have the same Quality of Coverage (QoC). In this paper, we propose a new coverage measurement method using Delaunay Triangulation (DT). This can provide the value for all coverage measurement tools. Moreover, it categorizes sensors as ‘fat’, ‘healthy’ or ‘thin’ to show the dense, optimal and scattered areas. It can also yield the largest empty area of sensors in the field. Simulation results show that the proposed DT method can achieve accurate coverage information, and provides many tools to compare QoC between different scenarios. PMID:22163792
Developmental Testing of Habitability and Human Factors Tools and Methods During Neemo 15
NASA Technical Reports Server (NTRS)
Thaxton, S. S.; Litaker, H. L., Jr.; Holden, K. L.; Adolf, J. A.; Pace, J.; Morency, R. M.
2011-01-01
Currently, no established methods exist to collect real-time human factors and habitability data while crewmembers are living aboard the International Space Station (ISS), traveling aboard other space vehicles, or living in remote habitats. Currently, human factors and habitability data regarding space vehicles and habitats are acquired at the end of missions during postflight crew debriefs. These debriefs occur weeks or often longer after events have occurred, which forces a significant reliance on incomplete human memory, which is imperfect. Without a means to collect real-time data, small issues may have a cumulative effect and continue to cause crew frustration and inefficiencies. Without timely and appropriate reporting methodologies, issues may be repeated or lost. TOOL DEVELOPMENT AND EVALUATION: As part of a directed research project (DRP) aiming to develop and validate tools and methods for collecting near real-time human factors and habitability data, a preliminary set of tools and methods was developed. These tools and methods were evaluated during the NASA Extreme Environments Mission Operations (NEEMO) 15 mission in October 2011. Two versions of a software tool were used to collect observational data from NEEMO crewmembers that also used targeted strategies for using video cameras to collect observations. Space habitability observation reporting tool (SHORT) was created based on a tool previously developed by NASA to capture human factors and habitability issues during spaceflight. SHORT uses a web-based interface that allows users to enter a text description of any observations they wish to report and assign a priority level if changes are needed. In addition to the web-based format, a mobile Apple (iOS) format was implemented, referred to as iSHORT. iSHORT allows users to provide text, audio, photograph, and video data to report observations. iSHORT can be deployed on an iPod Touch, iPhone, or iPad; for NEEMO 15, the app was provided on an iPad2.
Using Drawing Technology to Assess Students' Visualizations of Chemical Reaction Processes
ERIC Educational Resources Information Center
Chang, Hsin-Yi; Quintana, Chris; Krajcik, Joseph
2014-01-01
In this study, we investigated how students used a drawing tool to visualize their ideas of chemical reaction processes. We interviewed 30 students using thinking-aloud and retrospective methods and provided them with a drawing tool. We identified four types of connections the students made as they used the tool: drawing on existing knowledge,…
ERIC Educational Resources Information Center
Bordeianu, Sever; Carter, Christina E.; Dennis, Nancy K.
2000-01-01
Describes Web-based online public access catalogs (Web OPACs) and other Web-based tools as gateway methods for providing access to library collections. Addresses solutions for overcoming barriers to information, such as through the implementation of proxy servers and other authentication tools for remote users. (Contains 18 references.)…
The KMAT: Benchmarking Knowledge Management.
ERIC Educational Resources Information Center
de Jager, Martha
Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…
On-line Tools for Assessing Petroleum Releases
The Internet tools described in this report provide methods and models for evaluation of contaminated sites. Two problems are addressed by models. The first is the placement of wells for correct delineation of contaminant plumes. Because aquifer recharge can displace plumes dow...
Primer on consumer marketing research : procedures, methods, and tools
DOT National Transportation Integrated Search
1994-03-01
The Volpe Center developed a marketing research primer which provides a guide to the approach, procedures, and research tools used by private industry in predicting consumer response. The final two chapters of the primer focus on the challenges of do...
Systems, methods, and apparatus of a low conductance silicon micro-leak for mass spectrometer inlet
NASA Technical Reports Server (NTRS)
Harpold, Dan N. (Inventor); Niemann, Hasso B. (Inventor); Jamieson, Brian G. (Inventor); Lynch, Bernard A. (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which in some embodiments a mass spectrometer micro-leak includes a number of channels fabricated by semiconductor processing tools and that includes a number of inlet holes that provide access to the channels.
Systems, Methods, and Apparatus of a Low Conductance Silicon Micro-Leak for Mass Spectrometer Inlet
NASA Technical Reports Server (NTRS)
Harpold, Dan N. (Inventor); Niemann, Hasso B. (Inventor); Jamieson, Brian G. (Inventor); Lynch, Bernard A. (Inventor)
2013-01-01
Systems, methods and apparatus are provided through which in some embodiments a mass spectrometer micro-leak includes a number of channels fabricated by semiconductor processing tools and that includes a number of inlet holes that provide access to the channels.
Veterans' experience in using the online Surgeon General's family health history tool.
Arar, Nedal; Seo, Joann; Abboud, Hanna E; Parchman, Michael; Noel, Polly
2011-09-01
AIM: To assess veterans' experience and satisfaction in using the Surgeon General's (SG) online family health history (FHH) tool, and determine the perceived facilitators and barriers to using the online SG-FHH tool. MATERIALS #ENTITYSTARTX00026; METHODS: A mixed-method using both qualitative and quantitative approaches was employed in this study. A total of 35 veterans at the VA Medical Center in San Antonio, Texas, USA were invited to enter their FHH information using the online SG-FHH tool, complete the study's satisfaction survey and participate in a short semi-structured interview. The goal of the semi-structured interviews was to assess participants perceived facilitators and barriers to using the online SG-FHH tool. All participants were also provided with a printed copy of their pedigree, which was generated by the SG-FHH tool and were encouraged to share it with their relatives and providers. RESULTS: The majority of participants (91%) said that they had access to a computer with internet capability and 77% reported that they knew how to use a computer. More than two-thirds of the participants felt that items on the SG-FHH tool were easy to read and felt that FHH categories were relevant to their family's health. Approximately 94% of participants viewed the SG-FHH tool as useful, and the majority of participants (97%) indicated that they were likely to recommend the tool to others. Content analysis of the semi-structured interviews highlighted several barriers to veterans' use of the SG-FHH tool and their FHH information. These included: lack of patients' knowledge regarding their relatives' FHH, and privacy and confidentiality concerns. CONCLUSION: This study provides information on the performance and functionality of an inexpensive and widely accessible method for FHH collection. Furthermore, our findings highlight several opportunities and challenges facing the utilization of FHH information as a clinical and genomic tool at the Veterans Health Administration (VHA). The results suggest that strategies that improve veterans' knowledge regarding the importance of their FHH information and that address their concerns about privacy and confidentiality may enhance the successful implementation of FHH information into VHA clinical practice. IMPLICATIONS: identifying a locally accepted method for FHH collection and documentation which can be conducted outside of the patient visit will reduce time burdens for providers and patients and allow for a focus on other important topics during clinic visits. Improvement in familial risk screening and assessment will enable the VHA to be prepared for personalized medicine and focus their resources on promoting critically important health behaviors for populations with the highest risk of developing chronic diseases and their complications.
Methods, Tools and Current Perspectives in Proteogenomics *
Ruggles, Kelly V.; Krug, Karsten; Wang, Xiaojing; Clauser, Karl R.; Wang, Jing; Payne, Samuel H.; Fenyö, David; Zhang, Bing; Mani, D. R.
2017-01-01
With combined technological advancements in high-throughput next-generation sequencing and deep mass spectrometry-based proteomics, proteogenomics, i.e. the integrative analysis of proteomic and genomic data, has emerged as a new research field. Early efforts in the field were focused on improving protein identification using sample-specific genomic and transcriptomic sequencing data. More recently, integrative analysis of quantitative measurements from genomic and proteomic studies have identified novel insights into gene expression regulation, cell signaling, and disease. Many methods and tools have been developed or adapted to enable an array of integrative proteogenomic approaches and in this article, we systematically classify published methods and tools into four major categories, (1) Sequence-centric proteogenomics; (2) Analysis of proteogenomic relationships; (3) Integrative modeling of proteogenomic data; and (4) Data sharing and visualization. We provide a comprehensive review of methods and available tools in each category and highlight their typical applications. PMID:28456751
Identifying persistent and characteristic features in firearm tool marks on cartridge cases
NASA Astrophysics Data System (ADS)
Ott, Daniel; Soons, Johannes; Thompson, Robert; Song, John
2017-12-01
Recent concerns about subjectivity in forensic firearm identification have motivated the development of algorithms to compare firearm tool marks that are imparted on ammunition and to generate quantitative measures of similarity. In this paper, we describe an algorithm that identifies impressed tool marks on a cartridge case that are both consistent between firings and contribute strongly to a surface similarity metric. The result is a representation of the tool mark topography that emphasizes both significant and persistent features across firings. This characteristic surface map is useful for understanding the variability and persistence of the tool marks created by a firearm and can provide improved discrimination between the comparison scores of samples fired from the same firearm and the scores of samples fired from different firearms. The algorithm also provides a convenient method for visualizing areas of similarity that may be useful in providing quantitative support for visual comparisons by trained examiners.
Perspectives on Wellness Self-Monitoring Tools for Older Adults
Huh, Jina; Le, Thai; Reeder, Blaine; Thompson, Hilaire J.; Demiris, George
2013-01-01
Purpose Our purpose was to understand different stakeholder perceptions about the use of self-monitoring tools, specifically in the area of older adults’ personal wellness. In conjunction with the advent of personal health records, tracking personal health using self-monitoring technologies shows promising patient support opportunities. While clinicians’ tools for monitoring of older adults have been explored, we know little about how older adults may self-monitor their wellness and health and how their health care providers would perceive such use. Methods We conducted three focus groups with health care providers (n=10) and four focus groups with community-dwelling older adults (n=31). Results Older adult participants’ found the concept of self-monitoring unfamiliar and this influenced a narrowed interest in the use of wellness self-monitoring tools. On the other hand, health care provider participants showed open attitudes towards wellness monitoring tools for older adults and brainstormed about various stakeholders’ use cases. The two participant groups showed diverging perceptions in terms of: perceived uses, stakeholder interests, information ownership and control, and sharing of wellness monitoring tools. Conclusions Our paper provides implications and solutions for how older adults’ wellness self-monitoring tools can enhance patient-health care provider interaction, patient education, and improvement in overall wellness. PMID:24041452
Purification of functionalized DNA origami nanostructures.
Shaw, Alan; Benson, Erik; Högberg, Björn
2015-05-26
The high programmability of DNA origami has provided tools for precise manipulation of matter at the nanoscale. This manipulation of matter opens up the possibility to arrange functional elements for a diverse range of applications that utilize the nanometer precision provided by these structures. However, the realization of functionalized DNA origami still suffers from imperfect production methods, in particular in the purification step, where excess material is separated from the desired functionalized DNA origami. In this article we demonstrate and optimize two purification methods that have not previously been applied to DNA origami. In addition, we provide a systematic study comparing the purification efficacy of these and five other commonly used purification methods. Three types of functionalized DNA origami were used as model systems in this study. DNA origami was patterned with either small molecules, antibodies, or larger proteins. With the results of our work we aim to provide a guideline in quality fabrication of various types of functionalized DNA origami and to provide a route for scalable production of these promising tools.
NASA Astrophysics Data System (ADS)
Chen, Mingjun; Li, Ziang; Yu, Bo; Peng, Hui; Fang, Zhen
2013-09-01
In the grinding of high quality fused silica parts with complex surface or structure using ball-headed metal bonded diamond wheel with small diameter, the existing dressing methods are not suitable to dress the ball-headed diamond wheel precisely due to that they are either on-line in process dressing which may causes collision problem or without consideration for the effects of the tool setting error and electrode wear. An on-machine precision preparation and dressing method is proposed for ball-headed diamond wheel based on electrical discharge machining. By using this method the cylindrical diamond wheel with small diameter is manufactured to hemispherical-headed form. The obtained ball-headed diamond wheel is dressed after several grinding passes to recover geometrical accuracy and sharpness which is lost due to the wheel wear. A tool setting method based on high precision optical system is presented to reduce the wheel center setting error and dimension error. The effect of electrode tool wear is investigated by electrical dressing experiments, and the electrode tool wear compensation model is established based on the experimental results which show that the value of wear ratio coefficient K' tends to be constant with the increasing of the feed length of electrode and the mean value of K' is 0.156. Grinding experiments of fused silica are carried out on a test bench to evaluate the performance of the preparation and dressing method. The experimental results show that the surface roughness of the finished workpiece is 0.03 μm. The effect of the grinding parameter and dressing frequency on the surface roughness is investigated based on the measurement results of the surface roughness. This research provides an on-machine preparation and dressing method for ball-headed metal bonded diamond wheel used in the grinding of fused silica, which provides a solution to the tool setting method and the effect of electrode tool wear.
Mueller, David S.
2013-01-01
profiles from the entire cross section and multiple transects to determine a mean profile for the measurement. The use of an exponent derived from normalized data from the entire cross section is shown to be valid for application of the power velocity distribution law in the computation of the unmeasured discharge in a cross section. Selected statistics are combined with empirically derived criteria to automatically select the appropriate extrapolation methods. A graphical user interface (GUI) provides the user tools to visually evaluate the automatically selected extrapolation methods and manually change them, as necessary. The sensitivity of the total discharge to available extrapolation methods is presented in the GUI. Use of extrap by field hydrographers has demonstrated that extrap is a more accurate and efficient method of determining the appropriate extrapolation methods compared with tools currently (2012) provided in the ADCP manufacturers’ software.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Evaluating the effectiveness of gloves in reducing the hazards of hand-transmitted vibration.
Griffin, M J
1998-05-01
A method of evaluating the effectiveness of gloves in reducing the hazards of hand-transmitted vibration is proposed. The glove isolation effectiveness was calculated from: (a) the measured transmissibility of a glove, (b) the vibration spectrum on the handle of a specific tool (or class of tools), and (c) the frequency weighting indicating the degree to which different frequencies of vibration cause injury. With previously reported tool vibration spectra and glove transmissibilities (from 10-1000 Hz), the method was used to test 10 gloves with 20 different powered tools. The frequency weighting for hand-transmitted vibration advocated in British standard 6842 (1987) and international standard 5349 (1986) greatly influences the apparent isolation effectiveness of gloves. With the frequency weighting, the gloves had little effect on the transmission of vibration to the hand from most of the tools. Only for two or three tools (those dominated by high frequency vibration) did any glove provide useful attenuation. Without the frequency weighting, some gloves showed useful attenuation of the vibration on most powered tools. In view of the uncertain effect of the vibration frequency in the causation of disorders from hand-transmitted vibration, it is provisionally suggested that the wearing of a glove by the user of a particular vibratory tool could be encouraged if the glove reduces the transmission of vibration when it is evaluated without the frequency weighting and does not increase the vibration when it is evaluated with the frequency weighting. A current international standard for the measurement and evaluation of the vibration transmitted by gloves can classify a glove as an antivibration glove when it provides no useful attenuation of vibration, whereas a glove providing useful attenuation of vibration on a specific tool can fail the test.
Rotary fast tool servo system and methods
Montesanti, Richard C.; Trumper, David L.
2007-10-02
A high bandwidth rotary fast tool servo provides tool motion in a direction nominally parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. Three or more flexure blades having all ends fixed are used to form an axis of rotation for a swing arm that carries a cutting tool at a set radius from the axis of rotation. An actuator rotates a swing arm assembly such that a cutting tool is moved in and away from the lathe-mounted, rotating workpiece in a rapid and controlled manner in order to machine the workpiece. A pair of position sensors provides rotation and position information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in-feed slide of a precision lathe.
Rotary fast tool servo system and methods
Montesanti, Richard C [Cambridge, MA; Trumper, David L [Plaistow, NH; Kirtley, Jr., James L.
2009-08-18
A high bandwidth rotary fast tool servo provides tool motion in a direction nominally parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. Three or more flexure blades having all ends fixed are used to form an axis of rotation for a swing arm that carries a cutting tool at a set radius from the axis of rotation. An actuator rotates a swing arm assembly such that a cutting tool is moved in and away from the lathe-mounted, rotating workpiece in a rapid and controlled manner in order to machine the workpiece. One or more position sensors provides rotation and position information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in-feed slide of a precision lathe.
Teacher Logs: A Tool for Gaining a Comprehensive Understanding of Classroom Practices
ERIC Educational Resources Information Center
Glennie, Elizabeth J.; Charles, Karen J.; Rice, Olivia N.
2017-01-01
Examining repeated classroom encounters over time provides a comprehensive picture of activities. Studies of instructional practices in classrooms have traditionally relied on two methods: classroom observations, which are expensive, and surveys, which are limited in scope and accuracy. Teacher logs provide a "real-time" method for…
Toxicogenomics is the study of changes in gene expression, protein, and metabolite profiles within cells and tissues, complementary to more traditional toxicological methods. Genomics tools provide detailed molecular data about the underlying biochemical mechanisms of toxicity, a...
Targeted polypeptide degradation
Church, George M [Brookline, MA; Janse, Daniel M [Brookline, MA
2008-05-13
This invention pertains to compositions, methods, cells and organisms useful for selectively localizing polypeptides to the proteasome for degradation. Therapeutic methods and pharmaceutical compositions for treating disorders associated with the expression and/or activity of a polypeptide by targeting these polypeptides for degradation, as well as methods for targeting therapeutic polypeptides for degradation and/or activating therapeutic polypeptides by degradation are provided. The invention provides methods for identifying compounds that mediate proteasome localization and/or polypeptide degradation. The invention also provides research tools for the study of protein function.
Dcode.org anthology of comparative genomic tools.
Loots, Gabriela G; Ovcharenko, Ivan
2005-07-01
Comparative genomics provides the means to demarcate functional regions in anonymous DNA sequences. The successful application of this method to identifying novel genes is currently shifting to deciphering the non-coding encryption of gene regulation across genomes. To facilitate the practical application of comparative sequence analysis to genetics and genomics, we have developed several analytical and visualization tools for the analysis of arbitrary sequences and whole genomes. These tools include two alignment tools, zPicture and Mulan; a phylogenetic shadowing tool, eShadow for identifying lineage- and species-specific functional elements; two evolutionary conserved transcription factor analysis tools, rVista and multiTF; a tool for extracting cis-regulatory modules governing the expression of co-regulated genes, Creme 2.0; and a dynamic portal to multiple vertebrate and invertebrate genome alignments, the ECR Browser. Here, we briefly describe each one of these tools and provide specific examples on their practical applications. All the tools are publicly available at the http://www.dcode.org/ website.
Mori, Brenda; Brooks, Dina; Norman, Kathleen E; Herold, Jodi; Beaton, Dorcas E
2015-08-01
To develop the first draft of a Canadian tool to assess physiotherapy (PT) students' performance in clinical education (CE). Phase 1: to gain consensus on the items within the new tool, the number and placement of the comment boxes, and the rating scale; Phase 2: to explore the face and content validity of the draft tool. Phase 1 used the Delphi method; Phase 2 used cognitive interviewing methods with recent graduates and clinical instructors (CIs) and detailed interviews with clinical education and measurement experts. Consensus was reached on the first draft of the new tool by round 3 of the Delphi process, which was completed by 21 participants. Interviews were completed with 13 CIs, 6 recent graduates, and 7 experts. Recent graduates and CIs were able to interpret the tool accurately, felt they could apply it to a recent CE experience, and provided suggestions to improve the draft. Experts provided salient advice. The first draft of a new tool to assess PT students in CE, the Canadian Physiotherapy Assessment of Clinical Performance (ACP), was developed and will undergo further development and testing, including national consultation with stakeholders. Data from Phase 2 will contribute to developing an online education module for CIs and students.
Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli
2016-01-01
Objects: Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. Methods: The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. Results: The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Conclusions: Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications. PMID:27199729
A survey of tools for the analysis of quantitative PCR (qPCR) data.
Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas
2014-09-01
Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.
A simple web-based tool to compare freshwater fish data collected using AFS standard methods
Bonar, Scott A.; Mercado-Silva, Norman; Rahr, Matt; Torrey, Yuta T.; Cate, Averill
2016-01-01
The American Fisheries Society (AFS) recently published Standard Methods for Sampling North American Freshwater Fishes. Enlisting the expertise of 284 scientists from 107 organizations throughout Canada, Mexico, and the United States, this text was developed to facilitate comparisons of fish data across regions or time. Here we describe a user-friendly web tool that automates among-sample comparisons in individual fish condition, population length-frequency distributions, and catch per unit effort (CPUE) data collected using AFS standard methods. Currently, the web tool (1) provides instantaneous summaries of almost 4,000 data sets of condition, length frequency, and CPUE of common freshwater fishes collected using standard gears in 43 states and provinces; (2) is easily appended with new standardized field data to update subsequent queries and summaries; (3) compares fish data from a particular water body with continent, ecoregion, and state data summaries; and (4) provides additional information about AFS standard fish sampling including benefits, ongoing validation studies, and opportunities to comment on specific methods. The web tool—programmed in a PHP-based Drupal framework—was supported by several AFS Sections, agencies, and universities and is freely available from the AFS website and fisheriesstandardsampling.org. With widespread use, the online tool could become an important resource for fisheries biologists.
Non-orthogonal tool/flange and robot/world calibration.
Ernst, Floris; Richter, Lars; Matthäus, Lars; Martens, Volker; Bruder, Ralf; Schlaefer, Alexander; Schweikard, Achim
2012-12-01
For many robot-assisted medical applications, it is necessary to accurately compute the relation between the robot's coordinate system and the coordinate system of a localisation or tracking device. Today, this is typically carried out using hand-eye calibration methods like those proposed by Tsai/Lenz or Daniilidis. We present a new method for simultaneous tool/flange and robot/world calibration by estimating a solution to the matrix equation AX = YB. It is computed using a least-squares approach. Because real robots and localisation are all afflicted by errors, our approach allows for non-orthogonal matrices, partially compensating for imperfect calibration of the robot or localisation device. We also introduce a new method where full robot/world and partial tool/flange calibration is possible by using localisation devices providing less than six degrees of freedom (DOFs). The methods are evaluated on simulation data and on real-world measurements from optical and magnetical tracking devices, volumetric ultrasound providing 3-DOF data, and a surface laser scanning device. We compare our methods with two classical approaches: the method by Tsai/Lenz and the method by Daniilidis. In all experiments, the new algorithms outperform the classical methods in terms of translational accuracy by up to 80% and perform similarly in terms of rotational accuracy. Additionally, the methods are shown to be stable: the number of calibration stations used has far less influence on calibration quality than for the classical methods. Our work shows that the new method can be used for estimating the relationship between the robot's and the localisation device's coordinate systems. The new method can also be used for deficient systems providing only 3-DOF data, and it can be employed in real-time scenarios because of its speed. Copyright © 2012 John Wiley & Sons, Ltd.
Automating testbed documentation and database access using World Wide Web (WWW) tools
NASA Technical Reports Server (NTRS)
Ames, Charles; Auernheimer, Brent; Lee, Young H.
1994-01-01
A method for providing uniform transparent access to disparate distributed information systems was demonstrated. A prototype testing interface was developed to access documentation and information using publicly available hypermedia tools. The prototype gives testers a uniform, platform-independent user interface to on-line documentation, user manuals, and mission-specific test and operations data. Mosaic was the common user interface, and HTML (Hypertext Markup Language) provided hypertext capability.
Method selection for sustainability assessments: The case of recovery of resources from waste water.
Zijp, M C; Waaijers-van der Loop, S L; Heijungs, R; Broeren, M L M; Peeters, R; Van Nieuwenhuijzen, A; Shen, L; Heugens, E H W; Posthuma, L
2017-07-15
Sustainability assessments provide scientific support in decision procedures towards sustainable solutions. However, in order to contribute in identifying and choosing sustainable solutions, the sustainability assessment has to fit the decision context. Two complicating factors exist. First, different stakeholders tend to have different views on what a sustainability assessment should encompass. Second, a plethora of sustainability assessment methods exist, due to the multi-dimensional characteristic of the concept. Different methods provide other representations of sustainability. Based on a literature review, we present a protocol to facilitate method selection together with stakeholders. The protocol guides the exploration of i) the decision context, ii) the different views of stakeholders and iii) the selection of pertinent assessment methods. In addition, we present an online tool for method selection. This tool identifies assessment methods that meet the specifications obtained with the protocol, and currently contains characteristics of 30 sustainability assessment methods. The utility of the protocol and the tool are tested in a case study on the recovery of resources from domestic waste water. In several iterations, a combination of methods was selected, followed by execution of the selected sustainability assessment methods. The assessment results can be used in the first phase of the decision procedure that leads to a strategic choice for sustainable resource recovery from waste water in the Netherlands. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Statistical Project Control Tool for Engineering Managers
NASA Technical Reports Server (NTRS)
Bauch, Garland T.
2001-01-01
This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.
Interactive Visualization of Dependencies
ERIC Educational Resources Information Center
Moreno, Camilo Arango; Bischof, Walter F.; Hoover, H. James
2012-01-01
We present an interactive tool for browsing course requisites as a case study of dependency visualization. This tool uses multiple interactive visualizations to allow the user to explore the dependencies between courses. A usability study revealed that the proposed browser provides significant advantages over traditional methods, in terms of…
Total internal reflection laser tools and methods
Zediker, Mark S.; Faircloth, Brian O.; Kolachalam, Sharath K.; Grubb, Daryl L.
2016-02-02
There is provided high power laser tools and laser heads that utilize total internal reflection ("TIR") structures to direct the laser beam along a laser beam path within the TIR structure. The TIR structures may be a TIR prism having its hypotenuse as a TIR surface.
Analytical Tools in School Finance Reform.
ERIC Educational Resources Information Center
Johns, R. L.
This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…
INTEGRATING A LANDSCAPE HYDROLOGIC ANALYSIS FOR WATERSHED ASSESSMENT
Methods to provide linkages between a hydrologic modeling tool (AGW A) and landscape assessment tool (A TtILA) for determining the vulnerability of semi-arid landscapes to natural and human-induced landscape pattern changes have been developed. The objective of this study is to ...
Prony Ringdown GUI (CERTS Prony Ringdown, part of the DSI Tool Box)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuffner, Francis; Marinovici, PNNL Laurentiu; Hauer, PNNL John
2014-02-21
The PNNL Prony Ringdown graphical user interface is one analysis tool included in the Dynamic System Identification toolbox (DSI Toolbox). The Dynamic System Identification toolbox is a MATLAB-based collection of tools for parsing and analyzing phasor measurement unit data, especially in regards to small signal stability. It includes tools to read the data, preprocess it, and perform small signal analysis. 5. Method of Solution: The Dynamic System Identification Toolbox (DSI Toolbox) is designed to provide a research environment for examining phasor measurement unit data and performing small signal stability analysis. The software uses a series of text-driven menus to helpmore » guide users and organize the toolbox features. Methods for reading in populate phasor measurement unit data are provided, with appropriate preprocessing options for small-signal-stability analysis. The toolbox includes the Prony Ringdown GUI and basic algorithms to estimate information on oscillatory modes of the system, such as modal frequency and damping ratio.« less
ARX - A Comprehensive Tool for Anonymizing Biomedical Data
Prasser, Fabian; Kohlmayer, Florian; Lautenschläger, Ronald; Kuhn, Klaus A.
2014-01-01
Collaboration and data sharing have become core elements of biomedical research. Especially when sensitive data from distributed sources are linked, privacy threats have to be considered. Statistical disclosure control allows the protection of sensitive data by introducing fuzziness. Reduction of data quality, however, needs to be balanced against gains in protection. Therefore, tools are needed which provide a good overview of the anonymization process to those responsible for data sharing. These tools require graphical interfaces and the use of intuitive and replicable methods. In addition, extensive testing, documentation and openness to reviews by the community are important. Existing publicly available software is limited in functionality, and often active support is lacking. We present ARX, an anonymization tool that i) implements a wide variety of privacy methods in a highly efficient manner, ii) provides an intuitive cross-platform graphical interface, iii) offers a programming interface for integration into other software systems, and iv) is well documented and actively supported. PMID:25954407
Shared control of a medical robot with haptic guidance.
Xiong, Linfei; Chng, Chin Boon; Chui, Chee Kong; Yu, Peiwu; Li, Yao
2017-01-01
Tele-operation of robotic surgery reduces the radiation exposure during the interventional radiological operations. However, endoscope vision without force feedback on the surgical tool increases the difficulty for precise manipulation and the risk of tissue damage. The shared control of vision and force provides a novel approach of enhanced control with haptic guidance, which could lead to subtle dexterity and better maneuvrability during MIS surgery. The paper provides an innovative shared control method for robotic minimally invasive surgery system, in which vision and haptic feedback are incorporated to provide guidance cues to the clinician during surgery. The incremental potential field (IPF) method is utilized to generate a guidance path based on the anatomy of tissue and surgical tool interaction. Haptic guidance is provided at the master end to assist the clinician during tele-operative surgical robotic task. The approach has been validated with path following and virtual tumor targeting experiments. The experiment results demonstrate that comparing with vision only guidance, the shared control with vision and haptics improved the accuracy and efficiency of surgical robotic manipulation, where the tool-position error distance and execution time are reduced. The validation experiment demonstrates that the shared control approach could help the surgical robot system provide stable assistance and precise performance to execute the designated surgical task. The methodology could also be implemented with other surgical robot with different surgical tools and applications.
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-09-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-04-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
Morales, Alfredo M.; Gonzales, Marcela
2004-06-15
The present invention describes a method for fabricating an embossing tool or an x-ray mask tool, providing microstructures that smoothly vary in height from point-to-point in etched substrates, i.e., structure which can vary in all three dimensions. The process uses a lithographic technique to transfer an image pattern in the surface of a silicon wafer by exposing and developing the resist and then etching the silicon substrate. Importantly, the photoresist is variably exposed so that when developed some of the resist layer remains. The remaining undeveloped resist acts as an etchant barrier to the reactive plasma used to etch the silicon substrate and therefore provides the ability etch structures of variable depths.
An experimental method for the assessment of color simulation tools.
Lillo, Julio; Alvaro, Leticia; Moreira, Humberto
2014-07-22
The Simulcheck method for evaluating the accuracy of color simulation tools in relation to dichromats is described and used to test three color simulation tools: Variantor, Coblis, and Vischeck. A total of 10 dichromats (five protanopes, five deuteranopes) and 10 normal trichromats participated in the current study. Simulcheck includes two psychophysical tasks: the Pseudoachromatic Stimuli Identification task and the Minimum Achromatic Contrast task. The Pseudoachromatic Stimuli Identification task allows determination of the two chromatic angles (h(uv) values) that generate a minimum response in the yellow–blue opponent mechanism and, consequently, pseudoachromatic stimuli (greens or reds). The Minimum Achromatic Contrast task requires the selection of the gray background that produces minimum contrast (near zero change in the achromatic mechanism) for each pseudoachromatic stimulus selected in the previous task (L(R) values). Results showed important differences in the colorimetric transformations performed by the three evaluated simulation tools and their accuracy levels. Vischeck simulation accurately implemented the algorithm of Brettel, Viénot, and Mollon (1997). Only Vischeck appeared accurate (similarity in huv and L(R) values between real and simulated dichromats) and, consequently, could render reliable color selections. It is concluded that Simulcheck is a consistent method because it provided an equivalent pattern of results for huv and L(R) values irrespective of the stimulus set used to evaluate a simulation tool. Simulcheck was also considered valid because real dichromats provided expected huv and LR values when performing the two psychophysical tasks included in this method. © 2014 ARVO.
Realizing the Potential of Mobile Mental Health: New Methods for New Data in Psychiatry
Staples, Patrick; Onnela, Jukka-Pekka
2015-01-01
Smartphones are now ubiquitous and can be harnessed to offer psychiatry a wealth of real-time data regarding patient behavior, self-reported symptoms, and even physiology. The data collected from smartphones meet the three criteria of big data: velocity, volume, and variety. Although these data have tremendous potential, transforming them into clinically valid and useful information requires using new tools and methods as a part of assessment in psychiatry. In this paper, we introduce and explore numerous analytical methods and tools from the computational and statistical sciences that appear readily applicable to psychiatric data collected using smartphones. By matching smartphone data with appropriate statistical methods, psychiatry can better realize the potential of mobile mental health and empower both patients and providers with novel clinical tools. PMID:26073363
WebArray: an online platform for microarray data analysis
Xia, Xiaoqin; McClelland, Michael; Wang, Yipeng
2005-01-01
Background Many cutting-edge microarray analysis tools and algorithms, including commonly used limma and affy packages in Bioconductor, need sophisticated knowledge of mathematics, statistics and computer skills for implementation. Commercially available software can provide a user-friendly interface at considerable cost. To facilitate the use of these tools for microarray data analysis on an open platform we developed an online microarray data analysis platform, WebArray, for bench biologists to utilize these tools to explore data from single/dual color microarray experiments. Results The currently implemented functions were based on limma and affy package from Bioconductor, the spacings LOESS histogram (SPLOSH) method, PCA-assisted normalization method and genome mapping method. WebArray incorporates these packages and provides a user-friendly interface for accessing a wide range of key functions of limma and others, such as spot quality weight, background correction, graphical plotting, normalization, linear modeling, empirical bayes statistical analysis, false discovery rate (FDR) estimation, chromosomal mapping for genome comparison. Conclusion WebArray offers a convenient platform for bench biologists to access several cutting-edge microarray data analysis tools. The website is freely available at . It runs on a Linux server with Apache and MySQL. PMID:16371165
In search of tools to aid logical thinking and communicating about medical decision making.
Hunink, M G
2001-01-01
To have real-time impact on medical decision making, decision analysts need a wide variety of tools to aid logical thinking and communication. Decision models provide a formal framework to integrate evidence and values, but they are commonly perceived as complex and difficult to understand by those unfamiliar with the methods, especially in the context of clinical decision making. The theory of constraints, introduced by Eliyahu Goldratt in the business world, provides a set of tools for logical thinking and communication that could potentially be useful in medical decision making. The author used the concept of a conflict resolution diagram to analyze the decision to perform carotid endarterectomy prior to coronary artery bypass grafting in a patient with both symptomatic coronary and asymptomatic carotid artery disease. The method enabled clinicians to visualize and analyze the issues, identify and discuss the underlying assumptions, search for the best available evidence, and use the evidence to make a well-founded decision. The method also facilitated communication among those involved in the care of the patient. Techniques from fields other than decision analysis can potentially expand the repertoire of tools available to support medical decision making and to facilitate communication in decision consults.
ProphTools: general prioritization tools for heterogeneous biological networks.
Navarro, Carmen; Martínez, Victor; Blanco, Armando; Cano, Carlos
2017-12-01
Networks have been proven effective representations for the analysis of biological data. As such, there exist multiple methods to extract knowledge from biological networks. However, these approaches usually limit their scope to a single biological entity type of interest or they lack the flexibility to analyze user-defined data. We developed ProphTools, a flexible open-source command-line tool that performs prioritization on a heterogeneous network. ProphTools prioritization combines a Flow Propagation algorithm similar to a Random Walk with Restarts and a weighted propagation method. A flexible model for the representation of a heterogeneous network allows the user to define a prioritization problem involving an arbitrary number of entity types and their interconnections. Furthermore, ProphTools provides functionality to perform cross-validation tests, allowing users to select the best network configuration for a given problem. ProphTools core prioritization methodology has already been proven effective in gene-disease prioritization and drug repositioning. Here we make ProphTools available to the scientific community as flexible, open-source software and perform a new proof-of-concept case study on long noncoding RNAs (lncRNAs) to disease prioritization. ProphTools is robust prioritization software that provides the flexibility not present in other state-of-the-art network analysis approaches, enabling researchers to perform prioritization tasks on any user-defined heterogeneous network. Furthermore, the application to lncRNA-disease prioritization shows that ProphTools can reach the performance levels of ad hoc prioritization tools without losing its generality. © The Authors 2017. Published by Oxford University Press.
The Pocket Psychiatrist: Tools to enhance psychiatry education in family medicine.
Bass, Deanna; Brandenburg, Dana; Danner, Christine
2015-01-01
Primary care is the setting where the majority of patients seek assistance for their mental health problems. To assist family medicine residents in providing effective care to patients for mental health problems during residency and after graduation, it is essential they receive training in the assessment, diagnosis, and treatment of common mental health conditions. While there is some limited education time with a psychiatrist in our department, residents need tools and resources that provide education during their continuity clinics even when the psychiatrist is not available. Information on two tools that were developed is provided. These tools include teaching residents a brief method for conducting a psychiatric interview as well as a means to access evidence-based information on diagnosis and treatment of mental health conditions through templates available within our electronic medical record. © The Author(s) 2015.
Ashbaugh, F.A.; Murry, K.R.
1986-02-10
A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting flutes formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first flute tip to the axis of rotation plus the distance from the second flute tip to the axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second flute tip to the axis of rotation minus one-half the distance from the first flute tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.
Ashbaugh, Fred N.; Murry, Kenneth R.
1988-12-27
A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting edges formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first cutting edge tip to the axis of rotation plus the distance from the second cutting edge tip to the axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second cutting edge tip to the axis of rotation minus one-half the distance from the first cutting edge tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashbaugh, F.A.; Murry, K.R.
A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting flutes formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first flute tip to the axis of rotation plus the distance from the second flute tip to themore » axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second flute tip to the axis of rotation minus one-half the distance from the first flute tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.« less
DCODE.ORG Anthology of Comparative Genomic Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loots, G G; Ovcharenko, I
2005-01-11
Comparative genomics provides the means to demarcate functional regions in anonymous DNA sequences. The successful application of this method to identifying novel genes is currently shifting to deciphering the noncoding encryption of gene regulation across genomes. To facilitate the use of comparative genomics to practical applications in genetics and genomics we have developed several analytical and visualization tools for the analysis of arbitrary sequences and whole genomes. These tools include two alignment tools: zPicture and Mulan; a phylogenetic shadowing tool: eShadow for identifying lineage- and species-specific functional elements; two evolutionary conserved transcription factor analysis tools: rVista and multiTF; a toolmore » for extracting cis-regulatory modules governing the expression of co-regulated genes, CREME; and a dynamic portal to multiple vertebrate and invertebrate genome alignments, the ECR Browser. Here we briefly describe each one of these tools and provide specific examples on their practical applications. All the tools are publicly available at the http://www.dcode.org/ web site.« less
Moles: Tool-Assisted Environment Isolation with Closures
NASA Astrophysics Data System (ADS)
de Halleux, Jonathan; Tillmann, Nikolai
Isolating test cases from environment dependencies is often desirable, as it increases test reliability and reduces test execution time. However, code that calls non-virtual methods or consumes sealed classes is often impossible to test in isolation. Moles is a new lightweight framework which addresses this problem. For any .NET method, Moles allows test-code to provide alternative implementations, given as .NET delegates, for which C# provides very concise syntax while capturing local variables in a closure object. Using code instrumentation, the Moles framework will redirect calls to provided delegates instead of the original methods. The Moles framework is designed to work together with the dynamic symbolic execution tool Pex to enable automated test generation. In a case study, testing code programmed against the Microsoft SharePoint Foundation API, we achieved full code coverage while running tests in isolation without an actual SharePoint server. The Moles framework integrates with .NET and Visual Studio.
Design and Analysis Tool for External-Compression Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.
2012-01-01
A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.
NASA Astrophysics Data System (ADS)
Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T.; Morris, R. V.; Laura, J.
2018-04-01
The PySAT point spectra tool provides a flexible graphical interface, enabling scientists to apply a wide variety of preprocessing and machine learning methods to point spectral data, with an emphasis on multivariate regression.
Pilot TMDL Applications Using the Impervious Cover Method
This report provides a description of the Impervious Cover (IC) method and tests its feasibility as a total maximum daily load (TMDL) development tool using watersheds nominated by five New England States.
Dehlendorf, Christine; Fitzpatrick, Judith; Steinauer, Jody; Swiader, Lawrence; Grumbach, Kevin; Hall, Cara; Kuppermann, Miriam
2017-07-01
We developed and formatively evaluated a tablet-based decision support tool for use by women prior to a contraceptive counseling visit to help them engage in shared decision making regarding method selection. Drawing upon formative work around women's preferences for contraceptive counseling and conceptual understanding of health care decision making, we iteratively developed a storyboard and then digital prototypes, based on best practices for decision support tool development. Pilot testing using both quantitative and qualitative data and cognitive testing was conducted. We obtained feedback from patient and provider advisory groups throughout the development process. Ninety-six percent of women who used the tool in pilot testing reported that it helped them choose a method, and qualitative interviews indicated acceptability of the tool's content and presentation. Compared to the control group, women who used the tool demonstrated trends toward increased likelihood of complete satisfaction with their method. Participant responses to cognitive testing were used in tool refinement. Our decision support tool appears acceptable to women in the family planning setting. Formative evaluation of the tool supports its utility among patients making contraceptive decisions, which can be further evaluated in a randomized controlled trial. Copyright © 2017 Elsevier B.V. All rights reserved.
GeneTools--application for functional annotation and statistical hypothesis testing.
Beisvag, Vidar; Jünge, Frode K R; Bergum, Hallgeir; Jølsum, Lars; Lydersen, Stian; Günther, Clara-Cecilie; Ramampiaro, Heri; Langaas, Mette; Sandvik, Arne K; Laegreid, Astrid
2006-10-24
Modern biology has shifted from "one gene" approaches to methods for genomic-scale analysis like microarray technology, which allow simultaneous measurement of thousands of genes. This has created a need for tools facilitating interpretation of biological data in "batch" mode. However, such tools often leave the investigator with large volumes of apparently unorganized information. To meet this interpretation challenge, gene-set, or cluster testing has become a popular analytical tool. Many gene-set testing methods and software packages are now available, most of which use a variety of statistical tests to assess the genes in a set for biological information. However, the field is still evolving, and there is a great need for "integrated" solutions. GeneTools is a web-service providing access to a database that brings together information from a broad range of resources. The annotation data are updated weekly, guaranteeing that users get data most recently available. Data submitted by the user are stored in the database, where it can easily be updated, shared between users and exported in various formats. GeneTools provides three different tools: i) NMC Annotation Tool, which offers annotations from several databases like UniGene, Entrez Gene, SwissProt and GeneOntology, in both single- and batch search mode. ii) GO Annotator Tool, where users can add new gene ontology (GO) annotations to genes of interest. These user defined GO annotations can be used in further analysis or exported for public distribution. iii) eGOn, a tool for visualization and statistical hypothesis testing of GO category representation. As the first GO tool, eGOn supports hypothesis testing for three different situations (master-target situation, mutually exclusive target-target situation and intersecting target-target situation). An important additional function is an evidence-code filter that allows users, to select the GO annotations for the analysis. GeneTools is the first "all in one" annotation tool, providing users with a rapid extraction of highly relevant gene annotation data for e.g. thousands of genes or clones at once. It allows a user to define and archive new GO annotations and it supports hypothesis testing related to GO category representations. GeneTools is freely available through www.genetools.no
Systems Analysis - a new paradigm and decision support tools for the water framework directive
NASA Astrophysics Data System (ADS)
Bruen, M.
2008-05-01
In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD) requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE) and the Analytical Hierarchy Process (AHP) are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS) that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness.
Toxicokinetic and Dosimetry Modeling Tools for Exposure ...
New technologies and in vitro testing approaches have been valuable additions to risk assessments that have historically relied solely on in vivo test results. Compared to in vivo methods, in vitro high throughput screening (HTS) assays are less expensive, faster and can provide mechanistic insights on chemical action. However, extrapolating from in vitro chemical concentrations to target tissue or blood concentrations in vivo is fraught with uncertainties, and modeling is dependent upon pharmacokinetic variables not measured in in vitro assays. To address this need, new tools have been created for characterizing, simulating, and evaluating chemical toxicokinetics. Physiologically-based pharmacokinetic (PBPK) models provide estimates of chemical exposures that produce potentially hazardous tissue concentrations, while tissue microdosimetry PK models relate whole-body chemical exposures to cell-scale concentrations. These tools rely on high-throughput in vitro measurements, and successful methods exist for pharmaceutical compounds that determine PK from limited in vitro measurements and chemical structure-derived property predictions. These high throughput (HT) methods provide a more rapid and less resource–intensive alternative to traditional PK model development. We have augmented these in vitro data with chemical structure-based descriptors and mechanistic tissue partitioning models to construct HTPBPK models for over three hundred environmental and pharmace
Livet, Melanie; Fixsen, Amanda
2018-01-01
With mental health services shifting to community-based settings, community mental health (CMH) organizations are under increasing pressure to deliver effective services. Despite availability of evidence-based interventions, there is a gap between effective mental health practices and the care that is routinely delivered. Bridging this gap requires availability of easily tailorable implementation support tools to assist providers in implementing evidence-based intervention with quality, thereby increasing the likelihood of achieving the desired client outcomes. This study documents the process and lessons learned from exploring the feasibility of adapting such a technology-based tool, Centervention, as the example innovation, for use in CMH settings. Mixed-methods data on core features, innovation-provider fit, and organizational capacity were collected from 44 CMH providers. Lessons learned included the need to augment delivery through technology with more personal interactions, the importance of customizing and integrating the tool with existing technologies, and the need to incorporate a number of strategies to assist with adoption and use of Centervention-like tools in CMH contexts. This study adds to the current body of literature on the adaptation process for technology-based tools and provides information that can guide additional innovations for CMH settings.
An intelligent tool for activity data collection.
Sarkar, A M Jehad
2011-01-01
Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user's activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool's performance in producing reliable datasets.
Providing Access and Visualization to Global Cloud Properties from GEO Satellites
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Palikonda, R.; Ayers, J. K.
2015-12-01
Providing public access to cloud macro and microphysical properties is a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a tool and method that allows end users to easily browse and access cloud information that is otherwise difficult to acquire and manipulate. The core of the tool is an application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the dynamically generated imagery as an input into their own work flows for both image generation and cloud product requisition. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite derived cloud product information available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider. Using the Open Geospatial Consortium's Web Processing Service as our access method, we describe a system that uses a hybrid local and cloud based parallel processing system that can return both satellite imagery and cloud product imagery as well as the binary data used to generate them in multiple formats. The images and cloud products are sourced from multiple satellites and also "merged" datasets created by temporally and spatially matching satellite sensors. Finally, the tool and API allow users to access information that spans the time ranges that our group has information available. In the case of satellite imagery, the temporal range can span the entire lifetime of the sensor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Daly, Don S.; Willse, Alan R.
The Automated Microarray Image Analysis (AMIA) Toolbox for MATLAB is a flexible, open-source microarray image analysis tool that allows the user to customize analysis of sets of microarray images. This tool provides several methods of identifying and quantify spot statistics, as well as extensive diagnostic statistics and images to identify poor data quality or processing. The open nature of this software allows researchers to understand the algorithms used to provide intensity estimates and to modify them easily if desired.
3-DOF Force-Sensing Motorized Micro-Forceps for Robot-Assisted Vitreoretinal Surgery
Gonenc, Berk; Chamani, Alireza; Handa, James; Gehlbach, Peter; Taylor, Russell H.; Iordachita, Iulian
2017-01-01
In vitreoretinal surgery, membrane peeling is a prototypical task where a layer of fibrous tissue is delaminated off the retina with a micro-forceps by applying very fine forces that are mostly imperceptible to the surgeon. Previously we developed sensitized ophthalmic surgery tools based on fiber Bragg grating (FBG) strain sensors, which were shown to precisely detect forces at the instrument’s tip in two degrees of freedom perpendicular to the tool axis. This paper presents a new design that employs an additional sensor to capture also the tensile force along the tool axis. The grasping functionality is provided via a compact motorized unit. To compute forces, we investigate two distinct fitting methods: a linear regression and a nonlinear fitting based on second-order Bernstein polynomials. We carry out experiments to test the repeatability of sensor outputs, calibrate the sensor and validate its performance. Results demonstrate sensor wavelength repeatability within 2 pm. Although the linear method provides sufficient accuracy in measuring transverse forces, in the axial direction it produces a root mean square (rms) error over 3 mN even for a confined magnitude and direction of forces. On the other hand, the nonlinear method provides a more consistent and accurate measurement of both the transverse and axial forces for the entire force range (0–25 mN). Validation including random samples shows that our tool with the nonlinear force computation method can predict 3-D forces with an rms error under 0.15 mN in the transverse plane and within 2 mN accuracy in the axial direction. PMID:28736508
A Tool for the Automated Design and Evaluation of Habitat Interior Layouts
NASA Technical Reports Server (NTRS)
Simon, Matthew A.; Wilhite, Alan W.
2013-01-01
The objective of space habitat design is to minimize mass and system size while providing adequate space for all necessary equipment and a functional layout that supports crew health and productivity. Unfortunately, development and evaluation of interior layouts is often ignored during conceptual design because of the subjectivity and long times required using current evaluation methods (e.g., human-in-the-loop mockup tests and in-depth CAD evaluations). Early, more objective assessment could prevent expensive design changes that may increase vehicle mass and compromise functionality. This paper describes a new interior design evaluation method to enable early, structured consideration of habitat interior layouts. This interior layout evaluation method features a comprehensive list of quantifiable habitat layout evaluation criteria, automatic methods to measure these criteria from a geometry model, and application of systems engineering tools and numerical methods to construct a multi-objective value function measuring the overall habitat layout performance. In addition to a detailed description of this method, a C++/OpenGL software tool which has been developed to implement this method is also discussed. This tool leverages geometry modeling coupled with collision detection techniques to identify favorable layouts subject to multiple constraints and objectives (e.g., minimize mass, maximize contiguous habitable volume, maximize task performance, and minimize crew safety risks). Finally, a few habitat layout evaluation examples are described to demonstrate the effectiveness of this method and tool to influence habitat design.
Recommendations following a multi-laboratory comparison of microbial source tracking methods
Microbial source tracking (MST) methods are under development to provide resource managers with tools to identify sources of fecal contamination in water. Some of the most promising methods currently under development were recently evaluated in the Source Identification Protocol ...
Fernández-Carrobles, M. Milagro; Tadeo, Irene; Bueno, Gloria; Noguera, Rosa; Déniz, Oscar; Salido, Jesús; García-Rojo, Marcial
2013-01-01
Given that angiogenesis and lymphangiogenesis are strongly related to prognosis in neoplastic and other pathologies and that many methods exist that provide different results, we aim to construct a morphometric tool allowing us to measure different aspects of the shape and size of vascular vessels in a complete and accurate way. The developed tool presented is based on vessel closing which is an essential property to properly characterize the size and the shape of vascular and lymphatic vessels. The method is fast and accurate improving existing tools for angiogenesis analysis. The tool also improves the accuracy of vascular density measurements, since the set of endothelial cells forming a vessel is considered as a single object. PMID:24489494
Recent advances in vitro assays, in silico tools, and systems biology approaches provide opportunities for refined mechanistic understanding for chemical safety assessment that will ultimately lead to reduced reliance on animal-based methods. With the U.S. commercial chemical lan...
Tool-specific performance of vibration-reducing gloves for attenuating fingers-transmitted vibration
Welcome, Daniel E.; Dong, Ren G.; Xu, Xueyan S.; Warren, Christopher; McDowell, Thomas W.
2016-01-01
BACKGROUND Fingers-transmitted vibration can cause vibration-induced white finger. The effectiveness of vibration-reducing (VR) gloves for reducing hand transmitted vibration to the fingers has not been sufficiently examined. OBJECTIVE The objective of this study is to examine tool-specific performance of VR gloves for reducing finger-transmitted vibrations in three orthogonal directions (3D) from powered hand tools. METHODS A transfer function method was used to estimate the tool-specific effectiveness of four typical VR gloves. The transfer functions of the VR glove fingers in three directions were either measured in this study or during a previous study using a 3D laser vibrometer. More than seventy vibration spectra of various tools or machines were used in the estimations. RESULTS When assessed based on frequency-weighted acceleration, the gloves provided little vibration reduction. In some cases, the gloves amplified the vibration by more than 10%, especially the neoprene glove. However, the neoprene glove did the best when the assessment was based on unweighted acceleration. The neoprene glove was able to reduce the vibration by 10% or more of the unweighted vibration for 27 out of the 79 tools. If the dominant vibration of a tool handle or workpiece was in the shear direction relative to the fingers, as observed in the operation of needle scalers, hammer chisels, and bucking bars, the gloves did not reduce the vibration but increased it. CONCLUSIONS This study confirmed that the effectiveness for reducing vibration varied with the gloves and the vibration reduction of each glove depended on tool, vibration direction to the fingers, and finger location. VR gloves, including certified anti-vibration gloves do not provide much vibration reduction when judged based on frequency-weighted acceleration. However, some of the VR gloves can provide more than 10% reduction of the unweighted vibration for some tools or workpieces. Tools and gloves can be matched for better effectiveness for protecting the fingers. PMID:27867313
Aubrey, Wayne; Riley, Michael C; Young, Michael; King, Ross D; Oliver, Stephen G; Clare, Amanda
2015-01-01
Many advances in synthetic biology require the removal of a large number of genomic elements from a genome. Most existing deletion methods leave behind markers, and as there are a limited number of markers, such methods can only be applied a fixed number of times. Deletion methods that recycle markers generally are either imprecise (remove untargeted sequences), or leave scar sequences which can cause genome instability and rearrangements. No existing marker recycling method is automation-friendly. We have developed a novel openly available deletion tool that consists of: 1) a method for deleting genomic elements that can be repeatedly used without limit, is precise, scar-free, and suitable for automation; and 2) software to design the method's primers. Our tool is sequence agnostic and could be used to delete large numbers of coding sequences, promoter regions, transcription factor binding sites, terminators, etc in a single genome. We have validated our tool on the deletion of non-essential open reading frames (ORFs) from S. cerevisiae. The tool is applicable to arbitrary genomes, and we provide primer sequences for the deletion of: 90% of the ORFs from the S. cerevisiae genome, 88% of the ORFs from S. pombe genome, and 85% of the ORFs from the L. lactis genome.
Software Tools to Support the Assessment of System Health
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
2013-01-01
This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of sensors that optimally meet the performance goals and the constraints. It identifies optimal sensor suite solutions by utilizing a merit (i.e., cost) function with one of several available optimization approaches. As part of its analysis, S4 can expose fault conditions that are difficult to diagnose due to an incomplete diagnostic philosophy and/or a lack of sensors. S4 was originally developed and applied to liquid rocket engines. It was subsequently used to study the optimized selection of sensors for a simulation ]based aircraft engine diagnostic system. The ETA Tool is a software ]based analysis tool that augments the testability analysis and reporting capabilities of a commercial ]off ]the ]shelf (COTS) package. An initial diagnostic assessment is performed by the COTS software using a user ]developed, qualitative, directed ]graph model of the system being analyzed. The ETA Tool accesses system design information captured within the model and the associated testability analysis output to create a series of six reports for various system engineering needs. These reports are highlighted in the presentation. The ETA Tool was developed by NASA to support the verification of fault management requirements early in the Launch Vehicle process. Due to their early development during the design process, the TEAMS ]based diagnostic model and the ETA Tool were able to positively influence the system design by highlighting gaps in failure detection, fault isolation, and failure recovery.
Risk Management Implementation Tool
NASA Technical Reports Server (NTRS)
Wright, Shayla L.
2004-01-01
Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith
2015-01-01
Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.
Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet
NASA Technical Reports Server (NTRS)
Muss, J. A.; Johnson, C. W.; Gotchy, M. B.
2000-01-01
The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.
The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.
Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A
2010-03-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).
Inrig, Stephen J; Higashi, Robin T; Tiro, Jasmin A; Argenbright, Keith E; Lee, Simon J Craddock
2017-04-01
Despite federal funding for breast cancer screening, fragmented infrastructure and limited organizational capacity hinder access to the full continuum of breast cancer screening and clinical follow-up procedures among rural-residing women. We proposed a regional hub-and-spoke model, partnering with local providers to expand access across North Texas. We describe development and application of an iterative, mixed-method tool to assess county capacity to conduct community outreach and/or patient navigation in a partnership model. Our tool combined publicly-available quantitative data with qualitative assessments during site visits and semi-structured interviews. Application of our tool resulted in shifts in capacity designation in 10 of 17 county partners: 8 implemented local outreach with hub navigation; 9 relied on the hub for both outreach and navigation. Key factors influencing capacity: (1) formal linkages between partner organizations; (2) inter-organizational relationships; (3) existing clinical service protocols; (4) underserved populations. Qualitative data elucidate how our tool captured these capacity changes. Our capacity assessment tool enabled the hub to establish partnerships with county organizations by tailoring support to local capacity and needs. Absent a vertically integrated provider network for preventive services in these rural counties, our tool facilitated a virtually integrated regional network to extend access to breast cancer screening to underserved women. Copyright © 2016 Elsevier Ltd. All rights reserved.
The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software
Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung
2010-01-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162
Commercial Molecular Tests for Fungal Diagnosis from a Practical Point of View.
Lackner, Michaela; Lass-Flörl, Cornelia
2017-01-01
The increasing interest in molecular diagnostics is a result of tremendously improved knowledge on fungal infections in the past 20 years and the rapid development of new methods, in particular polymerase chain reaction. High expectations have been placed on molecular diagnostics, and the number of laboratories now using the relevant technology is rapidly increasing-resulting in an obvious need for standardization and definition of laboratory organization. In the past 10 years, multiple new molecular tools were marketed for the detection of DNA, antibodies, cell wall components, or other antigens. In contrast to classical culture methods, molecular methods do not detect a viable organisms, but only molecules which indicate its presence; this can be nucleic acids, cell components (antigens), or antibodies (Fig. 1). In this chapter, an overview is provided on commercially available detection tools, their strength and how to use them. A main focus is laid on providing tips and tricks that make daily life easier. We try to focus and mention methodical details which are not highlighted in the manufacturer's instructions of these test kits, but are based on our personal experience in the laboratory. Important to keep in mind is that molecular tools cannot replace culture, microscopy, or a critical view on patients' clinical history, signs, and symptoms, but provide a valuable add on tool. Diagnosis should not be based solely on a molecular test, but molecular tools might deliver an important piece of information that helps matching the diagnostic puzzle to a diagnosis, in particular as few tests are in vitro diagnostic tests (IVD) or only part of the whole test carries the IVD certificate (e.g., DNA extraction is often not included). Please be aware that the authors do not claim to provide a complete overview on all commercially available diagnostic assays being currently marketed for fungal detection, as those are subject to constant change. A main focus is put on commonly used panfungal assays and pathogen-specific assays, including Aspergillus-specific, Candida-specific, Cryptococcus specific, Histoplasma-specific, and Pneumocystis-specific assays. Assays are categorized according to their underlying principle in either antigen-detecting or antibody-detecting or DNA-detecting (Fig. 1). Other non-DNA-detecting nucleic acid methods such as FISH and PNA FISH are not summarized in this chapter and an overview on test performance, common false positives, and the clinical evaluation of commercial tests in studies is provided already in a previous book series by Javier Yugueros Marcos and David H. Pincus (Marcos and Pincus, Methods Mol Biol 968:25-54, 2013).
Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie
This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the test method for a given purpose. Relevance encapsulates the scientific basis of the test method, its capacity to predict adverse effects in the "target system" (i.e. human health or the environment) as well as its applicability for the intended purpose. In this chapter we focus on the validation of non-animal in vitro alternative testing methods and review the concepts, challenges, processes and tools fundamental to the validation of in vitro methods intended for hazard testing of chemicals. We explore major challenges and peculiarities of validation in this area. Based on the notion that validation per se is a scientific endeavour that needs to adhere to key scientific principles, namely objectivity and appropriate choice of methodology, we examine basic aspects of study design and management, and provide illustrations of statistical approaches to describe predictive performance of validated test methods as well as their reliability.
Exploring the single-cell RNA-seq analysis landscape with the scRNA-tools database.
Zappia, Luke; Phipson, Belinda; Oshlack, Alicia
2018-06-25
As single-cell RNA-sequencing (scRNA-seq) datasets have become more widespread the number of tools designed to analyse these data has dramatically increased. Navigating the vast sea of tools now available is becoming increasingly challenging for researchers. In order to better facilitate selection of appropriate analysis tools we have created the scRNA-tools database (www.scRNA-tools.org) to catalogue and curate analysis tools as they become available. Our database collects a range of information on each scRNA-seq analysis tool and categorises them according to the analysis tasks they perform. Exploration of this database gives insights into the areas of rapid development of analysis methods for scRNA-seq data. We see that many tools perform tasks specific to scRNA-seq analysis, particularly clustering and ordering of cells. We also find that the scRNA-seq community embraces an open-source and open-science approach, with most tools available under open-source licenses and preprints being extensively used as a means to describe methods. The scRNA-tools database provides a valuable resource for researchers embarking on scRNA-seq analysis and records the growth of the field over time.
Correction tool for Active Shape Model based lumbar muscle segmentation.
Valenzuela, Waldo; Ferguson, Stephen J; Ignasiak, Dominika; Diserens, Gaelle; Vermathen, Peter; Boesch, Chris; Reyes, Mauricio
2015-08-01
In the clinical environment, accuracy and speed of the image segmentation process plays a key role in the analysis of pathological regions. Despite advances in anatomic image segmentation, time-effective correction tools are commonly needed to improve segmentation results. Therefore, these tools must provide faster corrections with a low number of interactions, and a user-independent solution. In this work we present a new interactive correction method for correcting the image segmentation. Given an initial segmentation and the original image, our tool provides a 2D/3D environment, that enables 3D shape correction through simple 2D interactions. Our scheme is based on direct manipulation of free form deformation adapted to a 2D environment. This approach enables an intuitive and natural correction of 3D segmentation results. The developed method has been implemented into a software tool and has been evaluated for the task of lumbar muscle segmentation from Magnetic Resonance Images. Experimental results show that full segmentation correction could be performed within an average correction time of 6±4 minutes and an average of 68±37 number of interactions, while maintaining the quality of the final segmentation result within an average Dice coefficient of 0.92±0.03.
Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization
NASA Technical Reports Server (NTRS)
Gern, Frank H.
2012-01-01
This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.
Measuring Workload Demand of Informatics Systems with the Clinical Case Demand Index
Iyengar, M. Sriram; Rogith, Deevakar; Florez-Arango, Jose F
2017-01-01
Introduction: The increasing use of Health Information Technology (HIT) can add substantially to workload on clinical providers. Current methods for assessing workload do not take into account the nature of clinical cases and the use of HIT tools while solving them. Methods: The Clinical Case Demand Index (CCDI), consisting of a summary score and visual representation, was developed to meet this need. Consistency with current perceived workload measures was evaluated in a Randomized Control Trial of a mobile health system. Results: CCDI is significantly correlated with existing workload measures and inversely related to provider performance. Discussion: CCDI combines subjective and objective characteristics of clinical cases along with cognitive and clinical dimensions. Applications include evaluation of HIT tools, clinician scheduling, medical education. Conclusion: CCDI supports comparative effectiveness research of HIT tools. In addition, CCDI could have numerous applications including training, clinical trials, design of clinical workflows, and others. PMID:29854166
Concentration solar power optimization system and method of using the same
Andraka, Charles E
2014-03-18
A system and method for optimizing at least one mirror of at least one CSP system is provided. The system has a screen for displaying light patterns for reflection by the mirror, a camera for receiving a reflection of the light patterns from the mirror, and a solar characterization tool. The solar characterization tool has a characterizing unit for determining at least one mirror parameter of the mirror based on an initial position of the camera and the screen, and a refinement unit for refining the determined parameter(s) based on an adjusted position of the camera and screen whereby the mirror is characterized. The system may also be provided with a solar alignment tool for comparing at least one mirror parameter of the mirror to a design geometry whereby an alignment error is defined, and at least one alignment unit for adjusting the mirror to reduce the alignment error.
Enabling the use of hereditary information from pedigree tools in medical knowledge-based systems.
Gay, Pablo; López, Beatriz; Plà, Albert; Saperas, Jordi; Pous, Carles
2013-08-01
The use of family information is a key issue to deal with inheritance illnesses. This kind of information use to come in the form of pedigree files, which contain structured information as tree or graphs, which explains the family relationships. Knowledge-based systems should incorporate the information gathered by pedigree tools to assess medical decision making. In this paper, we propose a method to achieve such a goal, which consists on the definition of new indicators, and methods and rules to compute them from family trees. The method is illustrated with several case studies. We provide information about its implementation and integration on a case-based reasoning tool. The method has been experimentally tested with breast cancer diagnosis data. The results show the feasibility of our methodology. Copyright © 2013 Elsevier Inc. All rights reserved.
Poverty and pediatric palliative care: what can we do?
Beaune, Laura; Leavens, Anne; Muskat, Barbara; Ford-Jones, Lee; Rapoport, Adam; Zlotnik Shaul, Randi; Morinis, Julia; Chapman, Lee Ann
2014-01-01
It has been recognized that families of children with life-limiting health conditions struggle with significant financial demands, yet may not have awareness of resources available to them. Additionally, health care providers may not be aware of the socioeconomic needs of families they care for. This article describes a mixed-methods study examining the content validity and utility for health care providers of a poverty screening tool and companion resource guide for the pediatric palliative care population. The study found high relevance and validity of the tool. Significant barriers to implementing the screening tool in clinical practice were described by participants, including: concerns regarding time required, roles and responsibilities, and discomfort in asking about income. Implications for practice and suggestions for improving the tool are discussed. Screening and attention to the social determinants of health lie within the scope of practice of all health care providers. Social workers can play a leadership role in this work.
Washburne, Alex D; Silverman, Justin D; Leff, Jonathan W; Bennett, Dominic J; Darcy, John L; Mukherjee, Sayan; Fierer, Noah; David, Lawrence A
2017-01-01
Marker gene sequencing of microbial communities has generated big datasets of microbial relative abundances varying across environmental conditions, sample sites and treatments. These data often come with putative phylogenies, providing unique opportunities to investigate how shared evolutionary history affects microbial abundance patterns. Here, we present a method to identify the phylogenetic factors driving patterns in microbial community composition. We use the method, "phylofactorization," to re-analyze datasets from the human body and soil microbial communities, demonstrating how phylofactorization is a dimensionality-reducing tool, an ordination-visualization tool, and an inferential tool for identifying edges in the phylogeny along which putative functional ecological traits may have arisen.
Pisa, Pedro T; Landais, Edwige; Margetts, Barrie; Vorster, Hester H; Friedenreich, Christine M; Huybrechts, Inge; Martin-Prevel, Yves; Branca, Francesco; Lee, Warren T K; Leclercq, Catherine; Jerling, Johann; Zotor, Francis; Amuna, Paul; Al Jawaldeh, Ayoub; Aderibigbe, Olaide Ruth; Amoussa, Waliou Hounkpatin; Anderson, Cheryl A M; Aounallah-Skhiri, Hajer; Atek, Madjid; Benhura, Chakare; Chifamba, Jephat; Covic, Namukolo; Dary, Omar; Delisle, Hélène; El Ati, Jalila; El Hamdouchi, Asmaa; El Rhazi, Karima; Faber, Mieke; Kalimbira, Alexander; Korkalo, Liisa; Kruger, Annamarie; Ledo, James; Machiweni, Tatenda; Mahachi, Carol; Mathe, Nonsikelelo; Mokori, Alex; Mouquet-Rivier, Claire; Mutie, Catherine; Nashandi, Hilde Liisa; Norris, Shane A; Onabanjo, Oluseye Olusegun; Rambeloson, Zo; Saha, Foudjo Brice U; Ubaoji, Kingsley Ikechukwu; Zaghloul, Sahar; Slimani, Nadia
2018-01-02
To carry out an inventory on the availability, challenges, and needs of dietary assessment (DA) methods in Africa as a pre-requisite to provide evidence, and set directions (strategies) for implementing common dietary methods and support web-research infrastructure across countries. The inventory was performed within the framework of the "Africa's Study on Physical Activity and Dietary Assessment Methods" (AS-PADAM) project. It involves international institutional and African networks. An inventory questionnaire was developed and disseminated through the networks. Eighteen countries responded to the dietary inventory questionnaire. Various DA tools were reported in Africa; 24-Hour Dietary Recall and Food Frequency Questionnaire were the most commonly used tools. Few tools were validated and tested for reliability. Face-to-face interview was the common method of administration. No computerized software or other new (web) technologies were reported. No tools were standardized across countries. The lack of comparable DA methods across represented countries is a major obstacle to implement comprehensive and joint nutrition-related programmes for surveillance, programme evaluation, research, and prevention. There is a need to develop new or adapt existing DA methods across countries by employing related research infrastructure that has been validated and standardized in other settings, with the view to standardizing methods for wider use.
ADAM: analysis of discrete models of biological systems using computer algebra.
Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard
2011-07-20
Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics.
A comparison of sleep assessment tools by nurses and patients in critical care.
Richardson, Annette; Crow, Wendy; Coghill, Elaine; Turnock, Christopher
2007-09-01
The aim of this critical care sleep assessment pilot study was to evaluate the usefulness of three sleep assessment tools to identify which, if any, provided the closest comparison between the nurses' judgement and the patients' experience of their sleep. The study objectives were to: (i) compare patients' and nurses' assessment of sleep using three different rating tools. (ii) Ascertain patients' preferences with non-interventional, user friendly, practical tools in critical care. (iii) Recommend changes and improvements to the way that sleep is assessed and documented. Sleep is important for promoting critical care recovery and sleep disturbance is known to cause irritability, aggression and increased stress levels. The availability and use of valid critical care sleep assessment tools is limited. A descriptive comparative study using three sleep assessment-rating scales were constructed to provide easy to understand tools for completion by both patients and nurses in critical care. Structured interviews were undertaken with 82 patients and 82 nurses using a convenience sample from four multispecialty critical care units in one large teaching trust. Patients were included in the study if they met a list of pre-defined criteria to obtain responses from lucid orientated patients. No tool produced a close association between the nurses' assessment of the patients sleep and the patients' assessment of their sleep. Patients found two of the three tools easy to use when rating their sleep. Discussion. Objective invasive measurements of sleep as well as complex subjective tools appear inappropriate to be used as a part of daily critical care practice. The application of simple rating scores has a high degree of error when nurses assess patients' sleep, even though high levels of patient observation and assessment are practiced in critical care. More research is needed to examine the assessment of sleep in critical care, particularly linking rating scales to alternative methods of physiological assessment of sleep. Findings indicate nurses are unable to accurately assess critical care patients' sleep using rating assessment tools. However patients were found to prefer two sleep assessment tools, one banded in hours to assess sleep quantity and one as a comparison against normal sleep to assess sleep quality. This study reviews the importance of sleep assessment and the diverse methods available for assessing sleep focussing on the critically ill patient. More noteworthy it highlights how nurses sole judgements of patients sleep is not a reliable method in clinical practice, however it provides some indication on the application of 'easy to use' tools to assist in the patients assessments of their sleep.
Nanohole optical tweezers in heterogeneous mixture analysis
NASA Astrophysics Data System (ADS)
Hacohen, Noa; Ip, Candice J. X.; Laxminarayana, Gurunatha K.; DeWolf, Timothy S.; Gordon, Reuven
2017-08-01
Nanohole optical trapping is a tool that has been shown to analyze proteins at the single molecule level using pure samples. The next step is to detect and study single molecules with dirty samples. We demonstrate that using our double nanohole optical tweezing configuration, single particles in an egg white solution can be classified when trapped. Different sized molecules provide different signal variations in their trapped state, allowing the proteins to be statistically characterized. Root mean squared variation and trap stiffness are methods used on trapped signals to distinguish between the different proteins. This method to isolate and determine single molecules in heterogeneous samples provides huge potential to become a reliable tool for use within biomedical and scientific communities.
RECENT DEVELOPMENTS IN ANALYTICAL METHODS FOR FIBROUS AMPHIBOLE IN VERMICULITE ATTIC INSULATION
The U.S. Environmental Protection Agency has developed a test method for the analysis of fibrous amphibole in vermiculite attic insulation. This method was developed to provide the Agency with monitoring tools to study the occurrence and potential for exposure to fibrous amphibo...
Van de Velde, Stijn; Roshanov, Pavel; Kortteisto, Tiina; Kunnamo, Ilkka; Aertgeerts, Bert; Vandvik, Per Olav; Flottorp, Signe
2016-03-05
A computerised clinical decision support system (CCDSS) is a technology that uses patient-specific data to provide relevant medical knowledge at the point of care. It is considered to be an important quality improvement intervention, and the implementation of CCDSS is growing substantially. However, the significant investments do not consistently result in value for money due to content, context, system and implementation issues. The Guideline Implementation with Decision Support (GUIDES) project aims to improve the impact of CCDSS through optimised implementation based on high-quality evidence-based recommendations. To achieve this, we will develop tools that address the factors that determine successful CCDSS implementation. We will develop the GUIDES tools in four steps, using the methods and results of the Tailored Implementation for Chronic Diseases (TICD) project as a starting point: (1) a review of research evidence and frameworks on the determinants of implementing recommendations using CCDSS; (2) a synthesis of a comprehensive framework for the identified determinants; (3) the development of tools for use of the framework and (4) pilot testing the utility of the tools through the development of a tailored CCDSS intervention in Norway, Belgium and Finland. We selected the conservative management of knee osteoarthritis as a prototype condition for the pilot. During the process, the authors will collaborate with an international expert group to provide input and feedback on the tools. This project will provide guidance and tools on methods of identifying implementation determinants and selecting strategies to implement evidence-based recommendations through CCDSS. We will make the GUIDES tools available to CCDSS developers, implementers, researchers, funders, clinicians, managers, educators, and policymakers internationally. The tools and recommendations will be generic, which makes them scalable to a large spectrum of conditions. Ultimately, the better implementation of CCDSS may lead to better-informed decisions and improved care and patient outcomes for a wide range of conditions. PROSPERO, CRD42016033738.
A New Disability-related Health Care Needs Assessment Tool for Persons With Brain Disorders
Kim, Yoon; Eun, Sang June; Kim, Wan Ho; Lee, Bum-Suk; Leigh, Ja-Ho; Kim, Jung-Eun
2013-01-01
Objectives This study aimed to develop a health needs assessment (HNA) tool for persons with brain disorders and to assess the unmet needs of persons with brain disorders using the developed tool. Methods The authors used consensus methods to develop a HNA tool. Using a randomized stratified systematic sampling method adjusted for sex, age, and districts, 57 registered persons (27 severe and 30 mild cases) with brain disorders dwelling in Seoul, South Korea were chosen and medical specialists investigated all of the subjects with the developed tools. Results The HNA tool for brain disorders we developed included four categories: 1) medical interventions and operations, 2) assistive devices, 3) rehabilitation therapy, and 4) regular follow-up. This study also found that 71.9% of the subjects did not receive appropriate medical care, which implies that the severity of their disability is likely to be exacerbated and permanent, and the loss irrecoverable. Conclusions Our results showed that the HNA tool for persons with brain disorders based on unmet needs defined by physicians can be a useful method for evaluating the appropriateness and necessity of medical services offered to the disabled, and it can serve as the norm for providing health care services for disabled persons. Further studies should be undertaken to increase validity and reliability of the tool. Fundamental research investigating the factors generating or affecting the unmet needs is necessary; its results could serve as basis for developing policies to eliminate or alleviate these factors. PMID:24137530
A Tool for Estimating Variability in Wood Preservative Treatment Retention
Patricia K. Lebow; Adam M. Taylor; Timothy M. Young
2015-01-01
Composite sampling is standard practice for evaluation of preservative retention levels in preservative-treated wood. Current protocols provide an average retention value but no estimate of uncertainty. Here we describe a statistical method for calculating uncertainty estimates using the standard sampling regime with minimal additional chemical analysis. This tool can...
Selecting a Free Web-Hosted Survey Tool for Student Use
ERIC Educational Resources Information Center
Elbeck, Matt
2014-01-01
This study provides marketing educators a review of free web-based survey services and guidance for student use. A mixed methods approach started with online searches and metrics identifying 13 free web-hosted survey services, described as demonstration or project tools, and ranked using popularity and importance web-based metrics. For each…
A Web-Based Learning Tool Improves Student Performance in Statistics: A Randomized Masked Trial
ERIC Educational Resources Information Center
Gonzalez, Jose A.; Jover, Lluis; Cobo, Erik; Munoz, Pilar
2010-01-01
Background: e-status is a web-based tool able to generate different statistical exercises and to provide immediate feedback to students' answers. Although the use of Information and Communication Technologies (ICTs) is becoming widespread in undergraduate education, there are few experimental studies evaluating its effects on learning. Method: All…
ERIC Educational Resources Information Center
Shriberg, Michael
2002-01-01
This paper analyzes recent efforts to measure sustainability in higher education across institutions. The benefits of cross-institutional assessments include: identifying and benchmarking leaders and best practices; communicating common goals, experiences, and methods; and providing a directional tool to measure progress toward the concept of a…
Opportunities and Possibilities: Philosophical Hermeneutics and the Educational Researcher
ERIC Educational Resources Information Center
Agrey, Loren G.
2014-01-01
The opportunities that philosophical hermeneutics provide as a research tool are explored and it is shown that this qualitative research method can be employed as a valuable tool for the educational researcher. Used as an alternative to the standard quantitative approach to educational research, currently being the dominant paradigm of data…
An Examination of Selected Software Testing Tools: 1992
1992-12-01
Report ....................................................... 27-19 Figure 27-17. Metrics Manager Database Full Report...historical test database , the test management and problem reporting tools were examined using the sample test database provided by each supplier. 4-4...track the impact of new methods, organi- zational structures, and technologies. Metrics Manager is supported by an industry database that allows
Selection and application of microbial source tracking tools for water-quality investigations
Stoeckel, Donald M.
2005-01-01
Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.
NASA Astrophysics Data System (ADS)
Ohnuma, Hidetoshi; Kawahira, Hiroichi
1998-09-01
An automatic alternative phase shift mask (PSM) pattern layout tool has been newly developed. This tool is dedicated for embedded DRAM in logic device to shrink gate line width with improving line width controllability in lithography process with a design rule below 0.18 micrometers by the KrF excimer laser exposure. The tool can crete Levenson type PSM used being coupled with a binary mask adopting a double exposure method for positive photo resist. By using graphs, this tool automatically creates alternative PSM patterns. Moreover, it does not give any phase conflicts. By adopting it to actual embedded DRAM in logic cells, we have provided 0.16 micrometers gate resist patterns at both random logic and DRAM areas. The patterns were fabricated using two masks with the double exposure method. Gate line width has been well controlled under a practical exposure-focus window.
Composite sandwich structure and method for making same
NASA Technical Reports Server (NTRS)
Magurany, Charles J. (Inventor)
1995-01-01
A core for a sandwich structure which has multi-ply laminate ribs separated by voids is made as an integral unit in one single curing step. Tooling blocks corresponding to the voids are first wrapped by strips of prepreg layup equal to one half of each rib laminate so a continuous wall of prepreg material is formed around the tooling blocks. The wrapped tooling blocks are next pressed together laterally, like tiles, so adjoining walls from two tooling blocks are joined. The assembly is then cured by conventional methods, and afterwards the tooling blocks are removed so voids are formed. The ribs can be provided with integral tabs forming bonding areas for face sheets, and face sheets may be co-cured with the core ribs. The new core design is suitable for discrete ribcores used in space telescopes and reflector panels, where quasiisotropic properties and zero coefficient of thermal expansion are required.
One- and two-dimensional dopant/carrier profiling for ULSI
NASA Astrophysics Data System (ADS)
Vandervorst, W.; Clarysse, T.; De Wolf, P.; Trenkler, T.; Hantschel, T.; Stephenson, R.; Janssens, T.
1998-11-01
Dopant/carrier profiles constitute the basis of the operation of a semiconductor device and thus play a decisive role in the performance of a transistor and are subjected to the same scaling laws as the other constituents of a modern semiconductor device and continuously evolve towards shallower and more complex configurations. This evolution has increased the demands on the profiling techniques in particular in terms of resolution and quantification such that a constant reevaluation and improvement of the tools is required. As no single technique provides all the necessary information (dopant distribution, electrical activation,..) with the requested spatial and depth resolution, the present paper attempts to provide an assessment of those tools which can be considered as the main metrology technologies for ULSI-applications. For 1D-dopant profiling secondary ion mass spectrometry (SIMS) has progressed towards a generally accepted tool meeting the requirements. For 1D-carrier profiling spreading resistance profiling and microwave surface impedance profiling are envisaged as the best choices but extra developments are required to promote them to routinely applicable methods. As no main metrology tool exist for 2D-dopant profiling, main emphasis is on 2D-carrier profiling tools based on scanning probe microscopy. Scanning spreading resistance (SSRM) and scanning capacitance microscopy (SCM) are the preferred methods although neither of them already meets all the requirements. Complementary information can be extracted from Nanopotentiometry which samples the device operation in more detail. Concurrent use of carrier profiling tools, Nanopotentiometry, analysis of device characteristics and simulations is required to provide a complete characterization of deep submicron devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
Aubrey, Wayne; Riley, Michael C.; Young, Michael; King, Ross D.; Oliver, Stephen G.; Clare, Amanda
2015-01-01
Many advances in synthetic biology require the removal of a large number of genomic elements from a genome. Most existing deletion methods leave behind markers, and as there are a limited number of markers, such methods can only be applied a fixed number of times. Deletion methods that recycle markers generally are either imprecise (remove untargeted sequences), or leave scar sequences which can cause genome instability and rearrangements. No existing marker recycling method is automation-friendly. We have developed a novel openly available deletion tool that consists of: 1) a method for deleting genomic elements that can be repeatedly used without limit, is precise, scar-free, and suitable for automation; and 2) software to design the method’s primers. Our tool is sequence agnostic and could be used to delete large numbers of coding sequences, promoter regions, transcription factor binding sites, terminators, etc in a single genome. We have validated our tool on the deletion of non-essential open reading frames (ORFs) from S. cerevisiae. The tool is applicable to arbitrary genomes, and we provide primer sequences for the deletion of: 90% of the ORFs from the S. cerevisiae genome, 88% of the ORFs from S. pombe genome, and 85% of the ORFs from the L. lactis genome. PMID:26630677
Farrokh-Eslamlou, Hamidreza; Aghlmand, Siamak; Eslami, Mohammad; Homer, Caroline S E
2014-04-01
We investigated whether use of the World Health Organization's (WHO's) Decision-Making Tool (DMT) for Family Planning Clients and Providers would improve the process and outcome quality indicators of family planning (FP) services in Iran. The DMT was adapted for the Iranian setting. The study evaluated 24 FP quality key indicators grouped into two main areas, namely process and outcome. The tool was implemented in 52 urban and rural public health facilities in four selected and representative provinces of Iran. A pre-post methodology was undertaken to examine whether use of the tool improved the quality of FP services and client satisfaction with the services. Quantitative data were collected through observations of counselling and exit interviews with clients using structured questionnaires. Different numbers of FP clients were recruited during the baseline and the post-intervention rounds (n=448 vs 547, respectively). The DMT improved many client-provider interaction indicators, including verbal and non-verbal communication (p<0.05). The tool also impacted positively on the client's choice of contraceptive method, providers' technical competence, and quality of information provided to clients (p<0.05). Use of the tool improved the clients' satisfaction with FP services (from 72% to 99%; p<0.05). The adapted WHO's DMT has the potential to improve the quality of FP services.
2013-01-01
Background Predictive tools are already being implemented to assist in Emergency Department bed management by forecasting the expected total volume of patients. Yet these tools are unable to detect and diagnose when estimates fall short. Early detection of hotspots, that is subpopulations of patients presenting in unusually high numbers, would help authorities to manage limited health resources and communicate effectively about emerging risks. We evaluate an anomaly detection tool that signals when, and in what way Emergency Departments in 18 hospitals across the state of Queensland, Australia, are significantly exceeding their forecasted patient volumes. Methods The tool in question is an adaptation of the Surveillance Tree methodology initially proposed in Sparks and Okugami (IntStatl 1:2–24, 2010). for the monitoring of vehicle crashes. The methodology was trained on presentations to 18 Emergency Departments across Queensland over the period 2006 to 2008. Artificial increases were added to simulated, in-control counts for these data to evaluate the tool’s sensitivity, timeliness and diagnostic capability. The results were compared with those from a univariate control chart. The tool was then applied to data from 2009, the year of the H1N1 (or ‘Swine Flu’) pandemic. Results The Surveillance Tree method was found to be at least as effective as a univariate, exponentially weighted moving average (EWMA) control chart when increases occurred in a subgroup of the monitored population. The method has advantages over the univariate control chart in that it allows for the monitoring of multiple disease groups while still allowing control of the overall false alarm rate. It is also able to detect changes in the makeup of the Emergency Department presentations, even when the total count remains unchanged. Furthermore, the Surveillance Tree method provides diagnostic information useful for service improvements or disease management. Conclusions Multivariate surveillance provides a useful tool in the management of hospital Emergency Departments by not only efficiently detecting unusually high numbers of presentations, but by providing information about which groups of patients are causing the increase. PMID:24313914
Web-Based Tools for Text-Based Patient-Provider Communication in Chronic Conditions: Scoping Review
Grunfeld, Eva; Makuwaza, Tutsirai; Bender, Jacqueline L
2017-01-01
Background Patients with chronic conditions require ongoing care which not only necessitates support from health care providers outside appointments but also self-management. Web-based tools for text-based patient-provider communication, such as secure messaging, allow for sharing of contextual information and personal narrative in a simple accessible medium, empowering patients and enabling their providers to address emerging care needs. Objective The objectives of this study were to (1) conduct a systematic search of the published literature and the Internet for Web-based tools for text-based communication between patients and providers; (2) map tool characteristics, their intended use, contexts in which they were used, and by whom; (3) describe the nature of their evaluation; and (4) understand the terminology used to describe the tools. Methods We conducted a scoping review using the MEDLINE (Medical Literature Analysis and Retrieval System Online) and EMBASE (Excerpta Medica Database) databases. We summarized information on the characteristics of the tools (structure, functions, and communication paradigm), intended use, context and users, evaluation (study design and outcomes), and terminology. We performed a parallel search of the Internet to compare with tools identified in the published literature. Results We identified 54 papers describing 47 unique tools from 13 countries studied in the context of 68 chronic health conditions. The majority of tools (77%, 36/47) had functions in addition to communication (eg, viewable care plan, symptom diary, or tracker). Eight tools (17%, 8/47) were described as allowing patients to communicate with the team or multiple health care providers. Most of the tools were intended to support communication regarding symptom reporting (49%, 23/47), and lifestyle or behavior modification (36%, 17/47). The type of health care providers who used tools to communicate with patients were predominantly allied health professionals of various disciplines (30%, 14/47), nurses (23%, 11/47), and physicians (19%, 9/47), among others. Over half (52%, 25/48) of the tools were evaluated in randomized controlled trials, and 23 tools (48%, 23/48) were evaluated in nonrandomized studies. Terminology of tools varied by intervention type and functionality and did not consistently reflect a theme of communication. The majority of tools found in the Internet search were patient portals from 6 developers; none were found among published articles. Conclusions Web-based tools for text-based patient-provider communication were identified from a wide variety of clinical contexts and with varied functionality. Tools were most prevalent in contexts where intended use was self-management. Few tools for team-based communication were found, but this may become increasingly important as chronic disease care becomes more interdisciplinary. PMID:29079552
3D liver volume reconstructed for palpation training.
Tibamoso, Gerardo; Perez-Gutierrez, Byron; Uribe-Quevedo, Alvaro
2013-01-01
Virtual Reality systems for medical procedures such as the palpation of different organs, requires fast, robust, accurate and reliable computational methods for providing realism during interaction with the 3D biological models. This paper presents the segmentation, reconstruction and palpation simulation of a healthy liver volume as a tool for training. The chosen method considers the mechanical characteristics and liver properties for correctly simulating palpation interactions, which results appropriate as a complementary tool for training medical students in familiarizing with the liver anatomy.
Streamflow Duration Assessment Method for the Pacific Northwest
The Streamflow Duration Assessment Method for the Pacific Northwest is a scientific tool developed by EPA and the U.S. Army Corps of Engineers to provide a rapid assessment framework to distinguish between ephemeral, intermittent and perennial streams.
methylPipe and compEpiTools: a suite of R packages for the integrative analysis of epigenomics data.
Kishore, Kamal; de Pretis, Stefano; Lister, Ryan; Morelli, Marco J; Bianchi, Valerio; Amati, Bruno; Ecker, Joseph R; Pelizzola, Mattia
2015-09-29
Numerous methods are available to profile several epigenetic marks, providing data with different genome coverage and resolution. Large epigenomic datasets are then generated, and often combined with other high-throughput data, including RNA-seq, ChIP-seq for transcription factors (TFs) binding and DNase-seq experiments. Despite the numerous computational tools covering specific steps in the analysis of large-scale epigenomics data, comprehensive software solutions for their integrative analysis are still missing. Multiple tools must be identified and combined to jointly analyze histone marks, TFs binding and other -omics data together with DNA methylation data, complicating the analysis of these data and their integration with publicly available datasets. To overcome the burden of integrating various data types with multiple tools, we developed two companion R/Bioconductor packages. The former, methylPipe, is tailored to the analysis of high- or low-resolution DNA methylomes in several species, accommodating (hydroxy-)methyl-cytosines in both CpG and non-CpG sequence context. The analysis of multiple whole-genome bisulfite sequencing experiments is supported, while maintaining the ability of integrating targeted genomic data. The latter, compEpiTools, seamlessly incorporates the results obtained with methylPipe and supports their integration with other epigenomics data. It provides a number of methods to score these data in regions of interest, leading to the identification of enhancers, lncRNAs, and RNAPII stalling/elongation dynamics. Moreover, it allows a fast and comprehensive annotation of the resulting genomic regions, and the association of the corresponding genes with non-redundant GeneOntology terms. Finally, the package includes a flexible method based on heatmaps for the integration of various data types, combining annotation tracks with continuous or categorical data tracks. methylPipe and compEpiTools provide a comprehensive Bioconductor-compliant solution for the integrative analysis of heterogeneous epigenomics data. These packages are instrumental in providing biologists with minimal R skills a complete toolkit facilitating the analysis of their own data, or in accelerating the analyses performed by more experienced bioinformaticians.
Methods of epigenome editing for probing the function of genomic imprinting.
Rienecker, Kira DA; Hill, Matthew J; Isles, Anthony R
2016-10-01
The curious patterns of imprinted gene expression draw interest from several scientific disciplines to the functional consequences of genomic imprinting. Methods of probing the function of imprinting itself have largely been indirect and correlational, relying heavily on conventional transgenics. Recently, the burgeoning field of epigenome editing has provided new tools and suggested strategies for asking causal questions with site specificity. This perspective article aims to outline how these new methods may be applied to questions of functional imprinting and, with this aim in mind, to suggest new dimensions for the expansion of these epigenome-editing tools.
Debris Examination Using Ballistic and Radar Integrated Software
NASA Technical Reports Server (NTRS)
Griffith, Anthony; Schottel, Matthew; Lee, David; Scully, Robert; Hamilton, Joseph; Kent, Brian; Thomas, Christopher; Benson, Jonathan; Branch, Eric; Hardman, Paul;
2012-01-01
The Debris Examination Using Ballistic and Radar Integrated Software (DEBRIS) program was developed to provide rapid and accurate analysis of debris observed by the NASA Debris Radar (NDR). This software provides a greatly improved analysis capacity over earlier manual processes, allowing for up to four times as much data to be analyzed by one-quarter of the personnel required by earlier methods. There are two applications that comprise the DEBRIS system: the Automated Radar Debris Examination Tool (ARDENT) and the primary DEBRIS tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geihe, Erika; Trantow, Brian; Wender, Paul
The introduction of tools to study, control or expand the inner-workings of algae has been slow to develop. Provided are embodiments of a molecular method based on guanidinium-rich molecular transporters (GR-MoTrs) for bringing molecular cargos into algal cells. The methods of the disclosure have been shown to work in wild-type algae that have an intact cell wall. Developed using Chlamydomonas reinhardtii, this method is also successful with less studied algae, including Neochloris oleoabundans and Scenedesmus dimorphus, thus providing a new and versatile tool for algal research and modification. The method of delivering a cargo compound to an algal cell comprisesmore » contacting an algal cell with a guanidinium-rich delivery vehicle comprising a guanidinium-rich molecular transporter (GR-MoTr) linked to a cargo compound desired to be delivered to the algal cell, whereby the guanidinium-rich molecular transporter can traverse the algal cell wall, thereby delivering the cargo compound to the algal cell.« less
Johnson, Eileanoir B.; Gregory, Sarah; Johnson, Hans J.; Durr, Alexandra; Leavitt, Blair R.; Roos, Raymund A.; Rees, Geraint; Tabrizi, Sarah J.; Scahill, Rachael I.
2017-01-01
The selection of an appropriate segmentation tool is a challenge facing any researcher aiming to measure gray matter (GM) volume. Many tools have been compared, yet there is currently no method that can be recommended above all others; in particular, there is a lack of validation in disease cohorts. This work utilizes a clinical dataset to conduct an extensive comparison of segmentation tools. Our results confirm that all tools have advantages and disadvantages, and we present a series of considerations that may be of use when selecting a GM segmentation method, rather than a ranking of these tools. Seven segmentation tools were compared using 3 T MRI data from 20 controls, 40 premanifest Huntington’s disease (HD), and 40 early HD participants. Segmented volumes underwent detailed visual quality control. Reliability and repeatability of total, cortical, and lobular GM were investigated in repeated baseline scans. The relationship between each tool was also examined. Longitudinal within-group change over 3 years was assessed via generalized least squares regression to determine sensitivity of each tool to disease effects. Visual quality control and raw volumes highlighted large variability between tools, especially in occipital and temporal regions. Most tools showed reliable performance and the volumes were generally correlated. Results for longitudinal within-group change varied between tools, especially within lobular regions. These differences highlight the need for careful selection of segmentation methods in clinical neuroimaging studies. This guide acts as a primer aimed at the novice or non-technical imaging scientist providing recommendations for the selection of cohort-appropriate GM segmentation software. PMID:29066997
Johnson, Eileanoir B; Gregory, Sarah; Johnson, Hans J; Durr, Alexandra; Leavitt, Blair R; Roos, Raymund A; Rees, Geraint; Tabrizi, Sarah J; Scahill, Rachael I
2017-01-01
The selection of an appropriate segmentation tool is a challenge facing any researcher aiming to measure gray matter (GM) volume. Many tools have been compared, yet there is currently no method that can be recommended above all others; in particular, there is a lack of validation in disease cohorts. This work utilizes a clinical dataset to conduct an extensive comparison of segmentation tools. Our results confirm that all tools have advantages and disadvantages, and we present a series of considerations that may be of use when selecting a GM segmentation method, rather than a ranking of these tools. Seven segmentation tools were compared using 3 T MRI data from 20 controls, 40 premanifest Huntington's disease (HD), and 40 early HD participants. Segmented volumes underwent detailed visual quality control. Reliability and repeatability of total, cortical, and lobular GM were investigated in repeated baseline scans. The relationship between each tool was also examined. Longitudinal within-group change over 3 years was assessed via generalized least squares regression to determine sensitivity of each tool to disease effects. Visual quality control and raw volumes highlighted large variability between tools, especially in occipital and temporal regions. Most tools showed reliable performance and the volumes were generally correlated. Results for longitudinal within-group change varied between tools, especially within lobular regions. These differences highlight the need for careful selection of segmentation methods in clinical neuroimaging studies. This guide acts as a primer aimed at the novice or non-technical imaging scientist providing recommendations for the selection of cohort-appropriate GM segmentation software.
Spectral analysis for GNSS coordinate time series using chirp Fourier transform
NASA Astrophysics Data System (ADS)
Feng, Shengtao; Bo, Wanju; Ma, Qingzun; Wang, Zifan
2017-12-01
Spectral analysis for global navigation satellite system (GNSS) coordinate time series provides a principal tool to understand the intrinsic mechanism that affects tectonic movements. Spectral analysis methods such as the fast Fourier transform, Lomb-Scargle spectrum, evolutionary power spectrum, wavelet power spectrum, etc. are used to find periodic characteristics in time series. Among spectral analysis methods, the chirp Fourier transform (CFT) with less stringent requirements is tested with synthetic and actual GNSS coordinate time series, which proves the accuracy and efficiency of the method. With the length of series only limited to even numbers, CFT provides a convenient tool for windowed spectral analysis. The results of ideal synthetic data prove CFT accurate and efficient, while the results of actual data show that CFT is usable to derive periodic information from GNSS coordinate time series.
Ashman, Amy M; Collins, Clare E; Brown, Leanne J; Rae, Kym M
2016-01-01
Background Dietitians ideally should provide personally tailored nutrition advice to pregnant women. Provision is hampered by a lack of appropriate tools for nutrition assessment and counselling in practice settings. Smartphone technology, through the use of image-based dietary records, can address limitations of traditional methods of recording dietary intake. Feedback on these records can then be provided by the dietitian via smartphone. Efficacy and validity of these methods requires examination. Objective The aims of the Australian Diet Bytes and Baby Bumps study, which used image-based dietary records and a purpose-built brief Selected Nutrient and Diet Quality (SNaQ) tool to provide tailored nutrition advice to pregnant women, were to assess relative validity of the SNaQ tool for analyzing dietary intake compared with nutrient analysis software, to describe the nutritional intake adequacy of pregnant participants, and to assess acceptability of dietary feedback via smartphone. Methods Eligible women used a smartphone app to record everything they consumed over 3 nonconsecutive days. Records consisted of an image of the food or drink item placed next to a fiducial marker, with a voice or text description, or both, providing additional detail. We used the SNaQ tool to analyze participants’ intake of daily food group servings and selected key micronutrients for pregnancy relative to Australian guideline recommendations. A visual reference guide consisting of images of foods and drinks in standard serving sizes assisted the dietitian with quantification. Feedback on participants’ diets was provided via 2 methods: (1) a short video summary sent to participants’ smartphones, and (2) a follow-up telephone consultation with a dietitian. Agreement between dietary intake assessment using the SNaQ tool and nutrient analysis software was evaluated using Spearman rank correlation and Cohen kappa. Results We enrolled 27 women (median age 28.8 years, 8 Indigenous Australians, 15 primiparas), of whom 25 completed the image-based dietary record. Median intakes of grains, vegetables, fruit, meat, and dairy were below recommendations. Median (interquartile range) intake of energy-dense, nutrient-poor foods was 3.5 (2.4-3.9) servings/day and exceeded recommendations (0-2.5 servings/day). Positive correlations between the SNaQ tool and nutrient analysis software were observed for energy (ρ=.898, P<.001) and all selected micronutrients (iron, calcium, zinc, folate, and iodine, ρ range .510-.955, all P<.05), both with and without vitamin and mineral supplements included in the analysis. Cohen kappa showed moderate to substantial agreement for selected micronutrients when supplements were included (kappa range .488-.803, all P ≤.001) and for calcium, iodine, and zinc when excluded (kappa range .554-.632, all P<.001). A total of 17 women reported changing their diet as a result of the personalized nutrition advice. Conclusions The SNaQ tool demonstrated acceptable validity for assessing adequacy of key pregnancy nutrient intakes and preliminary evidence of utility to support dietitians in providing women with personalized advice to optimize nutrition during pregnancy. PMID:27815234
Method and tool for expanding tubular members by electro-hydraulic forming
Golovashchenko, Sergey Fedorovich; Bonnen, John Joseph Francis
2013-10-29
An electro-hydraulic forming tool having one or more electrodes for forming parts with sharp corners. The electrodes may be moved and sequentially discharged several times to form various areas of the tube. Alternatively, a plurality of electrodes may be provided that are provided within an insulating tube that defines a charge area opening. The insulating tube is moved to locate the charge area opening adjacent one of the electrodes to form spaced locations on a preform. In other embodiments, a filament wire is provided in a cartridge or supported by an insulative support.
Ergonomic glovebox workspace layout tool and associated method of use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roddy, Shannon Howard
The present invention provides an elongate tool that aides in the placement of objects and machinery within a glovebox, such that the objects and machinery can be safely handled by a user. The tool includes a plurality of visual markings (in English units, metric units, other units, grooves, ridges, varying widths, etc.) that indicate distance from the user within the glovebox, optionally broken into placement preference zones that are color coded, grayscale coded, or the like.
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith
2015-01-01
Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment. PMID:26368541
Mapping healthcare systems: a policy relevant analytic tool
Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L.V.
2017-01-01
Abstract Background In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool – the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Methods Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. Results We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. Conclusions As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. PMID:28541518
2013-01-01
Background Tools to support clinical or patient decision-making in the treatment/management of a health condition are used in a range of clinical settings for numerous preference-sensitive healthcare decisions. Their impact in clinical practice is largely dependent on their quality across a range of domains. We critically analysed currently available tools to support decision making or patient understanding in the treatment of acute ischaemic stroke with intravenous thrombolysis, as an exemplar to provide clinicians/researchers with practical guidance on development, evaluation and implementation of such tools for other preference-sensitive treatment options/decisions in different clinical contexts. Methods Tools were identified from bibliographic databases, Internet searches and a survey of UK and North American stroke networks. Two reviewers critically analysed tools to establish: information on benefits/risks of thrombolysis included in tools, and the methods used to convey probabilistic information (verbal descriptors, numerical and graphical); adherence to guidance on presenting outcome probabilities (IPDASi probabilities items) and information content (Picker Institute Checklist); readability (Fog Index); and the extent that tools had comprehensive development processes. Results Nine tools of 26 identified included information on a full range of benefits/risks of thrombolysis. Verbal descriptors, frequencies and percentages were used to convey probabilistic information in 20, 19 and 18 tools respectively, whilst nine used graphical methods. Shortcomings in presentation of outcome probabilities (e.g. omitting outcomes without treatment) were identified. Patient information tools had an aggregate median Fog index score of 10. None of the tools had comprehensive development processes. Conclusions Tools to support decision making or patient understanding in the treatment of acute stroke with thrombolysis have been sub-optimally developed. Development of tools should utilise mixed methods and strategies to meaningfully involve clinicians, patients and their relatives in an iterative design process; include evidence-based methods to augment interpretability of textual and probabilistic information (e.g. graphical displays showing natural frequencies) on the full range of outcome states associated with available options; and address patients with different levels of health literacy. Implementation of tools will be enhanced when mechanisms are in place to periodically assess the relevance of tools and where necessary, update the mode of delivery, form and information content. PMID:23777368
NASA Astrophysics Data System (ADS)
Ee, K. C.; Dillon, O. W.; Jawahir, I. S.
2004-06-01
This paper discusses the influence of major chip-groove parameters of a cutting tool on the chip formation process in orthogonal machining using finite element (FE) methods. In the FE formulation, a thermal elastic-viscoplastic material model is used together with a modified Johnson-Cook material law for the flow stress. The chip back-flow angle and the chip up-curl radius are calculated for a range of cutting conditions by varying the chip-groove parameters. The analysis provides greater understanding of the effectiveness of chip-groove configurations and points a way to correlate cutting conditions with tool-wear when machining with a grooved cutting tool.
Microstructure-Property-Design Relationships in the Simulation Era: An Introduction (PREPRINT)
2010-01-01
Astronautics (AIAA) paper #1026. 20. Dimiduk DM (1998) Systems engineering of gamma titanium aluminides : impact of fundamentals on development strategy...microstructure-sensitive design tools for single-crystal turbine blades provides an accessible glimpse into future computational tools and their data...requirements. 15. SUBJECT TERMS single-crystal turbine blades , computational methods, integrated computational materials 16. SECURITY
This document provides guidance for Logistics, Multi-modal, and Shippers on how to use outside data collection systems to populate the SmartWay tools carrier data and activity sections using an automated method. (EPA publication # EPA-420-B-16-057a)
Making Quality Sense: A Guide to Quality, Tools and Techniques, Awards and the Thinking Behind Them.
ERIC Educational Resources Information Center
Owen, Jane
This document is intended to guide further education colleges and work-based learning providers through some of the commonly used tools, techniques, and theories of quality management. The following are among the topics discussed: (1) various ways of defining quality; methods used by organizations to achieve quality (quality control, quality…
The Instructional Instrument SL-EDGE Student Library-Educational DiGital Environment.
ERIC Educational Resources Information Center
Kyriakopoulou, Antonia; Kalamboukis, Theodore
An educational digital environment that will provide appropriate methods and techniques for the support and enhancement of the educational and learning process is a valuable tool for both educators and learners. In the context of such a mission, the educational tool SL-EDGE (Student Library-Educational DiGital Environment) has been developed. The…
Student Perception of Social Media as a Course Tool
ERIC Educational Resources Information Center
McCarthy, Richard V.; McCarthy, Mary M.
2014-01-01
If a technology provides features that are useful then it will have a positive impact on performance. Social media has morphed into one of the preferred methods of communication for many people; much has been written to proclaim its benefits including its usefulness as a tool to help students achieve success within the classroom. But is it…
ERIC Educational Resources Information Center
Cann, Cynthia W.; Brumagim, Alan L.
2008-01-01
The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…
Cross-Cultural Education in U.S. Medical Schools: Development of an Assessment Tool.
ERIC Educational Resources Information Center
Dolhun, Eduardo Pena; Munoz, Claudia; Grumbach, Kevin
2003-01-01
Medical schools were invited to provide written and Web-based materials related to implementing cross-cultural competency in their curricula. A tool was developed to measure teaching methods, skill sets, and eight content areas in cross-cultural education. Most programs emphasized teaching general themes, such as the doctor-patient relationship,…
Using the Model United Nations as a Teaching Tool.
ERIC Educational Resources Information Center
Efird, L. Julian
This document provides a description of the Model United Nations (MUN) program, its educational benefits, an overview of its practice within the United States, and outlines methods for using the MUN as a teaching tool. A total of 72 MUNs involving high school and college students was reported in 1977-78. As a simulation, the MUN provides…
ERIC Educational Resources Information Center
Mirzaei, Maryam Sadat; Meshgi, Kourosh; Akita, Yuya; Kawahara, Tatsuya
2017-01-01
This paper introduces a novel captioning method, partial and synchronized captioning (PSC), as a tool for developing second language (L2) listening skills. Unlike conventional full captioning, which provides the full text and allows comprehension of the material merely by reading, PSC promotes listening to the speech by presenting a selected…
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
Mass spectrometry as a quantitative tool in plant metabolomics
Jorge, Tiago F.; Mata, Ana T.
2016-01-01
Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967
Simplified, inverse, ejector design tool
NASA Technical Reports Server (NTRS)
Dechant, Lawrence J.
1993-01-01
A simple lumped parameter based inverse design tool has been developed which provides flow path geometry and entrainment estimates subject to operational, acoustic, and design constraints. These constraints are manifested through specification of primary mass flow rate or ejector thrust, fully-mixed exit velocity, and static pressure matching. Fundamentally, integral forms of the conservation equations coupled with the specified design constraints are combined to yield an easily invertible linear system in terms of the flow path cross-sectional areas. Entrainment is computed by back substitution. Initial comparison with experimental and analogous one-dimensional methods show good agreement. Thus, this simple inverse design code provides an analytically based, preliminary design tool with direct application to High Speed Civil Transport (HSCT) design studies.
Using microwave Doppler radar in automated manufacturing applications
NASA Astrophysics Data System (ADS)
Smith, Gregory C.
Since the beginning of the Industrial Revolution, manufacturers worldwide have used automation to improve productivity, gain market share, and meet growing or changing consumer demand for manufactured products. To stimulate further industrial productivity, manufacturers need more advanced automation technologies: "smart" part handling systems, automated assembly machines, CNC machine tools, and industrial robots that use new sensor technologies, advanced control systems, and intelligent decision-making algorithms to "see," "hear," "feel," and "think" at the levels needed to handle complex manufacturing tasks without human intervention. The investigator's dissertation offers three methods that could help make "smart" CNC machine tools and industrial robots possible: (1) A method for detecting acoustic emission using a microwave Doppler radar detector, (2) A method for detecting tool wear on a CNC lathe using a Doppler radar detector, and (3) An online non-contact method for detecting industrial robot position errors using a microwave Doppler radar motion detector. The dissertation studies indicate that microwave Doppler radar could be quite useful in automated manufacturing applications. In particular, the methods developed may help solve two difficult problems that hinder further progress in automating manufacturing processes: (1) Automating metal-cutting operations on CNC machine tools by providing a reliable non-contact method for detecting tool wear, and (2) Fully automating robotic manufacturing tasks by providing a reliable low-cost non-contact method for detecting on-line position errors. In addition, the studies offer a general non-contact method for detecting acoustic emission that may be useful in many other manufacturing and non-manufacturing areas, as well (e.g., monitoring and nondestructively testing structures, materials, manufacturing processes, and devices). By advancing the state of the art in manufacturing automation, the studies may help stimulate future growth in industrial productivity, which also promises to fuel economic growth and promote economic stability. The study also benefits the Department of Industrial Technology at Iowa State University and the field of Industrial Technology by contributing to the ongoing "smart" machine research program within the Department of Industrial Technology and by stimulating research into new sensor technologies within the University and within the field of Industrial Technology.
ERIC Educational Resources Information Center
Roe, Katie; McConney, Andrew; Mansfield, Caroline F.
2014-01-01
Modern zoos utilise a variety of education tools for communicating with visitors. Previous research has discussed the benefits of providing multiple education communications, yet little research provides an indication of what communications are being employed within zoos today. This research is a two-phased, mixed-methods investigation into the…
Propulsion Diagnostic Method Evaluation Strategy (ProDiMES) User's Guide
NASA Technical Reports Server (NTRS)
Simon, Donald L.
2010-01-01
This report is a User's Guide for the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES). ProDiMES is a standard benchmarking problem and a set of evaluation metrics to enable the comparison of candidate aircraft engine gas path diagnostic methods. This Matlab (The Mathworks, Inc.) based software tool enables users to independently develop and evaluate diagnostic methods. Additionally, a set of blind test case data is also distributed as part of the software. This will enable the side-by-side comparison of diagnostic approaches developed by multiple users. The Users Guide describes the various components of ProDiMES, and provides instructions for the installation and operation of the tool.
Probability of Detection (POD) as a statistical model for the validation of qualitative methods.
Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T
2011-01-01
A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.
Machine Learning: A Crucial Tool for Sensor Design
Zhao, Weixiang; Bhushan, Abhinav; Santamaria, Anthony D.; Simon, Melinda G.; Davis, Cristina E.
2009-01-01
Sensors have been widely used for disease diagnosis, environmental quality monitoring, food quality control, industrial process analysis and control, and other related fields. As a key tool for sensor data analysis, machine learning is becoming a core part of novel sensor design. Dividing a complete machine learning process into three steps: data pre-treatment, feature extraction and dimension reduction, and system modeling, this paper provides a review of the methods that are widely used for each step. For each method, the principles and the key issues that affect modeling results are discussed. After reviewing the potential problems in machine learning processes, this paper gives a summary of current algorithms in this field and provides some feasible directions for future studies. PMID:20191110
New Techniques for the Generation and Analysis of Tailored Microbial Systems on Surfaces.
Furst, Ariel L; Smith, Matthew J; Francis, Matthew B
2018-05-17
The interactions between microbes and surfaces provide critically important cues that control the behavior and growth of the cells. As our understanding of complex microbial communities improves, there is a growing need for experimental tools that can establish and control the spatial arrangements of these cells in a range of contexts. Recent improvements in methods to attach bacteria and yeast to nonbiological substrates, combined with an expanding set of techniques available to study these cells, position this field for many new discoveries. Improving methods for controlling the immobilization of bacteria provides powerful experimental tools for testing hypotheses regarding microbiome interactions, studying the transfer of nutrients between bacterial species, and developing microbial communities for green energy production and pollution remediation.
RATT: Rapid Annotation Transfer Tool
Otto, Thomas D.; Dillon, Gary P.; Degrave, Wim S.; Berriman, Matthew
2011-01-01
Second-generation sequencing technologies have made large-scale sequencing projects commonplace. However, making use of these datasets often requires gene function to be ascribed genome wide. Although tool development has kept pace with the changes in sequence production, for tasks such as mapping, de novo assembly or visualization, genome annotation remains a challenge. We have developed a method to rapidly provide accurate annotation for new genomes using previously annotated genomes as a reference. The method, implemented in a tool called RATT (Rapid Annotation Transfer Tool), transfers annotations from a high-quality reference to a new genome on the basis of conserved synteny. We demonstrate that a Mycobacterium tuberculosis genome or a single 2.5 Mb chromosome from a malaria parasite can be annotated in less than five minutes with only modest computational resources. RATT is available at http://ratt.sourceforge.net. PMID:21306991
SU-E-E-02: An Excel-Based Study Tool for ABR-Style Exams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cline, K; Stanley, D; Defoor, D
2015-06-15
Purpose: As the landscape of learning and testing shifts toward a computer-based environment, a replacement for paper-based methods of studying is desirable. Using Microsoft Excel, a study tool was developed that allows the user to populate multiple-choice questions and then generate an interactive quiz session to answer them. Methods: The code for the tool was written using Microsoft Excel Visual Basic for Applications with the intent that this tool could be implemented by any institution with Excel. The base tool is a template with a setup macro, which builds out the structure based on user’s input. Once the framework ismore » built, the user can input sets of multiple-choice questions, answer choices, and even add figures. The tool can be run in random-question or sequential-question mode for single or multiple courses of study. The interactive session allows the user to select answer choices and immediate feedback is provided. Once the user is finished studying, the tool records the day’s progress by reporting progress statistics useful for trending. Results: Six doctoral students at UTHSCSA have used this tool for the past two months to study for their qualifying exam, which is similar in format and content to the American Board of Radiology (ABR) Therapeutic Part II exam. The students collaborated to create a repository of questions, met weekly to go over these questions, and then used the tool to prepare for their exam. Conclusion: The study tool has provided an effective and efficient way for students to collaborate and be held accountable for exam preparation. The ease of use and familiarity of Excel are important factors for the tool’s use. There are software packages to create similar question banks, but this study tool has no additional cost for those that already have Excel. The study tool will be made openly available.« less
Laing, Karen; Baumgartner, Katherine
2005-01-01
Many endoscopy units are looking for ways to improve their efficiency without increasing the number of staff, purchasing additional equipment, or making the patients feel as if they have been rushed through the care process. To accomplish this, a few hospitals have looked to other industries for help. Recently, "lean" methods and tools from the manufacturing industry, have been applied successfully in health care systems, and have proven to be an effective way to eliminate waste and redundancy in workplace processes. The "lean" method and tools in service organizations focuses on providing the most efficient and effective flow of service and products. This article will describe the journey of one endoscopy department within a community hospital to illustrate application of "lean" methods and tools and results.
[Mixed methods research in public health: issues and illustration].
Guével, Marie-Renée; Pommier, Jeanine
2012-01-01
For many years, researchers in a range of fields have combined quantitative and qualitative methods. However, the combined use of quantitative and qualitative methods has only recently been conceptualized and defined as mixed methods research. Some authors have described the emerging field as a third methodological tradition (in addition to the qualitative and quantitative traditions). Mixed methods research combines different perspectives and facilitates the study of complex interventions or programs, particularly in public health, an area where interdisciplinarity is critical. However, the existing literature is primarily in English. By contrast, the literature in French remains limited. The purpose of this paper is to present the emergence of mixed methods research for francophone public health specialists. A literature review was conducted to identify the main characteristics of mixed methods research. The results provide an overall picture of the mixed methods approach through its history, definitions, and applications, and highlight the tools developed to clarify the approach (typologies) and to implement it (integration of results and quality standards). The tools highlighted in the literature review are illustrated by a study conducted in France. Mixed methods research opens new possibilities for examining complex research questions and provides relevant and promising opportunities for addressing current public health issues in France.
Gill, Ashlinder; Khan, Anum Irfan; Hans, Parminder Kaur; Kuluski, Kerry; Cott, Cheryl
2016-01-01
Background People experiencing complex chronic disease and disability (CCDD) face some of the greatest challenges of any patient population. Primary care providers find it difficult to manage multiple discordant conditions and symptoms and often complex social challenges experienced by these patients. The electronic Patient Reported Outcome (ePRO) tool is designed to overcome some of these challenges by supporting goal-oriented primary care delivery. Using the tool, patients and providers collaboratively develop health care goals on a portal linked to a mobile device to help patients and providers track progress between visits. Objectives This study tested the usability and feasibility of adopting the ePRO tool into a single interdisciplinary primary health care practice in Toronto, Canada. The Fit between Individuals, Fask, and Technology (FITT) framework was used to guide our assessment and explore whether the ePRO tool is: (1) feasible for adoption in interdisciplinary primary health care practices and (2) usable from both the patient and provider perspectives. This usability pilot is part of a broader user-centered design development strategy. Methods A 4-week pilot study was conducted in which patients and providers used the ePRO tool to develop health-related goals, which patients then monitored using a mobile device. Patients and providers collaboratively set goals using the system during an initial visit and had at least 1 follow-up visit at the end of the pilot to discuss progress. Focus groups and interviews were conducted with patients and providers to capture usability and feasibility measures. Data from the ePRO system were extracted to provide information regarding tool usage. Results Six providers and 11 patients participated in the study; 3 patients dropped out mainly owing to health issues. The remaining 8 patients completed 210 monitoring protocols, equal to over 1300 questions, with patients often answering questions daily. Providers and patients accessed the portal on an average of 10 and 1.5 times, respectively. Users found the system easy to use, some patients reporting that the tool helped in their ability to self-manage, catalyzed a sense of responsibility over their care, and improved patient-centered care delivery. Some providers found that the tool helped focus conversations on goal setting. However, the tool did not fit well with provider workflows, monitoring questions were not adequately tailored to individual patient needs, and daily reporting became tedious and time-consuming for patients. Conclusions Although our study suggests relatively low usability and feasibility of the ePRO tool, we are encouraged by the early impact on patient outcomes and generally positive responses from both user groups regarding the potential of the tool to improve care for patients with CCDD. As is consistent with our user-centered design development approach, we have modified the tool based on user feedback, and are now testing the redeveloped tool through an exploratory trial. PMID:27256035
Integrating reliability and maintainability into a concurrent engineering environment
NASA Astrophysics Data System (ADS)
Phillips, Clifton B.; Peterson, Robert R.
1993-02-01
This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.
Use of omics methods to determine the mode of action of natural phytotoxins
USDA-ARS?s Scientific Manuscript database
Technology has greatly increased the power of omics methods to profile transcription, protein, and metabolite responses to phytotoxins. These methods hold promise as a tool for providing clues to the modes of action of such compounds. However, to date, only two putative modes of action have been fou...
Flowgen: Flowchart-based documentation for C + + codes
NASA Astrophysics Data System (ADS)
Kosower, David A.; Lopez-Villarejo, J. J.
2015-11-01
We present the Flowgen tool, which generates flowcharts from annotated C + + source code. The tool generates a set of interconnected high-level UML activity diagrams, one for each function or method in the C + + sources. It provides a simple and visual overview of complex implementations of numerical algorithms. Flowgen is complementary to the widely-used Doxygen documentation tool. The ultimate aim is to render complex C + + computer codes accessible, and to enhance collaboration between programmers and algorithm or science specialists. We describe the tool and a proof-of-concept application to the VINCIA plug-in for simulating collisions at CERN's Large Hadron Collider.
Uncertainty visualisation in the Model Web
NASA Astrophysics Data System (ADS)
Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.
2012-04-01
Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).
NASA Astrophysics Data System (ADS)
Paquette, Mark S.
New tools are often required to facilitate new discoveries and test new methods. Commercial offerings can be prohibitively expensive and difficult to customize. The development of ad-hoc tools provides the most flexibility and provides an opportunity to modify and refine a technology. An embossing system was developed for silk film imprinting and stamping in order to facilitate and add versatility to the efforts involving micro- and nanoscale device manufacturing in biopolymers. This system features temperature controlled embossing surfaces, adjustable embossing pressures, and variable embossing times. The device can also be fitted with interchangeable temperature controlled embossing and stamping tools. The design, development, fabrication, applications, and future improvements are explored for the system. This device may facilitate new discoveries in the realm of biopolymer micro- and nanomanufacturing and may provide a path towards high volume production of silk film based technologies.
Rublee, Parke A; Remington, David L; Schaefer, Eric F; Marshall, Michael M
2005-01-01
Molecular methods, including conventional PCR, real-time PCR, denaturing gradient gel electrophoresis, fluorescent fragment detection PCR, and fluorescent in situ hybridization, have all been developed for use in identifying and studying the distribution of the toxic dinoflagellates Pfiesteria piscicida and P. shumwayae. Application of the methods has demonstrated a worldwide distribution of both species and provided insight into their environmental tolerance range and temporal changes in distribution. Genetic variability among geographic locations generally appears low in rDNA genes, and detection of the organisms in ballast water is consistent with rapid dispersal or high gene flow among populations, but additional sequence data are needed to verify this hypothesis. The rapid development and application of these tools serves as a model for study of other microbial taxa and provides a basis for future development of tools that can simultaneously detect multiple targets.
Preliminary design methods for fiber reinforced composite structures employing a personal computer
NASA Technical Reports Server (NTRS)
Eastlake, C. N.
1986-01-01
The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.
Sansom, P; Copley, V R; Naik, F C; Leach, S; Hall, I M
2013-01-01
Statistical methods used in spatio-temporal surveillance of disease are able to identify abnormal clusters of cases but typically do not provide a measure of the degree of association between one case and another. Such a measure would facilitate the assignment of cases to common groups and be useful in outbreak investigations of diseases that potentially share the same source. This paper presents a model-based approach, which on the basis of available location data, provides a measure of the strength of association between cases in space and time and which is used to designate and visualise the most likely groupings of cases. The method was developed as a prospective surveillance tool to signal potential outbreaks, but it may also be used to explore groupings of cases in outbreak investigations. We demonstrate the method by using a historical case series of Legionnaires’ disease amongst residents of England and Wales. PMID:23483594
Inferring subunit stoichiometry from single molecule photobleaching
2013-01-01
Single molecule photobleaching is a powerful tool for determining the stoichiometry of protein complexes. By attaching fluorophores to proteins of interest, the number of associated subunits in a complex can be deduced by imaging single molecules and counting fluorophore photobleaching steps. Because some bleaching steps might be unobserved, the ensemble of steps will be binomially distributed. In this work, it is shown that inferring the true composition of a complex from such data is nontrivial because binomially distributed observations present an ill-posed inference problem. That is, a unique and optimal estimate of the relevant parameters cannot be extracted from the observations. Because of this, a method has not been firmly established to quantify confidence when using this technique. This paper presents a general inference model for interpreting such data and provides methods for accurately estimating parameter confidence. The formalization and methods presented here provide a rigorous analytical basis for this pervasive experimental tool. PMID:23712552
Some Innovative Methods to Improve Profiles Derivation
ERIC Educational Resources Information Center
Pei, Lai Kwan
2008-01-01
As the government aimed to provide appropriate education to all children (No Child Left Behind Act), it is important that the education providers can assess the performance of the students correctly so that they can provide the appropriate education for the students. Profile analysis is a very useful tool to interpret test scores and measure…
Gilliam, Meredith; Krein, Sarah L; Belanger, Karen; Fowler, Karen E; Dimcheff, Derek E; Solomon, Gabriel
2017-01-01
Background: Incomplete or delayed access to discharge information by outpatient providers and patients contributes to discontinuity of care and poor outcomes. Objective: To evaluate the effect of a new electronic discharge summary tool on the timeliness of documentation and communication with outpatient providers. Methods: In June 2012, we implemented an electronic discharge summary tool at our 145-bed university-affiliated Veterans Affairs hospital. The tool facilitates completion of a comprehensive discharge summary note that is available for patients and outpatient medical providers at the time of hospital discharge. Discharge summary note availability, outpatient provider satisfaction, and time between the decision to discharge a patient and discharge note completion were all evaluated before and after implementation of the tool. Results: The percentage of discharge summary notes completed by the time of first post-discharge clinical contact improved from 43% in February 2012 to 100% in September 2012 and was maintained at 100% in 2014. A survey of 22 outpatient providers showed that 90% preferred the new summary and 86% found it comprehensive. Despite increasing required documentation, the time required to discharge a patient, from physician decision to discharge note completion, improved from 5.6 h in 2010 to 4.1 h in 2012 (p = 0.04), and to 2.8 h in 2015 (p < 0.001). Conclusion: The implementation of a novel discharge summary tool improved the timeliness and comprehensiveness of discharge information as needed for the delivery of appropriate, high-quality follow-up care, without adversely affecting the efficiency of the discharge process. PMID:28491308
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
An object-oriented description method of EPMM process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Yang, Fan
2017-06-01
In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.
Decision support methods for the environmental assessment of contamination at mining sites.
Jordan, Gyozo; Abdaal, Ahmed
2013-09-01
Polluting mine accidents and widespread environmental contamination associated with historic mining in Europe and elsewhere has triggered the improvement of related environmental legislation and of the environmental assessment and management methods for the mining industry. Mining has some unique features such as natural background pollution associated with natural mineral deposits, industrial activities and contamination located in the three-dimensional sub-surface space, the problem of long-term remediation after mine closure, problem of secondary contaminated areas around mine sites and abandoned mines in historic regions like Europe. These mining-specific problems require special tools to address the complexity of the environmental problems of mining-related contamination. The objective of this paper is to review and evaluate some of the decision support methods that have been developed and applied to mining contamination. In this paper, only those methods that are both efficient decision support tools and provide a 'holistic' approach to the complex problem as well are considered. These tools are (1) landscape ecology, (2) industrial ecology, (3) landscape geochemistry, (4) geo-environmental models, (5) environmental impact assessment, (6) environmental risk assessment, (7) material flow analysis and (8) life cycle assessment. This unique inter-disciplinary study should enable both the researcher and the practitioner to obtain broad view on the state-of-the-art of decision support methods for the environmental assessment of contamination at mine sites. Documented examples and abundant references are also provided.
miR-MaGiC improves quantification accuracy for small RNA-seq.
Russell, Pamela H; Vestal, Brian; Shi, Wen; Rudra, Pratyaydipta D; Dowell, Robin; Radcliffe, Richard; Saba, Laura; Kechris, Katerina
2018-05-15
Many tools have been developed to profile microRNA (miRNA) expression from small RNA-seq data. These tools must contend with several issues: the small size of miRNAs, the small number of unique miRNAs, the fact that similar miRNAs can be transcribed from multiple loci, and the presence of miRNA isoforms known as isomiRs. Methods failing to address these issues can return misleading information. We propose a novel quantification method designed to address these concerns. We present miR-MaGiC, a novel miRNA quantification method, implemented as a cross-platform tool in Java. miR-MaGiC performs stringent mapping to a core region of each miRNA and defines a meaningful set of target miRNA sequences by collapsing the miRNA space to "functional groups". We hypothesize that these two features, mapping stringency and collapsing, provide more optimal quantification to a more meaningful unit (i.e., miRNA family). We test miR-MaGiC and several published methods on 210 small RNA-seq libraries, evaluating each method's ability to accurately reflect global miRNA expression profiles. We define accuracy as total counts close to the total number of input reads originating from miRNAs. We find that miR-MaGiC, which incorporates both stringency and collapsing, provides the most accurate counts.
MO-F-211-01: Methods for Completing Practice Quality Improvement (PQI).
Johnson, J; Brown, K; Ibbott, G; Pawlicki, T
2012-06-01
Practice Quality Improvement (PQI) is becoming an expected part of routine practice in healthcare as an approach to provide more efficient, effective and high quality care. Additionally, as part of the ABR's Maintenance of Certification (MOC) pathway, medical physicists are now expected to complete a PQI project. This session will describe the history behind and benefits of the ABR's MOC program, provide details of quality improvement methods and how to successfully complete a PQI project. PQI methods include various commonly used engineering and management tools. The Plan-Do-Study-Act (PDSA) cycle will be presented as one project planning and implementation tool. Other PQI analysis instruments such as flowcharts, Pareto charts, process control charts and fishbone diagrams will also be explained with examples. Cause analysis, solution development and implementation, and post-implementation measurement will be presented. Project identification and definition as well as appropriate measurement tool selection will be offered. Methods to choose key quality metrics (key quality indicators) will also be addressed. Several sample PQI projects and templates available through the AAPM and other organizations will be described. At least three examples of completed PQI projects will be shared. 1. Identify and define a PQI project 2. Identify and select measurement methods/techniques for use with the PQI project 3. Describe example(s) of completed projects. © 2012 American Association of Physicists in Medicine.
Citizen science provides a reliable and scalable tool to track disease-carrying mosquitoes.
Palmer, John R B; Oltra, Aitana; Collantes, Francisco; Delgado, Juan Antonio; Lucientes, Javier; Delacour, Sarah; Bengoa, Mikel; Eritja, Roger; Bartumeus, Frederic
2017-10-24
Recent outbreaks of Zika, chikungunya and dengue highlight the importance of better understanding the spread of disease-carrying mosquitoes across multiple spatio-temporal scales. Traditional surveillance tools are limited by jurisdictional boundaries and cost constraints. Here we show how a scalable citizen science system can solve this problem by combining citizen scientists' observations with expert validation and correcting for sampling effort. Our system provides accurate early warning information about the Asian tiger mosquito (Aedes albopictus) invasion in Spain, well beyond that available from traditional methods, and vital for public health services. It also provides estimates of tiger mosquito risk comparable to those from traditional methods but more directly related to the human-mosquito encounters that are relevant for epidemiological modelling and scalable enough to cover the entire country. These results illustrate how powerful public participation in science can be and suggest citizen science is positioned to revolutionize mosquito-borne disease surveillance worldwide.
InteGO2: a web tool for measuring and visualizing gene semantic similarities using Gene Ontology.
Peng, Jiajie; Li, Hongxiang; Liu, Yongzhuang; Juan, Liran; Jiang, Qinghua; Wang, Yadong; Chen, Jin
2016-08-31
The Gene Ontology (GO) has been used in high-throughput omics research as a major bioinformatics resource. The hierarchical structure of GO provides users a convenient platform for biological information abstraction and hypothesis testing. Computational methods have been developed to identify functionally similar genes. However, none of the existing measurements take into account all the rich information in GO. Similarly, using these existing methods, web-based applications have been constructed to compute gene functional similarities, and to provide pure text-based outputs. Without a graphical visualization interface, it is difficult for result interpretation. We present InteGO2, a web tool that allows researchers to calculate the GO-based gene semantic similarities using seven widely used GO-based similarity measurements. Also, we provide an integrative measurement that synergistically integrates all the individual measurements to improve the overall performance. Using HTML5 and cytoscape.js, we provide a graphical interface in InteGO2 to visualize the resulting gene functional association networks. InteGO2 is an easy-to-use HTML5 based web tool. With it, researchers can measure gene or gene product functional similarity conveniently, and visualize the network of functional interactions in a graphical interface. InteGO2 can be accessed via http://mlg.hit.edu.cn:8089/ .
A direct push resistivity method was evaluated as a complementary screening tool to provide rapid in-situ contaminant detection to aid in better defining locations for drilling, sampling, and monitoring well installation at hazardous waste sites. Nine continuous direct push resi...
Subsurface Exploration Methods for Soft Ground Rapid Transit Tunnels : Volume 2. Appendixes A-F.
DOT National Transportation Integrated Search
1976-04-01
This study assesses subsurface exploration methods with respect to their ability to provide adequate data for the construction of rapid transit, soft-ground bored and cut-and-cover tunnels. Geophysical and other exploration tools not now widely used ...
From mobile ADCP to high-resolution SSC: a cross-section calibration tool
Boldt, Justin A.
2015-01-01
Sediment is a major cause of stream impairment, and improved sediment monitoring is a crucial need. Point samples of suspended-sediment concentration (SSC) are often not enough to provide an understanding to answer critical questions in a changing environment. As technology has improved, there now exists the opportunity to obtain discrete measurements of SSC and flux while providing a spatial scale unmatched by any other device. Acoustic instruments are ubiquitous in the U.S. Geological Survey (USGS) for making streamflow measurements but when calibrated with physical sediment samples, they may be used for sediment measurements as well. The acoustic backscatter measured by an acoustic Doppler current profiler (ADCP) has long been known to correlate well with suspended sediment, but until recently, it has mainly been qualitative in nature. This new method using acoustic surrogates has great potential to leverage the routine data collection to provide calibrated, quantitative measures of SSC which hold promise to be more accurate, complete, and cost efficient than other methods. This extended abstract presents a method for the measurement of high spatial and temporal resolution SSC using a down-looking, mobile ADCP from discrete cross-sections. The high-resolution scales of sediment data are a primary advantage and a vast improvement over other discrete methods for measuring SSC. Although acoustic surrogate technology using continuous, fixed-deployment ADCPs (side-looking) is proven, the same methods cannot be used with down-looking ADCPs due to the fact that the SSC and particle-size distribution variation in the vertical profile violates theory and complicates assumptions. A software tool was developed to assist in using acoustic backscatter from a down-looking, mobile ADCP as a surrogate for SSC. This tool has a simple graphical user interface that loads the data, assists in the calibration procedure, and provides data visualization and output options. This tool is designed to improve ongoing efforts to monitor and predict resource responses to a changing environment. Because ADCPs are used routinely for streamflow measurements, using acoustic backscatter from ADCPs as a surrogate for SSC has the potential to revolutionize sediment measurements by providing rapid measurements of sediment flux and distribution at spatial and temporal scales that are far beyond the capabilities of traditional physical samplers.
Adequacy of surface analytical tools for studying the tribology of ceramics
NASA Technical Reports Server (NTRS)
Sliney, H. E.
1986-01-01
Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.
Multivariate Density Estimation and Remote Sensing
NASA Technical Reports Server (NTRS)
Scott, D. W.
1983-01-01
Current efforts to develop methods and computer algorithms to effectively represent multivariate data commonly encountered in remote sensing applications are described. While this may involve scatter diagrams, multivariate representations of nonparametric probability density estimates are emphasized. The density function provides a useful graphical tool for looking at data and a useful theoretical tool for classification. This approach is called a thunderstorm data analysis.
ERIC Educational Resources Information Center
Ortega, Ryan A.; Brame, Cynthia J.
2015-01-01
Concept mapping was developed as a method of displaying and organizing hierarchical knowledge structures. Using the new, multidimensional presentation software Prezi, we have developed a new teaching technique designed to engage higher-level skills in the cognitive domain. This tool, synthesis mapping, is a natural evolution of concept mapping,…
The Business Change Initiative: A Novel Approach to Improved Cost and Schedule Management
NASA Technical Reports Server (NTRS)
Shinn, Stephen A.; Bryson, Jonathan; Klein, Gerald; Lunz-Ruark, Val; Majerowicz, Walt; McKeever, J.; Nair, Param
2016-01-01
Goddard Space Flight Center's Flight Projects Directorate employed a Business Change Initiative (BCI) to infuse a series of activities coordinated to drive improved cost and schedule performance across Goddard's missions. This sustaining change framework provides a platform to manage and implement cost and schedule control techniques throughout the project portfolio. The BCI concluded in December 2014, deploying over 100 cost and schedule management changes including best practices, tools, methods, training, and knowledge sharing. The new business approach has driven the portfolio to improved programmatic performance. The last eight launched GSFC missions have optimized cost, schedule, and technical performance on a sustained basis to deliver on time and within budget, returning funds in many cases. While not every future mission will boast such strong performance, improved cost and schedule tools, management practices, and ongoing comprehensive evaluations of program planning and control methods to refine and implement best practices will continue to provide a framework for sustained performance. This paper will describe the tools, techniques, and processes developed during the BCI and the utilization of collaborative content management tools to disseminate project planning and control techniques to ensure continuous collaboration and optimization of cost and schedule management in the future.
2012-01-01
Background The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. Method A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID) or computed radiography (CR). We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Results Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. Conclusion The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence. PMID:22452821
Biopython: freely available Python tools for computational molecular biology and bioinformatics.
Cock, Peter J A; Antao, Tiago; Chang, Jeffrey T; Chapman, Brad A; Cox, Cymon J; Dalke, Andrew; Friedberg, Iddo; Hamelryck, Thomas; Kauff, Frank; Wilczynski, Bartek; de Hoon, Michiel J L
2009-06-01
The Biopython project is a mature open source international collaboration of volunteer developers, providing Python libraries for a wide range of bioinformatics problems. Biopython includes modules for reading and writing different sequence file formats and multiple sequence alignments, dealing with 3D macro molecular structures, interacting with common tools such as BLAST, ClustalW and EMBOSS, accessing key online databases, as well as providing numerical methods for statistical learning. Biopython is freely available, with documentation and source code at (www.biopython.org) under the Biopython license.
DOT National Transportation Integrated Search
1980-06-01
The purpose of this report is to provide the tunneling profession with improved practical tools in the technical or design area, which provide more accurate representations of the ground-structure interaction in tunneling. The design methods range fr...
Discovery through maps: Exploring real-world applications of ecosystem services
Background/Question/Methods U.S. EPA’s EnviroAtlas provides a collection of interactive tools and resources for exploring ecosystem goods and services. The purpose of EnviroAtlas is to provide better access to consistently derived ecosystems and socio-economic data to facil...
THE CAUSAL ANALYSIS / DIAGNOSIS DECISION ...
CADDIS is an on-line decision support system that helps investigators in the regions, states and tribes find, access, organize, use and share information to produce causal evaluations in aquatic systems. It is based on the US EPA's Stressor Identification process which is a formal method for identifying causes of impairments in aquatic systems. CADDIS 2007 increases access to relevant information useful for causal analysis and provides methods and tools that practitioners can use to analyze their own data. The new Candidate Cause section provides overviews of commonly encountered causes of impairments to aquatic systems: metals, sediments, nutrients, flow alteration, temperature, ionic strength, and low dissolved oxygen. CADDIS includes new Conceptual Models that illustrate the relationships from sources to stressors to biological effects. An Interactive Conceptual Model for phosphorus links the diagram with supporting literature citations. The new Analyzing Data section helps practitioners analyze their data sets and interpret and use those results as evidence within the USEPA causal assessment process. Downloadable tools include a graphical user interface statistical package (CADStat), and programs for use with the freeware R statistical package, and a Microsoft Excel template. These tools can be used to quantify associations between causes and biological impairments using innovative methods such as species-sensitivity distributions, biological inferenc
Wang, Duolin; Zeng, Shuai; Xu, Chunhui; Qiu, Wangren; Liang, Yanchun; Joshi, Trupti; Xu, Dong
2017-12-15
Computational methods for phosphorylation site prediction play important roles in protein function studies and experimental design. Most existing methods are based on feature extraction, which may result in incomplete or biased features. Deep learning as the cutting-edge machine learning method has the ability to automatically discover complex representations of phosphorylation patterns from the raw sequences, and hence it provides a powerful tool for improvement of phosphorylation site prediction. We present MusiteDeep, the first deep-learning framework for predicting general and kinase-specific phosphorylation sites. MusiteDeep takes raw sequence data as input and uses convolutional neural networks with a novel two-dimensional attention mechanism. It achieves over a 50% relative improvement in the area under the precision-recall curve in general phosphorylation site prediction and obtains competitive results in kinase-specific prediction compared to other well-known tools on the benchmark data. MusiteDeep is provided as an open-source tool available at https://github.com/duolinwang/MusiteDeep. xudong@missouri.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Extension of a GIS procedure for calculating the RUSLE equation LS factor
NASA Astrophysics Data System (ADS)
Zhang, Hongming; Yang, Qinke; Li, Rui; Liu, Qingrui; Moore, Demie; He, Peng; Ritsema, Coen J.; Geissen, Violette
2013-03-01
The Universal Soil Loss Equation (USLE) and revised USLE (RUSLE) are often used to estimate soil erosion at regional landscape scales, however a major limitation is the difficulty in extracting the LS factor. The geographic information system-based (GIS-based) methods which have been developed for estimating the LS factor for USLE and RUSLE also have limitations. The unit contributing area-based estimation method (UCA) converts slope length to unit contributing area for considering two-dimensional topography, however is not able to predict the different zones of soil erosion and deposition. The flowpath and cumulative cell length-based method (FCL) overcomes this disadvantage but does not consider channel networks and flow convergence in two-dimensional topography. The purpose of this research was to overcome these limitations and extend the FCL method through inclusion of channel networks and convergence flow. We developed LS-TOOL in Microsoft's.NET environment using C♯ with a user-friendly interface. Comparing the LS factor calculated with the three methodologies (UCA, FCL and LS-TOOL), LS-TOOL delivers encouraging results. In particular, LS-TOOL uses breaks in slope identified from the DEM to locate soil erosion and deposition zones, channel networks and convergence flow areas. Comparing slope length and LS factor values generated using LS-TOOL with manual methods, LS-TOOL corresponds more closely with the reality of the Xiannangou catchment than results using UCA or FCL. The LS-TOOL algorithm can automatically calculate slope length, slope steepness, L factor, S factor, and LS factors, providing the results as ASCII files which can be easily used in some GIS software. This study is an important step forward in conducting more accurate large area erosion evaluation.
Systems analysis - a new paradigm and decision support tools for the water framework directive
NASA Astrophysics Data System (ADS)
Bruen, M.
2007-06-01
In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD) requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE) and the Analytical Hierarchy Process (AHP) are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS) that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness. This is best done by trained sociologists fully integrated into the processes. The WINCOMS research project is an example applied to the implementation of the WFD in Ireland.
Suurmond, Robert; van Rhee, Henk; Hak, Tony
2017-12-01
We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
An Intelligent Tool for Activity Data Collection
Jehad Sarkar, A. M.
2011-01-01
Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user’s activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool’s performance in producing reliable datasets. PMID:22163832
Damschroder, Laura J; Fetters, Michael D; Zikmund-Fisher, Brian J; Crabtree, Benjamin F; Hudson, Shawna V; Ruffin IV, Mack T; Fucinari, Juliana; Kang, Minji; Taichman, L Susan; Creswell, John W
2018-01-01
Background Women with chronic medical conditions, such as diabetes and hypertension, have a higher risk of pregnancy-related complications compared with women without medical conditions and should be offered contraception if desired. Although evidence based guidelines for contraceptive selection in the presence of medical conditions are available via the United States Medical Eligibility Criteria (US MEC), these guidelines are underutilized. Research also supports the use of decision tools to promote shared decision making between patients and providers during contraceptive counseling. Objective The overall goal of the MiHealth, MiChoice project is to design and implement a theory-driven, Web-based tool that incorporates the US MEC (provider-level intervention) within the vehicle of a contraceptive decision tool for women with chronic medical conditions (patient-level intervention) in community-based primary care settings (practice-level intervention). This will be a 3-phase study that includes a predesign phase, a design phase, and a testing phase in a randomized controlled trial. This study protocol describes phase 1 and aim 1, which is to determine patient-, provider-, and practice-level factors that are relevant to the design and implementation of the contraceptive decision tool. Methods This is a mixed methods implementation study. To customize the delivery of the US MEC in the decision tool, we selected high-priority constructs from the Consolidated Framework for Implementation Research and the Theoretical Domains Framework to drive data collection and analysis at the practice and provider level, respectively. A conceptual model that incorporates constructs from the transtheoretical model and the health beliefs model undergirds patient-level data collection and analysis and will inform customization of the decision tool for this population. We will recruit 6 community-based primary care practices and conduct quantitative surveys and semistructured qualitative interviews with women who have chronic medical conditions, their primary care providers (PCPs), and clinic staff, as well as field observations of practice activities. Quantitative survey data will be summarized with simple descriptive statistics and relationships between participant characteristics and contraceptive recommendations (for PCPs), and current contraceptive use (for patients) will be examined using Fisher exact test. We will conduct thematic analysis of qualitative data from interviews and field observations. The integration of data will occur by comparing, contrasting, and synthesizing qualitative and quantitative findings to inform the future development and implementation of the intervention. Results We are currently enrolling practices and anticipate study completion in 15 months. Conclusions This protocol describes the first phase of a multiphase mixed methods study to develop and implement a Web-based decision tool that is customized to meet the needs of women with chronic medical conditions in primary care settings. Study findings will promote contraceptive counseling via shared decision making and reflect evidence-based guidelines for contraceptive selection. Trial Registration ClinicalTrials.gov NCT03153644; https://clinicaltrials.gov/ct2/show/NCT03153644 (Archived by WebCite at http://www.webcitation.org/6yUkA5lK8) PMID:29669707
ADAM: Analysis of Discrete Models of Biological Systems Using Computer Algebra
2011-01-01
Background Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. Results We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Conclusions Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics. PMID:21774817
Workspace Program for Complex-Number Arithmetic
NASA Technical Reports Server (NTRS)
Patrick, M. C.; Howell, Leonard W., Jr.
1986-01-01
COMPLEX is workspace program designed to empower APL with complexnumber capabilities. Complex-variable methods provide analytical tools invaluable for applications in mathematics, science, and engineering. COMPLEX written in APL.
Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia
2012-01-01
The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools. PMID:23201980
A new disability-related health care needs assessment tool for persons with brain disorders.
Kim, Yoon; Eun, Sang June; Kim, Wan Ho; Lee, Bum-Suk; Leigh, Ja-Ho; Kim, Jung-Eun; Lee, Jin Yong
2013-09-01
This study aimed to develop a health needs assessment (HNA) tool for persons with brain disorders and to assess the unmet needs of persons with brain disorders using the developed tool. The authors used consensus methods to develop a HNA tool. Using a randomized stratified systematic sampling method adjusted for sex, age, and districts, 57 registered persons (27 severe and 30 mild cases) with brain disorders dwelling in Seoul, South Korea were chosen and medical specialists investigated all of the subjects with the developed tools. The HNA tool for brain disorders we developed included four categories: 1) medical interventions and operations, 2) assistive devices, 3) rehabilitation therapy, and 4) regular follow-up. This study also found that 71.9% of the subjects did not receive appropriate medical care, which implies that the severity of their disability is likely to be exacerbated and permanent, and the loss irrecoverable. Our results showed that the HNA tool for persons with brain disorders based on unmet needs defined by physicians can be a useful method for evaluating the appropriateness and necessity of medical services offered to the disabled, and it can serve as the norm for providing health care services for disabled persons. Further studies should be undertaken to increase validity and reliability of the tool. Fundamental research investigating the factors generating or affecting the unmet needs is necessary; its results could serve as basis for developing policies to eliminate or alleviate these factors.
Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping
2018-05-22
Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.
ERIC Educational Resources Information Center
Gagnon, Micheline; And Others
1983-01-01
A method for determining the tridimensional angular displacement of skates during the two-legged stop in ice hockey was developed and validated. The angles were measured by geometry, using a cinecamera and specially equipped skates. The method provides a new tool for kinetic analyses of skating movements. (Authors/PP)
A Learner-Centered Grading Method Focused on Reaching Proficiency with Course Learning Outcomes
ERIC Educational Resources Information Center
Toledo, Santiago; Dubas, Justin M.
2017-01-01
Getting students to use grading feedback as a tool for learning is a continual challenge for educators. This work proposes a method for evaluating student performance that provides feedback to students based on standards of learning dictated by clearly delineated course learning outcomes. This method combines elements of standards-based grading…
USDA-ARS?s Scientific Manuscript database
A rapid method for extracting eriophyoid mites was adapted from previous studies to provide growers and IPM consultants with a practical, efficient, and reliable tool to monitor for rust mites in vineyards. The rinse in bag (RIB) method allows quick extraction of mites from collected plant parts (sh...
Abe Vicente, Mariana; Barão, Katia; Silva, Tiago Donizetti; Forones, Nora Manoukian
2013-01-01
To evaluate methods for the identification of nutrition risk and nutritional status in outpatients with colorectal (CRC) and gastric cancer (GC), and to compare the results to those obtained for patients already treated for these cancers. A cross-sectional study was conducted on 137 patients: group 1 (n = 75) consisting of patients with GC or CRC, and group 2 (n = 62) consisting of patients after treatment of GC or CRC under follow up, who were tumor free for a period longer than 3 months. Nutritional status was assessed in these patients using objective methods [body mass index (BMI), phase angle, serum albumin]; nutritional screening tools [Malnutrition Universal Screening Tool (MUST), Malnutrition Screening Tool (MST), Nutritional Risk Index (NRI)], and subjective assessment [Patient-Generated Subjective Global Assessment (PGSGA)]. The sensitivity and specificity of each method was calculated in relation to the PG-SGA used as gold standard. One hundred thirty seven patients participated in the study. Stage IV cancer patients were more common in group 1. There was no difference in BMI between groups (p = 0.67). Analysis of the association between methods of assessing nutritional status and PG-SGA showed that the nutritional screening tools provided more significant results (p < 0.05) than the objective methods in the two groups. PG-SGA detected the highest proportion of undernourished patients in group 1. The nutritional screening tools MUST, NRI and MST were more sensitive than the objective methods. Phase angle measurement was the most sensitive objective method in group 1. The nutritional screening tools showed the best association with PG-SGA and were also more sensitive than the objective methods. The results suggest the combination of MUST and PG-SGA for patients with cancer before and after treatment. Copyright © AULA MEDICA EDICIONES 2013. Published by AULA MEDICA. All rights reserved.
New methodology to baseline and match AME polysilicon etcher using advanced diagnostic tools
NASA Astrophysics Data System (ADS)
Poppe, James; Shipman, John; Reinhardt, Barbara E.; Roussel, Myriam; Hedgecock, Raymond; Fonda, Arturo
1999-09-01
As process controls tighten in the semiconductor industry, the need to understand the variables that determine system performance become more important. For plasma etch systems, process success depends on the control of key parameters such as: vacuum integrity, pressure, gas flows, and RF power. It is imperative to baseline, monitor, and control these variables. This paper presents an overview of the methods and tools used by Motorola BMC fabrication facility to characterize an Applied Materials polysilicon etcher. Tool performance data obtained from our traditional measurement techniques are limited in their scope and do not provide a complete picture of the ultimate tool performance. Presently the BMC traditional characterization tools provide a snapshot of the static operation of the equipment under test (EUT); however, complete evaluation of the dynamic performance cannot be monitored without the aid of specialized diagnostic equipment. To provide us with a complete system baseline evaluation of the polysilicon etcher, three diagnostic tools were utilized: Lucas Labs Vacuum Diagnostic System, Residual Gas Analyzer, and the ENI Voltage/Impedance Probe. The diagnostic methodology used to baseline and match key parameters of qualified production equipment has had an immense impact on other equipment characterization in the facility. It has resulted in reduced cycle time for new equipment introduction as well.
Dwivedi, Bhakti; Kowalski, Jeanne
2018-01-01
While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/.
Dwivedi, Bhakti
2018-01-01
While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/. PMID:29415010
Modeling crime events by d-separation method
NASA Astrophysics Data System (ADS)
Aarthee, R.; Ezhilmaran, D.
2017-11-01
Problematic legal cases have recently called for a scientifically founded method of dealing with the qualitative and quantitative roles of evidence in a case [1].To deal with quantitative, we proposed a d-separation method for modeling the crime events. A d-separation is a graphical criterion for identifying independence in a directed acyclic graph. By developing a d-separation method, we aim to lay the foundations for the development of a software support tool that can deal with the evidential reasoning in legal cases. Such a tool is meant to be used by a judge or juror, in alliance with various experts who can provide information about the details. This will hopefully improve the communication between judges or jurors and experts. The proposed method used to uncover more valid independencies than any other graphical criterion.
González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio
2015-03-01
A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.
NASA Astrophysics Data System (ADS)
Yoon, Seung-Chul; Park, Bosoon; Lawrence, Kurt C.
2017-05-01
Various types of optical imaging techniques measuring light reflectivity and scattering can detect microbial colonies of foodborne pathogens on agar plates. Until recently, these techniques were developed to provide solutions for hypothesis-driven studies, which focused on developing tools and batch/offline machine learning methods with well defined sets of data. These have relatively high accuracy and rapid response time because the tools and methods are often optimized for the collected data. However, they often need to be retrained or recalibrated when new untrained data and/or features are added. A big-data driven technique is more suitable for online learning of new/ambiguous samples and for mining unknown or hidden features. Although big data research in hyperspectral imaging is emerging in remote sensing and many tools and methods have been developed so far in many other applications such as bioinformatics, the tools and methods still need to be evaluated and adjusted in applications where the conventional batch machine learning algorithms were dominant. The primary objective of this study is to evaluate appropriate big data analytic tools and methods for online learning and mining of foodborne pathogens on agar plates. After the tools and methods are successfully identified, they will be applied to rapidly search big color and hyperspectral image data of microbial colonies collected over the past 5 years in house and find the most probable colony or a group of colonies in the collected big data. The meta-data, such as collection time and any unstructured data (e.g. comments), will also be analyzed and presented with output results. The expected results will be novel, big data-driven technology to correctly detect and recognize microbial colonies of various foodborne pathogens on agar plates.
A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James
2011-11-01
Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less
Twitter's tweet method modelling and simulation
NASA Astrophysics Data System (ADS)
Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.
2015-02-01
This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.
Liu, Shiyuan; Xu, Shuang; Wu, Xiaofei; Liu, Wei
2012-06-18
This paper proposes an iterative method for in situ lens aberration measurement in lithographic tools based on a quadratic aberration model (QAM) that is a natural extension of the linear model formed by taking into account interactions among individual Zernike coefficients. By introducing a generalized operator named cross triple correlation (CTC), the quadratic model can be calculated very quickly and accurately with the help of fast Fourier transform (FFT). The Zernike coefficients up to the 37th order or even higher are determined by solving an inverse problem through an iterative procedure from several through-focus aerial images of a specially designed mask pattern. The simulation work has validated the theoretical derivation and confirms that such a method is simple to implement and yields a superior quality of wavefront estimate, particularly for the case when the aberrations are relatively large. It is fully expected that this method will provide a useful practical means for the in-line monitoring of the imaging quality of lithographic tools.
Whitmore, Lee; Mavridis, Lazaros; Wallace, B A; Janes, Robert W
2018-01-01
Circular dichroism spectroscopy is a well-used, but simple method in structural biology for providing information on the secondary structure and folds of proteins. DichroMatch (DM@PCDDB) is an online tool that is newly available in the Protein Circular Dichroism Data Bank (PCDDB), which takes advantage of the wealth of spectral and metadata deposited therein, to enable identification of spectral nearest neighbors of a query protein based on four different methods of spectral matching. DM@PCDDB can potentially provide novel information about structural relationships between proteins and can be used in comparison studies of protein homologs and orthologs. © 2017 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.
Direct writing of metal nanostructures: lithographic tools for nanoplasmonics research.
Leggett, Graham J
2011-03-22
Continued progress in the fast-growing field of nanoplasmonics will require the development of new methods for the fabrication of metal nanostructures. Optical lithography provides a continually expanding tool box. Two-photon processes, as demonstrated by Shukla et al. (doi: 10.1021/nn103015g), enable the fabrication of gold nanostructures encapsulated in dielectric material in a simple, direct process and offer the prospect of three-dimensional fabrication. At higher resolution, scanning probe techniques enable nanoparticle particle placement by localized oxidation, and near-field sintering of nanoparticulate films enables direct writing of nanowires. Direct laser "printing" of single gold nanoparticles offers a remarkable capability for the controlled fabrication of model structures for fundamental studies, particle-by-particle. Optical methods continue to provide a powerful support for research into metamaterials.
Higher-order automatic differentiation of mathematical functions
NASA Astrophysics Data System (ADS)
Charpentier, Isabelle; Dal Cappello, Claude
2015-04-01
Functions of mathematical physics such as the Bessel functions, the Chebyshev polynomials, the Gauss hypergeometric function and so forth, have practical applications in many scientific domains. On the one hand, differentiation formulas provided in reference books apply to real or complex variables. These do not account for the chain rule. On the other hand, based on the chain rule, the automatic differentiation has become a natural tool in numerical modeling. Nevertheless automatic differentiation tools do not deal with the numerous mathematical functions. This paper describes formulas and provides codes for the higher-order automatic differentiation of mathematical functions. The first method is based on Faà di Bruno's formula that generalizes the chain rule. The second one makes use of the second order differential equation they satisfy. Both methods are exemplified with the aforementioned functions.
Constance I. Millar; Christopher W. Swanston; David L. Peterson
2014-01-01
Federal agencies have led the development of adaptation principles and tools in forest ecosystems over the past decade. Successful adaptation efforts generally require organizations to: (1) develop science-management partnerships, (2) provide education on climate change science, (3) provide a toolkit of methods and processes for vulnerability assessment and adaptation...
Delmaar, J E; Bokkers, B G H; ter Burg, W; van Engelen, J G M
2013-02-01
The demonstration of safe use of chemicals in consumer products, as required under REACH, is proposed to follow a tiered process. In the first tier, simple conservative methods and assumptions should be made to quickly verify whether risks for a particular use are expected. The ECETOC TRA Consumer Exposure Tool was developed to assist in first tier risk assessments for substances in consumer products. The ECETOC TRA is not a prioritization tool, but is meant as a first screening. Therefore, the exposure assessment needs to cover all products/articles in a specific category. For the assessment of the dermal exposure for substances in articles, ECETOC TRA uses the concept of a 'contact layer', a hypothetical layer that limits the exposure to a substance contained in the product. For each product/article category, ECETOC TRA proposes default values for the thickness of this contact layer. As relevant experimental exposure data is currently lacking, default values are based on expert judgment alone. In this paper it is verified whether this concept meets the requirement of being a conservative exposure evaluation method. This is done by confronting the ECETOC TRA expert judgment based predictions with a mechanistic emission model, based on the well established theory of diffusion of substances in materials. Diffusion models have been applied and tested in many applications of emission modeling. Experimentally determined input data for a number of material and substance combinations are available. The estimated emissions provide information on the range of emissions that could occur in reality. First tier tools such as ECETOC TRA tool are required to cover all products/articles in a category and to provide estimates that are at least as high as is expected on the basis of current scientific knowledge. Since this was not the case, it is concluded that the ECETOC TRA does not provide a proper conservative estimation method for the dermal exposure to articles. An alternative method was proposed. Copyright © 2012 Elsevier Inc. All rights reserved.
Research on the Intensity Analysis and Result Visualization of Construction Land in Urban Planning
NASA Astrophysics Data System (ADS)
Cui, J.; Dong, B.; Li, J.; Li, L.
2017-09-01
As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.
New Automotive Air Conditioning System Simulation Tool Developed in MATLAB/Simulink
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiss, T.; Chaney, L.; Meyer, J.
Further improvements in vehicle fuel efficiency require accurate evaluation of the vehicle's transient total power requirement. When operated, the air conditioning (A/C) system is the largest auxiliary load on a vehicle; therefore, accurate evaluation of the load it places on the vehicle's engine and/or energy storage system is especially important. Vehicle simulation software, such as 'Autonomie,' has been used by OEMs to evaluate vehicles' energy performance. A transient A/C simulation tool incorporated into vehicle simulation models would also provide a tool for developing more efficient A/C systems through a thorough consideration of the transient A/C system performance. The dynamic systemmore » simulation software Matlab/Simulink was used to develop new and more efficient vehicle energy system controls. The various modeling methods used for the new simulation tool are described in detail. Comparison with measured data is provided to demonstrate the validity of the model.« less
CsSNP: A Web-Based Tool for the Detecting of Comparative Segments SNPs.
Wang, Yi; Wang, Shuangshuang; Zhou, Dongjie; Yang, Shuai; Xu, Yongchao; Yang, Chao; Yang, Long
2016-07-01
SNP (single nucleotide polymorphism) is a popular tool for the study of genetic diversity, evolution, and other areas. Therefore, it is necessary to develop a convenient, utility, robust, rapid, and open source detecting-SNP tool for all researchers. Since the detection of SNPs needs special software and series steps including alignment, detection, analysis and present, the study of SNPs is limited for nonprofessional users. CsSNP (Comparative segments SNP, http://biodb.sdau.edu.cn/cssnp/ ) is a freely available web tool based on the Blat, Blast, and Perl programs to detect comparative segments SNPs and to show the detail information of SNPs. The results are filtered and presented in the statistics figure and a Gbrowse map. This platform contains the reference genomic sequences and coding sequences of 60 plant species, and also provides new opportunities for the users to detect SNPs easily. CsSNP is provided a convenient tool for nonprofessional users to find comparative segments SNPs in their own sequences, and give the users the information and the analysis of SNPs, and display these data in a dynamic map. It provides a new method to detect SNPs and may accelerate related studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Callaghan, Michael E., E-mail: elspeth.raymond@health.sa.gov.au; Freemasons Foundation Centre for Men's Health, University of Adelaide; Urology Unit, Repatriation General Hospital, SA Health, Flinders Centre for Innovation in Cancer
Purpose: To identify, through a systematic review, all validated tools used for the prediction of patient-reported outcome measures (PROMs) in patients being treated with radiation therapy for prostate cancer, and provide a comparative summary of accuracy and generalizability. Methods and Materials: PubMed and EMBASE were searched from July 2007. Title/abstract screening, full text review, and critical appraisal were undertaken by 2 reviewers, whereas data extraction was performed by a single reviewer. Eligible articles had to provide a summary measure of accuracy and undertake internal or external validation. Tools were recommended for clinical implementation if they had been externally validated and foundmore » to have accuracy ≥70%. Results: The search strategy identified 3839 potential studies, of which 236 progressed to full text review and 22 were included. From these studies, 50 tools predicted gastrointestinal/rectal symptoms, 29 tools predicted genitourinary symptoms, 4 tools predicted erectile dysfunction, and no tools predicted quality of life. For patients treated with external beam radiation therapy, 3 tools could be recommended for the prediction of rectal toxicity, gastrointestinal toxicity, and erectile dysfunction. For patients treated with brachytherapy, 2 tools could be recommended for the prediction of urinary retention and erectile dysfunction. Conclusions: A large number of tools for the prediction of PROMs in prostate cancer patients treated with radiation therapy have been developed. Only a small minority are accurate and have been shown to be generalizable through external validation. This review provides an accessible catalogue of tools that are ready for clinical implementation as well as which should be prioritized for validation.« less
Voice-enabled Knowledge Engine using Flood Ontology and Natural Language Processing
NASA Astrophysics Data System (ADS)
Sermet, M. Y.; Demir, I.; Krajewski, W. F.
2015-12-01
The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts, flood-related data, information and interactive visualizations for communities in Iowa. The IFIS is designed for use by general public, often people with no domain knowledge and limited general science background. To improve effective communication with such audience, we have introduced a voice-enabled knowledge engine on flood related issues in IFIS. Instead of navigating within many features and interfaces of the information system and web-based sources, the system provides dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to real-time stream gauges, in-house data sources, analysis and visualization tools to answer natural language questions. Our goal is the systematization of data and modeling results on flood related issues in Iowa, and to provide an interface for definitive answers to factual queries. The goal of the knowledge engine is to make all flood related knowledge in Iowa easily accessible to everyone, and support voice-enabled natural language input. We aim to integrate and curate all flood related data, implement analytical and visualization tools, and make it possible to compute answers from questions. The IFIS explicitly implements analytical methods and models, as algorithms, and curates all flood related data and resources so that all these resources are computable. The IFIS Knowledge Engine computes the answer by deriving it from its computational knowledge base. The knowledge engine processes the statement, access data warehouse, run complex database queries on the server-side and return outputs in various formats. This presentation provides an overview of IFIS Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources. IFIS Knowledge Engine provides an alternative access method to these comprehensive set of tools and data resources available in IFIS. Current implementation of the system accepts free-form input and voice recognition capabilities within browser and mobile applications.
Guichard, Anne; Tardieu, Émilie; Dagenais, Christian; Nour, Kareen; Lafontaine, Ginette; Ridde, Valéry
2017-04-01
The aim of this project was to identify and prioritize a set of conditions to be considered for incorporating a health equity tool into public health practice. Concept mapping and focus groups were implemented as complementary methods to investigate the conditions of use of a health equity tool by public health organizations in Quebec. Using a hybrid integrated research design is a richer way to address the complexity of questions emerging from intervention and planning settings. This approach provides a deeper, operational, and contextualized understanding of research results involving different professional and organizational cultures, and thereby supports the decision-making process. Concept mapping served to identify and prioritize in a limited timeframe the conditions to be considered for incorporation into a health equity tool into public health practices. Focus groups then provided a more refined understanding of the barriers, issues, and facilitating factors surrounding the tools adoption, helped distinguish among participants' perspectives based on functional roles and organizational contexts, and clarified some apparently contradictory results from the concept map. The combined use of these two techniques brought the strengths of each approach to bear, thereby overcoming some of the respective limitations of concept mapping and focus groups. This design is appropriate for investigating targets with multiple levels of complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Automated shock detection and analysis algorithm for space weather application
NASA Astrophysics Data System (ADS)
Vorotnikov, Vasiliy S.; Smith, Charles W.; Hu, Qiang; Szabo, Adam; Skoug, Ruth M.; Cohen, Christina M. S.
2008-03-01
Space weather applications have grown steadily as real-time data have become increasingly available. Numerous industrial applications have arisen with safeguarding of the power distribution grids being a particular interest. NASA uses short-term and long-term space weather predictions in its launch facilities. Researchers studying ionospheric, auroral, and magnetospheric disturbances use real-time space weather services to determine launch times. Commercial airlines, communication companies, and the military use space weather measurements to manage their resources and activities. As the effects of solar transients upon the Earth's environment and society grow with the increasing complexity of technology, better tools are needed to monitor and evaluate the characteristics of the incoming disturbances. A need is for automated shock detection and analysis methods that are applicable to in situ measurements upstream of the Earth. Such tools can provide advance warning of approaching disturbances that have significant space weather impacts. Knowledge of the shock strength and speed can also provide insight into the nature of the approaching solar transient prior to arrival at the magnetopause. We report on efforts to develop a tool that can find and analyze shocks in interplanetary plasma data without operator intervention. This method will run with sufficient speed to be a practical space weather tool providing useful shock information within 1 min of having the necessary data to ground. The ability to run without human intervention frees space weather operators to perform other vital services. We describe ways of handling upstream data that minimize the frequency of false positive alerts while providing the most complete description of approaching disturbances that is reasonably possible.
Using MCBEND for neutron or gamma-ray deterministic calculations
NASA Astrophysics Data System (ADS)
Geoff, Dobson; Adam, Bird; Brendan, Tollit; Paul, Smith
2017-09-01
MCBEND 11 is the latest version of the general radiation transport Monte Carlo code from AMEC Foster Wheeler's ANSWERS® Software Service. MCBEND is well established in the UK shielding community for radiation shielding and dosimetry assessments. MCBEND supports a number of acceleration techniques, for example the use of an importance map in conjunction with Splitting/Russian Roulette. MCBEND has a well established automated tool to generate this importance map, commonly referred to as the MAGIC module using a diffusion adjoint solution. This method is fully integrated with the MCBEND geometry and material specification, and can easily be run as part of a normal MCBEND calculation. An often overlooked feature of MCBEND is the ability to use this method for forward scoping calculations, which can be run as a very quick deterministic method. Additionally, the development of the Visual Workshop environment for results display provides new capabilities for the use of the forward calculation as a productivity tool. In this paper, we illustrate the use of the combination of the old and new in order to provide an enhanced analysis capability. We also explore the use of more advanced deterministic methods for scoping calculations used in conjunction with MCBEND, with a view to providing a suite of methods to accompany the main Monte Carlo solver.
Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.
Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe
2018-01-01
Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.
NASA Astrophysics Data System (ADS)
Kucera, P. A.; Burek, T.; Halley-Gotway, J.
2015-12-01
NCAR's Joint Numerical Testbed Program (JNTP) focuses on the evaluation of experimental forecasts of tropical cyclones (TCs) with the goal of developing new research tools and diagnostic evaluation methods that can be transitioned to operations. Recent activities include the development of new TC forecast verification methods and the development of an adaptable TC display and diagnostic system. The next generation display and diagnostic system is being developed to support evaluation needs of the U.S. National Hurricane Center (NHC) and broader TC research community. The new hurricane display and diagnostic capabilities allow forecasters and research scientists to more deeply examine the performance of operational and experimental models. The system is built upon modern and flexible technology that includes OpenLayers Mapping tools that are platform independent. The forecast track and intensity along with associated observed track information are stored in an efficient MySQL database. The system provides easy-to-use interactive display system, and provides diagnostic tools to examine forecast track stratified by intensity. Consensus forecasts can be computed and displayed interactively. The system is designed to display information for both real-time and for historical TC cyclones. The display configurations are easily adaptable to meet the needs of the end-user preferences. Ongoing enhancements include improving capabilities for stratification and evaluation of historical best tracks, development and implementation of additional methods to stratify and compute consensus hurricane track and intensity forecasts, and improved graphical display tools. The display is also being enhanced to incorporate gridded forecast, satellite, and sea surface temperature fields. The presentation will provide an overview of the display and diagnostic system development and demonstration of the current capabilities.
Design and Testing of an Air Force Services Mystery Shopping Program.
1998-11-01
Base level Air Force Services’ lodging and foodservice activities use limited service quality measurement tools to determine customer perceptions of... service quality . These tools, specifically management observation and customer comment cards, do not provide a complete picture of service quality . Other... service quality measurement methods such as mystery shopping are rarely used. Bases do not consider using mystery shopping programs because of the
Shallow Water Reverberation Measurement and Prediction
1994-06-01
tool . The temporal signal processing consisted of a short-time Fourier transform spectral estimation method applied to data from a single hydrophone...The three-dimensional Hamiltonian Acoustic Ray-tracing Program for the Ocean (HARPO) was used as the primary propagation modeling tool . The temporal...summarizes the work completed and discusses lessons learned . Advice regarding future work to refine the present study will be provided. 6 our poiut source
Reducing Information Overload in Large Seismic Data Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.
2000-08-02
Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their effortsmore » to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.« less
Soak Up the Rain New England Webinar Series: National ...
Presenters will provide an introduction to the most recent EPA green infrastructure tools to R1 stakeholders; and their use in making decisions about implementing green infrastructure. We will discuss structuring your green infrastructure decision, finding appropriate information and tools, evaluating options and selecting the right Best Management Practices mix for your needs.WMOST (Watershed Management Optimization Support Tool)- for screening a wide range of practices for cost-effectiveness in achieving watershed or water utilities management goals.GIWiz (Green Infrastructure Wizard)- a web application connecting communities to EPA Green Infrastructure tools and resources.Opti-Tool-designed to assist in developing technically sound and optimized cost-effective Stormwater management plans. National Stormwater Calculator- a desktop application for estimating the impact of land cover change and green infrastructure controls on stormwater runoff. DASEES-GI (Decision Analysis for a Sustainable Environment, Economy, and Society) – a framework for linking objectives and measures with green infrastructure methods. Presenters will provide an introduction to the most recent EPA green infrastructure tools to R1 stakeholders; and their use in making decisions about implementing green infrastructure. We will discuss structuring your green infrastructure decision, finding appropriate information and tools, evaluating options and selecting the right Best Management Pr
Air traffic management evaluation tool
NASA Technical Reports Server (NTRS)
Sheth, Kapil S. (Inventor); Sridhar, Banavar (Inventor); Bilimoria, Karl D. (Inventor); Grabbe, Shon (Inventor); Chatterji, Gano Broto (Inventor); Schipper, John F. (Inventor)
2010-01-01
Method and system for evaluating and implementing air traffic management tools and approaches for managing and avoiding an air traffic incident before the incident occurs. The invention provides flight plan routing and direct routing or wind optimal routing, using great circle navigation and spherical Earth geometry. The invention provides for aircraft dynamics effects, such as wind effects at each altitude, altitude changes, airspeed changes and aircraft turns to provide predictions of aircraft trajectory (and, optionally, aircraft fuel use). A second system provides several aviation applications using the first system. These applications include conflict detection and resolution, miles-in trail or minutes-in-trail aircraft separation, flight arrival management, flight re-routing, weather prediction and analysis and interpolation of weather variables based upon sparse measurements.
Jacques, Eveline; Wells, Darren M; Bennett, Malcolm J; Vissenberg, Kris
2015-01-01
High-resolution imaging of cytoskeletal structures paves the way for standardized methods to quantify cytoskeletal organization. Here we provide a detailed description of the analysis performed to determine the microtubule patterns in gravistimulated roots, using the recently developed software tool MicroFilament Analyzer.
FODEM: A Multi-Threaded Research and Development Method for Educational Technology
ERIC Educational Resources Information Center
Suhonen, Jarkko; de Villiers, M. Ruth; Sutinen, Erkki
2012-01-01
Formative development method (FODEM) is a multithreaded design approach that was originated to support the design and development of various types of educational technology innovations, such as learning tools, and online study programmes. The threaded and agile structure of the approach provides flexibility to the design process. Intensive…
Tool Wear Feature Extraction Based on Hilbert Marginal Spectrum
NASA Astrophysics Data System (ADS)
Guan, Shan; Song, Weijie; Pang, Hongyang
2017-09-01
In the metal cutting process, the signal contains a wealth of tool wear state information. A tool wear signal’s analysis and feature extraction method based on Hilbert marginal spectrum is proposed. Firstly, the tool wear signal was decomposed by empirical mode decomposition algorithm and the intrinsic mode functions including the main information were screened out by the correlation coefficient and the variance contribution rate. Secondly, Hilbert transform was performed on the main intrinsic mode functions. Hilbert time-frequency spectrum and Hilbert marginal spectrum were obtained by Hilbert transform. Finally, Amplitude domain indexes were extracted on the basis of the Hilbert marginal spectrum and they structured recognition feature vector of tool wear state. The research results show that the extracted features can effectively characterize the different wear state of the tool, which provides a basis for monitoring tool wear condition.
Community-driven computational biology with Debian Linux.
Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles
2010-12-21
The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.
NASA Technical Reports Server (NTRS)
ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.
2005-01-01
The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.
Laser production of articles from powders
Lewis, Gary K.; Milewski, John O.; Cremers, David A.; Nemec, Ronald B.; Barbe, Michael R.
1998-01-01
Method and apparatus for forming articles from materials in particulate form in which the materials are melted by a laser beam and deposited at points along a tool path to form an article of the desired shape and dimensions. Preferably the tool path and other parameters of the deposition process are established using computer-aided design and manufacturing techniques. A controller comprised of a digital computer directs movement of a deposition zone along the tool path and provides control signals to adjust apparatus functions, such as the speed at which a deposition head which delivers the laser beam and powder to the deposition zone moves along the tool path.
Laser production of articles from powders
Lewis, G.K.; Milewski, J.O.; Cremers, D.A.; Nemec, R.B.; Barbe, M.R.
1998-11-17
Method and apparatus for forming articles from materials in particulate form in which the materials are melted by a laser beam and deposited at points along a tool path to form an article of the desired shape and dimensions. Preferably the tool path and other parameters of the deposition process are established using computer-aided design and manufacturing techniques. A controller comprised of a digital computer directs movement of a deposition zone along the tool path and provides control signals to adjust apparatus functions, such as the speed at which a deposition head which delivers the laser beam and powder to the deposition zone moves along the tool path. 20 figs.
Buelow, Janice; Miller, Wendy; Fishman, Jesse
2018-01-01
ABSTRACT Background: Nurses have become increasingly involved in overseeing the management of patients with complex medical conditions, including those with epilepsy. Nurses who are not specialists in epilepsy can play a central role in providing optimal care, education, and support to their patients with epilepsy, given the proper tools. Objective: Our objective was to create a tool that can be used by nurses in the clinic setting to help facilitate discussion of topics relevant to enhancing medical care and management of patients with epilepsy. To address this need, a panel of epilepsy nursing experts used a patient-centered care approach to develop an Epilepsy Nursing Communication Tool (ENCT). Methods: An initial set of topics and questions was created based on findings from a literature review. Eight nurse experts reviewed and revised the ENCT using focus groups and discussion forums. The revised ENCT was provided to nurses who care for patients with epilepsy but had not been involved in ENCT development. Nurses were asked to rate the usability and feasibility on a 5-point scale to assess whether the tool captured important topics and was easy to use. Results: Ten nurses provided usability and feasibility assessments. Results indicated strong tool utility, with median scores of 4.5, 4, and 4 for usefulness, ease of use, and acceptability, respectively. Conclusions: The preliminary ENCT shows promise in providing a tool that nurses can use in their interactions with patients with epilepsy to help address the complexity of disease management, which may help improve overall patient care. PMID:29505437
NESSUS/EXPERT - An expert system for probabilistic structural analysis methods
NASA Technical Reports Server (NTRS)
Millwater, H.; Palmer, K.; Fink, P.
1988-01-01
An expert system (NESSUS/EXPERT) is presented which provides assistance in using probabilistic structural analysis methods. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator. NESSUS/EXPERT was developed with a combination of FORTRAN and CLIPS, a C language expert system tool, to exploit the strengths of each language.
Comparison of three methods for evaluation of work postures in a truck assembly plant.
Zare, Mohsen; Biau, Sophie; Brunet, Rene; Roquelaure, Yves
2017-11-01
This study compared the results of three risk assessment tools (self-reported questionnaire, observational tool, direct measurement method) for the upper limbs and back in a truck assembly plant at two cycle times (11 and 8 min). The weighted Kappa factor showed fair agreement between the observational and direct measurement method for the arm (0.39) and back (0.47). The weighted Kappa factor for these methods was poor for the neck (0) and wrist (0) but the observed proportional agreement (P o ) was 0.78 for the neck and 0.83 for the wrist. The weighted Kappa factor between questionnaire and direct measurement showed poor or slight agreement (0) for different body segments in both cycle times. The results revealed moderate agreement between the observational tool and the direct measurement method, and poor agreement between the self-reported questionnaire and direct measurement. Practitioner Summary: This study provides risk exposure measurement by different common ergonomic methods in the field. The results help to develop valid measurements and improve exposure evaluation. Hence, the ergonomist/practitioners should apply the methods with caution, or at least knowing what the issues/errors are.
Fu, Ling-Lin; Li, Jian-Rong
2014-01-01
The ability to trace fecal indicators and food-borne pathogens to the point of origin has major ramifications for food industry, food regulatory agencies, and public health. Such information would enable food producers and processors to better understand sources of contamination and thereby take corrective actions to prevent transmission. Microbial source tracking (MST), which currently is largely focused on determining sources of fecal contamination in waterways, is also providing the scientific community tools for tracking both fecal bacteria and food-borne pathogens contamination in the food chain. Approaches to MST are commonly classified as library-dependent methods (LDMs) or library-independent methods (LIMs). These tools will have widespread applications, including the use for regulatory compliance, pollution remediation, and risk assessment. These tools will reduce the incidence of illness associated with food and water. Our aim in this review is to highlight the use of molecular MST methods in application to understanding the source and transmission of food-borne pathogens. Moreover, the future directions of MST research are also discussed.
Clinical guideline representation in a CDS: a human information processing method.
Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique
2012-01-01
The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.
Bhalla, Kavi; Harrison, James E
2016-04-01
Burden of disease and injury methods can be used to summarise and compare the effects of conditions in terms of disability-adjusted life years (DALYs). Burden estimation methods are not inherently complex. However, as commonly implemented, the methods include complex modelling and estimation. To provide a simple and open-source software tool that allows estimation of incidence-DALYs due to injury, given data on incidence of deaths and non-fatal injuries. The tool includes a default set of estimation parameters, which can be replaced by users. The tool was written in Microsoft Excel. All calculations and values can be seen and altered by users. The parameter sets currently used in the tool are based on published sources. The tool is available without charge online at http://calculator.globalburdenofinjuries.org. To use the tool with the supplied parameter sets, users need to only paste a table of population and injury case data organised by age, sex and external cause of injury into a specified location in the tool. Estimated DALYs can be read or copied from tables and figures in another part of the tool. In some contexts, a simple and user-modifiable burden calculator may be preferable to undertaking a more complex study to estimate the burden of disease. The tool and the parameter sets required for its use can be improved by user innovation, by studies comparing DALYs estimates calculated in this way and in other ways, and by shared experience of its use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Characterizing Task-Based OpenMP Programs
Muddukrishna, Ananya; Jonsson, Peter A.; Brorsson, Mats
2015-01-01
Programmers struggle to understand performance of task-based OpenMP programs since profiling tools only report thread-based performance. Performance tuning also requires task-based performance in order to balance per-task memory hierarchy utilization against exposed task parallelism. We provide a cost-effective method to extract detailed task-based performance information from OpenMP programs. We demonstrate the utility of our method by quickly diagnosing performance problems and characterizing exposed task parallelism and per-task instruction profiles of benchmarks in the widely-used Barcelona OpenMP Tasks Suite. Programmers can tune performance faster and understand performance tradeoffs more effectively than existing tools by using our method to characterize task-based performance. PMID:25860023
A method of designing smartphone interface based on the extended user's mental model
NASA Astrophysics Data System (ADS)
Zhao, Wei; Li, Fengmin; Bian, Jiali; Pan, Juchen; Song, Song
2017-01-01
The user's mental model is the core guiding theory of product design, especially practical products. The essence of practical product is a tool which is used by users to meet their needs. Then, the most important feature of a tool is usability. The design method based on the user's mental model provides a series of practical and feasible theoretical guidance for improving the usability of the product according to the user's awareness of things. In this paper, we propose a method of designing smartphone interface based on the extended user's mental model according to further research on user groups. This approach achieves personalized customization of smartphone application interface and enhance application using efficiency.
An integrated network visualization framework towards metabolic engineering applications.
Noronha, Alberto; Vilaça, Paulo; Rocha, Miguel
2014-12-30
Over the last years, several methods for the phenotype simulation of microorganisms, under specified genetic and environmental conditions have been proposed, in the context of Metabolic Engineering (ME). These methods provided insight on the functioning of microbial metabolism and played a key role in the design of genetic modifications that can lead to strains of industrial interest. On the other hand, in the context of Systems Biology research, biological network visualization has reinforced its role as a core tool in understanding biological processes. However, it has been scarcely used to foster ME related methods, in spite of the acknowledged potential. In this work, an open-source software that aims to fill the gap between ME and metabolic network visualization is proposed, in the form of a plugin to the OptFlux ME platform. The framework is based on an abstract layer, where the network is represented as a bipartite graph containing minimal information about the underlying entities and their desired relative placement. The framework provides input/output support for networks specified in standard formats, such as XGMML, SBGN or SBML, providing a connection to genome-scale metabolic models. An user-interface makes it possible to edit, manipulate and query nodes in the network, providing tools to visualize diverse effects, including visual filters and aspect changing (e.g. colors, shapes and sizes). These tools are particularly interesting for ME, since they allow overlaying phenotype simulation results or elementary flux modes over the networks. The framework and its source code are freely available, together with documentation and other resources, being illustrated with well documented case studies.
Designing Real-time Decision Support for Trauma Resuscitations
Yadav, Kabir; Chamberlain, James M.; Lewis, Vicki R.; Abts, Natalie; Chawla, Shawn; Hernandez, Angie; Johnson, Justin; Tuveson, Genevieve; Burd, Randall S.
2016-01-01
Background Use of electronic clinical decision support (eCDS) has been recommended to improve implementation of clinical decision rules. Many eCDS tools, however, are designed and implemented without taking into account the context in which clinical work is performed. Implementation of the pediatric traumatic brain injury (TBI) clinical decision rule at one Level I pediatric emergency department includes an electronic questionnaire triggered when ordering a head computed tomography using computerized physician order entry (CPOE). Providers use this CPOE tool in less than 20% of trauma resuscitation cases. A human factors engineering approach could identify the implementation barriers that are limiting the use of this tool. Objectives The objective was to design a pediatric TBI eCDS tool for trauma resuscitation using a human factors approach. The hypothesis was that clinical experts will rate a usability-enhanced eCDS tool better than the existing CPOE tool for user interface design and suitability for clinical use. Methods This mixed-methods study followed usability evaluation principles. Pediatric emergency physicians were surveyed to identify barriers to using the existing eCDS tool. Using standard trauma resuscitation protocols, a hierarchical task analysis of pediatric TBI evaluation was developed. Five clinical experts, all board-certified pediatric emergency medicine faculty members, then iteratively modified the hierarchical task analysis until reaching consensus. The software team developed a prototype eCDS display using the hierarchical task analysis. Three human factors engineers provided feedback on the prototype through a heuristic evaluation, and the software team refined the eCDS tool using a rapid prototyping process. The eCDS tool then underwent iterative usability evaluations by the five clinical experts using video review of 50 trauma resuscitation cases. A final eCDS tool was created based on their feedback, with content analysis of the evaluations performed to ensure all concerns were identified and addressed. Results Among 26 EPs (76% response rate), the main barriers to using the existing tool were that the information displayed is redundant and does not fit clinical workflow. After the prototype eCDS tool was developed based on the trauma resuscitation hierarchical task analysis, the human factors engineers rated it to be better than the CPOE tool for nine of 10 standard user interface design heuristics on a three-point scale. The eCDS tool was also rated better for clinical use on the same scale, in 84% of 50 expert–video pairs, and was rated equivalent in the remainder. Clinical experts also rated barriers to use of the eCDS tool as being low. Conclusions An eCDS tool for diagnostic imaging designed using human factors engineering methods has improved perceived usability among pediatric emergency physicians. PMID:26300010
Approaches, tools and methods used for setting priorities in health research in the 21st century
Yoshida, Sachiyo
2016-01-01
Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (<1%). About 3% of studies reported no clear process and provided very little information on how priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is likely that the Delphi method and non–replicable consultation processes will gradually be replaced by these emerging tools, which offer more transparency and replicability. It is too early to say whether any single method can address the needs of most exercises conducted at different levels, or if better results may perhaps be achieved through combination of components of several methods. PMID:26401271
Personal Constructions of Biological Concepts--The Repertory Grid Approach
ERIC Educational Resources Information Center
McCloughlin, Thomas J. J.; Matthews, Philip S. C.
2017-01-01
This work discusses repertory grid analysis as a tool for investigating the structures of students' representations of biological concepts. Repertory grid analysis provides the researcher with a variety of techniques that are not associated with standard methods of concept mapping for investigating conceptual structures. It can provide valuable…
Using "Relationship Marketing" Theory To Develop a Training Model for Admissions Recruiters.
ERIC Educational Resources Information Center
Gyure, James F.; Arnold, Susan G.
2001-01-01
Addresses a critical aspect of enrollment management by providing a "conceptual training outline" based on relationship marketing and management principles for admissions recruiters and other appropriate enrollment staff. Provides a set of "Attitude Tools" to suggest how various training methods might benefit from a consistent…
Strategies for Teaching Fractions: Using Error Analysis for Intervention and Assessment
ERIC Educational Resources Information Center
Spangler, David B.
2011-01-01
Many students struggle with fractions and must understand them before learning higher-level math. Veteran educator David B. Spangler provides research-based tools that are aligned with NCTM and Common Core State Standards. He outlines powerful diagnostic methods for analyzing student work and providing timely, specific, and meaningful…
DOT National Transportation Integrated Search
1998-10-01
The overall objectives of this study were (1) to provide basic performance evaluation of asphalt overlays on rigid pavements and (2) to provide a design tool for supporting a long-range rehabilitation plan for the US 59 : corridor in the Lufkin Distr...
Project Pride Evaluation Report.
ERIC Educational Resources Information Center
Jennewein, Marilyn; And Others
Project PRIDE (Probe, Research, Inquire, Discover, and Evaluate) is evaluated in this report to provide data to be used as a learning tool for project staff and student participants. Major objectives of the project are to provide an inter-disciplinary, objective approach to the study of the American heritage, and to incorporate methods and…
Prioritizing Contaminants for Monitoring and Management
EPA researcher presents work to develop methods and tools that integrate chemical monitoring with pathway-based bioactivity measurements, which will help provide screening-level assessments useful to identify and prioritize emerging contaminants.
The ratio method: A new tool to study one-neutron halo nuclei
Capel, Pierre; Johnson, R. C.; Nunes, F. M.
2013-10-02
Recently a new observable to study halo nuclei was introduced, based on the ratio between breakup and elastic angular cross sections. This new observable is shown by the analysis of specific reactions to be independent of the reaction mechanism and to provide nuclear-structure information of the projectile. Here we explore the details of this ratio method, including the sensitivity to binding energy and angular momentum of the projectile. We also study the reliability of the method with breakup energy. Lastly, we provide guidelines and specific examples for experimentalists who wish to apply this method.
Fan Database and Web-tool for Choosing Quieter Spaceflight Fans
NASA Technical Reports Server (NTRS)
Allen, Christopher S.; Burnside, Nathan J.
2007-01-01
One critical aspect of designing spaceflight hardware is the selection of fans to provide the necessary cooling. And with efforts to minimize cost and the tendancy to be conservative with the amount of cooling provided, it is easy to choose an overpowered fan. One impact of this is that the fan uses more energy than is necessary. But, the more significant impact is that the hardware produces much more acoustic noise than if an optimal fan was chosen. Choosing the right fan for a specific hardware application is no simple task. It requires knowledge of cooling requirements and various fan performance characteristics as well as knowledge of the aerodynamic losses of the hardware in which the fan is to be installed. Knowledge of the acoustic emissions of each fan as a function of operating condition is also required in order to choose a quieter fan for a given design point. The purpose of this paper is to describe a database and design-tool that have been developed to aid spaceflight hardware developers in choosing a fan for their application that is based on aerodynamic performance and reduced acoustic emissions as well. This web-based-tool provides a limited amount of fan-data, provides a method for selecting a fan based on its projected operating point, and also provides a method for comparing and contrasting aerodynamic performance and acoustic data from different fans. Drill-down techniques are used to display details of the spectral noise characteristics of the fan at specific operation conditions. The fan aerodynamic and acoustic data were acquired at Ames Research Center in the Experimental Aero-Physics Branch's Anechoic Chamber. Acoustic data were acquired according to ANSI Standard S12.11-1987, "Method for the Measurement of Noise Emitted by Small Air-Moving Devices." One significant improvement made to this technique included automation that allows for a significant increase in flow-rate resolution. The web-tool was developed at Johnson Space Center and is based on the web-development application, SEQUEL, which includes graphics and drill-down capabilities. This paper will describe the type and amount of data taken for the fans and will give examples of this data. This paper will also describe the data-tool and gives examples of how it can be used to choose quieter fans for use in spaceflight hardware.
Twelve tips for getting started using mixed methods in medical education research.
Lavelle, Ellen; Vuk, Jasna; Barber, Carolyn
2013-04-01
Mixed methods research, which is gaining popularity in medical education, provides a new and comprehensive approach for addressing teaching, learning, and evaluation issues in the field. The aim of this article is to provide medical education researchers with 12 tips, based on consideration of current literature in the health professions and in educational research, for conducting and disseminating mixed methods research. Engaging in mixed methods research requires consideration of several major components: the mixed methods paradigm, types of problems, mixed method designs, collaboration, and developing or extending theory. Mixed methods is an ideal tool for addressing a full range of problems in medical education to include development of theory and improving practice.
Method for grinding precision components
Ramanath, Srinivasan; Kuo, Shih Yee; Williston, William H.; Buljan, Sergej-Tomislav
2000-01-01
A method for precision cylindrical grinding of hard brittle materials, such as ceramics or glass and composites comprising ceramics or glass, provides material removal rates as high as 19-380 cm.sup.3 /min/cm. The abrasive tools used in the method comprise a strong, light weight wheel core bonded to a continuous rim of abrasive segments containing superabrasive grain in a dense metal bond matrix.
Pao's Selection Method for Quality Papers and the Subsequent Use of Medical Literature
ERIC Educational Resources Information Center
Boyce, Bert; Primov, Karen
1977-01-01
Pao's "quality filter" selection method is re-examined as to its effectiveness in selecting papers that not only are of use to medical educators but to researchers as well. It is concluded that the method does provide the librarian with a tool for forming a highly selective bibliography in a particular medical literature without need for…
Reproducing the internal and external anatomy of fossil bones: Two new automatic digital tools.
Profico, Antonio; Schlager, Stefan; Valoriani, Veronica; Buzi, Costantino; Melchionna, Marina; Veneziano, Alessio; Raia, Pasquale; Moggi-Cecchi, Jacopo; Manzi, Giorgio
2018-04-21
We present two new automatic tools, developed under the R environment, to reproduce the internal and external structures of bony elements. The first method, Computer-Aided Laser Scanner Emulator (CA-LSE), provides the reconstruction of the external portions of a 3D mesh by simulating the action of a laser scanner. The second method, Automatic Segmentation Tool for 3D objects (AST-3D), performs the digital reconstruction of anatomical cavities. We present the application of CA-LSE and AST-3D methods to different anatomical remains, highly variable in terms of shape, size and structure: a modern human skull, a malleus bone, and a Neanderthal deciduous tooth. Both methods are developed in the R environment and embedded in the packages "Arothron" and "Morpho," where both the codes and the data are fully available. The application of CA-LSE and AST-3D allows the isolation and manipulation of the internal and external components of the 3D virtual representation of complex bony elements. In particular, we present the output of the four case studies: a complete modern human endocast and the right maxillary sinus, the dental pulp of the Neanderthal tooth and the inner network of blood vessels of the malleus. Both methods demonstrated to be much faster, cheaper, and more accurate than other conventional approaches. The tools we presented are available as add-ons in existing software within the R platform. Because of ease of application, and unrestrained availability of the methods proposed, these tools can be widely used by paleoanthropologists, paleontologists and anatomists. © 2018 Wiley Periodicals, Inc.
Approaches, tools and methods used for setting priorities in health research in the 21(st) century.
Yoshida, Sachiyo
2016-06-01
Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001-2014. A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (<1%). About 3% of studies reported no clear process and provided very little information on how priorities were set. A further 19% used a combination of expert panel interview and focus group discussion ("consultation process") but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face-to-face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. The number of priority setting exercises in health research published in PubMed-indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well-defined structure - such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix - it is likely that the Delphi method and non-replicable consultation processes will gradually be replaced by these emerging tools, which offer more transparency and replicability. It is too early to say whether any single method can address the needs of most exercises conducted at different levels, or if better results may perhaps be achieved through combination of components of several methods.
Multispectral analysis tools can increase utility of RGB color images in histology
NASA Astrophysics Data System (ADS)
Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard
2018-04-01
Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.
Self-actuating and self-diagnosing plastically deforming piezo-composite flapping wing MAV
NASA Astrophysics Data System (ADS)
Harish, Ajay B.; Harursampath, Dineshkumar; Mahapatra, D. Roy
2011-04-01
In this work, we propose a constitutive model to describe the behavior of Piezoelectric Fiber Reinforced Composite (PFRC) material consisting of elasto-plastic matrix reinforced by strong elastic piezoelectric fibers. Computational efficiency is achieved using analytical solutions for elastic stifness matrix derived from Variational Asymptotic Methods (VAM). This is extended to provide Structural Health Monitoring (SHM) based on plasticity induced degradation of flapping frequency of PFRC. Overall this work provides an effective mathematical tool that can be used for structural self-health monitoring of plasticity induced flapping degradation of PFRC flapping wing MAVs. The developed tool can be re-calibrated to also provide SHM for other forms of failures like fatigue, matrix cracking etc.
MEvoLib v1.0: the first molecular evolution library for Python.
Álvarez-Jarreta, Jorge; Ruiz-Pesini, Eduardo
2016-10-28
Molecular evolution studies involve many different hard computational problems solved, in most cases, with heuristic algorithms that provide a nearly optimal solution. Hence, diverse software tools exist for the different stages involved in a molecular evolution workflow. We present MEvoLib, the first molecular evolution library for Python, providing a framework to work with different tools and methods involved in the common tasks of molecular evolution workflows. In contrast with already existing bioinformatics libraries, MEvoLib is focused on the stages involved in molecular evolution studies, enclosing the set of tools with a common purpose in a single high-level interface with fast access to their frequent parameterizations. The gene clustering from partial or complete sequences has been improved with a new method that integrates accessible external information (e.g. GenBank's features data). Moreover, MEvoLib adjusts the fetching process from NCBI databases to optimize the download bandwidth usage. In addition, it has been implemented using parallelization techniques to cope with even large-case scenarios. MEvoLib is the first library for Python designed to facilitate molecular evolution researches both for expert and novel users. Its unique interface for each common task comprises several tools with their most used parameterizations. It has also included a method to take advantage of biological knowledge to improve the gene partition of sequence datasets. Additionally, its implementation incorporates parallelization techniques to enhance computational costs when handling very large input datasets.
Roles and methods of performance evaluation of hospital academic leadership.
Zhou, Ying; Yuan, Huikang; Li, Yang; Zhao, Xia; Yi, Lihua
2016-01-01
The rapidly advancing implementation of public hospital reform urgently requires the identification and classification of a pool of exceptional medical specialists, corresponding with incentives to attract and retain them, providing a nucleus of distinguished expertise to ensure public hospital preeminence. This paper examines the significance of academic leadership, from a strategic management perspective, including various tools, methods and mechanisms used in the theory and practice of performance evaluation, and employed in the selection, training and appointment of academic leaders. Objective methods of assessing leadership performance are also provided for reference.
Setting the standard, implementation and auditing within haemodialysis.
Jones, J
1997-01-01
With an ever increasing awareness of the need to deliver a quality of care that is measurable in Nursing, the concept of Standards provides an ideal tool (1). Standards operate outside the boundaries of policies and procedures to provide an audit tool of authenticity and flexibility. Within our five Renal Units, while we felt confident that we were delivering an excellent standard of care to our patients and continually trying to improve upon it, what we really needed was a method of measuring this current level of care and highlighting key areas where we could offer improvement.
Digitizing the Facebow: A Clinician/Technician Communication Tool.
Kalman, Les; Chrapka, Julia; Joseph, Yasmin
2016-01-01
Communication between the clinician and the technician has been an ongoing problem in dentistry. To improve the issue, a dental software application has been developed--the Virtual Facebow App. It is an alternative to the traditional analog facebow, used to orient the maxillary cast in mounting. Comparison data of the two methods indicated that the digitized virtual facebow provided increased efficiency in mounting, increased accuracy in occlusion, and lower cost. Occlusal accuracy, lab time, and total time were statistically significant (P<.05). The virtual facebow provides a novel alternative for cast mounting and another tool for clinician-technician communication.
Tools, Techniques, and Applications: Normalizing the VR Paradigm
NASA Technical Reports Server (NTRS)
Duncan, Gaeme
2008-01-01
Oshynee's precision Learning Objective performance factor rubrics with associated behavioral anchors integrates with Thinking Worlds(TradeMark), to provide event data recording and dynamic prescriptive feedback. Thinking Worlds(TradeMark) provides SCORM parametric data for reporting within the game and within overarching curricula or workplace evaluation strategy. - Open-sourced, browser-based digital dashboard reporting tools collect data from TW, LMS, LCMS, HR, and workplace metrics or control systems The games may be delivered across the internet or in a range of networked and stand-alone methods using the delivery model (s) required by the host organization.
Solernou, Albert; Hanson, Benjamin S; Richardson, Robin A; Welch, Robert; Read, Daniel J; Harlen, Oliver G; Harris, Sarah A
2018-03-01
Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.
Tools to support evidence-informed public health decision making
2014-01-01
Background Public health professionals are increasingly expected to engage in evidence-informed decision making to inform practice and policy decisions. Evidence-informed decision making involves the use of research evidence along with expertise, existing public health resources, knowledge about community health issues, the local context and community, and the political climate. The National Collaborating Centre for Methods and Tools has identified a seven step process for evidence-informed decision making. Tools have been developed to support public health professionals as they work through each of these steps. This paper provides an overview of tools used in three Canadian public health departments involved in a study to develop capacity for evidence-informed decision making. Methods As part of a knowledge translation and exchange intervention, a Knowledge Broker worked with public health professionals to identify and apply tools for use with each of the steps of evidence-informed decision making. The Knowledge Broker maintained a reflective journal and interviews were conducted with a purposive sample of decision makers and public health professionals. This paper presents qualitative analysis of the perceived usefulness and usability of the tools. Results Tools were used in the health departments to assist in: question identification and clarification; searching for the best available research evidence; assessing the research evidence for quality through critical appraisal; deciphering the ‘actionable message(s)’ from the research evidence; tailoring messages to the local context to ensure their relevance and suitability; deciding whether and planning how to implement research evidence in the local context; and evaluating the effectiveness of implementation efforts. Decision makers provided descriptions of how the tools were used within the health departments and made suggestions for improvement. Overall, the tools were perceived as valuable for advancing and sustaining evidence-informed decision making. Conclusion Tools are available to support the process of evidence-informed decision making among public health professionals. The usability and usefulness of these tools for advancing and sustaining evidence-informed decision making are discussed, including recommendations for the tools’ application in other public health settings beyond this study. Knowledge and awareness of these tools may assist other health professionals in their efforts to implement evidence-informed practice. PMID:25034534
An advanced probabilistic structural analysis method for implicit performance functions
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.
1989-01-01
In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.
Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P
2011-05-19
There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.
Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika
2017-01-01
Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.
Cutting force measurement of electrical jigsaw by strain gauges
NASA Astrophysics Data System (ADS)
Kazup, L.; Varadine Szarka, A.
2016-11-01
This paper describes a measuring method based on strain gauges for accurate specification of electric jigsaw's cutting force. The goal of the measurement is to provide an overall perspective about generated forces in a jigsaw's gearbox during a cutting period. The lifetime of the tool is affected by these forces primarily. This analysis is part of the research and development project aiming to develop a special linear magnetic brake for realizing automatic lifetime tests of electric jigsaws or similar handheld tools. The accurate specification of cutting force facilitates to define realistic test cycles during the automatic lifetime test. The accuracy and precision resulted by the well described cutting force characteristic and the possibility of automation provide new dimension for lifetime testing of the handheld tools with alternating movement.
Fatigue Life Assessment of 65Si7 Leaf Springs: A Comparative Study
Arora, Vinkel Kumar; Bhushan, Gian; Aggarwal, M. L.
2014-01-01
The experimental fatigue life prediction of leaf springs is a time consuming process. The engineers working in the field of leaf springs always face a challenge to formulate alternate methods of fatigue life assessment. The work presented in this paper provides alternate methods for fatigue life assessment of leaf springs. A 65Si7 light commercial vehicle leaf spring is chosen for this study. The experimental fatigue life and load rate are determined on a full scale leaf spring testing machine. Four alternate methods of fatigue life assessment have been depicted. Firstly by SAE spring design manual approach the fatigue test stroke is established and by the intersection of maximum and initial stress the fatigue life is predicted. The second method constitutes a graphical method based on modified Goodman's criteria. In the third method codes are written in FORTRAN for fatigue life assessment based on analytical technique. The fourth method consists of computer aided engineering tools. The CAD model of the leaf spring has been prepared in solid works and analyzed using ANSYS. Using CAE tools, ideal type of contact and meshing elements have been proposed. The method which provides fatigue life closer to experimental value and consumes less time is suggested. PMID:27379327
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ditzler, Gregory; Morrison, J. Calvin; Lan, Yemin
Background: Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α– & β–diversity. Feature subset selection – a sub-field of machine learning – can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate betweenmore » age groups in the human gut microbiome. Results: We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. Conclusions: We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.« less
Fizzy: feature subset selection for metagenomics.
Ditzler, Gregory; Morrison, J Calvin; Lan, Yemin; Rosen, Gail L
2015-11-04
Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α- & β-diversity. Feature subset selection--a sub-field of machine learning--can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate between age groups in the human gut microbiome. We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.
Fizzy. Feature subset selection for metagenomics
Ditzler, Gregory; Morrison, J. Calvin; Lan, Yemin; ...
2015-11-04
Background: Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α– & β–diversity. Feature subset selection – a sub-field of machine learning – can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate betweenmore » age groups in the human gut microbiome. Results: We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. Conclusions: We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.« less
Clarity versus complexity: land-use modeling as a practical tool for decision-makers
Sohl, Terry L.; Claggett, Peter
2013-01-01
The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.
Implementation of a Distributed Object-Oriented Database Management System
1989-03-01
and heuristic algorithms. A method for determining ueit allocation by splitting relations in the conceptual schema base on queries and updates is...level framworks can provide to the user the appearance of many tools to be closely integrated. In particular, the KBSA tools use many high level...development process should begin first with conceptual design of the system. Approximately one month should be used to decide how the new projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarocki, John Charles; Zage, David John; Fisher, Andrew N.
LinkShop is a software tool for applying the method of Linkography to the analysis time-sequence data. LinkShop provides command line, web, and application programming interfaces (API) for input and processing of time-sequence data, abstraction models, and ontologies. The software creates graph representations of the abstraction model, ontology, and derived linkograph. Finally, the tool allows the user to perform statistical measurements of the linkograph and refine the ontology through direct manipulation of the linkograph.
Human factors issues in the design of user interfaces for planning and scheduling
NASA Technical Reports Server (NTRS)
Murphy, Elizabeth D.
1991-01-01
The purpose is to provide and overview of human factors issues that impact the effectiveness of user interfaces to automated scheduling tools. The following methods are employed: (1) a survey of planning and scheduling tools; (2) the identification and analysis of human factors issues; (3) the development of design guidelines based on human factors literature; and (4) the generation of display concepts to illustrate guidelines.
2016-02-01
proof in mathematics. For example, consider the proof of the Pythagorean Theorem illustrated at: http://www.cut-the-knot.org/ pythagoras / where 112...methods and tools have made significant progress in their ability to model software designs and prove correctness theorems about the systems modeled...assumption criticality” or “ theorem root set size” SITAPS detects potentially brittle verification cases. SITAPS provides tools and techniques that
The Hyperspectral Imager for the Coastal Ocean (HICO): Sensor and Data Processing Overview
2010-01-20
backscattering coefficients, and others. Several of these software modules will be developed within the Automated Processing System (APS), a data... Automated Processing System (APS) NRL developed APS, which processes satellite data into ocean color data products. APS is a collection of methods...used for ocean color processing which provide the tools for the automated processing of satellite imagery [1]. These tools are in the process of
Evaluation of hand sensibility: a review.
Novak, C B
2001-01-01
Many assessment devices and measures have been described to evaluate sensibility, with little consensus on the optimal measurement tool. The purpose of this paper is to review the assessment methods and devices used in the evaluation of hand sensibility. Consideration is given to the characteristics of each measurement tool, the information necessary for complete patient evaluation, and the battery of valid and reliable measurements that provide the most complete and accurate patient assessment.
Innovations for the future of pharmacovigilance.
Almenoff, June S
2007-01-01
Post-marketing pharmacovigilance involves the review and management of safety information from many sources. Among these sources, spontaneous adverse event reporting systems are among the most challenging and resource-intensive to manage. Traditionally, efforts to monitor spontaneous adverse event reporting systems have focused on review of individual case reports. The science of pharmacovigilance could be enhanced with the availability of systems-based tools that facilitate analysis of aggregate data for purposes of signal detection, signal evaluation and knowledge management. GlaxoSmithKline (GSK) recently implemented Online Signal Management (OSM) as a data-driven framework for managing the pharmacovigilance of marketed products. This pioneering work builds upon the strong history GSK has of innovation in this area. OSM is a software application co-developed by GSK and Lincoln Technologies that integrates traditional pharmacovigilance methods with modern quantitative statistical methods and data visualisation tools. OSM enables the rapid identification of trends from the individual adverse event reports received by GSK. OSM also provides knowledge-management tools to ensure the successful tracking of emerging safety issues. GSK has developed standard procedures and 'best practices' around the use of OSM to ensure the systematic evaluation of complex safety datasets. In summary, the implementation of OSM provides new tools and efficient processes to advance the science of pharmacovigilance.
Chemical Tool Peer Review Summary.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cashion, Avery Ted; Cieslewski, Grzegorz
Chemical tracers are commonly used to characterize fracture networks and to determine the connectivity between the injection and production wells. Currently, most tracer experiments involve injecting the tracer at the injection well, manually collecting liquid samples at the wellhead of the production well, and sending the samples off for laboratory analysis. While this method provides accurate tracer concentration data, it does not provide information regarding the location of the fractures conducting the tracer between wellbores. The goal of this project is to develop chemical sensors and design a prototype tool to help understand the fracture properties of a geothermal reservoirmore » by monitoring tracer concentrations along the depth of the well. The sensors will be able to detect certain species of the ionic tracers (mainly iodide) and pH in-situ during the tracer experiment. The proposed high-temperature (HT) tool will house the chemical sensors as well as a standard logging sensor package of pressure, temperature, and flow sensors in order to provide additional information on the state of the geothermal reservoir. The sensors and the tool will be able to survive extended deployments at temperatures up to 225 °C and high pressures to provide real-time temporal and spatial feedback of tracer concentration. Data collected from this tool will allow for the real-time identification of the fractures conducting chemical tracers between wellbores along with the pH of the reservoir fluid at various depths.« less
Regularization and computational methods for precise solution of perturbed orbit transfer problems
NASA Astrophysics Data System (ADS)
Woollands, Robyn Michele
The author has developed a suite of algorithms for solving the perturbed Lambert's problem in celestial mechanics. These algorithms have been implemented as a parallel computation tool that has broad applicability. This tool is composed of four component algorithms and each provides unique benefits for solving a particular type of orbit transfer problem. The first one utilizes a Keplerian solver (a-iteration) for solving the unperturbed Lambert's problem. This algorithm not only provides a "warm start" for solving the perturbed problem but is also used to identify which of several perturbed solvers is best suited for the job. The second algorithm solves the perturbed Lambert's problem using a variant of the modified Chebyshev-Picard iteration initial value solver that solves two-point boundary value problems. This method converges over about one third of an orbit and does not require a Newton-type shooting method and thus no state transition matrix needs to be computed. The third algorithm makes use of regularization of the differential equations through the Kustaanheimo-Stiefel transformation and extends the domain of convergence over which the modified Chebyshev-Picard iteration two-point boundary value solver will converge, from about one third of an orbit to almost a full orbit. This algorithm also does not require a Newton-type shooting method. The fourth algorithm uses the method of particular solutions and the modified Chebyshev-Picard iteration initial value solver to solve the perturbed two-impulse Lambert problem over multiple revolutions. The method of particular solutions is a shooting method but differs from the Newton-type shooting methods in that it does not require integration of the state transition matrix. The mathematical developments that underlie these four algorithms are derived in the chapters of this dissertation. For each of the algorithms, some orbit transfer test cases are included to provide insight on accuracy and efficiency of these individual algorithms. Following this discussion, the combined parallel algorithm, known as the unified Lambert tool, is presented and an explanation is given as to how it automatically selects which of the three perturbed solvers to compute the perturbed solution for a particular orbit transfer. The unified Lambert tool may be used to determine a single orbit transfer or for generating of an extremal field map. A case study is presented for a mission that is required to rendezvous with two pieces of orbit debris (spent rocket boosters). The unified Lambert tool software developed in this dissertation is already being utilized by several industrial partners and we are confident that it will play a significant role in practical applications, including solution of Lambert problems that arise in the current applications focused on enhanced space situational awareness.
Design of a high altitude long endurance flying-wing solar-powered unmanned air vehicle
NASA Astrophysics Data System (ADS)
Alsahlani, A. A.; Johnston, L. J.; Atcliffe, P. A.
2017-06-01
The low-Reynolds number environment of high-altitude §ight places severe demands on the aerodynamic design and stability and control of a high altitude, long endurance (HALE) unmanned air vehicle (UAV). The aerodynamic efficiency of a §ying-wing configuration makes it an attractive design option for such an application and is investigated in the present work. The proposed configuration has a high-aspect ratio, swept-wing planform, the wing sweep being necessary to provide an adequate moment arm for outboard longitudinal and lateral control surfaces. A design optimization framework is developed under a MATLAB environment, combining aerodynamic, structural, and stability analysis. Low-order analysis tools are employed to facilitate efficient computations, which is important when there are multiple optimization loops for the various engineering analyses. In particular, a vortex-lattice method is used to compute the wing planform aerodynamics, coupled to a twodimensional (2D) panel method to derive aerofoil sectional characteristics. Integral boundary-layer methods are coupled to the panel method in order to predict §ow separation boundaries during the design iterations. A quasi-analytical method is adapted for application to flyingwing con¦gurations to predict the wing weight and a linear finite-beam element approach is used for structural analysis of the wing-box. Stability is a particular concern in the low-density environment of high-altitude flight for flying-wing aircraft and so provision of adequate directional stability and control power forms part of the optimization process. At present, a modified Genetic Algorithm is used in all of the optimization loops. Each of the low-order engineering analysis tools is validated using higher-order methods to provide con¦dence in the use of these computationally-efficient tools in the present design-optimization framework. This paper includes the results of employing the present optimization tools in the design of a HALE, flying-wing UAV to indicate that this is a viable design configuration option.
Conser, Christiana; Seebacher, Lizbeth; Fujino, David W; Reichard, Sarah; DiTomaso, Joseph M
2015-01-01
Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when "needs further evaluation" classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When "needs further evaluation" classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program.
Conser, Christiana; Seebacher, Lizbeth; Fujino, David W.; Reichard, Sarah; DiTomaso, Joseph M.
2015-01-01
Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when “needs further evaluation” classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When “needs further evaluation” classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program. PMID:25803830
Sarmiento, Kelly; Donnell, Zoe; Hoffman, Rosanne; Tennant, Bethany
2018-04-23
Explore healthcare providers' experiences managing mTBI and better understand their use of mTBI assessment tools and guidelines. Cross-sectional Methods: A random sample of 1,760 healthcare providers responded to the web-based DocStyles survey between June 18 and 30, 2014. The sample included family/general practitioners, internists, pediatricians, and nurse practitioners who reported seeing pediatric patients. We examined their experiences with mTBI to identify opportunities to increase preparedness and improve management of mTBI. Fifty-nine percent of healthcare providers reported that they diagnosed or managed pediatric patients with mTBI within the last 12 months. Of those, 44.4% felt 'very prepared' to make decisions about when pediatric patients can safety return to activities, such as school and sports after a mTBI. When asked how often they use screening or assessment tools to assess pediatric patients with mTBI, almost half reported that they 'seldom' or 'never' use those resources (24.6% and 22.0%, respectively). Most healthcare providers reported seeing pediatric patients with mTBI, yet most feel only somewhat prepared to manage this injury in their practise. Broader use of screening tools and guidelines, that include clinical decision support tools, may be useful for healthcare providers who care for pediatric patients with mTBI.
22 CFR 124.2 - Exemptions for training and military service.
Code of Federal Regulations, 2010 CFR
2010-04-01
... methods and tools include the development and/or use of mockups, computer models and simulations, and test facilities. (iii) Manufacturing know-how, such as: Information that provides detailed manufacturing processes...
49 CFR 396.25 - Qualifications of brake inspectors.
Code of Federal Regulations, 2011 CFR
2011-10-01
... methods, procedures, tools and equipment used when performing an assigned brake service or inspection task... motor carrier or intermodal equipment provider at its principal place of business, or at the location at...
49 CFR 396.25 - Qualifications of brake inspectors.
Code of Federal Regulations, 2010 CFR
2010-10-01
... methods, procedures, tools and equipment used when performing an assigned brake service or inspection task... motor carrier or intermodal equipment provider at its principal place of business, or at the location at...
Technology-Aided Assessment of Sensorimotor Function in Early Infancy
Allievi, Alessandro G.; Arichi, Tomoki; Gordon, Anne L.; Burdet, Etienne
2014-01-01
There is a pressing need for new techniques capable of providing accurate information about sensorimotor function during the first 2 years of childhood. Here, we review current clinical methods and challenges for assessing motor function in early infancy, and discuss the potential benefits of applying technology-assisted methods. We also describe how the use of these tools with neuroimaging, and in particular functional magnetic resonance imaging (fMRI), can shed new light on the intra-cerebral processes underlying neurodevelopmental impairment. This knowledge is of particular relevance in the early infant brain, which has an increased capacity for compensatory neural plasticity. Such tools could bring a wealth of knowledge about the underlying pathophysiological processes of diseases such as cerebral palsy; act as biomarkers to monitor the effects of possible therapeutic interventions; and provide clinicians with much needed early diagnostic information. PMID:25324827
A tool for simulating parallel branch-and-bound methods
NASA Astrophysics Data System (ADS)
Golubeva, Yana; Orlov, Yury; Posypkin, Mikhail
2016-01-01
The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer's interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.
Fast algorithms for Quadrature by Expansion I: Globally valid expansions
NASA Astrophysics Data System (ADS)
Rachh, Manas; Klöckner, Andreas; O'Neil, Michael
2017-09-01
The use of integral equation methods for the efficient numerical solution of PDE boundary value problems requires two main tools: quadrature rules for the evaluation of layer potential integral operators with singular kernels, and fast algorithms for solving the resulting dense linear systems. Classically, these tools were developed separately. In this work, we present a unified numerical scheme based on coupling Quadrature by Expansion, a recent quadrature method, to a customized Fast Multipole Method (FMM) for the Helmholtz equation in two dimensions. The method allows the evaluation of layer potentials in linear-time complexity, anywhere in space, with a uniform, user-chosen level of accuracy as a black-box computational method. Providing this capability requires geometric and algorithmic considerations beyond the needs of standard FMMs as well as careful consideration of the accuracy of multipole translations. We illustrate the speed and accuracy of our method with various numerical examples.
NASA Technical Reports Server (NTRS)
Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola
2005-01-01
Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.
Computed tomography-based volumetric tool for standardized measurement of the maxillary sinus
Giacomini, Guilherme; Pavan, Ana Luiza Menegatti; Altemani, João Mauricio Carrasco; Duarte, Sergio Barbosa; Fortaleza, Carlos Magno Castelo Branco; Miranda, José Ricardo de Arruda
2018-01-01
Volume measurements of maxillary sinus may be useful to identify diseases affecting paranasal sinuses. However, literature shows a lack of consensus in studies measuring the volume. This may be attributable to different computed tomography data acquisition techniques, segmentation methods, focuses of investigation, among other reasons. Furthermore, methods for volumetrically quantifying the maxillary sinus are commonly manual or semiautomated, which require substantial user expertise and are time-consuming. The purpose of the present study was to develop an automated tool for quantifying the total and air-free volume of the maxillary sinus based on computed tomography images. The quantification tool seeks to standardize maxillary sinus volume measurements, thus allowing better comparisons and determinations of factors that influence maxillary sinus size. The automated tool utilized image processing techniques (watershed, threshold, and morphological operators). The maxillary sinus volume was quantified in 30 patients. To evaluate the accuracy of the automated tool, the results were compared with manual segmentation that was performed by an experienced radiologist using a standard procedure. The mean percent differences between the automated and manual methods were 7.19% ± 5.83% and 6.93% ± 4.29% for total and air-free maxillary sinus volume, respectively. Linear regression and Bland-Altman statistics showed good agreement and low dispersion between both methods. The present automated tool for maxillary sinus volume assessment was rapid, reliable, robust, accurate, and reproducible and may be applied in clinical practice. The tool may be used to standardize measurements of maxillary volume. Such standardization is extremely important for allowing comparisons between studies, providing a better understanding of the role of the maxillary sinus, and determining the factors that influence maxillary sinus size under normal and pathological conditions. PMID:29304130
Real-time fluorescence loop mediated isothermal amplification for the diagnosis of malaria.
Lucchi, Naomi W; Demas, Allison; Narayanan, Jothikumar; Sumari, Deborah; Kabanywanyi, Abdunoor; Kachur, S Patrick; Barnwell, John W; Udhayakumar, Venkatachalam
2010-10-29
Molecular diagnostic methods can complement existing tools to improve the diagnosis of malaria. However, they require good laboratory infrastructure thereby restricting their use to reference laboratories and research studies. Therefore, adopting molecular tools for routine use in malaria endemic countries will require simpler molecular platforms. The recently developed loop-mediated isothermal amplification (LAMP) method is relatively simple and can be improved for better use in endemic countries. In this study, we attempted to improve this method for malaria diagnosis by using a simple and portable device capable of performing both the amplification and detection (by fluorescence) of LAMP in one platform. We refer to this as the RealAmp method. Published genus-specific primers were used to test the utility of this method. DNA derived from different species of malaria parasites was used for the initial characterization. Clinical samples of P. falciparum were used to determine the sensitivity and specificity of this system compared to microscopy and a nested PCR method. Additionally, directly boiled parasite preparations were compared with a conventional DNA isolation method. The RealAmp method was found to be simple and allowed real-time detection of DNA amplification. The time to amplification varied but was generally less than 60 minutes. All human-infecting Plasmodium species were detected. The sensitivity and specificity of RealAmp in detecting P. falciparum was 96.7% and 91.7% respectively, compared to microscopy and 98.9% and 100% respectively, compared to a standard nested PCR method. In addition, this method consistently detected P. falciparum from directly boiled blood samples. This RealAmp method has great potential as a field usable molecular tool for diagnosis of malaria. This tool can provide an alternative to conventional PCR based diagnostic methods for field use in clinical and operational programs.
A dielectric logging tool with insulated collar for formation fluid detection around borehole
NASA Astrophysics Data System (ADS)
Wang, Bin; Li, Kang; Kong, Fan-Min; Zhao, Jia
2015-08-01
A dielectric tool with insulated collar for analyzing fluid saturation outside a borehole was introduced. The UWB (ultra-wideband) antenna mounted on the tool was optimized to launch a transient pulse. The broadband evaluation method provided more advantages when compared with traditional dielectric tools. The EM (electromagnetic) power distribution outside the borehole was studied, and it was shown that energy was propagated in two modes. Furthermore, the mechanism of the modes was discussed. In order to increase this tools' investigation depth, a novel insulated collar was introduced. In addition, operation in difference formations was discussed and this tool proved to be able to efficiently launch lateral EM waves. Response voltages indicated that the proposed scheme was able to evaluate the fluid saturation of reservoir formations and dielectric dispersion properties. It may be used as an alternative tool for imaging logging applications.
Response simulation and theoretical calibration of a dual-induction resistivity LWD tool
NASA Astrophysics Data System (ADS)
Xu, Wei; Ke, Shi-Zhen; Li, An-Zong; Chen, Peng; Zhu, Jun; Zhang, Wei
2014-03-01
In this paper, responses of a new dual-induction resistivity logging-while-drilling (LWD) tool in 3D inhomogeneous formation models are simulated by the vector finite element method (VFEM), the influences of the borehole, invaded zone, surrounding strata, and tool eccentricity are analyzed, and calibration loop parameters and calibration coefficients of the LWD tool are discussed. The results show that the tool has a greater depth of investigation than that of the existing electromagnetic propagation LWD tools and is more sensitive to azimuthal conductivity. Both deep and medium induction responses have linear relationships with the formation conductivity, considering optimal calibration loop parameters and calibration coefficients. Due to the different depths of investigation and resolution, deep induction and medium induction are affected differently by the formation model parameters, thereby having different correction factors. The simulation results can provide theoretical references for the research and interpretation of the dual-induction resistivity LWD tools.
Method and apparatus for transmitting and receiving data to and from a downhole tool
Hall, David R.; Fox, Joe
2007-03-13
A transmission line network system for transmitting and/or receiving data from a downhole tool. The invention is achieved by providing one or more transceiving elements, preferably rings, at either end of a downhole tool. A conduit containing a coaxial cable capable of communicating an electrical signal is attached to the transceiving element and extends through a central bore of the downhole tool and through the central bore of any tool intermediate the first transceiving element and a second transceiving element. Upon receiving an electrical signal from the cable, the second transceiving element may convert such signal to a magnetic field. The magnetic field may be detected by a third transceiving element in close proximity to the second transceiving element. In this manner, many different tools may be included in a downhole transmission network without requiring substantial modification, if any, of any particular tool.
Free DICOM de-identification tools in clinical research: functioning and safety of patient privacy.
Aryanto, K Y E; Oudkerk, M; van Ooijen, P M A
2015-12-01
To compare non-commercial DICOM toolkits for their de-identification ability in removing a patient's personal health information (PHI) from a DICOM header. Ten DICOM toolkits were selected for de-identification tests. Tests were performed by using the system's default de-identification profile and, subsequently, the tools' best adjusted settings. We aimed to eliminate fifty elements considered to contain identifying patient information. The tools were also examined for their respective methods of customization. Only one tool was able to de-identify all required elements with the default setting. Not all of the toolkits provide a customizable de-identification profile. Six tools allowed changes by selecting the provided profiles, giving input through a graphical user interface (GUI) or configuration text file, or providing the appropriate command-line arguments. Using adjusted settings, four of those six toolkits were able to perform full de-identification. Only five tools could properly de-identify the defined DICOM elements, and in four cases, only after careful customization. Therefore, free DICOM toolkits should be used with extreme care to prevent the risk of disclosing PHI, especially when using the default configuration. In case optimal security is required, one of the five toolkits is proposed. • Free DICOM toolkits should be carefully used to prevent patient identity disclosure. • Each DICOM tool produces its own specific outcomes from the de-identification process. • In case optimal security is required, using one DICOM toolkit is proposed.
The Web as an educational tool for/in learning/teaching bioinformatics statistics.
Oliver, J; Pisano, M E; Alonso, T; Roca, P
2005-12-01
Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.
A novel, image analysis-based method for the evaluation of in vitro antagonism.
Szekeres, András; Leitgeb, Balázs; Kredics, László; Manczinger, László; Vágvölgyi, Csaba
2006-06-01
A novel method is proposed for the accurate evaluation of in vitro antagonism. Based on the measurement of areas of the fungal colonies, biocontrol indices were calculated, which are characteristic to the antagonistic Trichoderma strains. These indices provide a useful tool to describe the biocontrol abilities of fungi.
Coloc-stats: a unified web interface to perform colocalization analysis of genomic features.
Simovski, Boris; Kanduri, Chakravarthi; Gundersen, Sveinung; Titov, Dmytro; Domanska, Diana; Bock, Christoph; Bossini-Castillo, Lara; Chikina, Maria; Favorov, Alexander; Layer, Ryan M; Mironov, Andrey A; Quinlan, Aaron R; Sheffield, Nathan C; Trynka, Gosia; Sandve, Geir K
2018-06-05
Functional genomics assays produce sets of genomic regions as one of their main outputs. To biologically interpret such region-sets, researchers often use colocalization analysis, where the statistical significance of colocalization (overlap, spatial proximity) between two or more region-sets is tested. Existing colocalization analysis tools vary in the statistical methodology and analysis approaches, thus potentially providing different conclusions for the same research question. As the findings of colocalization analysis are often the basis for follow-up experiments, it is helpful to use several tools in parallel and to compare the results. We developed the Coloc-stats web service to facilitate such analyses. Coloc-stats provides a unified interface to perform colocalization analysis across various analytical methods and method-specific options (e.g. colocalization measures, resolution, null models). Coloc-stats helps the user to find a method that supports their experimental requirements and allows for a straightforward comparison across methods. Coloc-stats is implemented as a web server with a graphical user interface that assists users with configuring their colocalization analyses. Coloc-stats is freely available at https://hyperbrowser.uio.no/coloc-stats/.
In silico prediction of splice-altering single nucleotide variants in the human genome.
Jian, Xueqiu; Boerwinkle, Eric; Liu, Xiaoming
2014-12-16
In silico tools have been developed to predict variants that may have an impact on pre-mRNA splicing. The major limitation of the application of these tools to basic research and clinical practice is the difficulty in interpreting the output. Most tools only predict potential splice sites given a DNA sequence without measuring splicing signal changes caused by a variant. Another limitation is the lack of large-scale evaluation studies of these tools. We compared eight in silico tools on 2959 single nucleotide variants within splicing consensus regions (scSNVs) using receiver operating characteristic analysis. The Position Weight Matrix model and MaxEntScan outperformed other methods. Two ensemble learning methods, adaptive boosting and random forests, were used to construct models that take advantage of individual methods. Both models further improved prediction, with outputs of directly interpretable prediction scores. We applied our ensemble scores to scSNVs from the Catalogue of Somatic Mutations in Cancer database. Analysis showed that predicted splice-altering scSNVs are enriched in recurrent scSNVs and known cancer genes. We pre-computed our ensemble scores for all potential scSNVs across the human genome, providing a whole genome level resource for identifying splice-altering scSNVs discovered from large-scale sequencing studies.
Cui, Miao; Lin, Che-Yi; Su, Yi-Hsien
2017-09-01
Studies on the gene regulatory networks (GRNs) of sea urchin embryos have provided a basic understanding of the molecular mechanisms controlling animal development. The causal links in GRNs have been verified experimentally through perturbation of gene functions. Microinjection of antisense morpholino oligonucleotides (MOs) into the egg is the most widely used approach for gene knockdown in sea urchin embryos. The modification of MOs into a membrane-permeable form (vivo-MOs) has allowed gene knockdown at later developmental stages. Recent advances in genome editing tools, such as zinc-finger nucleases, transcription activator-like effector-based nucleases and the clustered regularly interspaced short palindromic repeat/clustered regularly interspaced short palindromic repeat-associated protein 9 (CRISPR/Cas9) system, have provided methods for gene knockout in sea urchins. Here, we review the use of vivo-MOs and genome editing tools in sea urchin studies since the publication of its genome in 2006. Various applications of the CRISPR/Cas9 system and their potential in studying sea urchin development are also discussed. These new tools will provide more sophisticated experimental methods for studying sea urchin development. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines
Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less
The Biomolecular Interaction Network Database and related tools 2005 update
Alfarano, C.; Andrade, C. E.; Anthony, K.; Bahroos, N.; Bajec, M.; Bantoft, K.; Betel, D.; Bobechko, B.; Boutilier, K.; Burgess, E.; Buzadzija, K.; Cavero, R.; D'Abreo, C.; Donaldson, I.; Dorairajoo, D.; Dumontier, M. J.; Dumontier, M. R.; Earles, V.; Farrall, R.; Feldman, H.; Garderman, E.; Gong, Y.; Gonzaga, R.; Grytsan, V.; Gryz, E.; Gu, V.; Haldorsen, E.; Halupa, A.; Haw, R.; Hrvojic, A.; Hurrell, L.; Isserlin, R.; Jack, F.; Juma, F.; Khan, A.; Kon, T.; Konopinsky, S.; Le, V.; Lee, E.; Ling, S.; Magidin, M.; Moniakis, J.; Montojo, J.; Moore, S.; Muskat, B.; Ng, I.; Paraiso, J. P.; Parker, B.; Pintilie, G.; Pirone, R.; Salama, J. J.; Sgro, S.; Shan, T.; Shu, Y.; Siew, J.; Skinner, D.; Snyder, K.; Stasiuk, R.; Strumpf, D.; Tuekam, B.; Tao, S.; Wang, Z.; White, M.; Willis, R.; Wolting, C.; Wong, S.; Wrong, A.; Xin, C.; Yao, R.; Yates, B.; Zhang, S.; Zheng, K.; Pawson, T.; Ouellette, B. F. F.; Hogue, C. W. V.
2005-01-01
The Biomolecular Interaction Network Database (BIND) (http://bind.ca) archives biomolecular interaction, reaction, complex and pathway information. Our aim is to curate the details about molecular interactions that arise from published experimental research and to provide this information, as well as tools to enable data analysis, freely to researchers worldwide. BIND data are curated into a comprehensive machine-readable archive of computable information and provides users with methods to discover interactions and molecular mechanisms. BIND has worked to develop new methods for visualization that amplify the underlying annotation of genes and proteins to facilitate the study of molecular interaction networks. BIND has maintained an open database policy since its inception in 1999. Data growth has proceeded at a tremendous rate, approaching over 100 000 records. New services provided include a new BIND Query and Submission interface, a Standard Object Access Protocol service and the Small Molecule Interaction Database (http://smid.blueprint.org) that allows users to determine probable small molecule binding sites of new sequences and examine conserved binding residues. PMID:15608229
Obara, Ilona; Paterson, Alastair; Nazar, Zachariah; Portlock, Jane; Husband, Andrew
2017-01-01
Objective. To assess the development of knowledge, attitudes, and behaviors for collaborative practice among first-year pharmacy students following completion of interprofessional education. Methods. A mixed-methods strategy was employed to detect student self-reported change in knowledge, attitudes, and behaviors. Validated survey tools were used to assess student perception and attitudes. The Nominal Group Technique (NGT) was used to capture student reflections and provide peer discussion on the individual IPE sessions. Results. The validated survey tools did not detect any change in students’ attitudes and perceptions. The NGT succeeded in providing a milieu for participating students to reflect on their IPE experiences. The peer review process allowed students to compare their initial perceptions and reactions and renew their reflections on the learning experience. Conclusion. The NGT process has provided the opportunity to assess the student experience through the reflective process that was enriched via peer discussion. Students have demonstrated more positive attitudes and behaviors toward interprofessional working through IPE. PMID:28381886
Perceived Utility of Pharmacy Licensure Examination Preparation Tools
Peak, Amy Sutton; Sheehan, Amy Heck; Arnett, Stephanie
2006-01-01
Objectives To identify board examination preparation tools most commonly used by recent pharmacy graduates and determine which tools are perceived as most valuable and representative of the actual content of licensure examinations. Methods An electronic survey was sent to all 2004 graduates of colleges of pharmacy in Indiana. Participants identified which specific preparation tools were used and rated tools based on usefulness, representativeness of licensure examination, and monetary value, and provided overall recommendations to future graduates. Results The most commonly used preparation tools were the Pharmacy Law Review Session offered by Dr. Thomas Wilson at Purdue University, the Complete Review for Pharmacy, Pre-NAPLEX, PharmPrep, and the Kaplan NAPLEX Review. Tools receiving high ratings in all categories included Dr. Wilson's Pharmacy Law Review Session, Pre-NAPLEX, Comprehensive Pharmacy Review, Kaplan NAPLEX Review, and Review of Pharmacy. Conclusions Although no preparation tool was associated with a higher examination pass rate, certain tools were clearly rated higher than others by test takers. PMID:17149406
Decisions and Reasons: Examining Preservice Teacher Decision-Making through Video Self-Analysis
ERIC Educational Resources Information Center
Rich, Peter J.; Hannafin, Michael J.
2008-01-01
Methods used to study teacher thinking have both provided insight into the cognitive aspects of teaching and resulted in new, as yet unresolved, relationships between practice and theory. Recent developments in video-analysis tools have allowed preservice teachers to analyze both their practices and thinking, providing important feedback for…
ERIC Educational Resources Information Center
Colwell, Jamie; Hutchison, Amy C.
2015-01-01
A systematic review of relevant literature was conducted to provide a source of information and practical guidelines for teachers and teacher educators to consider instructional methods for using digital tools in elementary language arts classrooms to promote literacy. Focal studies are highlighted to provide rich descriptions of practical uses…
Levels of Job Satisfaction of Coaches Providing Education to Mentally Retarded Children in Turkey
ERIC Educational Resources Information Center
Ilhan, Ekrem Levent
2012-01-01
The purpose of this research is to determine the levels of job satisfaction of sports coaches who are providing education to mentally retarded children and to examine as well as their job satisfaction according to different variables. Survey method was preferred as the data collection tool and "Minnesota Satisfaction…
Practitioner Review: Adolescent Alcohol Use Disorders--Assessment and Treatment Issues
ERIC Educational Resources Information Center
Perepletchikova, Francheska; Krystal, John H.; Kaufman, Joan
2008-01-01
Background: Alcohol use disorders in adolescents are associated with significant morbidity and mortality. Over the past decade, there has been a burgeoning of research on adolescent alcohol use disorders. Methods: A summary of the alcohol assessment tools is provided, and randomized studies reviewed and synthesized to provide an overview of state…
2017-01-01
Background Population datasets and the Internet are playing an ever-growing role in the way cancer information is made available to providers, patients, and their caregivers. The Surveillance, Epidemiology, and End Results Cancer Survival Calculator (SEER*CSC) is a Web-based cancer prognostic tool that uses SEER data, a large population dataset, to provide physicians with highly valid, evidence-based prognostic estimates for increasing shared decision-making and improving patient-provider communication of complex health information. Objective The aim of this study was to develop, test, and implement SEER*CSC. Methods An iterative approach was used to develop the SEER*CSC. Based on input from cancer patient advocacy groups and physicians, an initial version of the tool was developed. Next, providers from 4 health care delivery systems were recruited to do formal usability testing of SEER*CSC. A revised version of SEER*CSC was then implemented in two health care delivery sites using a real-world clinical implementation approach, and usage data were collected. Post-implementation follow-up interviews were conducted with site champions. Finally, patients from two cancer advocacy groups participated in usability testing. Results Overall feedback of SEER*CSC from both providers and patients was positive, with providers noting that the tool was professional and reliable, and patients finding it to be informational and helpful to use when discussing their diagnosis with their provider. However, use during the small-scale implementation was low. Reasons for low usage included time to enter data, not having treatment options in the tool, and the tool not being incorporated into the electronic health record (EHR). Patients found the language in its current version to be too complex. Conclusions The implementation and usability results showed that participants were enthusiastic about the use and features of SEER*CSC, but sustained implementation in a real-world clinical setting faced significant challenges. As a result of these findings, SEER*CSC is being redesigned with more accessible language for a public facing release. Meta-tools, which put different tools in context of each other, are needed to assist in understanding the strengths and limitations of various tools and their place in the clinical decision-making pathway. The continued development and eventual release of prognostic tools should include feedback from multidisciplinary health care teams, various stakeholder groups, patients, and caregivers. PMID:28729232
Analytical method of waste allocation in waste management systems: Concept, method and case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergeron, Francis C., E-mail: francis.b.c@videotron.ca
Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. Themore » conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste management in Geneva.« less
A tool to evaluate local biophysical effects on temperature due to land cover change transitions
NASA Astrophysics Data System (ADS)
Perugini, Lucia; Caporaso, Luca; Duveiller, Gregory; Cescatti, Alessandro; Abad-Viñas, Raul; Grassi, Giacomo; Quesada, Benjamin
2017-04-01
Land Cover Changes (LCC) affect local, regional and global climate through biophysical variations of the surface energy budget mediated by albedo, evapotranspiration, and roughness. Assessment of the full climate impacts of anthropogenic LCC are incomplete without considering biophysical effects, but the high level of uncertainties in quantifying their impacts to date have made it impractical to offer clear advice on which policy makers could act. To overcome this barrier, we provide a tool to evaluate the biophysical impact of a matrix of land cover transitions, following a tiered methodological approach similar to the one provided by the IPCC to estimate the biogeochemical effects, i.e. through three levels of methodological complexity, from Tier 1 (i.e. default method and factors) to Tier 3 (i.e. specific methods and factors). In particular, the tool provides guidance for quantitative assessment of changes in temperature following a land cover transition. The tool focuses on temperature for two main reasons (i) it is the main variable of interest for policy makers at local and regional level, and (ii) temperature is able to summarize the impact of radiative and non-radiative processes following LULCC. The potential changes in annual air temperature that can be expected from various land cover transitions are derived from a dedicated dataset constructed by the JRC in the framework of the LUC4C FP7 project. The inputs for the dataset are air temperature values derived from satellite Earth Observation data (MODIS) and land cover characterization from the ESA Climate Change Initiative product reclassified into their IPCC land use category equivalent. This data, originally at 0.05 degree of spatial resolution, is aggregated and analysed at regional level to provide guidance on the expected temperature impact following specific LCC transitions.
Formal functional test designs with a test representation language
NASA Technical Reports Server (NTRS)
Hops, J. M.
1993-01-01
The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.
BiNChE: a web tool and library for chemical enrichment analysis based on the ChEBI ontology.
Moreno, Pablo; Beisken, Stephan; Harsha, Bhavana; Muthukrishnan, Venkatesh; Tudose, Ilinca; Dekker, Adriano; Dornfeldt, Stefanie; Taruttis, Franziska; Grosse, Ivo; Hastings, Janna; Neumann, Steffen; Steinbeck, Christoph
2015-02-21
Ontology-based enrichment analysis aids in the interpretation and understanding of large-scale biological data. Ontologies are hierarchies of biologically relevant groupings. Using ontology annotations, which link ontology classes to biological entities, enrichment analysis methods assess whether there is a significant over or under representation of entities for ontology classes. While many tools exist that run enrichment analysis for protein sets annotated with the Gene Ontology, there are only a few that can be used for small molecules enrichment analysis. We describe BiNChE, an enrichment analysis tool for small molecules based on the ChEBI Ontology. BiNChE displays an interactive graph that can be exported as a high-resolution image or in network formats. The tool provides plain, weighted and fragment analysis based on either the ChEBI Role Ontology or the ChEBI Structural Ontology. BiNChE aids in the exploration of large sets of small molecules produced within Metabolomics or other Systems Biology research contexts. The open-source tool provides easy and highly interactive web access to enrichment analysis with the ChEBI ontology tool and is additionally available as a standalone library.
MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models
Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines
2016-08-03
Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less
Benchmarking short sequence mapping tools
2013-01-01
Background The development of next-generation sequencing instruments has led to the generation of millions of short sequences in a single run. The process of aligning these reads to a reference genome is time consuming and demands the development of fast and accurate alignment tools. However, the current proposed tools make different compromises between the accuracy and the speed of mapping. Moreover, many important aspects are overlooked while comparing the performance of a newly developed tool to the state of the art. Therefore, there is a need for an objective evaluation method that covers all the aspects. In this work, we introduce a benchmarking suite to extensively analyze sequencing tools with respect to various aspects and provide an objective comparison. Results We applied our benchmarking tests on 9 well known mapping tools, namely, Bowtie, Bowtie2, BWA, SOAP2, MAQ, RMAP, GSNAP, Novoalign, and mrsFAST (mrFAST) using synthetic data and real RNA-Seq data. MAQ and RMAP are based on building hash tables for the reads, whereas the remaining tools are based on indexing the reference genome. The benchmarking tests reveal the strengths and weaknesses of each tool. The results show that no single tool outperforms all others in all metrics. However, Bowtie maintained the best throughput for most of the tests while BWA performed better for longer read lengths. The benchmarking tests are not restricted to the mentioned tools and can be further applied to others. Conclusion The mapping process is still a hard problem that is affected by many factors. In this work, we provided a benchmarking suite that reveals and evaluates the different factors affecting the mapping process. Still, there is no tool that outperforms all of the others in all the tests. Therefore, the end user should clearly specify his needs in order to choose the tool that provides the best results. PMID:23758764
Implementing iRound: A Computer-Based Auditing Tool.
Brady, Darcie
Many hospitals use rounding or auditing as a tool to help identify gaps and needs in quality and process performance. Some hospitals are also using rounding to help improve patient experience. It is known that purposeful rounding helps improve Hospital Consumer Assessment of Healthcare Providers and Systems scores by helping manage patient expectations, provide service recovery, and recognize quality caregivers. Rounding works when a standard method is used across the facility, where data are comparable and trustworthy. This facility had a pen-and-paper process in place that made data reporting difficult, created a silo culture between departments, and most audits and rounds were completed differently on each unit. It was recognized that this facility needed to standardize the rounding and auditing process. The tool created by the Advisory Board called iRound was chosen as the tool this facility would use for patient experience rounds as well as process and quality rounding. The success of the iRound tool in this facility depended on several factors that started many months before implementation to current everyday usage.
Osunlana, A M; Asselin, J; Anderson, R; Ogunleye, A A; Cave, A; Sharma, A M; Campbell-Scherer, D L
2015-08-01
Despite several clinical practice guidelines, there remains a considerable gap in prevention and management of obesity in primary care. To address the need for changing provider behaviour, a randomized controlled trial with convergent mixed method evaluation, the 5As Team (5AsT) study, was conducted. As part of the 5AsT intervention, the 5AsT tool kit was developed. This paper describes the development process and evaluation of these tools. Tools were co-developed by the multidisciplinary research team and the 5AsT, which included registered nurses/nurse practitioners (n = 15), mental health workers (n = 7) and registered dieticians (n = 7), who were previously randomized to the 5AsT intervention group at a primary care network in Edmonton, Alberta, Canada. The 5AsT tool development occurred through a practice/implementation-oriented, need-based, iterative process during learning collaborative sessions of the 5AsT intervention. Feedback during tool development was received through field notes and final provider evaluation was carried out through anonymous questionnaires. Twelve tools were co-developed with 5AsT. All tools were evaluated as either 'most useful' or 'moderately useful' in primary care practice by the 5AsT. Four key findings during 5AsT tool development were the need for: tools that were adaptive, tools to facilitate interdisciplinary practice, tools to help patients understand realistic expectations for weight loss and shared decision-making tools for goal setting and relapse prevention. The 5AsT tools are primary care tools which extend the utility of the 5As of obesity management framework in clinical practice. © 2015 The Authors. Clinical Obesity published by John Wiley & Sons Ltd on behalf of World Obesity.
Asselin, J.; Anderson, R.; Ogunleye, A. A.; Cave, A.; Sharma, A. M.; Campbell‐Scherer, D. L.
2015-01-01
Summary Despite several clinical practice guidelines, there remains a considerable gap in prevention and management of obesity in primary care. To address the need for changing provider behaviour, a randomized controlled trial with convergent mixed method evaluation, the 5As Team (5AsT) study, was conducted. As part of the 5AsT intervention, the 5AsT tool kit was developed. This paper describes the development process and evaluation of these tools. Tools were co‐developed by the multidisciplinary research team and the 5AsT, which included registered nurses/nurse practitioners (n = 15), mental health workers (n = 7) and registered dieticians (n = 7), who were previously randomized to the 5AsT intervention group at a primary care network in Edmonton, Alberta, Canada. The 5AsT tool development occurred through a practice/implementation‐oriented, need‐based, iterative process during learning collaborative sessions of the 5AsT intervention. Feedback during tool development was received through field notes and final provider evaluation was carried out through anonymous questionnaires. Twelve tools were co‐developed with 5AsT. All tools were evaluated as either ‘most useful’ or ‘moderately useful’ in primary care practice by the 5AsT. Four key findings during 5AsT tool development were the need for: tools that were adaptive, tools to facilitate interdisciplinary practice, tools to help patients understand realistic expectations for weight loss and shared decision‐making tools for goal setting and relapse prevention. The 5AsT tools are primary care tools which extend the utility of the 5As of obesity management framework in clinical practice. PMID:26129630
New support vector machine-based method for microRNA target prediction.
Li, L; Gao, Q; Mao, X; Cao, Y
2014-06-09
MicroRNA (miRNA) plays important roles in cell differentiation, proliferation, growth, mobility, and apoptosis. An accurate list of precise target genes is necessary in order to fully understand the importance of miRNAs in animal development and disease. Several computational methods have been proposed for miRNA target-gene identification. However, these methods still have limitations with respect to their sensitivity and accuracy. Thus, we developed a new miRNA target-prediction method based on the support vector machine (SVM) model. The model supplies information of two binding sites (primary and secondary) for a radial basis function kernel as a similarity measure for SVM features. The information is categorized based on structural, thermodynamic, and sequence conservation. Using high-confidence datasets selected from public miRNA target databases, we obtained a human miRNA target SVM classifier model with high performance and provided an efficient tool for human miRNA target gene identification. Experiments have shown that our method is a reliable tool for miRNA target-gene prediction, and a successful application of an SVM classifier. Compared with other methods, the method proposed here improves the sensitivity and accuracy of miRNA prediction. Its performance can be further improved by providing more training examples.
Purohit, Bhaskar; Maneskar, Abhishek; Saxena, Deepak
2016-04-14
Addressing the shortage of health service providers (doctors and nurses) in rural health centres remains a huge challenge. The lack of motivation of health service providers to serve in rural areas is one of the major reasons for such shortage. While many studies have aimed at analysing the reasons for low motivation, hardly any studies in India have focused on developing valid and reliable tools to measure motivation among health service providers. Hence, the objective of the study was to test and develop a valid and reliable instrument to assess the motivation of health service providers working with the public health system in India and the extent to which the motivation factors included in the study motivate health service providers to perform better at work. The present study adapted an already developed tool on motivation. The reliability and validity of the tool were established using different methods. The first stage of the tool development involved content development and assessment where, after a detailed literature review, a predeveloped tool with 19 items was adapted. However, in light of the literature review and pilot test, the same tool was modified to suit the local context by adding 7 additional items so that the final modified tool comprised of 26 items. A correlation matrix was applied to check the pattern of relationships among the items. The total sample size for the study was 154 health service providers from one Western state in India. To understand the sampling adequacy, the Kaiser-Meyer-Olkin measure of sampling adequacy and Bartlett's test of sphericity were applied and finally factor analysis was carried out to calculate the eigenvalues and to understand the relative impact of factors affecting motivation. A correlation matrix value of 0.017 was obtained narrating multi-co-linearity among the observations. Based on initial factor analysis, 8 out of 26 study factors were excluded from the study components with a cutoff range of less than 0.6. Running the factor analysis again suggested the inclusion of 18 items which were subsequently labelled under the following heads: transparency, goals, security, convenience, benefits, encouragement, adequacy of earnings and further growth and power. There is a great need to develop instruments aimed at assessing the motivation of health service providers. The instrument used in the study has good psychometric properties and may serve as a useful tool to assess motivation among healthcare providers.
Huang, Camillan
2003-01-01
Technology has created a new dimension for visual teaching and learning with web-delivered interactive media. The Virtual Labs Project has embraced this technology with instructional design and evaluation methodologies behind the simPHYSIO suite of simulation-based, online interactive teaching modules in physiology for the Stanford students. In addition, simPHYSIO provides the convenience of anytime web-access and a modular structure that allows for personalization and customization of the learning material. This innovative tool provides a solid delivery and pedagogical backbone that can be applied to developing an interactive simulation-based training tool for the use and management of the Picture Archiving and Communication System (PACS) image information system. The disparity in the knowledge between health and IT professionals can be bridged by providing convenient modular teaching tools to fill the gaps in knowledge. An innovative teaching method in the whole PACS is deemed necessary for its successful implementation and operation since it has become widely distributed with many interfaces, components, and customizations. This paper will discuss the techniques for developing an interactive-based teaching tool, a case study of its implementation, and a perspective for applying this approach to an online PACS training tool. Copyright 2002 Elsevier Science Ltd.
A patient-centered electronic tool for weight loss outcomes after Roux-en-Y gastric bypass.
Wood, G Craig; Benotti, Peter; Gerhard, Glenn S; Miller, Elaina K; Zhang, Yushan; Zaccone, Richard J; Argyropoulos, George A; Petrick, Anthony T; Still, Christopher D
2014-01-01
BACKGROUND. Current patient education and informed consent regarding weight loss expectations for bariatric surgery candidates are largely based on averages from large patient cohorts. The variation in weight loss outcomes illustrates the need for establishing more realistic weight loss goals for individual patients. This study was designed to develop a simple web-based tool which provides patient-specific weight loss expectations. METHODS. Postoperative weight measurements after Roux-en-Y gastric bypass (RYGB) were collected and analyzed with patient characteristics known to influence weight loss outcomes. Quantile regression was used to create expected weight loss curves (25th, 50th, and 75th %tile) for the 24 months after RYGB. The resulting equations were validated and used to develop web-based tool for predicting weight loss outcomes. RESULTS. Weight loss data from 2986 patients (2608 in the primary cohort and 378 in the validation cohort) were included. Preoperative body mass index (BMI) and age were found to have a high correlation with weight loss accomplishment (P < 0.0001 for each). An electronic tool was created that provides easy access to patient-specific, 24-month weight loss trajectories based on initial BMI and age. CONCLUSIONS. This validated, patient-centered electronic tool will assist patients and providers in patient teaching, informed consent, and postoperative weight loss management.
Open source tools for fluorescent imaging.
Hamilton, Nicholas A
2012-01-01
As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.
Accuracy Improvement of Multi-Axis Systems Based on Laser Correction of Volumetric Geometric Errors
NASA Astrophysics Data System (ADS)
Teleshevsky, V. I.; Sokolov, V. A.; Pimushkin, Ya I.
2018-04-01
The article describes a volumetric geometric errors correction method for CNC- controlled multi-axis systems (machine-tools, CMMs etc.). The Kalman’s concept of “Control and Observation” is used. A versatile multi-function laser interferometer is used as Observer in order to measure machine’s error functions. A systematic error map of machine’s workspace is produced based on error functions measurements. The error map results into error correction strategy. The article proposes a new method of error correction strategy forming. The method is based on error distribution within machine’s workspace and a CNC-program postprocessor. The postprocessor provides minimal error values within maximal workspace zone. The results are confirmed by error correction of precision CNC machine-tools.
Tube swaging device uses explosive force
NASA Technical Reports Server (NTRS)
Mc Smith, D. G.
1968-01-01
Tool joins a sleeve to a tube by explosive swaging, thus providing a leakproof, lightweight, and strong assembly. No new or different material is used in this method and therefore the thermal and galvanic properties are maintained.
Pluye, Pierre; Hong, Quan Nha
2014-01-01
This article provides an overview of mixed methods research and mixed studies reviews. These two approaches are used to combine the strengths of quantitative and qualitative methods and to compensate for their respective limitations. This article is structured in three main parts. First, the epistemological background for mixed methods will be presented. Afterward, we present the main types of mixed methods research designs and techniques as well as guidance for planning, conducting, and appraising mixed methods research. In the last part, we describe the main types of mixed studies reviews and provide a tool kit and examples. Future research needs to offer guidance for assessing mixed methods research and reporting mixed studies reviews, among other challenges.
Quantitative Susceptibility Mapping of Human Brain Reflects Spatial Variation in Tissue Composition
Li, Wei; Wu, Bing; Liu, Chunlei
2011-01-01
Image phase from gradient echo MRI provides a unique contrast that reflects brain tissue composition variations, such as iron and myelin distribution. Phase imaging is emerging as a powerful tool for the investigation of functional brain anatomy and disease diagnosis. However, the quantitative value of phase is compromised by its nonlocal and orientation dependent properties. There is an increasing need for reliable quantification of magnetic susceptibility, the intrinsic property of tissue. In this study, we developed a novel and accurate susceptibility mapping method that is also phase-wrap insensitive. The proposed susceptibility mapping method utilized two complementary equations: (1) the Fourier relationship of phase and magnetic susceptibility; and (2) the first-order partial derivative of the first equation in the spatial frequency domain. In numerical simulation, this method reconstructed the susceptibility map almost free of streaking artifact. Further, the iterative implementation of this method allowed for high quality reconstruction of susceptibility maps of human brain in vivo. The reconstructed susceptibility map provided excellent contrast of iron-rich deep nuclei and white matter bundles from surrounding tissues. Further, it also revealed anisotropic magnetic susceptibility in brain white matter. Hence, the proposed susceptibility mapping method may provide a powerful tool for the study of brain physiology and pathophysiology. Further elucidation of anisotropic magnetic susceptibility in vivo may allow us to gain more insight into the white matter microarchitectures. PMID:21224002
Analysis Tool Web Services from the EMBL-EBI.
McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo
2013-07-01
Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.
Analysis Tool Web Services from the EMBL-EBI
McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo
2013-01-01
Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338
Vawda, Naseema B. M.; Milburn, Norweeta G.; Steyn, Renier; Zhang, Muyu
2016-01-01
Objective Adolescent suicidal behaviour is a public health concern in South Africa. The purpose of this manuscript is to report on the development of a screening tool for teachers to identify South African students who are most at risk for suicidal behaviour. This need is addressed within the context of the limited number of mental health professionals available to provide screening and care services in South Africa. Method Grade 8 students participated by completing sociodemographic questionnaires and self-report psychometric instruments. A screening tool for suicidal behaviour was developed using a four phase approach. Results Twelve factors for high risk suicidal behaviour were identified and included in the screening tool. While further research is needed to validate the screening tool, the findings provide a useful preliminary starting point for teachers to refer students at high risk for suicidal behaviour to mental health services for treatment. Conclusion This screening tool is based on factors that were identified as being associated with suicidal behaviour from local research on South African adolescents. The tool contributes to research on adolescent mental health, particularly suicidal behaviour, in developing low and middle income countries like South Africa, with the aim of creating African prevention and intervention programmes. PMID:28459269
Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli
2016-01-01
Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications.
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
ERIC Educational Resources Information Center
Vermont Inst. for Self-Reliance, Rutland.
This guide provides a description of Responsive Text (RT), a method for presenting job-relevant information within a computer-based support system. A summary of what RT is and why it is important is provided first. The first section of the guide provides a brief overview of what research tells about the reading process and how the general design…
Experiences on developing digital down conversion algorithms using Xilinx system generator
NASA Astrophysics Data System (ADS)
Xu, Chengfa; Yuan, Yuan; Zhao, Lizhi
2013-07-01
The Digital Down Conversion (DDC) algorithm is a classical signal processing method which is widely used in radar and communication systems. In this paper, the DDC function is implemented by Xilinx System Generator tool on FPGA. System Generator is an FPGA design tool provided by Xilinx Inc and MathWorks Inc. It is very convenient for programmers to manipulate the design and debug the function, especially for the complex algorithm. Through the developing process of DDC function based on System Generator, the results show that System Generator is a very fast and efficient tool for FPGA design.
Advanced genetic tools for plant biotechnology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, WS; Yuan, JS; Stewart, CN
2013-10-09
Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis ofmore » large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.« less
Advanced genetic tools for plant biotechnology.
Liu, Wusheng; Yuan, Joshua S; Stewart, C Neal
2013-11-01
Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.
Use of Influenza Risk Assessment Tool for Prepandemic Preparedness
Trock, Susan C.
2018-01-01
In 2010, the Centers for Disease Control and Prevention began to develop an Influenza Risk Assessment Tool (IRAT) to methodically capture and assess information relating to influenza A viruses not currently circulating among humans. The IRAT uses a multiattribute, additive model to generate a summary risk score for each virus. Although the IRAT is not intended to predict the next pandemic influenza A virus, it has provided input into prepandemic preparedness decisions. PMID:29460739
In Internet-Based Visualization System Study about Breakthrough Applet Security Restrictions
NASA Astrophysics Data System (ADS)
Chen, Jie; Huang, Yan
In the process of realization Internet-based visualization system of the protein molecules, system needs to allow users to use the system to observe the molecular structure of the local computer, that is, customers can generate the three-dimensional graphics from PDB file on the client computer. This requires Applet access to local file, related to the Applet security restrictions question. In this paper include two realization methods: 1.Use such as signature tools, key management tools and Policy Editor tools provided by the JDK to digital signature and authentication for Java Applet, breakthrough certain security restrictions in the browser. 2. Through the use of Servlet agent implement indirect access data methods, breakthrough the traditional Java Virtual Machine sandbox model restriction of Applet ability. The two ways can break through the Applet's security restrictions, but each has its own strengths.
Geochemical Reaction Mechanism Discovery from Molecular Simulation
Stack, Andrew G.; Kent, Paul R. C.
2014-11-10
Methods to explore reactions using computer simulation are becoming increasingly quantitative, versatile, and robust. In this review, a rationale for how molecular simulation can help build better geochemical kinetics models is first given. We summarize some common methods that geochemists use to simulate reaction mechanisms, specifically classical molecular dynamics and quantum chemical methods and discuss their strengths and weaknesses. Useful tools such as umbrella sampling and metadynamics that enable one to explore reactions are discussed. Several case studies wherein geochemists have used these tools to understand reaction mechanisms are presented, including water exchange and sorption on aqueous species and mineralmore » surfaces, surface charging, crystal growth and dissolution, and electron transfer. The impact that molecular simulation has had on our understanding of geochemical reactivity are highlighted in each case. In the future, it is anticipated that molecular simulation of geochemical reaction mechanisms will become more commonplace as a tool to validate and interpret experimental data, and provide a check on the plausibility of geochemical kinetic models.« less
Preserving Simplecticity in the Numerical Integration of Linear Beam Optics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, Christopher K.
2017-07-01
Presented are mathematical tools and methods for the development of numerical integration techniques that preserve the symplectic condition inherent to mechanics. The intended audience is for beam physicists with backgrounds in numerical modeling and simulation with particular attention to beam optics applications. The paper focuses on Lie methods that are inherently symplectic regardless of the integration accuracy order. Section 2 provides the mathematically tools used in the sequel and necessary for the reader to extend the covered techniques. Section 3 places those tools in the context of charged-particle beam optics; in particular linear beam optics is presented in terms ofmore » a Lie algebraic matrix representation. Section 4 presents numerical stepping techniques with particular emphasis on a third-order leapfrog method. Section 5 discusses the modeling of field imperfections with particular attention to the fringe fields of quadrupole focusing magnets. The direct computation of a third order transfer matrix for a fringe field is shown.« less
The application of systems thinking in health: why use systems thinking?
Peters, David H
2014-08-26
This paper explores the question of what systems thinking adds to the field of global health. Observing that elements of systems thinking are already common in public health research, the article discusses which of the large body of theories, methods, and tools associated with systems thinking are more useful. The paper reviews the origins of systems thinking, describing a range of the theories, methods, and tools. A common thread is the idea that the behavior of systems is governed by common principles that can be discovered and expressed. They each address problems of complexity, which is a frequent challenge in global health. The different methods and tools are suited to different types of inquiry and involve both qualitative and quantitative techniques. The paper concludes by emphasizing that explicit models used in systems thinking provide new opportunities to understand and continuously test and revise our understanding of the nature of things, including how to intervene to improve people's health.
Using geographical information systems and cartograms as a health service quality improvement tool.
Lovett, Derryn A; Poots, Alan J; Clements, Jake T C; Green, Stuart A; Samarasundera, Edgar; Bell, Derek
2014-07-01
Disease prevalence can be spatially analysed to provide support for service implementation and health care planning, these analyses often display geographic variation. A key challenge is to communicate these results to decision makers, with variable levels of Geographic Information Systems (GIS) knowledge, in a way that represents the data and allows for comprehension. The present research describes the combination of established GIS methods and software tools to produce a novel technique of visualising disease admissions and to help prevent misinterpretation of data and less optimal decision making. The aim of this paper is to provide a tool that supports the ability of decision makers and service teams within health care settings to develop services more efficiently and better cater to the population; this tool has the advantage of information on the position of populations, the size of populations and the severity of disease. A standard choropleth of the study region, London, is used to visualise total emergency admission values for Chronic Obstructive Pulmonary Disease and bronchiectasis using ESRI's ArcGIS software. Population estimates of the Lower Super Output Areas (LSOAs) are then used with the ScapeToad cartogram software tool, with the aim of visualising geography at uniform population density. An interpolation surface, in this case ArcGIS' spline tool, allows the creation of a smooth surface over the LSOA centroids for admission values on both standard and cartogram geographies. The final product of this research is the novel Cartogram Interpolation Surface (CartIS). The method provides a series of outputs culminating in the CartIS, applying an interpolation surface to a uniform population density. The cartogram effectively equalises the population density to remove visual bias from areas with a smaller population, while maintaining contiguous borders. CartIS decreases the number of extreme positive values not present in the underlying data as can be found in interpolation surfaces. This methodology provides a technique for combining simple GIS tools to create a novel output, CartIS, in a health service context with the key aim of improving visualisation communication techniques which highlight variation in small scale geographies across large regions. CartIS more faithfully represents the data than interpolation, and visually highlights areas of extreme value more than cartograms, when either is used in isolation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
GEMINI: Integrative Exploration of Genetic Variation and Genome Annotations
Paila, Umadevi; Chapman, Brad A.; Kirchner, Rory; Quinlan, Aaron R.
2013-01-01
Modern DNA sequencing technologies enable geneticists to rapidly identify genetic variation among many human genomes. However, isolating the minority of variants underlying disease remains an important, yet formidable challenge for medical genetics. We have developed GEMINI (GEnome MINIng), a flexible software package for exploring all forms of human genetic variation. Unlike existing tools, GEMINI integrates genetic variation with a diverse and adaptable set of genome annotations (e.g., dbSNP, ENCODE, UCSC, ClinVar, KEGG) into a unified database to facilitate interpretation and data exploration. Whereas other methods provide an inflexible set of variant filters or prioritization methods, GEMINI allows researchers to compose complex queries based on sample genotypes, inheritance patterns, and both pre-installed and custom genome annotations. GEMINI also provides methods for ad hoc queries and data exploration, a simple programming interface for custom analyses that leverage the underlying database, and both command line and graphical tools for common analyses. We demonstrate GEMINI's utility for exploring variation in personal genomes and family based genetic studies, and illustrate its ability to scale to studies involving thousands of human samples. GEMINI is designed for reproducibility and flexibility and our goal is to provide researchers with a standard framework for medical genomics. PMID:23874191
Self-optimizing Monte Carlo method for nuclear well logging simulation
NASA Astrophysics Data System (ADS)
Liu, Lianyan
1997-09-01
In order to increase the efficiency of Monte Carlo simulation for nuclear well logging problems, a new method has been developed for variance reduction. With this method, an importance map is generated in the regular Monte Carlo calculation as a by-product, and the importance map is later used to conduct the splitting and Russian roulette for particle population control. By adopting a spatial mesh system, which is independent of physical geometrical configuration, the method allows superior user-friendliness. This new method is incorporated into the general purpose Monte Carlo code MCNP4A through a patch file. Two nuclear well logging problems, a neutron porosity tool and a gamma-ray lithology density tool are used to test the performance of this new method. The calculations are sped up over analog simulation by 120 and 2600 times, for the neutron porosity tool and for the gamma-ray lithology density log, respectively. The new method enjoys better performance by a factor of 4~6 times than that of MCNP's cell-based weight window, as per the converged figure-of-merits. An indirect comparison indicates that the new method also outperforms the AVATAR process for gamma-ray density tool problems. Even though it takes quite some time to generate a reasonable importance map from an analog run, a good initial map can create significant CPU time savings. This makes the method especially suitable for nuclear well logging problems, since one or several reference importance maps are usually available for a given tool. Study shows that the spatial mesh sizes should be chosen according to the mean-free-path. The overhead of the importance map generator is 6% and 14% for neutron and gamma-ray cases. The learning ability towards a correct importance map is also demonstrated. Although false-learning may happen, physical judgement can help diagnose with contributon maps. Calibration and analysis are performed for the neutron tool and the gamma-ray tool. Due to the fact that a very good initial importance map is always available after the first point has been calculated, high computing efficiency is maintained. The availability of contributon maps provides an easy way of understanding the logging measurement and analyzing for the depth of investigation.
Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan
2015-12-01
Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.
Overview of 'Omics Technologies for Military Occupational Health Surveillance and Medicine.
Bradburne, Christopher; Graham, David; Kingston, H M; Brenner, Ruth; Pamuku, Matt; Carruth, Lucy
2015-10-01
Systems biology ('omics) technologies are emerging as tools for the comprehensive analysis and monitoring of human health. In order for these tools to be used in military medicine, clinical sampling and biobanking will need to be optimized to be compatible with downstream processing and analysis for each class of molecule measured. This article provides an overview of 'omics technologies, including instrumentation, tools, and methods, and their potential application for warfighter exposure monitoring. We discuss the current state and the potential utility of personalized data from a variety of 'omics sources including genomics, epigenomics, transcriptomics, metabolomics, proteomics, lipidomics, and efforts to combine their use. Issues in the "sample-to-answer" workflow, including collection and biobanking are discussed, as well as national efforts for standardization and clinical interpretation. Establishment of these emerging capabilities, along with accurate xenobiotic monitoring, for the Department of Defense could provide new and effective tools for environmental health monitoring at all duty stations, including deployed locations. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
Raterink, Ginger
2016-02-01
Critical thinking, clinical decision making, and critical reflection have been identified as skills required of nurses in every clinical situation. The Educating Nurses: A Call for Radical Transformation report suggested that critical reflection is a key to improving the educational process. Reflective journaling is a tool that helps develop such skills. This article presents the tool of reflective journaling and the use of this process by educators working with students. It describes the use of reflective journaling in graduate nursing education, as well as a scoring process to evaluate the reflection and provide feedback. Students and faculty found the journaling to be helpful for reflection of a clinical situation focused on critical thinking skill development. The rubric scoring tool provided faculty with a method for feedback. Reflective journaling is a tool that faculty and students can use to develop critical thinking skills for the role of the advanced practice RN. A rubric scoring system offers a consistent format for feedback. Copyright 2016, SLACK Incorporated.
Cognitive Foundry v. 3.0 (OSS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basilico, Justin; Dixon, Kevin; McClain, Jonathan
2009-11-18
The Cognitive Foundry is a unified collection of tools designed for research and applications that use cognitive modeling, machine learning, or pattern recognition. The software library contains design patterns, interface definitions, and default implementations of reusable software components and algorithms designed to support a wide variety of research and development needs. The library contains three main software packages: the Common package that contains basic utilities and linear algebraic methods, the Cognitive Framework package that contains tools to assist in implementing and analyzing theories of cognition, and the Machine Learning package that provides general algorithms and methods for populating Cognitive Frameworkmore » components from domain-relevant data.« less
Augmented reality building operations tool
Brackney, Larry J.
2014-09-09
A method (700) for providing an augmented reality operations tool to a mobile client (642) positioned in a building (604). The method (700) includes, with a server (660), receiving (720) from the client (642) an augmented reality request for building system equipment (612) managed by an energy management system (EMS) (620). The method (700) includes transmitting (740) a data request for the equipment (612) to the EMS (620) and receiving (750) building management data (634) for the equipment (612). The method (700) includes generating (760) an overlay (656) with an object created based on the building management data (634), which may be sensor data, diagnostic procedures, or the like. The overlay (656) is configured for concurrent display on a display screen (652) of the client (642) with a real-time image of the building equipment (612). The method (700) includes transmitting (770) the overlay (656) to the client (642).
Shen, Yufeng; Tolić, Nikola; Xie, Fang; Zhao, Rui; Purvine, Samuel O.; Schepmoes, Athena A.; Ronald, J. Moore; Anderson, Gordon A.; Smith, Richard D.
2011-01-01
We report on the effectiveness of CID, HCD, and ETD for LC-FT MS/MS analysis of peptides using a tandem linear ion trap-Orbitrap mass spectrometer. A range of software tools and analysis parameters were employed to explore the use of CID, HCD, and ETD to identify peptides isolated from human blood plasma without the use of specific “enzyme rules”. In the evaluation of an FDR-controlled SEQUEST scoring method, the use of accurate masses for fragments increased the numbers of identified peptides (by ~50%) compared to the use of conventional low accuracy fragment mass information, and CID provided the largest contribution to the identified peptide datasets compared to HCD and ETD. The FDR-controlled Mascot scoring method provided significantly fewer peptide identifications than with SEQUEST (by 1.3–2.3 fold) at the same confidence levels, and CID, HCD, and ETD provided similar contributions to identified peptides. Evaluation of de novo sequencing and the UStags method for more intense fragment ions revealed that HCD afforded more sequence consecutive residues (e.g., ≥7 amino acids) than either CID or ETD. Both the FDR-controlled SEQUEST and Mascot scoring methods provided peptide datasets that were affected by the decoy database and mass tolerances applied (e.g., the identical peptides between the datasets could be limited to ~70%), while the UStags method provided the most consistent peptide datasets (>90% overlap) with extremely low (near zero) numbers of false positive identifications. The m/z ranges in which CID, HCD, and ETD contributed the largest number of peptide identifications were substantially overlapping. This work suggests that the three peptide ion fragmentation methods are complementary, and that maximizing the number of peptide identifications benefits significantly from a careful match with the informatics tools and methods applied. These results also suggest that the decoy strategy may inaccurately estimate identification FDRs. PMID:21678914
New approaches in assessing food intake in epidemiology.
Conrad, Johanna; Koch, Stefanie A J; Nöthlings, Ute
2018-06-22
A promising direction for improving dietary intake measurement in epidemiologic studies is the combination of short-term and long-term dietary assessment methods using statistical methods. Thereby, web-based instruments are particularly interesting as their application offers several potential advantages such as self-administration and a shorter completion time. The objective of this review is to provide an overview of new web-based short-term instruments and to describe their features. A number of web-based short-term dietary assessment tools for application in different countries and age-groups have been developed so far. Particular attention should be paid to the underlying database and the search function of the tool. Moreover, web-based instruments can improve the estimation of portion sizes by offering several options to the user. Web-based dietary assessment methods are associated with lower costs and reduced burden for participants and researchers, and show a comparable validity with traditional instruments. When there is a need for a web-based tool researcher should consider the adaptation of existing tools rather than developing new instruments. The combination of short-term and long-term instruments seems more feasible with the use of new technology.
Community-driven computational biology with Debian Linux
2010-01-01
Background The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. Results The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Conclusions Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers. PMID:21210984
The Socratic Method: analyzing ethical issues in health administration.
Gac, E J; Boerstler, H; Ruhnka, J C
1998-01-01
The Socratic Method has long been recognized by the legal profession as an effective tool for promoting critical thinking and analysis in the law. This article describes ways the technique can be used in health administration education to help future administrators develop the "ethical rudder" they will need for effective leadership. An illustrative dialogue is provided.
Workshop on Survey Methods in Education Research: Facilitator's Guide and Resources. REL 2017-214
ERIC Educational Resources Information Center
Walston, Jill; Redford, Jeremy; Bhatt, Monica P.
2017-01-01
This Workshop on Survey Methods in Education Research tool consists of a facilitator guide and workshop handouts. The toolkit is intended for use by state or district education leaders and others who want to conduct training on developing and administering surveys. The facilitator guide provides materials related to various phases of the survey…
ERIC Educational Resources Information Center
Weurlander, Maria; Soderberg, Magnus; Scheja, Max; Hult, Hakan; Wernerson, Annika
2012-01-01
This study aims to provide a greater insight into how formative assessments are experienced and understood by students. Two different formative assessment methods, an individual, written assessment and an oral group assessment, were components of a pathology course within a medical curriculum. In a cohort of 70 students, written accounts were…
An improved strategy for regression of biophysical variables and Landsat ETM+ data.
Warren B. Cohen; Thomas K. Maiersperger; Stith T. Gower; David P. Turner
2003-01-01
Empirical models are important tools for relating field-measured biophysical variables to remote sensing data. Regression analysis has been a popular empirical method of linking these two types of data to provide continuous estimates for variables such as biomass, percent woody canopy cover, and leaf area index (LAI). Traditional methods of regression are not...
ERIC Educational Resources Information Center
Dooly, Melinda; O'Dowd, Robert
2012-01-01
This book provides an accessible introduction to some of the methods and theoretical approaches for investigating foreign language (FL) interaction and exchange in online environments. Research approaches which can be applied to Computer-Mediated Communication (CMC) are outlined, followed by discussion of the way in which tools and techniques for…
Towards "Inverse" Character Tables? A One-Step Method for Decomposing Reducible Representations
ERIC Educational Resources Information Center
Piquemal, J.-Y.; Losno, R.; Ancian, B.
2009-01-01
In the framework of group theory, a new procedure is described for a one-step automated reduction of reducible representations. The matrix inversion tool, provided by standard spreadsheet software, is applied to the central part of the character table that contains the characters of the irreducible representation. This method is not restricted to…
ERIC Educational Resources Information Center
Benson, Tammy; Cotabish, Alicia
2014-01-01
Throughout the evolution of education, various methods of teacher training have emerged to provide general professional development to educators. After trial and error, forms of coaching, including peer coaching, emerged as one of several operational training tools and has been a recommended method of teacher development in recent years (Cotabish…
Technical background of the FireLine Assessment MEthod (FLAME)
Jim Bishop
2007-01-01
The FireLine Assessment MEthod (FLAME) provides a fireline-practical tool for predicting significant changes in fire rate-of-spread (ROS). FLAME addresses the dominant drivers of large, short-term change: effective windspeed, fuel type, and fine-fuel moisture. Primary output is the ROS-ratio, expressing the degree of change in ROS. The application process guides and...
ERIC Educational Resources Information Center
Mandel, Lauren Heather
2012-01-01
Wayfinding is the method by which humans orient and navigate in space, and particularly in built environments such as cities and complex buildings, including public libraries. In order to wayfind successfully in the built environment, humans need information provided by wayfinding systems and tools, for instance architectural cues, signs, and…
Database Design Learning: A Project-Based Approach Organized through a Course Management System
ERIC Educational Resources Information Center
Dominguez, Cesar; Jaime, Arturo
2010-01-01
This paper describes an active method for database design learning through practical tasks development by student teams in a face-to-face course. This method integrates project-based learning, and project management techniques and tools. Some scaffolding is provided at the beginning that forms a skeleton that adapts to a great variety of…
Quantifying Solar Cell Cracks in Photovoltaic Modules by Electroluminescence Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spataru, Sergiu; Hacke, Peter; Sera, Dezso
2015-06-14
This article proposes a method for quantifying the percentage of partially and totally disconnected solar cell cracks by analyzing electroluminescence images of the photovoltaic module taken under high- and low-current forward bias. The method is based on the analysis of the module's electroluminescence intensity distribution, applied at module and cell level. These concepts are demonstrated on a crystalline silicon photovoltaic module that was subjected to several rounds of mechanical loading and humidity-freeze cycling, causing increasing levels of solar cell cracks. The proposed method can be used as a diagnostic tool to rate cell damage or quality of modules after transportation.more » Moreover, the method can be automated and used in quality control for module manufacturers, installers, or as a diagnostic tool by plant operators and diagnostic service providers.« less
Object oriented studies into artificial space debris
NASA Technical Reports Server (NTRS)
Adamson, J. M.; Marshall, G.
1988-01-01
A prototype simulation is being developed under contract to the Royal Aerospace Establishment (RAE), Farnborough, England, to assist in the discrimination of artificial space objects/debris. The methodology undertaken has been to link Object Oriented programming, intelligent knowledge based system (IKBS) techniques and advanced computer technology with numeric analysis to provide a graphical, symbolic simulation. The objective is to provide an additional layer of understanding on top of conventional classification methods. Use is being made of object and rule based knowledge representation, multiple reasoning, truth maintenance and uncertainty. Software tools being used include Knowledge Engineering Environment (KEE) and SymTactics for knowledge representation. Hooks are being developed within the SymTactics framework to incorporate mathematical models describing orbital motion and fragmentation. Penetration and structural analysis can also be incorporated. SymTactics is an Object Oriented discrete event simulation tool built as a domain specific extension to the KEE environment. The tool provides facilities for building, debugging and monitoring dynamic (military) simulations.
Parks, W.S.; Carmichael, J.K.; Mirecki, J.E.
1993-01-01
Direct Push Technology (DPT) and a modified-auger method of sampling were used at an abandoned wood-preserving plant site at Jackson, Tennessee, to collect lithologic data and ground-water samples in an area known to be affected by a subsurface creosote plume. The groundwater samples were analyzed using (1) gas chromatography with photo-ionization detection (GS/PID), (2) high- performance liquid chromatography (HPLC), (3) colonmetric phenol analysis, and (4) toxicity bioassay. DPT piezocone and cone-penetrometer-type tools provided lithologic data and ground-water samples at two onsite stations to a depth of refusal of about 35 feet below land surface. With the assistance of an auger rig, this depth was extended to about 65 feet by pushing the tools in advance of the augers. Following the DPT work, a modified-auger method was tested by the USGS. This method left doubt as to the integrity of the samples collected once zones of contamination were penetrated. GC/PID and HPLC methods of water-quality analysis provided the most data concerning contaminants in the ground-water and proved to be the most effective in creosote plume detection. Analyses from these methods showed that the highest concentrations of contaminants were detected at depths less than about 35 feet below land surface. Phenol analyses provided data supplemental to the HPLC analyses. Bioassay data indicated that toxicity associated with the plume extended to depths of about 55 feet below land surface.
Implementation of Health Insurance Support Tools in Community Health Centers.
Huguet, Nathalie; Hatch, Brigit; Sumic, Aleksandra; Tillotson, Carrie; Hicks, Elizabeth; Nelson, Joan; DeVoe, Jennifer E
2018-01-01
Health information technology (HIT) provides new opportunities for primary care clinics to support patients with health insurance enrollment and maintenance. We present strategies, early findings, and clinic reflections on the development and implementation of HIT tools designed to streamline and improve health insurance tracking at community health centers. We are conducting a hybrid implementation-effectiveness trial to assess novel health insurance enrollment and support tools in primary care clinics. Twenty-three clinics in 7 health centers from the OCHIN practice-based research network are participating in the implementation component of the trial. Participating health centers were randomized to 1 of 2 levels of implementation support, including arm 1 (n = 4 health centers, 11 clinic sites) that received HIT tools and educational materials and arm 2 (n = 3 health centers, 12 clinic sites) that received HIT tools, educational materials, and individualized implementation support with a practice coach. We used mixed-methods (qualitative and quantitative) to assess tool use rates and facilitators and barriers to implementation in the first 6 months. Clinics reported favorable attitudes toward the HIT tools, which replace less efficient and more cumbersome processes, and reflect on the importance of clinic engagement in tool development and refinement. Five of 7 health centers are now regularly using the tools and are actively working to increase tool use. Six months after formal implementation, arm 2 clinics demonstrated higher rates of tool use, compared with arm 1. These results highlight the value of early clinic input in tool development, the potential benefit of practice coaching during HIT tool development and implementation, and a novel method for coupling a hybrid implementation-effectiveness design with principles of improvement science in primary care research. © Copyright 2018 by the American Board of Family Medicine.
Software-engineering challenges of building and deploying reusable problem solvers.
O'Connor, Martin J; Nyulas, Csongor; Tu, Samson; Buckeridge, David L; Okhmatovskaia, Anna; Musen, Mark A
2009-11-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task-method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach.
Software-engineering challenges of building and deploying reusable problem solvers
O’CONNOR, MARTIN J.; NYULAS, CSONGOR; TU, SAMSON; BUCKERIDGE, DAVID L.; OKHMATOVSKAIA, ANNA; MUSEN, MARK A.
2012-01-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task–method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach. PMID:23565031
Safety assessment tool for construction zone work phasing plans
DOT National Transportation Integrated Search
2016-05-01
The Highway Safety Manual (HSM) is the compilation of national safety research that provides quantitative methods for : analyzing highway safety. The HSM presents crash modification functions related to freeway work zone characteristics such as : wor...
Prioritizing Health: A Systematic Approach to Scoping Determinants in Health Impact Assessment.
McCallum, Lindsay C; Ollson, Christopher A; Stefanovic, Ingrid L
2016-01-01
The determinants of health are those factors that have the potential to affect health, either positively or negatively, and include a range of personal, social, economic, and environmental factors. In the practice of health impact assessment (HIA), the stage at which the determinants of health are considered for inclusion is during the scoping step. The scoping step is intended to identify how the HIA will be carried out and to set the boundaries (e.g., temporal and geographical) for the assessment. There are several factors that can help to inform the scoping process, many of which are considered in existing HIA tools and guidance; however, a systematic method of prioritizing determinants was found to be lacking. In order to analyze existing HIA scoping tools that are available, a systematic literature review was conducted, including both primary and gray literature. A total of 10 HIA scoping tools met the inclusion/exclusion criteria and were carried forward for comparative analysis. The analysis focused on minimum elements and practice standards of HIA scoping that have been established in the field. The analysis determined that existing approaches lack a clear, systematic method of prioritization of health determinants for inclusion in HIA. This finding led to the development of a Systematic HIA Scoping tool that addressed this gap. The decision matrix tool uses factors, such as impact, public concern, and data availability, to prioritize health determinants. Additionally, the tool allows for identification of data gaps and provides a transparent method for budget allocation and assessment planning. In order to increase efficiency and improve utility, the tool was programed into Microsoft Excel. Future work in the area of HIA methodology development is vital to the ongoing success of the practice and utilization of HIA as a reliable decision-making tool.
Optimization of Low-Thrust Spiral Trajectories by Collocation
NASA Technical Reports Server (NTRS)
Falck, Robert D.; Dankanich, John W.
2012-01-01
As NASA examines potential missions in the post space shuttle era, there has been a renewed interest in low-thrust electric propulsion for both crewed and uncrewed missions. While much progress has been made in the field of software for the optimization of low-thrust trajectories, many of the tools utilize higher-fidelity methods which, while excellent, result in extremely high run-times and poor convergence when dealing with planetocentric spiraling trajectories deep within a gravity well. Conversely, faster tools like SEPSPOT provide a reasonable solution but typically fail to account for other forces such as third-body gravitation, aerodynamic drag, solar radiation pressure. SEPSPOT is further constrained by its solution method, which may require a very good guess to yield a converged optimal solution. Here the authors have developed an approach using collocation intended to provide solution times comparable to those given by SEPSPOT while allowing for greater robustness and extensible force models.
NASA Astrophysics Data System (ADS)
Xiong, Yanmei; Zhang, Yuyan; Rong, Pengfei; Yang, Jie; Wang, Wei; Liu, Dingbin
2015-09-01
We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose.We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose. Electronic supplementary information (ESI) available: Experimental section and additional figures. See DOI: 10.1039/c5nr03758a
Portfolios in Saudi medical colleges
Fida, Nadia M.; Shamim, Muhammad S.
2016-01-01
Over recent decades, the use of portfolios in medical education has evolved, and is being applied in undergraduate and postgraduate programs worldwide. Portfolios, as a learning process and method of documenting and assessing learning, is supported as a valuable tool by adult learning theories that stress the need for learners to be self-directed and to engage in experiential learning. Thoughtfully implemented, a portfolio provides learning experiences unequaled by any single learning tool. The credibility (validity) and dependability (reliability) of assessment through portfolios have been questioned owing to its subjective nature; however, methods to safeguard these features have been described in the literature. This paper discusses some of this literature, with particular attention to the role of portfolios in relation to self-reflective learning, provides an overview of current use of portfolios in undergraduate medical education in Saudi Arabia, and proposes research-based guidelines for its implementation and other similar contexts. PMID:26905344
DRS: Derivational Reasoning System
NASA Technical Reports Server (NTRS)
Bose, Bhaskar
1995-01-01
The high reliability requirements for airborne systems requires fault-tolerant architectures to address failures in the presence of physical faults, and the elimination of design flaws during the specification and validation phase of the design cycle. Although much progress has been made in developing methods to address physical faults, design flaws remain a serious problem. Formal methods provides a mathematical basis for removing design flaws from digital systems. DRS (Derivational Reasoning System) is a formal design tool based on advanced research in mathematical modeling and formal synthesis. The system implements a basic design algebra for synthesizing digital circuit descriptions from high level functional specifications. DRS incorporates an executable specification language, a set of correctness preserving transformations, verification interface, and a logic synthesis interface, making it a powerful tool for realizing hardware from abstract specifications. DRS integrates recent advances in transformational reasoning, automated theorem proving and high-level CAD synthesis systems in order to provide enhanced reliability in designs with reduced time and cost.
White, Sarah A; van den Broek, Nynke R
2004-05-30
Before introducing a new measurement tool it is necessary to evaluate its performance. Several statistical methods have been developed, or used, to evaluate the reliability and validity of a new assessment method in such circumstances. In this paper we review some commonly used methods. Data from a study that was conducted to evaluate the usefulness of a specific measurement tool (the WHO Colour Scale) is then used to illustrate the application of these methods. The WHO Colour Scale was developed under the auspices of the WHO to provide a simple portable and reliable method of detecting anaemia. This Colour Scale is a discrete interval scale, whereas the actual haemoglobin values it is used to estimate are on a continuous interval scale and can be measured accurately using electrical laboratory equipment. The methods we consider are: linear regression, correlation coefficients, paired t-tests plotting differences against mean values and deriving limits of agreement; kappa and weighted kappa statistics, sensitivity and specificity, an intraclass correlation coefficient and the repeatability coefficient. We note that although the definition and properties of each of these methods is well established inappropriate methods continue to be used in medical literature for assessing reliability and validity, as evidenced in the context of the evaluation of the WHO Colour Scale. Copyright 2004 John Wiley & Sons, Ltd.
Note: A phase synchronization photography method for AC discharge.
Wu, Zhicheng; Zhang, Qiaogen; Ma, Jingtan; Pang, Lei
2018-05-01
To research discharge physics under AC voltage, a phase synchronization photography method is presented. By using a permanent-magnet synchronous motor to drive a photography mask synchronized with a discharge power supply, discharge images in a specific phase window can be recorded. Some examples of discharges photographed by this method, including the corona discharge in SF 6 and the corona discharge along the air/epoxy surface, demonstrate the feasibility of this method. Therefore, this method provides an effective tool for discharge physics researchers.
Note: A phase synchronization photography method for AC discharge
NASA Astrophysics Data System (ADS)
Wu, Zhicheng; Zhang, Qiaogen; Ma, Jingtan; Pang, Lei
2018-05-01
To research discharge physics under AC voltage, a phase synchronization photography method is presented. By using a permanent-magnet synchronous motor to drive a photography mask synchronized with a discharge power supply, discharge images in a specific phase window can be recorded. Some examples of discharges photographed by this method, including the corona discharge in SF6 and the corona discharge along the air/epoxy surface, demonstrate the feasibility of this method. Therefore, this method provides an effective tool for discharge physics researchers.
Qualitative tools and experimental philosophy.
Andow, James
2016-11-16
Experimental philosophy brings empirical methods to philosophy. These methods are used to probe how people think about philosophically interesting things such as knowledge, morality, and freedom. This paper explores the contribution that qualitative methods have to make in this enterprise. I argue that qualitative methods have the potential to make a much greater contribution than they have so far. Along the way, I acknowledge a few types of resistance that proponents of qualitative methods in experimental philosophy might encounter, and provide reasons to think they are ill-founded.
NASA Astrophysics Data System (ADS)
Matthews, L.; Gurrola, H.
2015-12-01
Typical petrophysical well log correlation is accomplished by manual pattern recognition leading to subjective correlations. The change in character in a well log is dependent upon the change in the response of the tool to lithology. The petrophysical interpreter looks for a change in one log type that would correspond to the way a different tool responds to the same lithology. To develop an objective way to pick changes in well log characteristics, we adapt a method of first arrival picking used in seismic data to analyze changes in the character of well logs. We chose to use the fractal method developed by Boschetti et al[1] (1996). This method worked better than we expected and we found similar changes in the fractal dimension across very different tool types (sonic vs density vs gamma ray). We reason the fractal response of the log is not dependent on the physics of the tool response but rather the change in the complexity of the log data. When a formation changes physical character in time or space the recorded magnitude in tool data changes complexity at the same time even if the original tool response is very different. The relative complexity of the data regardless of the tool used is dependent upon the complexity of the medium relative to tool measurement. The relative complexity of the recorded magnitude data changes as a tool transitions from one character type to another. The character we are measuring is the roughness or complexity of the petrophysical curve. Our method provides a way to directly compare different log types based on a quantitative change in signal complexity. For example, using changes in data complexity allow us to correlate gamma ray suites with sonic logs within a well and then across to an adjacent well with similar signatures. Our method creates reliable and automatic correlations to be made in data sets beyond the reasonable cognitive limits of geoscientists in both speed and consistent pattern recognition. [1] Fabio Boschetti, Mike D. Dentith, and Ron D. List, (1996). A fractal-based algorithm for detecting first arrivals on seismic traces. Geophysics, Vol.61, No.4, P. 1095-1102.
Solernou, Albert
2018-01-01
Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package. PMID:29570700
Peter R. Robichaud
1997-01-01
Geostatistics provides a method to describe the spatial continuity of many natural phenomena. Spatial models are based upon the concept of scaling, kriging and conditional simulation. These techniques were used to describe the spatially-varied surface conditions on timber harvest and burned hillslopes. Geostatistical techniques provided estimates of the ground cover (...
A Systematic Approach to Educating the Emerging Adult Learner in Undergraduate Management Courses
ERIC Educational Resources Information Center
Dachner, Alison M.; Polin, Beth
2016-01-01
Management education research has provided educators with new instructional tools to improve course design and update the methods used in the classroom. In an effort to provide the typical undergraduate management student with the best possible learning experience and outcomes, it is important to recognize how and why these new activities benefit…
Developing Portfolios in Education: A Guide to Reflection, Inquiry, and Assessment [with CD-ROM
ERIC Educational Resources Information Center
Johnson, Ruth S.; Mims, J. Sabrina; Doyle-Nichols, Adelaide
2006-01-01
Within a conceptual and research framework about the usefulness of portfolios, this book suggests methods to organize the process, and provides tools that will be used not only during preparation programs but also for professional and academic advancement. Key features include: (1) Provides a conceptual framework for portfolio development: Readers…
Developing an undue influence screening tool for Adult Protective Services.
Quinn, Mary Joy; Nerenberg, Lisa; Navarro, Adria E; Wilber, Kathleen H
2017-03-01
The study purpose was to develop and pilot an undue influence screening tool for California's Adult Protective Services (APS) personnel based on the definition of undue influence enacted into California law January 1, 2014. Methods included four focus groups with APS providers (n = 33), piloting the preliminary tool by APS personnel (n = 15), and interviews with four elder abuse experts and two APS administrators. Social service literature-including existing undue influence models-was reviewed, as were existing screening and assessment tools. Using the information from these various sources, the California Undue Influence Screening Tool (CUIST) was developed. It can be applied to APS cases and potentially adapted for use by other professionals and for use in other states. Implementation of the tool into APS practice, policy, procedures, and training of personnel will depend on the initiative of APS management. Future work will need to address the reliability and validity of CUIST.
Using Petri Net Tools to Study Properties and Dynamics of Biological Systems
Peleg, Mor; Rubin, Daniel; Altman, Russ B.
2005-01-01
Petri Nets (PNs) and their extensions are promising methods for modeling and simulating biological systems. We surveyed PN formalisms and tools and compared them based on their mathematical capabilities as well as by their appropriateness to represent typical biological processes. We measured the ability of these tools to model specific features of biological systems and answer a set of biological questions that we defined. We found that different tools are required to provide all capabilities that we assessed. We created software to translate a generic PN model into most of the formalisms and tools discussed. We have also made available three models and suggest that a library of such models would catalyze progress in qualitative modeling via PNs. Development and wide adoption of common formats would enable researchers to share models and use different tools to analyze them without the need to convert to proprietary formats. PMID:15561791
Pointo - a Low Cost Solution to Point Cloud Processing
NASA Astrophysics Data System (ADS)
Houshiar, H.; Winkler, S.
2017-11-01
With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.
Mesoscale brain explorer, a flexible python-based image analysis and visualization tool.
Haupt, Dirk; Vanni, Matthieu P; Bolanos, Federico; Mitelut, Catalin; LeDue, Jeffrey M; Murphy, Tim H
2017-07-01
Imaging of mesoscale brain activity is used to map interactions between brain regions. This work has benefited from the pioneering studies of Grinvald et al., who employed optical methods to image brain function by exploiting the properties of intrinsic optical signals and small molecule voltage-sensitive dyes. Mesoscale interareal brain imaging techniques have been advanced by cell targeted and selective recombinant indicators of neuronal activity. Spontaneous resting state activity is often collected during mesoscale imaging to provide the basis for mapping of connectivity relationships using correlation. However, the information content of mesoscale datasets is vast and is only superficially presented in manuscripts given the need to constrain measurements to a fixed set of frequencies, regions of interest, and other parameters. We describe a new open source tool written in python, termed mesoscale brain explorer (MBE), which provides an interface to process and explore these large datasets. The platform supports automated image processing pipelines with the ability to assess multiple trials and combine data from different animals. The tool provides functions for temporal filtering, averaging, and visualization of functional connectivity relations using time-dependent correlation. Here, we describe the tool and show applications, where previously published datasets were reanalyzed using MBE.
Exploratory Climate Data Visualization and Analysis Using DV3D and UVCDAT
NASA Technical Reports Server (NTRS)
Maxwell, Thomas
2012-01-01
Earth system scientists are being inundated by an explosion of data generated by ever-increasing resolution in both global models and remote sensors. Advanced tools for accessing, analyzing, and visualizing very large and complex climate data are required to maintain rapid progress in Earth system research. To meet this need, NASA, in collaboration with the Ultra-scale Visualization Climate Data Analysis Tools (UVCOAT) consortium, is developing exploratory climate data analysis and visualization tools which provide data analysis capabilities for the Earth System Grid (ESG). This paper describes DV3D, a UV-COAT package that enables exploratory analysis of climate simulation and observation datasets. OV3D provides user-friendly interfaces for visualization and analysis of climate data at a level appropriate for scientists. It features workflow inte rfaces, interactive 40 data exploration, hyperwall and stereo visualization, automated provenance generation, and parallel task execution. DV30's integration with CDAT's climate data management system (COMS) and other climate data analysis tools provides a wide range of high performance climate data analysis operations. DV3D expands the scientists' toolbox by incorporating a suite of rich new exploratory visualization and analysis methods for addressing the complexity of climate datasets.