DOE Office of Scientific and Technical Information (OSTI.GOV)
David Fritz, John Floren
2013-08-27
Minimega is a simple emulytics platform for creating testbeds of networked devices. The platform consists of easily deployable tools to facilitate bringing up large networks of virtual machines including Windows, Linux, and Android. Minimega attempts to allow experiments to be brought up quickly with nearly no configuration. Minimega also includes tools for simple cluster management, as well as tools for creating Linux based virtual machine images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ian Metzger, Jesse Dean
2010-12-31
This software requires inputs of simple water fixture inventory information and calculates the water/energy and cost benefits of various retrofit opportunities. This tool includes water conservation measures for: Low-flow Toilets, Low-flow Urinals, Low-flow Faucets, and Low-flow Showheads. This tool calculates water savings, energy savings, demand reduction, cost savings, and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.
Simplified aeroelastic modeling of horizontal axis wind turbines
NASA Technical Reports Server (NTRS)
Wendell, J. H.
1982-01-01
Certain aspects of the aeroelastic modeling and behavior of the horizontal axis wind turbine (HAWT) are examined. Two simple three degree of freedom models are described in this report, and tools are developed which allow other simple models to be derived. The first simple model developed is an equivalent hinge model to study the flap-lag-torsion aeroelastic stability of an isolated rotor blade. The model includes nonlinear effects, preconing, and noncoincident elastic axis, center of gravity, and aerodynamic center. A stability study is presented which examines the influence of key parameters on aeroelastic stability. Next, two general tools are developed to study the aeroelastic stability and response of a teetering rotor coupled to a flexible tower. The first of these tools is an aeroelastic model of a two-bladed rotor on a general flexible support. The second general tool is a harmonic balance solution method for the resulting second order system with periodic coefficients. The second simple model developed is a rotor-tower model which serves to demonstrate the general tools. This model includes nacelle yawing, nacelle pitching, and rotor teetering. Transient response time histories are calculated and compared to a similar model in the literature. Agreement between the two is very good, especially considering how few harmonics are used. Finally, a stability study is presented which examines the effects of support stiffness and damping, inflow angle, and preconing.
van Rhee, Henk; Hak, Tony
2017-01-01
We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crussell, Jonathan; Erickson, Jeremy; Fritz, David
minimega is an emulytics platform for creating testbeds of networked devices. The platoform consists of easily deployable tools to facilitate bringing up large networks of virtual machines including Windows, Linux, and Android. minimega allows experiments to be brought up quickly with almost no configuration. minimega also includes tools for simple cluster, management, as well as tools for creating Linux-based virtual machines. This release of minimega includes new emulated sensors for Android devices to improve the fidelity of testbeds that include mobile devices. Emulated sensors include GPS and
Exposure assessment in health assessments for hand-arm vibration syndrome.
Mason, H J; Poole, K; Young, C
2011-08-01
Assessing past cumulative vibration exposure is part of assessing the risk of hand-arm vibration syndrome (HAVS) in workers exposed to hand-arm vibration and invariably forms part of a medical assessment of such workers. To investigate the strength of relationships between the presence and severity of HAVS and different cumulative exposure metrics obtained from a self-reporting questionnaire. Cumulative exposure metrics were constructed from a tool-based questionnaire applied in a group of HAVS referrals and workplace field studies. These metrics included simple years of vibration exposure, cumulative total hours of all tool use and differing combinations of acceleration magnitudes for specific tools and their daily use, including the current frequency-weighting method contained in ISO 5349-1:2001. Use of simple years of exposure is a weak predictor of HAVS or its increasing severity. The calculation of cumulative hours across all vibrating tools used is a more powerful predictor. More complex calculations based on involving likely acceleration data for specific classes of tools, either frequency weighted or not, did not offer a clear further advantage in this dataset. This may be due to the uncertainty associated with workers' recall of their past tool usage or the variability between tools in the magnitude of their vibration emission. Assessing years of exposure or 'latency' in a worker should be replaced by cumulative hours of tool use. This can be readily obtained using a tool-pictogram-based self-reporting questionnaire and a simple spreadsheet calculation.
Cloud-Based Tools to Support High-Resolution Modeling (Invited)
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Swain, N.; Christensen, S.
2013-12-01
The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482
Suurmond, Robert; van Rhee, Henk; Hak, Tony
2017-12-01
We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
Bhalla, Kavi; Harrison, James E
2016-04-01
Burden of disease and injury methods can be used to summarise and compare the effects of conditions in terms of disability-adjusted life years (DALYs). Burden estimation methods are not inherently complex. However, as commonly implemented, the methods include complex modelling and estimation. To provide a simple and open-source software tool that allows estimation of incidence-DALYs due to injury, given data on incidence of deaths and non-fatal injuries. The tool includes a default set of estimation parameters, which can be replaced by users. The tool was written in Microsoft Excel. All calculations and values can be seen and altered by users. The parameter sets currently used in the tool are based on published sources. The tool is available without charge online at http://calculator.globalburdenofinjuries.org. To use the tool with the supplied parameter sets, users need to only paste a table of population and injury case data organised by age, sex and external cause of injury into a specified location in the tool. Estimated DALYs can be read or copied from tables and figures in another part of the tool. In some contexts, a simple and user-modifiable burden calculator may be preferable to undertaking a more complex study to estimate the burden of disease. The tool and the parameter sets required for its use can be improved by user innovation, by studies comparing DALYs estimates calculated in this way and in other ways, and by shared experience of its use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Analysis of pre-service physics teacher skills designing simple physics experiments based technology
NASA Astrophysics Data System (ADS)
Susilawati; Huda, C.; Kurniawan, W.; Masturi; Khoiri, N.
2018-03-01
Pre-service physics teacher skill in designing simple experiment set is very important in adding understanding of student concept and practicing scientific skill in laboratory. This study describes the skills of physics students in designing simple experiments based technologicall. The experimental design stages include simple tool design and sensor modification. The research method used is descriptive method with the number of research samples 25 students and 5 variations of simple physics experimental design. Based on the results of interviews and observations obtained the results of pre-service physics teacher skill analysis in designing simple experimental physics charged technology is good. Based on observation result, pre-service physics teacher skill in designing simple experiment is good while modification and sensor application are still not good. This suggests that pre-service physics teacher still need a lot of practice and do experiments in designing physics experiments using sensor modifications. Based on the interview result, it is found that students have high enough motivation to perform laboratory activities actively and students have high curiosity to be skilled at making simple practicum tool for physics experiment.
Analytical Tools in School Finance Reform.
ERIC Educational Resources Information Center
Johns, R. L.
This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…
Gravitational Wave Detection in the Introductory Lab
NASA Astrophysics Data System (ADS)
Burko, Lior M.
2017-01-01
Great physics breakthroughs are rarely included in the introductory physics course. General relativity and binary black hole coalescence are no different, and can be included in the introductory course only in a very limited sense. However, we can design activities that directly involve the detection of GW150914, the designation of the Gravitation Wave signal detected on September 14, 2015, thereby engage the students in this exciting discovery directly. The activities naturally do not include the construction of a detector or the detection of gravitational waves. Instead, we design it to include analysis of the data from GW150914, which includes some interesting analysis activities for students of the introductory course. The same activities can be assigned either as a laboratory exercise or as a computational project for the same population of students. The analysis tools used here are simple and available to the intended student population. It does not include the sophisticated analysis tools, which were used by LIGO to carefully analyze the detected signal. However, these simple tools are sufficient to allow the student to get important results. We have successfully assigned this lab project for students of the introductory course with calculus at Georgia Gwinnett College.
The Food-Safe Schools Action Guide
ERIC Educational Resources Information Center
Centers for Disease Control and Prevention, 2007
2007-01-01
"The Food-Safe School Needs Assessment and Planning Guide" is a tool that can help schools assess their food safety policies, procedures, and programs and develop plans for improvement. This tool includes a simple, straightforward questionnaire, score card, and planning guide that give administrators, school staff, families, and students a chance…
Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan
2014-01-01
LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784
Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan
2014-12-15
LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.
Software Tools to Support Research on Airport Departure Planning
NASA Technical Reports Server (NTRS)
Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul
2003-01-01
A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.
Anton TenWolde; Mark T. Bomberg
2009-01-01
Overall, despite the lack of exact input data, the use of design tools, including models, is much superior to the simple following of rules of thumbs, and a moisture analysis should be standard procedure for any building envelope design. Exceptions can only be made for buildings in the same climate, similar occupancy, and similar envelope construction. This chapter...
Remote Control and Data Acquisition: A Case Study
NASA Technical Reports Server (NTRS)
DeGennaro, Alfred J.; Wilkinson, R. Allen
2000-01-01
This paper details software tools developed to remotely command experimental apparatus, and to acquire and visualize the associated data in soft real time. The work was undertaken because commercial products failed to meet the needs. This work has identified six key factors intrinsic to development of quality research laboratory software. Capabilities include access to all new instrument functions without any programming or dependence on others to write drivers or virtual instruments, simple full screen text-based experiment configuration and control user interface, months of continuous experiment run-times, order of 1% CPU load for condensed matter physics experiment described here, very little imposition of software tool choices on remote users, and total remote control from anywhere in the world over the Internet or from home on a 56 Kb modem as if the user is sitting in the laboratory. This work yielded a set of simple robust tools that are highly reliable, resource conserving, extensible, and versatile, with a uniform simple interface.
Nutrition screening tools: an analysis of the evidence.
Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy
2012-05-01
In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.
Little Caney River Prehistory. 1979 Field Season,
1981-01-01
spurred tool used graver for puncturing soft materials, incising bone or skins 4o-t 22 Cutting tool A purposefully modified flake tool with working edge...of two facial implements do occur. The debitage burins(Figure4j-k)madeonthickflakeblanks. includes whole or fragmentary flakes having Incising of...is also sand tempered and-pecked. The ground stone slab/abrader is and is simple stamped or linearly incised on ground-and-marred and is from 50 cm
NASA Astrophysics Data System (ADS)
Hoebelheinrich, N. J.; Lynnes, C.; West, P.; Ferritto, M.
2014-12-01
Two problems common to many geoscience domains are the difficulties in finding tools to work with a given dataset collection, and conversely, the difficulties in finding data for a known tool. A collaborative team from the Earth Science Information Partnership (ESIP) has gotten together to design and create a web service, called ToolMatch, to address these problems. The team began their efforts by defining an initial, relatively simple conceptual model that addressed the two uses cases briefly described above. The conceptual model is expressed as an ontology using OWL (Web Ontology Language) and DCterms (Dublin Core Terms), and utilizing standard ontologies such as DOAP (Description of a Project), FOAF (Friend of a Friend), SKOS (Simple Knowledge Organization System) and DCAT (Data Catalog Vocabulary). The ToolMatch service will be taking advantage of various Semantic Web and Web standards, such as OpenSearch, RESTful web services, SWRL (Semantic Web Rule Language) and SPARQL (Simple Protocol and RDF Query Language). The first version of the ToolMatch service was deployed in early fall 2014. While more complete testing is required, a number of communities besides ESIP member organizations have expressed interest in collaborating to create, test and use the service and incorporate it into their own web pages, tools and / or services including the USGS Data Catalog service, DataONE, the Deep Carbon Observatory, Virtual Solar Terrestrial Observatory (VSTO), and the U.S. Global Change Research Program. In this session, presenters will discuss the inception and development of the ToolMatch service, the collaborative process used to design, refine, and test the service, and future plans for the service.
NASA Astrophysics Data System (ADS)
Civera Lorenzo, Tamara
2017-10-01
Brief presentation about the J-PLUS EDR data access web portal (http://archive.cefca.es/catalogues/jplus-edr) where the different services available to retrieve images and catalogues data have been presented.J-PLUS Early Data Release (EDR) archive includes two types of data: images and dual and single catalogue data which include parameters measured from images. J-PLUS web portal offers catalogue data and images through several different online data access tools or services each suited to a particular need. The different services offered are: Coverage map Sky navigator Object visualization Image search Cone search Object list search Virtual observatory services: Simple Cone Search Simple Image Access Protocol Simple Spectral Access Protocol Table Access Protocol
Build Your Own Solar Air Heater.
ERIC Educational Resources Information Center
Conservation and Renewable Energy Inquiry and Referral Service (DOE), Silver Spring, MD.
The solar air heater is a simple device for catching some of the sun's energy to heat a home. Procedures for making and installing such a heater are presented. Included is a materials list, including tools needed for constructing the heater, sources for obtaining further details, and a list of material specifications. (JN)
pyZELDA: Python code for Zernike wavefront sensors
NASA Astrophysics Data System (ADS)
Vigan, A.; N'Diaye, M.
2018-06-01
pyZELDA analyzes data from Zernike wavefront sensors dedicated to high-contrast imaging applications. This modular software was originally designed to analyze data from the ZELDA wavefront sensor prototype installed in VLT/SPHERE; simple configuration files allow it to be extended to support several other instruments and testbeds. pyZELDA also includes simple simulation tools to measure the theoretical sensitivity of a sensor and to compare it to other sensors.
Franc, Jeffrey Michael; Ingrassia, Pier Luigi; Verde, Manuela; Colombo, Davide; Della Corte, Francesco
2015-02-01
Surge capacity, or the ability to manage an extraordinary volume of patients, is fundamental for hospital management of mass-casualty incidents. However, quantification of surge capacity is difficult and no universal standard for its measurement has emerged, nor has a standardized statistical method been advocated. As mass-casualty incidents are rare, simulation may represent a viable alternative to measure surge capacity. Hypothesis/Problem The objective of the current study was to develop a statistical method for the quantification of surge capacity using a combination of computer simulation and simple process-control statistical tools. Length-of-stay (LOS) and patient volume (PV) were used as metrics. The use of this method was then demonstrated on a subsequent computer simulation of an emergency department (ED) response to a mass-casualty incident. In the derivation phase, 357 participants in five countries performed 62 computer simulations of an ED response to a mass-casualty incident. Benchmarks for ED response were derived from these simulations, including LOS and PV metrics for triage, bed assignment, physician assessment, and disposition. In the application phase, 13 students of the European Master in Disaster Medicine (EMDM) program completed the same simulation scenario, and the results were compared to the standards obtained in the derivation phase. Patient-volume metrics included number of patients to be triaged, assigned to rooms, assessed by a physician, and disposed. Length-of-stay metrics included median time to triage, room assignment, physician assessment, and disposition. Simple graphical methods were used to compare the application phase group to the derived benchmarks using process-control statistical tools. The group in the application phase failed to meet the indicated standard for LOS from admission to disposition decision. This study demonstrates how simulation software can be used to derive values for objective benchmarks of ED surge capacity using PV and LOS metrics. These objective metrics can then be applied to other simulation groups using simple graphical process-control tools to provide a numeric measure of surge capacity. Repeated use in simulations of actual EDs may represent a potential means of objectively quantifying disaster management surge capacity. It is hoped that the described statistical method, which is simple and reusable, will be useful for investigators in this field to apply to their own research.
Cutting Symmetrical Recesses In Soft Ceramic Tiles
NASA Technical Reports Server (NTRS)
Nesotas, Tony C.; Tyler, Brent
1989-01-01
Simple tool cuts hemispherical recesses in soft ceramic tiles. Designed to expose wires of thermocouples embedded in tiles without damaging leads. Creates neat, precise holes around wires. End mill includes axial hole to accommodate thermocouple wires embedded in material to be cut. Wires pass into hole without being bent or broken. Dimensions in inches. Used in place of such tools as dental picks, tweezers, spatulas, and putty knives.
Névéol, Aurélie; Pereira, Suzanne; Kerdelhué, Gaetan; Dahamna, Badisse; Joubert, Michel; Darmoni, Stéfan J
2007-01-01
The growing number of resources to be indexed in the catalogue of online health resources in French (CISMeF) calls for curating strategies involving automatic indexing tools while maintaining the catalogue's high indexing quality standards. To develop a simple automatic tool that retrieves MeSH descriptors from documents titles. In parallel to research on advanced indexing methods, a bag-of-words tool was developed for timely inclusion in CISMeF's maintenance system. An evaluation was carried out on a corpus of 99 documents. The indexing sets retrieved by the automatic tool were compared to manual indexing based on the title and on the full text of resources. 58% of the major main headings were retrieved by the bag-of-words algorithm and the precision on main heading retrieval was 69%. Bag-of-words indexing has effectively been used on selected resources to be included in CISMeF since August 2006. Meanwhile, on going work aims at improving the current version of the tool.
Linear regression metamodeling as a tool to summarize and present simulation model results.
Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M
2013-10-01
Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.
Brachypodium distachyon genetic resources
USDA-ARS?s Scientific Manuscript database
Brachypodium distachyon is a well-established model species for the grass family Poaceae. It possesses an array of features that make it suited for this purpose, including a small sequenced genome, simple transformation methods, and additional functional genomics tools. However, the most critical to...
Modification of the Fosberg fire weather index to include drought
Scott L. Goodrick
2002-01-01
The Fosberg fire weather index is a simple tool for evaluating the potential influence of weather on a wildland fire based on temperature, relative humidity and wind speed. A modification to this index that includes the impact of precipitation is proposed. The Keetch-Byram drought index is used to formulate a 'fuel availability' factor that modifies the...
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.
Simonyan, Vahan; Mazumder, Raja
2014-09-30
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis
Simonyan, Vahan; Mazumder, Raja
2014-01-01
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953
Recent Developments in OVERGRID, OVERFLOW-2 and Chimera Grid Tools Scripts
NASA Technical Reports Server (NTRS)
Chan, William M.
2004-01-01
OVERGRID and OVERFLOW-2 feature easy to use multiple-body dynamics. The new features of OVERGRID include a preliminary chemistry interface, standard atmosphere and mass properties calculators, a simple unsteady solution viewer, and a debris tracking interface. Script library development in Chimera Grid Tools has applications in turbopump grid generation. This viewgraph presentation profiles multiple component dynamics, validation test cases for a sphere, cylinder, and oscillating airfoil, and debris analysis.
NASA Astrophysics Data System (ADS)
Almagro, A.
2013-07-01
Different experiences of surveys of Islamic monuments from different sites of Tunisia, Algeria and Morocco are presented. They have been made with simple tools: one photographic camera and a laser meter, without a previous planning or prevision for the survey, profiting from visits organized during scientific meetings to which the author was invited. Some of these monuments belong to sites included in the World Heritage List, but no metric documents or only low quality information is available. Monumental Almohad gates from Rabat and Marrakech, the al-Badi palace of Marrakech, the minarets of Mansura and the Qala of Beni Hammad, the dome in front of the mihrab of the mosque of Tlemcen are some of the examples to be presented. The methodology applied is based on ideas and tools acquired in CIPA meetings proving the usefulness of these encounters but supporting the idea that "providers" should provide tools and methods and "users" should be responsible for documentation, never missing the opportunity of acquiring knowledge from the heritage during the survey process.
Speed up of XML parsers with PHP language implementation
NASA Astrophysics Data System (ADS)
Georgiev, Bozhidar; Georgieva, Adriana
2012-11-01
In this paper, authors introduce PHP5's XML implementation and show how to read, parse, and write a short and uncomplicated XML file using Simple XML in a PHP environment. The possibilities for mutual work of PHP5 language and XML standard are described. The details of parsing process with Simple XML are also cleared. A practical project PHP-XML-MySQL presents the advantages of XML implementation in PHP modules. This approach allows comparatively simple search of XML hierarchical data by means of PHP software tools. The proposed project includes database, which can be extended with new data and new XML parsing functions.
NASA Technical Reports Server (NTRS)
Petrenko, M.; Hegde, M.; Bryant, K.; Johnson, J. E.; Ritrivi, A.; Shen, S.; Volmer, B.; Pham, L. B.
2015-01-01
Goddard Earth Sciences Data and Information Services Center (GES DISC) has been providing access to scientific data sets since 1990s. Beginning as one of the first Earth Observing System Data and Information System (EOSDIS) archive centers, GES DISC has evolved to offer a wide range of science-enabling services. With a growing understanding of needs and goals of its science users, GES DISC continues to improve and expand on its broad set of data discovery and access tools, sub-setting services, and visualization tools. Nonetheless, the multitude of the available tools, a partial overlap of functionality, and independent and uncoupled interfaces employed by these tools often leave the end users confused as of what tools or services are the most appropriate for a task at hand. As a result, some the services remain underutilized or largely unknown to the users, significantly reducing the availability of the data and leading to a great loss of scientific productivity. In order to improve the accessibility of GES DISC tools and services, we have designed and implemented UUI, the Unified User Interface. UUI seeks to provide a simple, unified, and intuitive one-stop shop experience for the key services available at GES DISC, including sub-setting (Simple Subset Wizard), granule file search (Mirador), plotting (Giovanni), and other services. In this poster, we will discuss the main lessons, obstacles, and insights encountered while designing the UUI experience. We will also present the architecture and technology behind UUI, including NodeJS, Angular, and Mongo DB, as well as speculate on the future of the tool at GES DISC as well as in a broader context of the Space Science Informatics.
Creating Simple Windchill Admin Tools Using Info*Engine
NASA Technical Reports Server (NTRS)
Jones, Corey; Kapatos, Dennis; Skradski, Cory
2012-01-01
Being a Windchill administrator often requires performing simple yet repetitive tasks on large sets of objects. These can include renaming, deleting, checking in, undoing checkout, and much more. This is especially true during a migration. Fortunately, PTC has provided a simple way to dynamically interact with Windchill using Info*Engine. This presentation will describe how to create simple Info*Engine tasks capable of saving Windchill 10.0 administrators hours of tedious work. It will also show how these tasks can be combined and displayed on a simple JSP page that acts as a "Windchill Administrator Dashboard/Toolbox". The attendee will learn some valuable tasks Info*Engine capable of performing. The attendee will gain a basic understanding of how to perform and implement Info*Engine tasks. The attendee will learn what's involved in creating a JSP page that displays Info*Engine tasks
Cox, Trevor F; Ranganath, Lakshminarayan
2011-12-01
Alkaptonuria (AKU) is due to excessive homogentisic acid accumulation in body fluids due to lack of enzyme homogentisate dioxygenase leading in turn to varied clinical manifestations mainly by a process of conversion of HGA to a polymeric melanin-like pigment known as ochronosis. A potential treatment, a drug called nitisinone, to decrease formation of HGA is available. However, successful demonstration of its efficacy in modifying the natural history of AKU requires an effective quantitative assessment tool. We have described two potential tools that could be used to quantitate disease burden in AKU. One tool describes scoring the clinical features that includes clinical assessments, investigations and questionnaires in 15 patients with AKU. The second tool describes a scoring system that only includes items obtained from questionnaires used in 44 people with AKU. Statistical analyses were carried out on the two patient datasets to assess the AKU tools; these included the calculation of Chronbach's alpha, multidimensional scaling and simple linear regression analysis. The conclusion was that there was good evidence that the tools could be adopted as AKU assessment tools, but perhaps with further refinement before being used in the practical setting of a clinical trial.
Simple Statistics: - Summarized!
ERIC Educational Resources Information Center
Blai, Boris, Jr.
Statistics are an essential tool for making proper judgement decisions. It is concerned with probability distribution models, testing of hypotheses, significance tests and other means of determining the correctness of deductions and the most likely outcome of decisions. Measures of central tendency include the mean, median and mode. A second…
How to establish business office incentive programs.
Wilkerson, L J
1991-01-01
Incentive programs to help increase collections or reduce days in receivables are becoming popular among healthcare business offices. A successful incentive program addresses major issues during the planning stage and includes realistic incentive goals, simple measurement tools, meaningful incentive payments, and proper monitoring of results.
Next generation simulation tools: the Systems Biology Workbench and BioSPICE integration.
Sauro, Herbert M; Hucka, Michael; Finney, Andrew; Wellock, Cameron; Bolouri, Hamid; Doyle, John; Kitano, Hiroaki
2003-01-01
Researchers in quantitative systems biology make use of a large number of different software packages for modelling, analysis, visualization, and general data manipulation. In this paper, we describe the Systems Biology Workbench (SBW), a software framework that allows heterogeneous application components--written in diverse programming languages and running on different platforms--to communicate and use each others' capabilities via a fast binary encoded-message system. Our goal was to create a simple, high performance, opensource software infrastructure which is easy to implement and understand. SBW enables applications (potentially running on separate, distributed computers) to communicate via a simple network protocol. The interfaces to the system are encapsulated in client-side libraries that we provide for different programming languages. We describe in this paper the SBW architecture, a selection of current modules, including Jarnac, JDesigner, and SBWMeta-tool, and the close integration of SBW into BioSPICE, which enables both frameworks to share tools and compliment and strengthen each others capabilities.
Selb, Melissa; Gimigliano, Francesca; Prodinger, Birgit; Stucki, Gerold; Pestelli, Germano; Iocco, Maurizio; Boldrini, Paolo
2017-04-01
As part of international efforts to develop and implement national models including the specification of ICF-based clinical data collection tools, the Italian rehabilitation community initiated a project to develop simple, intuitive descriptions of the ICF Rehabilitation Set, highlighting the core concept of each category in user-friendly language. This paper outlines the Italian experience in developing simple, intuitive descriptions of the ICF Rehabilitation Set as an ICF-based clinical data collection tool for Italy. Consensus process. Expert conference. Multidisciplinary group of rehabilitation professionals. The first of a two-stage consensus process involved developing an initial proposal for simple, intuitive descriptions of each ICF Rehabilitation Set category based on descriptions generated in a similar process in China. Stage two involved a consensus conference. Divided into three working groups, participants discussed and voted (vote A) whether the initially proposed descriptions of each ICF Rehabilitation Set category was simple and intuitive enough for use in daily practice. Afterwards the categories with descriptions considered ambiguous i.e. not simple and intuitive enough, were divided among the working groups, who were asked to propose a new description for the allocated categories. These proposals were then voted (vote B) on in a plenary session. The last step of the consensus conference required each working group to develop a new proposal for each and the same categories with descriptions still considered ambiguous. Participants then voted (final vote) for which of the three proposed descriptions they preferred. Nineteen clinicians from diverse rehabilitation disciplines from various regions of Italy participated in the consensus process. Three ICF categories already achieved consensus in vote A, while 20 ICF categories were accepted in vote B. The remaining 7 categories were decided in the final vote. The findings were discussed in light of current efforts toward developing strategies for ICF implementation, specifically for the application of an ICF-based clinical data collection tool, not only for Italy but also for the rest of Europe. Promising as minimal standards for monitoring the impact of interventions and for standardized reporting of functioning as a relevant outcome in rehabilitation.
Insights into early lithic technologies from ethnography
Hayden, Brian
2015-01-01
Oldowan lithic assemblages are often portrayed as a product of the need to obtain sharp flakes for cutting into animal carcases. However, ethnographic and experimental research indicates that the optimal way to produce flakes for such butchering purposes is via bipolar reduction of small cryptocrystalline pebbles rather than from larger crystalline cores resembling choppers. Ethnographic observations of stone tool-using hunter-gatherers in environments comparable with early hominins indicate that most stone tools (particularly chopper forms and flake tools) were used for making simple shaft tools including spears, digging sticks and throwing sticks. These tools bear strong resemblances to Oldowan stone tools. Bipolar reduction for butchering probably preceded chopper-like core reduction and provides a key link between primate nut-cracking technologies and the emergence of more sophisticated lithic technologies leading to the Oldowan. PMID:26483534
The Dairy Greenhouse Gas Emission Model: Reference Manual
USDA-ARS?s Scientific Manuscript database
The Dairy Greenhouse Gas Model (DairyGHG) is a software tool for estimating the greenhouse gas emissions and carbon footprint of dairy production systems. A relatively simple process-based model is used to predict the primary greenhouse gas emissions, which include the net emission of carbon dioxide...
BioC: a minimalist approach to interoperability for biomedical text processing
Comeau, Donald C.; Islamaj Doğan, Rezarta; Ciccarese, Paolo; Cohen, Kevin Bretonnel; Krallinger, Martin; Leitner, Florian; Lu, Zhiyong; Peng, Yifan; Rinaldi, Fabio; Torii, Manabu; Valencia, Alfonso; Verspoor, Karin; Wiegers, Thomas C.; Wu, Cathy H.; Wilbur, W. John
2013-01-01
A vast amount of scientific information is encoded in natural language text, and the quantity of such text has become so great that it is no longer economically feasible to have a human as the first step in the search process. Natural language processing and text mining tools have become essential to facilitate the search for and extraction of information from text. This has led to vigorous research efforts to create useful tools and to create humanly labeled text corpora, which can be used to improve such tools. To encourage combining these efforts into larger, more powerful and more capable systems, a common interchange format to represent, store and exchange the data in a simple manner between different language processing systems and text mining tools is highly desirable. Here we propose a simple extensible mark-up language format to share text documents and annotations. The proposed annotation approach allows a large number of different annotations to be represented including sentences, tokens, parts of speech, named entities such as genes or diseases and relationships between named entities. In addition, we provide simple code to hold this data, read it from and write it back to extensible mark-up language files and perform some sample processing. We also describe completed as well as ongoing work to apply the approach in several directions. Code and data are available at http://bioc.sourceforge.net/. Database URL: http://bioc.sourceforge.net/ PMID:24048470
Memory management in genome-wide association studies
2009-01-01
Genome-wide association is a powerful tool for the identification of genes that underlie common diseases. Genome-wide association studies generate billions of genotypes and pose significant computational challenges for most users including limited computer memory. We applied a recently developed memory management tool to two analyses of North American Rheumatoid Arthritis Consortium studies and measured the performance in terms of central processing unit and memory usage. We conclude that our memory management approach is simple, efficient, and effective for genome-wide association studies. PMID:20018047
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radhakrishnan, Ben D.
2012-06-30
This research project, which was conducted during the Summer and Fall of 2011, investigated some commercially available assessment tools with a focus on IT equipment to see if such tools could round out the DC Pro tool suite. In this research, the assessment capabilities of the various tools were compiled to help make “non-biased” information available to the public. This research should not be considered to be exhaustive on all existing vendor tools although a number of vendors were contacted. Large IT equipment OEM’s like IBM and Dell provide their proprietary internal automated software which does not work on anymore » other IT equipment. However, found two companies with products that showed promise in performing automated assessments for IT equipment from different OEM vendors. This report documents the research and provides a list of software products reviewed, contacts and websites, product details, discussions with specific companies, a set of recommendations, and next steps. As a result of this research, a simple 3-level approach to an IT assessment tool is proposed along with an example of an assessment using a simple IT equipment data collection tool (Level 1, spreadsheet). The tool has been reviewed with the Green Grid and LBNL staff. The initial feedback has been positive although further refinement to the tool will be necessary. Proposed next steps include a field trial of at least two vendors’ software in two different data centers with an objective to prove the concept, ascertain the extent of energy and computational assessment, ease of installation and opportunities for continuous improvement. Based on the discussions, field trials (or case studies) are proposed with two vendors – JouleX (expected to be completed in 2012) and Sentilla.« less
How High Is It? What Time Is It?
ERIC Educational Resources Information Center
Ulmer, David C., Jr.
1975-01-01
Two instruments are described which were designed to provide beginning students with the tools for determining sun-time and the altitude of the sun. Both can be constructed by the student. A simple-to-construct scale is included. Activities are suggested to use with the instruments. (Author/EB)
The Beer Lambert Law Measurement Made Easy
ERIC Educational Resources Information Center
Onorato, Pasquale; Gratton, Luigi M.; Polesell, Marta; Salmoiraghi, Alessandro; Oss, Stefano
2018-01-01
We propose the use of a smartphone based apparatus as a valuable tool for investigating the optical absorption of a material and to verify the exponential decay predicted by Beer's law. The very simple experimental activities presented here, suitable for undergraduate students, allows one to measure the material transmittance including its…
Current and state-of-the-art approaches for detecting mycotoxins in commodities
USDA-ARS?s Scientific Manuscript database
The tools that have been applied to detection of mycotoxins in commodities are numerous and powerful. These include everything from simple to use diagnostic test strips to complex, instrument intensive, methods such as ultra-high performance liquid chromatography-mass spectrometry (UPLC-MS). This wi...
Multilevel and Latent Variable Modeling with Composite Links and Exploded Likelihoods
ERIC Educational Resources Information Center
Rabe-Hesketh, Sophia; Skrondal, Anders
2007-01-01
Composite links and exploded likelihoods are powerful yet simple tools for specifying a wide range of latent variable models. Applications considered include survival or duration models, models for rankings, small area estimation with census information, models for ordinal responses, item response models with guessing, randomized response models,…
The Promise of Open Educational Resources
ERIC Educational Resources Information Center
Smith, Marshall S.; Casserly, Catherine M.
2006-01-01
Open educational resources (OER) include full courses, course materials, modules, textbooks, streaming videos, tests, software, and any other tools, materials, or techniques used to either support access to knowledge, or have an impact on teaching, learning, and research. At the heart of the OER movement is the simple and powerful idea that the…
Precision Sheet Metal. Progress Record and Theory Outline.
ERIC Educational Resources Information Center
Connecticut State Dept. of Education, Hartford. Div. of Vocational-Technical Schools.
This combination progress record and course outline is designed for use by individuals teaching a course in precision sheet metal. Included among the topics addressed in the course are the following: employment opportunities in metalworking, measurement and layout, orthographic projection, precision sheet metal drafting, simple layout, hand tools,…
Creating a Classroom Kaleidoscope with the World Wide Web.
ERIC Educational Resources Information Center
Quinlan, Laurie A.
1997-01-01
Discusses the elements of classroom Web presentations: planning; construction, including design tips; classroom use; and assessment. Lists 14 World Wide Web resources for K-12 teachers; Internet search tools (directories, search engines and meta-search engines); a Web glossary; and an example of HTML for a simple Web page. (PEN)
A new tool to evaluate postgraduate training posts: the Job Evaluation Survey Tool (JEST).
Wall, David; Goodyear, Helen; Singh, Baldev; Whitehouse, Andrew; Hughes, Elizabeth; Howes, Jonathan
2014-10-02
Three reports in 2013 about healthcare and patient safety in the UK, namely Berwick, Francis and Keogh have highlighted the need for junior doctors' views about their training experience to be heard. In the UK, the General Medical Council (GMC) quality assures medical training programmes and requires postgraduate deaneries to undertake quality management and monitoring of all training posts in their area. The aim of this study was to develop a simple trainee questionnaire for evaluation of postgraduate training posts based on the GMC, UK standards and to look at the reliability and validity including comparison with a well-established and internationally validated tool, the Postgraduate Hospital Educational Environment Measure (PHEEM). The Job Evaluation Survey Tool (JEST), a fifteen item job evaluation questionnaire was drawn up in 2006, piloted with Foundation doctors (2007), field tested with specialist paediatric registrars (2008) and used over a three year period (2008-11) by Foundation Doctors. Statistical analyses including descriptives, reliability, correlation and factor analysis were undertaken and JEST compared with PHEEM. The JEST had a reliability of 0.91 in the pilot study of 76 Foundation doctors, 0.88 in field testing of 173 Paediatric specialist registrars and 0.91 in three years of general use in foundation training with 3367 doctors completing JEST. Correlation of JEST with PHEEM was 0.80 (p < 0.001). Factor analysis showed two factors, a teaching factor and a social and lifestyle one. The JEST has proved to be a simple, valid and reliable evaluation tool in the monitoring and evaluation of postgraduate hospital training posts.
Renfro, Mindy Oxman; Fehrer, Steven
2011-01-01
Unintentional falls is an increasing public health problem as incidence of falls rises and the population ages. The Centers for Disease Control and Prevention reports that 1 in 3 adults aged 65 years and older will experience a fall this year; 20% to 30% of those who fall will sustain a moderate to severe injury. Physical therapists caring for older adults are usually engaged with these patients after the first injury fall and may have little opportunity to abate fall risk before the injuries occur. This article describes the content selection and development of a simple-to-administer, multifactorial, Fall Risk Assessment & Screening Tool (FRAST), designed specifically for use in primary care settings to identify those older adults with high fall risk. Fall Risk Assessment & Screening Tool incorporates previously validated measures within a new multifactorial tool and includes targeted recommendations for intervention. Development of the multifactorial FRAST used a 5-part process: identification of significant fall risk factors, review of best evidence, selection of items, creation of the scoring grid, and development of a recommended action plan. Fall Risk Assessment & Screening Tool has been developed to assess fall risk in the target population of older adults (older than 65 years) living and ambulating independently in the community. Many fall risk factors have been considered and 15 items selected for inclusion. Fall Risk Assessment & Screening Tool includes 4 previously validated measures to assess balance, depression, falls efficacy, and home safety. Reliability and validity studies of FRAST are under way. Fall risk for community-dwelling older adults is an urgent, multifactorial, public health problem. Providing primary care practitioners (PCPs) with a very simple screening tool is imperative. Fall Risk Assessment & Screening Tool was created to allow for safe, quick, and low-cost administration by minimally trained office staff with interpretation and follow-up provided by the PCP.
Basheti, Iman A; Armour, Carol L; Bosnic-Anticevich, Sinthia Z; Reddel, Helen K
2008-07-01
To evaluate the feasibility, acceptability and effectiveness of a brief intervention about inhaler technique, delivered by community pharmacists to asthma patients. Thirty-one pharmacists received brief workshop education (Active: n=16, CONTROL: n=15). Active Group pharmacists were trained to assess and teach dry powder inhaler technique, using patient-centered educational tools including novel Inhaler Technique Labels. Interventions were delivered to patients at four visits over 6 months. At baseline, patients (Active: 53, CONTROL: 44) demonstrated poor inhaler technique (mean+/-S.D. score out of 9, 5.7+/-1.6). At 6 months, improvement in inhaler technique score was significantly greater in Active cf. CONTROL patients (2.8+/-1.6 cf. 0.9+/-1.4, p<0.001), and asthma severity was significantly improved (p=0.015). Qualitative responses from patients and pharmacists indicated a high level of satisfaction with the intervention and educational tools, both for their effectiveness and for their impact on the patient-pharmacist relationship. A simple feasible intervention in community pharmacies, incorporating daily reminders via Inhaler Technique Labels on inhalers, can lead to improvement in inhaler technique and asthma outcomes. Brief training modules and simple educational tools, such as Inhaler Technique Labels, can provide a low-cost and sustainable way of changing patient behavior in asthma, using community pharmacists as educators.
Insights into early lithic technologies from ethnography.
Hayden, Brian
2015-11-19
Oldowan lithic assemblages are often portrayed as a product of the need to obtain sharp flakes for cutting into animal carcases. However, ethnographic and experimental research indicates that the optimal way to produce flakes for such butchering purposes is via bipolar reduction of small cryptocrystalline pebbles rather than from larger crystalline cores resembling choppers. Ethnographic observations of stone tool-using hunter-gatherers in environments comparable with early hominins indicate that most stone tools (particularly chopper forms and flake tools) were used for making simple shaft tools including spears, digging sticks and throwing sticks. These tools bear strong resemblances to Oldowan stone tools. Bipolar reduction for butchering probably preceded chopper-like core reduction and provides a key link between primate nut-cracking technologies and the emergence of more sophisticated lithic technologies leading to the Oldowan. © 2015 The Author(s).
Machine learning plus optical flow: a simple and sensitive method to detect cardioactive drugs
NASA Astrophysics Data System (ADS)
Lee, Eugene K.; Kurokawa, Yosuke K.; Tu, Robin; George, Steven C.; Khine, Michelle
2015-07-01
Current preclinical screening methods do not adequately detect cardiotoxicity. Using human induced pluripotent stem cell-derived cardiomyocytes (iPS-CMs), more physiologically relevant preclinical or patient-specific screening to detect potential cardiotoxic effects of drug candidates may be possible. However, one of the persistent challenges for developing a high-throughput drug screening platform using iPS-CMs is the need to develop a simple and reliable method to measure key electrophysiological and contractile parameters. To address this need, we have developed a platform that combines machine learning paired with brightfield optical flow as a simple and robust tool that can automate the detection of cardiomyocyte drug effects. Using three cardioactive drugs of different mechanisms, including those with primarily electrophysiological effects, we demonstrate the general applicability of this screening method to detect subtle changes in cardiomyocyte contraction. Requiring only brightfield images of cardiomyocyte contractions, we detect changes in cardiomyocyte contraction comparable to - and even superior to - fluorescence readouts. This automated method serves as a widely applicable screening tool to characterize the effects of drugs on cardiomyocyte function.
Gobe: an interactive, web-based tool for comparative genomic visualization.
Pedersen, Brent S; Tang, Haibao; Freeling, Michael
2011-04-01
Gobe is a web-based tool for viewing comparative genomic data. It supports viewing multiple genomic regions simultaneously. Its simple text format and flash-based rendering make it an interactive, exploratory research tool. Gobe can be used without installation through our web service, or downloaded and customized with stylesheets and javascript callback functions. Gobe is a flash application that runs in all modern web-browsers. The full source-code, including that for the online web application is available under the MIT license at: http://github.com/brentp/gobe. Sample applications are hosted at http://try-gobe.appspot.com/ and http://synteny.cnr.berkeley.edu/gobe-app/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, Christopher A.
In this dissertation the possibility that chaos and simple determinism are governing the dynamics of reversed field pinch (RFP) plasmas is investigated. To properly assess this possibility, data from both numerical simulations and experiment are analyzed. A large repertoire of nonlinear analysis techniques is used to identify low dimensional chaos in the data. These tools include phase portraits and Poincare sections, correlation dimension, the spectrum of Lyapunov exponents and short term predictability. In addition, nonlinear noise reduction techniques are applied to the experimental data in an attempt to extract any underlying deterministic dynamics. Two model systems are used to simulatemore » the plasma dynamics. These are the DEBS code, which models global RFP dynamics, and the dissipative trapped electron mode (DTEM) model, which models drift wave turbulence. Data from both simulations show strong indications of low dimensional chaos and simple determinism. Experimental date were obtained from the Madison Symmetric Torus RFP and consist of a wide array of both global and local diagnostic signals. None of the signals shows any indication of low dimensional chaos or low simple determinism. Moreover, most of the analysis tools indicate the experimental system is very high dimensional with properties similar to noise. Nonlinear noise reduction is unsuccessful at extracting an underlying deterministic system.« less
Causal Relation Analysis Tool of the Case Study in the Engineer Ethics Education
NASA Astrophysics Data System (ADS)
Suzuki, Yoshio; Morita, Keisuke; Yasui, Mitsukuni; Tanada, Ichirou; Fujiki, Hiroyuki; Aoyagi, Manabu
In engineering ethics education, the virtual experiencing of dilemmas is essential. Learning through the case study method is a particularly effective means. Many case studies are, however, difficult to deal with because they often include many complex causal relationships and social factors. It would thus be convenient if there were a tool that could analyze the factors of a case example and organize them into a hierarchical structure to get a better understanding of the whole picture. The tool that was developed applies a cause-and-effect matrix and simple graph theory. It analyzes the causal relationship between facts in a hierarchical structure and organizes complex phenomena. The effectiveness of this tool is shown by presenting an actual example.
Puppet Resource Handbook for Teachers.
ERIC Educational Resources Information Center
Vogelsang, Robert; And Others
Designed as a teacher and therapist training tool, this booklet demonstrates how to make several types or kinds of puppets, and how to use them in a variety of classroom situations. The first section includes instructions for the following hand puppets: simple cloth, paper mache, knitted finger, paper bag, and Muppet types. The second section…
Your Personal Learning Network: Professional Development on Demand
ERIC Educational Resources Information Center
Bauer, William I.
2010-01-01
Web 2.0 tools and resources can enhance our efficiency and effectiveness as music educators, supporting personal learning networks for ongoing professional growth and development. This article includes (a) an explanation of Really Simple Syndication (RSS) and the use of an RSS reader/aggregator; (b) a discussion of blogs, podcasts, wikis,…
A Java-based web service is being developed within the US EPA’s Chemistry Dashboard to provide real time estimates of toxicity values and physical properties. WebTEST can generate toxicity predictions directly from a simple URL which includes the endpoint, QSAR method, and ...
A Java-based web service is being developed within the US EPA’s Chemistry Dashboard to provide real time estimates of toxicity values and physical properties. WebTEST can generate toxicity predictions directly from a simple URL which includes the endpoint, QSAR method, and ...
Using simple agent-based modeling to inform and enhance neighborhood walkability.
Badland, Hannah; White, Marcus; Macaulay, Gus; Eagleson, Serryn; Mavoa, Suzanne; Pettit, Christopher; Giles-Corti, Billie
2013-12-11
Pedestrian-friendly neighborhoods with proximal destinations and services encourage walking and decrease car dependence, thereby contributing to more active and healthier communities. Proximity to key destinations and services is an important aspect of the urban design decision making process, particularly in areas adopting a transit-oriented development (TOD) approach to urban planning, whereby densification occurs within walking distance of transit nodes. Modeling destination access within neighborhoods has been limited to circular catchment buffers or more sophisticated network-buffers generated using geoprocessing routines within geographical information systems (GIS). Both circular and network-buffer catchment methods are problematic. Circular catchment models do not account for street networks, thus do not allow exploratory 'what-if' scenario modeling; and network-buffering functionality typically exists within proprietary GIS software, which can be costly and requires a high level of expertise to operate. This study sought to overcome these limitations by developing an open-source simple agent-based walkable catchment tool that can be used by researchers, urban designers, planners, and policy makers to test scenarios for improving neighborhood walkable catchments. A simplified version of an agent-based model was ported to a vector-based open source GIS web tool using data derived from the Australian Urban Research Infrastructure Network (AURIN). The tool was developed and tested with end-user stakeholder working group input. The resulting model has proven to be effective and flexible, allowing stakeholders to assess and optimize the walkability of neighborhood catchments around actual or potential nodes of interest (e.g., schools, public transport stops). Users can derive a range of metrics to compare different scenarios modeled. These include: catchment area versus circular buffer ratios; mean number of streets crossed; and modeling of different walking speeds and wait time at intersections. The tool has the capacity to influence planning and public health advocacy and practice, and by using open-access source software, it is available for use locally and internationally. There is also scope to extend this version of the tool from a simple to a complex model, which includes agents (i.e., simulated pedestrians) 'learning' and incorporating other environmental attributes that enhance walkability (e.g., residential density, mixed land use, traffic volume).
gHRV: Heart rate variability analysis made easy.
Rodríguez-Liñares, L; Lado, M J; Vila, X A; Méndez, A J; Cuesta, P
2014-08-01
In this paper, the gHRV software tool is presented. It is a simple, free and portable tool developed in python for analysing heart rate variability. It includes a graphical user interface and it can import files in multiple formats, analyse time intervals in the signal, test statistical significance and export the results. This paper also contains, as an example of use, a clinical analysis performed with the gHRV tool, namely to determine whether the heart rate variability indexes change across different stages of sleep. Results from tests completed by researchers who have tried gHRV are also explained: in general the application was positively valued and results reflect a high level of satisfaction. gHRV is in continuous development and new versions will include suggestions made by testers. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Pros and cons of body mass index as a nutritional and risk assessment tool in dialysis patients.
Carrero, Juan Jesús; Avesani, Carla Maria
2015-01-01
Obesity is a problem of serious concern among chronic kidney disease (CKD) patients; it is a risk factor for progression to end-stage renal disease and its incidence and prevalence in dialysis patients exceeds those of the general population. Obesity, typically assessed with the simple metric of body mass index (BMI), is considered a mainstay for nutritional assessment in guidelines on nutrition in CKD. While regular BMI assessment in connection with the dialysis session is a simple and easy-to-use monitoring tool, such ease of access can lead to excess-of-use, as the value of this metric to health care professionals is overestimated. This review examines BMI as a clinical monitoring tool in CKD practice and offers a critical appraisal as to what a high or a low BMI may signify in this patient population. Topics discussed include the utility of BMI as a reflection of body size, body composition and body fat distribution, diagnostic versus prognostic performance, and consideration of temporal trends over single assessments. © 2014 Wiley Periodicals, Inc.
Computational assignment of redox states to Coulomb blockade diamonds.
Olsen, Stine T; Arcisauskaite, Vaida; Hansen, Thorsten; Kongsted, Jacob; Mikkelsen, Kurt V
2014-09-07
With the advent of molecular transistors, electrochemistry can now be studied at the single-molecule level. Experimentally, the redox chemistry of the molecule manifests itself as features in the observed Coulomb blockade diamonds. We present a simple theoretical method for explicit construction of the Coulomb blockade diamonds of a molecule. A combined quantum mechanical/molecular mechanical method is invoked to calculate redox energies and polarizabilities of the molecules, including the screening effect of the metal leads. This direct approach circumvents the need for explicit modelling of the gate electrode. From the calculated parameters the Coulomb blockade diamonds are constructed using simple theory. We offer a theoretical tool for assignment of Coulomb blockade diamonds to specific redox states in particular, and a study of chemical details in the diamonds in general. With the ongoing experimental developments in molecular transistor experiments, our tool could find use in molecular electronics, electrochemistry, and electrocatalysis.
NASA Astrophysics Data System (ADS)
Rivers, M. L.; Gualda, G. A.
2009-05-01
One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk, and (3) generate MPEG movies for inclusion in presentations, publications, websites, etc. Both are freely available as run-time ('.sav') versions that can be run using the free IDL Virtual Machine TM, available from ITT Visual Information Solutions: http://www.ittvis.com/ProductServices/IDL/VirtualMachine.aspx The run-time versions of 'tomo_display' and 'vol_tools' can be downloaded from: http://cars.uchicago.edu/software/idl/tomography.html http://sites.google.com/site/voltools/
A WebGL Tool for Visualizing the Topology of the Sun's Coronal Magnetic Field
NASA Astrophysics Data System (ADS)
Duffy, A.; Cheung, C.; DeRosa, M. L.
2012-12-01
We present a web-based, topology-viewing tool that allows users to visualize the geometry and topology of the Sun's 3D coronal magnetic field in an interactive manner. The tool is implemented using, open-source, mature, modern web technologies including WebGL, jQuery, HTML 5, and CSS 3, which are compatible with nearly all modern web browsers. As opposed to the traditional method of visualization, which involves the downloading and setup of various software packages-proprietary and otherwise-the tool presents a clean interface that allows the user to easily load and manipulate the model, while also offering great power to choose which topological features are displayed. The tool accepts data encoded in the JSON open format that has libraries available for nearly every major programming language, making it simple to generate the data.
IT-based wellness tools for older adults: Design concepts and feedback.
Joe, Jonathan; Hall, Amanda; Chi, Nai-Ching; Thompson, Hilaire; Demiris, George
2018-03-01
To explore older adults' preferences regarding e-health applications through use of generated concepts that inform wellness tool design. The 6-8-5 method and affinity mapping were used to create e-health design ideas that were translated into storyboards and scenarios. Focus groups were conducted to obtain feedback on the prototypes and included participant sketching. A qualitative analysis of the focus groups for emerging themes was conducted, and sketches were analyzed. Forty-three older adults participated in six focus group sessions. The majority of participants found the wellness tools useful. Preferences included features that supported participants in areas of unmet needs, such as ability to find reliable health information, cognitive training, or maintaining social ties. Participants favored features such as use of voice navigation, but were concerned over cost and the need for technology skills and access. Sketches reinforced these wants, including portability, convenience, and simplicity. Several factors were found to increase the desirability of such devices including convenient access to their health and health information, a simple, accessible interface, and support for memory issues. Researchers and designers should incorporate the feedback of older adults regarding wellness tools, so that future designs meet the needs of older adults.
A simplified gis-based model for large wood recruitment and connectivity in mountain basins
NASA Astrophysics Data System (ADS)
Franceschi, Silvia; Antonello, Andrea; Vela, Ana Lucia; Cavalli, Marco; Crema, Stefano; Comiti, Francesco; Tonon, Giustino
2015-04-01
During the last 50 years in the Alps the decline of the rural and forest economy and at the depopulation of the mountain areas caused the progressive abandon of the land in general and in particular of the riparian zones and the consequent increment of the vegetation extension. On one hand the wood increases the availability of organic matter and has positive effects on mountain river systems. However, during flooding events large woods that reach the stream cause the clogging of bridges with an increase of flood hazard. The approach to the evaluation of the availability of large wood during flooding events is still a challenge. There are models that simulate the propagation of the logs downstream, but the evaluation of the trees that can reach the stream is still done using simplified GIS procedures. These procedures are the base for our research which will include LiDAR derived information on vegetation to evaluate large wood recruitment extreme events. Within the last Google Summer of Code (2014) we developed a set of tools to evaluate large wood recruitment and propagation along the channel network based on a simplified methodology for monitoring and modeling large wood recruitment and transport in mountain basins implemented by Lucía et 2014. These tools are integrated in the JGrassTools project as a dedicated section in the Hydro-Geomorphology library. The section LWRecruitment contains 10 simple modules that allow the user to start from very simple information related to geomorphology, flooding areas and vegetation cover and obtain a map of the most probable critical sections on the streams. The tools cover the two main aspects related to the iteration of large wood with the rivers: the recruitment mechanisms and the propagation downstream. While the propagation tool is very simple and does not consider the hydrodynamic of the problem, the recruitment algorithms are more specific and consider the influence of hillslopes stability and the flooding extension. The modules are available for download at www.jgrasstools.org. A simple and easy to use graphical interface to run the models is available at https://github.com/moovida/STAGE/releases.
NASA Technical Reports Server (NTRS)
Nunes, Arthur C., Jr.
2008-01-01
Friction stir welding (FSW) is a solid state welding process invented in 1991 at The Welding Institute in the United Kingdom. A weld is made in the FSW process by translating a rotating pin along a weld seam so as to stir the sides of the seam together. FSW avoids deleterious effects inherent in melting and promises to be an important welding process for any industries where welds of optimal quality are demanded. This article provides an introduction to the FSW process. The chief concern is the physical effect of the tool on the weld metal: how weld seam bonding takes place, what kind of weld structure is generated, potential problems, possible defects for example, and implications for process parameters and tool design. Weld properties are determined by structure, and the structure of friction stir welds is determined by the weld metal flow field in the vicinity of the weld tool. Metal flow in the vicinity of the weld tool is explained through a simple kinematic flow model that decomposes the flow field into three basic component flows: a uniform translation, a rotating solid cylinder, and a ring vortex encircling the tool. The flow components, superposed to construct the flow model, can be related to particular aspects of weld process parameters and tool design; they provide a bridge to an understanding of a complex-at-first-glance weld structure. Torques and forces are also discussed. Some simple mathematical models of structural aspects, torques, and forces are included.
X-ray system simulation software tools for radiology and radiography education.
Kengyelics, Stephen M; Treadgold, Laura A; Davies, Andrew G
2018-02-01
To develop x-ray simulation software tools to support delivery of radiological science education for a range of learning environments and audiences including individual study, lectures, and tutorials. Two software tools were developed; one simulated x-ray production for a simple two dimensional radiographic system geometry comprising an x-ray source, beam filter, test object and detector. The other simulated the acquisition and display of two dimensional radiographic images of complex three dimensional objects using a ray casting algorithm through three dimensional mesh objects. Both tools were intended to be simple to use, produce results accurate enough to be useful for educational purposes, and have an acceptable simulation time on modest computer hardware. The radiographic factors and acquisition geometry could be altered in both tools via their graphical user interfaces. A comparison of radiographic contrast measurements of the simulators to a real system was performed. The contrast output of the simulators had excellent agreement with measured results. The software simulators were deployed to 120 computers on campus. The software tools developed are easy-to-use, clearly demonstrate important x-ray physics and imaging principles, are accessible within a standard University setting and could be used to enhance the teaching of x-ray physics to undergraduate students. Current approaches to teaching x-ray physics in radiological science lack immediacy when linking theory with practice. This method of delivery allows students to engage with the subject in an experiential learning environment. Copyright © 2017. Published by Elsevier Ltd.
Automatic Data Traffic Control on DSM Architecture
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Jin, Hao-Qiang; Yan, Jerry; Kwak, Dochan (Technical Monitor)
2000-01-01
We study data traffic on distributed shared memory machines and conclude that data placement and grouping improve performance of scientific codes. We present several methods which user can employ to improve data traffic in his code. We report on implementation of a tool which detects the code fragments causing data congestions and advises user on improvements of data routing in these fragments. The capabilities of the tool include deduction of data alignment and affinity from the source code; detection of the code constructs having abnormally high cache or TLB misses; generation of data placement constructs. We demonstrate the capabilities of the tool on experiments with NAS parallel benchmarks and with a simple computational fluid dynamics application ARC3D.
Troubleshooting Microcomputers. A Technical Guide for Polk County Schools.
ERIC Educational Resources Information Center
Black, B. R.; And Others
This guide was started in 1986 as an effort to pull together a collection of several computer guides that had been written over the previous several years to assist schools in making simple computer repairs. The first of six sections contains general tips and hints, including sections on tool requirements, strobe disk speed adjustment, static…
ERIC Educational Resources Information Center
Highsmith, Joni Bitman
Stickybear's Math Splash is a CD-ROM-based software tool for teaching mathematics skills beyond simple number recognition to elementary students. The accompanying printed lesson plans are designed to complement mathematics skills with other methods and areas of emphasis including kinesthetic learning, listening skills, decision making skills, and…
The Polygonal Model: A Simple Representation of Biomolecules as a Tool for Teaching Metabolism
ERIC Educational Resources Information Center
Bonafe, Carlos Francisco Sampaio; Bispo, Jose Ailton Conceição; de Jesus, Marcelo Bispo
2018-01-01
Metabolism involves numerous reactions and organic compounds that the student must master to understand adequately the processes involved. Part of biochemical learning should include some knowledge of the structure of biomolecules, although the acquisition of such knowledge can be time-consuming and may require significant effort from the student.…
Interactive Videodisc: An Emerging Technology for Educators. ERIC Digest.
ERIC Educational Resources Information Center
Grabowski, Barbara L.
Interactive video can be a very complex learning system, or it can be a simple tool for teachers to use to enhance their instruction. The term has been used broadly in the literature and includes three major aspects: (1) interactive video as storage; (2) interactive video as hardware; and (3) interactive video as learning concept. This digest…
The Beer Lambert law measurement made easy
NASA Astrophysics Data System (ADS)
Onorato, Pasquale; Gratton, Luigi M.; Polesello, Marta; Salmoiraghi, Alessandro; Oss, Stefano
2018-05-01
We propose the use of a smartphone based apparatus as a valuable tool for investigating the optical absorption of a material and to verify the exponential decay predicted by Beer’s law. The very simple experimental activities presented here, suitable for undergraduate students, allows one to measure the material transmittance including its dependence on the incident radiation wavelength.
Yes! You Can Build a Web Site.
ERIC Educational Resources Information Center
Holzberg, Carol
2001-01-01
With specially formatted templates or simple Web page editors, teachers can lay out text and graphics in a work space resembling the interface of a word processor. Several options are presented to help teachers build Web sites. ree templates include Class Homepage Builder, AppliTools: HomePage, MySchoolOnline.com, and BigChalk.com. Web design…
fluff: exploratory analysis and visualization of high-throughput sequencing data
Georgiou, Georgios
2016-01-01
Summary. In this article we describe fluff, a software package that allows for simple exploration, clustering and visualization of high-throughput sequencing data mapped to a reference genome. The package contains three command-line tools to generate publication-quality figures in an uncomplicated manner using sensible defaults. Genome-wide data can be aggregated, clustered and visualized in a heatmap, according to different clustering methods. This includes a predefined setting to identify dynamic clusters between different conditions or developmental stages. Alternatively, clustered data can be visualized in a bandplot. Finally, fluff includes a tool to generate genomic profiles. As command-line tools, the fluff programs can easily be integrated into standard analysis pipelines. The installation is straightforward and documentation is available at http://fluff.readthedocs.org. Availability. fluff is implemented in Python and runs on Linux. The source code is freely available for download at https://github.com/simonvh/fluff. PMID:27547532
Chaos in plasma simulation and experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, C.; Newman, D.E.; Sprott, J.C.
1993-09-01
We investigate the possibility that chaos and simple determinism are governing the dynamics of reversed field pinch (RFP) plasmas using data from both numerical simulations and experiment. A large repertoire of nonlinear analysis techniques is used to identify low dimensional chaos. These tools include phase portraits and Poincard sections, correlation dimension, the spectrum of Lyapunov exponents and short term predictability. In addition, nonlinear noise reduction techniques are applied to the experimental data in an attempt to extract any underlying deterministic dynamics. Two model systems are used to simulate the plasma dynamics. These are -the DEBS code, which models global RFPmore » dynamics, and the dissipative trapped electron mode (DTEM) model, which models drift wave turbulence. Data from both simulations show strong indications of low,dimensional chaos and simple determinism. Experimental data were obtained from the Madison Symmetric Torus RFP and consist of a wide array of both global and local diagnostic signals. None of the signals shows any indication of low dimensional chaos or other simple determinism. Moreover, most of the analysis tools indicate the experimental system is very high dimensional with properties similar to noise. Nonlinear noise reduction is unsuccessful at extracting an underlying deterministic system.« less
Eckner, James T.; Richardson, James K.; Kim, Hogene; Joshi, Monica S.; Oh, Youkeun K.; Ashton-Miller, James A.
2015-01-01
Summary Slowed reaction time (RT) represents both a risk factor for and a consequence of sport concussion. The purpose of this study was to determine the reliability and criterion validity of a novel clinical test of simple and complex RT, called RTclin, in contact sport athletes. Both tasks were adapted from the well-known ruler drop test of RT and involve manually grasping a falling vertical shaft upon its release, with the complex task employing a go/no-go paradigm based on a slight cue. In 46 healthy contact sport athletes (24 males; M = 16.3 yr., SD = 5.0; 22 women: M age= 15.0 yr., SD = 4.0) whose sports included soccer, ice hockey, American football, martial arts, wrestling, and lacrosse, the latency and accuracy of simple and complex RTclin had acceptable test-retest and inter-rater reliabilities and correlated with a computerized criterion standard, the Axon Computerized Cognitive Assessment Tool. Medium to large effect sizes were found. The novel RTclin tests have acceptable reliability and criterion validity for clinical use and hold promise as concussion assessment tools. PMID:26106803
Marufu, Takawira C; Mannings, Alexa; Moppett, Iain K
2015-12-01
Accurate peri-operative risk prediction is an essential element of clinical practice. Various risk stratification tools for assessing patients' risk of mortality or morbidity have been developed and applied in clinical practice over the years. This review aims to outline essential characteristics (predictive accuracy, objectivity, clinical utility) of currently available risk scoring tools for hip fracture patients. We searched eight databases; AMED, CINHAL, Clinical Trials.gov, Cochrane, DARE, EMBASE, MEDLINE and Web of Science for all relevant studies published until April 2015. We included published English language observational studies that considered the predictive accuracy of risk stratification tools for patients with fragility hip fracture. After removal of duplicates, 15,620 studies were screened. Twenty-nine papers met the inclusion criteria, evaluating 25 risk stratification tools. Risk stratification tools considered in more than two studies were; ASA, CCI, E-PASS, NHFS and O-POSSUM. All tools were moderately accurate and validated in multiple studies; however there are some limitations to consider. The E-PASS and O-POSSUM are comprehensive but complex, and require intraoperative data making them a challenge for use on patient bedside. The ASA, CCI and NHFS are simple, easy and inexpensive using routinely available preoperative data. Contrary to the ASA and CCI which has subjective variables in addition to other limitations, the NHFS variables are all objective. In the search for a simple and inexpensive, easy to calculate, objective and accurate tool, the NHFS may be the most appropriate of the currently available scores for hip fracture patients. However more studies need to be undertaken before it becomes a national hip fracture risk stratification or audit tool of choice. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Prototyping Tool for Web-Based Multiuser Online Role-Playing Game
NASA Astrophysics Data System (ADS)
Okamoto, Shusuke; Kamada, Masaru; Yonekura, Tatsuhiro
This letter proposes a prototyping tool for Web-based Multiuser Online Role-Playing Game (MORPG). The design goal is to make this tool simple and powerful. The tool is comprised of a GUI editor, a translator and a runtime environment. The GUI editor is used to edit state-transition diagrams, each of which defines the behavior of the fictional characters. The state-transition diagrams are translated into C program codes, which plays the role of a game engine in RPG system. The runtime environment includes PHP, JavaScript with Ajax and HTML. So the prototype system can be played on the usual Web browser, such as Fire-fox, Safari and IE. On a click or key press by a player, the Web browser sends it to the Web server to reflect its consequence on the screens which other players are looking at. Prospected users of this tool include programming novices and schoolchildren. The knowledge or skill of any specific programming languages is not required to create state-transition diagrams. Its structure is not only suitable for the definition of a character behavior but also intuitive to help novices understand. Therefore, the users can easily create Web-based MORPG system with the tool.
Kharroubi, Akram; Saba, Elias; Ghannam, Ibrahim; Darwish, Hisham
2017-12-01
The need for simple self-assessment tools is necessary to predict women at high risk for developing osteoporosis. In this study, tools like the IOF One Minute Test, Fracture Risk Assessment Tool (FRAX), and Simple Calculated Osteoporosis Risk Estimation (SCORE) were found to be valid for Palestinian women. The threshold for predicting women at risk for each tool was estimated. The purpose of this study is to evaluate the validity of the updated IOF (International Osteoporosis Foundation) One Minute Osteoporosis Risk Assessment Test, FRAX, SCORE as well as age alone to detect the risk of developing osteoporosis in postmenopausal Palestinian women. Three hundred eighty-two women 45 years and older were recruited including 131 women with osteoporosis and 251 controls following bone mineral density (BMD) measurement, 287 completed questionnaires of the different risk assessment tools. Receiver operating characteristic (ROC) curves were evaluated for each tool using bone BMD as the gold standard for osteoporosis. The area under the ROC curve (AUC) was the highest for FRAX calculated with BMD for predicting hip fractures (0.897) followed by FRAX for major fractures (0.826) with cut-off values ˃1.5 and ˃7.8%, respectively. The IOF One Minute Test AUC (0.629) was the lowest compared to other tested tools but with sufficient accuracy for predicting the risk of developing osteoporosis with a cut-off value ˃4 total yes questions out of 18. SCORE test and age alone were also as good predictors of risk for developing osteoporosis. According to the ROC curve for age, women ≥64 years had a higher risk of developing osteoporosis. Higher percentage of women with low BMD (T-score ≤-1.5) or osteoporosis (T-score ≤-2.5) was found among women who were not exposed to the sun, who had menopause before the age of 45 years, or had lower body mass index (BMI) compared to controls. Women who often fall had lower BMI and approximately 27% of the recruited postmenopausal Palestinian women had accidents that caused fractures. Simple self-assessment tools like FRAX without BMD, SCORE, and the IOF One Minute Tests were valid for predicting Palestinian postmenopausal women at high risk of developing osteoporosis.
Multidisciplinary Conceptual Design for Reduced-Emission Rotorcraft
NASA Technical Reports Server (NTRS)
Silva, Christopher; Johnson, Wayne; Solis, Eduardo
2018-01-01
Python-based wrappers for OpenMDAO are used to integrate disparate software for practical conceptual design of rotorcraft. The suite of tools which are connected thus far include aircraft sizing, comprehensive analysis, and parametric geometry. The tools are exercised to design aircraft with aggressive goals for emission reductions relative to fielded state-of-the-art rotorcraft. Several advanced reduced-emission rotorcraft are designed and analyzed, demonstrating the flexibility of the tools to consider a wide variety of potentially transformative vertical flight vehicles. To explore scale effects, aircraft have been sized for 5, 24, or 76 passengers in their design missions. Aircraft types evaluated include tiltrotor, single-main-rotor, coaxial, and side-by-side helicopters. Energy and drive systems modeled include Lithium-ion battery, hydrogen fuel cell, turboelectric hybrid, and turboshaft drive systems. Observations include the complex nature of the trade space for this simple problem, with many potential aircraft design and operational solutions for achieving significant emission reductions. Also interesting is that achieving greatly reduced emissions may not require exotic component technologies, but may be achieved with a dedicated design objective of reducing emissions.
Analyzing Discourse Processing Using a Simple Natural Language Processing Tool
ERIC Educational Resources Information Center
Crossley, Scott A.; Allen, Laura K.; Kyle, Kristopher; McNamara, Danielle S.
2014-01-01
Natural language processing (NLP) provides a powerful approach for discourse processing researchers. However, there remains a notable degree of hesitation by some researchers to consider using NLP, at least on their own. The purpose of this article is to introduce and make available a "simple" NLP (SiNLP) tool. The overarching goal of…
ERIC Educational Resources Information Center
Russo, Alexander
2004-01-01
Simple checklists, one-shot interviews, brief site visits and narrative evaluations remain widespread as the tools of assessment. In many school districts, the evaluation includes little or no face-to-face contact, and the principal simply gets his or her evaluation in the mail, leading one researcher to describe them as "infrequent, late,…
Medical image segmentation to estimate HER2 gene status in breast cancer
NASA Astrophysics Data System (ADS)
Palacios-Navarro, Guillermo; Acirón-Pomar, José Manuel; Vilchez-Sorribas, Enrique; Zambrano, Eddie Galarza
2016-02-01
This work deals with the estimation of HER2 Gene status in breast tumour images treated with in situ hybridization techniques (ISH). We propose a simple algorithm to obtain the amplification factor of HER2 gene. The obtained results are very close to those obtained by specialists in a manual way. The developed algorithm is based on colour image segmentation and has been included in a software application tool for breast tumour analysis. The developed tool focus on the estimation of the seriousness of tumours, facilitating the work of pathologists and contributing to a better diagnosis.
Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach
Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen
2016-01-01
A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design. PMID:28773800
Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach.
Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen
2016-08-09
A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design.
Free web-based modelling platform for managed aquifer recharge (MAR) applications
NASA Astrophysics Data System (ADS)
Stefan, Catalin; Junghanns, Ralf; Glaß, Jana; Sallwey, Jana; Fatkhutdinov, Aybulat; Fichtner, Thomas; Barquero, Felix; Moreno, Miguel; Bonilla, José; Kwoyiga, Lydia
2017-04-01
Managed aquifer recharge represents a valuable instrument for sustainable water resources management. The concept implies purposeful infiltration of surface water into underground for later recovery or environmental benefits. Over decades, MAR schemes were successfully installed worldwide for a variety of reasons: to maximize the natural storage capacity of aquifers, physical aquifer management, water quality management, and ecological benefits. The INOWAS-DSS platform provides a collection of free web-based tools for planning, management and optimization of main components of MAR schemes. The tools are grouped into 13 specific applications that cover most relevant challenges encountered at MAR sites, both from quantitative and qualitative perspectives. The applications include among others the optimization of MAR site location, the assessment of saltwater intrusion, the restoration of groundwater levels in overexploited aquifers, the maximization of natural storage capacity of aquifers, the improvement of water quality, the design and operational optimization of MAR schemes, clogging development and risk assessment. The platform contains a collection of about 35 web-based tools of various degrees of complexity, which are either included in application specific workflows or used as standalone modelling instruments. Among them are simple tools derived from data mining and empirical equations, analytical groundwater related equations, as well as complex numerical flow and transport models (MODFLOW, MT3DMS and SEAWAT). Up to now, the simulation core of the INOWAS-DSS, which is based on the finite differences groundwater flow model MODFLOW, is implemented and runs on the web. A scenario analyser helps to easily set up and evaluate new management options as well as future development such as land use and climate change and compare them to previous scenarios. Additionally simple tools such as analytical equations to assess saltwater intrusion are already running online. Besides the simulation tools, a web-based data base is under development where geospatial and time series data can be stored, managed, and processed. Furthermore, a web-based information system containing user guides for the various developed tools and applications as well as basic information on MAR and related topics is published and will be regularly expanded as new tools are getting implemented. The INOWAS-DSS including its simulation tools, data base and information system provides an extensive framework to manage, plan and optimize MAR facilities. As the INOWAS-DSS is an open-source software accessible via the internet using standard web browsers, it offers new ways for data sharing and collaboration among various partners and decision makers.
Supporting geoscience with graphical-user-interface Internet tools for the Macintosh
NASA Astrophysics Data System (ADS)
Robin, Bernard
1995-07-01
This paper describes a suite of Macintosh graphical-user-interface (GUI) software programs that can be used in conjunction with the Internet to support geoscience education. These software programs allow science educators to access and retrieve a large body of resources from an increasing number of network sites, taking advantage of the intuitive, simple-to-use Macintosh operating system. With these tools, educators easily can locate, download, and exchange not only text files but also sound resources, video movie clips, and software application files from their desktop computers. Another major advantage of these software tools is that they are available at no cost and may be distributed freely. The following GUI software tools are described including examples of how they can be used in an educational setting: ∗ Eudora—an e-mail program ∗ NewsWatcher—a newsreader ∗ TurboGopher—a Gopher program ∗ Fetch—a software application for easy File Transfer Protocol (FTP) ∗ NCSA Mosaic—a worldwide hypertext browsing program. An explosive growth of online archives currently is underway as new electronic sites are being added continuously to the Internet. Many of these resources may be of interest to science educators who learn they can share not only ASCII text files, but also graphic image files, sound resources, QuickTime movie clips, and hypermedia projects with colleagues from locations around the world. These powerful, yet simple to learn GUI software tools are providing a revolution in how knowledge can be accessed, retrieved, and shared.
Killeen, G F; McKenzie, F E; Foy, B D; Schieffelin, C; Billingsley, P F; Beier, J C
2000-05-01
We have used a relatively simple but accurate model for predicting the impact of integrated transmission control on the malaria entomologic inoculation rate (EIR) at four endemic sites from across sub-Saharan Africa and the southwest Pacific. The simulated campaign incorporated modestly effective vaccine coverage, bed net use, and larval control. The results indicate that such campaigns would reduce EIRs at all four sites by 30- to 50-fold. Even without the vaccine, 15- to 25-fold reductions of EIR were predicted, implying that integrated control with a few modestly effective tools can meaningfully reduce malaria transmission in a range of endemic settings. The model accurately predicts the effects of bed nets and indoor spraying and demonstrates that they are the most effective tools available for reducing EIR. However, the impact of domestic adult vector control is amplified by measures for reducing the rate of emergence of vectors or the level of infectiousness of the human reservoir. We conclude that available tools, including currently neglected methods for larval control, can reduce malaria transmission intensity enough to alleviate mortality. Integrated control programs should be implemented to the fullest extent possible, even in areas of intense transmission, using simple models as decision-making tools. However, we also conclude that to eliminate malaria in many areas of intense transmission is beyond the scope of methods which developing nations can currently afford. New, cost-effective, practical tools are needed if malaria is ever to be eliminated from highly endemic areas.
Pulling My Gut out--Simple Tools for Engaging Students in Gross Anatomy Lectures
ERIC Educational Resources Information Center
Chan, Lap Ki
2010-01-01
A lecture is not necessarily a monologue, promoting only passive learning. If appropriate techniques are used, a lecture can stimulate active learning too. One such method is demonstration, which can engage learners' attention and increase the interaction between the lecturer and the learners. This article describes two simple and useful tools for…
Using simple agent-based modeling to inform and enhance neighborhood walkability
2013-01-01
Background Pedestrian-friendly neighborhoods with proximal destinations and services encourage walking and decrease car dependence, thereby contributing to more active and healthier communities. Proximity to key destinations and services is an important aspect of the urban design decision making process, particularly in areas adopting a transit-oriented development (TOD) approach to urban planning, whereby densification occurs within walking distance of transit nodes. Modeling destination access within neighborhoods has been limited to circular catchment buffers or more sophisticated network-buffers generated using geoprocessing routines within geographical information systems (GIS). Both circular and network-buffer catchment methods are problematic. Circular catchment models do not account for street networks, thus do not allow exploratory ‘what-if’ scenario modeling; and network-buffering functionality typically exists within proprietary GIS software, which can be costly and requires a high level of expertise to operate. Methods This study sought to overcome these limitations by developing an open-source simple agent-based walkable catchment tool that can be used by researchers, urban designers, planners, and policy makers to test scenarios for improving neighborhood walkable catchments. A simplified version of an agent-based model was ported to a vector-based open source GIS web tool using data derived from the Australian Urban Research Infrastructure Network (AURIN). The tool was developed and tested with end-user stakeholder working group input. Results The resulting model has proven to be effective and flexible, allowing stakeholders to assess and optimize the walkability of neighborhood catchments around actual or potential nodes of interest (e.g., schools, public transport stops). Users can derive a range of metrics to compare different scenarios modeled. These include: catchment area versus circular buffer ratios; mean number of streets crossed; and modeling of different walking speeds and wait time at intersections. Conclusions The tool has the capacity to influence planning and public health advocacy and practice, and by using open-access source software, it is available for use locally and internationally. There is also scope to extend this version of the tool from a simple to a complex model, which includes agents (i.e., simulated pedestrians) ‘learning’ and incorporating other environmental attributes that enhance walkability (e.g., residential density, mixed land use, traffic volume). PMID:24330721
Creating Simple Admin Tools Using Info*Engine and Java
NASA Technical Reports Server (NTRS)
Jones, Corey; Kapatos, Dennis; Skradski, Cory; Felkins, J. D.
2012-01-01
PTC has provided a simple way to dynamically interact with Windchill using Info*Engine. This presentation will describe how to create a simple Info*Engine Tasks capable of saving Windchill 10.0 administration of tedious work.
Development of the Concept of Energy Conservation using Simple Experiments for Grade 10 Students
NASA Astrophysics Data System (ADS)
Rachniyom, S.; Toedtanya, K.; Wuttiprom, S.
2017-09-01
The purpose of this research was to develop students’ concept of and retention rate in relation to energy conservation. Activities included simple and easy experiments that considered energy transformation from potential to kinetic energy. The participants were 30 purposively selected grade 10 students in the second semester of the 2016 academic year. The research tools consisted of learning lesson plans and a learning achievement test. Results showed that the experiments worked well and were appropriate as learning activities. The students’ achievement scores significantly increased at the statistical level of 05, the students’ retention rates were at a high level, and learning behaviour was at a good level. These simple experiments allowed students to learn to demonstrate to their peers and encouraged them to use familiar models to explain phenomena in daily life.
RSAT 2018: regulatory sequence analysis tools 20th anniversary.
Nguyen, Nga Thi Thuy; Contreras-Moreira, Bruno; Castro-Mondragon, Jaime A; Santana-Garcia, Walter; Ossio, Raul; Robles-Espinoza, Carla Daniela; Bahin, Mathieu; Collombet, Samuel; Vincens, Pierre; Thieffry, Denis; van Helden, Jacques; Medina-Rivera, Alejandra; Thomas-Chollier, Morgane
2018-05-02
RSAT (Regulatory Sequence Analysis Tools) is a suite of modular tools for the detection and the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, including from genome-wide datasets like ChIP-seq/ATAC-seq, (ii) motif scanning, (iii) motif analysis (quality assessment, comparisons and clustering), (iv) analysis of regulatory variations, (v) comparative genomics. Six public servers jointly support 10 000 genomes from all kingdoms. Six novel or refactored programs have been added since the 2015 NAR Web Software Issue, including updated programs to analyse regulatory variants (retrieve-variation-seq, variation-scan, convert-variations), along with tools to extract sequences from a list of coordinates (retrieve-seq-bed), to select motifs from motif collections (retrieve-matrix), and to extract orthologs based on Ensembl Compara (get-orthologs-compara). Three use cases illustrate the integration of new and refactored tools to the suite. This Anniversary update gives a 20-year perspective on the software suite. RSAT is well-documented and available through Web sites, SOAP/WSDL (Simple Object Access Protocol/Web Services Description Language) web services, virtual machines and stand-alone programs at http://www.rsat.eu/.
Improved Analysis of Earth System Models and Observations using Simple Climate Models
NASA Astrophysics Data System (ADS)
Nadiga, B. T.; Urban, N. M.
2016-12-01
Earth system models (ESM) are the most comprehensive tools we have to study climate change and develop climate projections. However, the computational infrastructure required and the cost incurred in running such ESMs precludes direct use of such models in conjunction with a wide variety of tools that can further our understanding of climate. Here we are referring to tools that range from dynamical systems tools that give insight into underlying flow structure and topology to tools that come from various applied mathematical and statistical techniques and are central to quantifying stability, sensitivity, uncertainty and predictability to machine learning tools that are now being rapidly developed or improved. Our approach to facilitate the use of such models is to analyze output of ESM experiments (cf. CMIP) using a range of simpler models that consider integral balances of important quantities such as mass and/or energy in a Bayesian framework.We highlight the use of this approach in the context of the uptake of heat by the world oceans in the ongoing global warming. Indeed, since in excess of 90% of the anomalous radiative forcing due greenhouse gas emissions is sequestered in the world oceans, the nature of ocean heat uptake crucially determines the surface warming that is realized (cf. climate sensitivity). Nevertheless, ESMs themselves are never run long enough to directly assess climate sensitivity. So, we consider a range of models based on integral balances--balances that have to be realized in all first-principles based models of the climate system including the most detailed state-of-the art climate simulations. The models range from simple models of energy balance to those that consider dynamically important ocean processes such as the conveyor-belt circulation (Meridional Overturning Circulation, MOC), North Atlantic Deep Water (NADW) formation, Antarctic Circumpolar Current (ACC) and eddy mixing. Results from Bayesian analysis of such models using both ESM experiments and actual observations are presented. One such result points to the importance of direct sequestration of heat below 700 m, a process that is not allowed for in the simple models that have been traditionally used to deduce climate sensitivity.
Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard
2018-06-01
Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .
ERIC Educational Resources Information Center
Gron, Liz U.; Bradley, Shelly B.; McKenzie, Jennifer R.; Shinn, Sara E.; Teague, M. Warfield
2013-01-01
This paper presents the use of simple, outcome-based assessment tools to design and evaluate the first semester of a new introductory laboratory program created to teach green analytical chemistry using environmental samples. This general chemistry laboratory program, like many introductory courses, has a wide array of stakeholders within and…
2012-01-01
Background The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. Method A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID) or computed radiography (CR). We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Results Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. Conclusion The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence. PMID:22452821
The Persistence of Mode 1 Technology in the Korean Late Paleolithic
Lee, Hyeong Woo
2013-01-01
Ssangjungri (SJ), an open-air site with several Paleolithic horizons, was recently discovered in South Korea. Most of the identified artifacts are simple core and flake tools that indicate an expedient knapping strategy. Bifacially worked core tools, which might be considered non-classic bifaces, also have been found. The prolific horizons at the site were dated by accelerator mass spectrometry (AMS) to about 30 kya. Another newly discovered Paleolithic open-air site, Jeungsan (JS), shows a homogeneous lithic pattern during this period. The dominated artifact types and usage of raw materials are similar in character to those from SJ, although JS yielded a larger number of simple core and flake tools with non-classic bifaces. Chronometric analysis by AMS and optically stimulated luminescence (OSL) indicate that the prime stratigraphic levels at JS also date to approximately 30 kya, and the numerous conjoining pieces indicate that the layers were not seriously affected by post-depositional processes. Thus, it can be confirmed that simple core and flake tools were produced at temporally and culturally independent sites until after 30 kya, supporting the hypothesis of a wide and persistent use of simple technology into the Late Pleistocene. PMID:23724113
Nonlinear transient analysis via energy minimization
NASA Technical Reports Server (NTRS)
Kamat, M. P.; Knight, N. F., Jr.
1978-01-01
The formulation basis for nonlinear transient analysis of finite element models of structures using energy minimization is provided. Geometric and material nonlinearities are included. The development is restricted to simple one and two dimensional finite elements which are regarded as being the basic elements for modeling full aircraft-like structures under crash conditions. The results indicate the effectiveness of the technique as a viable tool for this purpose.
Rees, Alice; Bott, Lewis
2017-01-01
Structural priming is a useful tool for investigating linguistics representations. We argue that structural priming can be extended to the investigation of pragmatic representations such as Gricean enrichments. That is not to say priming is without its limitations, however. Interpreting a failure to observe priming may not be as simple as Branigan & Pickering (B&P) imply.
Molgenis-impute: imputation pipeline in a box.
Kanterakis, Alexandros; Deelen, Patrick; van Dijk, Freerk; Byelas, Heorhiy; Dijkstra, Martijn; Swertz, Morris A
2015-08-19
Genotype imputation is an important procedure in current genomic analysis such as genome-wide association studies, meta-analyses and fine mapping. Although high quality tools are available that perform the steps of this process, considerable effort and expertise is required to set up and run a best practice imputation pipeline, particularly for larger genotype datasets, where imputation has to scale out in parallel on computer clusters. Here we present MOLGENIS-impute, an 'imputation in a box' solution that seamlessly and transparently automates the set up and running of all the steps of the imputation process. These steps include genome build liftover (liftovering), genotype phasing with SHAPEIT2, quality control, sample and chromosomal chunking/merging, and imputation with IMPUTE2. MOLGENIS-impute builds on MOLGENIS-compute, a simple pipeline management platform for submission and monitoring of bioinformatics tasks in High Performance Computing (HPC) environments like local/cloud servers, clusters and grids. All the required tools, data and scripts are downloaded and installed in a single step. Researchers with diverse backgrounds and expertise have tested MOLGENIS-impute on different locations and imputed over 30,000 samples so far using the 1,000 Genomes Project and new Genome of the Netherlands data as the imputation reference. The tests have been performed on PBS/SGE clusters, cloud VMs and in a grid HPC environment. MOLGENIS-impute gives priority to the ease of setting up, configuring and running an imputation. It has minimal dependencies and wraps the pipeline in a simple command line interface, without sacrificing flexibility to adapt or limiting the options of underlying imputation tools. It does not require knowledge of a workflow system or programming, and is targeted at researchers who just want to apply best practices in imputation via simple commands. It is built on the MOLGENIS compute workflow framework to enable customization with additional computational steps or it can be included in other bioinformatics pipelines. It is available as open source from: https://github.com/molgenis/molgenis-imputation.
Helioviewer.org: An Open-source Tool for Visualizing Solar Data
NASA Astrophysics Data System (ADS)
Hughitt, V. Keith; Ireland, J.; Schmiedel, P.; Dimitoglou, G.; Mueller, D.; Fleck, B.
2009-05-01
As the amount of solar data available to scientists continues to increase at faster and faster rates, it is important that there exist simple tools for navigating this data quickly with a minimal amount of effort. By combining heterogeneous solar physics datatypes such as full-disk images and coronagraphs, along with feature and event information, Helioviewer offers a simple and intuitive way to browse multiple datasets simultaneously. Images are stored in a repository using the JPEG 2000 format and tiled dynamically upon a client's request. By tiling images and serving only the portions of the image requested, it is possible for the client to work with very large images without having to fetch all of the data at once. Currently, Helioviewer enables users to browse the entire SOHO data archive, updated hourly, as well as data feature/event catalog data from eight different catalogs including active region, flare, coronal mass ejection, type II radio burst data. In addition to a focus on intercommunication with other virtual observatories and browsers (VSO, HEK, etc), Helioviewer will offer a number of externally-available application programming interfaces (APIs) to enable easy third party use, adoption and extension. Future functionality will include: support for additional data-sources including TRACE, SDO and STEREO, dynamic movie generation, a navigable timeline of recorded solar events, social annotation, and basic client-side image processing.
NASA Astrophysics Data System (ADS)
Gerhard, Christoph; Adams, Geoff
2015-10-01
Geometric optics is at the heart of optics teaching. Some of us may remember using pins and string to test the simple lens equation at school. Matters get more complex at undergraduate/postgraduate levels as we are introduced to paraxial rays, real rays, wavefronts, aberration theory and much more. Software is essential for the later stages, and the right software can profitably be used even at school. We present two free PC programs, which have been widely used in optics teaching, and have been further developed in close cooperation with lecturers/professors in order to address the current content of the curricula for optics, photonics and lasers in higher education. PreDesigner is a single thin lens modeller. It illustrates the simple lens law with construction rays and then allows the user to include field size and aperture. Sliders can be used to adjust key values with instant graphical feedback. This tool thus represents a helpful teaching medium for the visualization of basic interrelations in optics. WinLens3DBasic can model multiple thin or thick lenses with real glasses. It shows the system focii, principal planes, nodal points, gives paraxial ray trace values, details the Seidel aberrations, offers real ray tracing and many forms of analysis. It is simple to reverse lenses and model tilts and decenters. This tool therefore provides a good base for learning lens design fundamentals. Much work has been put into offering these features in ways that are easy to use, and offer opportunities to enhance the student's background understanding.
Galeotti, Francesco; Barile, Elisa; Lanzotti, Virginia; Dolci, Marcello; Curir, Paolo
2008-01-01
One flavone-C-glycoside and two flavonol-O-glycosides were recognized and isolated as the main flavonoidal components in nine different carnation cultivars, and their chemical structures have been determined by spectroscopic methods, including UV detection, MS and NMR. The distribution of these three compounds in flowers, leaves, stems, young sprouts, and roots of each cultivar was evaluated by a simple HPLC-UV method: the graphic representation of their content in the different tissues allows to identify and characterize unambiguously each considered carnation cultivar. The presented method could be an easy, inexpensive and reliable tool for carnation cultivar discrimination.
Image manipulation as research misconduct.
Parrish, Debra; Noonan, Bridget
2009-06-01
A growing number of research misconduct cases handled by the Office of Research Integrity involve image manipulations. Manipulations may include simple image enhancements, misrepresenting an image as something different from what it is, and altering specific features of an image. Through a study of specific cases, the misconduct findings associated with image manipulation, detection methods and those likely to identify such manipulations, are discussed. This article explores sanctions imposed against guilty researchers and the factors that resulted in no misconduct finding although relevant images clearly were flawed. Although new detection tools are available for universities and journals to detect questionable images, this article explores why these tools have not been embraced.
Using computer-aided drug design and medicinal chemistry strategies in the fight against diabetes.
Semighini, Evandro P; Resende, Jonathan A; de Andrade, Peterson; Morais, Pedro A B; Carvalho, Ivone; Taft, Carlton A; Silva, Carlos H T P
2011-04-01
The aim of this work is to present a simple, practical and efficient protocol for drug design, in particular Diabetes, which includes selection of the illness, good choice of a target as well as a bioactive ligand and then usage of various computer aided drug design and medicinal chemistry tools to design novel potential drug candidates in different diseases. We have selected the validated target dipeptidyl peptidase IV (DPP-IV), whose inhibition contributes to reduce glucose levels in type 2 diabetes patients. The most active inhibitor with complex X-ray structure reported was initially extracted from the BindingDB database. By using molecular modification strategies widely used in medicinal chemistry, besides current state-of-the-art tools in drug design (including flexible docking, virtual screening, molecular interaction fields, molecular dynamics, ADME and toxicity predictions), we have proposed 4 novel potential DPP-IV inhibitors with drug properties for Diabetes control, which have been supported and validated by all the computational tools used herewith.
An Upgrade Pinning Block: A Mechanical Practical Aid for Fast Labelling of the Insect Specimens.
Ghafouri Moghaddam, Mohammad Hossein; Ghafouri Moghaddam, Mostafa; Rakhshani, Ehsan; Mokhtari, Azizollah
2017-01-01
A new mechanical innovation is described to deal with standard labelling of dried specimens on triangular cards and/or pinned specimens in personal and public collections. It works quickly, precisely, and easily and is very useful for maintaining label uniformity in collections. The tools accurately sets the position of labels in the shortest possible time. This tools has advantages including rapid processing, cost effectiveness, light weight, and high accuracy, compared to conventional methods. It is fully customisable, compact, and does not require specialist equipment to assemble. Conventional methods generally require locating holes on the pinning block surface when labelling with a resulting risk to damage of the specimens. Insects of different orders can be labelled by this simple and effective tool.
An Upgrade Pinning Block: A Mechanical Practical Aid for Fast Labelling of the Insect Specimens
Ghafouri Moghaddam, Mohammad Hossein; Rakhshani, Ehsan; Mokhtari, Azizollah
2017-01-01
Abstract A new mechanical innovation is described to deal with standard labelling of dried specimens on triangular cards and/or pinned specimens in personal and public collections. It works quickly, precisely, and easily and is very useful for maintaining label uniformity in collections. The tools accurately sets the position of labels in the shortest possible time. This tools has advantages including rapid processing, cost effectiveness, light weight, and high accuracy, compared to conventional methods. It is fully customisable, compact, and does not require specialist equipment to assemble. Conventional methods generally require locating holes on the pinning block surface when labelling with a resulting risk to damage of the specimens. Insects of different orders can be labelled by this simple and effective tool. PMID:29104440
Mycobacteriophages: an important tool for the diagnosis of Mycobacterium tuberculosis (review).
Fu, Xiaoyan; Ding, Mingxing; Zhang, Ning; Li, Jicheng
2015-07-01
The prevention and control of tuberculosis (TB) on a global scale has become increasingly important with the emergence of multidrug‑resistant TB. Mycobacterium tuberculosis phages have been identified as an important investigative tool. Phage genomes exhibit a significant level of diversity and mosaic genome architecture, however, they are simple structures, which are amenable to genetic manipulation. Based on these characteristics, the phages may be used to construct a shuttle plasmid, which is an indispensable tool in the investigation of TB. Furthermore, they may be used for rapid diagnosis and assessing drug susceptibility of TB, including phage amplified assessment and reporter phage technology. With an improved understanding of mycobacteriophages, further clarification of the pathogenesis of TB, and of the implications for its diagnosis and therapy, may be elucidated.
Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations
NASA Technical Reports Server (NTRS)
Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.
2015-01-01
The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.
NASA Astrophysics Data System (ADS)
See, Linda; Perger, Christoph; Dresel, Christopher; Hofer, Martin; Weichselbaum, Juergen; Mondel, Thomas; Steffen, Fritz
2016-04-01
The validation of land cover products is an important step in the workflow of generating a land cover map from remotely-sensed imagery. Many students of remote sensing will be given exercises on classifying a land cover map followed by the validation process. Many algorithms exist for classification, embedded within proprietary image processing software or increasingly as open source tools. However, there is little standardization for land cover validation, nor a set of open tools available for implementing this process. The LACO-Wiki tool was developed as a way of filling this gap, bringing together standardized land cover validation methods and workflows into a single portal. This includes the storage and management of land cover maps and validation data; step-by-step instructions to guide users through the validation process; sound sampling designs; an easy-to-use environment for validation sample interpretation; and the generation of accuracy reports based on the validation process. The tool was developed for a range of users including producers of land cover maps, researchers, teachers and students. The use of such a tool could be embedded within the curriculum of remote sensing courses at a university level but is simple enough for use by students aged 13-18. A beta version of the tool is available for testing at: http://www.laco-wiki.net.
New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.
Final Technical Report Power through Policy: "Best Practices" for Cost-Effective Distributed Wind
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhoads-Weaver, Heather; Gagne, Matthew; Sahl, Kurt
2012-02-28
Power through Policy: 'Best Practices' for Cost-Effective Distributed Wind is a U.S. Department of Energy (DOE)-funded project to identify distributed wind technology policy best practices and to help policymakers, utilities, advocates, and consumers examine their effectiveness using a pro forma model. Incorporating a customized feed from the Database of State Incentives for Renewables and Efficiency (DSIRE), the Web-based Distributed Wind Policy Comparison Tool (Policy Tool) is designed to assist state, local, and utility officials in understanding the financial impacts of different policy options to help reduce the cost of distributed wind technologies. The project's final products include the Distributed Windmore » Policy Comparison Tool, found at www.windpolicytool.org, and its accompanying documentation: Distributed Wind Policy Comparison Tool Guidebook: User Instructions, Assumptions, and Case Studies. With only two initial user inputs required, the Policy Tool allows users to adjust and test a wide range of policy-related variables through a user-friendly dashboard interface with slider bars. The Policy Tool is populated with a variety of financial variables, including turbine costs, electricity rates, policies, and financial incentives; economic variables including discount and escalation rates; as well as technical variables that impact electricity production, such as turbine power curves and wind speed. The Policy Tool allows users to change many of the variables, including the policies, to gauge the expected impacts that various policy combinations could have on the cost of energy (COE), net present value (NPV), internal rate of return (IRR), and the simple payback of distributed wind projects ranging in size from 2.4 kilowatts (kW) to 100 kW. The project conducted case studies to demonstrate how the Policy Tool can provide insights into 'what if' scenarios and also allow the current status of incentives to be examined or defended when necessary. The ranking of distributed wind state policy and economic environments summarized in the attached report, based on the Policy Tool's default COE results, highlights favorable market opportunities for distributed wind growth as well as market conditions ripe for improvement. Best practices for distributed wind state policies are identified through an evaluation of their effect on improving the bottom line of project investments. The case studies and state rankings were based on incentives, power curves, and turbine pricing as of 2010, and may not match the current results from the Policy Tool. The Policy Tool can be used to evaluate the ways that a variety of federal and state policies and incentives impact the economics of distributed wind (and subsequently its expected market growth). It also allows policymakers to determine the impact of policy options, addressing market challenges identified in the U.S. DOE's '20% Wind Energy by 2030' report and helping to meet COE targets. In providing a simple and easy-to-use policy comparison tool that estimates financial performance, the Policy Tool and guidebook are expected to enhance market expansion by the small wind industry by increasing and refining the understanding of distributed wind costs, policy best practices, and key market opportunities in all 50 states. This comprehensive overview and customized software to quickly calculate and compare policy scenarios represent a fundamental step in allowing policymakers to see how their decisions impact the bottom line for distributed wind consumers, while estimating the relative advantages of different options available in their policy toolboxes. Interested stakeholders have suggested numerous ways to enhance and expand the initial effort to develop an even more user-friendly Policy Tool and guidebook, including the enhancement and expansion of the current tool, and conducting further analysis. The report and the project's Guidebook include further details on possible next steps. NREL Report No. BK-5500-53127; DOE/GO-102011-3453.« less
NASA Technical Reports Server (NTRS)
Katz, Daniel S.; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.; Borgioli, Andrea
2000-01-01
The process of designing and analyzing a multiple-reflector system has traditionally been time-intensive, requiring large amounts of both computational and human time. At many frequencies, a discrete approximation of the radiation integral may be used to model the system. The code which implements this physical optics (PO) algorithm was developed at the Jet Propulsion Laboratory. It analyzes systems of antennas in pairs, and for each pair, the analysis can be computationally time-consuming. Additionally, the antennas must be described using a local coordinate system for each antenna, which makes it difficult to integrate the design into a multi-disciplinary framework in which there is traditionally one global coordinate system, even before considering deforming the antenna as prescribed by external structural and/or thermal factors. Finally, setting up the code to correctly analyze all the antenna pairs in the system can take a fair amount of time, and introduces possible human error. The use of parallel computing to reduce the computational time required for the analysis of a given pair of antennas has been previously discussed. This paper focuses on the other problems mentioned above. It will present a methodology and examples of use of an automated tool that performs the analysis of a complete multiple-reflector system in an integrated multi-disciplinary environment (including CAD modeling, and structural and thermal analysis) at the click of a button. This tool, named MOD Tool (Millimeter-wave Optics Design Tool), has been designed and implemented as a distributed tool, with a client that runs almost identically on Unix, Mac, and Windows platforms, and a server that runs primarily on a Unix workstation and can interact with parallel supercomputers with simple instruction from the user interacting with the client.
Hearing loss screening tool (COBRA score) for newborns in primary care setting
Poonual, Watcharapol; Navacharoen, Niramon; Kangsanarak, Jaran; Namwongprom, Sirianong
2017-01-01
Purpose To develop and evaluate a simple screening tool to assess hearing loss in newborns. A derived score was compared with the standard clinical practice tool. Methods This cohort study was designed to screen the hearing of newborns using transiently evoked otoacoustic emission and auditory brain stem response, and to determine the risk factors associated with hearing loss of newborns in 3 tertiary hospitals in Northern Thailand. Data were prospectively collected from November 1, 2010 to May 31, 2012. To develop the risk score, clinical-risk indicators were measured by Poisson risk regression. The regression coefficients were transformed into item scores dividing each regression-coefficient with the smallest coefficient in the model, rounding the number to its nearest integer, and adding up to a total score. Results Five clinical risk factors (Craniofacial anomaly, Ototoxicity, Birth weight, family history [Relative] of congenital sensorineural hearing loss, and Apgar score) were included in our COBRA score. The screening tool detected, by area under the receiver operating characteristic curve, more than 80% of existing hearing loss. The positive-likelihood ratio of hearing loss in patients with scores of 4, 6, and 8 were 25.21 (95% confidence interval [CI], 14.69–43.26), 58.52 (95% CI, 36.26–94.44), and 51.56 (95% CI, 33.74–78.82), respectively. This result was similar to the standard tool (The Joint Committee on Infant Hearing) of 26.72 (95% CI, 20.59–34.66). Conclusion A simple screening tool of five predictors provides good prediction indices for newborn hearing loss, which may motivate parents to bring children for further appropriate testing and investigations. PMID:29234358
Hatler, Carol W; Grove, Charlene; Strickland, Stephanie; Barron, Starr; White, Bruce D
2012-01-01
Many critically ill patients in intensive care units (ICUs) are unable to communicate their wishes about goals of care, particularly about the use of life-sustaining treatments. Surrogates and clinicians struggle with medical decisions because of a lack of clarity regarding patients' preferences, leading to prolonged hospitalizations and increased costs. This project focused on the development and implementation of a tool to facilitate a better communication process by (1) assuring the early identification of a surrogate if indicated on admission and (2) clarifying the decision-making standards that the surrogate was to use when participating in decision making. Before introducing the tool into the admissions routine, the staff were educated about its use and value to the decision-making process. PROJECT AND METHODS: The study was to determine if early use of a simple method of identifying a patient's surrogate and treatment preferences might impact length of stay (LOS) and total hospital charges. A pre- and post-intervention study design was used. Nurses completed the surrogacy information tool for all patients upon admission to the neuroscience ICU. Subjects (total N = 203) were critically ill patients who had been on a mechanical ventilator for 96 hours or longer, or in the ICU for seven days or longer.The project included staff education on biomedical ethics, critical communication skills, early identification of families and staff in crisis, and use of a simple tool to document patients' surrogates and previously expressed care wishes. Data on hospital LOS and hospital charges were collected through a retrospective review of medical records for similar four-month time frames pre- and post-implementation of the assessment tool. Significant differences were found between pre- and post-groups in terms of hospital LOS (F = 6.39, p = .01) and total hospital charges (F = 7.03, p = .009). Project findings indicate that the use of a simple admission assessment tool, supported by staff education about its completion, use, and available resources, can decrease LOS and lower total hospital charges. The reasons for the difference between the pre- and post-intervention groups remain unclear. Further research is needed to evaluate if the quality of communications between patients, their legally authorized representatives, and clinicians--as suggested in the literature--may have played a role in decreasing LOS and total hospital charges.
Simple Tools to Facilitate Project Management of a Nursing Research Project.
Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret
2016-07-01
Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained. © The Author(s) 2016.
Breaking Barriers and Building Bridges: Using EJ SCREEN ...
Communities across the United States are faced with concerns about environmental risks and exposures including air contaminants near roadways, proximity to hazardous waste sites and children’s environmental health. These concerns are compounded by complicated data, limited opportunities for collaboration and resource-based restrictions such as funding. This workshop will introduce innovative approaches for combining the capacity of EPA science tools - EJ SCREEN and the recently released Community Focused Exposure and Risk Screening Tool (C-FERST). Following a nationally applicable case study, participants will learn how these tools can be used sequentially to; (1) identify community environmental health ‘hotspots’; (2) take a closer look at local scale sources of exposure and; (3) use new features of the tool to target potential partners and resources across the country. By exploring the power of GIS mapping and crowdsource data, participants will leave with simple, user-defined approaches for using state of the science tools to advance their community and environmental health projects. Presentation using EJ SCREEN and C-FERST
Evaluating Lexical Coverage in Simple English Wikipedia Articles: A Corpus-Driven Study
ERIC Educational Resources Information Center
Hendry, Clinton; Sheepy, Emily
2017-01-01
Simple English Wikipedia is a user-contributed online encyclopedia intended for young readers and readers whose first language is not English. We compiled a corpus of the entirety of Simple English Wikipedia as of June 20th, 2017. We used lexical frequency profiling tools to investigate the vocabulary size needed to comprehend Simple English…
Novel simple and practical nutritional screening tool for cancer inpatients: a pilot study.
Zekri, Jamal; Morganti, Julie; Rizvi, Azhar; Sadiq, Bakr Bin; Kerr, Ian; Aslam, Mohamed
2014-05-01
There is lack of consensus on how nutritional screening and intervention should be provided to cancer patients. Nutritional screening and support of cancer patients are not well established in the Middle East. We report our systematic and practical experience led by a qualified specialist dietician in a cancer inpatient setting, using a novel nutritional screening tool. Ninety-seven consecutive inpatients underwent nutritional screening and categorised into three nutritional risk groups based on oral intake, gastrointestinal symptoms, body mass index (BMI) and weight loss. Nutritional support was introduced accordingly. Statistical tests used included ANOVA, Bonferroni post hoc, chi-square and log rank tests. Median age was 48 (19-87)years. Patients were categorised into three nutritional risk groups: 55 % low, 37 % intermediate and 8 % high. Nutritional intervention was introduced for 36 % of these patients. Individually, weight, BMI, oral intake, serum albumin on admission and weight loss significantly affected nutritional risk and nutritional intervention (all significant P values). Eighty-seven, 60 and 55 % of patients admitted for chemotherapy, febrile neutropenia and other reasons, respectively, did not require specific nutritional intervention. There was a statistically significant relationship between nutritional risk and nutritional intervention (P=0.005). Significantly more patients were alive at 3 months in low (91 %) than intermediate (75 %) than high (37 %)-risk groups. About a third of cancer inpatients require nutritional intervention. The adopted nutritional risk assessment tool is simple and practical. The validity of this tool is supported by its significant relation with known individual nutritional risk factors. This should be confirmed in larger prospective study and comparing this new tool with other established ones.
KILLEEN, GERRY F.; McKENZIE, F. ELLIS; FOY, BRIAN D.; SCHIEFFELIN, CATHERINE; BILLINGSLEY, PETER F.; BEIER, JOHN C.
2008-01-01
We have used a relatively simple but accurate model for predicting the impact of integrated transmission control on the malaria entomologic inoculation rate (EIR) at four endemic sites from across sub-Saharan Africa and the southwest Pacific. The simulated campaign incorporated modestly effective vaccine coverage, bed net use, and larval control. The results indicate that such campaigns would reduce EIRs at all four sites by 30- to 50-fold. Even without the vaccine, 15- to 25-fold reductions of EIR were predicted, implying that integrated control with a few modestly effective tools can meaningfully reduce malaria transmission in a range of endemic settings. The model accurately predicts the effects of bed nets and indoor spraying and demonstrates that they are the most effective tools available for reducing EIR. However, the impact of domestic adult vector control is amplified by measures for reducing the rate of emergence of vectors or the level of infectiousness of the human reservoir. We conclude that available tools, including currently neglected methods for larval control, can reduce malaria transmission intensity enough to alleviate mortality. Integrated control programs should be implemented to the fullest extent possible, even in areas of intense transmission, using simple models as decision-making tools. However, we also conclude that to eliminate malaria in many areas of intense transmission is beyond the scope of methods which developing nations can currently afford. New, cost-effective, practical tools are needed if malaria is ever to be eliminated from highly endemic areas. PMID:11289662
Digitizing for Computer-Aided Finite Element Model Generation.
1979-10-10
this approach is a collection of programs developed over the last eight years at the University of Arizona, and called the GIFTS system. This paper...briefly describes the latest version of the system, GIFTS -5, and demonstrates its suitability in a design environment by simple examples. The programs...constituting the GIFTS system were used as a tool for research in many areas, including mesh generation, finite element data base design, interactive
Predicting Networked Strategic Behavior via Machine Learning and Game Theory
2015-01-13
The funding for this project was used to develop basic models, methodology and algorithms for the application of machine learning and related tools to settings in which strategic behavior is central. Among the topics studied was the development of simple behavioral models explaining and predicting human subject behavior in networked strategic experiments from prior work. These included experiments in biased voting and networked trading, among others.
En route Spacing Tool: Efficient Conflict-free Spacing to Flow-Restricted Airspace
NASA Technical Reports Server (NTRS)
Green, S.
1999-01-01
This paper describes the Air Traffic Management (ATM) problem within the U.S. of flow-restricted en route airspace, an assessment of its impact on airspace users, and a set of near-term tools and procedures to resolve the problem. The FAA is committed, over the next few years, to deploy the first generation of modem ATM decision support tool (DST) technology under the Free-Flight Phase-1 (FFp1) program. The associated en route tools include the User Request Evaluation Tool (URET) and the Traffic Management Advisor (TMA). URET is an initial conflict probe (ICP) capability that assists controllers with the detection and resolution of conflicts in en route airspace. TMA orchestrates arrivals transitioning into high-density terminal airspace by providing controllers with scheduled times of arrival (STA) and delay feedback advisories to assist with STA conformance. However, these FFPl capabilities do not mitigate the en route Miles-In-Trail (MIT) restrictions that are dynamically applied to mitigate airspace congestion. National statistics indicate that en route facilities (Centers) apply Miles-In-Trail (MIT) restrictions for approximately 5000 hours per month. Based on results from this study, an estimated 45,000 flights are impacted by these restrictions each month. Current-day practices for implementing these restrictions result in additional controller workload and an economic impact of which the fuel penalty alone may approach several hundred dollars per flight. To mitigate much of the impact of these restrictions on users and controller workload, a DST and procedures are presented. The DST is based on a simple derivative of FFP1 technology that is designed to introduce a set of simple tools for flow-rate (spacing) conformance and integrate them with conflict-probe capabilities. The tool and associated algorithms are described based on a concept prototype implemented within the CTAS baseline in 1995. A traffic scenario is used to illustrate the controller's use of the tool, and potential display options are presented for future controller evaluation.
Cunningham, J C; Sinka, I C; Zavaliangos, A
2004-08-01
In this first of two articles on the modeling of tablet compaction, the experimental inputs related to the constitutive model of the powder and the powder/tooling friction are determined. The continuum-based analysis of tableting makes use of an elasto-plastic model, which incorporates the elements of yield, plastic flow potential, and hardening, to describe the mechanical behavior of microcrystalline cellulose over the range of densities experienced during tableting. Specifically, a modified Drucker-Prager/cap plasticity model, which includes material parameters such as cohesion, internal friction, and hydrostatic yield pressure that evolve with the internal state variable relative density, was applied. Linear elasticity is assumed with the elastic parameters, Young's modulus, and Poisson's ratio dependent on the relative density. The calibration techniques were developed based on a series of simple mechanical tests including diametrical compression, simple compression, and die compaction using an instrumented die. The friction behavior is measured using an instrumented die and the experimental data are analyzed using the method of differential slices. The constitutive model and frictional properties are essential experimental inputs to the finite element-based model described in the companion article. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 93:2022-2039, 2004
Preparing for Euro 2012: developing a hazard risk assessment.
Wong, Evan G; Razek, Tarek; Luhovy, Artem; Mogilevkina, Irina; Prudnikov, Yuriy; Klimovitskiy, Fedor; Yutovets, Yuriy; Khwaja, Kosar A; Deckelbaum, Dan L
2015-04-01
Risk assessment is a vital step in the disaster-preparedness continuum as it is the foundation of subsequent phases, including mitigation, response, and recovery. To develop a risk assessment tool geared specifically towards the Union of European Football Associations (UEFA) Euro 2012. In partnership with the Donetsk National Medical University, Donetsk Research and Development Institute of Traumatology and Orthopedics, Donetsk Regional Public Health Administration, and the Ministry of Emergency of Ukraine, a table-based tool was created, which, based on historical evidence, identifies relevant potential threats, evaluates their impacts and likelihoods on graded scales based on previous available data, identifies potential mitigating shortcomings, and recommends further mitigation measures. This risk assessment tool has been applied in the vulnerability-assessment-phase of the UEFA Euro 2012. Twenty-three sub-types of potential hazards were identified and analyzed. Ten specific hazards were recognized as likely to very likely to occur, including natural disasters, bombing and blast events, road traffic collisions, and disorderly conduct. Preventative measures, such as increased stadium security and zero tolerance for impaired driving, were recommended. Mitigating factors were suggested, including clear, incident-specific preparedness plans and enhanced inter-agency communication. This hazard risk assessment tool is a simple aid in vulnerability assessment, essential for disaster preparedness and response, and may be applied broadly to future international events.
Simple Example of Backtest Overfitting (SEBO)
DOE Office of Scientific and Technical Information (OSTI.GOV)
In the field of mathematical finance, a "backtest" is the usage of historical market data to assess the performance of a proposed trading strategy. It is a relatively simple matter for a present-day computer system to explore thousands, millions or even billions of variations of a proposed strategy, and pick the best performing variant as the "optimal" strategy "in sample" (i.e., on the input dataset). Unfortunately, such an "optimal" strategy often performs very poorly "out of sample" (i.e. on another dataset), because the parameters of the invest strategy have been oversit to the in-sample data, a situation known as "backtestmore » overfitting". While the mathematics of backtest overfitting has been examined in several recent theoretical studies, here we pursue a more tangible analysis of this problem, in the form of an online simulator tool. Given a input random walk time series, the tool develops an "optimal" variant of a simple strategy by exhaustively exploring all integer parameter values among a handful of parameters. That "optimal" strategy is overfit, since by definition a random walk is unpredictable. Then the tool tests the resulting "optimal" strategy on a second random walk time series. In most runs using our online tool, the "optimal" strategy derived from the first time series performs poorly on the second time series, demonstrating how hard it is not to overfit a backtest. We offer this online tool, "Simple Example of Backtest Overfitting (SEBO)", to facilitate further research in this area.« less
Suárez Álvarez, Óscar; Fernández-Feito, Ana; Vallina Crespo, Henar; Aldasoro Unamuno, Elena; Cofiño, Rafael
2018-05-11
It is essential to develop a comprehensive approach to institutionally promoted interventions to assess their impact on health from the perspective of the social determinants of health and equity. Simple, adapted tools must be developed to carry out these assessments. The aim of this paper is to present two tools to assess the impact of programmes and community-based interventions on the social determinants of health. The first tool is intended to assess health programmes through interviews and analysis of information provided by the assessment team. The second tool, by means of online assessments of community-based interventions, also enables a report on inequality issues that includes recommendations for improvement. In addition to reducing health-related social inequities, the implementation of these tools can also help to improve the efficiency of public health interventions. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
Delora, Adam; Gonzales, Aaron; Medina, Christopher S; Mitchell, Adam; Mohed, Abdul Faheem; Jacobs, Russell E; Bearer, Elaine L
2016-01-15
Magnetic resonance imaging (MRI) is a well-developed technique in neuroscience. Limitations in applying MRI to rodent models of neuropsychiatric disorders include the large number of animals required to achieve statistical significance, and the paucity of automation tools for the critical early step in processing, brain extraction, which prepares brain images for alignment and voxel-wise statistics. This novel timesaving automation of template-based brain extraction ("skull-stripping") is capable of quickly and reliably extracting the brain from large numbers of whole head images in a single step. The method is simple to install and requires minimal user interaction. This method is equally applicable to different types of MR images. Results were evaluated with Dice and Jacquard similarity indices and compared in 3D surface projections with other stripping approaches. Statistical comparisons demonstrate that individual variation of brain volumes are preserved. A downloadable software package not otherwise available for extraction of brains from whole head images is included here. This software tool increases speed, can be used with an atlas or a template from within the dataset, and produces masks that need little further refinement. Our new automation can be applied to any MR dataset, since the starting point is a template mask generated specifically for that dataset. The method reliably and rapidly extracts brain images from whole head images, rendering them useable for subsequent analytical processing. This software tool will accelerate the exploitation of mouse models for the investigation of human brain disorders by MRI. Copyright © 2015 Elsevier B.V. All rights reserved.
Fischer, John P; Nelson, Jonas A; Shang, Eric K; Wink, Jason D; Wingate, Nicholas A; Woo, Edward Y; Jackson, Benjamin M; Kovach, Stephen J; Kanchwala, Suhail
2014-12-01
Groin wound complications after open vascular surgery procedures are common, morbid, and costly. The purpose of this study was to generate a simple, validated, clinically usable risk assessment tool for predicting groin wound morbidity after infra-inguinal vascular surgery. A retrospective review of consecutive patients undergoing groin cutdowns for femoral access between 2005-2011 was performed. Patients necessitating salvage flaps were compared to those who did not, and a stepwise logistic regression was performed and validated using a bootstrap technique. Utilising this analysis, a simplified risk score was developed to predict the risk of developing a wound which would necessitate salvage. A total of 925 patients were included in the study. The salvage flap rate was 11.2% (n = 104). Predictors determined by logistic regression included prior groin surgery (OR = 4.0, p < 0.001), prosthetic graft (OR = 2.7, p < 0.001), coronary artery disease (OR = 1.8, p = 0.019), peripheral arterial disease (OR = 5.0, p < 0.001), and obesity (OR = 1.7, p = 0.039). Based upon the respective logistic coefficients, a simplified scoring system was developed to enable the preoperative risk stratification regarding the likelihood of a significant complication which would require a salvage muscle flap. The c-statistic for the regression demonstrated excellent discrimination at 0.89. This study presents a simple, internally validated risk assessment tool that accurately predicts wound morbidity requiring flap salvage in open groin vascular surgery patients. The preoperatively high-risk patient can be identified and selectively targeted as a candidate for a prophylactic muscle flap.
SpikingLab: modelling agents controlled by Spiking Neural Networks in Netlogo.
Jimenez-Romero, Cristian; Johnson, Jeffrey
2017-01-01
The scientific interest attracted by Spiking Neural Networks (SNN) has lead to the development of tools for the simulation and study of neuronal dynamics ranging from phenomenological models to the more sophisticated and biologically accurate Hodgkin-and-Huxley-based and multi-compartmental models. However, despite the multiple features offered by neural modelling tools, their integration with environments for the simulation of robots and agents can be challenging and time consuming. The implementation of artificial neural circuits to control robots generally involves the following tasks: (1) understanding the simulation tools, (2) creating the neural circuit in the neural simulator, (3) linking the simulated neural circuit with the environment of the agent and (4) programming the appropriate interface in the robot or agent to use the neural controller. The accomplishment of the above-mentioned tasks can be challenging, especially for undergraduate students or novice researchers. This paper presents an alternative tool which facilitates the simulation of simple SNN circuits using the multi-agent simulation and the programming environment Netlogo (educational software that simplifies the study and experimentation of complex systems). The engine proposed and implemented in Netlogo for the simulation of a functional model of SNN is a simplification of integrate and fire (I&F) models. The characteristics of the engine (including neuronal dynamics, STDP learning and synaptic delay) are demonstrated through the implementation of an agent representing an artificial insect controlled by a simple neural circuit. The setup of the experiment and its outcomes are described in this work.
NASA Astrophysics Data System (ADS)
Génot, V.; André, N.; Cecconi, B.; Bouchemit, M.; Budnik, E.; Bourrel, N.; Gangloff, M.; Dufourg, N.; Hess, S.; Modolo, R.; Renard, B.; Lormant, N.; Beigbeder, L.; Popescu, D.; Toniutti, J.-P.
2014-11-01
The interest for data communication between analysis tools in planetary sciences and space physics is illustrated in this paper via several examples of the uses of SAMP. The Simple Application Messaging Protocol is developed in the frame of the IVOA from an earlier protocol called PLASTIC. SAMP enables easy communication and interoperability between astronomy software, stand-alone and web-based; it is now increasingly adopted by the planetary sciences and space physics community. Its attractiveness is based, on one hand, on the use of common file formats for exchange and, on the other hand, on established messaging models. Examples of uses at the CDPP and elsewhere are presented. The CDPP (Centre de Données de la Physique des Plasmas, http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (Automated Multi Dataset Analysis, http://amda.cdpp.eu/) which enables in depth analysis of large amount of data through dedicated functionalities such as: visualization, conditional search and cataloging. Besides AMDA, the 3DView (http://3dview.cdpp.eu/) tool provides immersive visualizations and is further developed to include simulation and observational data. These tools and their interactions with each other, notably via SAMP, are presented via science cases of interest to planetary sciences and space physics communities.
The Multisensory Attentional Consequences of Tool Use: A Functional Magnetic Resonance Imaging Study
Holmes, Nicholas P.; Spence, Charles; Hansen, Peter C.; Mackay, Clare E.; Calvert, Gemma A.
2008-01-01
Background Tool use in humans requires that multisensory information is integrated across different locations, from objects seen to be distant from the hand, but felt indirectly at the hand via the tool. We tested the hypothesis that using a simple tool to perceive vibrotactile stimuli results in the enhanced processing of visual stimuli presented at the distal, functional part of the tool. Such a finding would be consistent with a shift of spatial attention to the location where the tool is used. Methodology/Principal Findings We tested this hypothesis by scanning healthy human participants' brains using functional magnetic resonance imaging, while they used a simple tool to discriminate between target vibrations, accompanied by congruent or incongruent visual distractors, on the same or opposite side to the tool. The attentional hypothesis was supported: BOLD response in occipital cortex, particularly in the right hemisphere lingual gyrus, varied significantly as a function of tool position, increasing contralaterally, and decreasing ipsilaterally to the tool. Furthermore, these modulations occurred despite the fact that participants were repeatedly instructed to ignore the visual stimuli, to respond only to the vibrotactile stimuli, and to maintain visual fixation centrally. In addition, the magnitude of multisensory (visual-vibrotactile) interactions in participants' behavioural responses significantly predicted the BOLD response in occipital cortical areas that were also modulated as a function of both visual stimulus position and tool position. Conclusions/Significance These results show that using a simple tool to locate and to perceive vibrotactile stimuli is accompanied by a shift of spatial attention to the location where the functional part of the tool is used, resulting in enhanced processing of visual stimuli at that location, and decreased processing at other locations. This was most clearly observed in the right hemisphere lingual gyrus. Such modulations of visual processing may reflect the functional importance of visuospatial information during human tool use. PMID:18958150
Exploratory Causal Analysis in Bivariate Time Series Data
NASA Astrophysics Data System (ADS)
McCracken, James M.
Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data sets, but little research exists of how these tools compare to each other in practice. This work introduces and defines exploratory causal analysis (ECA) to address this issue along with the concept of data causality in the taxonomy of causal studies introduced in this work. The motivation is to provide a framework for exploring potential causal structures in time series data sets. ECA is used on several synthetic and empirical data sets, and it is found that all of the tested time series causality tools agree with each other (and intuitive notions of causality) for many simple systems but can provide conflicting causal inferences for more complicated systems. It is proposed that such disagreements between different time series causality tools during ECA might provide deeper insight into the data than could be found otherwise.
Simple Climate Model Evaluation Using Impulse Response Tests
NASA Astrophysics Data System (ADS)
Schwarber, A.; Hartin, C.; Smith, S. J.
2017-12-01
Simple climate models (SCMs) are central tools used to incorporate climate responses into human-Earth system modeling. SCMs are computationally inexpensive, making them an ideal tool for a variety of analyses, including consideration of uncertainty. Despite their wide use, many SCMs lack rigorous testing of their fundamental responses to perturbations. Here, following recommendations of a recent National Academy of Sciences report, we compare several SCMs (Hector-deoclim, MAGICC 5.3, MAGICC 6.0, and the IPCC AR5 impulse response function) to diagnose model behavior and understand the fundamental system responses within each model. We conduct stylized perturbations (emissions and forcing/concentration) of three different chemical species: CO2, CH4, and BC. We find that all 4 models respond similarly in terms of overall shape, however, there are important differences in the timing and magnitude of the responses. For example, the response to a BC pulse differs over the first 20 years after the pulse among the models, a finding that is due to differences in model structure. Such perturbation experiments are difficult to conduct in complex models due to internal model noise, making a direct comparison with simple models challenging. We can, however, compare the simplified model response from a 4xCO2 step experiment to the same stylized experiment carried out by CMIP5 models, thereby testing the ability of SCMs to emulate complex model results. This work allows an assessment of how well current understanding of Earth system responses are incorporated into multi-model frameworks by way of simple climate models.
NASA Astrophysics Data System (ADS)
Valentin, M. M.; Hay, L.; Van Beusekom, A. E.; Viger, R. J.; Hogue, T. S.
2016-12-01
Forecasting the hydrologic response to climate change in Alaska's glaciated watersheds remains daunting for hydrologists due to sparse field data and few modeling tools, which frustrates efforts to manage and protect critical aquatic habitat. Approximately 20% of the 64,000 square kilometer Copper River watershed is glaciated, and its glacier-fed tributaries support renowned salmon fisheries that are economically, culturally, and nutritionally invaluable to the local communities. This study adapts a simple, yet powerful, conceptual hydrologic model to simulate changes in the timing and volume of streamflow in the Copper River, Alaska as glaciers change under plausible future climate scenarios. The USGS monthly water balance model (MWBM), a hydrologic tool used for two decades to evaluate a broad range of hydrologic questions in the contiguous U.S., was enhanced to include glacier melt simulations and remotely sensed data. In this presentation we summarize the technical details behind our MWBM adaptation and demonstrate its use in the Copper River Basin to evaluate glacier and streamflow responses to climate change.
ERIC Educational Resources Information Center
Plummer, Donna; Kuhlman, Wilma
2005-01-01
To introduce students to rocks and their characteristics, teacher can begin rock units with the activities described in this article. Students need the ability to make simple observations using their senses and simple tools.
Future Automotive Systems Technology Simulator (FASTSim)
DOE Office of Scientific and Technical Information (OSTI.GOV)
An advanced vehicle powertrain systems analysis tool, the Future Automotive Systems Technology Simulator (FASTSim) provides a simple way to compare powertrains and estimate the impact of technology improvements on light-, medium- and heavy-duty vehicle efficiency, performance, cost, and battery life. Created by the National Renewable Energy Laboratory, FASTSim accommodates a range of vehicle types - including conventional vehicles, electric-drive vehicles, and fuel cell vehicles - and is available for free download in Microsoft Excel and Python formats.
Improved model for the angular dependence of excimer laser ablation rates in polymer materials
NASA Astrophysics Data System (ADS)
Pedder, J. E. A.; Holmes, A. S.; Dyer, P. E.
2009-10-01
Measurements of the angle-dependent ablation rates of polymers that have applications in microdevice fabrication are reported. A simple model based on Beer's law, including plume absorption, is shown to give good agreement with the experimental findings for polycarbonate and SU8, ablated using the 193 and 248 nm excimer lasers, respectively. The modeling forms a useful tool for designing masks needed to fabricate complex surface relief by ablation.
Scaling up digital circuit computation with DNA strand displacement cascades.
Qian, Lulu; Winfree, Erik
2011-06-03
To construct sophisticated biochemical circuits from scratch, one needs to understand how simple the building blocks can be and how robustly such circuits can scale up. Using a simple DNA reaction mechanism based on a reversible strand displacement process, we experimentally demonstrated several digital logic circuits, culminating in a four-bit square-root circuit that comprises 130 DNA strands. These multilayer circuits include thresholding and catalysis within every logical operation to perform digital signal restoration, which enables fast and reliable function in large circuits with roughly constant switching time and linear signal propagation delays. The design naturally incorporates other crucial elements for large-scale circuitry, such as general debugging tools, parallel circuit preparation, and an abstraction hierarchy supported by an automated circuit compiler.
Bioluminescent bioreporter pad biosensor for monitoring water toxicity.
Axelrod, Tim; Eltzov, Evgeni; Marks, Robert S
2016-01-01
Toxicants in water sources are of concern. We developed a tool that is affordable and easy-to-use for monitoring toxicity in water. It is a biosensor composed of disposable bioreporter pads (calcium alginate matrix with immobilized bacteria) and a non-disposable CMOS photodetector. Various parameters to enhance the sensor's signal have been tested, including the effect of alginate and bacterium concentrations. The effect of various toxicants, as well as, environmental samples were tested by evaluating their effect on bacterial luminescence. This is the first step in the creation of a sensitive and simple operative tool that may be used in different environments. Copyright © 2015 Elsevier B.V. All rights reserved.
Solar Observations as Educational Tools (P8)
NASA Astrophysics Data System (ADS)
Shylaja, B. S.
2006-11-01
taralaya89@yahoo.co.in Solar observations are very handy tools to expose the students to the joy of research. In this presentation I briefly discuss the various experiments already done here with a small 6" Coude refractor. These include simple experiments like eclipse observations, rotation measurements, variation in the angular size of the sun through the year as well as sun spot size variations, Doppler measurements, identification of elements from solar spectrum (from published high resolution spectrum), limb darkening measurements, deriving the curve of growth (from published data). I also describe the theoretical implications of the experiments and future plans to develop this as a platform for motivating students towards a career in basic science research.
Intuitive Tools for the Design and Analysis of Communication Payloads for Satellites
NASA Technical Reports Server (NTRS)
Culver, Michael R.; Soong, Christine; Warner, Joseph D.
2014-01-01
In an effort to make future communications satellite payload design more efficient and accessible, two tools were created with intuitive graphical user interfaces (GUIs). The first tool allows payload designers to graphically design their payload by using simple drag and drop of payload components onto a design area within the program. Information about each picked component is pulled from a database of common space-qualified communication components sold by commerical companies. Once a design is completed, various reports can be generated, such as the Master Equipment List. The second tool is a link budget calculator designed specifically for ease of use. Other features of this tool include being able to access a database of NASA ground based apertures for near Earth and Deep Space communication, the Tracking and Data Relay Satellite System (TDRSS) base apertures, and information about the solar system relevant to link budget calculations. The link budget tool allows for over 50 different combinations of user inputs, eliminating the need for multiple spreadsheets and the user errors associated with using them. Both of the aforementioned tools increase the productivity of space communication systems designers, and have the colloquial latitude to allow non-communication experts to design preliminary communication payloads.
A glossary of Karst terminology
Monroe, Watson Hiner
1970-01-01
This glossary includes most terms used in describing karst geomorphologic features and processes. The terms are primarily those used in the literature of English-speaking countries, but a few of the more common terms in French, German, and Spanish are included, with references to the corresponding English terms where they are available. The glossary also includes simple definitions of the more common rocks and minerals found in karst terrain, common terms of hydrology, and a number of the descriptive terms used by speleologists. The glossary does not include definitions of most biospeleological terms, geologic structure terms, varieties of carbonate rock that require microscopic techniques for identification, or names describing tools and techniques of cave exploration.
Charles, Patrick G P; Wolfe, Rory; Whitby, Michael; Fine, Michael J; Fuller, Andrew J; Stirling, Robert; Wright, Alistair A; Ramirez, Julio A; Christiansen, Keryn J; Waterer, Grant W; Pierce, Robert J; Armstrong, John G; Korman, Tony M; Holmes, Peter; Obrosky, D Scott; Peyrani, Paula; Johnson, Barbara; Hooy, Michelle; Grayson, M Lindsay
2008-08-01
Existing severity assessment tools, such as the pneumonia severity index (PSI) and CURB-65 (tool based on confusion, urea level, respiratory rate, blood pressure, and age >or=65 years), predict 30-day mortality in community-acquired pneumonia (CAP) and have limited ability to predict which patients will require intensive respiratory or vasopressor support (IRVS). The Australian CAP Study (ACAPS) was a prospective study of 882 episodes in which each patient had a detailed assessment of severity features, etiology, and treatment outcomes. Multivariate logistic regression was performed to identify features at initial assessment that were associated with receipt of IRVS. These results were converted into a simple points-based severity tool that was validated in 5 external databases, totaling 7464 patients. In ACAPS, 10.3% of patients received IRVS, and the 30-day mortality rate was 5.7%. The features statistically significantly associated with receipt of IRVS were low systolic blood pressure (2 points), multilobar chest radiography involvement (1 point), low albumin level (1 point), high respiratory rate (1 point), tachycardia (1 point), confusion (1 point), poor oxygenation (2 points), and low arterial pH (2 points): SMART-COP. A SMART-COP score of >or=3 points identified 92% of patients who received IRVS, including 84% of patients who did not need immediate admission to the intensive care unit. Accuracy was also high in the 5 validation databases. Sensitivities of PSI and CURB-65 for identifying the need for IRVS were 74% and 39%, respectively. SMART-COP is a simple, practical clinical tool for accurately predicting the need for IRVS that is likely to assist clinicians in determining CAP severity.
Cherkaoui, Imad; Sabouni, Radia; Ghali, Iraqi; Kizub, Darya; Billioux, Alexander C; Bennani, Kenza; Bourkadi, Jamal Eddine; Benmamoun, Abderrahmane; Lahlou, Ouafae; Aouad, Rajae El; Dooley, Kelly E
2014-01-01
Public tuberculosis (TB) clinics in urban Morocco. Explore risk factors for TB treatment default and develop a prediction tool. Assess consequences of default, specifically risk for transmission or development of drug resistance. Case-control study comparing patients who defaulted from TB treatment and patients who completed it using quantitative methods and open-ended questions. Results were interpreted in light of health professionals' perspectives from a parallel study. A predictive model and simple tool to identify patients at high risk of default were developed. Sputum from cases with pulmonary TB was collected for smear and drug susceptibility testing. 91 cases and 186 controls enrolled. Independent risk factors for default included current smoking, retreatment, work interference with adherence, daily directly observed therapy, side effects, quick symptom resolution, and not knowing one's treatment duration. Age >50 years, never smoking, and having friends who knew one's diagnosis were protective. A simple scoring tool incorporating these factors was 82.4% sensitive and 87.6% specific for predicting default in this population. Clinicians and patients described additional contributors to default and suggested locally-relevant intervention targets. Among 89 cases with pulmonary TB, 71% had sputum that was smear positive for TB. Drug resistance was rare. The causes of default from TB treatment were explored through synthesis of qualitative and quantitative data from patients and health professionals. A scoring tool with high sensitivity and specificity to predict default was developed. Prospective evaluation of this tool coupled with targeted interventions based on our findings is warranted. Of note, the risk of TB transmission from patients who default treatment to others is likely to be high. The commonly-feared risk of drug resistance, though, may be low; a larger study is required to confirm these findings.
Mukherji, Sutapa
2018-03-01
In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.
NASA Astrophysics Data System (ADS)
Mukherji, Sutapa
2018-03-01
In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.
Screening and syndromic approaches to identify gonorrhea and chlamydial infection among women.
Sloan, N L; Winikoff, B; Haberland, N; Coggins, C; Elias, C
2000-03-01
The standard diagnostic tools to identify sexually transmitted infections are often expensive and have laboratory and infrastructure requirements that make them unavailable to family planning and primary health-care clinics in developing countries. Therefore, inexpensive, accessible tools that rely on symptoms, signs, and/or risk factors have been developed to identify and treat reproductive tract infections without the need for laboratory diagnostics. Studies were reviewed that used standard diagnostic tests to identify gonorrhea and cervical chlamydial infection among women and that provided adequate information about the usefulness of the tools for screening. Aggregation of the studies' results suggest that risk factors, algorithms, and risk scoring for syndromic management are poor indicators of gonorrhea and chlamydial infection in samples of both low and high prevalence and, consequently, are not effective mechanisms with which to identify or manage these conditions. The development and evaluation of other approaches to identify gonorrhea and chlamydial infections, including inexpensive and simple laboratory screening tools, periodic universal treatment, and other alternatives must be given priority.
Moscow Test Well, INEL Oversight Program: Aqueous geochemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCurry, M.; Fromm, J.; Welhan, J.
1992-09-29
This report presents a summary and interpretation of data gathered during sampling of the Moscow Test Well at Moscow, Idaho during April and May of 1992. The principal objectives of this chemical survey were to validate sampling procedures with a new straddle packer sampling tool in a previously hydrologically well characterized and simple sampling environment, and to compare analytical results from two independent labs for reproducibility of analytical results. Analytes included a wide range of metals, anions, nutrients, BNA`s, and VOC`s. Secondary objectives included analyzing of waters from a large distilled water tank (utilized for all field laboratory purposes asmore » ``pure`` stock water), of water which passed through a steamer used to clean the packer, and of rinsates from the packer tool itself before it was lowered into the test well. Analyses were also obtained of blanks and spikes for data validation purposes.« less
Undergraduate Education with the WIYN 0.9-m Telescope
NASA Astrophysics Data System (ADS)
Pilachowski, Catherine A.
2017-01-01
Several models have been explored at Indiana University Bloomington for undergraduate student engagement in astronomy using the WIYN 0.9-m telescope at Kitt Peak. These models include individual student research projects using the telescope, student observations as part of an observational techniques course for majors, and enrichment activities for non-science majors in general education courses. Where possible, we arrange for students to travel to the telescope. More often, we are able to use simple online tools such as Skype and VNC viewers to give students an authentic observing experience. Experiences with the telescope motivate students to learn basic content in astronomy, including the celestial sphere, the electromagnetic spectrum, telescopes and detectors, the variety of astronomical objects, date reduction processes, image analysis, and color image creation and appreciation. The WIYN 0.9-m telescope is an essential tool for our program at all levels of undergraduate education
A Data-Based Console Logger for Mission Operations Team Coordination
NASA Technical Reports Server (NTRS)
Thronesbery, Carroll; Malin, Jane T.; Jenks, Kenneth; Overland, David; Oliver, Patrick; Zhang, Jiajie; Gong, Yang; Zhang, Tao
2005-01-01
Concepts and prototypes1,2 are discussed for a data-based console logger (D-Logger) to meet new challenges for coordination among flight controllers arising from new exploration mission concepts. The challenges include communication delays, increased crew autonomy, multiple concurrent missions, reduced-size flight support teams that include multidisciplinary flight controllers during quiescent periods, and migrating some flight support activities to flight controller offices. A spiral development approach has been adopted, making simple, but useful functions available early and adding more extensive support later. Evaluations have guided the development of the D-Logger from the beginning and continue to provide valuable user influence about upcoming requirements. D-Logger is part of a suite of tools designed to support future operations personnel and crew. While these tools can be used independently, when used together, they provide yet another level of support by interacting with one another. Recommendations are offered for the development of similar projects.
A Simple Evacuation Modeling and Simulation Tool for First Responders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, Daniel B; Payne, Patricia W
2015-01-01
Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools canmore » quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.« less
A simple and inexpensive external fixator.
Noor, M A
1988-11-01
A simple and inexpensive external fixator has been designed. It is constructed of galvanized iron pipe and mild steel bolts and nuts. It can easily be manufactured in a hospital workshop with a minimum of tools.
ERIC Educational Resources Information Center
Sigford, Ann; Nelson, Nancy
1998-01-01
Presents a program for elementary teachers to learn how to use hand tools and household appliances to teach the principles of physics. The lesson helps teachers become familiar with simple hand tools, combat the apprehension of mechanical devices, and develop an interest in tools and technology. Session involves disassembling appliances to…
Development and Validation of the Texas Best Management Practice Evaluation Tool (TBET)
USDA-ARS?s Scientific Manuscript database
Conservation planners need simple yet accurate tools to predict sediment and nutrient losses from agricultural fields to guide conservation practice implementation and increase cost-effectiveness. The Texas Best management practice Evaluation Tool (TBET), which serves as an input/output interpreter...
Utility of Mobile phones to support In-situ data collection for Land Cover Mapping
NASA Astrophysics Data System (ADS)
Oduor, P.; Omondi, S.; Wahome, A.; Mugo, R. M.; Flores, A.
2017-12-01
With the compelling need to create better monitoring tools for our landscapes to enhance better decision making processes, it becomes imperative to do so in much more sophisticated yet simple ways. Making it possible to leverage untapped potential of our "lay men" at the same time enabling us to respond to the complexity of the information we have to get out. SERVIR Eastern and Southern Africa has developed a mobile app that can be utilized with very little prior knowledge or no knowledge at all to collect spatial information on land cover. This set of in-situ data can be collected by masses because the tools is very simple to use, and have this information fed in classification algorithms than can then be used to map out our ever changing landscape. The LULC Mapper is a subset of JiMap system and is able to pull the google earth imagery and open street maps to enable user familiarize with their location. It uses phone GPS, phone network information to map location coordinates and at the same time gives the user sample picture of what to categorize their landscape. The system is able to work offline and when user gets access to internet they can push the information into an amazon database as bulk data. The location details including geotagged photos allows the data to be used in development of a lot of spatial information including land cover data. The app is currently available in Google Play Store and will soon be uploaded on Appstore for utilization by a wider community. We foresee a lot of potential in this tool in terms of making data collection cheaper and affordable. Taking advantage of the advances made in phone technology. We envisage to do a data collection campaign where we can have the tool used for crowdsourcing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plimpton, Steve; Jones, Matt; Crozier, Paul
2006-01-01
Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invokedmore » interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less
The design of an intelligent human-computer interface for the test, control and monitor system
NASA Technical Reports Server (NTRS)
Shoaff, William D.
1988-01-01
The graphical intelligence and assistance capabilities of a human-computer interface for the Test, Control, and Monitor System at Kennedy Space Center are explored. The report focuses on how a particular commercial off-the-shelf graphical software package, Data Views, can be used to produce tools that build widgets such as menus, text panels, graphs, icons, windows, and ultimately complete interfaces for monitoring data from an application; controlling an application by providing input data to it; and testing an application by both monitoring and controlling it. A complete set of tools for building interfaces is described in a manual for the TCMS toolkit. Simple tools create primitive widgets such as lines, rectangles and text strings. Intermediate level tools create pictographs from primitive widgets, and connect processes to either text strings or pictographs. Other tools create input objects; Data Views supports output objects directly, thus output objects are not considered. Finally, a set of utilities for executing, monitoring use, editing, and displaying the content of interfaces is included in the toolkit.
Survey Of Wind Tunnels At Langley Research Center
NASA Technical Reports Server (NTRS)
Bower, Robert E.
1989-01-01
Report presented at AIAA 14th Aerodynamic Testing Conference on current capabilities and planned improvements at NASA Langley Research Center's major wind tunnels. Focuses on 14 major tunnels, 8 unique in world, 3 unique in country. Covers Langley Spin Tunnel. Includes new National Transonic Facility (NTF). Also surveys Langley Unitary Plan Wind Tunnel (UPWT). Addresses resurgence of inexpensive simple-to-operate research tunnels. Predicts no shortage of tools for aerospace researcher and engineer in next decade or two.
Navigating freely-available software tools for metabolomics analysis.
Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph
2017-01-01
The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.
Benchmarking of Decision-Support Tools Used for Tiered Sustainable Remediation Appraisal.
Smith, Jonathan W N; Kerrison, Gavin
2013-01-01
Sustainable remediation comprises soil and groundwater risk-management actions that are selected, designed, and operated to maximize net environmental, social, and economic benefit (while assuring protection of human health and safety). This paper describes a benchmarking exercise to comparatively assess potential differences in environmental management decision making resulting from application of different sustainability appraisal tools ranging from simple (qualitative) to more quantitative (multi-criteria and fully monetized cost-benefit analysis), as outlined in the SuRF-UK framework. The appraisal tools were used to rank remedial options for risk management of a subsurface petroleum release that occurred at a petrol filling station in central England. The remediation options were benchmarked using a consistent set of soil and groundwater data for each tier of sustainability appraisal. The ranking of remedial options was very similar in all three tiers, and an environmental management decision to select the most sustainable options at tier 1 would have been the same decision at tiers 2 and 3. The exercise showed that, for relatively simple remediation projects, a simple sustainability appraisal led to the same remediation option selection as more complex appraisal, and can be used to reliably inform environmental management decisions on other relatively simple land contamination projects.
[Validation of a nutritional screening tool for hospitalized pediatric patients].
Lama More, R A; Moráis López, A; Herrero Álvarez, M; Caraballo Chicano, S; Galera Martínez, R; López Ruzafa, E; Rodríguez Martínez, G; de la Mano Hernández, A; Rivero de la Rosa, M C
2012-01-01
Malnutrition among hospitalized patients has clinical implications, and interest has arisen to find screening tools able to identify subjects under risk. At present, there is no consensus about the most suitable nutrition screening tool for pediatric patients. To validate STAMP (Screening Tool for the Assessment of Malnutrition in Pediatrics) pediatric screening tool in Spain. Descriptive cross-sectional study of patients admitted to a 3rd level children's hospital with both medical and surgical specialities. During the first 24 hours of admission, STAMP screening tool was applied. For its validation, results were compared with those obtained from a nutritional assessment performed by specialist staff, which included clinical, anthropometric and body composition data. A sample of 250 children was studied. Nutritional assessment identified 64 patients (25.6%) under risk, 40 of whom were malnourished (16%). STAMP classified 48.4% of the patients as being under nutritional risk. This tool showed 75% sensitivity and 60.8% specificity when identifying patients under risk according to nutritional assessment. It showed 90% sensitivity and 59.5% specificity when identifying malnourished patients. Malnutrition was less frequent than that reported in other European countries, although diagnosis technique was different. STAMP is a simple and useful tool for nutritional screening, avoiding the need to assess all patients on admission in order to identify those under nutritional risk.
Twelve essential tools for living the life of whole person health care.
Schlitz, Marilyn; Valentina, Elizabeth
2013-01-01
The integration of body, mind, and spirit has become a key dimension of health education and disease prevention and treatment; however, our health care system remains primarily disease centered. Finding simple steps to help each of us find our own balance can improve our lives, our work, and our relationships. On the basis of interviews with health care experts at the leading edge of the new model of medicine, this article identifies simple tools to improve the health of patients and caregivers.
POVME 2.0: An Enhanced Tool for Determining Pocket Shape and Volume Characteristics
2015-01-01
Analysis of macromolecular/small-molecule binding pockets can provide important insights into molecular recognition and receptor dynamics. Since its release in 2011, the POVME (POcket Volume MEasurer) algorithm has been widely adopted as a simple-to-use tool for measuring and characterizing pocket volumes and shapes. We here present POVME 2.0, which is an order of magnitude faster, has improved accuracy, includes a graphical user interface, and can produce volumetric density maps for improved pocket analysis. To demonstrate the utility of the algorithm, we use it to analyze the binding pocket of RNA editing ligase 1 from the unicellular parasite Trypanosoma brucei, the etiological agent of African sleeping sickness. The POVME analysis characterizes the full dynamics of a potentially druggable transient binding pocket and so may guide future antitrypanosomal drug-discovery efforts. We are hopeful that this new version will be a useful tool for the computational- and medicinal-chemist community. PMID:25400521
R-based Tool for a Pairwise Structure-activity Relationship Analysis.
Klimenko, Kyrylo
2018-04-01
The Structure-Activity Relationship analysis is a complex process that can be enhanced by computational techniques. This article describes a simple tool for SAR analysis that has a graphic user interface and a flexible approach towards the input of molecular data. The application allows calculating molecular similarity represented by Tanimoto index & Euclid distance, as well as, determining activity cliffs by means of Structure-Activity Landscape Index. The calculation is performed in a pairwise manner either for the reference compound and other compounds or for all possible pairs in the data set. The results of SAR analysis are visualized using two types of plot. The application capability is demonstrated by the analysis of a set of COX2 inhibitors with respect to Isoxicam. This tool is available online: it includes manual and input file examples. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
The West Midlands breast cancer screening status algorithm - methodology and use as an audit tool.
Lawrence, Gill; Kearins, Olive; O'Sullivan, Emma; Tappenden, Nancy; Wallis, Matthew; Walton, Jackie
2005-01-01
To illustrate the ability of the West Midlands breast screening status algorithm to assign a screening status to women with malignant breast cancer, and its uses as a quality assurance and audit tool. Breast cancers diagnosed between the introduction of the National Health Service [NHS] Breast Screening Programme and 31 March 2001 were obtained from the West Midlands Cancer Intelligence Unit (WMCIU). Screen-detected tumours were identified via breast screening units, and the remaining cancers were assigned to one of eight screening status categories. Multiple primaries and recurrences were excluded. A screening status was assigned to 14,680 women (96% of the cohort examined), 110 cancers were not registered at the WMCIU and the cohort included 120 screen-detected recurrences. The West Midlands breast screening status algorithm is a robust simple tool which can be used to derive data to evaluate the efficacy and impact of the NHS Breast Screening Programme.
WEC3: Wave Energy Converter Code Comparison Project: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien
This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less
"AFacet": a geometry based format and visualizer to support SAR and multisensor signature generation
NASA Astrophysics Data System (ADS)
Rosencrantz, Stephen; Nehrbass, John; Zelnio, Ed; Sudkamp, Beth
2018-04-01
When simulating multisensor signature data (including SAR, LIDAR, EO, IR, etc...), geometry data are required that accurately represent the target. Most vehicular targets can, in real life, exist in many possible configurations. Examples of these configurations might include a rotated turret, an open door, a missing roof rack, or a seat made of metal or wood. Previously we have used the Modelman (.mmp) format and tool to represent and manipulate our articulable models. Unfortunately Modelman is now an unsupported tool and an undocumented binary format. Some work has been done to reverse engineer a reader in Matlab so that the format could continue to be useful. This work was tedious and resulted in an incomplete conversion. In addition, the resulting articulable models could not be altered and re-saved in the Modelman format. The AFacet (.afacet) articulable facet file format is a replacement for the binary Modelman (.mmp) file format. There is a one-time straight forward path for conversion from Modelman to the AFacet format. It is a simple ASCII, comma separated, self-documenting format that is easily readable (and in many cases usefully editable) by a human with any text editor, preventing future obsolescence. In addition, because the format is simple, it is relatively easy for even the most novice programmer to create a program to read and write AFacet files in any language without any special libraries. This paper presents the AFacet format, as well as a suite of tools for creating, articulating, manipulating, viewing, and converting the 370+ (when this paper was written) models that have been converted to the AFacet format.
Beckmann, Kerri; O'Callaghan, Michael; Vincent, Andrew; Roder, David; Millar, Jeremy; Evans, Sue; McNeil, John; Moretti, Kim
2018-03-01
The Cancer of the Prostate Risk Assessment Post-Surgical (CAPRA-S) score is a simple post-operative risk assessment tool predicting disease recurrence after radical prostatectomy, which is easily calculated using available clinical data. To be widely useful, risk tools require multiple external validations. We aimed to validate the CAPRA-S score in an Australian multi-institutional population, including private and public settings and reflecting community practice. The study population were all men on the South Australian Prostate Cancer Clinical Outcomes Collaborative Database with localized prostate cancer diagnosed during 1998-2013, who underwent radical prostatectomy without adjuvant therapy (n = 1664). Predictive performance was assessed via Kaplan-Meier and Cox proportional regression analyses, Harrell's Concordance index, calibration plots and decision curve analysis. Biochemical recurrence occurred in 342 (21%) cases. Five-year recurrence-free probabilities for CAPRA-S scores indicating low (0-2), intermediate (3-5) and high risk were 95, 79 and 46%, respectively. The hazard ratio for CAPRA-S score increments was 1.56 (95% confidence interval 1.49-1.64). The Concordance index for 5-year recurrence-free survival was 0.77. The calibration plot showed good correlation between predicted and observed recurrence-free survival across scores. Limitations include the retrospective nature and small numbers with higher CAPRA-S scores. The CAPRA-S score is an accurate predictor of recurrence after radical prostatectomy in our cohort, supporting its utility in the Australian setting. This simple tool can assist in post-surgical selection of patients who would benefit from adjuvant therapy while avoiding morbidity among those less likely to benefit. © 2017 Royal Australasian College of Surgeons.
Zarzycki, Paweł K; Zarzycka, Magdalena B; Clifton, Vicki L; Adamski, Jerzy; Głód, Bronisław K
2011-08-19
The goal of this paper is to demonstrate the separation and detection capability of eco-friendly micro-TLC technique for the classification of spirulina and selected herbs from pharmaceutical and food products. Target compounds were extracted using relatively low-parachor liquids. A number of the spirulina samples which originated from pharmaceutical formulations and food products, were isolated using a simple one step extraction with small volume of methanol, acetone or tetrahydrofuran. Herb samples rich in chlorophyll dyes were analyzed as reference materials. Quantitative data derived from micro-plates under visible light conditions and after iodine staining were explored using chemometrics tools including cluster analysis and principal components analysis. Using this method we could easily distinguish genuine spirulina and non-spirulina samples as well as fresh from expired commercial products and furthermore, we could identify some biodegradation peaks appearing on micro-TLC profiles. This methodology can be applied as a fast screening or fingerprinting tool for the classification of genuine spirulina and herb samples and in particular may be used commercially for the rapid quality control screening of products. Furthermore, this approach allows low-cost fractionation of target substances including cyanobacteria pigments in raw biological or environmental samples for preliminary chemotaxonomic investigations. Due to the low consumption of the mobile phase (usually less than 1 mL per run), this method can be considered as environmentally friendly analytical tool, which may be an alternative for fingerprinting protocols based on HPLC machines and simple separation systems involving planar micro-fluidic or micro-chip devices. Copyright © 2011 Elsevier B.V. All rights reserved.
Comparison and correlation of Simple Sequence Repeats distribution in genomes of Brucella species
Kiran, Jangampalli Adi Pradeep; Chakravarthi, Veeraraghavulu Praveen; Kumar, Yellapu Nanda; Rekha, Somesula Swapna; Kruti, Srinivasan Shanthi; Bhaskar, Matcha
2011-01-01
Computational genomics is one of the important tools to understand the distribution of closely related genomes including simple sequence repeats (SSRs) in an organism, which gives valuable information regarding genetic variations. The central objective of the present study was to screen the SSRs distributed in coding and non-coding regions among different human Brucella species which are involved in a range of pathological disorders. Computational analysis of the SSRs in the Brucella indicates few deviations from expected random models. Statistical analysis also reveals that tri-nucleotide SSRs are overrepresented and tetranucleotide SSRs underrepresented in Brucella genomes. From the data, it can be suggested that over expressed tri-nucleotide SSRs in genomic and coding regions might be responsible in the generation of functional variation of proteins expressed which in turn may lead to different pathogenicity, virulence determinants, stress response genes, transcription regulators and host adaptation proteins of Brucella genomes. Abbreviations SSRs - Simple Sequence Repeats, ORFs - Open Reading Frames. PMID:21738309
NASA Astrophysics Data System (ADS)
Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.
2003-03-01
A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.
NASA Astrophysics Data System (ADS)
Zhang, X.; Srinivasan, R.
2008-12-01
In this study, a user friendly GIS tool was developed for evaluating and improving NEXRAD using raingauge data. This GIS tool can automatically read in raingauge and NEXRAD data, evaluate the accuracy of NEXRAD for each time unit, implement several geostatistical methods to improve the accuracy of NEXRAD through raingauge data, and output spatial precipitation map for distributed hydrologic model. The geostatistical methods incorporated in this tool include Simple Kriging with varying local means, Kriging with External Drift, Regression Kriging, Co-Kriging, and a new geostatistical method that was newly developed by Li et al. (2008). This tool was applied in two test watersheds at hourly and daily temporal scale. The preliminary cross-validation results show that incorporating raingauge data to calibrate NEXRAD can pronouncedly change the spatial pattern of NEXRAD and improve its accuracy. Using different geostatistical methods, the GIS tool was applied to produce long term precipitation input for a distributed hydrologic model - Soil and Water Assessment Tool (SWAT). Animated video was generated to vividly illustrate the effect of using different precipitation input data on distributed hydrologic modeling. Currently, this GIS tool is developed as an extension of SWAT, which is used as water quantity and quality modeling tool by USDA and EPA. The flexible module based design of this tool also makes it easy to be adapted for other hydrologic models for hydrological modeling and water resources management.
Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.
Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton
2013-01-01
The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).
Soulis, Konstantinos X; Valiantzas, John D; Ntoulas, Nikolaos; Kargas, George; Nektarios, Panayiotis A
2017-09-15
In spite of the well-known green roof benefits, their widespread adoption in the management practices of urban drainage systems requires the use of adequate analytical and modelling tools. In the current study, green roof runoff modeling was accomplished by developing, testing, and jointly using a simple conceptual model and a physically based numerical simulation model utilizing HYDRUS-1D software. The use of such an approach combines the advantages of the conceptual model, namely simplicity, low computational requirements, and ability to be easily integrated in decision support tools with the capacity of the physically based simulation model to be easily transferred in conditions and locations other than those used for calibrating and validating it. The proposed approach was evaluated with an experimental dataset that included various green roof covers (either succulent plants - Sedum sediforme, or xerophytic plants - Origanum onites, or bare substrate without any vegetation) and two substrate depths (either 8 cm or 16 cm). Both the physically based and the conceptual models matched very closely the observed hydrographs. In general, the conceptual model performed better than the physically based simulation model but the overall performance of both models was sufficient in most cases as it is revealed by the Nash-Sutcliffe Efficiency index which was generally greater than 0.70. Finally, it was showcased how a physically based and a simple conceptual model can be jointly used to allow the use of the simple conceptual model for a wider set of conditions than the available experimental data and in order to support green roof design. Copyright © 2017 Elsevier Ltd. All rights reserved.
Simplified, inverse, ejector design tool
NASA Technical Reports Server (NTRS)
Dechant, Lawrence J.
1993-01-01
A simple lumped parameter based inverse design tool has been developed which provides flow path geometry and entrainment estimates subject to operational, acoustic, and design constraints. These constraints are manifested through specification of primary mass flow rate or ejector thrust, fully-mixed exit velocity, and static pressure matching. Fundamentally, integral forms of the conservation equations coupled with the specified design constraints are combined to yield an easily invertible linear system in terms of the flow path cross-sectional areas. Entrainment is computed by back substitution. Initial comparison with experimental and analogous one-dimensional methods show good agreement. Thus, this simple inverse design code provides an analytically based, preliminary design tool with direct application to High Speed Civil Transport (HSCT) design studies.
Factors affecting adoption and implementation of AHRQ health literacy tools in pharmacies.
Shoemaker, Sarah J; Staub-DeLong, Leah; Wasserman, Melanie; Spranca, Mark
2013-01-01
Pharmacies are key sources of medication information for patients, yet few effectively serve patients with low health literacy. The Agency for Healthcare Research and Quality (AHRQ) supported the development of four health literacy tools for pharmacists to address this problem, and to help assess and improve pharmacies' health literacy practices. This study aimed to understand the facilitators and barriers to the adoption and implementation of AHRQ's health literacy tools, particularly a tool to assess a pharmacy's health literacy practices. We conducted a comparative, multiple-case study of eight pharmacies, guided by an adaptation of Rogers's Diffusion of Innovations model. Data were collected and triangulated through interviews, site visit observations, and the review of documents, and analyzed on the factors affecting pharmacies' adoption decisions and implementation of the tools. Factors important to pharmacies' decision to adopt the health literacy tools included awareness of health literacy; a culture of innovation; a change champion; the relative advantage and compatibility of the tools; and an invitation to utilize and receive support to use the tools. The barriers included a lack of leadership support, limited staff time, and a perception of the tools as complex with limited value. For implementation, the primary facilitators were buy-in from leadership, qualified staff, college-affiliated change champions, the adaptability and organization of the tool, and support. Barriers to implementation were limited leadership buy-in, prioritization of other activities, lack of qualified staff, and tool complexity. If pharmacists are provided tools that could ultimately improve their health literacy practices and patient-centered services; and the tools have a clear relative advantage, are simple as well adaptable, and the pharmacists are supported in their efforts - either by colleagues or by collaborating with colleges of pharmacy-then there could be important progress toward achieving the goals of the National Action Plan for Health Literacy. Copyright © 2013 Elsevier Inc. All rights reserved.
SPARTA: Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis.
Johnson, Benjamin K; Scholz, Matthew B; Teal, Tracy K; Abramovitch, Robert B
2016-02-04
Many tools exist in the analysis of bacterial RNA sequencing (RNA-seq) transcriptional profiling experiments to identify differentially expressed genes between experimental conditions. Generally, the workflow includes quality control of reads, mapping to a reference, counting transcript abundance, and statistical tests for differentially expressed genes. In spite of the numerous tools developed for each component of an RNA-seq analysis workflow, easy-to-use bacterially oriented workflow applications to combine multiple tools and automate the process are lacking. With many tools to choose from for each step, the task of identifying a specific tool, adapting the input/output options to the specific use-case, and integrating the tools into a coherent analysis pipeline is not a trivial endeavor, particularly for microbiologists with limited bioinformatics experience. To make bacterial RNA-seq data analysis more accessible, we developed a Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis (SPARTA). SPARTA is a reference-based bacterial RNA-seq analysis workflow application for single-end Illumina reads. SPARTA is turnkey software that simplifies the process of analyzing RNA-seq data sets, making bacterial RNA-seq analysis a routine process that can be undertaken on a personal computer or in the classroom. The easy-to-install, complete workflow processes whole transcriptome shotgun sequencing data files by trimming reads and removing adapters, mapping reads to a reference, counting gene features, calculating differential gene expression, and, importantly, checking for potential batch effects within the data set. SPARTA outputs quality analysis reports, gene feature counts and differential gene expression tables and scatterplots. SPARTA provides an easy-to-use bacterial RNA-seq transcriptional profiling workflow to identify differentially expressed genes between experimental conditions. This software will enable microbiologists with limited bioinformatics experience to analyze their data and integrate next generation sequencing (NGS) technologies into the classroom. The SPARTA software and tutorial are available at sparta.readthedocs.org.
An open source GIS-based tool to integrate the fragmentation mechanism in rockfall propagation
NASA Astrophysics Data System (ADS)
Matas, Gerard; Lantada, Nieves; Gili, Josep A.; Corominas, Jordi
2015-04-01
Rockfalls are frequent instability processes in road cuts, open pit mines and quarries, steep slopes and cliffs. Even though the stability of rock slopes can be determined using analytical approaches, the assessment of large rock cliffs require simplifying assumptions due to the difficulty of working with a large amount of joints, the scattering of both the orientations and strength parameters. The attitude and persistency of joints within the rock mass define the size of kinematically unstable rock volumes. Furthermore the rock block will eventually split in several fragments during its propagation downhill due its impact with the ground surface. Knowledge of the size, energy, trajectory… of each block resulting from fragmentation is critical in determining the vulnerability of buildings and protection structures. The objective of this contribution is to present a simple and open source tool to simulate the fragmentation mechanism in rockfall propagation models and in the calculation of impact energies. This tool includes common modes of motion for falling boulders based on the previous literature. The final tool is being implemented in a GIS (Geographic Information Systems) using open source Python programming. The tool under development will be simple, modular, compatible with any GIS environment, open source, able to model rockfalls phenomena correctly. It could be used in any area susceptible to rockfalls with a previous adjustment of the parameters. After the adjustment of the model parameters to a given area, a simulation could be performed to obtain maps of kinetic energy, frequency, stopping density and passing heights. This GIS-based tool and the analysis of the fragmentation laws using data collected from recent rockfall have being developed within the RockRisk Project (2014-2016). This project is funded by the Spanish Ministerio de Economía y Competitividad and entitled "Rockfalls in cliffs: risk quantification and its prevention"(BIA2013-42582-P).
CRISPR/Cas9 Immune System as a Tool for Genome Engineering.
Hryhorowicz, Magdalena; Lipiński, Daniel; Zeyland, Joanna; Słomski, Ryszard
2017-06-01
CRISPR/Cas (clustered regularly interspaced short palindromic repeats/CRISPR-associated) adaptive immune systems constitute a bacterial defence against invading nucleic acids derived from bacteriophages or plasmids. This prokaryotic system was adapted in molecular biology and became one of the most powerful and versatile platforms for genome engineering. CRISPR/Cas9 is a simple and rapid tool which enables the efficient modification of endogenous genes in various species and cell types. Moreover, a modified version of the CRISPR/Cas9 system with transcriptional repressors or activators allows robust transcription repression or activation of target genes. The simplicity of CRISPR/Cas9 has resulted in the widespread use of this technology in many fields, including basic research, biotechnology and biomedicine.
NASA Astrophysics Data System (ADS)
Crawford, David L.; McKenna, D.
2006-12-01
A good estimate of sky brightness and its variations throughout the night, the months, and even the years is an essential bit of knowledge both for good observing and especially as a tool in efforts to minimize sky brightness through local action. Hence a stable and accurate monitor can be a valuable and necessary tool. We have developed such a monitor, with the financial help of Vatican Observatory and Walker Management. The device is now undergoing its Beta test in preparation for production. It is simple, accurate, well calibrated, and automatic, sending its data directly to IDA over the internet via E-mail . Approximately 50 such monitors will be ready soon for deployment worldwide including most major observatories. Those interested in having one should enquire of IDA about details.
High energy PIXE: A tool to characterize multi-layer thick samples
NASA Astrophysics Data System (ADS)
Subercaze, A.; Koumeir, C.; Métivier, V.; Servagent, N.; Guertin, A.; Haddad, F.
2018-02-01
High energy PIXE is a useful and non-destructive tool to characterize multi-layer thick samples such as cultural heritage objects. In a previous work, we demonstrated the possibility to perform quantitative analysis of simple multi-layer samples using high energy PIXE, without any assumption on their composition. In this work an in-depth study of the parameters involved in the method previously published is proposed. Its extension to more complex samples with a repeated layer is also presented. Experiments have been performed at the ARRONAX cyclotron using 68 MeV protons. The thicknesses and sequences of a multi-layer sample including two different layers of the same element have been determined. Performances and limits of this method are presented and discussed.
NONMEMory: a run management tool for NONMEM.
Wilkins, Justin J
2005-06-01
NONMEM is an extremely powerful tool for nonlinear mixed-effect modelling and simulation of pharmacokinetic and pharmacodynamic data. However, it is a console-based application whose output does not lend itself to rapid interpretation or efficient management. NONMEMory has been created to be a comprehensive project manager for NONMEM, providing detailed summary, comparison and overview of the runs comprising a given project, including the display of output data, simple post-run processing, fast diagnostic plots and run output management, complementary to other available modelling aids. Analysis time ought not to be spent on trivial tasks, and NONMEMory's role is to eliminate these as far as possible by increasing the efficiency of the modelling process. NONMEMory is freely available from http://www.uct.ac.za/depts/pha/nonmemory.php.
Analytical Tools Interface for Landscape Assessments
Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...
Development of a simplified urban water balance model (WABILA).
Henrichs, M; Langner, J; Uhl, M
2016-01-01
During the last decade, water sensitive urban design (WSUD) has become more and more accepted. However, there is not any simple tool or option available to evaluate the influence of these measures on the local water balance. To counteract the impact of new settlements, planners focus on mitigating increases in runoff through installation of infiltration systems. This leads to an increasing non-natural groundwater recharge and decreased evapotranspiration. Simple software tools which evaluate or simulate the effect of WSUD on the local water balance are still needed. The authors developed a tool named WABILA (Wasserbilanz) that could support planners for optimal WSUD. WABILA is an easy-to-use planning tool that is based on simplified regression functions for established measures and land covers. Results show that WSUD has to be site-specific, based on climate conditions and the natural water balance.
UUI: Reusable Spatial Data Services in Unified User Interface at NASA GES DISC
NASA Technical Reports Server (NTRS)
Petrenko, Maksym; Hegde, Mahabaleshwa; Bryant, Keith; Pham, Long B.
2016-01-01
Unified User Interface (UUI) is a next-generation operational data access tool that has been developed at Goddard Earth Sciences Data and Information Services Center(GES DISC) to provide a simple, unified, and intuitive one-stop shop experience for the key data services available at GES DISC, including subsetting (Simple Subset Wizard -SSW), granule file search (Mirador), plotting (Giovanni), and other legacy spatial data services. UUI has been built based on a flexible infrastructure of reusable web services self-contained building blocks that can easily be plugged into spatial applications, including third-party clients or services, to easily enable new functionality as new datasets and services become available. In this presentation, we will discuss our experience in designing UUI services based on open industry standards. We will also explain how the resulting framework can be used for a rapid development, deployment, and integration of spatial data services, facilitating efficient access and dissemination of spatial data sets.
Simple Sensitivity Analysis for Orion GNC
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch.We describe in this paper a sensitivity analysis tool (Critical Factors Tool or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
Ernecoff, Natalie C; Witteman, Holly O; Chon, Kristen; Chen, Yanquan Iris; Buddadhumaruk, Praewpannarai; Chiarchiaro, Jared; Shotsberger, Kaitlin J; Shields, Anne-Marie; Myers, Brad A; Hough, Catherine L; Carson, Shannon S; Lo, Bernard; Matthay, Michael A; Anderson, Wendy G; Peterson, Michael W; Steingrub, Jay S; Arnold, Robert M; White, Douglas B
2016-06-01
Although barriers to shared decision making in intensive care units are well documented, there are currently no easily scaled interventions to overcome these problems. We sought to assess stakeholders' perceptions of the acceptability, usefulness, and design suggestions for a tablet-based tool to support communication and shared decision making in ICUs. We conducted in-depth semi-structured interviews with 58 key stakeholders (30 surrogates and 28 ICU care providers). Interviews explored stakeholders' perceptions about the acceptability of a tablet-based tool to support communication and shared decision making, including the usefulness of modules focused on orienting families to the ICU, educating them about the surrogate's role, completing a question prompt list, eliciting patient values, educating about treatment options, eliciting perceptions about prognosis, and providing psychosocial support resources. The interviewer also elicited stakeholders' design suggestions for such a tool. We used constant comparative methods to identify key themes that arose during the interviews. Overall, 95% (55/58) of participants perceived the proposed tool to be acceptable, with 98% (57/58) of interviewees finding six or more of the seven content domains acceptable. Stakeholders identified several potential benefits of the tool including that it would help families prepare for the surrogate role and for family meetings as well as give surrogates time and a framework to think about the patient's values and treatment options. Key design suggestions included: conceptualize the tool as a supplement to rather than a substitute for surrogate-clinician communication; make the tool flexible with respect to how, where, and when surrogates can access the tool; incorporate interactive exercises; use video and narration to minimize the cognitive load of the intervention; and build an extremely simple user interface to maximize usefulness for individuals with low computer literacy. There is broad support among stakeholders for the use of a tablet-based tool to improve communication and shared decision making in ICUs. Eliciting the perspectives of key stakeholders early in the design process yielded important insights to create a tool tailored to the needs of surrogates and care providers in ICUs. Copyright © 2016 Elsevier Inc. All rights reserved.
Ramot, Daniel; Johnson, Brandon E.; Berry, Tommie L.; Carnell, Lucinda; Goodman, Miriam B.
2008-01-01
Background Caenorhabditis elegans locomotion is a simple behavior that has been widely used to dissect genetic components of behavior, synaptic transmission, and muscle function. Many of the paradigms that have been created to study C. elegans locomotion rely on qualitative experimenter observation. Here we report the implementation of an automated tracking system developed to quantify the locomotion of multiple individual worms in parallel. Methodology/Principal Findings Our tracking system generates a consistent measurement of locomotion that allows direct comparison of results across experiments and experimenters and provides a standard method to share data between laboratories. The tracker utilizes a video camera attached to a zoom lens and a software package implemented in MATLAB®. We demonstrate several proof-of-principle applications for the tracker including measuring speed in the absence and presence of food and in the presence of serotonin. We further use the tracker to automatically quantify the time course of paralysis of worms exposed to aldicarb and levamisole and show that tracker performance compares favorably to data generated using a hand-scored metric. Conclusions/Signficance Although this is not the first automated tracking system developed to measure C. elegans locomotion, our tracking software package is freely available and provides a simple interface that includes tools for rapid data collection and analysis. By contrast with other tools, it is not dependent on a specific set of hardware. We propose that the tracker may be used for a broad range of additional worm locomotion applications including genetic and chemical screening. PMID:18493300
Bonato, Lucio; Minelli, Alessandro; Lopresti, Massimo; Cerretti, Pierfilippo
2014-01-01
ChiloKey is a matrix-based, interactive key to all 179 species of Geophilomorpha (Chilopoda) recorded from Europe, including species of uncertain identity and those whose morphology is known partially only. The key is intended to assist in identification of subadult and adult specimens, by means of microscopy and simple dissection techniques whenever necessary. The key is freely available through the web at: http://www.biologia.unipd.it/chilokey/ and at http://www.interactive-keys.eu/chilokey/.
Parametric Study of Biconic Re-Entry Vehicles
NASA Technical Reports Server (NTRS)
Steele, Bryan; Banks, Daniel W.; Whitmore, Stephen A.
2007-01-01
An optimization based on hypersonic aerodynamic performance and volumetric efficiency was accomplished for a range of biconic configurations. Both axisymmetric and quasi-axisymmetric geometries (bent and flattened) were analyzed. The aerodynamic optimization wag based on hypersonic simple Incidence angle analysis tools. The range of configurations included those suitable for r lunar return trajectory with a lifting aerocapture at Earth and an overall volume that could support a nominal crew. The results yielded five configurations that had acceptable aerodynamic performance and met overall geometry and size limitations
The ARC/INFO geographic information system
NASA Astrophysics Data System (ADS)
Morehouse, Scott
1992-05-01
ARC/INFO is a general-purpose system for processing geographic information. It is based on a relatively simple model of geographic space—the coverage—and contains an extensive set of geoprocessing tools which operate on coverages. ARC/INFO is used in a wide variety of applications areas, including: natural-resource inventory and planning, cadastral database development and mapping, urban and regional planning, and cartography. This paper is an overview of ARC/INFO and discusses the ARC/INFO conceptual architecture, data model, operators, and user interface.
Neuman, Keir C.; Block, Steven M.
2006-01-01
Since their invention just over 20 years ago, optical traps have emerged as a powerful tool with broad-reaching applications in biology and physics. Capabilities have evolved from simple manipulation to the application of calibrated forces on—and the measurement of nanometer-level displacements of—optically trapped objects. We review progress in the development of optical trapping apparatus, including instrument design considerations, position detection schemes and calibration techniques, with an emphasis on recent advances. We conclude with a brief summary of innovative optical trapping configurations and applications. PMID:16878180
NASA Technical Reports Server (NTRS)
Hartz, Leslie
1994-01-01
Tool helps worker grip and move along large, smooth structure with no handgrips or footholds. Adheres to surface but easily released by actuating simple mechanism. Includes handle and segmented contact-adhesive pad. Bulk of pad made of soft plastic foam conforming to surface of structure. Each segment reinforced with rib. In sticking mode, ribs braced by side catches. In peeling mode, side catches retracted, and segmented adhesive pad loses its stiffness. Modified versions useful in inspecting hulls of ships and scaling walls in rescue operations.
Jaeger, Christian; Hemmann, Felix
2014-01-01
Elimination of Artifacts in NMR SpectroscopY (EASY) is a simple but very effective tool to remove simultaneously any real NMR probe background signal, any spectral distortions due to deadtime ringdown effects and -specifically- severe acoustic ringing artifacts in NMR spectra of low-gamma nuclei. EASY enables and maintains quantitative NMR (qNMR) as only a single pulse (preferably 90°) is used for data acquisition. After the acquisition of the first scan (it contains the wanted NMR signal and the background/deadtime/ringing artifacts) the same experiment is repeated immediately afterwards before the T1 waiting delay. This second scan contains only the background/deadtime/ringing parts. Hence, the simple difference of both yields clean NMR line shapes free of artefacts. In this Part I various examples for complete (1)H, (11)B, (13)C, (19)F probe background removal due to construction parts of the NMR probes are presented. Furthermore, (25)Mg EASY of Mg(OH)2 is presented and this example shows how extremely strong acoustic ringing can be suppressed (more than a factor of 200) such that phase and baseline correction for spectra acquired with a single pulse is no longer a problem. EASY is also a step towards deadtime-free data acquisition as these effects are also canceled completely. EASY can be combined with any other NMR experiment, including 2D NMR, if baseline distortions are a big problem. © 2013 Published by Elsevier Inc.
Simple Biological Systems for Assessing the Activity of Superoxide Dismutase Mimics
Tovmasyan, Artak; Reboucas, Julio S.
2014-01-01
Abstract Significance: Half a century of research provided unambiguous proof that superoxide and species derived from it—reactive oxygen species (ROS)—play a central role in many diseases and degenerative processes. This stimulated the search for pharmaceutical agents that are capable of preventing oxidative damage, and methods of assessing their therapeutic potential. Recent Advances: The limitations of superoxide dismutase (SOD) as a therapeutic tool directed attention to small molecules, SOD mimics, that are capable of catalytically scavenging superoxide. Several groups of compounds, based on either metal complexes, including metalloporphyrins, metallocorroles, Mn(II) cyclic polyamines, and Mn(III) salen derivatives, or non-metal based compounds, such as fullerenes, nitrones, and nitroxides, have been developed and studied in vitro and in vivo. Very few entered clinical trials. Critical Issues and Future Directions: Development of SOD mimics requires in-depth understanding of their mechanisms of biological action. Elucidation of both molecular features, essential for efficient ROS-scavenging in vivo, and factors limiting the potential side effects requires biologically relevant and, at the same time, relatively simple testing systems. This review discuses the advantages and limitations of genetically engineered SOD-deficient unicellular organisms, Escherichia coli and Saccharomyces cerevisiae as tools for investigating the efficacy and mechanisms of biological actions of SOD mimics. These simple systems allow the scrutiny of the minimal requirements for a functional SOD mimic: the association of a high catalytic activity for superoxide dismutation, low toxicity, and an efficient cellular uptake/biodistribution. Antioxid. Redox Signal. 20, 2416–2436. PMID:23964890
NASA Technical Reports Server (NTRS)
Clementel, N.; Madura, T. I.; Kruip, C. J. H.; Icke, V.; Gull, T. R.
2014-01-01
Eta Carinae is an ideal astrophysical laboratory for studying massive binary interactions and evolution, and stellar wind-wind collisions. Recent three-dimensional (3D) simulations set the stage for understanding the highly complex 3D flows in Eta Car. Observations of different broad high- and low-ionization forbidden emission lines provide an excellent tool to constrain the orientation of the system, the primary's mass-loss rate, and the ionizing flux of the hot secondary. In this work we present the first steps towards generating synthetic observations to compare with available and future HST/STIS data. We present initial results from full 3D radiative transfer simulations of the interacting winds in Eta Car. We use the SimpleX algorithm to post-process the output from 3D SPH simulations and obtain the ionization fractions of hydrogen and helium assuming three different mass-loss rates for the primary star. The resultant ionization maps of both species constrain the regions where the observed forbidden emission lines can form. Including collisional ionization is necessary to achieve a better description of the ionization states, especially in the areas shielded from the secondary's radiation. We find that reducing the primary's mass-loss rate increases the volume of ionized gas, creating larger areas where the forbidden emission lines can form. We conclude that post processing 3D SPH data with SimpleX is a viable tool to create ionization maps for Eta Car.
NASA Technical Reports Server (NTRS)
Clementel, N.; Madura, T. I.; Kruip, C.J.H.; Icke, V.; Gull, T. R.
2014-01-01
Eta Carinae is an ideal astrophysical laboratory for studying massive binary interactions and evolution, and stellar wind-wind collisions. Recent three-dimensional (3D) simulations set the stage for understanding the highly complex 3D flows in eta Car. Observations of different broad high- and low-ionization forbidden emission lines provide an excellent tool to constrain the orientation of the system, the primary's mass-loss rate, and the ionizing flux of the hot secondary. In this work we present the first steps towards generating synthetic observations to compare with available and future HST/STIS data. We present initial results from full 3D radiative transfer simulations of the interacting winds in eta Car.We use the SimpleX algorithm to post-process the output from 3D SPH simulations and obtain the ionization fractions of hydrogen and helium assuming three different mass-loss rates for the primary star. The resultant ionization maps of both species constrain the regions where the observed forbidden emission lines can form. Including collisional ionization is necessary to achieve a better description of the ionization states, especially in the areas shielded from the secondary's radiation. We find that reducing the primary's mass-loss rate increases the volume of ionized gas, creating larger areas where the forbidden emission lines can form.We conclude that post processing 3D SPH data with SimpleX is a viable tool to create ionization maps for eta Car.
PAHFIT: Properties of PAH Emission
NASA Astrophysics Data System (ADS)
Smith, J. D.; Draine, Bruce
2012-10-01
PAHFIT is an IDL tool for decomposing Spitzer IRS spectra of PAH emission sources, with a special emphasis on the careful recovery of ambiguous silicate absorption, and weak, blended dust emission features. PAHFIT is primarily designed for use with full 5-35 micron Spitzer low-resolution IRS spectra. PAHFIT is a flexible tool for fitting spectra, and you can add or disable features, compute combined flux bands, change fitting limits, etc., without changing the code. PAHFIT uses a simple, physically-motivated model, consisting of starlight, thermal dust continuum in a small number of fixed temperature bins, resolved dust features and feature blends, prominent emission lines (which themselves can be blended with dust features), as well as simple fully-mixed or screen dust extinction, dominated by the silicate absorption bands at 9.7 and 18 microns. Most model components are held fixed or are tightly constrained. PAHFIT uses Drude profiles to recover the full strength of dust emission features and blends, including the significant power in the wings of the broad emission profiles. This means the resulting feature strengths are larger (by factors of 2-4) than are recovered by methods which estimate the underlying continuum using line segments or spline curves fit through fiducial wavelength anchors.
VennDIS: a JavaFX-based Venn and Euler diagram software to generate publication quality figures.
Ignatchenko, Vladimir; Ignatchenko, Alexandr; Sinha, Ankit; Boutros, Paul C; Kislinger, Thomas
2015-04-01
Venn diagrams are graphical representations of the relationships among multiple sets of objects and are often used to illustrate similarities and differences among genomic and proteomic datasets. All currently existing tools for producing Venn diagrams evince one of two traits; they require expertise in specific statistical software packages (such as R), or lack the flexibility required to produce publication-quality figures. We describe a simple tool that addresses both shortcomings, Venn Diagram Interactive Software (VennDIS), a JavaFX-based solution for producing highly customizable, publication-quality Venn, and Euler diagrams of up to five sets. The strengths of VennDIS are its simple graphical user interface and its large array of customization options, including the ability to modify attributes such as font, style and position of the labels, background color, size of the circle/ellipse, and outline color. It is platform independent and provides real-time visualization of figure modifications. The created figures can be saved as XML files for future modification or exported as high-resolution images for direct use in publications. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A novel methodology for building robust design rules by using design based metrology (DBM)
NASA Astrophysics Data System (ADS)
Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan
2013-03-01
This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.
Moghazy, Amr; Abdelrahman, Amira; Fahim, Ayman
2012-01-01
Preparedness is a necessity for proper handling of emergencies and disaster, particularly in Suez Canal and Sinai regions. To assure best success rates, educative programs should be environmentally based. Burn and fire preventive educative programs were tailored to adapt social and education levels of audience. In addition, common etiologies and applicability of preventive measures, according to local resources and logistics, were considered. Presentations were the main educative tool; they were made as simple as possible to assure best understanding. To assure continuous education, brochures and stickers, containing most popular mistakes and questions, were distributed after the sessions. Audience was classified according to their level of knowledge to health professional group; students groups; high-risk group; and lay people group. For course efficacy evaluation, pre- and posttests were used immediately before and after the sessions. Right answers in both tests were compared for statistical significance. Results showed significant acquisition of proper attitude and knowledge in all educated groups. The highest was among students and the least was in health professionals. Comprehensive simple environmental-based educative programs are ideal for rapid reform and community mobilization in our region. Activities should include direct contact, stickers and flyers, and audiovisual tools if possible.
Touch Interaction with 3D Geographical Visualization on Web: Selected Technological and User Issues
NASA Astrophysics Data System (ADS)
Herman, L.; Stachoň, Z.; Stuchlík, R.; Hladík, J.; Kubíček, P.
2016-10-01
The use of both 3D visualization and devices with touch displays is increasing. In this paper, we focused on the Web technologies for 3D visualization of spatial data and its interaction via touch screen gestures. At the first stage, we compared the support of touch interaction in selected JavaScript libraries on different hardware (desktop PCs with touch screens, tablets, and smartphones) and software platforms. Afterward, we realized simple empiric test (within-subject design, 6 participants, 2 simple tasks, LCD touch monitor Acer and digital terrain models as stimuli) focusing on the ability of users to solve simple spatial tasks via touch screens. An in-house testing web tool was developed and used based on JavaScript, PHP, and X3DOM languages and Hammer.js libraries. The correctness of answers, speed of users' performances, used gestures, and a simple gesture metric was recorded and analysed. Preliminary results revealed that the pan gesture is most frequently used by test participants and it is also supported by the majority of 3D libraries. Possible gesture metrics and future developments including the interpersonal differences are discussed in the conclusion.
MyGeoHub: A Collaborative Geospatial Research and Education Platform
NASA Astrophysics Data System (ADS)
Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.
2017-12-01
Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.
External validation of a simple clinical tool used to predict falls in people with Parkinson disease
Duncan, Ryan P.; Cavanaugh, James T.; Earhart, Gammon M.; Ellis, Terry D.; Ford, Matthew P.; Foreman, K. Bo; Leddy, Abigail L.; Paul, Serene S.; Canning, Colleen G.; Thackeray, Anne; Dibble, Leland E.
2015-01-01
Background Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. METHODS We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. RESULTS The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76 –0.89), comparable to the developmental study. CONCLUSION The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual’s risk of an impending fall. PMID:26003412
Duncan, Ryan P; Cavanaugh, James T; Earhart, Gammon M; Ellis, Terry D; Ford, Matthew P; Foreman, K Bo; Leddy, Abigail L; Paul, Serene S; Canning, Colleen G; Thackeray, Anne; Dibble, Leland E
2015-08-01
Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76-0.89), comparable to the developmental study. The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual's risk of an impending fall. Copyright © 2015 Elsevier Ltd. All rights reserved.
Benefits and Pitfalls: Simple Guidelines for the Use of Social Networking Tools in K-12 Education
ERIC Educational Resources Information Center
Huffman, Stephanie
2013-01-01
The article will outline a framework for the use of social networking tools in K-12 education framed around four thought provoking questions: 1) what are the benefits and pitfalls of using social networking tools in P-12 education, 2) how do we plan effectively for the use of social networking tool, 3) what role does professional development play…
Moret, Whitney M
2018-03-21
Economic strengthening practitioners are increasingly seeking data collection tools that will help them target households vulnerable to HIV and poor child well-being outcomes, match households to appropriate interventions, monitor their status, and determine readiness for graduation from project support. This article discusses efforts in 3 countries to develop simple, valid tools to quantify and classify economic vulnerability status. In Côte d'Ivoire, we conducted a cross-sectional survey with 3,749 households to develop a scale based on the definition of HIV-related economic vulnerability from the U.S. President's Emergency Plan for AIDS Relief (PEPFAR) for the purpose of targeting vulnerable households for PEPFAR-funded programs for orphans and vulnerable children. The vulnerability measures examined did not cluster in ways that would allow for the creation of a small number of composite measures, and thus we were unable to develop a scale. In Uganda, we assessed the validity of a vulnerability index developed to classify households according to donor classifications of economic status by measuring its association with a validated poverty measure, finding only a modest correlation. In South Africa, we developed monitoring and evaluation tools to assess economic status of individual adolescent girls and their households. We found no significant correlation with our validation measures, which included a validated measure of girls' vulnerability to HIV, a validated poverty measure, and subjective classifications generated by the community, data collector, and respondent. Overall, none of the measures of economic vulnerability used in the 3 countries varied significantly with their proposed validation items. Our findings suggest that broad constructs of economic vulnerability cannot be readily captured using simple scales to classify households and individuals in a way that accounts for a substantial amount of variance at locally defined vulnerability levels. We recommend that researchers and implementers design monitoring and evaluation instruments to capture narrower definitions of vulnerability based on characteristics programs intend to affect. We also recommend using separate tools for targeting based on context-specific indicators with evidence-based links to negative outcomes. Policy makers and donors should avoid reliance on simplified metrics of economic vulnerability in the programs they support. © Moret.
NASA Astrophysics Data System (ADS)
Warren, M. A.; Goult, S.; Clewley, D.
2018-06-01
Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.
A numerical tool for reproducing driver behaviour: experiments and predictive simulations.
Casucci, M; Marchitto, M; Cacciabue, P C
2010-03-01
This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions.
Diagnosis of Late-Stage, Early-Onset, Small-Fiber Polyneuropathy
2016-10-01
develop biotechnology tools for simple diagnosis (sweat testing and pupilometry), 3) identify gene polymorphisms to detect risk for SFPN. None...Goal 4) Specific Aim 2: To develop and evaluate simple biotechnology devices for diagnosing and monitoring longstanding eoSFPN based on
Simple Nutrition Screening Tool for Pediatric Inpatients.
White, Melinda; Lawson, Karen; Ramsey, Rebecca; Dennis, Nicole; Hutchinson, Zoe; Soh, Xin Ying; Matsuyama, Misa; Doolan, Annabel; Todd, Alwyn; Elliott, Aoife; Bell, Kristie; Littlewood, Robyn
2016-03-01
Pediatric nutrition risk screening tools are not routinely implemented throughout many hospitals, despite prevalence studies demonstrating malnutrition is common in hospitalized children. Existing tools lack the simplicity of those used to assess nutrition risk in the adult population. This study reports the accuracy of a new, quick, and simple pediatric nutrition screening tool (PNST) designed to be used for pediatric inpatients. The pediatric Subjective Global Nutrition Assessment (SGNA) and anthropometric measures were used to develop and assess the validity of 4 simple nutrition screening questions comprising the PNST. Participants were pediatric inpatients in 2 tertiary pediatric hospitals and 1 regional hospital. Two affirmative answers to the PNST questions were found to maximize the specificity and sensitivity to the pediatric SGNA and body mass index (BMI) z scores for malnutrition in 295 patients. The PNST identified 37.6% of patients as being at nutrition risk, whereas the pediatric SGNA identified 34.2%. The sensitivity and specificity of the PNST compared with the pediatric SGNA were 77.8% and 82.1%, respectively. The sensitivity of the PNST at detecting patients with a BMI z score of less than -2 was 89.3%, and the specificity was 66.2%. Both the PNST and pediatric SGNA were relatively poor at detecting patients who were stunted or overweight, with the sensitivity and specificity being less than 69%. The PNST provides a sensitive, valid, and simpler alternative to existing pediatric nutrition screening tools such as Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP), Screening Tool Risk on Nutritional status and Growth (STRONGkids), and Paediatric Yorkhill Malnutrition Score (PYMS) to ensure the early detection of hospitalized children at nutrition risk. © 2014 American Society for Parenteral and Enteral Nutrition.
Identification of apple cultivars on the basis of simple sequence repeat markers.
Liu, G S; Zhang, Y G; Tao, R; Fang, J G; Dai, H Y
2014-09-12
DNA markers are useful tools that play an important role in plant cultivar identification. They are usually based on polymerase chain reaction (PCR) and include simple sequence repeats (SSRs), inter-simple sequence repeats, and random amplified polymorphic DNA. However, DNA markers were not used effectively in the complete identification of plant cultivars because of the lack of known DNA fingerprints. Recently, a novel approach called the cultivar identification diagram (CID) strategy was developed to facilitate the use of DNA markers for separate plant individuals. The CID was designed whereby a polymorphic maker was generated from each PCR that directly allowed for cultivar sample separation at each step. Therefore, it could be used to identify cultivars and varieties easily with fewer primers. In this study, 60 apple cultivars, including a few main cultivars in fields and varieties from descendants (Fuji x Telamon) were examined. Of the 20 pairs of SSR primers screened, 8 pairs gave reproducible, polymorphic DNA amplification patterns. The banding patterns obtained from these 8 primers were used to construct a CID map. Each cultivar or variety in this study was distinguished from the others completely, indicating that this method can be used for efficient cultivar identification. The result contributed to studies on germplasm resources and the seedling industry in fruit trees.
Taylor, S; Byrne, A; Adams, R; Turner, J; Hanna, L; Staffurth, J; Farnell, D; Sivell, S; Nelson, A; Green, J
2016-10-01
Although pelvic radiotherapy is an effective treatment for various malignancies, around half of patients develop significant gastrointestinal problems. These symptoms often remain undetected, despite the existence of effective treatments. This study developed and refined a simple screening tool to detect common gastrointestinal symptoms in outpatient clinics. These symptoms have a significant effect on quality of life. This tool will increase detection rates and so enable access to specialist gastroenterologists, which will in turn lead to improved symptom control and quality of life after treatment. A literature review and expert consensus meeting identified four items for the ALERT-B (Assessment of Late Effects of RadioTherapy - Bowel) screening tool. ALERT-B was face tested for its usability and acceptability using cognitive interviews with 12 patients experiencing late gastrointestinal symptoms after pelvic radiotherapy. Thematic analysis and probe category were used to analyse interview transcripts. Interview data were presented to a group of experts to agree on the final content and format of the tool. ALERT-B was assessed for reliability and tested for validity against the Gastrointestinal Symptom Rating Scale in a clinical study (EAGLE). Overall, the tool was found to be acceptable in terms of wording, response format and completion time. Participant-reported experiences, including lifestyle modifications and the psychological effect of the symptoms, led to further modifications of the tool. The refined tool includes three questions covering rectal bleeding, incontinence, nocturnal bowel movements and impact on quality of life, including mood, relationships and socialising. ALERT-B was successfully validated against the Gastrointestinal Symptom Rating Scale in the EAGLE study with the tool shown broadly to be internally consistent (Cronbach's α = 0.61 and all item-subscale correlation [Spearman] coefficients are > 0.6). The ALERT-B screening tool can be used in clinical practice to improve post-treatment supportive care by triggering the clinical assessment of patients suitable for referral to a gastroenterologist. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology
Grüning, Björn A.; Paszkiewicz, Konrad; Pritchard, Leighton
2013-01-01
The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu). PMID:24109552
Woo, Ann; Hittell, Jodi; Beardsley, Carrie; Noh, Charles; Stoukides, Cheryl A; Kaul, Alan F
2004-01-01
The goal of this ongoing comprehensive osteoporosis disease management initiative is to provide the adult primary care physicians' (PCPs) offices with a program enabling them to systematically identify and manage their population for osteoporosis. For over six years, Hill Physicians Medical Group (Hill Physicians) has implemented multiple strategies to develop a best practice for identifying and treating members who were candidates for osteoporosis therapy. Numerous tools were used to support this disease management effort, including: evidence-based clinical practice guidelines, patient education sessions, the Simple Calculated Osteoporosis Risk Estimation (SCORE) questionnaire tool, member specific reports for PCPs, targeted member mailings, office-based Peripheral Instantaneous X-ray Imaging (PIXI) test and counseling, dual x-ray absorptiometry (DEXA) scan guidelines, and web-based Electronic Simple Calculated Osteoporosis Risk Estimation (eSCORE) questionnaire tools. Hill Physicians tabulated results for patients who completed 2649 SCORE tests, screened 978 patients with PIXI tests, and identified 338 osteopenic and 124 osteoporotic patients. The preliminary results of this unique six-year ongoing educational initiative are slow but promising. New physician offices express interest in participating and those offices that have participated in the program continue to screen for osteoporosis. Hill Physicians' message is consistent and is communicated to the physicians repeatedly in different ways in accordance with the principles of educational outreach. Physicians who have conducted the program have positive feedback from their patients and office staff and have begun to communicate their experience to their peers.
Burridge-Knopoff Model as an Educational and Demonstrational Tool in Seismicity Prediction
NASA Astrophysics Data System (ADS)
Kato, M.
2007-12-01
While our effort is ongoing, the fact that predicting destructive earthquakes is not a straightforward business is hard to sell to the general public. Japan is prone to two types of destructive earthquakes; interplate events along Japan Trench and Nankai Trough, and intraplate events that often occur beneath megacities. Periodicity of interplate earthquakes is usually explained by the elastic rebound theory, but we are aware that the historical seismicity along Nankai Trough is not simply periodic. Inland intraplate events have geologically postulated recurrence intervals that are far longer than human lifetime, and we do not have ample knowledge to model their behavior that includes interaction among intraplate and interplate events. To demonstrate that accumulation and release of elastic energy is complex even in a simple system, we propose to utilize the Burridge-Knopoff (BK) model as a demonstrational tool. This original one-dimensional model is easy to construct and handle so that this is also an effective educational tool for classroom use. Our simulator is a simple realization of the original one dimensional BK, which consists of small blocks, springs and a motor. Accumulation and release of strain is visibly observable, and by guessing when the next large events occur we are able to intuitively learn that observation of strain accumulation is only one element in predicting large events. Quantitative analysis of the system is also possible by measuring the movement of blocks. While the long term average of strain energy is controlled by the loading rate, observed seismicity is neither time-predictable nor slip-predictable. Time between successive events is never a constant. Distribution of released energy obeys the power law, similar to Ishimoto- Iida and Gutenberg-Richter Law. This tool is also useful in demonstration of nonlinear behavior of complex system.
Use of simple models to determine wake vortex categories for new aircraft.
DOT National Transportation Integrated Search
2015-06-22
The paper describes how to use simple models and, if needed, sensitivity analyses to determine the wake vortex categories for new aircraft. The methodology provides a tool for the regulators to assess the relative risk of introducing new aircraft int...
VARED: Verification and Analysis of Requirements and Early Designs
NASA Technical Reports Server (NTRS)
Badger, Julia; Throop, David; Claunch, Charles
2014-01-01
Requirements are a part of every project life cycle; everything going forward in a project depends on them. Good requirements are hard to write, there are few useful tools to test, verify, or check them, and it is difficult to properly marry them to the subsequent design, especially if the requirements are written in natural language. In fact, the inconsistencies and errors in the requirements along with the difficulty in finding these errors contribute greatly to the cost of the testing and verification stage of flight software projects [1]. Large projects tend to have several thousand requirements written at various levels by different groups of people. The design process is distributed and a lack of widely accepted standards for requirements often results in a product that varies widely in style and quality. A simple way to improve this would be to standardize the design process using a set of tools and widely accepted requirements design constraints. The difficulty with this approach is finding the appropriate constraints and tools. Common complaints against the tools available include ease of use, functionality, and available features. Also, although preferable, it is rare that these tools are capable of testing the quality of the requirements.
Plasma Diagnostics: Use and Justification in an Industrial Environment
NASA Astrophysics Data System (ADS)
Loewenhardt, Peter
1998-10-01
The usefulness and importance of plasma diagnostics have played a major role in the development of plasma processing tools in the semiconductor industry. As can be seen through marketing materials from semiconductor equipment manufacturers, results from plasma diagnostic equipment can be a powerful tool in selling the technological leadership of tool design. Some diagnostics have long been used for simple process control such as optical emission for endpoint determination, but in recent years more sophisticated and involved diagnostic tools have been utilized in chamber and plasma source development and optimization. It is now common to find an assortment of tools at semiconductor equipment companies such as Langmuir probes, mass spectrometers, spatial optical emission probes, impedance, ion energy and ion flux probes. An outline of how the importance of plasma diagnostics has grown at an equipment manufacturer over the last decade will be given, with examples of significant and useful results obtained. Examples will include the development and optimization of an inductive plasma source, trends and hardware effects on ion energy distributions, mass spectrometry influences on process development and investigations of plasma-wall interactions. Plasma diagnostic focus, in-house development and proliferation in an environment where financial justification requirements are both strong and necessary will be discussed.
The role of optimization in the next generation of computer-based design tools
NASA Technical Reports Server (NTRS)
Rogan, J. Edward
1989-01-01
There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.
Therapeutic evaluation of GM2 gangliosidoses by ELISA using anti-GM2 ganglioside antibodies.
Tsuji, Daisuke; Higashine, Yukari; Matsuoka, Kazuhiko; Sakuraba, Hitoshi; Itoh, Kohji
2007-03-01
GM2 gangliosidoses, including Tay-Sachs disease, Sandhoff disease and the AB variant, comprise deficiencies of beta-hexosaminidase isozymes and GM2 ganglioside activator protein associated with accumulation of GM2 ganglioside (GM2) in lysosomes and neurosomatic clinical manifestations. A simple assay system for intracellular quantification of GM2 is required to evaluate the therapeutic effects on GM2-gangliosidoses. We newly established a cell-ELISA system involving anti-GM2 monoclonal antibodies for measuring GM2 storage in fibroblasts from Tay-Sachs and Sandhoff disease patients. We succeeded in detecting the corrective effect of enzyme replacement on elimination of GM2 in the cells with this ELISA system. This simple and sensitive system should be useful as additional diagnosis tool as well as therapeutic evaluation of GM2 gangliosidoses.
A simple method to calculate first-passage time densities with arbitrary initial conditions
NASA Astrophysics Data System (ADS)
Nyberg, Markus; Ambjörnsson, Tobias; Lizana, Ludvig
2016-06-01
Numerous applications all the way from biology and physics to economics depend on the density of first crossings over a boundary. Motivated by the lack of general purpose analytical tools for computing first-passage time densities (FPTDs) for complex problems, we propose a new simple method based on the independent interval approximation (IIA). We generalise previous formulations of the IIA to include arbitrary initial conditions as well as to deal with discrete time and non-smooth continuous time processes. We derive a closed form expression for the FPTD in z and Laplace-transform space to a boundary in one dimension. Two classes of problems are analysed in detail: discrete time symmetric random walks (Markovian) and continuous time Gaussian stationary processes (Markovian and non-Markovian). Our results are in good agreement with Langevin dynamics simulations.
Van Allen Probes Science Gateway and Space Weather Data Processing
NASA Astrophysics Data System (ADS)
Romeo, G.; Barnes, R. J.; Weiss, M.; Fox, N. J.; Mauk, B.; Potter, M.; Kessel, R.
2014-12-01
The Van Allen Probes Science Gateway acts as a centralized interface to the instrument Science Operation Centers (SOCs), provides mission planning tools, and hosts a number of science related activities such as the mission bibliography. Most importantly, the Gateway acts as the primary site for processing and delivering the VAP Space Weather data to users. Over the past year, the web-site has been completely redesigned with the focus on easier navigation and improvements of the existing tools such as the orbit plotter, position calculator and magnetic footprint tool. In addition, a new data plotting facility has been added. Based on HTML5, which allows users to interactively plot Van Allen Probes summary and space weather data. The user can tailor the tool to display exactly the plot they wish to see and then share this with other users via either a URL or by QR code. Various types of plots can be created, including simple time series, data plotted as a function of orbital location, and time versus L-Shell. We discuss the new Van Allen Probes Science Gateway and the Space Weather Data Pipeline.
Rocker: Open source, easy-to-use tool for AUC and enrichment calculations and ROC visualization.
Lätti, Sakari; Niinivehmas, Sanna; Pentikäinen, Olli T
2016-01-01
Receiver operating characteristics (ROC) curve with the calculation of area under curve (AUC) is a useful tool to evaluate the performance of biomedical and chemoinformatics data. For example, in virtual drug screening ROC curves are very often used to visualize the efficiency of the used application to separate active ligands from inactive molecules. Unfortunately, most of the available tools for ROC analysis are implemented into commercially available software packages, or are plugins in statistical software, which are not always the easiest to use. Here, we present Rocker, a simple ROC curve visualization tool that can be used for the generation of publication quality images. Rocker also includes an automatic calculation of the AUC for the ROC curve and Boltzmann-enhanced discrimination of ROC (BEDROC). Furthermore, in virtual screening campaigns it is often important to understand the early enrichment of active ligand identification, for this Rocker offers automated calculation routine. To enable further development of Rocker, it is freely available (MIT-GPL license) for use and modifications from our web-site (http://www.jyu.fi/rocker).
An empirical model for estimating annual consumption by freshwater fish populations
Liao, H.; Pierce, C.L.; Larscheid, J.G.
2005-01-01
Population consumption is an important process linking predator populations to their prey resources. Simple tools are needed to enable fisheries managers to estimate population consumption. We assembled 74 individual estimates of annual consumption by freshwater fish populations and their mean annual population size, 41 of which also included estimates of mean annual biomass. The data set included 14 freshwater fish species from 10 different bodies of water. From this data set we developed two simple linear regression models predicting annual population consumption. Log-transformed population size explained 94% of the variation in log-transformed annual population consumption. Log-transformed biomass explained 98% of the variation in log-transformed annual population consumption. We quantified the accuracy of our regressions and three alternative consumption models as the mean percent difference from observed (bioenergetics-derived) estimates in a test data set. Predictions from our population-size regression matched observed consumption estimates poorly (mean percent difference = 222%). Predictions from our biomass regression matched observed consumption reasonably well (mean percent difference = 24%). The biomass regression was superior to an alternative model, similar in complexity, and comparable to two alternative models that were more complex and difficult to apply. Our biomass regression model, log10(consumption) = 0.5442 + 0.9962??log10(biomass), will be a useful tool for fishery managers, enabling them to make reasonably accurate annual population consumption predictions from mean annual biomass estimates. ?? Copyright by the American Fisheries Society 2005.
Escape Excel: A tool for preventing gene symbol and accession conversion errors.
Welsh, Eric A; Stewart, Paul A; Kuenzi, Brent M; Eschrich, James A
2017-01-01
Microsoft Excel automatically converts certain gene symbols, database accessions, and other alphanumeric text into dates, scientific notation, and other numerical representations. These conversions lead to subsequent, irreversible, corruption of the imported text. A recent survey of popular genomic literature estimates that one-fifth of all papers with supplementary gene lists suffer from this issue. Here, we present an open-source tool, Escape Excel, which prevents these erroneous conversions by generating an escaped text file that can be safely imported into Excel. Escape Excel is implemented in a variety of formats (http://www.github.com/pstew/escape_excel), including a command line based Perl script, a Windows-only Excel Add-In, an OS X drag-and-drop application, a simple web-server, and as a Galaxy web environment interface. Test server implementations are accessible as a Galaxy interface (http://apostl.moffitt.org) and simple non-Galaxy web server (http://apostl.moffitt.org:8000/). Escape Excel detects and escapes a wide variety of problematic text strings so that they are not erroneously converted into other representations upon importation into Excel. Examples of problematic strings include date-like strings, time-like strings, leading zeroes in front of numbers, and long numeric and alphanumeric identifiers that should not be automatically converted into scientific notation. It is hoped that greater awareness of these potential data corruption issues, together with diligent escaping of text files prior to importation into Excel, will help to reduce the amount of Excel-corrupted data in scientific analyses and publications.
Software reuse in spacecraft planning and scheduling systems
NASA Technical Reports Server (NTRS)
Mclean, David; Tuchman, Alan; Broseghini, Todd; Yen, Wen; Page, Brenda; Johnson, Jay; Bogovich, Lynn; Burkhardt, Chris; Mcintyre, James; Klein, Scott
1993-01-01
The use of a software toolkit and development methodology that supports software reuse is described. The toolkit includes source-code-level library modules and stand-alone tools which support such tasks as data reformatting and report generation, simple relational database applications, user interfaces, tactical planning, strategic planning and documentation. The current toolkit is written in C and supports applications that run on IBM-PC's under DOS and UNlX-based workstations under OpenLook and Motif. The toolkit is fully integrated for building scheduling systems that reuse AI knowledge base technology. A typical scheduling scenario and three examples of applications that utilize the reuse toolkit will be briefly described. In addition to the tools themselves, a description of the software evolution and reuse methodology that was used is presented.
A programming language for composable DNA circuits
Phillips, Andrew; Cardelli, Luca
2009-01-01
Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing. PMID:19535415
A programming language for composable DNA circuits.
Phillips, Andrew; Cardelli, Luca
2009-08-06
Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing.
Technology: Presentations in the Cloud with a Twist
ERIC Educational Resources Information Center
Siegle, Del
2011-01-01
Technology tools have come a long way from early word processing applications and opportunities for students to engage in simple programming. Many tools now exist for students to develop and share products in a variety of formats and for a wide range of audiences. PowerPoint is probably the most ubiquitously used tool for student projects. In…
Scratch as a Computational Modelling Tool for Teaching Physics
ERIC Educational Resources Information Center
Lopez, Victor; Hernandez, Maria Isabel
2015-01-01
The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…
O'Connor, Annette M; Tsafnat, Guy; Gilbert, Stephen B; Thayer, Kristina A; Wolfe, Mary S
2018-01-09
The second meeting of the International Collaboration for Automation of Systematic Reviews (ICASR) was held 3-4 October 2016 in Philadelphia, Pennsylvania, USA. ICASR is an interdisciplinary group whose aim is to maximize the use of technology for conducting rapid, accurate, and efficient systematic reviews of scientific evidence. Having automated tools for systematic review should enable more transparent and timely review, maximizing the potential for identifying and translating research findings to practical application. The meeting brought together multiple stakeholder groups including users of summarized research, methodologists who explore production processes and systematic review quality, and technologists such as software developers, statisticians, and vendors. This diversity of participants was intended to ensure effective communication with numerous stakeholders about progress toward automation of systematic reviews and stimulate discussion about potential solutions to identified challenges. The meeting highlighted challenges, both simple and complex, and raised awareness among participants about ongoing efforts by various stakeholders. An outcome of this forum was to identify several short-term projects that participants felt would advance the automation of tasks in the systematic review workflow including (1) fostering better understanding about available tools, (2) developing validated datasets for testing new tools, (3) determining a standard method to facilitate interoperability of tools such as through an application programming interface or API, and (4) establishing criteria to evaluate the quality of tools' output. ICASR 2016 provided a beneficial forum to foster focused discussion about tool development and resources and reconfirm ICASR members' commitment toward systematic reviews' automation.
Rubin, Katrine Hass; Friis-Holmberg, Teresa; Hermann, Anne Pernille; Abrahamsen, Bo; Brixen, Kim
2013-08-01
A huge number of risk assessment tools have been developed. Far from all have been validated in external studies, more of them have absence of methodological and transparent evidence, and few are integrated in national guidelines. Therefore, we performed a systematic review to provide an overview of existing valid and reliable risk assessment tools for prediction of osteoporotic fractures. Additionally, we aimed to determine if the performance of each tool was sufficient for practical use, and last, to examine whether the complexity of the tools influenced their discriminative power. We searched PubMed, Embase, and Cochrane databases for papers and evaluated these with respect to methodological quality using the Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS) checklist. A total of 48 tools were identified; 20 had been externally validated, however, only six tools had been tested more than once in a population-based setting with acceptable methodological quality. None of the tools performed consistently better than the others and simple tools (i.e., the Osteoporosis Self-assessment Tool [OST], Osteoporosis Risk Assessment Instrument [ORAI], and Garvan Fracture Risk Calculator [Garvan]) often did as well or better than more complex tools (i.e., Simple Calculated Risk Estimation Score [SCORE], WHO Fracture Risk Assessment Tool [FRAX], and Qfracture). No studies determined the effectiveness of tools in selecting patients for therapy and thus improving fracture outcomes. High-quality studies in randomized design with population-based cohorts with different case mixes are needed. Copyright © 2013 American Society for Bone and Mineral Research.
Several steps/day indicators predict changes in anthropometric outcomes: HUB city steps
USDA-ARS?s Scientific Manuscript database
Walking for exercise remains the most frequently reported leisure-time activity, likely because it is simple, inexpensive, and easily incorporated into most people’s lifestyle. Pedometers are simple, convenient, and economical tools that can be used to quantify step-determined physical activity. F...
Predicting Fish Densities in Lotic Systems: a Simple Modeling Approach
Fish density models are essential tools for fish ecologists and fisheries managers. However, applying these models can be difficult because of high levels of model complexity and the large number of parameters that must be estimated. We designed a simple fish density model and te...
A Progression of Static Equilibrium Laboratory Exercises
ERIC Educational Resources Information Center
Kutzner, Mickey; Kutzner, Andrew
2013-01-01
Although simple architectural structures like bridges, catwalks, cantilevers, and Stonehenge have been integral in human societies for millennia, as have levers and other simple tools, modern students of introductory physics continue to grapple with Newton's conditions for static equilibrium. As formulated in typical introductory physics…
DABAM: an open-source database of X-ray mirrors metrology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez del Rio, Manuel; Bianchi, Davide; Cocco, Daniele
2016-04-20
An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper,more » with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. Some optics simulations are presented and discussed to illustrate the real use of the profiles from the database.« less
DABAM: an open-source database of X-ray mirrors metrology
Sanchez del Rio, Manuel; Bianchi, Davide; Cocco, Daniele; Glass, Mark; Idir, Mourad; Metz, Jim; Raimondi, Lorenzo; Rebuffi, Luca; Reininger, Ruben; Shi, Xianbo; Siewert, Frank; Spielmann-Jaeggi, Sibylle; Takacs, Peter; Tomasset, Muriel; Tonnessen, Tom; Vivo, Amparo; Yashchuk, Valeriy
2016-01-01
An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper, with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. Some optics simulations are presented and discussed to illustrate the real use of the profiles from the database. PMID:27140145
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez del Rio, Manuel; Bianchi, Davide; Cocco, Daniele
An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper,more » with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. Some optics simulations are presented and discussed to illustrate the real use of the profiles from the database.« less
DABAM: An open-source database of X-ray mirrors metrology
Sanchez del Rio, Manuel; Bianchi, Davide; Cocco, Daniele; ...
2016-05-01
An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper,more » with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. In conclusion, some optics simulations are presented and discussed to illustrate the real use of the profiles from the database.« less
DABAM: an open-source database of X-ray mirrors metrology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez del Rio, Manuel; Bianchi, Davide; Cocco, Daniele
An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper,more » with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. Some optics simulations are presented and discussed to illustrate the real use of the profiles from the database.« less
DABAM: An open-source database of X-ray mirrors metrology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez del Rio, Manuel; Bianchi, Davide; Cocco, Daniele
An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper,more » with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. In conclusion, some optics simulations are presented and discussed to illustrate the real use of the profiles from the database.« less
Repair of major system elements on Skylab
NASA Technical Reports Server (NTRS)
Pace, R. E., Jr.
1974-01-01
In-flight maintenance, as conceived and preplanned for the Skylab mission was limited to simple scheduled and unscheduled replacement tasks and minor contingency repairs. Tools and spares were provided accordingly. However, failures during the mission dictated complicated and sophisticated repairs to major systems so that the mission could continue. These repairs included the release of a large structure that failed to deploy, the assembly and deployment of large mechanical devices, the installation and checkout of precision electronic equipment, troubleshooting and repair of precision electromechanical equipment, and tapping into and recharging a cooling system. The repairs were conducted both inside the spacecraft and during extravehicular activities. Some of the repair tasks required team effort on the part of the crewmen including close procedural coordination between internal and extravehicular crewmen. The Skylab experience indicates that crewmen can, with adequate training, make major system repairs in space using standard or special tools. Design of future spacecraft systems should acknowledge this capability and provide for more extensive in-flight repair and maintenance.
Simple tool for planting acorns
William R. Beaufait
1957-01-01
A handy, inexpensive tool for planting acorns has been developed at the Delta Research Center of the Southern Forest Experiment Station and used successfully in experimental plantings. One of its merits is that it ensures a planting hole of eactly the desired depth.
Making Temporal Logic Calculational: A Tool for Unification and Discovery
NASA Astrophysics Data System (ADS)
Boute, Raymond
In temporal logic, calculational proofs beyond simple cases are often seen as challenging. The situation is reversed by making temporal logic calculational, yielding shorter and clearer proofs than traditional ones, and serving as a (mental) tool for unification and discovery. A side-effect of unifying theories is easier access by practicians. The starting point is a simple generic (software tool independent) Functional Temporal Calculus (FTC). Specific temporal logics are then captured via endosemantic functions. This concept reflects tacit conventions throughout mathematics and, once identified, is general and useful. FTC also yields a reasoning style that helps discovering theorems by calculation rather than just proving given facts. This is illustrated by deriving various theorems, most related to liveness issues in TLA+, and finding strengthenings of known results. Educational issues are addressed in passing.
A Simple Framework for Evaluating Authorial Contributions for Scientific Publications.
Warrender, Jeffrey M
2016-10-01
A simple tool is provided to assist researchers in assessing contributions to a scientific publication, for ease in evaluating which contributors qualify for authorship, and in what order the authors should be listed. The tool identifies four phases of activity leading to a publication-Conception and Design, Data Acquisition, Analysis and Interpretation, and Manuscript Preparation. By comparing a project participant's contribution in a given phase to several specified thresholds, a score of up to five points can be assigned; the contributor's scores in all four phases are summed to yield a total "contribution score", which is compared to a threshold to determine which contributors merit authorship. This tool may be useful in a variety of contexts in which a systematic approach to authorial credit is desired.
Ghali, Iraqi; Kizub, Darya; Billioux, Alexander C.; Bennani, Kenza; Bourkadi, Jamal Eddine; Benmamoun, Abderrahmane; Lahlou, Ouafae; Aouad, Rajae El; Dooley, Kelly E.
2014-01-01
Setting Public tuberculosis (TB) clinics in urban Morocco. Objective Explore risk factors for TB treatment default and develop a prediction tool. Assess consequences of default, specifically risk for transmission or development of drug resistance. Design Case-control study comparing patients who defaulted from TB treatment and patients who completed it using quantitative methods and open-ended questions. Results were interpreted in light of health professionals’ perspectives from a parallel study. A predictive model and simple tool to identify patients at high risk of default were developed. Sputum from cases with pulmonary TB was collected for smear and drug susceptibility testing. Results 91 cases and 186 controls enrolled. Independent risk factors for default included current smoking, retreatment, work interference with adherence, daily directly observed therapy, side effects, quick symptom resolution, and not knowing one’s treatment duration. Age >50 years, never smoking, and having friends who knew one’s diagnosis were protective. A simple scoring tool incorporating these factors was 82.4% sensitive and 87.6% specific for predicting default in this population. Clinicians and patients described additional contributors to default and suggested locally-relevant intervention targets. Among 89 cases with pulmonary TB, 71% had sputum that was smear positive for TB. Drug resistance was rare. Conclusion The causes of default from TB treatment were explored through synthesis of qualitative and quantitative data from patients and health professionals. A scoring tool with high sensitivity and specificity to predict default was developed. Prospective evaluation of this tool coupled with targeted interventions based on our findings is warranted. Of note, the risk of TB transmission from patients who default treatment to others is likely to be high. The commonly-feared risk of drug resistance, though, may be low; a larger study is required to confirm these findings. PMID:24699682
NASA Astrophysics Data System (ADS)
Gwiazda, A.; Banas, W.; Sekala, A.; Foit, K.; Hryniewicz, P.; Kost, G.
2015-11-01
Process of workcell designing is limited by different constructional requirements. They are related to technological parameters of manufactured element, to specifications of purchased elements of a workcell and to technical characteristics of a workcell scene. This shows the complexity of the design-constructional process itself. The results of such approach are individually designed workcell suitable to the specific location and specific production cycle. Changing this parameters one must rebuild the whole configuration of a workcell. Taking into consideration this it is important to elaborate the base of typical elements of a robot kinematic chain that could be used as the tool for building Virtual modelling of kinematic chains of industrial robots requires several preparatory phase. Firstly, it is important to create a database element, which will be models of industrial robot arms. These models could be described as functional primitives that represent elements between components of the kinematic pairs and structural members of industrial robots. A database with following elements is created: the base kinematic pairs, the base robot structural elements, the base of the robot work scenes. The first of these databases includes kinematic pairs being the key component of the manipulator actuator modules. Accordingly, as mentioned previously, it includes the first stage rotary pair of fifth stage. This type of kinematic pairs was chosen due to the fact that it occurs most frequently in the structures of industrial robots. Second base consists of structural robot elements therefore it allows for the conversion of schematic structures of kinematic chains in the structural elements of the arm of industrial robots. It contains, inter alia, the structural elements such as base, stiff members - simple or angular units. They allow converting recorded schematic three-dimensional elements. Last database is a database of scenes. It includes elements of both simple and complex: simple models of technological equipment, conveyors models, models of the obstacles and like that. Using these elements it could be formed various production spaces (robotized workcells), in which it is possible to virtually track the operation of an industrial robot arm modelled in the system.
Seed: a user-friendly tool for exploring and visualizing microbial community data.
Beck, Daniel; Dennis, Christopher; Foster, James A
2015-02-15
In this article we present Simple Exploration of Ecological Data (Seed), a data exploration tool for microbial communities. Seed is written in R using the Shiny library. This provides access to powerful R-based functions and libraries through a simple user interface. Seed allows users to explore ecological datasets using principal coordinate analyses, scatter plots, bar plots, hierarchal clustering and heatmaps. Seed is open source and available at https://github.com/danlbek/Seed. danlbek@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony
2018-01-01
This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.
Simple proteomics data analysis in the object-oriented PowerShell.
Mohammed, Yassene; Palmblad, Magnus
2013-01-01
Scripting languages such as Perl and Python are appreciated for solving simple, everyday tasks in bioinformatics. A more recent, object-oriented command shell and scripting language, Windows PowerShell, has many attractive features: an object-oriented interactive command line, fluent navigation and manipulation of XML files, ability to consume Web services from the command line, consistent syntax and grammar, rich regular expressions, and advanced output formatting. The key difference between classical command shells and scripting languages, such as bash, and object-oriented ones, such as PowerShell, is that in the latter the result of a command is a structured object with inherited properties and methods rather than a simple stream of characters. Conveniently, PowerShell is included in all new releases of Microsoft Windows and therefore already installed on most computers in classrooms and teaching labs. In this chapter we demonstrate how PowerShell in particular allows easy interaction with mass spectrometry data in XML formats, connection to Web services for tools such as BLAST, and presentation of results as formatted text or graphics. These features make PowerShell much more than "yet another scripting language."
A simple model of hysteresis behavior using spreadsheet analysis
NASA Astrophysics Data System (ADS)
Ehrmann, A.; Blachowicz, T.
2015-01-01
Hysteresis loops occur in many scientific and technical problems, especially as field dependent magnetization of ferromagnetic materials, but also as stress-strain-curves of materials measured by tensile tests including thermal effects, liquid-solid phase transitions, in cell biology or economics. While several mathematical models exist which aim to calculate hysteresis energies and other parameters, here we offer a simple model for a general hysteretic system, showing different hysteresis loops depending on the defined parameters. The calculation which is based on basic spreadsheet analysis plus an easy macro code can be used by students to understand how these systems work and how the parameters influence the reactions of the system on an external field. Importantly, in the step-by-step mode, each change of the system state, compared to the last step, becomes visible. The simple program can be developed further by several changes and additions, enabling the building of a tool which is capable of answering real physical questions in the broad field of magnetism as well as in other scientific areas, in which similar hysteresis loops occur.
[Screening for malnutrition among hospitalized patients in a Colombian University Hospital].
Cruz, Viviana; Bernal, Laura; Buitrago, Giancarlo; Ruiz, Álvaro J
2017-04-01
On admission, 30 to 50% of hospitalized patients have some degree of malnutrition, which is associated with longer length of stay, higher rates of complications, mortality and greater costs. To determine the frequency of screening for risk of malnutrition in medical records and assess the usefulness of the Malnutrition Screening Tool (MST). In a cross-sectional study, we searched for malnutrition screening in medical records, and we applied the MST tool to hospitalized patients at the Internal Medicine Wards of San Ignacio University Hospital. Of 295 patients included, none had been screened for malnutrition since hospital admission. Sixty one percent were at nutritional risk, with a higher prevalence among patients with HIV (85.7%), cancer (77.5%) and pneumonia. A positive MST result was associated with a 3.2 days increase in length of hospital stay (p = 0.024). The prevalence of malnutrition risk in hospitalized patients is high, but its screening is inadequate and it is underdiagnosed. The MST tool is simple, fast, low-cost, and has a good diagnostic performance.
AncestrySNPminer: A bioinformatics tool to retrieve and develop ancestry informative SNP panels
Amirisetty, Sushil; Khurana Hershey, Gurjit K.; Baye, Tesfaye M.
2012-01-01
A wealth of genomic information is available in public and private databases. However, this information is underutilized for uncovering population specific and functionally relevant markers underlying complex human traits. Given the huge amount of SNP data available from the annotation of human genetic variation, data mining is a faster and cost effective approach for investigating the number of SNPs that are informative for ancestry. In this study, we present AncestrySNPminer, the first web-based bioinformatics tool specifically designed to retrieve Ancestry Informative Markers (AIMs) from genomic data sets and link these informative markers to genes and ontological annotation classes. The tool includes an automated and simple “scripting at the click of a button” functionality that enables researchers to perform various population genomics statistical analyses methods with user friendly querying and filtering of data sets across various populations through a single web interface. AncestrySNPminer can be freely accessed at https://research.cchmc.org/mershalab/AncestrySNPminer/login.php. PMID:22584067
Proposal of a micromagnetic standard problem for ferromagnetic resonance simulations
NASA Astrophysics Data System (ADS)
Baker, Alexander; Beg, Marijan; Ashton, Gregory; Albert, Maximilian; Chernyshenko, Dmitri; Wang, Weiwei; Zhang, Shilei; Bisotti, Marc-Antonio; Franchin, Matteo; Hu, Chun Lian; Stamps, Robert; Hesjedal, Thorsten; Fangohr, Hans
2017-01-01
Nowadays, micromagnetic simulations are a common tool for studying a wide range of different magnetic phenomena, including the ferromagnetic resonance. A technique for evaluating reliability and validity of different micromagnetic simulation tools is the simulation of proposed standard problems. We propose a new standard problem by providing a detailed specification and analysis of a sufficiently simple problem. By analyzing the magnetization dynamics in a thin permalloy square sample, triggered by a well defined excitation, we obtain the ferromagnetic resonance spectrum and identify the resonance modes via Fourier transform. Simulations are performed using both finite difference and finite element numerical methods, with OOMMF and Nmag simulators, respectively. We report the effects of initial conditions and simulation parameters on the character of the observed resonance modes for this standard problem. We provide detailed instructions and code to assist in using the results for evaluation of new simulator tools, and to help with numerical calculation of ferromagnetic resonance spectra and modes in general.
Ho, Daniel W H; Sze, Karen M F; Ng, Irene O L
2015-08-28
Viral integration into the human genome upon infection is an important risk factor for various human malignancies. We developed viral integration site detection tool called Virus-Clip, which makes use of information extracted from soft-clipped sequencing reads to identify exact positions of human and virus breakpoints of integration events. With initial read alignment to virus reference genome and streamlined procedures, Virus-Clip delivers a simple, fast and memory-efficient solution to viral integration site detection. Moreover, it can also automatically annotate the integration events with the corresponding affected human genes. Virus-Clip has been verified using whole-transcriptome sequencing data and its detection was validated to have satisfactory sensitivity and specificity. Marked advancement in performance was detected, compared to existing tools. It is applicable to versatile types of data including whole-genome sequencing, whole-transcriptome sequencing, and targeted sequencing. Virus-Clip is available at http://web.hku.hk/~dwhho/Virus-Clip.zip.
Extensional rheometry with a handheld mobile device
NASA Astrophysics Data System (ADS)
Marshall, Kristin A.; Liedtke, Aleesha M.; Todt, Anika H.; Walker, Travis W.
2017-06-01
The on-site characterization of complex fluids is important for a number of academic and industrial applications. Consequently, a need exists to develop portable rheometers that can provide in the field diagnostics and serve as tools for rapid quality assurance. With the advancement of smartphone technology and the widespread global ownership of smart devices, mobile applications are attractive as platforms for rheological characterization. The present work investigates the use of a smartphone device for the extensional characterization of a series of Boger fluids composed of glycerol/water and poly(ethylene oxide), taking advantage of the increasing high-speed video capabilities (currently up to 240 Hz capture rate at 720p) of smartphone cameras. We report a noticeable difference in the characterization of samples with slight variations in polymer concentration and discuss current device limitations. Potential benefits of a handheld extensional rheometer include its use as a point-of-care diagnostic tool, especially in developing communities, as well as a simple and inexpensive tool for assessing product quality in industry.
Swartz, Jordan; Koziatek, Christian; Theobald, Jason; Smith, Silas; Iturrate, Eduardo
2017-05-01
Testing for venous thromboembolism (VTE) is associated with cost and risk to patients (e.g. radiation). To assess the appropriateness of imaging utilization at the provider level, it is important to know that provider's diagnostic yield (percentage of tests positive for the diagnostic entity of interest). However, determining diagnostic yield typically requires either time-consuming, manual review of radiology reports or the use of complex and/or proprietary natural language processing software. The objectives of this study were twofold: 1) to develop and implement a simple, user-configurable, and open-source natural language processing tool to classify radiology reports with high accuracy and 2) to use the results of the tool to design a provider-specific VTE imaging dashboard, consisting of both utilization rate and diagnostic yield. Two physicians reviewed a training set of 400 lower extremity ultrasound (UTZ) and computed tomography pulmonary angiogram (CTPA) reports to understand the language used in VTE-positive and VTE-negative reports. The insights from this review informed the arguments to the five modifiable parameters of the NLP tool. A validation set of 2,000 studies was then independently classified by the reviewers and by the tool; the classifications were compared and the performance of the tool was calculated. The tool was highly accurate in classifying the presence and absence of VTE for both the UTZ (sensitivity 95.7%; 95% CI 91.5-99.8, specificity 100%; 95% CI 100-100) and CTPA reports (sensitivity 97.1%; 95% CI 94.3-99.9, specificity 98.6%; 95% CI 97.8-99.4). The diagnostic yield was then calculated at the individual provider level and the imaging dashboard was created. We have created a novel NLP tool designed for users without a background in computer programming, which has been used to classify venous thromboembolism reports with a high degree of accuracy. The tool is open-source and available for download at http://iturrate.com/simpleNLP. Results obtained using this tool can be applied to enhance quality by presenting information about utilization and yield to providers via an imaging dashboard. Copyright © 2017 Elsevier B.V. All rights reserved.
Simple Sensitivity Analysis for Orion Guidance Navigation and Control
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
NASA Astrophysics Data System (ADS)
Swamy, Ashwin Balegar
This thesis involves development of an interactive GIS (Geographic Information System) based application, which gives information about the ancient history of Egypt. The astonishing architecture, the strange burial rituals and their civilization were some of the intriguing questions that motivated me towards developing this application. The application is a historical timeline starting from 3100 BC, leading up to 664 BC, focusing on the evolution of the Egyptian dynasties. The tool holds information regarding some of the famous monuments which were constructed during that era and also about the civilizations that co-existed. It also provides details about the religions followed by their kings. It also includes the languages spoken during those periods. The tool is developed using JAVA, a programing language and MOJO (Map Objects Java Objects) a product of ESRI (Environmental Science Research Institute) to create map objects, to provide geographic information. JAVA Swing is used for designing the user interface. HTML (Hyper Text Markup Language) pages are created to provide the user with more information related to the historic period. CSS (Cascade Style Sheets) and JAVA Scripts are used with HTML5 to achieve creative display of content. The tool is kept simple and easy for the user to interact with. The tool also includes pictures and videos for the user to get a feel of the historic period. The application is built to motivate people to know more about one of the prominent and ancient civilization of the Mediterranean world.
A Simple Mechanical Model for the Isotropic Harmonic Oscillator
ERIC Educational Resources Information Center
Nita, Gelu M.
2010-01-01
A constrained elastic pendulum is proposed as a simple mechanical model for the isotropic harmonic oscillator. The conceptual and mathematical simplicity of this model recommends it as an effective pedagogical tool in teaching basic physics concepts at advanced high school and introductory undergraduate course levels. (Contains 2 figures.)
The Simple Theory of Public Library Services.
ERIC Educational Resources Information Center
Newhouse, Joseph P.
A simple normative theory applicable to public library services was developed as a tool to aid libraries in answering the question: which books should be bought by the library? Although developed for normative purposes, the theory generates testable predictions. It is relevant to measuring benefits from services which are provided publicly because…
Safety in the Chemical Laboratory: Laboratory Air Quality: Part I. A Concentration Model.
ERIC Educational Resources Information Center
Butcher, Samuel S.; And Others
1985-01-01
Offers a simple model for estimating vapor concentrations in instructional laboratories. Three methods are described for measuring ventilation rates, and the results of measurements in six laboratories are presented. The model should provide a simple screening tool for evaluating worst-case personal exposures. (JN)
USDA-ARS?s Scientific Manuscript database
Simple sequence repeat (SSR) markers are widely used tools for inferences about genetic diversity, phylogeography and spatial genetic structure. Their applications assume that variation among alleles is essentially caused by an expansion or contraction of the number of repeats and that, accessorily,...
Albert, J G; Humbla, O; McAlindon, M E; Davison, C; Seitz, U; Fraser, C; Hagenmüller, F; Noetzel, E; Spada, C; Riccioni, M E; Barnert, J; Filmann, N; Keuchel, M
2015-10-01
Small bowel capsule endoscopy (SBCE) has become a first line diagnostic tool. Several training courses with a similar format have been established in Europe; however, data on learning curve and training in SBCE remain sparse.Between 2008 and 2011, different basic SBCE training courses were organized internationally in UK (n = 2), Italy (n = 2), Germany (n = 2), Finland (n = 1), and nationally in Germany (n = 10), applying similar 8-hour curricula with 50% lectures and 50% hands-on training. The Given PillCam System was used in 12 courses, the Olympus EndoCapsule system in 5, respectively. A simple evaluation tool for capsule endoscopy training (ET-CET) was developed using 10 short SBCE videos including relevant lesions and normal or irrelevant findings. For each video, delegates were required to record a diagnosis (achievable total score from 0 to 10) and the clinical relevance (achievable total score 0 to 10). ET-CET was performed at baseline before the course and repeated, with videos in altered order, after the course.Two hundred ninety-four delegates (79.3% physicians, 16.3% nurses, 4.4% others) were included for baseline analysis, 268 completed the final evaluation. Forty percent had no previous experience in SBCE, 33% had performed 10 or less procedures. Median scores for correct diagnosis improved from 4.0 (IQR 3) to 7.0 (IQR 3) during the courses (P < 0.001, Wilcoxon), and for correct classification of relevance of the lesions from 5.0 (IQR 3) to 7.0 (IQR 3) (P < 0.001), respectively. Improvement was not dependent on experience, profession, SBCE system, or course setting. Previous experience in SBCE was associated with higher baseline scores for correct diagnosis (P < 0.001; Kruskal-Wallis). Additionally, independent nonparametric partial correlation with experience in gastroscopy (rho 0.33) and colonoscopy (rho 0.27) was observed (P < 0.001).A simple ET-CET demonstrated significant improvement of diagnostic skills on completion of formal basic SBCE courses with hands-on training, regardless of preexisting experience, profession, and course setting. Baseline scores for correct diagnoses show a plateau after interpretation of 25 SBCE before courses, supporting this number as a compromise for credentialing. Experience in flexible endoscopy may be useful before attending an SBCE course.
An automated benchmarking platform for MHC class II binding prediction methods.
Andreatta, Massimo; Trolle, Thomas; Yan, Zhen; Greenbaum, Jason A; Peters, Bjoern; Nielsen, Morten
2018-05-01
Computational methods for the prediction of peptide-MHC binding have become an integral and essential component for candidate selection in experimental T cell epitope discovery studies. The sheer amount of published prediction methods-and often discordant reports on their performance-poses a considerable quandary to the experimentalist who needs to choose the best tool for their research. With the goal to provide an unbiased, transparent evaluation of the state-of-the-art in the field, we created an automated platform to benchmark peptide-MHC class II binding prediction tools. The platform evaluates the absolute and relative predictive performance of all participating tools on data newly entered into the Immune Epitope Database (IEDB) before they are made public, thereby providing a frequent, unbiased assessment of available prediction tools. The benchmark runs on a weekly basis, is fully automated, and displays up-to-date results on a publicly accessible website. The initial benchmark described here included six commonly used prediction servers, but other tools are encouraged to join with a simple sign-up procedure. Performance evaluation on 59 data sets composed of over 10 000 binding affinity measurements suggested that NetMHCIIpan is currently the most accurate tool, followed by NN-align and the IEDB consensus method. Weekly reports on the participating methods can be found online at: http://tools.iedb.org/auto_bench/mhcii/weekly/. mniel@bioinformatics.dtu.dk. Supplementary data are available at Bioinformatics online.
Stuyt, Elizabeth B; Voyles, Claudia A; Bursac, Sara
2018-02-07
Background: The National Acupuncture Detoxification Association (NADA) protocol, a simple standardized auricular treatment has the potential to provide vast public health relief on issues currently challenging our world. This includes but is not limited to addiction, such as the opioid epidemic, but also encompasses mental health, trauma, PTSD, chronic stress, and the symptoms associated with these conditions. Simple accessible tools that improve outcomes can make profound differences. We assert that the NADA protocol can have greatest impact when broadly applied by behavioral health professionals, Auricular Detoxification Specialists (ADSes). Methods: The concept of ADS is described and how current laws vary from state to state. Using available national data, a survey of practitioners in three selected states with vastly different laws regarding ADSes, and interviews of publicly funded programs which are successfully incorporating the NADA protocol, we consider possible effects of ADS-friendly conditions. Results: Data presented supports the idea that conditions conducive to ADS practice lead to greater implementation. Program interviews reflect settings in which adding ADSes can in turn lead to improved outcomes. Discussion: The primary purpose of non-acupuncturist ADSes is to expand the access of this simple but effective treatment to all who are suffering from addictions, stress, or trauma and to allow programs to incorporate acupuncture in the form of the NADA protocol at minimal cost, when and where it is needed. States that have changed laws to allow ADS practice for this standardized ear acupuncture protocol have seen increased access to this treatment, benefiting both patients and the programs.
Kangaroo – A pattern-matching program for biological sequences
2002-01-01
Background Biologists are often interested in performing a simple database search to identify proteins or genes that contain a well-defined sequence pattern. Many databases do not provide straightforward or readily available query tools to perform simple searches, such as identifying transcription binding sites, protein motifs, or repetitive DNA sequences. However, in many cases simple pattern-matching searches can reveal a wealth of information. We present in this paper a regular expression pattern-matching tool that was used to identify short repetitive DNA sequences in human coding regions for the purpose of identifying potential mutation sites in mismatch repair deficient cells. Results Kangaroo is a web-based regular expression pattern-matching program that can search for patterns in DNA, protein, or coding region sequences in ten different organisms. The program is implemented to facilitate a wide range of queries with no restriction on the length or complexity of the query expression. The program is accessible on the web at http://bioinfo.mshri.on.ca/kangaroo/ and the source code is freely distributed at http://sourceforge.net/projects/slritools/. Conclusion A low-level simple pattern-matching application can prove to be a useful tool in many research settings. For example, Kangaroo was used to identify potential genetic targets in a human colorectal cancer variant that is characterized by a high frequency of mutations in coding regions containing mononucleotide repeats. PMID:12150718
ERIC Educational Resources Information Center
New Teacher Project, 2011
2011-01-01
This "Rating a Teacher Observation Tool" identifies five simple questions and provides an easy-to-use scorecard to help policymakers decide whether an observation framework is likely to produce fair and accurate results. The five questions are: (1) Do the criteria and tools cover the classroom performance areas most connected to student outcomes?…
NASA Technical Reports Server (NTRS)
Casey, E. J.; Commadore, C. C.; Ingles, M. E.
1980-01-01
Long wire bundles twist into uniform spiral harnesses with help of simple apparatus. Wires pass through spacers and through hand-held tool with hole for each wire. Ends are attached to low speed bench motor. As motor turns, operator moves hand tool away forming smooth twists in wires between motor and tool. Technique produces harnesses that generate less radio-frequency interference than do irregularly twisted cables.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorokine, Alexandre
2011-10-01
Simple Ontology Format (SOFT) library and file format specification provides a set of simple tools for developing and maintaining ontologies. The library, implemented as a perl module, supports parsing and verification of the files in SOFt format, operations with ontologies (adding, removing, or filtering of entities), and converting of ontologies into other formats. SOFT allows users to quickly create ontologies using only a basic text editor, verify it, and portray it in a graph layout system using customized styles.
GI-conf: A configuration tool for the GI-cat distributed catalog
NASA Astrophysics Data System (ADS)
Papeschi, F.; Boldrini, E.; Bigagli, L.; Mazzetti, P.
2009-04-01
In this work we present a configuration tool for the GI-cat. In an Service-Oriented Architecture (SOA) framework, GI-cat implements a distributed catalog service providing advanced capabilities, such as: caching, brokering and mediation functionalities. GI-cat applies a distributed approach, being able to distribute queries to the remote service providers of interest in an asynchronous style, and notifies the status of the queries to the caller implementing an incremental feedback mechanism. Today, GI-cat functionalities are made available through two standard catalog interfaces: the OGC CSW ISO and CSW Core Application Profiles. However, two other interfaces are under testing: the CIM and the EO Extension Packages of the CSW ebRIM Application Profile. GI-cat is able to interface a multiplicity of discovery and access services serving heterogeneous Earth and Space Sciences resources. They include international standards like the OGC Web Services -i.e. OGC CSW, WCS, WFS and WMS, as well as interoperability arrangements (i.e. community standards) such as: UNIDATA THREDDS/OPeNDAP, SeaDataNet CDI (Common Data Index), GBIF (Global Biodiversity Information Facility) services, and SibESS-C infrastructure services. GI-conf implements user-friendly configuration tool for GI-cat. This is a GUI application that employs a visual and very simple approach to configure both the GI-cat publishing and distribution capabilities, in a dynamic way. The tool allows to set one or more GI-cat configurations. Each configuration consists of: a) the catalog standards interfaces published by GI-cat; b) the resources (i.e. services/servers) to be accessed and mediated -i.e. federated. Simple icons are used for interfaces and resources, implementing a user-friendly visual approach. The main GI-conf functionalities are: • Interfaces and federated resources management: user can set which interfaces must be published; besides, she/he can add a new resource, update or remove an already federated resource. • Multiple configuration management: multiple GI-cat configurations can be defined; every configuration identifies a set of published interfaces and a set of federated resources. Configurations can be edited, added, removed, exported, and even imported. • HTML report creation: an HTML report can be created, showing the current active GI-cat configuration, including the resources that are being federated and the published interface endpoints. The configuration tool is shipped with GI-cat and can be used to configure the service after its installation is completed.
Software Models Impact Stresses
NASA Technical Reports Server (NTRS)
Hanshaw, Timothy C.; Roy, Dipankar; Toyooka, Mark
1991-01-01
Generalized Impact Stress Software designed to assist engineers in predicting stresses caused by variety of impacts. Program straightforward, simple to implement on personal computers, "user friendly", and handles variety of boundary conditions applied to struck body being analyzed. Applications include mathematical modeling of motions and transient stresses of spacecraft, analysis of slamming of piston, of fast valve shutoffs, and play of rotating bearing assembly. Provides fast and inexpensive analytical tool for analysis of stresses and reduces dependency on expensive impact tests. Written in FORTRAN 77. Requires use of commercial software package PLOT88.
Benefits of a holistic breathing technique in patients on hemodialysis.
Stanley, Ruth; Leither, Thomas W; Sindelir, Cathy
2011-01-01
Health-related quality of life and heart rate variability are often depressed in patients on hemodialysis. This pilot program used a simple holistic, self-directed breathing technique designed to improve heart rate variability, with the hypothesis that improving heart rate variability would subsequently enhance health-related quality of life. Patient self-reported benefits included reductions in anxiety, fatigue, insomnia, and pain. Using holistic physiologic techniques may offer a unique and alternative tool for nurses to help increase health-related quality of life in patients on hemodialysis.
2010-01-01
We present an extensible software model for the genotype and phenotype community, XGAP. Readers can download a standard XGAP (http://www.xgap.org) or auto-generate a custom version using MOLGENIS with programming interfaces to R-software and web-services or user interfaces for biologists. XGAP has simple load formats for any type of genotype, epigenotype, transcript, protein, metabolite or other phenotype data. Current functionality includes tools ranging from eQTL analysis in mouse to genome-wide association studies in humans. PMID:20214801
Swertz, Morris A; Velde, K Joeri van der; Tesson, Bruno M; Scheltema, Richard A; Arends, Danny; Vera, Gonzalo; Alberts, Rudi; Dijkstra, Martijn; Schofield, Paul; Schughart, Klaus; Hancock, John M; Smedley, Damian; Wolstencroft, Katy; Goble, Carole; de Brock, Engbert O; Jones, Andrew R; Parkinson, Helen E; Jansen, Ritsert C
2010-01-01
We present an extensible software model for the genotype and phenotype community, XGAP. Readers can download a standard XGAP (http://www.xgap.org) or auto-generate a custom version using MOLGENIS with programming interfaces to R-software and web-services or user interfaces for biologists. XGAP has simple load formats for any type of genotype, epigenotype, transcript, protein, metabolite or other phenotype data. Current functionality includes tools ranging from eQTL analysis in mouse to genome-wide association studies in humans.
OLIFE: Tight Binding Code for Transmission Coefficient Calculation
NASA Astrophysics Data System (ADS)
Mijbil, Zainelabideen Yousif
2018-05-01
A new and human friendly transport calculation code has been developed. It requires a simple tight binding Hamiltonian as the only input file and uses a convenient graphical user interface to control calculations. The effect of magnetic field on junction has also been included. Furthermore the transmission coefficient can be calculated between any two points on the scatterer which ensures high flexibility to check the system. Therefore Olife can highly be recommended as an essential tool for pretesting studying and teaching electron transport in molecular devices that saves a lot of time and effort.
Using McStas for modelling complex optics, using simple building bricks
NASA Astrophysics Data System (ADS)
Willendrup, Peter K.; Udby, Linda; Knudsen, Erik; Farhi, Emmanuel; Lefmann, Kim
2011-04-01
The McStas neutron ray-tracing simulation package is a versatile tool for producing accurate neutron simulations, extensively used for design and optimization of instruments, virtual experiments, data analysis and user training.In McStas, component organization and simulation flow is intrinsically linear: the neutron interacts with the beamline components in a sequential order, one by one. Historically, a beamline component with several parts had to be implemented with a complete, internal description of all these parts, e.g. a guide component including all four mirror plates and required logic to allow scattering between the mirrors.For quite a while, users have requested the ability to allow “components inside components” or meta-components, allowing to combine functionality of several simple components to achieve more complex behaviour, i.e. four single mirror plates together defining a guide.We will here show that it is now possible to define meta-components in McStas, and present a set of detailed, validated examples including a guide with an embedded, wedged, polarizing mirror system of the Helmholtz-Zentrum Berlin type.
Carvalho, Rimenys J; Cruz, Thayana A
2018-01-01
High-throughput screening (HTS) systems have emerged as important tools to provide fast and low cost evaluation of several conditions at once since it requires small quantities of material and sample volumes. These characteristics are extremely valuable for experiments with large number of variables enabling the application of design of experiments (DoE) strategies or simple experimental planning approaches. Once, the capacity of HTS systems to mimic chromatographic purification steps was established, several studies were performed successfully including scale down purification. Here, we propose a method for studying different purification conditions that can be used for any recombinant protein, including complex and glycosylated proteins, using low binding filter microplates.
The Properties of Confined Water and Fluid Flow at the Nanoscale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwegler, E; Reed, J; Lau, E
This project has been focused on the development of accurate computational tools to study fluids in confined, nanoscale geometries, and the application of these techniques to probe the structural and electronic properties of water confined between hydrophilic and hydrophobic substrates, including the presence of simple ions at the interfaces. In particular, we have used a series of ab-initio molecular dynamics simulations and quantum Monte Carlo calculations to build an understanding of how hydrogen bonding and solvation are modified at the nanoscale. The properties of confined water affect a wide range of scientific and technological problems - including protein folding, cell-membranemore » flow, materials properties in confined media and nanofluidic devices.« less
Kaban, Leonard B; Cappetta, Alyssa; George, Brian C; Lahey, Edward T; Bohnen, Jordan D; Troulis, Maria J
2017-10-01
There are no universally accepted tools to evaluate operative skills of surgical residents in a timely fashion. The purpose of this study was to determine the feasibility of using a smartphone application, SIMPL (System for Improving and Measuring Procedural Learning), developed by a multi-institutional research collaborative, to achieve a high rate of timely operative evaluations and resident communication and to collect performance data. The authors hypothesized that these goals would be achieved because the process is convenient and efficient. This was a prospective feasibility and engagement study using SIMPL to evaluate residents' operative skills. SIMPL requires the attending surgeon to answer 3 multiple-choice questions: 1) What level of help (Zwisch Scale) was required by the trainee? 2) What was the level of performance? 3) How complex was the case? The evaluator also can dictate a narrative. The sample was composed of 3 faculty members and 3 volunteer senior residents. Predictor variables were the surgeons, trainees, and procedures performed. Outcome variables included number and percentage of procedures performed by faculty-and-resident pairs assessed, time required to complete assessments, time lapsed to submission, percentage of assessments with narratives, and residents' response rates. From March through June 2016, 151 procedures were performed in the operating room by the faculty-and-resident teams. There were 107 assessments submitted (71%). Resident response (self-assessment) to faculty evaluations was 81%. Recorded time to complete assessments (n = 75 of 107) was shorter than 2 minutes. The time lapsed to submission was shorter than 72 hours (100%). Dictations were submitted for 35 evaluations (33%). Data for the type of help, performance, and complexity of cases were collected for each resident. SIMPL facilitates timely intraoperative evaluations of surgical skills, engagement by faculty and residents, and collection of detailed procedural data. Additional prospective trials to assess this tool further are planned. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Airtightness the simple(CS) way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, S.
Builders who might buck against such time consuming air sealing methods as polyethylene wrap and the airtight drywall approach (ADA) may respond better to current strategies. One such method, called SimpleCS, has proven especially effective. SimpleCS, pronounced simplex, stands for simple caulk and seal. A modification of the ADA, SimpleCS is an air-sealing management tool, a simplified systems approach to building tight homes. The system address the crucial question of when and by whom various air sealing steps should be done. It avoids the problems that often occur when later contractors cut open polyethylene wrap to drill holes in themore » drywall. The author describes how SimpleCS works, and the cost and training involved.« less
New method to evaluate the 7Li(p, n)7Be reaction near threshold
NASA Astrophysics Data System (ADS)
Herrera, María S.; Moreno, Gustavo A.; Kreiner, Andrés J.
2015-04-01
In this work a complete description of the 7Li(p, n)7Be reaction near threshold is given using center-of-mass and relative coordinates. It is shown that this standard approach, not used before in this context, leads to a simple mathematical representation which gives easy access to all relevant quantities in the reaction and allows a precise numerical implementation. It also allows in a simple way to include proton beam-energy spread affects. The method, implemented as a C++ code, was validated both with numerical and experimental data finding a good agreement. This tool is also used here to analyze scattered published measurements such as (p, n) cross sections, differential and total neutron yields for thick targets. Using these data we derive a consistent set of parameters to evaluate neutron production near threshold. Sensitivity of the results to data uncertainty and the possibility of incorporating new measurements are also discussed.
An integrtated approach to the use of Landsat TM data for gold exploration in west central Nevada
NASA Technical Reports Server (NTRS)
Mouat, D. A.; Myers, J. S.; Miller, N. L.
1987-01-01
This paper represents an integration of several Landsat TM image processing techniques with other data to discriminate the lithologies and associated areas of hydrothermal alteration in the vicinity of the Paradise Peak gold mine in west central Nevada. A microprocessor-based image processing system and an IDIMS system were used to analyze data from a 512 X 512 window of a Landsat-5 TM scene collected on June 30, 1984. Image processing techniques included simple band composites, band ratio composites, principal components composites, and baseline-based composites. These techniques were chosen based on their ability to discriminate the spectral characteristics of the products of hydrothermal alteration as well as of the associated regional lithologies. The simple band composite, ratio composite, two principal components composites, and the baseline-based composites separately can define the principal areas of alteration. Combined, they provide a very powerful exploration tool.
Ultimate Longitudinal Strength of Composite Ship Hulls
NASA Astrophysics Data System (ADS)
Zhang, Xiangming; Huang, Lingkai; Zhu, Libao; Tang, Yuhang; Wang, Anwen
2017-01-01
A simple analytical model to estimate the longitudinal strength of ship hulls in composite materials under buckling, material failure and ultimate collapse is presented in this paper. Ship hulls are regarded as assemblies of stiffened panels which idealized as group of plate-stiffener combinations. Ultimate strain of the plate-stiffener combination is predicted under buckling or material failure with composite beam-column theory. The effects of initial imperfection of ship hull and eccentricity of load are included. Corresponding longitudinal strengths of ship hull are derived in a straightforward method. A longitudinally framed ship hull made of symmetrically stacked unidirectional plies under sagging is analyzed. The results indicate that present analytical results have a good agreement with FEM method. The initial deflection of ship hull and eccentricity of load can dramatically reduce the bending capacity of ship hull. The proposed formulations provide a simple but useful tool for the longitudinal strength estimation in practical design.
Quantitative Modeling of Earth Surface Processes
NASA Astrophysics Data System (ADS)
Pelletier, Jon D.
This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.
Shen, Lishuang; Attimonelli, Marcella; Bai, Renkui; Lott, Marie T; Wallace, Douglas C; Falk, Marni J; Gai, Xiaowu
2018-06-01
Accurate mitochondrial DNA (mtDNA) variant annotation is essential for the clinical diagnosis of diverse human diseases. Substantial challenges to this process include the inconsistency in mtDNA nomenclatures, the existence of multiple reference genomes, and a lack of reference population frequency data. Clinicians need a simple bioinformatics tool that is user-friendly, and bioinformaticians need a powerful informatics resource for programmatic usage. Here, we report the development and functionality of the MSeqDR mtDNA Variant Tool set (mvTool), a one-stop mtDNA variant annotation and analysis Web service. mvTool is built upon the MSeqDR infrastructure (https://mseqdr.org), with contributions of expert curated data from MITOMAP (https://www.mitomap.org) and HmtDB (https://www.hmtdb.uniba.it/hmdb). mvTool supports all mtDNA nomenclatures, converts variants to standard rCRS- and HGVS-based nomenclatures, and annotates novel mtDNA variants. Besides generic annotations from dbNSFP and Variant Effect Predictor (VEP), mvTool provides allele frequencies in more than 47,000 germline mitogenomes, and disease and pathogenicity classifications from MSeqDR, Mitomap, HmtDB and ClinVar (Landrum et al., 2013). mvTools also provides mtDNA somatic variants annotations. "mvTool API" is implemented for programmatic access using inputs in VCF, HGVS, or classical mtDNA variant nomenclatures. The results are reported as hyperlinked html tables, JSON, Excel, and VCF formats. MSeqDR mvTool is freely accessible at https://mseqdr.org/mvtool.php. © 2018 Wiley Periodicals, Inc.
Fisher, Rohan P; Myers, Bronwyn A
2011-02-25
Despite the demonstrated utility of GIS for health applications, there are perceived problems in low resource settings: GIS software can be expensive and complex; input data are often of low quality. This study aimed to test the appropriateness of new, inexpensive and simple GIS tools in poorly resourced areas of a developing country. GIS applications were trialled in pilot studies based on mapping of health resources and health indicators at the clinic and district level in the predominantly rural province of Nusa Tenggara Timur in eastern Indonesia. The pilot applications were (i) rapid field collection of health infrastructure data using a GPS enabled PDA, (ii) mapping health indicator data using open source GIS software, and (iii) service availability mapping using a free modelling tool. Through contextualised training, district and clinic staff acquired skills in spatial analysis and visualisation and, six months after the pilot studies, they were using these skills for advocacy in the planning process, to inform the allocation of some health resources, and to evaluate some public health initiatives. We demonstrated that GIS can be a useful and inexpensive tool for the decentralisation of health data analysis to low resource settings through the use of free and simple software, locally relevant training materials and by providing data collection tools to ensure data reliability.
2011-01-01
Background Despite the demonstrated utility of GIS for health applications, there are perceived problems in low resource settings: GIS software can be expensive and complex; input data are often of low quality. This study aimed to test the appropriateness of new, inexpensive and simple GIS tools in poorly resourced areas of a developing country. GIS applications were trialled in pilot studies based on mapping of health resources and health indicators at the clinic and district level in the predominantly rural province of Nusa Tenggara Timur in eastern Indonesia. The pilot applications were (i) rapid field collection of health infrastructure data using a GPS enabled PDA, (ii) mapping health indicator data using open source GIS software, and (iii) service availability mapping using a free modelling tool. Results Through contextualised training, district and clinic staff acquired skills in spatial analysis and visualisation and, six months after the pilot studies, they were using these skills for advocacy in the planning process, to inform the allocation of some health resources, and to evaluate some public health initiatives. Conclusions We demonstrated that GIS can be a useful and inexpensive tool for the decentralisation of health data analysis to low resource settings through the use of free and simple software, locally relevant training materials and by providing data collection tools to ensure data reliability. PMID:21352553
Mohammad Al Alfy, Ibrahim
2018-01-01
A set of three pads was constructed from primary materials (sand, gravel and cement) to calibrate the gamma-gamma density tool. A simple equation was devised to convert the qualitative cps values to quantitative g/cc values. The neutron-neutron porosity tool measures the qualitative cps porosity values. A direct equation was derived to calculate the porosity percentage from the cps porosity values. Cement-bond log illustrates the cement quantities, which surround well pipes. This log needs a difficult process due to the existence of various parameters, such as: drilling well diameter as well as internal diameter, thickness and type of well pipes. An equation was invented to calculate the cement percentage at standard conditions. This equation can be modified according to varying conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.
OceanVideoLab: A Tool for Exploring Underwater Video
NASA Astrophysics Data System (ADS)
Ferrini, V. L.; Morton, J. J.; Wiener, C.
2016-02-01
Video imagery acquired with underwater vehicles is an essential tool for characterizing seafloor ecosystems and seafloor geology. It is a fundamental component of ocean exploration that facilitates real-time operations, augments multidisciplinary scientific research, and holds tremendous potential for public outreach and engagement. Acquiring, documenting, managing, preserving and providing access to large volumes of video acquired with underwater vehicles presents a variety of data stewardship challenges to the oceanographic community. As a result, only a fraction of underwater video content collected with research submersibles is documented, discoverable and/or viewable online. With more than 1 billion users, YouTube offers infrastructure that can be leveraged to help address some of the challenges associated with sharing underwater video with a broad global audience. Anyone can post content to YouTube, and some oceanographic organizations, such as the Schmidt Ocean Institute, have begun live-streaming video directly from underwater vehicles. OceanVideoLab (oceanvideolab.org) was developed to help improve access to underwater video through simple annotation, browse functionality, and integration with related environmental data. Any underwater video that is publicly accessible on YouTube can be registered with OceanVideoLab by simply providing a URL. It is strongly recommended that a navigational file also be supplied to enable geo-referencing of observations. Once a video is registered, it can be viewed and annotated using a simple user interface that integrates observations with vehicle navigation data if provided. This interface includes an interactive map and a list of previous annotations that allows users to jump to times of specific observations in the video. Future enhancements to OceanVideoLab will include the deployment of a search interface, the development of an application program interface (API) that will drive the search and enable querying of content by other systems/tools, the integration of related environmental data from complementary data systems (e.g. temperature, bathymetry), and the expansion of infrastructure to enable broad crowdsourcing of annotations.
Functional phosphoproteomic mass spectrometry-based approaches
2012-01-01
Mass Spectrometry (MS)-based phosphoproteomics tools are crucial for understanding the structure and dynamics of signaling networks. Approaches such as affinity purification followed by MS have also been used to elucidate relevant biological questions in health and disease. The study of proteomes and phosphoproteomes as linked systems, rather than research studies of individual proteins, are necessary to understand the functions of phosphorylated and un-phosphorylated proteins under spatial and temporal conditions. Phosphoproteome studies also facilitate drug target protein identification which may be clinically useful in the near future. Here, we provide an overview of general principles of signaling pathways versus phosphorylation. Likewise, we detail chemical phosphoproteomic tools, including pros and cons with examples where these methods have been applied. In addition, basic clues of electrospray ionization and collision induced dissociation fragmentation are detailed in a simple manner for successful phosphoproteomic clinical studies. PMID:23369623
orthoFind Facilitates the Discovery of Homologous and Orthologous Proteins.
Mier, Pablo; Andrade-Navarro, Miguel A; Pérez-Pulido, Antonio J
2015-01-01
Finding homologous and orthologous protein sequences is often the first step in evolutionary studies, annotation projects, and experiments of functional complementation. Despite all currently available computational tools, there is a requirement for easy-to-use tools that provide functional information. Here, a new web application called orthoFind is presented, which allows a quick search for homologous and orthologous proteins given one or more query sequences, allowing a recurrent and exhaustive search against reference proteomes, and being able to include user databases. It addresses the protein multidomain problem, searching for homologs with the same domain architecture, and gives a simple functional analysis of the results to help in the annotation process. orthoFind is easy to use and has been proven to provide accurate results with different datasets. Availability: http://www.bioinfocabd.upo.es/orthofind/.
Dumont, Elodie; De Bleye, Charlotte; Sacré, Pierre-Yves; Netchacovitch, Lauranne; Hubert, Philippe; Ziemons, Eric
2016-05-01
Over recent decades, spreading environmental concern entailed the expansion of green chemistry analytical tools. Vibrational spectroscopy, belonging to this class of analytical tool, is particularly interesting taking into account its numerous advantages such as fast data acquisition and no sample preparation. In this context, near-infrared, Raman and mainly surface-enhanced Raman spectroscopy (SERS) have thus gained interest in many fields including bioanalysis. The two former techniques only ensure the analysis of concentrated compounds in simple matrices, whereas the emergence of SERS improved the performances of vibrational spectroscopy to very sensitive and selective analyses. Complex SERS substrates were also developed enabling biomarker measurements, paving the way for SERS immunoassays. Therefore, in this paper, the strengths and weaknesses of these techniques will be highlighted with a focus on recent progress.
Justice, W S M; O'Brien, M F; Szyszka, O; Shotton, J; Gilmour, J E M; Riordan, P; Wolfensohn, S
2017-08-05
Animal welfare monitoring is an essential part of zoo management and a legal requirement in many countries. Historically, a variety of welfare audits have been proposed to assist zoo managers. Unfortunately, there are a number of issues with these assessments, including lack of species information, validated tests and the overall complexity of these audits which make them difficult to implement in practice. The animal welfare assessment grid (AWAG) has previously been proposed as an animal welfare monitoring tool for animals used in research programmes. This computer-based system was successfully adapted for use in a zoo setting with two taxonomic groups: primates and birds. This tool is simple to use and provides continuous monitoring based on cumulative lifetime assessment. It is suggested as an alternative, practical method for welfare monitoring in zoos. British Veterinary Association.
2009-01-01
Background Breast cancer is a significant public health problem worldwide and the development of tools to identify individuals at-risk for hereditary breast cancer syndromes, where specific interventions can be proposed to reduce risk, has become increasingly relevant. A previous study in Southern Brazil has shown that a family history suggestive of these syndromes may be prevalent at the primary care level. Development of a simple and sensitive instrument, easily applicable in primary care units, would be particularly helpful in underserved communities in which identification and referral of high-risk individuals is difficult. Methods A simple 7-question instrument about family history of breast, ovarian and colorectal cancer, FHS-7, was developed to screen for individuals with an increased risk for hereditary breast cancer syndromes. FHS-7 was applied to 9218 women during routine visits to primary care units in Southern Brazil. Two consecutive samples of 885 women and 910 women who answered positively to at least one question and negatively to all questions were included, respectively. The sensitivity, specificity and positive and negative predictive values were determined. Results Of the 885 women reporting a positive family history, 211 (23.8%; CI95%: 21.5–26.2) had a pedigree suggestive of a hereditary breast and/or breast and colorectal cancer syndrome. Using as cut point one positive answer, the sensitivity and specificity of the instrument were 87.6% and 56.4%, respectively. Concordance between answers in two different applications was given by a intra-class correlation (ICC) of 0.84 for at least one positive answer. Temporal stability of the instrument was adequate (ICC = 0.65). Conclusion A simple instrument for the identification of the most common hereditary breast cancer syndrome phenotypes, showing good specificity and temporal stability was developed and could be used as a screening tool in primary care to refer at-risk individuals for genetic evaluations. PMID:19682358
Study of Tools for Network Discovery and Network Mapping
2003-11-01
connected to the switch. iv. Accessibility of historical data and event data In general, network discovery tools keep a history of the collected...has the following software dependencies: - Java Virtual machine 76 - Perl modules - RRD Tool - TomCat - PostgreSQL STRENGTHS AND...systems - provide a simple view of the current network status - generate alarms on status change - generate history of status change VISUAL MAP
A simple stochastic weather generator for ecological modeling
A.G. Birt; M.R. Valdez-Vivas; R.M. Feldman; C.W. Lafon; D. Cairns; R.N. Coulson; M. Tchakerian; W. Xi; Jim Guldin
2010-01-01
Stochastic weather generators are useful tools for exploring the relationship between organisms and their environment. This paper describes a simple weather generator that can be used in ecological modeling projects. We provide a detailed description of methodology, and links to full C++ source code (http://weathergen.sourceforge.net) required to implement or modify...
M"Health" for Higher Education
ERIC Educational Resources Information Center
Aburas, Abdurazzag A.; Ayran, Mujgan
2013-01-01
Better education is required better advanced tools to be used for students. Smart phone becomes main part of our daily life. New medical design interface is introduced for medicine student based mobile. The Graphic User Interface must be easy and simple. The main interface design issue for mobile is simple and easy to use. Human Mobile…
Social Network Analysis: A Simple but Powerful Tool for Identifying Teacher Leaders
ERIC Educational Resources Information Center
Smith, P. Sean; Trygstad, Peggy J.; Hayes, Meredith L.
2018-01-01
Instructional teacher leadership is central to a vision of distributed leadership. However, identifying instructional teacher leaders can be a daunting task, particularly for administrators who find themselves either newly appointed or faced with high staff turnover. This article describes the use of social network analysis (SNA), a simple but…
EZ and GOSSIP, two new VO compliant tools for spectral analysis
NASA Astrophysics Data System (ADS)
Franzetti, P.; Garill, B.; Fumana, M.; Paioro, L.; Scodeggio, M.; Paltani, S.; Scaramella, R.
2008-10-01
We present EZ and GOSSIP, two new VO compliant tools dedicated to spectral analysis. EZ is a tool to perform automatic redshift measurement; GOSSIP is a tool created to perform the SED fitting procedure in a simple, user friendly and efficient way. These two tools have been developed by the PANDORA Group at INAF-IASF (Milano); EZ has been developed in collaboration with Osservatorio Monte Porzio (Roma) and Integral Science Data Center (Geneve). EZ is released to the astronomical community; GOSSIP is currently in beta-testing.
Methods for quantifying simple gravity sensing in Drosophila melanogaster.
Inagaki, Hidehiko K; Kamikouchi, Azusa; Ito, Kei
2010-01-01
Perception of gravity is essential for animals: most animals possess specific sense organs to detect the direction of the gravitational force. Little is known, however, about the molecular and neural mechanisms underlying their behavioral responses to gravity. Drosophila melanogaster, having a rather simple nervous system and a large variety of molecular genetic tools available, serves as an ideal model for analyzing the mechanisms underlying gravity sensing. Here we describe an assay to measure simple gravity responses of flies behaviorally. This method can be applied for screening genetic mutants of gravity perception. Furthermore, in combination with recent genetic techniques to silence or activate selective sets of neurons, it serves as a powerful tool to systematically identify neural substrates required for the proper behavioral responses to gravity. The assay requires 10 min to perform, and two experiments can be performed simultaneously, enabling 12 experiments per hour.
Simple tool for prediction of parotid gland sparing in intensity-modulated radiation therapy.
Gensheimer, Michael F; Hummel-Kramer, Sharon M; Cain, David; Quang, Tony S
2015-01-01
Sparing one or both parotid glands is a key goal when planning head and neck cancer radiation treatment. If the planning target volume (PTV) overlaps one or both parotid glands substantially, it may not be possible to achieve adequate gland sparing. This finding results in physicians revising their PTV contours after an intensity-modulated radiation therapy (IMRT) plan has been run and reduces workflow efficiency. We devised a simple formula for predicting mean parotid gland dose from the overlap of the parotid gland and isotropically expanded PTV contours. We tested the tool using 44 patients from 2 institutions and found agreement between predicted and actual parotid gland doses (mean absolute error = 5.3 Gy). This simple method could increase treatment planning efficiency by improving the chance that the first plan presented to the physician will have optimal parotid gland sparing. Published by Elsevier Inc.
Simple tool for prediction of parotid gland sparing in intensity-modulated radiation therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gensheimer, Michael F.; Hummel-Kramer, Sharon M., E-mail: sharonhummel@comcast.net; Cain, David
Sparing one or both parotid glands is a key goal when planning head and neck cancer radiation treatment. If the planning target volume (PTV) overlaps one or both parotid glands substantially, it may not be possible to achieve adequate gland sparing. This finding results in physicians revising their PTV contours after an intensity-modulated radiation therapy (IMRT) plan has been run and reduces workflow efficiency. We devised a simple formula for predicting mean parotid gland dose from the overlap of the parotid gland and isotropically expanded PTV contours. We tested the tool using 44 patients from 2 institutions and found agreementmore » between predicted and actual parotid gland doses (mean absolute error = 5.3 Gy). This simple method could increase treatment planning efficiency by improving the chance that the first plan presented to the physician will have optimal parotid gland sparing.« less
Environmental management practices are trending away from simple, local- scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to impleme...
Desktop publishing: a useful tool for scientists.
Lindroth, J R; Cooper, G; Kent, R L
1994-01-01
Desktop publishing offers features that are not available in word processing programs. The process yields an impressive and professional-looking document that is legible and attractive. It is a simple but effective tool to enhance the quality and appearance of your work and perhaps also increase your productivity.
Photoelectron Imaging as a Quantum Chemistry Visualization Tool
ERIC Educational Resources Information Center
Grumbling, Emily R.; Pichugin, Kostyantyn; Mabbs, Richard; Sanov, Andrei
2011-01-01
An overview and simple example of photoelectron imaging is presented, highlighting its efficacy as a pedagogical tool for visualizing quantum phenomena. Specifically, photoelectron imaging of H[superscript -] (the simplest negative ion) is used to demonstrate several quantum mechanical principles. This example could be incorporated into an…
So simple, so hard: Taking medication as directed.
Burkholder, Rebecca; Linn, Elaine
Millions of Americans either fail to take the full course of prescribed medication or they take it incorrectly. The problem is particularly serious for people with cardiovascular disease, respiratory disease, and diabetes, and for racially and ethnically diverse populations. The (U.S.) National Consumers League through its Script Your Future medication adherence awareness campaign presented the conference "So Simple, So Hard: Taking Medications as Directed", convening health care professionals, community health workers, consumer and patient advocates, researchers, industry representatives, public agencies, and policymakers. This a one-day research symposium aimed to explore challenges and barriers to medication adherence and to highlight tools and strategies to improve adherence and health outcomes, particularly among underserved populations. The conference began with presentations on adherence research and health disparities, and continued with presentations on strategies and tools to improve adherence that could be utilized in health care practices or organizations (including assessing adherence, medication synchronization, and comprehensive medication management). Through group discussions, the conference provided a forum for participants to interact and lay the groundwork to develop partnerships for collaborative initiatives to improve appropriate medication use and adherence. Participants surveyed at the end of the day and 30 days after the conference reported that they found the meeting highly useful (rated 4.6 out of 5), with the vast majority saying they learned about research and tools they can apply in their work and made new connections for potential collaborations. The conference learnings are being shared by participants and disseminated to other interested organizations and individuals. Copyright © 2016 Elsevier Inc. All rights reserved.
Huber, Adam M.; Dugan, Elizabeth M.; Lachenbruch, Peter A.; Feldman, Brian M.; Perez, Maria D.; Zemel, Lawrence S.; Lindsley, Carol B.; Rennebohm, Robert M.; Wallace, Carol A.; Passo, Murray H.; Reed, Ann M.; Bowyer, Suzanne L.; Ballinger, Susan H.; Miller, Frederick W.; Rider, Lisa G.
2007-01-01
Objectives Clinical care and therapeutic trials in idiopathic inflammatory myopathies (IIM) require accurate and consistent assessment of cutaneous involvement. The Cutaneous Assessment Tool (CAT) was designed to measure skin activity and damage in IIM. We describe the development and inter-rater reliability of the CAT, and the frequency of lesions endorsed in a large population of juvenile IIM patients. Methods The CAT includes 10 activity, 4 damage and 7 combined lesions. Thirty-two photographic slides depicting IIM skin lesions were assessed by 11 raters. One hundred and twenty three children were assessed by 11 pediatric rheumatologists at ten centers. Inter-rater reliability was assessed using simple agreements and intra-class correlation coefficients (ICC). Results Simple agreements in recognizing lesions as present or absent were generally high (0.5 – 1.0). ICC's for CAT lesions were moderate (0.4 – 0.75) in both slides and real patients. ICC's for the CAT activity and damage scores were 0.71 and 0.81, respectively. CAT activity scores ranged from 0 – 44 (median 7, potential range 0 – 96) and CAT damage scores ranged from 0 – 13 (median 1, potential range 0 – 22). The most common cutaneous lesions endorsed were periungual capillary loop changes (63%), Gottron's papules/sign (53%), heliotrope rash (49%) and malar/facial erythema (49%). Conclusions Total CAT activity and damage scores have moderate to good reliability. Assessors generally agree on the presence of a variety of cutaneous lesions. The CAT is a promising, semi-quantitative tool to comprehensively assess skin disease activity and damage in IIM. PMID:17890275
Quality Control System using Simple Implementation of Seven Tools for Batik Textile Manufacturing
NASA Astrophysics Data System (ADS)
Ragil Suryoputro, Muhammad; Sugarindra, Muchamad; Erfaisalsyah, Hendy
2017-06-01
In order to produce better products and mitigate defect in products, every company must implement a quality control system. Company will find means to implement a quality control system that is capable and reliable. One of the methods is using the simple implementation of the seven tools in quality control defects. The case studied in this research was the level of disability xyz grey fabric on a shuttle loom 2 on the Batik manufacturing company. The seven tools that include: flowchart, check sheet, histogram, scatter diagram combined with control charts, Pareto diagrams and fishbone diagrams (causal diagram). Check sheet results obtained types of defects in the grey fabric was woven xyz is warp, double warp, the warp break, double warp, empty warp, warp tenuous, ugly edges, thick warp, and rust. Based on the analysis of control chart indicates that the process is out of control. This can be seen in the graph control where there is still a lot of outlier data. Based on a scatter diagram shows a positive correlation between the percentage of disability and the number of production. Based on Pareto diagram, repair needs priority is for the dominant type of defect is warp (44%) and based on double warp value histogram is also the highest with a value of 23635.11 m. In addition, based on the analysis of the factors causing defect by fishbone diagram double warp or other types of defects originating from the materials, methods, machines, measurements, man and environment. Thus the company can take to minimize the prevention and repair of defects and improve product quality.
Biotool2Web: creating simple Web interfaces for bioinformatics applications.
Shahid, Mohammad; Alam, Intikhab; Fuellen, Georg
2006-01-01
Currently there are many bioinformatics applications being developed, but there is no easy way to publish them on the World Wide Web. We have developed a Perl script, called Biotool2Web, which makes the task of creating web interfaces for simple ('home-made') bioinformatics applications quick and easy. Biotool2Web uses an XML document containing the parameters to run the tool on the Web, and generates the corresponding HTML and common gateway interface (CGI) files ready to be published on a web server. This tool is available for download at URL http://www.uni-muenster.de/Bioinformatics/services/biotool2web/ Georg Fuellen (fuellen@alum.mit.edu).
Processing Solutions for Big Data in Astronomy
NASA Astrophysics Data System (ADS)
Fillatre, L.; Lepiller, D.
2016-09-01
This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.
I'll take that to go: Big data bags and minimal identifiers for exchange of large, complex datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chard, Kyle; D'Arcy, Mike; Heavner, Benjamin D.
Big data workflows often require the assembly and exchange of complex, multi-element datasets. For example, in biomedical applications, the input to an analytic pipeline can be a dataset consisting thousands of images and genome sequences assembled from diverse repositories, requiring a description of the contents of the dataset in a concise and unambiguous form. Typical approaches to creating datasets for big data workflows assume that all data reside in a single location, requiring costly data marshaling and permitting errors of omission and commission because dataset members are not explicitly specified. We address these issues by proposing simple methods and toolsmore » for assembling, sharing, and analyzing large and complex datasets that scientists can easily integrate into their daily workflows. These tools combine a simple and robust method for describing data collections (BDBags), data descriptions (Research Objects), and simple persistent identifiers (Minids) to create a powerful ecosystem of tools and services for big data analysis and sharing. We present these tools and use biomedical case studies to illustrate their use for the rapid assembly, sharing, and analysis of large datasets.« less
Simmonds, Mark; Burch, Jane; Llewellyn, Alexis; Griffiths, Claire; Yang, Huiqin; Owen, Christopher; Duffy, Steven; Woolacott, Nerys
2015-06-01
It is uncertain which simple measures of childhood obesity are best for predicting future obesity-related health problems and the persistence of obesity into adolescence and adulthood. To investigate the ability of simple measures, such as body mass index (BMI), to predict the persistence of obesity from childhood into adulthood and to predict obesity-related adult morbidities. To investigate how accurately simple measures diagnose obesity in children, and how acceptable these measures are to children, carers and health professionals. Multiple sources including MEDLINE, EMBASE and The Cochrane Library were searched from 2008 to 2013. Systematic reviews and a meta-analysis were carried out of large cohort studies on the association between childhood obesity and adult obesity; the association between childhood obesity and obesity-related morbidities in adulthood; and the diagnostic accuracy of simple childhood obesity measures. Study quality was assessed using Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) and a modified version of the Quality in Prognosis Studies (QUIPS) tool. A systematic review and an elicitation exercise were conducted on the acceptability of the simple measures. Thirty-seven studies (22 cohorts) were included in the review of prediction of adult morbidities. Twenty-three studies (16 cohorts) were included in the tracking review. All studies included BMI. There were very few studies of other measures. There was a strong positive association between high childhood BMI and adult obesity [odds ratio 5.21, 95% confidence interval (CI) 4.50 to 6.02]. A positive association was found between high childhood BMI and adult coronary heart disease, diabetes and a range of cancers, but not stroke or breast cancer. The predictive accuracy of childhood BMI to predict any adult morbidity was very low, with most morbidities occurring in adults who were of healthy weight in childhood. Predictive accuracy of childhood obesity was moderate for predicting adult obesity, with a sensitivity of 30% and a specificity of 98%. Persistence of obesity from adolescence to adulthood was high. Thirty-four studies were included in the diagnostic accuracy review. Most of the studies used the least reliable reference standard (dual-energy X-ray absorptiometry); only 24% of studies were of high quality. The sensitivity of BMI for diagnosing obesity and overweight varied considerably; specificity was less variable. Pooled sensitivity of BMI was 74% (95% CI 64.2% to 81.8%) and pooled specificity was 95% (95% CI 92.2% to 96.4%). The acceptability to children and their carers of BMI or other common simple measures was generally good. Little evidence was available regarding childhood measures other than BMI. No individual-level analysis could be performed. Childhood BMI is not a good predictor of adult obesity or adult disease; the majority of obese adults were not obese as children and most obesity-related adult morbidity occurs in adults who had a healthy childhood weight. However, obesity (as measured using BMI) was found to persist from childhood to adulthood, with most obese adolescents also being obese in adulthood. BMI was found to be reasonably good for diagnosing obesity during childhood. There is no convincing evidence suggesting that any simple measure is better than BMI for diagnosing obesity in childhood or predicting adult obesity and morbidity. Further research on obesity measures other than BMI is needed to determine which is the best tool for diagnosing childhood obesity, and new cohort studies are needed to investigate the impact of contemporary childhood obesity on adult obesity and obesity-related morbidities. This study is registered as PROSPERO CRD42013005711. The National Institute for Health Research Health Technology Assessment programme.
Escape Excel: A tool for preventing gene symbol and accession conversion errors
Stewart, Paul A.; Kuenzi, Brent M.; Eschrich, James A.
2017-01-01
Background Microsoft Excel automatically converts certain gene symbols, database accessions, and other alphanumeric text into dates, scientific notation, and other numerical representations. These conversions lead to subsequent, irreversible, corruption of the imported text. A recent survey of popular genomic literature estimates that one-fifth of all papers with supplementary gene lists suffer from this issue. Results Here, we present an open-source tool, Escape Excel, which prevents these erroneous conversions by generating an escaped text file that can be safely imported into Excel. Escape Excel is implemented in a variety of formats (http://www.github.com/pstew/escape_excel), including a command line based Perl script, a Windows-only Excel Add-In, an OS X drag-and-drop application, a simple web-server, and as a Galaxy web environment interface. Test server implementations are accessible as a Galaxy interface (http://apostl.moffitt.org) and simple non-Galaxy web server (http://apostl.moffitt.org:8000/). Conclusions Escape Excel detects and escapes a wide variety of problematic text strings so that they are not erroneously converted into other representations upon importation into Excel. Examples of problematic strings include date-like strings, time-like strings, leading zeroes in front of numbers, and long numeric and alphanumeric identifiers that should not be automatically converted into scientific notation. It is hoped that greater awareness of these potential data corruption issues, together with diligent escaping of text files prior to importation into Excel, will help to reduce the amount of Excel-corrupted data in scientific analyses and publications. PMID:28953918
Dietary screening tool identifies nutritional risk in older adults123
Miller, Paige E; Mitchell, Diane C; Hartman, Terryl J; Lawrence, Frank R; Sempos, Christopher T; Smiciklas-Wright, Helen
2009-01-01
Background: No rapid methods exist for screening overall dietary intakes in older adults. Objective: The purpose of this study was to develop and evaluate a scoring system for a diet screening tool to identify nutritional risk in community-dwelling older adults. Design: This cross-sectional study in older adults (n = 204) who reside in rural areas examined nutrition status by using an in-person interview, biochemical measures, and four 24-h recalls that included the use of dietary supplements. Results: The dietary screening tool was able to characterize 3 levels of nutritional risk: at risk, possible risk, and not at risk. Individuals classified as at nutritional risk had significantly lower indicators of diet quality (Healthy Eating Index and Mean Adequacy Ratio) and intakes of protein, most micronutrients, dietary fiber, fruit, and vegetables. The at-risk group had higher intakes of fats and oils and refined grains. The at-risk group also had the lowest serum vitamin B-12, folate, β-cryptoxanthin, lutein, and zeaxanthin concentrations. The not-at-nutritional-risk group had significantly higher lycopene and β-carotene and lower homocysteine and methylmalonic acid concentrations. Conclusion: The dietary screening tool is a simple and practical tool that can help to detect nutritional risk in older adults. PMID:19458013
A patient-centered electronic tool for weight loss outcomes after Roux-en-Y gastric bypass.
Wood, G Craig; Benotti, Peter; Gerhard, Glenn S; Miller, Elaina K; Zhang, Yushan; Zaccone, Richard J; Argyropoulos, George A; Petrick, Anthony T; Still, Christopher D
2014-01-01
BACKGROUND. Current patient education and informed consent regarding weight loss expectations for bariatric surgery candidates are largely based on averages from large patient cohorts. The variation in weight loss outcomes illustrates the need for establishing more realistic weight loss goals for individual patients. This study was designed to develop a simple web-based tool which provides patient-specific weight loss expectations. METHODS. Postoperative weight measurements after Roux-en-Y gastric bypass (RYGB) were collected and analyzed with patient characteristics known to influence weight loss outcomes. Quantile regression was used to create expected weight loss curves (25th, 50th, and 75th %tile) for the 24 months after RYGB. The resulting equations were validated and used to develop web-based tool for predicting weight loss outcomes. RESULTS. Weight loss data from 2986 patients (2608 in the primary cohort and 378 in the validation cohort) were included. Preoperative body mass index (BMI) and age were found to have a high correlation with weight loss accomplishment (P < 0.0001 for each). An electronic tool was created that provides easy access to patient-specific, 24-month weight loss trajectories based on initial BMI and age. CONCLUSIONS. This validated, patient-centered electronic tool will assist patients and providers in patient teaching, informed consent, and postoperative weight loss management.
NASA Astrophysics Data System (ADS)
Bakavos, Dimitrios; Chen, Yingchun; Babout, Laurent; Prangnell, Phil
2011-05-01
The requirement for a probe, or pin, in friction stir spot welding (FSSW) leads to an undesirable keyhole and "hooking," which can influence the fracture path and weld strength. Furthermore, the full weld cycle for FSSW is typically longer than ideal for the automotive industry, being 2 to 5 seconds. Here, it is shown that using a novel pinless tool design it is possible to achieve high lap shear strength (~3.4 kN) in thin aluminum sheet (~1 mm thick), with short weld cycle times (<1 second). Several techniques have been exploited to study the material flow and mechanisms of weld formation in pinless FSSW, including high-resolution X-ray tomography, to understand the role of the tool design and weld parameters. Despite the "simple" nature of a pinless tool, material flow in the weld zone was found to be surprisingly complex and strongly influenced by surface features on the tool, which greatly increased the penetration of the plastic zone into the bottom sheet. Because of the rapid thermal cycle and high level of grain refinement, the weld zone was found to develop a higher strength than the parent material with little evidence of a heat affected zone (HAZ) after postweld natural aging.
Current trends for customized biomedical software tools.
Khan, Haseeb Ahmad
2017-01-01
In the past, biomedical scientists were solely dependent on expensive commercial software packages for various applications. However, the advent of user-friendly programming languages and open source platforms has revolutionized the development of simple and efficient customized software tools for solving specific biomedical problems. Many of these tools are designed and developed by biomedical scientists independently or with the support of computer experts and often made freely available for the benefit of scientific community. The current trends for customized biomedical software tools are highlighted in this short review.
Evaluation of IOTA Simple Ultrasound Rules to Distinguish Benign and Malignant Ovarian Tumours.
Garg, Sugandha; Kaur, Amarjit; Mohi, Jaswinder Kaur; Sibia, Preet Kanwal; Kaur, Navkiran
2017-08-01
IOTA stands for International Ovarian Tumour Analysis group. Ovarian cancer is one of the common cancers in women and is diagnosed at later stage in majority. The limiting factor for early diagnosis is lack of standardized terms and procedures in gynaecological sonography. Introduction of IOTA rules has provided some consistency in defining morphological features of ovarian masses through a standardized examination technique. To evaluate the efficacy of IOTA simple ultrasound rules in distinguishing benign and malignant ovarian tumours and establishing their use as a tool in early diagnosis of ovarian malignancy. A hospital based case control prospective study was conducted. Patients with suspected ovarian pathology were evaluated using IOTA ultrasound rules and designated as benign or malignant. Findings were correlated with histopathological findings. Collected data was statistically analysed using chi-square test and kappa statistical method. Out of initial 55 patients, 50 patients were included in the final analysis who underwent surgery. IOTA simple rules were applicable in 45 out of these 50 patients (90%). The sensitivity for the detection of malignancy in cases where IOTA simple rules were applicable was 91.66% and the specificity was 84.84%. Accuracy was 86.66%. Classifying inconclusive cases as malignant, the sensitivity and specificity was 93% and 80% respectively. High level of agreement was found between USG and histopathological diagnosis with Kappa value as 0.323. IOTA simple ultrasound rules were highly sensitive and specific in predicting ovarian malignancy preoperatively yet being reproducible, easy to train and use.
Susong, D.; Marks, D.; Garen, D.
1999-01-01
Topographically distributed energy- and water-balance models can accurately simulate both the development and melting of a seasonal snowcover in the mountain basins. To do this they require time-series climate surfaces of air temperature, humidity, wind speed, precipitation, and solar and thermal radiation. If data are available, these parameters can be adequately estimated at time steps of one to three hours. Unfortunately, climate monitoring in mountain basins is very limited, and the full range of elevations and exposures that affect climate conditions, snow deposition, and melt is seldom sampled. Detailed time-series climate surfaces have been successfully developed using limited data and relatively simple methods. We present a synopsis of the tools and methods used to combine limited data with simple corrections for the topographic controls to generate high temporal resolution time-series images of these climate parameters. Methods used include simulations, elevational gradients, and detrended kriging. The generated climate surfaces are evaluated at points and spatially to determine if they are reasonable approximations of actual conditions. Recommendations are made for the addition of critical parameters and measurement sites into routine monitoring systems in mountain basins.Topographically distributed energy- and water-balance models can accurately simulate both the development and melting of a seasonal snowcover in the mountain basins. To do this they require time-series climate surfaces of air temperature, humidity, wind speed, precipitation, and solar and thermal radiation. If data are available, these parameters can be adequately estimated at time steps of one to three hours. Unfortunately, climate monitoring in mountain basins is very limited, and the full range of elevations and exposures that affect climate conditions, snow deposition, and melt is seldom sampled. Detailed time-series climate surfaces have been successfully developed using limited data and relatively simple methods. We present a synopsis of the tools and methods used to combine limited data with simple corrections for the topographic controls to generate high temporal resolution time-series images of these climate parameters. Methods used include simulations, elevational gradients, and detrended kriging. The generated climate surfaces are evaluated at points and spatially to determine if they are reasonable approximations of actual conditions. Recommendations are made for the addition of critical parameters and measurement sites into routine monitoring systems in mountain basins.
Wintermark, M; Zeineh, M; Zaharchuk, G; Srivastava, A; Fischbein, N
2016-07-01
A neuroradiologist's activity includes many tasks beyond interpreting relative value unit-generating imaging studies. Our aim was to test a simple method to record and quantify the non-relative value unit-generating clinical activity represented by consults and clinical conferences, including tumor boards. Four full-time neuroradiologists, working an average of 50% clinical and 50% academic activity, systematically recorded all the non-relative value unit-generating consults and conferences in which they were involved during 3 months by using a simple, Web-based, computer-based application accessible from smartphones, tablets, or computers. The number and type of imaging studies they interpreted during the same period and the associated relative value units were extracted from our billing system. During 3 months, the 4 neuroradiologists working an average of 50% clinical activity interpreted 4241 relative value unit-generating imaging studies, representing 8152 work relative value units. During the same period, they recorded 792 non-relative value unit-generating study reviews as part of consults and conferences (not including reading room consults), representing 19% of the interpreted relative value unit-generating imaging studies. We propose a simple Web-based smartphone app to record and quantify non-relative value unit-generating activities including consults, clinical conferences, and tumor boards. The quantification of non-relative value unit-generating activities is paramount in this time of a paradigm shift from volume to value. It also represents an important tool for determining staffing levels, which cannot be performed on the basis of relative value unit only, considering the importance of time spent by radiologists on non-relative value unit-generating activities. It may also influence payment models from medical centers to radiology departments or practices. © 2016 by American Journal of Neuroradiology.
Moret, Whitney M
2018-01-01
Introduction: Economic strengthening practitioners are increasingly seeking data collection tools that will help them target households vulnerable to HIV and poor child well-being outcomes, match households to appropriate interventions, monitor their status, and determine readiness for graduation from project support. This article discusses efforts in 3 countries to develop simple, valid tools to quantify and classify economic vulnerability status. Methods and Findings: In Côte d'Ivoire, we conducted a cross-sectional survey with 3,749 households to develop a scale based on the definition of HIV-related economic vulnerability from the U.S. President's Emergency Plan for AIDS Relief (PEPFAR) for the purpose of targeting vulnerable households for PEPFAR-funded programs for orphans and vulnerable children. The vulnerability measures examined did not cluster in ways that would allow for the creation of a small number of composite measures, and thus we were unable to develop a scale. In Uganda, we assessed the validity of a vulnerability index developed to classify households according to donor classifications of economic status by measuring its association with a validated poverty measure, finding only a modest correlation. In South Africa, we developed monitoring and evaluation tools to assess economic status of individual adolescent girls and their households. We found no significant correlation with our validation measures, which included a validated measure of girls' vulnerability to HIV, a validated poverty measure, and subjective classifications generated by the community, data collector, and respondent. Overall, none of the measures of economic vulnerability used in the 3 countries varied significantly with their proposed validation items. Conclusion: Our findings suggest that broad constructs of economic vulnerability cannot be readily captured using simple scales to classify households and individuals in a way that accounts for a substantial amount of variance at locally defined vulnerability levels. We recommend that researchers and implementers design monitoring and evaluation instruments to capture narrower definitions of vulnerability based on characteristics programs intend to affect. We also recommend using separate tools for targeting based on context-specific indicators with evidence-based links to negative outcomes. Policy makers and donors should avoid reliance on simplified metrics of economic vulnerability in the programs they support. PMID:29496734
Geena 2, improved automated analysis of MALDI/TOF mass spectra.
Romano, Paolo; Profumo, Aldo; Rocco, Mattia; Mangerini, Rosa; Ferri, Fabio; Facchiano, Angelo
2016-03-02
Mass spectrometry (MS) is producing high volumes of data supporting oncological sciences, especially for translational research. Most of related elaborations can be carried out by combining existing tools at different levels, but little is currently available for the automation of the fundamental steps. For the analysis of MALDI/TOF spectra, a number of pre-processing steps are required, including joining of isotopic abundances for a given molecular species, normalization of signals against an internal standard, background noise removal, averaging multiple spectra from the same sample, and aligning spectra from different samples. In this paper, we present Geena 2, a public software tool for the automated execution of these pre-processing steps for MALDI/TOF spectra. Geena 2 has been developed in a Linux-Apache-MySQL-PHP web development environment, with scripts in PHP and Perl. Input and output are managed as simple formats that can be consumed by any database system and spreadsheet software. Input data may also be stored in a MySQL database. Processing methods are based on original heuristic algorithms which are introduced in the paper. Three simple and intuitive web interfaces are available: the Standard Search Interface, which allows a complete control over all parameters, the Bright Search Interface, which leaves to the user the possibility to tune parameters for alignment of spectra, and the Quick Search Interface, which limits the number of parameters to a minimum by using default values for the majority of parameters. Geena 2 has been utilized, in conjunction with a statistical analysis tool, in three published experimental works: a proteomic study on the effects of long-term cryopreservation on the low molecular weight fraction of serum proteome, and two retrospective serum proteomic studies, one on the risk of developing breat cancer in patients affected by gross cystic disease of the breast (GCDB) and the other for the identification of a predictor of breast cancer mortality following breast cancer surgery, whose results were validated by ELISA, a completely alternative method. Geena 2 is a public tool for the automated pre-processing of MS data originated by MALDI/TOF instruments, with a simple and intuitive web interface. It is now under active development for the inclusion of further filtering options and for the adoption of standard formats for MS spectra.
Defining Uncertainty and Error in Planktic Foraminiferal Oxygen Isotope Measurements
NASA Astrophysics Data System (ADS)
Fraass, A. J.; Lowery, C.
2016-12-01
Foraminifera are the backbone of paleoceanography, and planktic foraminifera are one of the leading tools for reconstructing water column structure. Currently, there are unconstrained variables when dealing with the reproducibility of oxygen isotope measurements. This study presents the first results from a simple model of foraminiferal calcification (Foraminiferal Isotope Reproducibility Model; FIRM), designed to estimate the precision and accuracy of oxygen isotope measurements. FIRM produces synthetic isotope data using parameters including location, depth habitat, season, number of individuals included in measurement, diagenesis, misidentification, size variation, and vital effects. Reproducibility is then tested using Monte Carlo simulations. The results from a series of experiments show that reproducibility is largely controlled by the number of individuals in each measurement, but also strongly a function of local oceanography if the number of individuals is held constant. Parameters like diagenesis or misidentification have an impact on both the precision and the accuracy of the data. Currently FIRM is a tool to estimate isotopic error values best employed in the Holocene. It is also a tool to explore the impact of myriad factors on the fidelity of paleoceanographic records. FIRM was constructed in the open-source computing environment R and is freely available via GitHub. We invite modification and expansion, and have planned inclusions for benthic foram reproducibility and stratigraphic uncertainty.
The smoke-fireplume model : tool for eventual application to prescribed burns and wildland fires.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D. F.; Dunn, W. E.; Lazaro, M. A.
Land managers are increasingly implementing strategies that employ the use of fire in prescribed burns to sustain ecosystems and plan to sustain the rate of increase in its use over the next five years. In planning and executing expanded use of fire in wildland treatment it is important to estimate the human health and safety consequences, property damage, and the extent of visibility degradation from the resulting conflagration-pyrolysis gases, soot and smoke generated during flaming, smoldering and/or glowing fires. Traditional approaches have often employed the analysis of weather observations and forecasts to determine whether a prescribed burn will affect populations,more » property, or protected Class I areas. However, the complexity of the problem lends itself to advanced PC-based models that are simple to use for both calculating the emissions from the burning of wildland fuels and the downwind dispersion of smoke and other products of pyrolysis, distillation, and/or fuels combustion. These models will need to address the effects of residual smoldering combustion, including plume dynamics and optical effects. In this paper, we discuss a suite of tools that can be applied for analyzing dispersion. These tools include the dispersion models FIREPLUME and SMOKE, together with the meteorological preprocessor SEBMET.« less
Development of a critical appraisal tool to assess the quality of cross-sectional studies (AXIS)
Downes, Martin J; Brennan, Marnie L; Williams, Hywel C; Dean, Rachel S
2016-01-01
Objectives The aim of this study was to develop a critical appraisal (CA) tool that addressed study design and reporting quality as well as the risk of bias in cross-sectional studies (CSSs). In addition, the aim was to produce a help document to guide the non-expert user through the tool. Design An initial scoping review of the published literature and key epidemiological texts was undertaken prior to the formation of a Delphi panel to establish key components for a CA tool for CSSs. A consensus of 80% was required from the Delphi panel for any component to be included in the final tool. Results An initial list of 39 components was identified through examination of existing resources. An international Delphi panel of 18 medical and veterinary experts was established. After 3 rounds of the Delphi process, the Appraisal tool for Cross-Sectional Studies (AXIS tool) was developed by consensus and consisted of 20 components. A detailed explanatory document was also developed with the tool, giving expanded explanation of each question and providing simple interpretations and examples of the epidemiological concepts being examined in each question to aid non-expert users. Conclusions CA of the literature is a vital step in evidence synthesis and therefore evidence-based decision-making in a number of different disciplines. The AXIS tool is therefore unique and was developed in a way that it can be used across disciplines to aid the inclusion of CSSs in systematic reviews, guidelines and clinical decision-making. PMID:27932337
Faiz, A S; Kaveney, A; Guo, S; Murphy, S; Philipp, C S
2017-09-01
Family members of Von Willebrand disease (VWD) patients may have low levels of VWF without major bleeding episodes and often remain undiagnosed. The purpose of this study was to assess the utility of a modified Screening Tool in identifying previously untested reproductive age female family members of VWD patients for haemostatic evaluation. Ninety-four reproductive age women including 41 previously untested family members of VWD patients, 26 previously diagnosed VWD patients and 27 healthy controls were administered a modified Screening Tool and had blood drawn for CBC, ferritin, and VWF testing. Participants completed a pictorial blood assessment chart (PBAC) with menses. The modified Screening Tool was positive in 32% family members, 77% VWD patients, and 19% controls (P < 0.001). Combined with low ferritin, the modified Screening Tool was positive in 66% family members, 92% VWD patients, and 44% controls (P = 0.001). In family members, incorporating low ferritin with the modified Screening Tool resulted in a sensitivity of 86% (95% CI, 42-100) and negative predictive value of 93% (95% CI, 66-100). In the control group, NPV was between 92% and 95% for the modified Screening Tool and also for the modified Screening Tool combined with low ferritin or a positive PBAC. These data in a racially diverse population suggest the usefulness of a simple, easy to administer modified Screening Tool. In conjunction with ferritin it could be used in a primary care setting to stratify reproductive age women with a family history of VWD for haemostatic evaluation. © 2017 John Wiley & Sons Ltd.
Mapping of Sample Collection Data: GIS Tools for the Natural Product Researcher
Oberlies, Nicholas H.; Rineer, James I.; Alali, Feras Q.; Tawaha, Khaled; Falkinham, Joseph O.; Wheaton, William D.
2009-01-01
Scientists engaged in the research of natural products often either conduct field collections themselves or collaborate with partners who do, such as botanists, mycologists, or SCUBA divers. The information gleaned from such collecting trips (e.g. longitude/latitude coordinates, geography, elevation, and a multitude of other field observations) have provided valuable data to the scientific community (e.g., biodiversity), even if it is tangential to the direct aims of the natural products research, which are often focused on drug discovery and/or chemical ecology. Geographic Information Systems (GIS) have been used to display, manage, and analyze geographic data, including collection sites for natural products. However, to the uninitiated, these tools are often beyond the financial and/or computational means of the natural product scientist. With new, free, and easy-to-use geospatial visualization tools, such as Google Earth, mapping and geographic imaging of sampling data are now within the reach of natural products scientists. The goals of the present study were to develop simple tools that are tailored for the natural products setting, thereby presenting a means to map such information, particularly via open source software like Google Earth. PMID:20161345
The Adversarial Route Analysis Tool: A Web Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casson, William H. Jr.
2012-08-02
The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.
Number Sense Made Simple Using Number Patterns
ERIC Educational Resources Information Center
Su, Hui Fang Huang; Marinas, Carol; Furner, Joseph
2011-01-01
This article highlights investigating intriguing number patterns utilising an emerging technology called the Square Tool. Mathematics teachers of grades K-12 will find the Square Tool useful in making connections and bridging the gap from the concrete to the abstract. Pattern recognition helps students discover various mathematical concepts. With…
Simple Educational Tool for Digital Speckle Shearography
ERIC Educational Resources Information Center
Spagnolo, Giuseppe Schirripa; Martocchia, Andrea; Papalillo, Donato; Cozzella, Lorenzo
2012-01-01
In this study, an educational tool has been prepared for obtaining short-term and more economic training on digital speckle shearography (DSS). Shearography non-destructive testing (NDT) has gained wide acceptance over the last decade, providing a number of important and exciting inspection solutions in aerospace, electronics and medical device…
ERIC Educational Resources Information Center
Jackson, Carrie
2013-01-01
When school leaders engage with their communities, they develop trusting relationships that support student, staff, and family learning. Digital and social media tools open doors and create opportunities to connect with families and communities in ways that have never been seen before. This article provides three simple ideas that can easily be…
drPACS: A Simple UNIX Execution Pipeline
NASA Astrophysics Data System (ADS)
Teuben, P.
2011-07-01
We describe a very simple yet flexible and effective pipeliner for UNIX commands. It creates a Makefile to define a set of serially dependent commands. The commands in the pipeline share a common set of parameters by which they can communicate. Commands must follow a simple convention to retrieve and store parameters. Pipeline parameters can optionally be made persistent across multiple runs of the pipeline. Tools were added to simplify running a large series of pipelines, which can then also be run in parallel.
Simple tools for assembling and searching high-density picolitre pyrophosphate sequence data.
Parker, Nicolas J; Parker, Andrew G
2008-04-18
The advent of pyrophosphate sequencing makes large volumes of sequencing data available at a lower cost than previously possible. However, the short read lengths are difficult to assemble and the large dataset is difficult to handle. During the sequencing of a virus from the tsetse fly, Glossina pallidipes, we found the need for tools to search quickly a set of reads for near exact text matches. A set of tools is provided to search a large data set of pyrophosphate sequence reads under a "live" CD version of Linux on a standard PC that can be used by anyone without prior knowledge of Linux and without having to install a Linux setup on the computer. The tools permit short lengths of de novo assembly, checking of existing assembled sequences, selection and display of reads from the data set and gathering counts of sequences in the reads. Demonstrations are given of the use of the tools to help with checking an assembly against the fragment data set; investigating homopolymer lengths, repeat regions and polymorphisms; and resolving inserted bases caused by incomplete chain extension. The additional information contained in a pyrophosphate sequencing data set beyond a basic assembly is difficult to access due to a lack of tools. The set of simple tools presented here would allow anyone with basic computer skills and a standard PC to access this information.
NASA Astrophysics Data System (ADS)
Takahashi, Leo
2011-03-01
The use of animation as a teaching tool has long been of interest to the readers of and contributors to this journal.1-5 While the sophisticated techniques presented in the cited papers are excellent and useful, there is one overlooked technique that may be of interest to the teacher who wants something quick and simple to enhance classroom presentations: PowerPoint animation.
2009-02-12
equivalent to usual printing or typescript . Can read either representations of familiar formulaic verbal exchanges or simple language containing only...read simple, authentic written material in a form equivalent to usual printing or typescript on subjects within a familiar context. Able to read with
Using High Speed Smartphone Cameras and Video Analysis Techniques to Teach Mechanical Wave Physics
ERIC Educational Resources Information Center
Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano
2017-01-01
We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses…
Comparison of different objective functions for parameterization of simple respiration models
M.T. van Wijk; B. van Putten; D.Y. Hollinger; A.D. Richardson
2008-01-01
The eddy covariance measurements of carbon dioxide fluxes collected around the world offer a rich source for detailed data analysis. Simple, aggregated models are attractive tools for gap filling, budget calculation, and upscaling in space and time. Key in the application of these models is their parameterization and a robust estimate of the uncertainty and reliability...
A Medieval Clock Made out of Simple Materials
ERIC Educational Resources Information Center
Danese, B.; Oss, S.
2008-01-01
A cheap replica of the verge-and-foliot clock has been built from simple materials. It is a didactic tool of great power for physics teaching at every stage of schooling, in particular at university level. An account is given of its construction and its working principles, together with motivated examples of a few activities. (Contains 3 tables…
A Simple and Effective Protein Folding Activity Suitable for Large Lectures
ERIC Educational Resources Information Center
White, Brian
2006-01-01
This article describes a simple and inexpensive hands-on simulation of protein folding suitable for use in large lecture classes. This activity uses a minimum of parts, tools, and skill to simulate some of the fundamental principles of protein folding. The major concepts targeted are that proteins begin as linear polypeptides and fold to…
Forced tearing of ductile and brittle thin sheets.
Tallinen, T; Mahadevan, L
2011-12-09
Tearing a thin sheet by forcing a rigid object through it leads to complex crack morphologies; a single oscillatory crack arises when a tool is driven laterally through a brittle sheet, while two diverging cracks and a series of concertinalike folds forms when a tool is forced laterally through a ductile sheet. On the other hand, forcing an object perpendicularly through the sheet leads to radial petallike tears in both ductile and brittle materials. To understand these different regimes we use a combination of experiments, simulations, and simple theories. In particular, we describe the transition from brittle oscillatory tearing via a single crack to ductile concertina tearing with two tears by deriving laws that describe the crack paths and wavelength of the concertina folds and provide a simple phase diagram for the morphologies in terms of the material properties of the sheet and the relative size of the tool.
Increasing Access and Usability of Remote Sensing Data: The NASA Protected Area Archive
NASA Technical Reports Server (NTRS)
Geller, Gary N.
2004-01-01
Although remote sensing data are now widely available, much of it at low or no-cost, many managers of protected conservation areas do not have the expertise or tools to view or analyze it. Thus access to it by the protected area management community is effectively blocked. The Protected Area Archive will increase access to remote sensing data by creating collections of satellite images of protected areas and packaging them with simple-to-use visualization and analytical tools. The user can easily locate the area and image of interest on a map, then display, roam, and zoom the image. A set of simple tools will be provided so the user can explore the data and employ it to assist in management and monitoring of their area. The 'Phase 1 ' version requires only a Windows-based computer and basic computer skills, and may be of particular help to protected area managers in developing countries.
The Simple Video Coder: A free tool for efficiently coding social video data.
Barto, Daniel; Bird, Clark W; Hamilton, Derek A; Fink, Brandi C
2017-08-01
Videotaping of experimental sessions is a common practice across many disciplines of psychology, ranging from clinical therapy, to developmental science, to animal research. Audio-visual data are a rich source of information that can be easily recorded; however, analysis of the recordings presents a major obstacle to project completion. Coding behavior is time-consuming and often requires ad-hoc training of a student coder. In addition, existing software is either prohibitively expensive or cumbersome, which leaves researchers with inadequate tools to quickly process video data. We offer the Simple Video Coder-free, open-source software for behavior coding that is flexible in accommodating different experimental designs, is intuitive for students to use, and produces outcome measures of event timing, frequency, and duration. Finally, the software also offers extraction tools to splice video into coded segments suitable for training future human coders or for use as input for pattern classification algorithms.
Gwee, Kok-Ann; Bergmans, Paul; Kim, JinYong; Coudsy, Bogdana; Sim, Angelia; Chen, Minhu; Lin, Lin; Hou, Xiaohua; Wang, Huahong; Goh, Khean-Lee; Pangilinan, John A; Kim, Nayoung; des Varannes, Stanislas Bruley
2017-01-01
Background/Aims There is a need for a simple and practical tool adapted for the diagnosis of chronic constipation (CC) in the Asian population. This study compared the Asian Neurogastroenterology and Motility Association (ANMA) CC tool and Rome III criteria for the diagnosis of CC in Asian subjects. Methods This multicenter, cross-sectional study included subjects presenting at outpatient gastrointestinal clinics across Asia. Subjects with CC alert symptoms completed a combination Diagnosis Questionnaire to obtain a diagnosis based on 4 different diagnostic methods: self-defined, investigator’s judgment, ANMA CC tool, and Rome III criteria. The primary endpoint was the level of agreement/disagreement between the ANMA CC diagnostic tool and Rome III criteria for the diagnosis of CC. Results The primary analysis comprised of 449 subjects, 414 of whom had a positive diagnosis according to the ANMA CC tool. Rome III positive/ANMA positive and Rome III negative/ANMA negative diagnoses were reported in 76.8% and 7.8% of subjects, respectively, resulting in an overall percentage agreement of 84.6% between the 2 diagnostic methods. The overall percentage disagreement between these 2 diagnostic methods was 15.4%. A higher level of agreement was seen between the ANMA CC tool and self-defined (374 subjects [90.3%]) or investigator’s judgment criteria (388 subjects [93.7%]) compared with the Rome III criteria. Conclusion This study demonstrates that the ANMA CC tool can be a useful for Asian patients with CC. PMID:27764907
Supervised learning of tools for content-based search of image databases
NASA Astrophysics Data System (ADS)
Delanoy, Richard L.
1996-03-01
A computer environment, called the Toolkit for Image Mining (TIM), is being developed with the goal of enabling users with diverse interests and varied computer skills to create search tools for content-based image retrieval and other pattern matching tasks. Search tools are generated using a simple paradigm of supervised learning that is based on the user pointing at mistakes of classification made by the current search tool. As mistakes are identified, a learning algorithm uses the identified mistakes to build up a model of the user's intentions, construct a new search tool, apply the search tool to a test image, display the match results as feedback to the user, and accept new inputs from the user. Search tools are constructed in the form of functional templates, which are generalized matched filters capable of knowledge- based image processing. The ability of this system to learn the user's intentions from experience contrasts with other existing approaches to content-based image retrieval that base searches on the characteristics of a single input example or on a predefined and semantically- constrained textual query. Currently, TIM is capable of learning spectral and textural patterns, but should be adaptable to the learning of shapes, as well. Possible applications of TIM include not only content-based image retrieval, but also quantitative image analysis, the generation of metadata for annotating images, data prioritization or data reduction in bandwidth-limited situations, and the construction of components for larger, more complex computer vision algorithms.
NASA Astrophysics Data System (ADS)
Gadhath, Arpitha Rao
The purpose of this thesis is to build an interactive Geographical Information System (GIS) tool, relating to the series of events that occurred during the Battle of France World War II. The tool gives us an insight about the countries involved in the battle, their allies and their strategies. This tool was created to use it as a one stop source of information regarding all the important battles that took place, which lead to the fall of France. The tool brings together the maps of all the countries involved. Integrated with each map is the data relevant to that map. The data for each country includes the place of attack, the strategies used during the attack, and the kind of warfare. The tool also makes use of HTML files to give all the information, along with the images from the time of the war and a footage which explains everything about the particular battle. The tool was build using JAVA, along with the use of MOJO (Map Objects Java Objects) to develop Maps of each of the countries. MOJO is developed by ESRI (Environmental Science Research Institute) which makes it easier to add data to the maps. It also makes highlighting important information easier making use of pop-up windows, charts and infographics. HTML files were designed making use of the open-source template developed by Bootstrap. The tool is built in such a way that the interface is simple and easy for the user to use and understand.
PyFolding: Open-Source Graphing, Simulation, and Analysis of the Biophysical Properties of Proteins.
Lowe, Alan R; Perez-Riba, Albert; Itzhaki, Laura S; Main, Ewan R G
2018-02-06
For many years, curve-fitting software has been heavily utilized to fit simple models to various types of biophysical data. Although such software packages are easy to use for simple functions, they are often expensive and present substantial impediments to applying more complex models or for the analysis of large data sets. One field that is reliant on such data analysis is the thermodynamics and kinetics of protein folding. Over the past decade, increasingly sophisticated analytical models have been generated, but without simple tools to enable routine analysis. Consequently, users have needed to generate their own tools or otherwise find willing collaborators. Here we present PyFolding, a free, open-source, and extensible Python framework for graphing, analysis, and simulation of the biophysical properties of proteins. To demonstrate the utility of PyFolding, we have used it to analyze and model experimental protein folding and thermodynamic data. Examples include: 1) multiphase kinetic folding fitted to linked equations, 2) global fitting of multiple data sets, and 3) analysis of repeat protein thermodynamics with Ising model variants. Moreover, we demonstrate how PyFolding is easily extensible to novel functionality beyond applications in protein folding via the addition of new models. Example scripts to perform these and other operations are supplied with the software, and we encourage users to contribute notebooks and models to create a community resource. Finally, we show that PyFolding can be used in conjunction with Jupyter notebooks as an easy way to share methods and analysis for publication and among research teams. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Saranya, Raju; Aarthi, Raju; Sankaran, Krishnan
2015-05-01
Spread of drug-resistant Staphylococcus spp. into communities pose danger demanding effective non-invasive and non-destructive tools for its early detection and surveillance. Characteristic volatile organic compounds (VOCs) produced by bacteria offer new diagnostic targets and novel approaches not exploited so far in infectious disease diagnostics. Our search for such characteristic VOC for Staphylococcus spp. led to the depiction of 2-[3-acetoxy-4,4,14-trimethylandrost-8-en-17-yl] propanoic acid (ATMAP), a moderately volatile compound detected both in the culture and headspace when the organism was grown in tryptone soya broth (TSB) medium. A simple and inexpensive colorimetric method (colour change from yellow to orange) using methyl red as the pH indicator provided an absolutely specific way for identifying Staphylococcus spp., The assay performed in liquid cultures (7-h growth in TSB) as well as in the headspace of plate cultures (grown for 10 h on TSA) was optimised in a 96-well plate and 12-well plate formats, respectively, employing a set of positive and negative strains. Only Staphylococcus spp. showed the distinct colour change from yellow to orange due to the production of the above VOC while in the case of other organisms, the reagent remained yellow. The method validated using known clinical and environmental strains (56 including Staphylococcus, Proteus, Pseudomonas, Klebsiella, Bacillus, Shigella and Escherichia coli) was found to be highly efficient showing 100% specificity and sensitivity. Such simple methods of bacterial pathogen identification are expected to form the next generation tools for the control of infectious diseases through early detection and surveillance of causative agents.
Zhou, Ting; Wang, Bangyan; Liu, Huiquan; Yang, Kaixiang; Thapa, Sudip; Zhang, Haowen; Li, Lu
2018-01-01
Abstract Background Cachexia is a multifactorial syndrome that is highly prevalent in advanced cancer patients and leads to progressive functional impairments. The classification of cachexia stages is essential for diagnosing and treating cachexia. However, there is a lack of simple tools with good discrimination for classifying cachexia stages. Therefore, our study aimed to develop a clinically applicable cachexia staging score (CSS) and validate its discrimination of clinical outcomes for different cachexia stages. Methods Advanced cancer patients were enrolled in our study. A CSS comprising the following five components was developed: weight loss, a simple questionnaire of sarcopenia (SARC‐F), Eastern Cooperative Oncology Group, appetite loss, and abnormal biochemistry. According to the CSS, patients were classified into non‐cachexia, pre‐cachexia, cachexia, and refractory cachexia stages, and clinical outcomes were compared among the four groups. Results Of the 297 participating patients, data from 259 patients were ultimately included. Based on the CSS, patients were classified into non‐cachexia (n = 69), pre‐cachexia (n = 68), cachexia (n = 103), and refractory cachexia (n = 19) stages. Patients with more severe cachexia stages had lower skeletal muscle indexes (P = 0.002 and P = 0.004 in male and female patients, respectively), higher prevalence of sarcopenia (P = 0.017 and P = 0.027 in male and female patients, respectively), more severe symptom burden (P < 0.001), poorer quality of life (P < 0.001 for all subscales except social well‐being), and shorter survival times (P < 0.001). Conclusions The CSS is a simple and clinically applicable tool with excellent discrimination for classifying cachexia stages. This score is extremely useful for the clinical treatment and prognosis of cachexia and for designing clinical trials. PMID:29372594
System for exchanging tools and end effectors on a robot
Burry, David B.; Williams, Paul M.
1991-02-19
A system and method for exchanging tools and end effectors on a robot permits exchange during a programmed task. The exchange mechanism is located off the robot, thus reducing the mass of the robot arm and permitting smaller robots to perform designated tasks. A simple spring/collet mechanism mounted on the robot is used which permits the engagement and disengagement of the tool or end effector without the need for a rotational orientation of the tool to the end effector/collet interface. As the tool changing system is not located on the robot arm no umbilical cords are located on robot.
Regression Models for Identifying Noise Sources in Magnetic Resonance Images
Zhu, Hongtu; Li, Yimei; Ibrahim, Joseph G.; Shi, Xiaoyan; An, Hongyu; Chen, Yashen; Gao, Wei; Lin, Weili; Rowe, Daniel B.; Peterson, Bradley S.
2009-01-01
Stochastic noise, susceptibility artifacts, magnetic field and radiofrequency inhomogeneities, and other noise components in magnetic resonance images (MRIs) can introduce serious bias into any measurements made with those images. We formally introduce three regression models including a Rician regression model and two associated normal models to characterize stochastic noise in various magnetic resonance imaging modalities, including diffusion-weighted imaging (DWI) and functional MRI (fMRI). Estimation algorithms are introduced to maximize the likelihood function of the three regression models. We also develop a diagnostic procedure for systematically exploring MR images to identify noise components other than simple stochastic noise, and to detect discrepancies between the fitted regression models and MRI data. The diagnostic procedure includes goodness-of-fit statistics, measures of influence, and tools for graphical display. The goodness-of-fit statistics can assess the key assumptions of the three regression models, whereas measures of influence can isolate outliers caused by certain noise components, including motion artifacts. The tools for graphical display permit graphical visualization of the values for the goodness-of-fit statistic and influence measures. Finally, we conduct simulation studies to evaluate performance of these methods, and we analyze a real dataset to illustrate how our diagnostic procedure localizes subtle image artifacts by detecting intravoxel variability that is not captured by the regression models. PMID:19890478
MOAB : a mesh-oriented database.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tautges, Timothy James; Ernst, Corey; Stimpson, Clint
A finite element mesh is used to decompose a continuous domain into a discretized representation. The finite element method solves PDEs on this mesh by modeling complex functions as a set of simple basis functions with coefficients at mesh vertices and prescribed continuity between elements. The mesh is one of the fundamental types of data linking the various tools in the FEA process (mesh generation, analysis, visualization, etc.). Thus, the representation of mesh data and operations on those data play a very important role in FEA-based simulations. MOAB is a component for representing and evaluating mesh data. MOAB can storemore » structured and unstructured mesh, consisting of elements in the finite element 'zoo'. The functional interface to MOAB is simple yet powerful, allowing the representation of many types of metadata commonly found on the mesh. MOAB is optimized for efficiency in space and time, based on access to mesh in chunks rather than through individual entities, while also versatile enough to support individual entity access. The MOAB data model consists of a mesh interface instance, mesh entities (vertices and elements), sets, and tags. Entities are addressed through handles rather than pointers, to allow the underlying representation of an entity to change without changing the handle to that entity. Sets are arbitrary groupings of mesh entities and other sets. Sets also support parent/child relationships as a relation distinct from sets containing other sets. The directed-graph provided by set parent/child relationships is useful for modeling topological relations from a geometric model or other metadata. Tags are named data which can be assigned to the mesh as a whole, individual entities, or sets. Tags are a mechanism for attaching data to individual entities and sets are a mechanism for describing relations between entities; the combination of these two mechanisms is a powerful yet simple interface for representing metadata or application-specific data. For example, sets and tags can be used together to describe geometric topology, boundary condition, and inter-processor interface groupings in a mesh. MOAB is used in several ways in various applications. MOAB serves as the underlying mesh data representation in the VERDE mesh verification code. MOAB can also be used as a mesh input mechanism, using mesh readers included with MOAB, or as a translator between mesh formats, using readers and writers included with MOAB. The remainder of this report is organized as follows. Section 2, 'Getting Started', provides a few simple examples of using MOAB to perform simple tasks on a mesh. Section 3 discusses the MOAB data model in more detail, including some aspects of the implementation. Section 4 summarizes the MOAB function API. Section 5 describes some of the tools included with MOAB, and the implementation of mesh readers/writers for MOAB. Section 6 contains a brief description of MOAB's relation to the TSTT mesh interface. Section 7 gives a conclusion and future plans for MOAB development. Section 8 gives references cited in this report. A reference description of the full MOAB API is contained in Section 9.« less
Sun, Changhong; Fan, Yu; Li, Juan; Wang, Gancheng; Zhang, Hanshuo; Xi, Jianzhong Jeff
2015-02-01
Transcription activator-like effectors (TALEs) are becoming powerful DNA-targeting tools in a variety of mammalian cells and model organisms. However, generating a stable cell line with specific gene mutations in a simple and rapid manner remains a challenging task. Here, we report a new method to efficiently produce monoclonal cells using integrated TALE nuclease technology and a series of high-throughput cell cloning approaches. Following this method, we obtained three mTOR mutant 293T cell lines within 2 months, which included one homozygous mutant line. © 2014 Society for Laboratory Automation and Screening.
An overview of C. elegans biology.
Strange, Kevin
2006-01-01
The establishment of Caenorhabditis elegans as a "model organism" began with the efforts of Sydney Brenner in the early 1960s. Brenner's focus was to find a suitable animal model in which the tools of genetic analysis could be used to define molecular mechanisms of development and nervous system function. C. elegans provides numerous experimental advantages for such studies. These advantages include a short life cycle, production of large numbers of offspring, easy and inexpensive laboratory culture, forward and reverse genetic tractability, and a relatively simple anatomy. This chapter will provide a brief overview of C. elegans biology.
Nash, Aaron; Soheili, Arash; Tambar, Uttam K
2013-09-20
Unnatural cyclic amino acids are valuable tools in biomedical research and drug discovery. A two-step stereoselective strategy for converting simple glycine-derived aminoesters into unnatural cyclic amino acid derivatives has been developed. The process includes a palladium-catalyzed tandem allylic amination/[2,3]-Stevens rearrangement followed by a ruthenium-catalyzed ring-closing metathesis. The [2,3]-rearrangement proceeds with high diastereoselectivity through an exo transition state. Oppolzer's chiral auxiliary was utilized to access an enantiopure cyclic amino acid by this approach, which will enable future biological applications.
An accessible four-dimensional treatment of Maxwell's equations in terms of differential forms
NASA Astrophysics Data System (ADS)
Sá, Lucas
2017-03-01
Maxwell’s equations are derived in terms of differential forms in the four-dimensional Minkowski representation, starting from the three-dimensional vector calculus differential version of these equations. Introducing all the mathematical and physical concepts needed (including the tool of differential forms), using only knowledge of elementary vector calculus and the local vector version of Maxwell’s equations, the equations are reduced to a simple and elegant set of two equations for a unified quantity, the electromagnetic field. The treatment should be accessible for students taking a first course on electromagnetism.
A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program
NASA Technical Reports Server (NTRS)
Bartoszek, J. T.; Huckins, B.; Coyle, M.
1979-01-01
A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.
Zheng, Hua; Rosal, Milagros C; Li, Wenjun; Borg, Amy; Yang, Wenyun; Ayers, David C
2018-01-01
Background Data-driven surgical decisions will ensure proper use and timing of surgical care. We developed a Web-based patient-centered treatment decision and assessment tool to guide treatment decisions among patients with advanced knee osteoarthritis who are considering total knee replacement surgery. Objective The aim of this study was to examine user experience and acceptance of the Web-based treatment decision support tool among older adults. Methods User-centered formative and summative evaluations were conducted for the tool. A sample of 28 patients who were considering total knee replacement participated in the study. Participants’ responses to the user interface design, the clarity of information, as well as usefulness, satisfaction, and acceptance of the tool were collected through qualitative (ie, individual patient interviews) and quantitative (ie, standardized Computer System Usability Questionnaire) methods. Results Participants were older adults with a mean age of 63 (SD 11) years. Three-quarters of them had no technical questions using the tool. User interface design recommendations included larger fonts, bigger buttons, less colors, simpler navigation without extra “next page” click, less mouse movement, and clearer illustrations with simple graphs. Color-coded bar charts and outcome-specific graphs with positive action were easiest for them to understand the outcomes data. Questionnaire data revealed high satisfaction with the tool usefulness and interface quality, and also showed ease of use of the tool, regardless of age or educational status. Conclusions We evaluated the usability of a patient-centered decision support tool designed for advanced knee arthritis patients to facilitate their knee osteoarthritis treatment decision making. The lessons learned can inform other decision support tools to improve interface and content design for older patients’ use. PMID:29712620
A stereoscopic look into the bulk
Czech, Bartlomiej; Lamprou, Lampros; McCandlish, Samuel; ...
2016-07-26
Here, we present the foundation for a holographic dictionary with depth perception. The dictionary consists of natural CFT operators whose duals are simple, diffeomorphism-invariant bulk operators. The CFT operators of interest are the “OPE blocks,” contributions to the OPE from a single conformal family. In holographic theories, we show that the OPE blocks are dual at leading order in 1/N to integrals of effective bulk fields along geodesics or homogeneous minimal surfaces in anti-de Sitter space. One widely studied example of an OPE block is the modular Hamiltonian, which is dual to the fluctuation in the area of a minimalmore » surface. Thus, our operators pave the way for generalizing the Ryu-Takayanagi relation to other bulk fields. Although the OPE blocks are non-local operators in the CFT, they admit a simple geometric description as fields in kinematic space — the space of pairs of CFT points. We develop the tools for constructing local bulk operators in terms of these non-local objects. The OPE blocks also allow for conceptually clean and technically simple derivations of many results known in the literature, including linearized Einstein’s equations and the relation between conformal blocks and geodesic Witten diagrams.« less
The Krylov accelerated SIMPLE(R) method for flow problems in industrial furnaces
NASA Astrophysics Data System (ADS)
Vuik, C.; Saghir, A.; Boerstoel, G. P.
2000-08-01
Numerical modeling of the melting and combustion process is an important tool in gaining understanding of the physical and chemical phenomena that occur in a gas- or oil-fired glass-melting furnace. The incompressible Navier-Stokes equations are used to model the gas flow in the furnace. The discrete Navier-Stokes equations are solved by the SIMPLE(R) pressure-correction method. In these applications, many SIMPLE(R) iterations are necessary to obtain an accurate solution. In this paper, Krylov accelerated versions are proposed: GCR-SIMPLE(R). The properties of these methods are investigated for a simple two-dimensional flow. Thereafter, the efficiencies of the methods are compared for three-dimensional flows in industrial glass-melting furnaces. Copyright
Geib, Scott M; Hall, Brian; Derego, Theodore; Bremer, Forest T; Cannoles, Kyle; Sim, Sheina B
2018-04-01
One of the most overlooked, yet critical, components of a whole genome sequencing (WGS) project is the submission and curation of the data to a genomic repository, most commonly the National Center for Biotechnology Information (NCBI). While large genome centers or genome groups have developed software tools for post-annotation assembly filtering, annotation, and conversion into the NCBI's annotation table format, these tools typically require back-end setup and connection to an Structured Query Language (SQL) database and/or some knowledge of programming (Perl, Python) to implement. With WGS becoming commonplace, genome sequencing projects are moving away from the genome centers and into the ecology or biology lab, where fewer resources are present to support the process of genome assembly curation. To fill this gap, we developed software to assess, filter, and transfer annotation and convert a draft genome assembly and annotation set into the NCBI annotation table (.tbl) format, facilitating submission to the NCBI Genome Assembly database. This software has no dependencies, is compatible across platforms, and utilizes a simple command to perform a variety of simple and complex post-analysis, pre-NCBI submission WGS project tasks. The Genome Annotation Generator is a consistent and user-friendly bioinformatics tool that can be used to generate a .tbl file that is consistent with the NCBI submission pipeline. The Genome Annotation Generator achieves the goal of providing a publicly available tool that will facilitate the submission of annotated genome assemblies to the NCBI. It is useful for any individual researcher or research group that wishes to submit a genome assembly of their study system to the NCBI.
Hall, Brian; Derego, Theodore; Bremer, Forest T; Cannoles, Kyle
2018-01-01
Abstract Background One of the most overlooked, yet critical, components of a whole genome sequencing (WGS) project is the submission and curation of the data to a genomic repository, most commonly the National Center for Biotechnology Information (NCBI). While large genome centers or genome groups have developed software tools for post-annotation assembly filtering, annotation, and conversion into the NCBI’s annotation table format, these tools typically require back-end setup and connection to an Structured Query Language (SQL) database and/or some knowledge of programming (Perl, Python) to implement. With WGS becoming commonplace, genome sequencing projects are moving away from the genome centers and into the ecology or biology lab, where fewer resources are present to support the process of genome assembly curation. To fill this gap, we developed software to assess, filter, and transfer annotation and convert a draft genome assembly and annotation set into the NCBI annotation table (.tbl) format, facilitating submission to the NCBI Genome Assembly database. This software has no dependencies, is compatible across platforms, and utilizes a simple command to perform a variety of simple and complex post-analysis, pre-NCBI submission WGS project tasks. Findings The Genome Annotation Generator is a consistent and user-friendly bioinformatics tool that can be used to generate a .tbl file that is consistent with the NCBI submission pipeline Conclusions The Genome Annotation Generator achieves the goal of providing a publicly available tool that will facilitate the submission of annotated genome assemblies to the NCBI. It is useful for any individual researcher or research group that wishes to submit a genome assembly of their study system to the NCBI. PMID:29635297
Analysis of biosurfaces by neutron reflectometry: From simple to complex interfaces
Junghans, Ann; Watkins, Erik B.; Barker, Robert D.; ...
2015-03-16
Because of its high sensitivity for light elements and the scattering contrast manipulation via isotopic substitutions, neutron reflectometry (NR) is an excellent tool for studying the structure of soft-condensed material. These materials include model biophysical systems as well as in situ living tissue at the solid–liquid interface. The penetrability of neutrons makes NR suitable for probing thin films with thicknesses of 5–5000 Å at various buried, for example, solid–liquid, interfaces [J. Daillant and A. Gibaud, Lect. Notes Phys. 770, 133 (2009); G. Fragneto-Cusani, J. Phys.: Condens. Matter 13, 4973 (2001); J. Penfold, Curr. Opin. Colloid Interface Sci. 7, 139 (2002)].more » Over the past two decades, NR has evolved to become a key tool in the characterization of biological and biomimetic thin films. Highlighted In the current report are some of the authors' recent accomplishments in utilizing NR to study highly complex systems, including in-situ experiments. Such studies will result in a much better understanding of complex biological problems, have significant medical impact by suggesting innovative treatment, and advance the development of highly functionalized biomimetic materials.« less
NASA Astrophysics Data System (ADS)
Li, Xinlong; Reber, Melanie A. R.; Corder, Christopher; Chen, Yuning; Zhao, Peng; Allison, Thomas K.
2016-09-01
We present a detailed description of the design, construction, and performance of high-power ultrafast Yb:fiber laser frequency combs in operation in our laboratory. We discuss two such laser systems: an 87 MHz, 9 W, 85 fs laser operating at 1060 nm and an 87 MHz, 80 W, 155 fs laser operating at 1035 nm. Both are constructed using low-cost, commercially available components, and can be assembled using only basic tools for cleaving and splicing single-mode fibers. We describe practical methods for achieving and characterizing low-noise single-pulse operation and long-term stability from Yb:fiber oscillators based on nonlinear polarization evolution. Stabilization of the combs using a variety of transducers, including a new method for tuning the carrier-envelope offset frequency, is discussed. High average power is achieved through chirped-pulse amplification in simple fiber amplifiers based on double-clad photonic crystal fibers. We describe the use of these combs in several applications, including ultrasensitive femtosecond time-resolved spectroscopy and cavity-enhanced high-order harmonic generation.
Yield of computed tomography of the cervical spine in cases of simple assault.
Uriell, Matthew L; Allen, Jason W; Lovasik, Brendan P; Benayoun, Marc D; Spandorfer, Robert M; Holder, Chad A
2017-01-01
Computed tomography (CT) of the cervical spine (C-spine) is routinely ordered for low-impact, non-penetrating or "simple" assault at our institution and others. Common clinical decision tools for C-spine imaging in the setting of trauma include the National Emergency X-Radiography Utilization Study (NEXUS) and the Canadian Cervical Spine Rule for Radiography (CCR). While NEXUS and CCR have served to decrease the amount of unnecessary imaging of the C-spine, overutilization of CT is still of concern. A retrospective, cross-sectional study was performed of the electronic medical record (EMR) database at an urban, Level I Trauma Center over a 6-month period for patients receiving a C-spine CT. The primary outcome of interest was prevalence of cervical spine fracture. Secondary outcomes of interest included appropriateness of C-spine imaging after retrospective application of NEXUS and CCR. The hypothesis was that fracture rates within this patient population would be extremely low. No C-spine fractures were identified in the 460 patients who met inclusion criteria. Approximately 29% of patients did not warrant imaging by CCR, and 25% by NEXUS. Of note, approximately 44% of patients were indeterminate for whether imaging was warranted by CCR, with the most common reason being lack of assessment for active neck rotation. Cervical spine CT is overutilized in the setting of simple assault, despite established clinical decision rules. With no fractures identified regardless of other factors, the likelihood that a CT of the cervical spine will identify clinically significant findings in the setting of "simple" assault is extremely low, approaching zero. At minimum, adherence to CCR and NEXUS within this patient population would serve to reduce both imaging costs and population radiation dose exposure. Copyright © 2016 Elsevier Ltd. All rights reserved.
Optimal low thrust geocentric transfer. [mission analysis computer program
NASA Technical Reports Server (NTRS)
Edelbaum, T. N.; Sackett, L. L.; Malchow, H. L.
1973-01-01
A computer code which will rapidly calculate time-optimal low thrust transfers is being developed as a mission analysis tool. The final program will apply to NEP or SEP missions and will include a variety of environmental effects. The current program assumes constant acceleration. The oblateness effect and shadowing may be included. Detailed state and costate equations are given for the thrust effect, oblateness effect, and shadowing. A simple but adequate model yields analytical formulas for power degradation due to the Van Allen radiation belts for SEP missions. The program avoids the classical singularities by the use of equinoctial orbital elements. Kryloff-Bogoliuboff averaging is used to facilitate rapid calculation. Results for selected cases using the current program are given.
Distributed Parameter Analysis of Pressure and Flow Disturbances in Rocket Propellant Feed Systems
NASA Technical Reports Server (NTRS)
Dorsch, Robert G.; Wood, Don J.; Lightner, Charlene
1966-01-01
A digital distributed parameter model for computing the dynamic response of propellant feed systems is formulated. The analytical approach used is an application of the wave-plan method of analyzing unsteady flow. Nonlinear effects are included. The model takes into account locally high compliances at the pump inlet and at the injector dome region. Examples of the calculated transient and steady-state periodic responses of a simple hypothetical propellant feed system to several types of disturbances are presented. Included are flow disturbances originating from longitudinal structural motion, gimbaling, throttling, and combustion-chamber coupling. The analytical method can be employed for analyzing developmental hardware and offers a flexible tool for the calculation of unsteady flow in these systems.
Strehl ratio: a tool for optimizing optical nulls and singularities.
Hénault, François
2015-07-01
In this paper a set of radial and azimuthal phase functions are reviewed that have a null Strehl ratio, which is equivalent to generating a central extinction in the image plane of an optical system. The study is conducted in the framework of Fraunhofer scalar diffraction, and is oriented toward practical cases where optical nulls or singularities are produced by deformable mirrors or phase plates. The identified solutions reveal unexpected links with the zeros of type-J Bessel functions of integer order. They include linear azimuthal phase ramps giving birth to an optical vortex, azimuthally modulated phase functions, and circular phase gratings (CPGs). It is found in particular that the CPG radiometric efficiency could be significantly improved by the null Strehl ratio condition. Simple design rules for rescaling and combining the different phase functions are also defined. Finally, the described analytical solutions could also serve as starting points for an automated searching software tool.
Modeling the Energy Use of a Connected and Automated Transportation System (Poster)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonder, J.; Brown, A.
Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing trafficmore » flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.« less
Fazalullasha, Fatima; Taras, Jillian; Morinis, Julia; Levin, Leo; Karmali, Karima; Neilson, Barbara; Muskat, Barbara; Bloch, Gary; Chan, Kevin; McDonald, Maureen; Makin, Sue; Ford-Jones, E Lee
2014-04-01
Previous research has highlighted the importance of addressing the social determinants of health to improve child health outcomes. However, significant barriers exist that limit the paediatrician's ability to properly address these issues. Barriers include a lack of clinical time, resources, training and education with regard to the social determinants of health; awareness of community resources; and case-management capacity. General practice recommendations to help the health care provider link patients to the community are insufficient. The objective of the current article was to present options for improving the link between the office and the community, using screening questions incorporating physician-based tools that link community resources. Simple interventions, such as routine referral to early-year centres and selected referral to public health home-visiting programs, may help to address populations with the greatest needs.
Fazalullasha, Fatima; Taras, Jillian; Morinis, Julia; Levin, Leo; Karmali, Karima; Neilson, Barbara; Muskat, Barbara; Bloch, Gary; Chan, Kevin; McDonald, Maureen; Makin, Sue; Ford-Jones, E Lee
2014-01-01
Previous research has highlighted the importance of addressing the social determinants of health to improve child health outcomes. However, significant barriers exist that limit the paediatrician’s ability to properly address these issues. Barriers include a lack of clinical time, resources, training and education with regard to the social determinants of health; awareness of community resources; and case-management capacity. General practice recommendations to help the health care provider link patients to the community are insufficient. The objective of the current article was to present options for improving the link between the office and the community, using screening questions incorporating physician-based tools that link community resources. Simple interventions, such as routine referral to early-year centres and selected referral to public health home-visiting programs, may help to address populations with the greatest needs. PMID:24855416
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We report on the real-time creation of an application for hands-on neurophysiology in an advanced undergraduate teaching laboratory. Enabled by the rapid software development tools included in the Matlab technical computing environment (The Mathworks, Natick, MA), a team, consisting of a neurophysiology educator and a biophysicist trained as an electrical engineer, interfaced to a course of approximately 15 students from engineering and biology backgrounds. The result is the powerful freeware data acquisition and analysis environment, "g-PRIME." The software was developed from week to week in response to curriculum demands, and student feedback. The program evolved from a simple software oscilloscope, enabling RC circuit analysis, to a suite of tools supporting analysis of neuronal excitability and synaptic transmission analysis in invertebrate model systems. The program has subsequently expanded in application to university courses, research, and high school projects in the US and abroad as free courseware.
A Computational and Experimental Study of Slit Resonators
NASA Technical Reports Server (NTRS)
Tam, C. K. W.; Ju, H.; Jones, M. G.; Watson, W. R.; Parrott, T. L.
2003-01-01
Computational and experimental studies are carried out to offer validation of the results obtained from direct numerical simulation (DNS) of the flow and acoustic fields of slit resonators. The test cases include slits with 90-degree corners and slits with 45-degree bevel angle housed inside an acoustic impedance tube. Three slit widths are used. Six frequencies from 0.5 to 3.0 kHz are chosen. Good agreement is found between computed and measured reflection factors. In addition, incident sound waves having white noise spectrum and a prescribed pseudo-random noise spectrum are used in subsequent series of tests. The computed broadband results are again found to agree well with experimental data. It is believed the present results provide strong support that DNS can eventually be a useful and accurate prediction tool for liner aeroacoustics. The usage of DNS as a design tool is discussed and illustrated by a simple example.
NASA Technical Reports Server (NTRS)
Milligan, James R.; Dutton, James E.
1993-01-01
In this paper, we have introduced a comprehensive method for enterprise modeling that addresses the three important aspects of how an organization goes about its business. FirstEP includes infrastructure modeling, information modeling, and process modeling notations that are intended to be easy to learn and use. The notations stress the use of straightforward visual languages that are intuitive, syntactically simple, and semantically rich. ProSLCSE will be developed with automated tools and services to facilitate enterprise modeling and process enactment. In the spirit of FirstEP, ProSLCSE tools will also be seductively easy to use. Achieving fully managed, optimized software development and support processes will be long and arduous for most software organizations, and many serious problems will have to be solved along the way. ProSLCSE will provide the ability to document, communicate, and modify existing processes, which is the necessary first step.
Robust tissue classification for reproducible wound assessment in telemedicine environments
NASA Astrophysics Data System (ADS)
Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves
2010-04-01
In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.
Evaluation and early detection of problematic Internet use in adolescents.
Gómez Salgado, Patricia; Rial Boubeta, Antonio; Braña Tobío, Teresa; Varela Mallou, Jesús; Barreiro Couto, Carmen
2014-01-01
Problematic Internet use in adolescents has become an issue of concern for a growing number of researchers and institutions over the past years. Behavioural problems, social isolation, school failure and family problems are some of the consequences of psychological and behavioural impact on teenagers. Taking into account the interest that this issue has generated at many levels, the aim of this paper is to develop a screening tool for early detection of problematic Internet use in teenagers. A survey of Compulsory Secondary School students from Galicia involving a total of 2,339 individuals was carried out. The results obtained allow (1) gauging the magnitude of the problem, establishing the risk levels among the adolescents, and (2) presenting a new, simple and short screening instrument. The present scale has sufficient theoretical and empirical support, including good psychometric properties (a = .83; specificity = .81; sensitivity = .80; ROC curve = .90), making it an interesting applied tool.
Thoth: Software for data visualization & statistics
NASA Astrophysics Data System (ADS)
Laher, R. R.
2016-10-01
Thoth is a standalone software application with a graphical user interface for making it easy to query, display, visualize, and analyze tabular data stored in relational databases and data files. From imported data tables, it can create pie charts, bar charts, scatter plots, and many other kinds of data graphs with simple menus and mouse clicks (no programming required), by leveraging the open-source JFreeChart library. It also computes useful table-column data statistics. A mature tool, having underwent development and testing over several years, it is written in the Java computer language, and hence can be run on any computing platform that has a Java Virtual Machine and graphical-display capability. It can be downloaded and used by anyone free of charge, and has general applicability in science, engineering, medical, business, and other fields. Special tools and features for common tasks in astronomy and astrophysical research are included in the software.
Interactive computation of coverage regions for indoor wireless communication
NASA Astrophysics Data System (ADS)
Abbott, A. Lynn; Bhat, Nitin; Rappaport, Theodore S.
1995-12-01
This paper describes a system which assists in the strategic placement of rf base stations within buildings. Known as the site modeling tool (SMT), this system allows the user to display graphical floor plans and to select base station transceiver parameters, including location and orientation, interactively. The system then computes and highlights estimated coverage regions for each transceiver, enabling the user to assess the total coverage within the building. For single-floor operation, the user can choose between distance-dependent and partition- dependent path-loss models. Similar path-loss models are also available for the case of multiple floors. This paper describes the method used by the system to estimate coverage for both directional and omnidirectional antennas. The site modeling tool is intended to be simple to use by individuals who are not experts at wireless communication system design, and is expected to be very useful in the specification of indoor wireless systems.
Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools
ERIC Educational Resources Information Center
Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.
2014-01-01
Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…
ERIC Educational Resources Information Center
Adedokun, Omolola A.
2018-01-01
This article provides an illustrative description of the pre-post difference index (PPDI), a simple, nontechnical yet robust tool for examining the instructional sensitivity of assessment items. Extension educators often design pretest-posttest instruments to assess the impact of their curricula on participants' knowledge and understanding of the…
A Cost Estimation Tool for Charter Schools
ERIC Educational Resources Information Center
Hayes, Cheryl D.; Keller, Eric
2009-01-01
To align their financing strategies and fundraising efforts with their fiscal needs, charter school leaders need to know how much funding they need and what that funding will support. This cost estimation tool offers a simple set of worksheets to help start-up charter school operators identify and estimate the range of costs and timing of…
Strategies for Using Repetition as a Powerful Teaching Tool
ERIC Educational Resources Information Center
Saville, Kirt
2011-01-01
Brain research indicates that repetition is of vital importance in the learning process. Repetition is an especially useful tool in the area of music education. The success of repetition can be enhanced by accurate and timely feedback. From "simple repetition" to "repetition with the addition or subtraction of degrees of freedom," there are many…
ACED IT: A Tool for Improved Ethical and Moral Decision-Making
ERIC Educational Resources Information Center
Kreitler, Crystal Mata; Stenmark, Cheryl K.; Rodarte, Allen M.; Piñón DuMond, Rebecca
2014-01-01
Numerous examples of unethical organizational decision-making highlighted in the media have led many to question the general moral perception and ethical judgments of individuals. The present study examined two forms of a straightforward ethical decision-making (EDM) tool (ACED IT cognitive map) that could be a relatively simple instrument for…
Tools for the Classroom? an Examination of Existing Sociometric Methods for Teacher Use
ERIC Educational Resources Information Center
McMullen, Jake A.; Veermans, Koen; Laine, Kaarina
2014-01-01
Despite the recent technical and theoretical advances in the investigation of children's social relations, the inherent complexity of these methods may prevent their easy integration into the classroom. A simple and effective tool can be valuable for teachers who wish to investigate students' social realities in the classroom. Therefore, the…
Energy Assessment Helps Kaiser Aluminum Save Energy and Improve Productivity
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2008-07-01
The Kaiser Aluminum plant in Sherman, Texas, adjusted controls and made repairs to a furnace for a simple payback of 1 month. Kaiser adopted DOE's Process Heating Assessment and Survey Tool (PHAST) software as the corporate diagnostic tool and has used it to evaluate process heating systems at five other aluminum plants.
Transition Matrices: A Tool to Assess Student Learning and Improve Instruction
ERIC Educational Resources Information Center
Morris, Gary A.; Walter, Paul; Skees, Spencer; Schwartz, Samantha
2017-01-01
This paper introduces a new spreadsheet tool for adoption by high school or college-level physics teachers who use common assessments in a pre-instruction/post-instruction mode to diagnose student learning and teaching effectiveness. The spreadsheet creates a simple matrix that identifies the percentage of students who select each possible…
Practical Application of Aspiration as an Outcome Indicator in Extension Evaluation
ERIC Educational Resources Information Center
Jayaratne, K. S. U.
2010-01-01
Extension educators need simple and accurate evaluation tools for program evaluation. This article explains how to use aspiration as an outcome indicator in Extension evaluation and introduces a practical evaluation tool. Aspiration can be described as the readiness for change. By recording participants' levels of aspiration, we will be able to…
NASA Astrophysics Data System (ADS)
Prakash, Manu; Mukundarajan, Haripriya
2013-11-01
A simple bite from an insect is the transmission mechanism for many deadly diseases worldwide--including malaria, yellow fever, west nile and dengue. Very little is known about how populations of numerous insect species and disease-causing parasites interact in their natural habitats due to a lack of measurement techniques. At present, vector surveillance techniques involve manual capture by using humans as live bait, which is hard to justify on ethical grounds. Individual mosquitoes are manually dissected to isolate salivary glands to detect sporozites. With typical vector infection rates being very low even in endemic areas, it is almost impossible to get an accurate picture of disease distribution, in both space and time. Here we present novel high-throughput microfluidic tools for vector surveillance, specifically mosquitoes. A two-dimensional high density array with baits provide an integrated platform for multiplex PCR for detection of both vector and parasite species. Combining techniques from engineering and field ecology, methods and tools developed here will enable high-throughput measurement of infection rates for a number of diseases in mosquito populations in field conditions. Pew Foundation.
KISS for STRAP: user extensions for a protein alignment editor.
Gille, Christoph; Lorenzen, Stephan; Michalsky, Elke; Frömmel, Cornelius
2003-12-12
The Structural Alignment Program STRAP is a comfortable comprehensive editor and analyzing tool for protein alignments. A wide range of functions related to protein sequences and protein structures are accessible with an intuitive graphical interface. Recent features include mapping of mutations and polymorphisms onto structures and production of high quality figures for publication. Here we address the general problem of multi-purpose program packages to keep up with the rapid development of bioinformatical methods and the demand for specific program functions. STRAP was remade implementing a novel design which aims at Keeping Interfaces in STRAP Simple (KISS). KISS renders STRAP extendable to bio-scientists as well as to bio-informaticians. Scientists with basic computer skills are capable of implementing statistical methods or embedding existing bioinformatical tools in STRAP themselves. For bio-informaticians STRAP may serve as an environment for rapid prototyping and testing of complex algorithms such as automatic alignment algorithms or phylogenetic methods. Further, STRAP can be applied as an interactive web applet to present data related to a particular protein family and as a teaching tool. JAVA-1.4 or higher. http://www.charite.de/bioinf/strap/
Castaño-Díez, Daniel; Kudryashev, Mikhail; Stahlberg, Henning
2017-02-01
Cryo electron tomography allows macromolecular complexes within vitrified, intact, thin cells or sections thereof to be visualized, and structural analysis to be performed in situ by averaging over multiple copies of the same molecules. Image processing for subtomogram averaging is specific and cumbersome, due to the large amount of data and its three dimensional nature and anisotropic resolution. Here, we streamline data processing for subtomogram averaging by introducing an archiving system, Dynamo Catalogue. This system manages tomographic data from multiple tomograms and allows visual feedback during all processing steps, including particle picking, extraction, alignment and classification. The file structure of a processing project file structure includes logfiles of performed operations, and can be backed up and shared between users. Command line commands, database queries and a set of GUIs give the user versatile control over the process. Here, we introduce a set of geometric tools that streamline particle picking from simple (filaments, spheres, tubes, vesicles) and complex geometries (arbitrary 2D surfaces, rare instances on proteins with geometric restrictions, and 2D and 3D crystals). Advanced functionality, such as manual alignment and subboxing, is useful when initial templates are generated for alignment and for project customization. Dynamo Catalogue is part of the open source package Dynamo and includes tools to ensure format compatibility with the subtomogram averaging functionalities of other packages, such as Jsubtomo, PyTom, PEET, EMAN2, XMIPP and Relion. Copyright © 2016. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Escriva-Bou, A.; Lund, J. R.; Pulido-Velazquez, M.; Medellin-Azuara, J.
2015-12-01
Most individual processes relating water and energy interdependence have been assessed in many different ways over the last decade. It is time to step up and include the results of these studies in management by proportionating a tool for integrating these processes in decision-making to effectively understand the tradeoffs between water and energy from management options and scenarios. A simple but powerful decision support system (DSS) for water management is described that includes water-related energy use and GHG emissions not solely from the water operations, but also from final water end uses, including demands from cities, agriculture, environment and the energy sector. Because one of the main drivers of energy use and GHG emissions is water pumping from aquifers, the DSS combines a surface water management model with a simple groundwater model, accounting for their interrelationships. The model also explicitly includes economic data to optimize water use across sectors during shortages and calculate return flows from different uses. Capabilities of the DSS are demonstrated on a case study over California's intertied water system. Results show that urban end uses account for most GHG emissions of the entire water cycle, but large water conveyance produces significant peaks over the summer season. Also the development of more efficient water application on the agricultural sector has increased the total energy consumption and the net water use in the basins.
2011-01-01
Background Multiple types of assays allow sensitive detection of virus-specific neutralizing antibodies. For example, the extent of antibody neutralization of HIV-1, SIV and SHIV can be measured in the TZM-bl cell line through the degree of luciferase reporter gene expression after infection. In the past, neutralization curves and titers for this standard assay have been calculated using an Excel macro. Updating all instances of such a macro with new techniques can be unwieldy and introduce non-uniformity across multi-lab teams. Using Excel also poses challenges in centrally storing, sharing and associating raw data files and results. Results We present LabKey Server's NAb tool for organizing, analyzing and securely sharing data, files and results for neutralizing antibody (NAb) assays, including the luciferase-based TZM-bl NAb assay. The customizable tool supports high-throughput experiments and includes a graphical plate template designer, allowing researchers to quickly adapt calculations to new plate layouts. The tool calculates the percent neutralization for each serum dilution based on luminescence measurements, fits a range of neutralization curves to titration results and uses these curves to estimate the neutralizing antibody titers for benchmark dilutions. Results, curve visualizations and raw data files are stored in a database and shared through a secure, web-based interface. NAb results can be integrated with other data sources based on sample identifiers. It is simple to make results public after publication by updating folder security settings. Conclusions Standardized tools for analyzing, archiving and sharing assay results can improve the reproducibility, comparability and reliability of results obtained across many labs. LabKey Server and its NAb tool are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. Many members of the HIV research community can also access the LabKey Server NAb tool without installing the software by using the Atlas Science Portal (https://atlas.scharp.org). Atlas is an installation of LabKey Server. PMID:21619655
Dose calculation of dynamic trajectory radiotherapy using Monte Carlo.
Manser, P; Frauchiger, D; Frei, D; Volken, W; Terribilini, D; Fix, M K
2018-04-06
Using volumetric modulated arc therapy (VMAT) delivery technique gantry position, multi-leaf collimator (MLC) as well as dose rate change dynamically during the application. However, additional components can be dynamically altered throughout the dose delivery such as the collimator or the couch. Thus, the degrees of freedom increase allowing almost arbitrary dynamic trajectories for the beam. While the dose delivery of such dynamic trajectories for linear accelerators is technically possible, there is currently no dose calculation and validation tool available. Thus, the aim of this work is to develop a dose calculation and verification tool for dynamic trajectories using Monte Carlo (MC) methods. The dose calculation for dynamic trajectories is implemented in the previously developed Swiss Monte Carlo Plan (SMCP). SMCP interfaces the treatment planning system Eclipse with a MC dose calculation algorithm and is already able to handle dynamic MLC and gantry rotations. Hence, the additional dynamic components, namely the collimator and the couch, are described similarly to the dynamic MLC by defining data pairs of positions of the dynamic component and the corresponding MU-fractions. For validation purposes, measurements are performed with the Delta4 phantom and film measurements using the developer mode on a TrueBeam linear accelerator. These measured dose distributions are then compared with the corresponding calculations using SMCP. First, simple academic cases applying one-dimensional movements are investigated and second, more complex dynamic trajectories with several simultaneously moving components are compared considering academic cases as well as a clinically motivated prostate case. The dose calculation for dynamic trajectories is successfully implemented into SMCP. The comparisons between the measured and calculated dose distributions for the simple as well as for the more complex situations show an agreement which is generally within 3% of the maximum dose or 3mm. The required computation time for the dose calculation remains the same when the additional dynamic moving components are included. The results obtained for the dose comparisons for simple and complex situations suggest that the extended SMCP is an accurate dose calculation and efficient verification tool for dynamic trajectory radiotherapy. This work was supported by Varian Medical Systems. Copyright © 2018. Published by Elsevier GmbH.
SU-G-201-15: Nomogram as an Efficient Dosimetric Verification Tool in HDR Prostate Brachytherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, J; Todor, D
Purpose: Nomogram as a simple QA tool for HDR prostate brachytherapy treatment planning has been developed and validated clinically. Reproducibility including patient-to-patient and physician-to-physician variability was assessed. Methods: The study was performed on HDR prostate implants from physician A (n=34) and B (n=15) using different implant techniques and planning methodologies. A nomogram was implemented as an independent QA of computer-based treatment planning before plan execution. Normalized implant strength (total air kerma strength Sk*t in cGy cm{sup 2} divided by prescribed dose in cGy) was plotted as a function of PTV volume and total V100. A quadratic equation was used tomore » fit the data with R{sup 2} denoting the model predictive power. Results: All plans showed good target coverage while OARs met the dose constraint guidelines. Vastly different implant and planning styles were reflected on conformity index (entire dose matrix V100/PTV volume, physician A implants: 1.27±0.14, physician B: 1.47±0.17) and PTV V150/PTV volume ratio (physician A: 0.34±0.09, physician B: 0.24±0.07). The quadratic model provided a better fit for the curved relationship between normalized implant strength and total V100 (or PTV volume) than a simple linear function. Unlike the normalized implant strength versus PTV volume nomogram which differed between physicians, a unique quadratic model based nomogram (Sk*t)/D=−0.0008V2+0.0542V+1.1185 (R{sup 2}=0.9977) described the dependence of normalized implant strength on total V100 over all the patients from both physicians despite two different implant and planning philosophies. Normalized implant strength - total V100 model also generated less deviant points distorting the smoothed ones with a significantly higher correlation. Conclusion: A simple and universal, excel-based nomogram was created as an independent calculation tool for HDR prostate brachytherapy. Unlike similar attempts, our nomogram is insensitive to implant style and does not rely on reproducing dose calculations using TG-43 formalism, thus making it a truly independent check.« less
ERIC Educational Resources Information Center
Tsai, Chin-Chung
2003-01-01
Examines the effects of using a conflict map on 8th grade students' conceptual change and ideational networks about simple series electric circuits. Analyzes student interview data through a flow map method. Shows that the use of conflict maps could help students construct greater, richer, and more integrated ideational networks about electric…
ERIC Educational Resources Information Center
Unlu, Zeynep Koyunlu; Dokme, Ibilge
2011-01-01
The purpose of this study was to investigate whether the combination of both analogy-based simulation and laboratory activities as a teaching tool was more effective than utilizing them separately in teaching the concepts of simple electricity. The quasi-experimental design that involved 66 seventh grade students from urban Turkish elementary…
Projectiles, pendula, and special relativity
NASA Astrophysics Data System (ADS)
Price, Richard H.
2005-05-01
The kind of flat-earth gravity used in introductory physics appears in an accelerated reference system in special relativity. From this viewpoint, we work out the special relativistic description of a ballistic projectile and a simple pendulum, two examples of simple motion driven by earth-surface gravity. The analysis uses only the basic mathematical tools of special relativity typical of a first-year university course.
OPTHYLIC: An Optimised Tool for Hybrid Limits Computation
NASA Astrophysics Data System (ADS)
Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée
2018-05-01
A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.
SPARSKIT: A basic tool kit for sparse matrix computations
NASA Technical Reports Server (NTRS)
Saad, Youcef
1990-01-01
Presented here are the main features of a tool package for manipulating and working with sparse matrices. One of the goals of the package is to provide basic tools to facilitate the exchange of software and data between researchers in sparse matrix computations. The starting point is the Harwell/Boeing collection of matrices for which the authors provide a number of tools. Among other things, the package provides programs for converting data structures, printing simple statistics on a matrix, plotting a matrix profile, and performing linear algebra operations with sparse matrices.
Dillehay, Tom D.; Goodbred, Steve; Pino, Mario; Vásquez Sánchez, Víctor F.; Tham, Teresa Rosales; Adovasio, James; Collins, Michael B.; Netherly, Patricia J.; Hastorf, Christine A.; Chiou, Katherine L.; Piperno, Dolores; Rey, Isabel; Velchoff, Nancy
2017-01-01
Simple pebble tools, ephemeral cultural features, and the remains of maritime and terrestrial foods are present in undisturbed Late Pleistocene and Early Holocene deposits underneath a large human-made mound at Huaca Prieta and nearby sites on the Pacific coast of northern Peru. Radiocarbon ages indicate an intermittent human presence dated between ~15,000 and 8000 calendar years ago before the mound was built. The absence of fishhooks, harpoons, and bifacial stone tools suggests that technologies of gathering, trapping, clubbing, and exchange were used primarily to procure food resources along the shoreline and in estuarine wetlands and distant mountains. The stone artifacts are minimally worked unifacial stone tools characteristic of several areas of South America. Remains of avocado, bean, and possibly cultivated squash and chile pepper are also present, suggesting human transport and consumption. Our new findings emphasize an early coastal lifeway of diverse food procurement strategies that suggest detailed observation of resource availability in multiple environments and a knowledgeable economic organization, although technologies were simple and campsites were seemingly ephemeral and discontinuous. These findings raise questions about the pace of early human movement along some areas of the Pacific coast and the level of knowledge and technology required to exploit maritime and inland resources. PMID:28560337
Dillehay, Tom D; Goodbred, Steve; Pino, Mario; Vásquez Sánchez, Víctor F; Tham, Teresa Rosales; Adovasio, James; Collins, Michael B; Netherly, Patricia J; Hastorf, Christine A; Chiou, Katherine L; Piperno, Dolores; Rey, Isabel; Velchoff, Nancy
2017-05-01
Simple pebble tools, ephemeral cultural features, and the remains of maritime and terrestrial foods are present in undisturbed Late Pleistocene and Early Holocene deposits underneath a large human-made mound at Huaca Prieta and nearby sites on the Pacific coast of northern Peru. Radiocarbon ages indicate an intermittent human presence dated between ~15,000 and 8000 calendar years ago before the mound was built. The absence of fishhooks, harpoons, and bifacial stone tools suggests that technologies of gathering, trapping, clubbing, and exchange were used primarily to procure food resources along the shoreline and in estuarine wetlands and distant mountains. The stone artifacts are minimally worked unifacial stone tools characteristic of several areas of South America. Remains of avocado, bean, and possibly cultivated squash and chile pepper are also present, suggesting human transport and consumption. Our new findings emphasize an early coastal lifeway of diverse food procurement strategies that suggest detailed observation of resource availability in multiple environments and a knowledgeable economic organization, although technologies were simple and campsites were seemingly ephemeral and discontinuous. These findings raise questions about the pace of early human movement along some areas of the Pacific coast and the level of knowledge and technology required to exploit maritime and inland resources.
Open source software in a practical approach for post processing of radiologic images.
Valeri, Gianluca; Mazza, Francesco Antonino; Maggi, Stefania; Aramini, Daniele; La Riccia, Luigi; Mazzoni, Giovanni; Giovagnoni, Andrea
2015-03-01
The purpose of this paper is to evaluate the use of open source software (OSS) to process DICOM images. We selected 23 programs for Windows and 20 programs for Mac from 150 possible OSS programs including DICOM viewers and various tools (converters, DICOM header editors, etc.). The programs selected all meet the basic requirements such as free availability, stand-alone application, presence of graphical user interface, ease of installation and advanced features beyond simple display monitor. Capabilities of data import, data export, metadata, 2D viewer, 3D viewer, support platform and usability of each selected program were evaluated on a scale ranging from 1 to 10 points. Twelve programs received a score higher than or equal to eight. Among them, five obtained a score of 9: 3D Slicer, MedINRIA, MITK 3M3, VolView, VR Render; while OsiriX received 10. OsiriX appears to be the only program able to perform all the operations taken into consideration, similar to a workstation equipped with proprietary software, allowing the analysis and interpretation of images in a simple and intuitive way. OsiriX is a DICOM PACS workstation for medical imaging and software for image processing for medical research, functional imaging, 3D imaging, confocal microscopy and molecular imaging. This application is also a good tool for teaching activities because it facilitates the attainment of learning objectives among students and other specialists.
Using Alice 2.0 to Design Games for People with Stroke.
Proffitt, Rachel; Kelleher, Caitlin; Baum, M Carolyn; Engsberg, Jack
2012-08-01
Computer and videogames are gaining in popularity as rehabilitation tools. Unfortunately, most systems still require extensive programming/engineering knowledge to create, something that therapists, as novice programmers, do not possess. There is software designed to allow novice programmers to create storyboard and games through simple drag-and-drop formats; however, the applications for therapeutic game development have not been studied. The purpose of this study was to have an occupational therapy (OT) student with no prior computer programming experience learn how to create computer games for persons with stroke using Alice 2.0, a drag-and-drop editor, designed by Carnegie Mellon University (Pittsburgh, PA). The OT student learned how to use Alice 2.0 through a textbook, tutorials, and assistance from computer science students. She kept a journal of her process, detailing her successes and challenges. The OT student created three games for people with stroke using Alice 2.0. She found that although there were many supports in Alice for creating stories, it lacked critical pieces necessary for game design. Her recommendations for a future programming environment for therapists were that it (1) be efficient, (2) include basic game design pieces so therapists do not have to create them, (3) provide technical support, and (4) be simple. With the incorporation of these recommendations, a future programming environment for therapists will be an effective tool for therapeutic game development.
Engineered nanomaterials: toward effective safety management in research laboratories.
Groso, Amela; Petri-Fink, Alke; Rothen-Rutishauser, Barbara; Hofmann, Heinrich; Meyer, Thierry
2016-03-15
It is still unknown which types of nanomaterials and associated doses represent an actual danger to humans and environment. Meanwhile, there is consensus on applying the precautionary principle to these novel materials until more information is available. To deal with the rapid evolution of research, including the fast turnover of collaborators, a user-friendly and easy-to-apply risk assessment tool offering adequate preventive and protective measures has to be provided. Based on new information concerning the hazards of engineered nanomaterials, we improved a previously developed risk assessment tool by following a simple scheme to gain in efficiency. In the first step, using a logical decision tree, one of the three hazard levels, from H1 to H3, is assigned to the nanomaterial. Using a combination of decision trees and matrices, the second step links the hazard with the emission and exposure potential to assign one of the three nanorisk levels (Nano 3 highest risk; Nano 1 lowest risk) to the activity. These operations are repeated at each process step, leading to the laboratory classification. The third step provides detailed preventive and protective measures for the determined level of nanorisk. We developed an adapted simple and intuitive method for nanomaterial risk management in research laboratories. It allows classifying the nanoactivities into three levels, additionally proposing concrete preventive and protective measures and associated actions. This method is a valuable tool for all the participants in nanomaterial safety. The users experience an essential learning opportunity and increase their safety awareness. Laboratory managers have a reliable tool to obtain an overview of the operations involving nanomaterials in their laboratories; this is essential, as they are responsible for the employee safety, but are sometimes unaware of the works performed. Bringing this risk to a three-band scale (like other types of risks such as biological, radiation, chemical, etc.) facilitates the management for occupational health and safety specialists. Institutes and school managers can obtain the necessary information to implement an adequate safety management system. Having an easy-to-use tool enables a dialog between all these partners, whose semantic and priorities in terms of safety are often different.
Assessing and Addressing Students' Scientific Literacy Needs in Physical Geology
NASA Astrophysics Data System (ADS)
Campbell-Stone, E. A.; Myers, J. D.
2005-12-01
Exacting excellence equally from university students around the globe can be accomplished by providing all students with necessary background tools to achieve mastery of their courses, even if those tools are not part of normal content. As instructors we hope to see our students grasp the substance of our courses, make mental connections between course material and practical applications, and use this knowledge to make informed decisions as citizens. Yet many educators have found that students enter university-level introductory courses in mathematics, science and engineering without adequate academic preparation. As part of a FIPSE-funded project at the University of Wyoming, the instructors of the Physical Geology course have taken a new approach to tackling the problem of lack of scientific/mathematic skills in incoming students. Instead of assuming that students should already know or will learn these skills on their own, they assess students' needs and provide them the opportunity to master scientific literacies as they learn geologic content. In the introductory geology course, instructors identified two categories of literacies, or basic skills that are necessary for academic success and citizen participation. Fundamental literacies include performing simple quantitative calculations, making qualitative assessments, and reading and analyzing tables and graphs. Technical literacies are those specific to understanding geology, and comprise the ability to read maps, visualize changes through time, and conceptualize in three dimensions. Because these skills are most easily taught in lab, the in-house lab manual was rewritten to be both literacy- and content-based. Early labs include simple exercises addressing literacies in the context of geological science, and each subsequent lab repeats exposure to literacies, but at increasing levels of difficulty. Resources available to assist students with literacy mastery include individual instruction, a detailed appendix to the lab manual explaining simple tasks such as converting units, and web-based resources. To document the progress of this program, students take pre- and post-course surveys assessing their grasp of the literacies. The surveys gather data on demographics, background, level of interest, level of confidence, understanding, and willingness to complete additional problem sets. This information has been integral in identifying areas of greatest weakness, least interest, and in gauging how backgrounds, expectations, and students' confidence affect their performance.
Ito, Yoichiro; Ma, Xiaofeng; Clary, Robert
2016-01-01
A simple tool is introduced which can modify the shape of tubing to enhance the partition efficiency in high-speed countercurrent chromatography. It consists of a pair of interlocking identical gears, each coaxially holding a pressing wheel to intermittently compress plastic tubing in 0 – 10 mm length at every 1 cm interval. The performance of the processed tubing is examined in protein separation with 1.6 mm ID PTFE tubing intermittently pressed in 3 mm and 10 mm width both at 10 mm intervals at various flow rates and revolution speeds. A series of experiments was performed with a polymer phase system composed of polyethylene glycol and dibasic potassium phosphate each at 12.5% (w/w) in deionized water using three protein samples. Overall results clearly demonstrate that the compressed tubing can yield substantially higher peak resolution than the non-processed tubing. The simple tubing modifier is very useful for separation of proteins with high-speed countercurrent chromatography. PMID:27818942
Crawshaw, Benjamin P; Keller, Deborah S; Brady, Justin T; Augestad, Knut M; Schiltz, Nicholas K; Koroukian, Siran M; Navale, Suparna M; Steele, Scott R; Delaney, Conor P
2017-03-01
The HospitAl length of stay, Readmissions and Mortality (HARM) score is a simple, inexpensive quality tool, linked directly to patient outcomes. We assess the HARM score for measuring surgical quality across multiple surgical populations. Upper gastrointestinal, hepatobiliary, and colorectal surgery cases between 2005 and 2009 were identified from the Healthcare Cost and Utilization Project California State Inpatient Database. Composite and individual HARM scores were calculated from length of stay, 30-day readmission and mortality, correlated to complication rates for each hospital and stratified by operative type. 71,419 admissions were analyzed. Higher HARM scores correlated with higher complication rates for all cases after risk adjustment and stratification by operation type, elective or emergent status. The HARM score is a simple and valid quality measurement for upper gastrointestinal, hepatobiliary and colorectal surgery. The HARM score could facilitate benchmarking to improve patient outcomes and resource utilization, and may facilitate outcome improvement. Copyright © 2016 Elsevier Inc. All rights reserved.
Ito, Yoichiro; Ma, Xiaofeng; Clary, Robert
2016-01-01
A simple tool is introduced which can modify the shape of tubing to enhance the partition efficiency in high-speed countercurrent chromatography. It consists of a pair of interlocking identical gears, each coaxially holding a pressing wheel to intermittently compress plastic tubing in 0 - 10 mm length at every 1 cm interval. The performance of the processed tubing is examined in protein separation with 1.6 mm ID PTFE tubing intermittently pressed in 3 mm and 10 mm width both at 10 mm intervals at various flow rates and revolution speeds. A series of experiments was performed with a polymer phase system composed of polyethylene glycol and dibasic potassium phosphate each at 12.5% (w/w) in deionized water using three protein samples. Overall results clearly demonstrate that the compressed tubing can yield substantially higher peak resolution than the non-processed tubing. The simple tubing modifier is very useful for separation of proteins with high-speed countercurrent chromatography.
NASA Technical Reports Server (NTRS)
Manousiouthakis, Vasilios
1995-01-01
We developed simple mathematical models for many of the technologies constituting the water reclamation system in a space station. These models were employed for subsystem optimization and for the evaluation of the performance of individual water reclamation technologies, by quantifying their operational 'cost' as a linear function of weight, volume, and power consumption. Then we performed preliminary investigations on the performance improvements attainable by simple hybrid systems involving parallel combinations of technologies. We are developing a software tool for synthesizing a hybrid water recovery system (WRS) for long term space missions. As conceptual framework, we are employing the state space approach. Given a number of available technologies and the mission specifications, the state space approach would help design flowsheets featuring optimal process configurations, including those that feature stream connections in parallel, series, or recycles. We visualize this software tool to function as follows: given the mission duration, the crew size, water quality specifications, and the cost coefficients, the software will synthesize a water recovery system for the space station. It should require minimal user intervention. The following tasks need to be solved for achieving this goal: (1) formulate a problem statement that will be used to evaluate the advantages of a hybrid WRS over a single technology WBS; (2) model several WRS technologies that can be employed in the space station; (3) propose a recycling network design methodology (since the WRS synthesis task is a recycling network design problem, it is essential to employ a systematic method in synthesizing this network); (4) develop a software implementation for this design methodology, design a hybrid system using this software, and compare the resulting WRS with a base-case WRS; and (5) create a user-friendly interface for this software tool.
Tsai, Ping-Huang; Liu, Jian-Liang; Lin, Ker-Neng; Chang, Chiung-Chih; Pai, Ming-Chyi; Wang, Wen-Fu; Huang, Jen-Ping; Hwang, Tzung-Jeng; Wang, Pei-Ning
2018-01-01
Objectives To develop a simple dementia screening tool to assist primary care physicians in identifying patients with cognitive impairment among subjects with memory complaints or at a high risk for dementia. Design The Brain Health Test (BHT) was developed by several experienced neurologists, psychiatrists, and clinical psychologists in the Taiwan Dementia Society. Validation of the BHT was conducted in the memory clinics of various levels of hospitals in Taiwan. Participants All dementia patients at the memory clinics who met the inclusion criteria of age greater or equal to 50 years were enrolled. Besides the BHT, the Mini-Mental State Examination and Clinical Dementia Rating were used to evaluate the cognition state of the patients and the severity of dementia. Results The BHT includes two parts: a risk evaluation and a cognitive test (BHT-cog). Self or informants reports of memory decline or needing help from others to manage money or medications were significantly associated with cognitive impairment. Among the risk factors evaluated in the BHT, a total risk score greater or equal to 8 was defined as a high risk for dementia. The total score for the finalized BHT-cog was 16. When the cutoff value for the BHT-cog was set to 10 for differentiating dementia and a normal mental state, the sensitivity was 91.5%, the specificity was 87.3%, the positive predictive value was 94.8%, and the negative predictive value was 80.1% The area under the receiver operating characteristic curve between dementia and healthy subjects was 0.958 (95% CI = 0.941–0.975). Conclusions The BHT is a simple tool that may be useful in primary care settings to identify high-risk patients to target for cognitive screening. PMID:29694392
NASA Technical Reports Server (NTRS)
Matney, Mark
2011-01-01
A number of statistical tools have been developed over the years for assessing the risk of reentering objects to human populations. These tools make use of the characteristics (e.g., mass, material, shape, size) of debris that are predicted by aerothermal models to survive reentry. The statistical tools use this information to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of the analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. Because this information is used in making policy and engineering decisions, it is important that these assumptions be tested using empirical data. This study uses the latest database of known uncontrolled reentry locations measured by the United States Department of Defense. The predicted ground footprint distributions of these objects are based on the theory that their orbits behave basically like simple Kepler orbits. However, there are a number of factors in the final stages of reentry - including the effects of gravitational harmonics, the effects of the Earth s equatorial bulge on the atmosphere, and the rotation of the Earth and atmosphere - that could cause them to diverge from simple Kepler orbit behavior and possibly change the probability of reentering over a given location. In this paper, the measured latitude and longitude distributions of these objects are directly compared with the predicted distributions, providing a fundamental empirical test of the model assumptions.
Induced sensitivity of Bacillus subtilis colony morphology to mechanical media compression
Polka, Jessica K.
2014-01-01
Bacteria from several taxa, including Kurthia zopfii, Myxococcus xanthus, and Bacillus mycoides, have been reported to align growth of their colonies to small features on the surface of solid media, including anisotropies created by compression. While the function of this phenomenon is unclear, it may help organisms navigate on solid phases, such as soil. The origin of this behavior is also unknown: it may be biological (that is, dependent on components that sense the environment and regulate growth accordingly) or merely physical. Here we show that B. subtilis, an organism that typically does not respond to media compression, can be induced to do so with two simple and synergistic perturbations: a mutation that maintains cells in the swarming (chained) state, and the addition of EDTA to the growth media, which further increases chain length. EDTA apparently increases chain length by inducing defects in cell separation, as the treatment has only marginal effects on the length of individual cells. These results lead us to three conclusions. First, the wealth of genetic tools available to B. subtilis will provide a new, tractable chassis for engineering compression sensitive organisms. Second, the sensitivity of colony morphology to media compression in Bacillus can be modulated by altering a simple physical property of rod-shaped cells. And third, colony morphology under compression holds promise as a rapid, simple, and low-cost way to screen for changes in the length of rod-shaped cells or chains thereof. PMID:25289183
Nyblade, Laura; Jain, Aparna; Benkirane, Manal; Li, Li; Lohiniva, Anna-Leena; McLean, Roger; Turan, Janet M; Varas-Díaz, Nelson; Cintrón-Bou, Francheska; Guan, Jihui; Kwena, Zachary; Thomas, Wendell
2013-11-13
Within healthcare settings, HIV-related stigma is a recognized barrier to access of HIV prevention and treatment services and yet, few efforts have been made to scale-up stigma reduction programs in service delivery. This is in part due to the lack of a brief, simple, standardized tool for measuring stigma among all levels of health facility staff that works across diverse HIV prevalence, language and healthcare settings. In response, an international consortium led by the Health Policy Project, has developed and field tested a stigma measurement tool for use with health facility staff. Experts participated in a content-development workshop to review an item pool of existing measures, identify gaps and prioritize questions. The resulting questionnaire was field tested in six diverse sites (China, Dominica, Egypt, Kenya, Puerto Rico and St. Christopher & Nevis). Respondents included clinical and non-clinical staff. Questionnaires were self- or interviewer-administered. Analysis of item performance across sites examined both psychometric properties and contextual issues. The key outcome of the process was a substantially reduced questionnaire. Eighteen core questions measure three programmatically actionable drivers of stigma within health facilities (worry about HIV transmission, attitudes towards people living with HIV (PLHIV), and health facility environment, including policies), and enacted stigma. The questionnaire also includes one short scale for attitudes towards PLHIV (5-item scale, α=0.78). Stigma-reduction programmes in healthcare facilities are urgently needed to improve the quality of care provided, uphold the human right to healthcare, increase access to health services, and maximize investments in HIV prevention and treatment. This brief, standardized tool will facilitate inclusion of stigma measurement in research studies and in routine facility data collection, allowing for the monitoring of stigma within healthcare facilities and evaluation of stigma-reduction programmes. There is potential for wide use of the tool either as a stand-alone survey or integrated within other studies of health facility staff.
Rocket Engine Oscillation Diagnostics
NASA Technical Reports Server (NTRS)
Nesman, Tom; Turner, James E. (Technical Monitor)
2002-01-01
Rocket engine oscillating data can reveal many physical phenomena ranging from unsteady flow and acoustics to rotordynamics and structural dynamics. Because of this, engine diagnostics based on oscillation data should employ both signal analysis and physical modeling. This paper describes an approach to rocket engine oscillation diagnostics, types of problems encountered, and example problems solved. Determination of design guidelines and environments (or loads) from oscillating phenomena is required during initial stages of rocket engine design, while the additional tasks of health monitoring, incipient failure detection, and anomaly diagnostics occur during engine development and operation. Oscillations in rocket engines are typically related to flow driven acoustics, flow excited structures, or rotational forces. Additional sources of oscillatory energy are combustion and cavitation. Included in the example problems is a sampling of signal analysis tools employed in diagnostics. The rocket engine hardware includes combustion devices, valves, turbopumps, and ducts. Simple models of an oscillating fluid system or structure can be constructed to estimate pertinent dynamic parameters governing the unsteady behavior of engine systems or components. In the example problems it is shown that simple physical modeling when combined with signal analysis can be successfully employed to diagnose complex rocket engine oscillatory phenomena.
Helioviewer.org: Browsing Very Large Image Archives Online Using JPEG 2000
NASA Astrophysics Data System (ADS)
Hughitt, V. K.; Ireland, J.; Mueller, D.; Dimitoglou, G.; Garcia Ortiz, J.; Schmidt, L.; Wamsler, B.; Beck, J.; Alexanderian, A.; Fleck, B.
2009-12-01
As the amount of solar data available to scientists continues to increase at faster and faster rates, it is important that there exist simple tools for navigating this data quickly with a minimal amount of effort. By combining heterogeneous solar physics datatypes such as full-disk images and coronagraphs, along with feature and event information, Helioviewer offers a simple and intuitive way to browse multiple datasets simultaneously. Images are stored in a repository using the JPEG 2000 format and tiled dynamically upon a client's request. By tiling images and serving only the portions of the image requested, it is possible for the client to work with very large images without having to fetch all of the data at once. In addition to a focus on intercommunication with other virtual observatories and browsers (VSO, HEK, etc), Helioviewer will offer a number of externally-available application programming interfaces (APIs) to enable easy third party use, adoption and extension. Recent efforts have resulted in increased performance, dynamic movie generation, and improved support for mobile web browsers. Future functionality will include: support for additional data-sources including RHESSI, SDO, STEREO, and TRACE, a navigable timeline of recorded solar events, social annotation, and basic client-side image processing.
Kwasa, Judith; Cettomai, Deanna; Lwanya, Edwin; Osiemo, Dennis; Oyaro, Patrick; Birbeck, Gretchen L; Price, Richard W; Bukusi, Elizabeth A; Cohen, Craig R; Meyer, Ana-Claire L
2012-01-01
To conduct a preliminary evaluation of the utility and reliability of a diagnostic tool for HIV-associated dementia (HAD) for use by primary health care workers (HCW) which would be feasible to implement in resource-limited settings. In resource-limited settings, HAD is an indication for anti-retroviral therapy regardless of CD4 T-cell count. Anti-retroviral therapy, the treatment for HAD, is now increasingly available in resource-limited settings. Nonetheless, HAD remains under-diagnosed likely because of limited clinical expertise and availability of diagnostic tests. Thus, a simple diagnostic tool which is practical to implement in resource-limited settings is an urgent need. A convenience sample of 30 HIV-infected outpatients was enrolled in Western Kenya. We assessed the sensitivity and specificity of a diagnostic tool for HAD as administered by a primary HCW. This was compared to an expert clinical assessment which included examination by a physician, neuropsychological testing, and in selected cases, brain imaging. Agreement between HCW and an expert examiner on certain tool components was measured using Kappa statistic. The sample was 57% male, mean age was 38.6 years, mean CD4 T-cell count was 323 cells/µL, and 54% had less than a secondary school education. Six (20%) of the subjects were diagnosed with HAD by expert clinical assessment. The diagnostic tool was 63% sensitive and 67% specific for HAD. Agreement between HCW and expert examiners was poor for many individual items of the diagnostic tool (K = .03-.65). This diagnostic tool had moderate sensitivity and specificity for HAD. However, reliability was poor, suggesting that substantial training and formal evaluations of training adequacy will be critical to enable HCW to reliably administer a brief diagnostic tool for HAD.
Evaluation of IOTA Simple Ultrasound Rules to Distinguish Benign and Malignant Ovarian Tumours
Kaur, Amarjit; Mohi, Jaswinder Kaur; Sibia, Preet Kanwal; Kaur, Navkiran
2017-01-01
Introduction IOTA stands for International Ovarian Tumour Analysis group. Ovarian cancer is one of the common cancers in women and is diagnosed at later stage in majority. The limiting factor for early diagnosis is lack of standardized terms and procedures in gynaecological sonography. Introduction of IOTA rules has provided some consistency in defining morphological features of ovarian masses through a standardized examination technique. Aim To evaluate the efficacy of IOTA simple ultrasound rules in distinguishing benign and malignant ovarian tumours and establishing their use as a tool in early diagnosis of ovarian malignancy. Materials and Methods A hospital based case control prospective study was conducted. Patients with suspected ovarian pathology were evaluated using IOTA ultrasound rules and designated as benign or malignant. Findings were correlated with histopathological findings. Collected data was statistically analysed using chi-square test and kappa statistical method. Results Out of initial 55 patients, 50 patients were included in the final analysis who underwent surgery. IOTA simple rules were applicable in 45 out of these 50 patients (90%). The sensitivity for the detection of malignancy in cases where IOTA simple rules were applicable was 91.66% and the specificity was 84.84%. Accuracy was 86.66%. Classifying inconclusive cases as malignant, the sensitivity and specificity was 93% and 80% respectively. High level of agreement was found between USG and histopathological diagnosis with Kappa value as 0.323. Conclusion IOTA simple ultrasound rules were highly sensitive and specific in predicting ovarian malignancy preoperatively yet being reproducible, easy to train and use. PMID:28969237
Afzal, Naveed; Sohn, Sunghwan; Abram, Sara; Scott, Christopher G.; Chaudhry, Rajeev; Liu, Hongfang; Kullo, Iftikhar J.; Arruda-Olson, Adelaide M.
2016-01-01
Objective Lower extremity peripheral arterial disease (PAD) is highly prevalent and affects millions of individuals worldwide. We developed a natural language processing (NLP) system for automated ascertainment of PAD cases from clinical narrative notes and compared the performance of the NLP algorithm to billing code algorithms, using ankle-brachial index (ABI) test results as the gold standard. Methods We compared the performance of the NLP algorithm to 1) results of gold standard ABI; 2) previously validated algorithms based on relevant ICD-9 diagnostic codes (simple model) and 3) a combination of ICD-9 codes with procedural codes (full model). A dataset of 1,569 PAD patients and controls was randomly divided into training (n= 935) and testing (n= 634) subsets. Results We iteratively refined the NLP algorithm in the training set including narrative note sections, note types and service types, to maximize its accuracy. In the testing dataset, when compared with both simple and full models, the NLP algorithm had better accuracy (NLP: 91.8%, full model: 81.8%, simple model: 83%, P<.001), PPV (NLP: 92.9%, full model: 74.3%, simple model: 79.9%, P<.001), and specificity (NLP: 92.5%, full model: 64.2%, simple model: 75.9%, P<.001). Conclusions A knowledge-driven NLP algorithm for automatic ascertainment of PAD cases from clinical notes had greater accuracy than billing code algorithms. Our findings highlight the potential of NLP tools for rapid and efficient ascertainment of PAD cases from electronic health records to facilitate clinical investigation and eventually improve care by clinical decision support. PMID:28189359
New approach to analyzing soil-building systems
Safak, E.
1998-01-01
A new method of analyzing seismic response of soil-building systems is introduced. The method is based on the discrete-time formulation of wave propagation in layered media for vertically propagating plane shear waves. Buildings are modeled as an extension of the layered soil media by assuming that each story in the building is another layer. The seismic response is expressed in terms of wave travel times between the layers, and the wave reflection and transmission coefficients at layer interfaces. The calculation of the response is reduced to a pair of simple finite-difference equations for each layer, which are solved recursively starting from the bedrock. Compared with commonly used vibration formulation, the wave propagation formulation provides several advantages, including the ability to incorporate soil layers, simplicity of the calculations, improved accuracy in modeling the mass and damping, and better tools for system identification and damage detection.A new method of analyzing seismic response of soil-building systems is introduced. The method is based on the discrete-time formulation of wave propagation in layered media for vertically propagating plane shear waves. Buildings are modeled as an extension of the layered soil media by assuming that each story in the building is another layer. The seismic response is expressed in terms of wave travel times between the layers, and the wave reflection and transmission coefficients at layer interfaces. The calculation of the response is reduced to a pair of simple finite-difference equations for each layer, which are solved recursively starting from the bedrock. Compared with commonly used vibration formulation, the wave propagation formulation provides several advantages, including the ability to incorporate soil layers, simplicity of the calculations, improved accuracy in modeling the mass and damping, and better tools for system identification and damage detection.
Opal web services for biomedical applications.
Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W
2010-07-01
Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.
Schwean-Lardner, Karen; Vermette, Catherine; Leis, Marina; Classen, Henry L.
2016-01-01
Simple Summary Altering daylength in a poultry management program is a simple tool that can have immense impacts on productivity and bird welfare. It is not uncommon for lighting data derived from broiler research to be extrapolated to turkey production. This review of two studies (one with broilers and the second with turkeys), completed in the same research facility using the same lighting programs, shows evidence that some, but not all responses to graded daylengths are similar between these two species. It defines that daylength choices for turkeys should be based on research conducted with turkeys. Abstract Daylength used as a management tool has powerful implications on the welfare of both broilers and turkeys. Near-constant light results in many detrimental impacts, including lack of behavioural rhythms and circadian melatonin rhythms. Both are suggestive that sleep fragmentation could result in birds reared on long photoperiods, which can lead to the same negative health and physiological responses as total sleep deprivation. An indirect comparison of the welfare implications of graded levels of daylength on broilers and turkeys clearly indicate that long daylengths depress welfare by increasing mortality, reducing mobility, increasing ocular pathologies and changing behaviour in both species. Furthermore, long daylengths change melatonin secretion patterns and eliminate behavioural and melatonin circadian rhythms, which were measured in broilers in these works. However, feather pecking in turkeys was reduced when birds were exposed to long daylengths. Exactly how much darkness should be included in a management program to maximize welfare will depend on the species, the age of marketing, and in turkeys, bird gender. PMID:27120624
Locating the Center of Gravity: The Dance of Normal and Frictional Forces
ERIC Educational Resources Information Center
Balta, Nuri
2012-01-01
Teaching physics concepts with the basic materials that are around us is one of the beauties of physics. Without expensive lab materials and long experiments, many physics concepts can be taught to students using simple tools. Demonstrations with these tools can be presented as discrepant events that surprise, amaze, or puzzle students. Greenslade…
Why Are Shot Puts Thrown at 31[degrees]? Using Autograph for Applications of the Parabola
ERIC Educational Resources Information Center
Butler, Douglas
2010-01-01
Autograph is a two- and three-dimensional dynamic statistics and graphing utility, developed in England, that has grown out of direct classroom experience. A simple select-and-right-click interface, together with tools such as Autograph's unique Slow Plot, Scribble Tool, and dynamic Constant Controller help make the classroom experience…
Collaborating across Time Zones: How 2.0 Technology Can Bring Your Global Team Together
ERIC Educational Resources Information Center
Hastings, Robin
2008-01-01
The Web 2.0 tools and services that are making socializing, networking, and communicating in general so easy are also making group projects seriously simple. With the judicious use of a few of the popular tools that use Web 2.0 technologies and philosophies, one can collaboratively create documents, spreadsheets, presentations, websites, project…
Four-Month-Old Infants Individuate and Track Simple Tools Following Functional Demonstrations
ERIC Educational Resources Information Center
Stavans, Maayan; Baillargeon, Renée
2018-01-01
Two experiments examined whether 4-month-olds (n = 120) who were induced to assign two objects to different categories would then be able to take advantage of these contrastive categorical encodings to individuate and track the objects. In each experiment, infants first watched functional demonstrations of two tools, a masher and tongs (Experiment…
Eating Disorders in Graduate Students: Exploring the SCOFF Questionnaire as a Simple Screening Tool
ERIC Educational Resources Information Center
Parker, Sarah C.; Lyons, John; Bonner, Julia
2005-01-01
The results of several studies have established the validity of the SCOFF questionnaire (a 5-question screening tool for eating disorders), but researchers need to explore further replicability using the US version in the graduate school population. In this study, the authors asked 335 graduate students attending the Northwestern student health…
A Constitutive Relationship between Crack Propagation and Specific Damping Capacity in Steel
1990-10-01
diagnostic tool for detecting crack growth in structures. The model must be simple to act as a tool, but it must be comprehensive to provide accuracy...strain for static fracture u ECritical strain above which plastic strain occursP EAverage value of the cyclic plastic-strain rangeP E t ln(Ao/AI), true
Ibrahim, G H; Buch, M H; Lawson, C; Waxman, R; Helliwell, P S
2009-01-01
To evaluate an existing tool (the Swedish modification of the Psoriasis Assessment Questionnaire) and to develop a new instrument to screen for psoriatic arthritis in people with psoriasis. The starting point was a community-based survey of people with psoriasis using questionnaires developed from the literature. Selected respondents were examined and additional known cases of psoriatic arthritis were included in the analysis. The new instrument was developed using univariate statistics and a logistic regression model, comparing people with and without psoriatic arthritis. The instruments were compared using receiver operating curve (ROC) curve analysis. 168 questionnaires were returned (response rate 27%) and 93 people attended for examination (55% of questionnaire respondents). Of these 93, twelve were newly diagnosed with psoriatic arthritis during this study. These 12 were supplemented by 21 people with known psoriatic arthritis. Just 5 questions were found to be significant predictors of psoriatic arthritis in this population. Figures for sensitivity and specificity were 0.92 and 0.78 respectively, an improvement on the Alenius tool (sensitivity and specificity, 0.63 and 0.72 respectively). A new screening tool for identifying people with psoriatic arthritis has been developed. Five simple questions demonstrated good sensitivity and specificity in this population but further validation is required.
Navigational Guidance and Ablation Planning Tools for Interventional Radiology.
Sánchez, Yadiel; Anvari, Arash; Samir, Anthony E; Arellano, Ronald S; Prabhakar, Anand M; Uppot, Raul N
Image-guided biopsy and ablation relies on successful identification and targeting of lesions. Currently, image-guided procedures are routinely performed under ultrasound, fluoroscopy, magnetic resonance imaging, or computed tomography (CT) guidance. However, these modalities have their limitations including inadequate visibility of the lesion, lesion or organ or patient motion, compatibility of instruments in an magnetic resonance imaging field, and, for CT and fluoroscopy cases, radiation exposure. Recent advances in technology have resulted in the development of a new generation of navigational guidance tools that can aid in targeting lesions for biopsy or ablations. These navigational guidance tools have evolved from simple hand-held trajectory guidance tools, to electronic needle visualization, to image fusion, to the development of a body global positioning system, to growth in cone-beam CT, and to ablation volume planning. These navigational systems are promising technologies that not only have the potential to improve lesion targeting (thereby increasing diagnostic yield of a biopsy or increasing success of tumor ablation) but also have the potential to decrease radiation exposure to the patient and staff, decrease procedure time, decrease the sedation requirements, and improve patient safety. The purpose of this article is to describe the challenges in current standard image-guided techniques, provide a definition and overview for these next-generation navigational devices, and describe the current limitations of these, still evolving, next-generation navigational guidance tools. Copyright © 2017 Elsevier Inc. All rights reserved.
Monteiro, Pedro Tiago; Pais, Pedro; Costa, Catarina; Manna, Sauvagya; Sá-Correia, Isabel; Teixeira, Miguel Cacho
2017-01-04
We present the PATHOgenic YEAst Search for Transcriptional Regulators And Consensus Tracking (PathoYeastract - http://pathoyeastract.org) database, a tool for the analysis and prediction of transcription regulatory associations at the gene and genomic levels in the pathogenic yeasts Candida albicans and C. glabrata Upon data retrieval from hundreds of publications, followed by curation, the database currently includes 28 000 unique documented regulatory associations between transcription factors (TF) and target genes and 107 DNA binding sites, considering 134 TFs in both species. Following the structure used for the YEASTRACT database, PathoYeastract makes available bioinformatics tools that enable the user to exploit the existing information to predict the TFs involved in the regulation of a gene or genome-wide transcriptional response, while ranking those TFs in order of their relative importance. Each search can be filtered based on the selection of specific environmental conditions, experimental evidence or positive/negative regulatory effect. Promoter analysis tools and interactive visualization tools for the representation of TF regulatory networks are also provided. The PathoYeastract database further provides simple tools for the prediction of gene and genomic regulation based on orthologous regulatory associations described for other yeast species, a comparative genomics setup for the study of cross-species evolution of regulatory networks. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Strauss, B.; Dodson, D.; Kulp, S. A.; Rizza, D. H.
2016-12-01
Surging Seas Risk Finder (riskfinder.org) is an online tool for accessing extensive local projections and analysis of sea level rise; coastal floods; and land, populations, contamination sources, and infrastructure and other assets that may be exposed to inundation. Risk Finder was first published in 2013 for Florida, New York and New Jersey, expanding to all states in the contiguous U.S. by 2016, when a major new version of the tool was released with a completely new interface. The revised tool was informed by hundreds of survey responses from and conversations with planners, local officials and other coastal stakeholders, plus consideration of modern best practices for responsive web design and user interfaces, and social science-based principles for science communication. Overarching design principles include simplicity and ease of navigation, leading to a landing page with Google-like sparsity and focus on search, and to an architecture based on search, so that each coastal zip code, city, county, state or other place type has its own webpage gathering all relevant analysis in modular, scrollable units. Millions of users have visited the Surging Seas suite of tools to date, and downloaded thousands of files, for stated purposes ranging from planning to business to education to personal decisions; and from institutions ranging from local to federal government agencies, to businesses, to NGOs, and to academia.
NASA Astrophysics Data System (ADS)
Pezzi, M.; Favaro, M.; Gregori, D.; Ricci, P. P.; Sapunenko, V.
2014-06-01
In large computing centers, such as the INFN CNAF Tier1 [1], is essential to be able to configure all the machines, depending on use, in an automated way. For several years at the Tier1 has been used Quattor[2], a server provisioning tool, which is currently used in production. Nevertheless we have recently started a comparison study involving other tools able to provide specific server installation and configuration features and also offer a proper full customizable solution as an alternative to Quattor. Our choice at the moment fell on integration between two tools: Cobbler [3] for the installation phase and Puppet [4] for the server provisioning and management operation. The tool should provide the following properties in order to replicate and gradually improve the current system features: implement a system check for storage specific constraints such as kernel modules black list at boot time to avoid undesired SAN (Storage Area Network) access during disk partitioning; a simple and effective mechanism for kernel upgrade and downgrade; the ability of setting package provider using yum, rpm or apt; easy to use Virtual Machine installation support including bonding and specific Ethernet configuration; scalability for managing thousands of nodes and parallel installations. This paper describes the results of the comparison and the tests carried out to verify the requirements and the new system suitability in the INFN-T1 environment.
User Manual for the PROTEUS Mesh Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Micheal A.; Shemon, Emily R
2016-09-19
PROTEUS is built around a finite element representation of the geometry for visualization. In addition, the PROTEUS-SN solver was built to solve the even-parity transport equation on a finite element mesh provided as input. Similarly, PROTEUS-MOC and PROTEUS-NEMO were built to apply the method of characteristics on unstructured finite element meshes. Given the complexity of real world problems, experience has shown that using commercial mesh generator to create rather simple input geometries is overly complex and slow. As a consequence, significant effort has been put into place to create multiple codes that help assist in the mesh generation and manipulation.more » There are three input means to create a mesh in PROTEUS: UFMESH, GRID, and NEMESH. At present, the UFMESH is a simple way to generate two-dimensional Cartesian and hexagonal fuel assembly geometries. The UFmesh input allows for simple assembly mesh generation while the GRID input allows the generation of Cartesian, hexagonal, and regular triangular structured grid geometry options. The NEMESH is a way for the user to create their own mesh or convert another mesh file format into a PROTEUS input format. Given that one has an input mesh format acceptable for PROTEUS, we have constructed several tools which allow further mesh and geometry construction (i.e. mesh extrusion and merging). This report describes the various mesh tools that are provided with the PROTEUS code giving both descriptions of the input and output. In many cases the examples are provided with a regression test of the mesh tools. The most important mesh tools for any user to consider using are the MT_MeshToMesh.x and the MT_RadialLattice.x codes. The former allows the conversion between most mesh types handled by PROTEUS while the second allows the merging of multiple (assembly) meshes into a radial structured grid. Note that the mesh generation process is recursive in nature and that each input specific for a given mesh tool (such as .axial or .merge) can be used as “mesh” input for any of the mesh tools discussed in this manual.« less
System for exchanging tools and end effectors on a robot
Burry, D.B.; Williams, P.M.
1991-02-19
A system and method for exchanging tools and end effectors on a robot permits exchange during a programmed task. The exchange mechanism is located off the robot, thus reducing the mass of the robot arm and permitting smaller robots to perform designated tasks. A simple spring/collet mechanism mounted on the robot is used which permits the engagement and disengagement of the tool or end effector without the need for a rotational orientation of the tool to the end effector/collet interface. As the tool changing system is not located on the robot arm no umbilical cords are located on robot. 12 figures.
Franzoso, Gianpaolo
2014-01-01
Introduction The purpose of the article is to share a modus operandi and a tool that allows the recruitment and management of thousands of patients and their treatment by using a simple software created by the author and made freely available to all colleague-pharmacists. The author, a pharmacist, created this database because there were no tools on the market with all the features needed to manage the treatment of patients and the orders of drugs to ensure continuity of care without waste of public money. Methods The data collection is facilitated by the software and allows the monitoring of treatment of the patients and their re-evaluation. This tool can create a table containing all the information needed to predict the demand for drugs, the timing of therapies and of the treatment plans. It is an effective instrument to calculate the optimal purchase of drugs and the delivery of therapies to patients. Conclusions A simple tool that allows the management of many patients, reduces research time and facilitates the control of therapies. It allows us to optimize inventory and minimize the stock of drugs. It allows the pharmacist to focus attention on the clinical management of the patient by helping him to follow therapy and respond to his needs.
Simple Machines. Physical Science in Action[TM]. Schlessinger Science Library. [Videotape].
ERIC Educational Resources Information Center
2000
In today's world, kids are aware that there are machines all around them. What they may not realize is that the function of all machines is to make work easier in some way. Simple Machines uses engaging visuals and colorful graphics to explain the concept of work and how humans use certain basic tools to help get work done. Students will learn…
Schwarz, Jean-Marc; Clearfield, Michael; Mulligan, Kathleen
2017-08-01
Epidemiologic studies suggest a link between excess sugar consumption and obesity, fatty liver disease, metabolic syndrome, and type 2 diabetes mellitus. One important pathway that may link these metabolic diseases to sugar consumption is hepatic conversion of sugar to fat, a process known as de novo lipogenesis (DNL). Mechanistic studies have shown that diets high in simple sugars increase both DNL and liver fat. Importantly, removal of sugar from diets of children with obesity for only 9 days consistently reduced DNL and liver fat and improved glucose and lipid metabolism. Although the sugar and beverage industries continue to question the scientific evidence linking high-sugar diets to metabolic diseases, major health organizations now make evidence-based recommendations to limit consumption of simple sugars to no more than 5% to 10% of daily intake. Clear recommendation about moderating sugar intake to patients may be an important nonpharmacologic tool to include in clinical practice.
Modeling Spacecraft Fuel Slosh at Embry-Riddle Aeronautical University
NASA Technical Reports Server (NTRS)
Schlee, Keith L.
2007-01-01
As a NASA-sponsored GSRP Fellow, I worked with other researchers and analysts at Embry-Riddle Aeronautical University and NASA's ELV Division to investigate the effect of spacecraft fuel slosh. NASA's research into the effects of fuel slosh includes modeling the response in full-sized tanks using equipment such as the Spinning Slosh Test Rig (SSTR), located at Southwest Research Institute (SwRI). NASA and SwRI engineers analyze data taken from SSTR runs and hand-derive equations of motion to identify model parameters and characterize the sloshing motion. With guidance from my faculty advisor, Dr. Sathya Gangadharan, and NASA flight controls analysts James Sudermann and Charles Walker, I set out to automate this parameter identification process by building a simple physical experimental setup to model free surface slosh in a spherical tank with a simple pendulum analog. This setup was then modeled using Simulink and SimMechanics. The Simulink Parameter Estimation Tool was then used to identify the model parameters.
Travagliati, Marco; Girardo, Salvatore; Pisignano, Dario; Beltram, Fabio; Cecchini, Marco
2013-09-03
Spatiotemporal image correlation spectroscopy (STICS) is a simple and powerful technique, well established as a tool to probe protein dynamics in cells. Recently, its potential as a tool to map velocity fields in lab-on-a-chip systems was discussed. However, the lack of studies on its performance has prevented its use for microfluidics applications. Here, we systematically and quantitatively explore STICS microvelocimetry in microfluidic devices. We exploit a simple experimental setup, based on a standard bright-field inverted microscope (no fluorescence required) and a high-fps camera, and apply STICS to map liquid flow in polydimethylsiloxane (PDMS) microchannels. Our data demonstrates optimal 2D velocimetry up to 10 mm/s flow and spatial resolution down to 5 μm.
Visualization and Interaction in Research, Teaching, and Scientific Communication
NASA Astrophysics Data System (ADS)
Ammon, C. J.
2017-12-01
Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.
Kano, Yoshinobu; Nguyen, Ngan; Saetre, Rune; Yoshida, Kazuhiro; Miyao, Yusuke; Tsuruoka, Yoshimasa; Matsubayashi, Yuichiro; Ananiadou, Sophia; Tsujii, Jun'ichi
2008-01-01
Recently, several text mining programs have reached a near-practical level of performance. Some systems are already being used by biologists and database curators. However, it has also been recognized that current Natural Language Processing (NLP) and Text Mining (TM) technology is not easy to deploy, since research groups tend to develop systems that cater specifically to their own requirements. One of the major reasons for the difficulty of deployment of NLP/TM technology is that re-usability and interoperability of software tools are typically not considered during development. While some effort has been invested in making interoperable NLP/TM toolkits, the developers of end-to-end systems still often struggle to reuse NLP/TM tools, and often opt to develop similar programs from scratch instead. This is particularly the case in BioNLP, since the requirements of biologists are so diverse that NLP tools have to be adapted and re-organized in a much more extensive manner than was originally expected. Although generic frameworks like UIMA (Unstructured Information Management Architecture) provide promising ways to solve this problem, the solution that they provide is only partial. In order for truly interoperable toolkits to become a reality, we also need sharable type systems and a developer-friendly environment for software integration that includes functionality for systematic comparisons of available tools, a simple I/O interface, and visualization tools. In this paper, we describe such an environment that was developed based on UIMA, and we show its feasibility through our experience in developing a protein-protein interaction (PPI) extraction system.
MI-Sim: A MATLAB package for the numerical analysis of microbial ecological interactions.
Wade, Matthew J; Oakley, Jordan; Harbisher, Sophie; Parker, Nicholas G; Dolfing, Jan
2017-01-01
Food-webs and other classes of ecological network motifs, are a means of describing feeding relationships between consumers and producers in an ecosystem. They have application across scales where they differ only in the underlying characteristics of the organisms and substrates describing the system. Mathematical modelling, using mechanistic approaches to describe the dynamic behaviour and properties of the system through sets of ordinary differential equations, has been used extensively in ecology. Models allow simulation of the dynamics of the various motifs and their numerical analysis provides a greater understanding of the interplay between the system components and their intrinsic properties. We have developed the MI-Sim software for use with MATLAB to allow a rigorous and rapid numerical analysis of several common ecological motifs. MI-Sim contains a series of the most commonly used motifs such as cooperation, competition and predation. It does not require detailed knowledge of mathematical analytical techniques and is offered as a single graphical user interface containing all input and output options. The tools available in the current version of MI-Sim include model simulation, steady-state existence and stability analysis, and basin of attraction analysis. The software includes seven ecological interaction motifs and seven growth function models. Unlike other system analysis tools, MI-Sim is designed as a simple and user-friendly tool specific to ecological population type models, allowing for rapid assessment of their dynamical and behavioural properties.
Integration of Irma tactical scene generator into directed-energy weapon system simulation
NASA Astrophysics Data System (ADS)
Owens, Monte A.; Cole, Madison B., III; Laine, Mark R.
2003-08-01
Integrated high-fidelity physics-based simulations that include engagement models, image generation, electro-optical hardware models and control system algorithms have previously been developed by Boeing-SVS for various tracking and pointing systems. These simulations, however, had always used images with featureless or random backgrounds and simple target geometries. With the requirement to engage tactical ground targets in the presence of cluttered backgrounds, a new type of scene generation tool was required to fully evaluate system performance in this challenging environment. To answer this need, Irma was integrated into the existing suite of Boeing-SVS simulation tools, allowing scene generation capabilities with unprecedented realism. Irma is a US Air Force research tool used for high-resolution rendering and prediction of target and background signatures. The MATLAB/Simulink-based simulation achieves closed-loop tracking by running track algorithms on the Irma-generated images, processing the track errors through optical control algorithms, and moving simulated electro-optical elements. The geometry of these elements determines the sensor orientation with respect to the Irma database containing the three-dimensional background and target models. This orientation is dynamically passed to Irma through a Simulink S-function to generate the next image. This integrated simulation provides a test-bed for development and evaluation of tracking and control algorithms against representative images including complex background environments and realistic targets calibrated using field measurements.
Andrew, Inga M; Waterfield, Kerry; Hildreth, Anthony J; Kirkpatrick, Graeme; Hawkins, Colette
2009-12-01
The objective of this study was to quantify the impact of standardized assessment and management tools on patient symptom scores in cancer-induced anorexia cachexia syndrome (ACS) using a within-group study design. Baseline assessments included the Patient Generated Subjective Global Assessment (PG-SGA) tool and an amended Symptoms and Concerns Checklist (SCC). Symptom management strategies, written for this project, were instigated. Follow-up SCC scores were collected at 2 and 4 weeks. Forty out of 79 patients referred were recruited; 29/79 (36.7%) were too unwell or had died prior to consent. At baseline, the PG-SGA tool revealed 250 active symptoms associated with ACS. Total PG-SGA score was above 9 for all patients. Predominant interventions involved simple dietary advice and prescription of artificial saliva, mouthwash and prokinetic antiemetics. Median total SCC score improved sequentially from 11 at baseline, to 7 and 4 at first and second review, respectively (visit 1 to 2, p = 0.001; visit 1 to 3, p < 0.001; and visit 2 to 3, p = 0.02). We conclude that patients with ACS are recognised late in their disease and have a considerable burden of active symptoms. A structured approach to assessment and management has a significant impact on symptom burden.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duke, Roger T.; Crump, Thomas Vu
The work was created to provide a tool for the purpose of improving the management of tasks associated with Agile projects. Agile projects are typically completed in an iterative manner with many short duration tasks being performed as part of iterations. These iterations are generally referred to as sprints. The objective of this work is to create a single tool that enables sprint teams to manage all of their tasks in multiple sprints and automatically produce all standard sprint performance charts with minimum effort. The format of the printed work is designed to mimic a standard Kanban board. The workmore » is developed as a single Excel file with worksheets capable of managing up to five concurrent sprints and up to one hundred tasks. It also includes a summary worksheet providing performance information from all active sprints. There are many commercial project management systems typically designed with features desired by larger organizations with many resources managing multiple programs and projects. The audience for this work is the small organizations and Agile project teams desiring an inexpensive, simple, user-friendly, task management tool. This work uses standard readily available software, Excel, requiring minimum data entry and automatically creating summary charts and performance data. It is formatted to print out and resemble standard flip charts and provide the visuals associated with this type of work.« less
Graph-based optimization of epitope coverage for vaccine antigen design
Theiler, James Patrick; Korber, Bette Tina Marie
2017-01-29
Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less
A simple web-based tool to compare freshwater fish data collected using AFS standard methods
Bonar, Scott A.; Mercado-Silva, Norman; Rahr, Matt; Torrey, Yuta T.; Cate, Averill
2016-01-01
The American Fisheries Society (AFS) recently published Standard Methods for Sampling North American Freshwater Fishes. Enlisting the expertise of 284 scientists from 107 organizations throughout Canada, Mexico, and the United States, this text was developed to facilitate comparisons of fish data across regions or time. Here we describe a user-friendly web tool that automates among-sample comparisons in individual fish condition, population length-frequency distributions, and catch per unit effort (CPUE) data collected using AFS standard methods. Currently, the web tool (1) provides instantaneous summaries of almost 4,000 data sets of condition, length frequency, and CPUE of common freshwater fishes collected using standard gears in 43 states and provinces; (2) is easily appended with new standardized field data to update subsequent queries and summaries; (3) compares fish data from a particular water body with continent, ecoregion, and state data summaries; and (4) provides additional information about AFS standard fish sampling including benefits, ongoing validation studies, and opportunities to comment on specific methods. The web tool—programmed in a PHP-based Drupal framework—was supported by several AFS Sections, agencies, and universities and is freely available from the AFS website and fisheriesstandardsampling.org. With widespread use, the online tool could become an important resource for fisheries biologists.
Graph-based optimization of epitope coverage for vaccine antigen design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theiler, James Patrick; Korber, Bette Tina Marie
Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less
Enhanced STEM Learning with the GeoMapApp Data Exploration Tool
NASA Astrophysics Data System (ADS)
Goodwillie, A. M.
2014-12-01
GeoMapApp (http://www.geomapapp.org), is a free, map-based data discovery and visualisation tool developed with NSF funding at Lamont-Doherty Earth Observatory. GeoMapApp provides casual and specialist users alike with access to hundreds of built-in geoscience data sets covering geology, geophysics, geochemistry, oceanography, climatology, cryospherics, and the environment. Users can also import their own data tables, spreadsheets, shapefiles, grids and images. Simple manipulation and analysis tools combined with layering capabilities and engaging visualisations provide a powerful platform with which to explore and interrogate geoscience data in its proper geospatial context thus helping users to more easily gain insight into the meaning of the data. A global elevation base map covering the oceans as well as continents forms the backbone of GeoMapApp. The multi-resolution base map is updated regularly and includes data sources ranging from Space Shuttle elevation data for land areas to ultra-high-resolution surveys of coral reefs and seafloor hydrothermal vent fields. Examples of built-in data sets that can be layered over the elevation model include interactive earthquake and volcano data, plate tectonic velocities, hurricane tracks, land and ocean temperature, water column properties, age of the ocean floor, and deep submersible bottom photos. A versatile profiling tool provides instant access to data cross-sections. Contouring and 3-D views are also offered - the attached image shows a 3-D view of East Africa's Ngorongoro Crater as an example. Tabular data - both imported and built-in - can be displayed in a variety of ways and a lasso tool enables users to quickly select data points directly from the map. A range of STEM-based education material based upon GeoMapApp is already available, including a number of self-contained modules for school- and college-level students (http://www.geomapapp.org/education/contributed_material.html). More learning modules are planned, such as one on the effects of sea-level rise. GeoMapApp users include students, teachers, researchers, curriculum developers and outreach specialists.
Gener: a minimal programming module for chemical controllers based on DNA strand displacement
Kahramanoğulları, Ozan; Cardelli, Luca
2015-01-01
Summary: Gener is a development module for programming chemical controllers based on DNA strand displacement. Gener is developed with the aim of providing a simple interface that minimizes the opportunities for programming errors: Gener allows the user to test the computations of the DNA programs based on a simple two-domain strand displacement algebra, the minimal available so far. The tool allows the user to perform stepwise computations with respect to the rules of the algebra as well as exhaustive search of the computation space with different options for exploration and visualization. Gener can be used in combination with existing tools, and in particular, its programs can be exported to Microsoft Research’s DSD tool as well as to LaTeX. Availability and implementation: Gener is available for download at the Cosbi website at http://www.cosbi.eu/research/prototypes/gener as a windows executable that can be run on Mac OS X and Linux by using Mono. Contact: ozan@cosbi.eu PMID:25957353
Gener: a minimal programming module for chemical controllers based on DNA strand displacement.
Kahramanoğulları, Ozan; Cardelli, Luca
2015-09-01
: Gener is a development module for programming chemical controllers based on DNA strand displacement. Gener is developed with the aim of providing a simple interface that minimizes the opportunities for programming errors: Gener allows the user to test the computations of the DNA programs based on a simple two-domain strand displacement algebra, the minimal available so far. The tool allows the user to perform stepwise computations with respect to the rules of the algebra as well as exhaustive search of the computation space with different options for exploration and visualization. Gener can be used in combination with existing tools, and in particular, its programs can be exported to Microsoft Research's DSD tool as well as to LaTeX. Gener is available for download at the Cosbi website at http://www.cosbi.eu/research/prototypes/gener as a windows executable that can be run on Mac OS X and Linux by using Mono. ozan@cosbi.eu. © The Author 2015. Published by Oxford University Press.
Romero, A F; Oliveira, M; Abessa, D M S
2018-03-01
This study sought to develop a simple index for ranking birds' environmental sensitivity to oil in which birds are used as biological indicators. The study area consisted of both the Santos Estuarine System (SES), and the Laje de Santos Marine State Park (LSMSP), located in Southeastern Brazil. Information on the bird species and their feeding and nesting behaviors were obtained from the literature and were the basis of the sensitivity index created. The SES had a higher number of species, but only about 30% were found to be highly sensitive. The LSMSP presented a much lower number of species, but all of them were considered to be highly sensitive to oil. Due to its simplicity, this index can be employed worldwide as a decision-making tool that may be integrated into other management tools, particularly when robust information on the biology of birds is lacking. Copyright © 2017 Elsevier Ltd. All rights reserved.
Applying open source data visualization tools to standard based medical data.
Kopanitsa, Georgy; Taranik, Maxim
2014-01-01
Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.
Building Flexible User Interfaces for Solving PDEs
NASA Astrophysics Data System (ADS)
Logg, Anders; Wells, Garth N.
2010-09-01
FEniCS is a collection of software tools for the automated solution of differential equations by finite element methods. In this note, we describe how FEniCS can be used to solve a simple nonlinear model problem with varying levels of automation. At one extreme, FEniCS provides tools for the fully automated and adaptive solution of nonlinear partial differential equations. At the other extreme, FEniCS provides a range of tools that allow the computational scientist to experiment with novel solution algorithms.
Documet, Jorge; Liu, Brent J; Documet, Luis; Huang, H K
2006-07-01
This paper describes a picture archiving and communication system (PACS) tool based on Web technology that remotely manages medical images between a PACS archive and remote destinations. Successfully implemented in a clinical environment and also demonstrated for the past 3 years at the conferences of various organizations, including the Radiological Society of North America, this tool provides a very practical and simple way to manage a PACS, including off-site image distribution and disaster recovery. The application is robust and flexible and can be used on a standard PC workstation or a Tablet PC, but more important, it can be used with a personal digital assistant (PDA). With a PDA, the Web application becomes a powerful wireless and mobile image management tool. The application's quick and easy-to-use features allow users to perform Digital Imaging and Communications in Medicine (DICOM) queries and retrievals with a single interface, without having to worry about the underlying configuration of DICOM nodes. In addition, this frees up dedicated PACS workstations to perform their specialized roles within the PACS workflow. This tool has been used at Saint John's Health Center in Santa Monica, California, for 2 years. The average number of queries per month is 2,021, with 816 C-MOVE retrieve requests. Clinical staff members can use PDAs to manage image workflow and PACS examination distribution conveniently for off-site consultations by referring physicians and radiologists and for disaster recovery. This solution also improves radiologists' effectiveness and efficiency in health care delivery both within radiology departments and for off-site clinical coverage.
HPC Profiling with the Sun Studio™ Performance Tools
NASA Astrophysics Data System (ADS)
Itzkowitz, Marty; Maruyama, Yukon
In this paper, we describe how to use the Sun Studio Performance Tools to understand the nature and causes of application performance problems. We first explore CPU and memory performance problems for single-threaded applications, giving some simple examples. Then, we discuss multi-threaded performance issues, such as locking and false-sharing of cache lines, in each case showing how the tools can help. We go on to describe OpenMP applications and the support for them in the performance tools. Then we discuss MPI applications, and the techniques used to profile them. Finally, we present our conclusions.
Flowgen: Flowchart-based documentation for C + + codes
NASA Astrophysics Data System (ADS)
Kosower, David A.; Lopez-Villarejo, J. J.
2015-11-01
We present the Flowgen tool, which generates flowcharts from annotated C + + source code. The tool generates a set of interconnected high-level UML activity diagrams, one for each function or method in the C + + sources. It provides a simple and visual overview of complex implementations of numerical algorithms. Flowgen is complementary to the widely-used Doxygen documentation tool. The ultimate aim is to render complex C + + computer codes accessible, and to enhance collaboration between programmers and algorithm or science specialists. We describe the tool and a proof-of-concept application to the VINCIA plug-in for simulating collisions at CERN's Large Hadron Collider.
Diamond machine tool face lapping machine
Yetter, H.H.
1985-05-06
An apparatus for shaping, sharpening and polishing diamond-tipped single-point machine tools. The isolation of a rotating grinding wheel from its driving apparatus using an air bearing and causing the tool to be shaped, polished or sharpened to be moved across the surface of the grinding wheel so that it does not remain at one radius for more than a single rotation of the grinding wheel has been found to readily result in machine tools of a quality which can only be obtained by the most tedious and costly processing procedures, and previously unattainable by simple lapping techniques.
Simplifying operations with an uplink/downlink integration toolkit
NASA Technical Reports Server (NTRS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-01-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
Simplifying operations with an uplink/downlink integration toolkit
NASA Astrophysics Data System (ADS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-11-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
A Novel Passive Robotic Tool Interface
NASA Astrophysics Data System (ADS)
Roberts, Paul
2013-09-01
The increased capability of space robotics has seen their uses increase from simple sample gathering and mechanical adjuncts to humans, to sophisticated multi- purpose investigative and maintenance tools that substitute for humans for many external space tasks. As with all space missions, reducing mass and system complexity is critical. A key component of robotic systems mass and complexity is the number of motors and actuators needed. MDA has developed a passive tool interface that, like a household power drill, permits a single tool actuator to be interfaced with many Tool Tips without requiring additional actuators to manage the changing and storage of these tools. MDA's Multifunction Tool interface permits a wide range of Tool Tips to be designed to a single interface that can be pre-qualified to torque and strength limits such that additional Tool Tips can be added to a mission's "tool kit" simply and quickly.
NASA Astrophysics Data System (ADS)
Sosa, M.; Grundel, L.; Simini, F.
2016-04-01
Logical reasoning is part of medical practice since its origins. Modern Medicine has included information-intensive tools to refine diagnostics and treatment protocols. We are introducing formal logic teaching in Medical School prior to Clinical Internship, to foster medical practice. Two simple examples (Acute Myocardial Infarction and Diabetes Mellitus) are given in terms of formal logic expression and truth tables. Flowcharts of both diagnostic processes help understand the procedures and to validate them logically. The particularity of medical information is that it is often accompanied by “missing data” which suggests to adapt formal logic to a “three state” logic in the future. Medical Education must include formal logic to understand complex protocols and best practices, prone to mutual interactions.
Cygankiewicz, Iwona; Zareba, Wojciech
2013-01-01
Heart rate variability (HRV) provides indirect insight into autonomic nervous system tone, and has a well-established role as a marker of cardiovascular risk. Recent decades brought an increasing interest in HRV assessment as a diagnostic tool in detection of autonomic impairment, and prediction of prognosis in several neurological disorders. Both bedside analysis of simple markers of HRV, as well as more sophisticated HRV analyses including time, frequency domain and nonlinear analysis have been proven to detect early autonomic involvement in several neurological disorders. Furthermore, altered HRV parameters were shown to be related with cardiovascular risk, including sudden cardiac risk, in patients with neurological diseases. This chapter aims to review clinical and prognostic application of HRV analysis in diabetes, stroke, multiple sclerosis, muscular dystrophies, Parkinson's disease and epilepsy. © 2013 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Oreilly, Daniel; Williams, Robert; Yarborough, Kevin
1988-01-01
This is a tutorial/diagnostic system for training personnel in the use of the Space Shuttle Main Engine Controller (SSMEC) Simulation Lab. It also provides a diagnostic capable of isolating lab failures at least to the major lab component. The system was implemented using Hypercard, which is an program of hypermedia running on Apple Macintosh computers. Hypercard proved to be a viable platform for the development and use of sophisticated tutorial systems and moderately capable diagnostic systems. This tutorial/diagnostic system uses the basic Hypercard tools to provide the tutorial. The diagnostic part of the system uses a simple interpreter written in the Hypercard language (Hypertalk) to implement the backward chaining rule based logic commonly found in diagnostic systems using Prolog. Some of the advantages of Hypercard in developing this type of system include sophisticated graphics, animation, sound and voice capabilities, its ability as a hypermedia tool, and its ability to include digitized pictures. The major disadvantage is the slow execution time for evaluation of rules (due to the interpretive processing of the language). Other disadvantages include the limitation on the size of the cards, that color is not supported, that it does not support grey scale graphics, and its lack of selectable fonts for text fields.
ERIC Educational Resources Information Center
Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin
2016-01-01
In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…
Stimulus Control: The Sought or Unsought Influence of the Objects We Tend to
ERIC Educational Resources Information Center
Morsella, Ezequiel; Larson, Lindsay R. L.; Zarolia, Pareezad; Bargh, John A.
2011-01-01
Does the mere presence of the things we have tended to influence our actions systematically, in ways that escape our awareness? For example, while entering a tool shed, does perceiving objects that we once tended to (e.g., tools, musical instruments) influence how we then execute a simple action (e.g., flicking the shed's light switch)? Ancient…
Using a Client Survey to Support Continuous Improvement: An Australian Case Study in Managing Change
ERIC Educational Resources Information Center
Besch, Janice
2014-01-01
With the arrival of online survey tools that are low-cost, readily available and easy to administer, all organizations have access to one of the most effective mechanisms for determining quality improvement priorities and measuring progress towards achieving those priorities over time. This case study outlines the use made of this simple tool by a…
ERIC Educational Resources Information Center
Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai
2016-01-01
A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…
Development of a Statistical Validation Methodology for Fire Weather Indices
Brian E. Potter; Scott Goodrick; Tim Brown
2003-01-01
Fire managers and forecasters must have tools, such as fire indices, to summarize large amounts of complex information. These tools allow them to identify and plan for periods of elevated risk and/or wildfire potential. This need was once met using simple measures like relative humidity or maximum daily temperature (e.g., Gisborne, 1936) to describe fire weather, and...
ERIC Educational Resources Information Center
Environmental Protection Agency, Washington, DC. Office of Radiation and Indoor Air.
The U.S. Environmental Protection Agency (EPA) developed the Indoor Air Quality Tools for Schools (IAQ TfS) Program to help schools prevent, identify, and resolve their IAQ problems. This publication describes the program and its advantages, explaining that through simple, low-cost measures, schools can: reduce IAQ-related health risks and…
A Switching-Mode Power Supply Design Tool to Improve Learning in a Power Electronics Course
ERIC Educational Resources Information Center
Miaja, P. F.; Lamar, D. G.; de Azpeitia, M.; Rodriguez, A.; Rodriguez, M.; Hernando, M. M.
2011-01-01
The static design of ac/dc and dc/dc switching-mode power supplies (SMPS) relies on a simple but repetitive process. Although specific spreadsheets, available in various computer-aided design (CAD) programs, are widely used, they are difficult to use in educational applications. In this paper, a graphic tool programmed in MATLAB is presented,…
ERIC Educational Resources Information Center
Lawal-Adebowale, O. A.; Oyekunle, O.
2014-01-01
With integration of information technology tool for academic course registration in the Federal University of Agriculture, Abeokuta, the study assessed the agro-students' appraisal of the online tool for course registration. A simple random sampling technique was used to select 325 agrostudents; and validated and reliable questionnaire was used…
A late Pleistocene human presence at Huaca Prieta, Peru, and early Pacific Coastal adaptations
NASA Astrophysics Data System (ADS)
Dillehay, Tom D.; Bonavia, Duccio; Goodbred, Steve L.; Pino, Mario; Vásquez, Victor; Tham, Teresa Rosales
2012-05-01
Archaeological excavations in deep pre-mound levels at Huaca Prieta in northern Peru have yielded new evidence of late Pleistocene cultural deposits that shed insights into the early human occupation of the Pacific coast of South America. Radiocarbon dates place this occupation between ~ 14,200 and 13,300 cal yr BP. The cultural evidence shares certain basic technological and subsistence traits, including maritime resources and simple flake tools, with previously discovered late Pleistocene sites along the Pacific coast of Peru and Chile. The results help to expand our knowledge of early maritime societies and human adaption to changing coastal environments.
Epistemonikos: a free, relational, collaborative, multilingual database of health evidence.
Rada, Gabriel; Pérez, Daniel; Capurro, Daniel
2013-01-01
Epistemonikos (www.epistemonikos.org) is a free, multilingual database of the best available health evidence. This paper describes the design, development and implementation of the Epistemonikos project. Using several web technologies to store systematic reviews, their included articles, overviews of reviews and structured summaries, Epistemonikos is able to provide a simple and powerful search tool to access health evidence for sound decision making. Currently, Epistemonikos stores more than 115,000 unique documents and more than 100,000 relationships between documents. In addition, since its database is translated into 9 different languages, Epistemonikos ensures that non-English speaking decision-makers can access the best available evidence without language barriers.
Can Technology Improve the Quality of Colonoscopy?
Thirumurthi, Selvi; Ross, William A; Raju, Gottumukkala S
2016-07-01
In order for screening colonoscopy to be an effective tool in reducing colon cancer incidence, exams must be performed in a high-quality manner. Quality metrics have been presented by gastroenterology societies and now include higher adenoma detection rate targets than in the past. In many cases, the quality of colonoscopy can often be improved with simple low-cost interventions such as improved procedure technique, implementing split-dose bowel prep, and monitoring individuals' performances. Emerging technology has expanded our field of view and image quality during colonoscopy. We will critically review several technological advances in the context of quality metrics and discuss if technology can really improve the quality of colonoscopy.
Voluntary environmental agreements: Good or bad news for environmental protection?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Segerson, K.; Miceli, T.J.
1998-09-01
There has been growing interest in the use of voluntary agreements (VAs) as an environmental policy tool. This article uses a simple model to determine whether VAs are likely to lead to efficient environmental protection. The authors consider cases where polluters are induced to participate either by a background threat of mandatory controls (the stick approach) or by cost-sharing subsidies (the carrot approach). The results suggest that the overall impact on environmental quality could be positive or negative, depending on a number of factors, including the allocation of bargaining power, the magnitude of the background threat, and the social costmore » of funds.« less
A fast, parallel algorithm for distant-dependent calculation of crystal properties
NASA Astrophysics Data System (ADS)
Stein, Matthew
2017-12-01
A fast, parallel algorithm for distant-dependent calculation and simulation of crystal properties is presented along with speedup results and methods of application. An illustrative example is used to compute the Lennard-Jones lattice constants up to 32 significant figures for 4 ≤ p ≤ 30 in the simple cubic, face-centered cubic, body-centered cubic, hexagonal-close-pack, and diamond lattices. In most cases, the known precision of these constants is more than doubled, and in some cases, corrected from previously published figures. The tools and strategies to make this computation possible are detailed along with application to other potentials, including those that model defects.
Making a Place for Space: Spatial Thinking in Social Science
Logan, John R.
2013-01-01
New technologies and multilevel data sets that include geographic identifiers have heightened sociologists’ interest in spatial analysis. I review several of the key concepts, measures, and methods that are brought into play in this work, and offer examples of their application in a variety of substantive fields. I argue that the most effective use of the new tools requires greater emphasis on spatial thinking. A device as simple as an illustrative map requires some understanding of how people respond to visual cues; models as complex as HLM with spatial lags require thoughtful measurement decisions and raise questions about what a spatial effect represents. PMID:24273374
A comprehensive surface-groundwater flow model
NASA Astrophysics Data System (ADS)
Arnold, Jeffrey G.; Allen, Peter M.; Bernhardt, Gilbert
1993-02-01
In this study, a simple groundwater flow and height model was added to an existing basin-scale surface water model. The linked model is: (1) watershed scale, allowing the basin to be subdivided; (2) designed to accept readily available inputs to allow general use over large regions; (3) continuous in time to allow simulation of land management, including such factors as climate and vegetation changes, pond and reservoir management, groundwater withdrawals, and stream and reservoir withdrawals. The model is described, and is validated on a 471 km 2 watershed near Waco, Texas. This linked model should provide a comprehensive tool for water resource managers in development and planning.
Repair of major system elements on Skylab
NASA Technical Reports Server (NTRS)
Pace, R. E., Jr.
1975-01-01
In-flight maintenance, as conceived and pre-planned for the Skylab Mission, was limited to simple scheduled and unscheduled replacement tasks and minor contingency repairs. Failures during the mission dictated complicated and sophisticated repairs to major systems so that the mission could continue. These repairs include the release of a large structure that failed to deploy, the assembly and deployment of large mechanical devices, the installation and checkout of precision electronic equipment, troubleshooting and repair of precision electromechanical equipment and tapping into and recharging a cooling system. The Skylab experience proves conclusively that crewmen can, with adequate training, make major system repairs in space using standard or special tools.
Twelve tips for applying change models to curriculum design, development and delivery.
McKimm, Judy; Jones, Paul Kneath
2017-10-25
Drawing primarily from business and management literature and the authors' experience, these 12 tips provide guidance to organizations, teams, and individuals involved in curriculum or program development at undergraduate, postgraduate, and continuing education levels. The tips are based around change models and approaches and can help underpin successful curriculum review, development, and delivery, as well as fostering appropriate educational innovation. A range of tools exist to support systematic program development and review, but even relatively simple changes need to take account of many factors, including the complexity of the environment, stakeholder engagement, cultural and psychological aspects, and the importance of followers.
FLaapLUC: A pipeline for the generation of prompt alerts on transient Fermi-LAT γ-ray sources
NASA Astrophysics Data System (ADS)
Lenain, J.-P.
2018-01-01
The large majority of high energy sources detected with Fermi-LAT are blazars, which are known to be very variable sources. High cadence long-term monitoring simultaneously at different wavelengths being prohibitive, the study of their transient activities can help shedding light on our understanding of these objects. The early detection of such potentially fast transient events is the key for triggering follow-up observations at other wavelengths. A Python tool, FLaapLUC, built on top of the Science Tools provided by the Fermi Science Support Center and the Fermi-LAT collaboration, has been developed using a simple aperture photometry approach. This tool can effectively detect relative flux variations in a set of predefined sources and alert potential users. Such alerts can then be used to trigger target of opportunity observations with other facilities. It is shown that FLaapLUC is an efficient tool to reveal transient events in Fermi-LAT data, providing quick results which can be used to promptly organise follow-up observations. Results from this simple aperture photometry method are also compared to full likelihood analyses. The FLaapLUC package is made available on GitHub and is open to contributions by the community.
What is conservation physiology? Perspectives on an increasingly integrated and essential science†
Cooke, Steven J.; Sack, Lawren; Franklin, Craig E.; Farrell, Anthony P.; Beardall, John; Wikelski, Martin; Chown, Steven L.
2013-01-01
Globally, ecosystems and their constituent flora and fauna face the localized and broad-scale influence of human activities. Conservation practitioners and environmental managers struggle to identify and mitigate threats, reverse species declines, restore degraded ecosystems, and manage natural resources sustainably. Scientific research and evidence are increasingly regarded as the foundation for new regulations, conservation actions, and management interventions. Conservation biologists and managers have traditionally focused on the characteristics (e.g. abundance, structure, trends) of populations, species, communities, and ecosystems, and simple indicators of the responses to environmental perturbations and other human activities. However, an understanding of the specific mechanisms underlying conservation problems is becoming increasingly important for decision-making, in part because physiological tools and knowledge are especially useful for developing cause-and-effect relationships, and for identifying the optimal range of habitats and stressor thresholds for different organisms. When physiological knowledge is incorporated into ecological models, it can improve predictions of organism responses to environmental change and provide tools to support management decisions. Without such knowledge, we may be left with simple associations. ‘Conservation physiology’ has been defined previously with a focus on vertebrates, but here we redefine the concept universally, for application to the diversity of taxa from microbes to plants, to animals, and to natural resources. We also consider ‘physiology’ in the broadest possible terms; i.e. how an organism functions, and any associated mechanisms, from development to bioenergetics, to environmental interactions, through to fitness. Moreover, we consider conservation physiology to include a wide range of applications beyond assisting imperiled populations, and include, for example, the eradication of invasive species, refinement of resource management strategies to minimize impacts, and evaluation of restoration plans. This concept of conservation physiology emphasizes the basis, importance, and ecological relevance of physiological diversity at a variety of scales. Real advances in conservation and resource management require integration and inter-disciplinarity. Conservation physiology and its suite of tools and concepts is a key part of the evidence base needed to address pressing environmental challenges. PMID:27293585
A simple method to predict regional fish abundance: an example in the McKenzie River Basin, Oregon
D.J. McGarvey; J.M. Johnston
2011-01-01
Regional assessments of fisheries resources are increasingly called for, but tools with which to perform them are limited. We present a simple method that can be used to estimate regional carrying capacity and apply it to the McKenzie River Basin, Oregon. First, we use a macroecological model to predict trout densities within small, medium, and large streams in the...
Space platform power system hardware testbed
NASA Technical Reports Server (NTRS)
Sable, D.; Patil, A.; Sizemore, T.; Deuty, S.; Noon, J.; Cho, B. H.; Lee, F. C.
1991-01-01
The scope of the work on the NASA Space Platform includes the design of a multi-module, multi-phase boost regulator, and a voltage-fed, push-pull autotransformer converter for the battery discharger. A buck converter was designed for the charge regulator. Also included is the associated mode control electronics for the charger and discharger, as well as continued development of a comprehensive modeling and simulation tool for the system. The design of the multi-module boost converter is discussed for use as a battery discharger. An alternative battery discharger design is discussed using a voltage-fed, push-pull autotransformer converter. The design of the charge regulator is explained using a simple buck converter. The design of the mode controller and effects of locating the bus filter capacitor bank 20 feet away from the power ORU are discussed. A brief discussion of some alternative topologies for battery charging and discharging is included. The power system modeling is described.
Hwang, Eun Gu; Lee, Yunjung
2016-12-01
Simple radiography is the best diagnostic tool for rib fractures caused by chest trauma, but it has some limitations. Thus, other tools are also being used. The aims of this study were to investigate the effectiveness of ultrasonography (US) for identifying rib fractures and to identify influencing factors of its effectiveness. Between October 2003 and August 2007, 201 patients with blunt chest trauma were available to undergo chest radiographic and US examinations for diagnosis of rib fractures. The two modalities were compared in terms of effectiveness based on simple radiographic readings and US examination results. We also investigated the factors that influenced the effectiveness of US examination. Rib fractures were detected on radiography in 69 patients (34.3%) but not in 132 patients. Rib fractures were diagnosed by using US examination in 160 patients (84.6%). Of the 132 patients who showed no rib fractures on radiography, 92 showed rib fractures on US. Among the 69 patients of rib fracture detected on radiography, 33 had additional rib fractures detected on US. Of the patients, 76 (37.8%) had identical radiographic and US results, and 125 (62.2%) had fractures detected on US that were previously undetected on radiography or additional fractures detected on US. Age, duration until US examination, and fracture location were not significant influencing factors. However, in the group without detected fractures on radiography, US showed a more significant effectiveness than in the group with detected fractures on radiography ( P =0.003). US examination could detect unnoticed rib fractures on simple radiography. US examination is especially more effective in the group without detected fractures on radiography. More attention should be paid to patients with chest trauma who have no detected fractures on radiography.
ViSimpl: Multi-View Visual Analysis of Brain Simulation Data
Galindo, Sergio E.; Toharia, Pablo; Robles, Oscar D.; Pastor, Luis
2016-01-01
After decades of independent morphological and functional brain research, a key point in neuroscience nowadays is to understand the combined relationships between the structure of the brain and its components and their dynamics on multiple scales, ranging from circuits of neurons at micro or mesoscale to brain regions at macroscale. With such a goal in mind, there is a vast amount of research focusing on modeling and simulating activity within neuronal structures, and these simulations generate large and complex datasets which have to be analyzed in order to gain the desired insight. In such context, this paper presents ViSimpl, which integrates a set of visualization and interaction tools that provide a semantic view of brain data with the aim of improving its analysis procedures. ViSimpl provides 3D particle-based rendering that allows visualizing simulation data with their associated spatial and temporal information, enhancing the knowledge extraction process. It also provides abstract representations of the time-varying magnitudes supporting different data aggregation and disaggregation operations and giving also focus and context clues. In addition, ViSimpl tools provide synchronized playback control of the simulation being analyzed. Finally, ViSimpl allows performing selection and filtering operations relying on an application called NeuroScheme. All these views are loosely coupled and can be used independently, but they can also work together as linked views, both in centralized and distributed computing environments, enhancing the data exploration and analysis procedures. PMID:27774062